ShaderReflection; stripping information from an Effect

Started by
27 comments, last by MJP 11 years, 6 months ago
Hey guys... I'm rewriting our Shader/Material framework for our engine and want to know more about "ShaderReflection" and how to reflect shaders and strip information from them... for example, I want to know:


  • Each and every Technique and its name
  • Each and every Pass within each Technique and its name
  • Name and semantic of every global variable
  • [Very important] the type of each effect variable
  • The number and type of all variable members (if any, as in a structure)
  • Information about all cbuffers


As of right now I've been parsing the "Description" member of Effects, EffectTechniques and EffectPasses to learn most of this information from an Effect instance. However, it seems there is no way to get the actual type, nor figure out how many members (and type of members) a variable contains. I've never used the ShaderReflection type/API before, so I'm totally ignorant of how it works and how to use it... and can't seem to find the right information on Google (possibly due to my ignorance of what search queries to use lol).

Any help on this is greatly appreciated!

Regards,

--ATC--


EDIT: I'm also curious about the best way to bind variable values to an Effect (e.g., the fastest and most efficient way). And how might I need to handle the use of multiple materials based on the same shader?
_______________________________________________________________________________
CEO & Lead Developer at ATCWARE™
"Project X-1"; a 100% managed, platform-agnostic game & simulation engine

Please visit our new forums and help us test them and break the ice!
___________________________________________________________________________________
Advertisement
Remember that the Effects framework has been deprecated since the release of the latest windows SDK, so you might want to think about whether you want to keep on using it. The effects framework also isn't actually part of DirectX, but is built on top of it, so as far as I know the reflection API doesn't provide any info about effects and techniques but only about the underlying shaders themselves.

Everything you might need from the reflection API can be found here
From this interface you can basically get all the data and details you need from your shaders.

Also try not to think in terms of setting individual variables when working with shaders in D3D10/11, but rather in terms of working with constant buffers. The reflection API will give you complete data of where each of your shader variables can be found in constant buffers and what size they are, so that's a good starting point for binding data to your shaders.

I don't really understand your problem of using multiple materials with a shader, you just use the same shader instances for your material but pass in different values depending on what your material requires.

I gets all your texture budgets!


Remember that the Effects framework has been deprecated since the release of the latest windows SDK, so you might want to think about whether you want to keep on using it. The effects framework also isn't actually part of DirectX, but is built on top of it, so as far as I know the reflection API doesn't provide any info about effects and techniques but only about the underlying shaders themselves.

Everything you might need from the reflection API can be found here
From this interface you can basically get all the data and details you need from your shaders.


Hmm, I didn't know that about the Effects Framework being depreciated. As of now I'm only using the actual Effect interface in D3D10 and 11... I haven't really fooled with the FXF, so that's why I was here asking! :-)

I will have to read your link to get on the same page with you.


Also try not to think in terms of setting individual variables when working with shaders in D3D10/11, but rather in terms of working with constant buffers. The reflection API will give you complete data of where each of your shader variables can be found in constant buffers and what size they are, so that's a good starting point for binding data to your shaders.


Correct. What I'm trying to do is basically create a strongly-typed list of variables for my "Shader" class that user code can easily handle and bind variable values. But behind the scenes my code will actually be writing the bytes to a buffer and binding it to the effect's cbuffer(s). I'm not finished but this is what I'm writing right now. If I understand correctly this is faster than using the fx.GetVariableByXXX().AsYYY().SetValue(something) method, no?


I don't really understand your problem of using multiple materials with a shader, you just use the same shader instances for your material but pass in different values depending on what your material requires.


Disregard... dunno what exactly I was trying to say but I figured that out already lol
_______________________________________________________________________________
CEO & Lead Developer at ATCWARE™
"Project X-1"; a 100% managed, platform-agnostic game & simulation engine

Please visit our new forums and help us test them and break the ice!
___________________________________________________________________________________

Hmm, I didn't know that about the Effects Framework being depreciated. As of now I'm only using the actual Effect interface in D3D10 and 11... I haven't really fooled with the FXF, so that's why I was here asking! :-)


I mostly try to encourage people who are somewhat more comfortable with D3D to move away from the Effects library in favor of building a system which suits their specific requirements using the core D3D library and the shader interfaces it provides. You'll get a greater understanding of how D3D manages shaders and how you can use the base shader features to your advantage.
In some setups the concept of techniques and passes being tied directly to shaders also doesn't really make sense, and in that case it's best to just design something that works for your design. I don't know whether this applies to you, but it's something to think about.


If I understand correctly this is faster than using the fx.GetVariableByXXX().AsYYY().SetValue(something) method, no?


It's been quite a while since I've use the effects library, so I can't really give you any performance statistics and I wouldn't be able to tell you whether it would perform better or not.

I gets all your texture budgets!

So what you're saying is I need to compile my shaders directly to "ShaderBytecode" rather than using an "Effect" instance? And this will give me all the freedom and flexibility I require? I have tended to always use SlimDX's "Effect" classes for D3D10/11 in the past. If making the switch is what I need to do then I shall do it. This shader/material system has been rather weak and unimpressive to me for months, and it's about time I do something with it.
_______________________________________________________________________________
CEO & Lead Developer at ATCWARE™
"Project X-1"; a 100% managed, platform-agnostic game & simulation engine

Please visit our new forums and help us test them and break the ice!
___________________________________________________________________________________

So what you're saying is I need to compile my shaders directly to "ShaderBytecode" rather than using an "Effect" instance? And this will give me all the freedom and flexibility I require? I have tended to always use SlimDX's "Effect" classes for D3D10/11 in the past. If making the switch is what I need to do then I shall do it. This shader/material system has been rather weak and unimpressive to me for months, and it's about time I do something with it.


Ok, I now realize that you're talking about SlimDX and not native DirectX, my bad :D
The link I provided was for native D3D, but I'm sure SlimDX provides the same functionality but I can't really help you out there as I've never used SlimDX myself.
I think the ShaderByteCode class is exactly what you're looking for, but as I said I've never used SlimDX so I couldn't say for sure.

I gets all your texture budgets!

Yes, SlimDX is almost a 1:1 wrapper... the style/names can be a bit different but the functionality is the same! smile.png

Ok... could you explain to me the essence of how I need to handle a shader... Let's say I do this:

[source lang="csharp"]
/* Compile the effect to bytecode :: */
var src = File.ReadAllText(path);
var byteCode = ShaderBytecode.Compile( src, profile.GetProfileString() );
[/source]

Now I have the "bytecode" of the shader... What do I do next? Am I NOT to use the "Effect" classes at all? I've been wanting to support shader fragments where I can dynamically pair up free-standing pixel and vertex (and other) shaders. And I don't see how it could be possible with the "Effect" class. But my knowledge of this approach to things is virtually zero...

Not asking you to hold my hand and write my code for me, but if you could explain the stages of how you use the "bytecode" to get the information I need and render with shader this way (as opposed to the Effect class) that would be great. I keep searching for it on Google but cannot find it lol
_______________________________________________________________________________
CEO & Lead Developer at ATCWARE™
"Project X-1"; a 100% managed, platform-agnostic game & simulation engine

Please visit our new forums and help us test them and break the ice!
___________________________________________________________________________________
I see what's going on here... I've been using SlimDX since I started working in D3D10 and D3D11, and it appears the Effects Framework is built right into SlimDX. So since Direct3D9 I haven't learned any other way. I'm not sure now what's part of the FXF and what's not... Now I'm realizing all the much lower-level and potentially powerful things that can be done and it really blows a hole in my design (and my mind)... So now I have to scrap everything, it seems, and start anew... How frustrating... Not even sure how to proceed lol...
_______________________________________________________________________________
CEO & Lead Developer at ATCWARE™
"Project X-1"; a 100% managed, platform-agnostic game & simulation engine

Please visit our new forums and help us test them and break the ice!
___________________________________________________________________________________
Scrapping things entirely may not be necessarily true. If you're aiming to make "Metro"/Windows Store apps, you will need to forgo the use of the effects API as the D3DCompiler is one of the "banned API's". Of course, the only managed D3D wrapper that supports Win8 is SharpDX, so it's a moot point for SlimDX.

And even then, you can still use the FX file format - the difference is you'd have an offline compiler tool that parses the format for techniques/passes in order to acquire the necessary shader profiles + entry points. So with the profile/entry point in hand, you can run the FX file through the D3DCompiler to get a ShaderByteCode object for each shader, then use that to create a reflection object to query all your meta data. Then write out the reflected meta data to a file, that gets consumed by your application at runtime - which would be your own implementation of Effects11 (or something completely different, either way...you use the meta data to automatically setup your constant buffers, bind resources, and manage the shader pipeline by directly using the Direct3D11 shader interfaces).

For parsing, you can use something like Tiny Parser Generator, which is a really neat little tool if you know how to create a grammar. This is a problem that I've been working on myself for my own software and the approach has been working out pretty well (and frankly, I love TinyPG). I also believe the MonoGame folks have adopted this method, so that may be a good place to gather some ideas.

FYI, to answer your original question about how to use an effect with shader reflection - you can get at the shader byte code via the effect pass (the shader descriptions, which are queried from the effect shader variable the pass contains), which you would use to create a shader reflection object. Even if you were using the effects framework, that's still a useful thing to do for other reasons, like constructing input layouts up front. From that reflection variable, you're able to query for other things, like constant buffers, and get information about shader variables and their types (class, size, members, etc). But of course, you don't really need to do that as most of that information is readily available in the effect descriptions anyways.
Ah, so I'm going to need to write a parser to parse HLSL code and read all of the shader functions in then compile each one to bytecode separately? Geez, this sounds like a mountain of work... This sets our production schedule "hopes" back dramatically lol...

But now if I understand correctly the whole concept of "Techniques" and "Passes" is just an abstraction for selecting different bits of shader bytecode to bind to the device? And that would also mean I'm going to have to parse the technique/pass definitions and implement a whole new technique and pass system in the engine to use it? If that's the case I can no longer use EffectPass.Apply() anymore either... thus I'm going to need to... I dunno... bind shader bytecode individually to the device interface's "PixelShader", "VertexShader", and "GeometryShader" (and the other stages for D3D11, of course) objects? I've never even seen anyone do this before, hmmm... However, if I understand correctly doing all this work will be worth it because I will have a very powerful/flexible system where I can dynamically piece together complete effects from fragments on-the-fly and (theoretically) generate tons of effect permutations from fragments?

There's something I'm not getting here though... Suppose I do something like this:

[source lang="csharp"]
void processShader(string path)
{
string fxSrc = File.ReadAllText( path );

byteCode = ShaderBytecode.Compile(fxSrc, "mainVS", "fx_5_0", ShaderFlags.None, EffectFlags.None);
}
[/source]

Ok... well now I have the bytecode of the vertex shader "mainVS" from my .fx file. I could do the same thing to get the pixel and/or geometry shaders... But what about the rest of the fx file? All my global variables and constant buffers? How/when do they get compiled and how do I put all this back together? That's what I'm not getting... and how can I wrap all of this up cleanly to make working with shaders and materials easy for users of the engine (one of which will be myself and my team)?

EDIT:

It would be great if someone could hash out a light-weight pseudo-code example of how this is supposed to work and what I need to be doing with all of this... I'm still searching all over Google and finding a couple bits of decent info here/there but nothing to answer the mounting number of questions... tongue.png

Also... this issue of Windows 8 Metro compatibility... I was already thinking that the best route to go would be to write a utility for my engine that imports effects (e.g., a .fx file) and pre-compiles and gets all the info the engine needs. Then it would create a file of our own custom format that contains all the meta-data we want and packs in the compiled bytecode. The engine would just read the file at run-time and construct an "Effect" instance (I mean our own Effect class, not from Effects Framework) from it... does my thinking seem on the right track here? I was already planning to do this because compiling lots of shaders for any decently complex game can be like watching paint dry...
_______________________________________________________________________________
CEO & Lead Developer at ATCWARE™
"Project X-1"; a 100% managed, platform-agnostic game & simulation engine

Please visit our new forums and help us test them and break the ice!
___________________________________________________________________________________

This topic is closed to new replies.

Advertisement