Jump to content

  • Log In with Google      Sign In   
  • Create Account


ShaderReflection; stripping information from an Effect


Old topic!
Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.

  • You cannot reply to this topic
28 replies to this topic

#1 ATC   Members   -  Reputation: 551

Like
0Likes
Like

Posted 10 October 2012 - 03:30 PM

Hey guys... I'm rewriting our Shader/Material framework for our engine and want to know more about "ShaderReflection" and how to reflect shaders and strip information from them... for example, I want to know:

  • Each and every Technique and its name
  • Each and every Pass within each Technique and its name
  • Name and semantic of every global variable
  • [Very important] the type of each effect variable
  • The number and type of all variable members (if any, as in a structure)
  • Information about all cbuffers

As of right now I've been parsing the "Description" member of Effects, EffectTechniques and EffectPasses to learn most of this information from an Effect instance. However, it seems there is no way to get the actual type, nor figure out how many members (and type of members) a variable contains. I've never used the ShaderReflection type/API before, so I'm totally ignorant of how it works and how to use it... and can't seem to find the right information on Google (possibly due to my ignorance of what search queries to use lol).

Any help on this is greatly appreciated!

Regards,

--ATC--


EDIT: I'm also curious about the best way to bind variable values to an Effect (e.g., the fastest and most efficient way). And how might I need to handle the use of multiple materials based on the same shader?

Edited by ATC, 10 October 2012 - 03:40 PM.

_______________________________________________________________________________
CEO & Lead Developer at ATCWARE™
"Project X-1"; a 100% managed, platform-agnostic game & simulation engine


Please visit our new forums and help us test them and break the ice!
___________________________________________________________________________________

Sponsor:

#2 Radikalizm   Crossbones+   -  Reputation: 2770

Like
1Likes
Like

Posted 10 October 2012 - 03:52 PM

Remember that the Effects framework has been deprecated since the release of the latest windows SDK, so you might want to think about whether you want to keep on using it. The effects framework also isn't actually part of DirectX, but is built on top of it, so as far as I know the reflection API doesn't provide any info about effects and techniques but only about the underlying shaders themselves.

Everything you might need from the reflection API can be found here
From this interface you can basically get all the data and details you need from your shaders.

Also try not to think in terms of setting individual variables when working with shaders in D3D10/11, but rather in terms of working with constant buffers. The reflection API will give you complete data of where each of your shader variables can be found in constant buffers and what size they are, so that's a good starting point for binding data to your shaders.

I don't really understand your problem of using multiple materials with a shader, you just use the same shader instances for your material but pass in different values depending on what your material requires.

Edited by Radikalizm, 10 October 2012 - 03:53 PM.


#3 ATC   Members   -  Reputation: 551

Like
0Likes
Like

Posted 10 October 2012 - 04:25 PM

Remember that the Effects framework has been deprecated since the release of the latest windows SDK, so you might want to think about whether you want to keep on using it. The effects framework also isn't actually part of DirectX, but is built on top of it, so as far as I know the reflection API doesn't provide any info about effects and techniques but only about the underlying shaders themselves.

Everything you might need from the reflection API can be found here
From this interface you can basically get all the data and details you need from your shaders.


Hmm, I didn't know that about the Effects Framework being depreciated. As of now I'm only using the actual Effect interface in D3D10 and 11... I haven't really fooled with the FXF, so that's why I was here asking! :-)

I will have to read your link to get on the same page with you.

Also try not to think in terms of setting individual variables when working with shaders in D3D10/11, but rather in terms of working with constant buffers. The reflection API will give you complete data of where each of your shader variables can be found in constant buffers and what size they are, so that's a good starting point for binding data to your shaders.


Correct. What I'm trying to do is basically create a strongly-typed list of variables for my "Shader" class that user code can easily handle and bind variable values. But behind the scenes my code will actually be writing the bytes to a buffer and binding it to the effect's cbuffer(s). I'm not finished but this is what I'm writing right now. If I understand correctly this is faster than using the fx.GetVariableByXXX().AsYYY().SetValue(something) method, no?

I don't really understand your problem of using multiple materials with a shader, you just use the same shader instances for your material but pass in different values depending on what your material requires.


Disregard... dunno what exactly I was trying to say but I figured that out already lol
_______________________________________________________________________________
CEO & Lead Developer at ATCWARE™
"Project X-1"; a 100% managed, platform-agnostic game & simulation engine


Please visit our new forums and help us test them and break the ice!
___________________________________________________________________________________

#4 Radikalizm   Crossbones+   -  Reputation: 2770

Like
2Likes
Like

Posted 10 October 2012 - 04:42 PM

Hmm, I didn't know that about the Effects Framework being depreciated. As of now I'm only using the actual Effect interface in D3D10 and 11... I haven't really fooled with the FXF, so that's why I was here asking! :-)


I mostly try to encourage people who are somewhat more comfortable with D3D to move away from the Effects library in favor of building a system which suits their specific requirements using the core D3D library and the shader interfaces it provides. You'll get a greater understanding of how D3D manages shaders and how you can use the base shader features to your advantage.
In some setups the concept of techniques and passes being tied directly to shaders also doesn't really make sense, and in that case it's best to just design something that works for your design. I don't know whether this applies to you, but it's something to think about.

If I understand correctly this is faster than using the fx.GetVariableByXXX().AsYYY().SetValue(something) method, no?


It's been quite a while since I've use the effects library, so I can't really give you any performance statistics and I wouldn't be able to tell you whether it would perform better or not.

#5 ATC   Members   -  Reputation: 551

Like
0Likes
Like

Posted 10 October 2012 - 04:55 PM

So what you're saying is I need to compile my shaders directly to "ShaderBytecode" rather than using an "Effect" instance? And this will give me all the freedom and flexibility I require? I have tended to always use SlimDX's "Effect" classes for D3D10/11 in the past. If making the switch is what I need to do then I shall do it. This shader/material system has been rather weak and unimpressive to me for months, and it's about time I do something with it.
_______________________________________________________________________________
CEO & Lead Developer at ATCWARE™
"Project X-1"; a 100% managed, platform-agnostic game & simulation engine


Please visit our new forums and help us test them and break the ice!
___________________________________________________________________________________

#6 Radikalizm   Crossbones+   -  Reputation: 2770

Like
1Likes
Like

Posted 10 October 2012 - 04:59 PM

So what you're saying is I need to compile my shaders directly to "ShaderBytecode" rather than using an "Effect" instance? And this will give me all the freedom and flexibility I require? I have tended to always use SlimDX's "Effect" classes for D3D10/11 in the past. If making the switch is what I need to do then I shall do it. This shader/material system has been rather weak and unimpressive to me for months, and it's about time I do something with it.


Ok, I now realize that you're talking about SlimDX and not native DirectX, my bad :D
The link I provided was for native D3D, but I'm sure SlimDX provides the same functionality but I can't really help you out there as I've never used SlimDX myself.
I think the ShaderByteCode class is exactly what you're looking for, but as I said I've never used SlimDX so I couldn't say for sure.

#7 ATC   Members   -  Reputation: 551

Like
0Likes
Like

Posted 10 October 2012 - 05:11 PM

Yes, SlimDX is almost a 1:1 wrapper... the style/names can be a bit different but the functionality is the same! Posted Image

Ok... could you explain to me the essence of how I need to handle a shader... Let's say I do this:

[source lang="csharp"] /* Compile the effect to bytecode :: */var src = File.ReadAllText(path);var byteCode = ShaderBytecode.Compile( src, profile.GetProfileString() );[/source]

Now I have the "bytecode" of the shader... What do I do next? Am I NOT to use the "Effect" classes at all? I've been wanting to support shader fragments where I can dynamically pair up free-standing pixel and vertex (and other) shaders. And I don't see how it could be possible with the "Effect" class. But my knowledge of this approach to things is virtually zero...

Not asking you to hold my hand and write my code for me, but if you could explain the stages of how you use the "bytecode" to get the information I need and render with shader this way (as opposed to the Effect class) that would be great. I keep searching for it on Google but cannot find it lol

Edited by ATC, 10 October 2012 - 05:12 PM.

_______________________________________________________________________________
CEO & Lead Developer at ATCWARE™
"Project X-1"; a 100% managed, platform-agnostic game & simulation engine


Please visit our new forums and help us test them and break the ice!
___________________________________________________________________________________

#8 ATC   Members   -  Reputation: 551

Like
0Likes
Like

Posted 10 October 2012 - 07:29 PM

I see what's going on here... I've been using SlimDX since I started working in D3D10 and D3D11, and it appears the Effects Framework is built right into SlimDX. So since Direct3D9 I haven't learned any other way. I'm not sure now what's part of the FXF and what's not... Now I'm realizing all the much lower-level and potentially powerful things that can be done and it really blows a hole in my design (and my mind)... So now I have to scrap everything, it seems, and start anew... How frustrating... Not even sure how to proceed lol...
_______________________________________________________________________________
CEO & Lead Developer at ATCWARE™
"Project X-1"; a 100% managed, platform-agnostic game & simulation engine


Please visit our new forums and help us test them and break the ice!
___________________________________________________________________________________

#9 Starnick   Members   -  Reputation: 1136

Like
1Likes
Like

Posted 10 October 2012 - 09:50 PM

Scrapping things entirely may not be necessarily true. If you're aiming to make "Metro"/Windows Store apps, you will need to forgo the use of the effects API as the D3DCompiler is one of the "banned API's". Of course, the only managed D3D wrapper that supports Win8 is SharpDX, so it's a moot point for SlimDX.

And even then, you can still use the FX file format - the difference is you'd have an offline compiler tool that parses the format for techniques/passes in order to acquire the necessary shader profiles + entry points. So with the profile/entry point in hand, you can run the FX file through the D3DCompiler to get a ShaderByteCode object for each shader, then use that to create a reflection object to query all your meta data. Then write out the reflected meta data to a file, that gets consumed by your application at runtime - which would be your own implementation of Effects11 (or something completely different, either way...you use the meta data to automatically setup your constant buffers, bind resources, and manage the shader pipeline by directly using the Direct3D11 shader interfaces).

For parsing, you can use something like Tiny Parser Generator, which is a really neat little tool if you know how to create a grammar. This is a problem that I've been working on myself for my own software and the approach has been working out pretty well (and frankly, I love TinyPG). I also believe the MonoGame folks have adopted this method, so that may be a good place to gather some ideas.

FYI, to answer your original question about how to use an effect with shader reflection - you can get at the shader byte code via the effect pass (the shader descriptions, which are queried from the effect shader variable the pass contains), which you would use to create a shader reflection object. Even if you were using the effects framework, that's still a useful thing to do for other reasons, like constructing input layouts up front. From that reflection variable, you're able to query for other things, like constant buffers, and get information about shader variables and their types (class, size, members, etc). But of course, you don't really need to do that as most of that information is readily available in the effect descriptions anyways.

Edited by Starnick, 10 October 2012 - 10:02 PM.


#10 ATC   Members   -  Reputation: 551

Like
0Likes
Like

Posted 10 October 2012 - 10:45 PM

Ah, so I'm going to need to write a parser to parse HLSL code and read all of the shader functions in then compile each one to bytecode separately? Geez, this sounds like a mountain of work... This sets our production schedule "hopes" back dramatically lol...

But now if I understand correctly the whole concept of "Techniques" and "Passes" is just an abstraction for selecting different bits of shader bytecode to bind to the device? And that would also mean I'm going to have to parse the technique/pass definitions and implement a whole new technique and pass system in the engine to use it? If that's the case I can no longer use EffectPass.Apply() anymore either... thus I'm going to need to... I dunno... bind shader bytecode individually to the device interface's "PixelShader", "VertexShader", and "GeometryShader" (and the other stages for D3D11, of course) objects? I've never even seen anyone do this before, hmmm... However, if I understand correctly doing all this work will be worth it because I will have a very powerful/flexible system where I can dynamically piece together complete effects from fragments on-the-fly and (theoretically) generate tons of effect permutations from fragments?

There's something I'm not getting here though... Suppose I do something like this:

[source lang="csharp"]void processShader(string path){ string fxSrc = File.ReadAllText( path ); byteCode = ShaderBytecode.Compile(fxSrc, "mainVS", "fx_5_0", ShaderFlags.None, EffectFlags.None);}[/source]

Ok... well now I have the bytecode of the vertex shader "mainVS" from my .fx file. I could do the same thing to get the pixel and/or geometry shaders... But what about the rest of the fx file? All my global variables and constant buffers? How/when do they get compiled and how do I put all this back together? That's what I'm not getting... and how can I wrap all of this up cleanly to make working with shaders and materials easy for users of the engine (one of which will be myself and my team)?

EDIT:

It would be great if someone could hash out a light-weight pseudo-code example of how this is supposed to work and what I need to be doing with all of this... I'm still searching all over Google and finding a couple bits of decent info here/there but nothing to answer the mounting number of questions... Posted Image

Also... this issue of Windows 8 Metro compatibility... I was already thinking that the best route to go would be to write a utility for my engine that imports effects (e.g., a .fx file) and pre-compiles and gets all the info the engine needs. Then it would create a file of our own custom format that contains all the meta-data we want and packs in the compiled bytecode. The engine would just read the file at run-time and construct an "Effect" instance (I mean our own Effect class, not from Effects Framework) from it... does my thinking seem on the right track here? I was already planning to do this because compiling lots of shaders for any decently complex game can be like watching paint dry...

Edited by ATC, 11 October 2012 - 12:36 AM.

_______________________________________________________________________________
CEO & Lead Developer at ATCWARE™
"Project X-1"; a 100% managed, platform-agnostic game & simulation engine


Please visit our new forums and help us test them and break the ice!
___________________________________________________________________________________

#11 Radikalizm   Crossbones+   -  Reputation: 2770

Like
1Likes
Like

Posted 11 October 2012 - 04:18 AM

But now if I understand correctly the whole concept of "Techniques" and "Passes" is just an abstraction for selecting different bits of shader bytecode to bind to the device? And that would also mean I'm going to have to parse the technique/pass definitions and implement a whole new technique and pass system in the engine to use it? If that's the case I can no longer use EffectPass.Apply() anymore either... thus I'm going to need to... I dunno... bind shader bytecode individually to the device interface's "PixelShader", "VertexShader", and "GeometryShader" (and the other stages for D3D11, of course) objects? I've never even seen anyone do this before, hmmm... However, if I understand correctly doing all this work will be worth it because I will have a very powerful/flexible system where I can dynamically piece together complete effects from fragments on-the-fly and (theoretically) generate tons of effect permutations from fragments?


That's pretty much the idea behind it, yes Posted Image
You could write a parser for .fx files, but that might not be an ideal situation. In my shader system I separate the actual shader code (in .hlsl files, not .fx files) and the shader program definition (ie. the matching of vertex shaders, pixel shaders, geometry shaders, etc.) into 2 different files so you don't get one huge file with too much differing data in it. This allows for very easy mixing and matching.
The shader program definition file is just a plain XML file specifying where to look for the source code (can be multiple plain .hlsl files), which entry point to use for each stage, which preprocessor defines to use, which compilation flags to use, etc. Blend, rasterizer and depth states are handled by the pipeline itself and not by the shader, and I have no concept of passes or techniques. I do allow for multiple shader 'setups' which correspond to a specific render pipeline setup so I can easily change how my pipeline works on the fly, but that's not really all that relevant here I suppose.

Because the shader definition file is just plain XML it becomes very easy to push it through my content pipeline without having to write any fancy parsers. I just use a generic serialization system to store all the values in my XML file, and I use these values to let my compiler build a binary shader definition file (which also encompasses my shader binaries) which can be loaded into my engine very fast with a minimal amount of parsing.
This basically comes down to that tool you mentioned at the end of your post Posted Image

Edited by Radikalizm, 11 October 2012 - 04:18 AM.


#12 xoofx   Members   -  Reputation: 777

Like
2Likes
Like

Posted 11 October 2012 - 06:35 AM

And even then, you can still use the FX file format - the difference is you'd have an offline compiler tool that parses the format for techniques/passes in order to acquire the necessary shader profiles + entry points. So with the profile/entry point in hand, you can run the FX file through the D3DCompiler to get a ShaderByteCode object for each shader, then use that to create a reflection object to query all your meta data. Then write out the reflected meta data to a file, that gets consumed by your application at runtime - which would be your own implementation of Effects11 (or something completely different, either way...you use the meta data to automatically setup your constant buffers, bind resources, and manage the shader pipeline by directly using the Direct3D11 shader interfaces).

For parsing, you can use something like Tiny Parser Generator, which is a really neat little tool if you know how to create a grammar. This is a problem that I've been working on myself for my own software and the approach has been working out pretty well (and frankly, I love TinyPG). I also believe the MonoGame folks have adopted this method, so that may be a good place to gather some ideas.

You are right, that is basically what has been done recently in SharpDX.Toolkit to support parsing of HLSL FX shaders:

- I decided to almost parse fx file format (only the content of technique/passes) in order to keep things quite compatible with legacy shaders and ease porting (though the syntax is slightly simplidied and different, check for example BasicEffect.fx link below)
- Write a parser, done by hand (nothing more than a couple of hundred of lines) and is mainly parsing technique/passes and the content of a pass, but the syntax supported is slightly closer to D3D9, producing an AST.
- Write a custom metadata fileformat (EffectData in the toolkit) that is able to save the whole description of a shader (shader bytecodes, constant buffers, value & resource variables, techniques/passes), nice thing is that It supports multiple effects inside a same archive that can be merged (like a zip of effects)
- Write a compilerthat takes the AST produced by the parser and compile all shaders and put them in the metadata format.
- A command line tool "fxc.exe" like called "tkfxc.exe" which basically allow to generate an archive of effects (unlike legacy FX, I can then put several effects inside the same tkfxo file), it is also outputing a friendly colored output as the original fxc.exe.
- Then you have to rewrite the whole Effect framework

Getting all this efficiently written is really really a long way (correct handling of constant buffers update, minimize calls to XXSetShaderResource/SetSamplers...for each shader stages...etc.), but in the end the result is pretty close to XNA effects (thus, I have been able to reuse all StockEffects from XNA, for example BasicEffect.fx is very similar to the original one, only techniques/passes declaration is slgihtly changing) but a bit more flexible as It can provide some extras features like linking of shader declared from another shader (export/import of a particular shader shareable between effects), or constant buffer pooling, or automatic access to all non-builtin variable declared in the effect pass at runtime (if you declare Toto = 1.0f; in the effect pass, you will be able to get this value from C#).

Edited by xoofx, 11 October 2012 - 06:38 AM.


#13 Starnick   Members   -  Reputation: 1136

Like
1Likes
Like

Posted 11 October 2012 - 07:33 AM

Very nice, I didn't realize you've released code for your toolkit (I heard mentions of it a while ago). But yeah, exactly the process that I've went with.

FYI, TinyPG produces code for a scanner/parser/parsetree, so there isn't any runtime dependencies. All you write is a grammer (e.g. terminals, production rules to parse and evaluate technique/pass expressions, etc). It's a lot simpler than writing out your own parser by hand.

Edited by Starnick, 11 October 2012 - 07:34 AM.


#14 xoofx   Members   -  Reputation: 777

Like
1Likes
Like

Posted 11 October 2012 - 08:42 AM

Very nice, I didn't realize you've released code for your toolkit (I heard mentions of it a while ago). But yeah, exactly the process that I've went with.
FYI, TinyPG produces code for a scanner/parser/parsetree, so there isn't any runtime dependencies. All you write is a grammer (e.g. terminals, production rules to parse and evaluate technique/pass expressions, etc). It's a lot simpler than writing out your own parser by hand.

TinyPG is indeed really nice and the generated code is concise and relatively efficient. But there are a couple of things that would require probably some changes to TinyPG in order to parse correctly HLSL with preprocessor:
  • For example, in my case, I parse and build an AST only for a technique block, everything else is eaten by the parser but skipped. I'm only checking that all braces {}[]() are matching until a technique is found.
  • I'm handling preprocessor (though D3DCompiler.PreProcess) and parsing #line directive in order to provide the exact line/file error from an included file. This would probably required some changes in TinyPG generated code or at least quite some code to re-process all parse tree errors and adjust line (and provide file information). Also a preprocessor doesn't fit well with a general grammar as it can be interleaved everywhere, so most of the time, you need some entry point between the scanner and the parser in order to handle them correctly (and I don't think that TinyPG has this kind of extension points).


#15 ATC   Members   -  Reputation: 551

Like
0Likes
Like

Posted 11 October 2012 - 10:06 AM

You guys are the best! :-)

If I'm understanding everything correctly the days of writing big .fx files full of various techniques and passes are waning, and we're heading towards the use of these "shader fragments" to give us absolute power over rendering with effects? If that is indeed the direction the industry is going in then I won't feel so bad about all the work I now have to do lol...

@ xoofx:

This toolkit is already a finished part of SharpDX? I'm already targeting SlimDX, but from the sound of it I could possibly use your code for my effect importer tool?

EDIT:

BTW, I'm about to look in the DirectX SDK for Microsoft's Effects Framework source to see how they wrote the framework... hopefully it will be of use to teaching me how to work with shaders at this new, lower level... Could someone post the path to the source within the SDK in case I have any trouble finding it?

Edited by ATC, 11 October 2012 - 10:12 AM.

_______________________________________________________________________________
CEO & Lead Developer at ATCWARE™
"Project X-1"; a 100% managed, platform-agnostic game & simulation engine


Please visit our new forums and help us test them and break the ice!
___________________________________________________________________________________

#16 Radikalizm   Crossbones+   -  Reputation: 2770

Like
1Likes
Like

Posted 11 October 2012 - 10:17 AM

BTW, I'm about to look in the DirectX SDK for Microsoft's Effects Framework source to see how they wrote the framework... hopefully it will be of use to teaching me how to work with shaders at this new, lower level... Could someone post the path to the source within the SDK in case I have any trouble finding it?


It can be found in <DXSDK Directory>\Samples\C++\Effects11

The code itself isn't pretty though IMO, so be warned

#17 ATC   Members   -  Reputation: 551

Like
0Likes
Like

Posted 11 October 2012 - 10:24 AM

It can be found in \Samples\C++\Effects11

The code itself isn't pretty though IMO, so be warned


Yep, I'd just found it a few mins ago and thought the same thing... :P

But it's the functionality and ideas I need, not their code verbatim...
_______________________________________________________________________________
CEO & Lead Developer at ATCWARE™
"Project X-1"; a 100% managed, platform-agnostic game & simulation engine


Please visit our new forums and help us test them and break the ice!
___________________________________________________________________________________

#18 ATC   Members   -  Reputation: 551

Like
0Likes
Like

Posted 11 October 2012 - 10:58 AM

Could someone please help me understand how on earth I compile the REST of my effect programs (e.g., the global variables, pre-processor directives, includes (e.g., fxh files), etc)? I still don't see how I put all this together by just compiling individual shader functions, or what the hell I'm supposed to do lol....

EDIT:
This just gets more and more confusing... everywhere I look online it appears that virtually everyone is using the Effects Framework despite its depreciation... there's little/nothing available out there to help me... And Microsoft's Effects Framework implementation is huge... thousands and thousands of lines of code. It's going to take forever to figure out how to do this by reading their code... If I understood/knew the steps of how to piece things together and what I need to be doing writing the code would be the easy part... But I just dunno wtf I'm supposed to be writing lol...

Edited by ATC, 11 October 2012 - 11:21 AM.

_______________________________________________________________________________
CEO & Lead Developer at ATCWARE™
"Project X-1"; a 100% managed, platform-agnostic game & simulation engine


Please visit our new forums and help us test them and break the ice!
___________________________________________________________________________________

#19 Radikalizm   Crossbones+   -  Reputation: 2770

Like
1Likes
Like

Posted 11 October 2012 - 11:34 AM

Could someone please help me understand how on earth I compile the REST of my effect programs (e.g., the global variables, pre-processor directives, includes (e.g., fxh files), etc)? I still don't see how I put all this together by just compiling individual shader functions, or what the hell I'm supposed to do lol....

EDIT:
This just gets more and more confusing... everywhere I look online it appears that virtually everyone is using the Effects Framework despite its depreciation... there's little/nothing available out there to help me... And Microsoft's Effects Framework implementation is huge... thousands and thousands of lines of code. It's going to take forever to figure out how to do this by reading their code... If I understood/knew the steps of how to piece things together and what I need to be doing writing the code would be the easy part... But I just dunno wtf I'm supposed to be writing lol...


The compiler interface provided by D3D takes care of all your compilation needs. It will take care of include files, global variables, compilation targets, compilation flags, etc.
Have a look at the various Compile methods in the ShaderByteCode class.

I wouldn't focus too much on microsoft's implementation of the effects framework, I can say from my own experience that a similar system including techniques and passes can be written with a much simpler implementation.

An effect is a collection of techniques, a technique is a collection of passes, and a pass encompasses your render state (shaders, rasterizer state, blend state, etc.)
SlimDX provides ways for you to bind data to your shaders using the shader wrapper classes (like VertexShaderWrapper), so that's a starting point for binding your constant buffers, samplers, etc. Where to put your separate variables into your constant buffer and which names, sizes and default values they have can be found from the shader reflection class.

Applying a pass means binding the state the pass encompasses to the D3D device, executing a technique means executing passes sequentially.

That's really all the info you'll need for building a system like the effects framework tbh.

Edited by Radikalizm, 11 October 2012 - 11:35 AM.


#20 ATC   Members   -  Reputation: 551

Like
0Likes
Like

Posted 11 October 2012 - 11:55 AM

Thanks again, Rad...

I just realized the DirectX SDK has some sample projects for "HLSL without FX". Now that I'm looking at it this is beginning to make more sense. It shows me what the heck is going on behind the scenes of the FXF which I have not seen very often.

What I'm still not getting though is how we have "shared constant buffers"... In an effect file we have global variables which can be (and are) consumed by multiple shaders of multiple stages. Now how do I achieve that? It would seem stupid to do something like this:

// VS.vsh
*constant buffer(s)*
someFunction();

// PS.psh
*same constant buffer(s)*
anotherFunction();

...and so on...

How do we mitigate this issue/problem?
_______________________________________________________________________________
CEO & Lead Developer at ATCWARE™
"Project X-1"; a 100% managed, platform-agnostic game & simulation engine


Please visit our new forums and help us test them and break the ice!
___________________________________________________________________________________




Old topic!
Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.



PARTNERS