Sign in to follow this  
Buttacup

D3D10_PASS_DESC.pIAInputSignature

Recommended Posts

Is pIAInputSignature set on loading the effect? What is it exactly the location, layout and stride??? My preference here would be to just tell CreateInputLayout() function what to expect as opposed to giving it an actual effect... It obviously can not be an address because it can be substituted for any other effect afterward... MSDN is vague on this one :/

Share this post


Link to post
Share on other sites
Unlike D3D9, D3D10(+) validates most state objects at creation time rather than render time. While this is good for rendering performance, it does introduce little inconveniences like this.

You need to create your vertex shader object before your input layout object, since the system verifies that the two are compatible. The "shader bytecode" refers to that of the vertex shader, and you can get the input layout data from it by using D3D10GetInputSignatureBlob. The effect system just serves the objects to you.

Since the SM4.0 IL instruction set is undocumented in public, it is infeasibly difficult to come up with your own shader bytecode. In addition, I remember hearing from the D3D team that the runtime does some checksum matching against the data so as to verify its source as the official shader compiler.

Note that if several of your shaders use the same input layout, you can also use the same input layout object on them without having to create a new one for each of those shaders. The input layout object doesn't care about the shader logic.

Share this post


Link to post
Share on other sites
I thought maybe I could use an effect header(.fxh)..... all the layouts noone of the implementation! I just need to feed my buffer class something yus and I don't want it relying on the presence of an active effect. o-O Will try this later. I'm pretty sure it's a sound approach assuming I am allowed to make an effect header in 4.0, I haven't really looked. Sux that the instruction set is undisclosed... is it on lock down or is it just not openly accessible? Is it time to reverse engineer or is it time to make a new? <=== I'm not doing either it's not that important... yet!

[edit] is instruction set really a good way to address that??? Would it not be more appropriate to say maybe the compile time instruction set ASM??? Or something like that... Is there an intermediate language between the ASM instruction set and the shader model instruction set? [/edit]

Share this post


Link to post
Share on other sites
Effect headers are just include files (like in C and C++), they have no magic about them.

You need to compile a vertex shader before you can create an input layout object - no way around that. The input layout spec is encoded in the parameter block of the vertex shader function bytecode, regardless of what type of parameters you use in the HLSL side (structs or individual variables).

If your vertex input structures do not change, you can of course compile the shader in advance and store the bytecode for when you need to create the input layout. This is very easy and no hacking is needed.

The asm instruction set is actually documented in the SDK, but the resulting bytecode isn't. However, it isn't worth the trouble to try to circumvent the system.

Share this post


Link to post
Share on other sites
No prob, glad I could help :)

While D3D10(+) can be confusing sometimes, it is quite optimized for performance. Some compromises in programmer convenience have been made due to this.

Thankfully, the issue discussed herein is among the most annoying ones, and IMHO it isn't too difficult to deal with despite that.

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

Sign in to follow this