Jump to content
  • Advertisement
Sign in to follow this  
AgentSnoop

Diect3D10 CreateInputLayout error

This topic is 3066 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

When I do CreateInputLayout with the following description,
D3D10_INPUT_ELEMENT_DESC elemDesc2[] =
{
	{ "POSITION",	0, DXGI_FORMAT_R32G32B32_FLOAT, 0, 0,  D3D10_INPUT_PER_VERTEX_DATA, 0 },
	{ "NORMAL",	0, DXGI_FORMAT_R32G32B32_FLOAT, 0, 12, D3D10_INPUT_PER_VERTEX_DATA, 0 },
	{ "TANGENT",	0, DXGI_FORMAT_R32G32B32_FLOAT, 0, 24, D3D10_INPUT_PER_VERTEX_DATA, 0 },
	{ "BINORMAL",	0, DXGI_FORMAT_R32G32B32_FLOAT, 0, 36, D3D10_INPUT_PER_VERTEX_DATA, 0 },
	{ "TEXCOORD",	0, DXGI_FORMAT_R32G32_FLOAT,	0, 48, D3D10_INPUT_PER_VERTEX_DATA, 0 }
};

hr = pD3DDevice->CreateInputLayout( elemDesc2, numElem, shaderPointer, shaderSize, &pLayout );


I get the error: "An invalid parameter was passed to the returning function". When I take away Binormal and Texcoord, the error goes away. Does anyone have an idea on what might be going on here, or what else I may be able to check?

Share this post


Link to post
Share on other sites
Advertisement
Probably something wrong with the shader you use in shaderPointer. Make sure the compiled shader uses matching input elements to the vertex shader, and that numElem is the correct count.

Share this post


Link to post
Share on other sites
I have numElem = 5, and the vertexshader input looks like this:


struct app2vert
{
float3 pos : POSITION;
float3 normal : NORMAL;
float3 tangent : TANGENT;
float3 bitangent : BINORMAL;
float2 tex0 : TEXCOORD0;
};

Share this post


Link to post
Share on other sites
Hi there.

Have you validated that the shadersize is valid.

I've seen it set like this.

D3D10_PASS_DESC PassDesc;
hr = Technique->GetPassByIndex( 0 )->GetDesc( &PassDesc );
if(FAILED(hr))
return error;

then pass this as your signature and shaderSize;
PassDesc.pIAInputSignature, PassDesc.IAInputSignatureSize

but not sure if it works

Share this post


Link to post
Share on other sites
Quote:
Original post by ankhd
Hi there.

Have you validated that the shadersize is valid.

I've seen it set like this.

D3D10_PASS_DESC PassDesc;
hr = Technique->GetPassByIndex( 0 )->GetDesc( &PassDesc );
if(FAILED(hr))
return error;

then pass this as your signature and shaderSize;
PassDesc.pIAInputSignature, PassDesc.IAInputSignatureSize

but not sure if it works


To be honest, I'm not exactly sure how I would do that right now with Nvidia's CG stuff.

Now, I'm assuming the shaders themselves are valid as they are working for opengl and direct3d 9, and I get information how Nvidia says to. Some InputLayouts pass and a couple don't. Both of them that don't have normal, tangent, binormal, and either one or two texcoords. The shaders seem to match up.

If anyone knows how I might be able to check the cg shaders for the input, that's probably the only thing I can think to check.

EDIT: I was debugging it in Visual Studio 2010, and it is giving me this error: "D3D10: ERROR: ID3D10Device::CreateInputLayout: The provided input signature expects to read an element with SemanticName/Index: 'ATTR'/15, but the declaration doesn't provide a matching name. [ STATE_CREATION ERROR #163: CREATEINPUTLAYOUT_MISSINGELEMENT ]"

EDIT2: When I compared ATTR15 with opengl, it saying it is BINORMAL, so I'm not sure what the problem is yet

[Edited by - AgentSnoop on February 27, 2010 1:04:58 AM]

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

We are the game development community.

Whether you are an indie, hobbyist, AAA developer, or just trying to learn, GameDev.net is the place for you to learn, share, and connect with the games industry. Learn more About Us or sign up!

Sign me up!