Diect3D10 CreateInputLayout error

Started by
3 comments, last by AgentSnoop 14 years, 1 month ago
When I do CreateInputLayout with the following description,

D3D10_INPUT_ELEMENT_DESC elemDesc2[] =
{
	{ "POSITION",	0, DXGI_FORMAT_R32G32B32_FLOAT, 0, 0,  D3D10_INPUT_PER_VERTEX_DATA, 0 },
	{ "NORMAL",	0, DXGI_FORMAT_R32G32B32_FLOAT, 0, 12, D3D10_INPUT_PER_VERTEX_DATA, 0 },
	{ "TANGENT",	0, DXGI_FORMAT_R32G32B32_FLOAT, 0, 24, D3D10_INPUT_PER_VERTEX_DATA, 0 },
	{ "BINORMAL",	0, DXGI_FORMAT_R32G32B32_FLOAT, 0, 36, D3D10_INPUT_PER_VERTEX_DATA, 0 },
	{ "TEXCOORD",	0, DXGI_FORMAT_R32G32_FLOAT,	0, 48, D3D10_INPUT_PER_VERTEX_DATA, 0 }
};

hr = pD3DDevice->CreateInputLayout( elemDesc2, numElem, shaderPointer, shaderSize, &pLayout );


I get the error: "An invalid parameter was passed to the returning function". When I take away Binormal and Texcoord, the error goes away. Does anyone have an idea on what might be going on here, or what else I may be able to check?
Advertisement
Probably something wrong with the shader you use in shaderPointer. Make sure the compiled shader uses matching input elements to the vertex shader, and that numElem is the correct count.
I have numElem = 5, and the vertexshader input looks like this:

struct app2vert{	float3 pos : 		POSITION;	float3 normal : 	NORMAL;		float3 tangent :	TANGENT;	float3 bitangent :	BINORMAL;	float2 tex0 : 		TEXCOORD0;};
Hi there.

Have you validated that the shadersize is valid.

I've seen it set like this.

D3D10_PASS_DESC PassDesc;
hr = Technique->GetPassByIndex( 0 )->GetDesc( &PassDesc );
if(FAILED(hr))
return error;

then pass this as your signature and shaderSize;
PassDesc.pIAInputSignature, PassDesc.IAInputSignatureSize

but not sure if it works
Quote:Original post by ankhd
Hi there.

Have you validated that the shadersize is valid.

I've seen it set like this.

D3D10_PASS_DESC PassDesc;
hr = Technique->GetPassByIndex( 0 )->GetDesc( &PassDesc );
if(FAILED(hr))
return error;

then pass this as your signature and shaderSize;
PassDesc.pIAInputSignature, PassDesc.IAInputSignatureSize

but not sure if it works


To be honest, I'm not exactly sure how I would do that right now with Nvidia's CG stuff.

Now, I'm assuming the shaders themselves are valid as they are working for opengl and direct3d 9, and I get information how Nvidia says to. Some InputLayouts pass and a couple don't. Both of them that don't have normal, tangent, binormal, and either one or two texcoords. The shaders seem to match up.

If anyone knows how I might be able to check the cg shaders for the input, that's probably the only thing I can think to check.

EDIT: I was debugging it in Visual Studio 2010, and it is giving me this error: "D3D10: ERROR: ID3D10Device::CreateInputLayout: The provided input signature expects to read an element with SemanticName/Index: 'ATTR'/15, but the declaration doesn't provide a matching name. [ STATE_CREATION ERROR #163: CREATEINPUTLAYOUT_MISSINGELEMENT ]"

EDIT2: When I compared ATTR15 with opengl, it saying it is BINORMAL, so I'm not sure what the problem is yet

[Edited by - AgentSnoop on February 27, 2010 1:04:58 AM]

This topic is closed to new replies.

Advertisement