Retrieving Vertex Declaration from Effects Framework

This topic is 3334 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

Recommended Posts

Hi guys, I have a technical question. There is a way to generate programmatically a Vertex Declaration from an Effect of the Effects Framework? (in either DX9 and DX10) I'm searching on the net, but there aren't many information on this question. Thanks! [Edited by - feal87 on October 24, 2009 11:56:11 AM]

Share on other sites
No, not that I know of. But if you'd like you can assign whatever custom attributes you'd like to techniques and passe, which you could use to provide information about the vertex inputs.

Share on other sites
Quote:
 Original post by MJPNo, not that I know of. But if you'd like you can assign whatever custom attributes you'd like to techniques and passe, which you could use to provide information about the vertex inputs.

Mhn...thanks for the clarification MJP. I'll go follow another route to solve my problem then. :)

Share on other sites
Excuse my curiosity but why would you want to do that?

Share on other sites
Yes you can do this, you can find out what inputs a vertex shader in an Effect requires and then build a vertex declaration from that. It takes a bit of work on your part, but, its pretty straight forward.

For DirectX9, use D3DXGetShaderInputSemantics() which will give you an array of the inputs that a vertex shader requires.

For DirectX10, use GetInputSignatureElementDesc(), which will get you the same sort of information.

Share on other sites
Quote:
 Original post by snake5Excuse my curiosity but why would you want to do that?

Remove the hassle of creating a vertex declaration manually for each shader i use.
Anyway thanks andur, i'll take a look at the functions you posted. :)

Share on other sites
I also am interested in this. It would greatly reduce alot of problems. I have found some small information on the internet. If someone could further expand or give some example of how to use this that would be great.

                 array<ShaderSemantic>^ ShaderBytecode::GetInputSemantics()                  {                          UINT count = 0;                          const DWORD* function = (const DWORD*) m_Pointer->GetBufferPointer();                            HRESULT hr = D3DXGetShaderInputSemantics( function, NULL, &count );                          GraphicsException::CheckHResult( hr );                          if( FAILED( hr ) )                                  return nullptr;                                                    array<ShaderSemantic>^ inputs = gcnew array<ShaderSemantic>( count );                          pin_ptr<ShaderSemantic> pinnedInputs = &inputs[0];                            hr = D3DXGetShaderInputSemantics( function, (D3DXSEMANTIC*) pinnedInputs, &count );                          GraphicsException::CheckHResult( hr );                          if( FAILED( hr ) )                                  return nullptr;                                                    return inputs;                  } andhr=D3DXCompileShaderFromFileA("pixelshader.txt",NULL,NULL,"PS","ps_2_0",dwShaderFlags,&g_pCompiledPSBuffer,&g_pCompiledPSErrorBuffer,&g_pPSConstantTable); _hr=D3DXGetShaderInputSemantics((DWORD*)g_pCompiledPSBuffer->GetBufferPointer(),NULL,&count);

I just found those examples, not my code. But it would be great to figure this out. Thanks.

Share on other sites
I've looked at the functions. In DX10/11 there is no problem to determine the vertex declaration (the function return even the type of the elements), but in DX9 its a little tricky cause the function does not give the type of the element, but only the semantic and little other information...

Share on other sites
In straight C++, the above code might look something like:

void getInputSemantics(const ID3DXBuffer *shaderCode, std::vector<D3DXSEMANTIC> &inputs) {    const DWORD* function = (const DWORD*) shaderCode->GetBufferPointer();    UINT count;    if (FAILED(D3DXGetShaderInputSemantics( function, NULL, &count )))        // Error    inputs.resize(count);    if (FAILED(D3DXGetShaderInputSemantics( function, &inputs[0], &count )))        // Error}

Share on other sites
Quote:
Original post by feal87
Quote:
 Original post by snake5Excuse my curiosity but why would you want to do that?

Remove the hassle of creating a vertex declaration manually for each shader i use.
Anyway thanks andur, i'll take a look at the functions you posted. :)

Well usually the vertex declaration is determined from your vertex data and the shaders are written to fit that, not the other way around. This is because vertex data is usually packed in certain ways to improve performance or reduce the memory footprint.

1. 1
2. 2
Rutin
21
3. 3
A4L
15
4. 4
5. 5

• 13
• 26
• 10
• 11
• 44
• Forum Statistics

• Total Topics
633741
• Total Posts
3013624
×