Jump to content
  • Advertisement
Sign in to follow this  
einaros

HLSL vertex shader parameters

This topic is 4868 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

I've got a vertex shader, vs_2_0, which gets is parameters in a struct as follows: struct VS_INPUT { float4 Pos : POSITION; float3 Normal : NORMAL; float3 Tex0 : TEXCOORD0; float4 Diffuse : COLOR0; }; The shader is used to render a mesh from a .x file. Since I failed time and again to get the Diffuse color input (figured this would be given to the shader as long as I remembered to do a device->SetMaterial before DrawSubset, but it seems it doesn't.. Side note: does anyone know why this happens?).. In desperation, I changed the order of the inputs in the above struct, to be eg. float4 Pos : POSITION; float4 Diffuse : COLOR0; float3 Tex0 : TEXCOORD0; float3 Normal : NORMAL; and to my confusion, this seems to mess up both this effect (giving completely messed up colors), and other effects/shaders used. Is there a logical explaination to this? Must the parameters be given in a particular order? Any why on earth does it affect the other shaders aswell?

Share this post


Link to post
Share on other sites
Advertisement
Quote:
The shader is used to render a mesh from a .x file. Since I failed time and again to get the Diffuse color input (figured this would be given to the shader as long as I remembered to do a device->SetMaterial before DrawSubset, but it seems it doesn't.. Side note: does anyone know why this happens?)..

SetMaterial() (and other similar functions) are for the legacy 'Fixed Function Pipeline' - they don't do anything for shaders [smile]

When rendering from an .x file you need to verify that the semantics match up - it's very possible that they don't. If the vertex declarations don't match up then you're going to need to use ID3DXMesh::CloneMesh() to add in the necessary parts - and then fill those in with meaningful values.

As long as you've got a mis-match in your data you're going to be open to all sorts of unexpected behaviours. It's been a while since I checked, but ISTR that the debug runtimes (see link in my sig. if you're not familiar) will spit out warning messages when the vertex declarations don't match up.

Quote:
this seems to mess up both this effect (giving completely messed up colors), and other effects/shaders used. Is there a logical explaination to this? Must the parameters be given in a particular order? Any why on earth does it affect the other shaders aswell?

Can you post the vertex declaration that you're using (input to IDirect3DDevice9::SetVertexDeclaration()) we might be able to offer some better information.

I'm not 100% sure on the ordering issue - I just checked the documentation and it didn't explictly state they must match. Can't be a bad thing if they do..

As for it messing up other shaders... are you using the same struct to describe the inputs to other vertex shaders?

hth
Jack

Share this post


Link to post
Share on other sites
First of all, I think you should use float2 and not float3 for the texture element.
This alone could be the reason behind your problem.

Anyway, I remember I've read somewhere that old hardware (pre DX9 maybe) could have some problems if you don't respect the old FVF vertex format precedence.
So, I think you should try using the following format:

struct VS_INPUT
{
float4 Pos : POSITION;
float4 Diffuse : COLOR0;
float3 Normal : NORMAL;
float2 Tex0 : TEXCOORD0;
};


Share this post


Link to post
Share on other sites
As for the mesh rendering, I fear I've misunderstood something severely. All I'm doing is reading the mesh, creating any textures/materials, and doing the following for the render:

for(DWORD i = 0; i < m_dwNumMaterials; i++)
{
m_pDevice->SetMaterial(&m_pMeshMaterials);
m_pDevice->SetTexture(0, m_pMeshTextures);
m_pMesh->DrawSubset(i);
}

The .x I'm rendering is exported from 3D Studio MAX, using the Panda plugin. The Mesh Viewer util included with the dx package was showing it like it should be, so I figured the .x itself was ok..

Quote:
Can you post the vertex declaration that you're using (input to IDirect3DDevice9::SetVertexDeclaration()) we might be able to offer some better information.


Once again I'm somewhat stumped. I never did do a SetVertexDeclaration. Can't say that I've seen a reference to that call in any of the examples I've been following to build the shaders. I'll read up on that one now.

Quote:
As for it messing up other shaders... are you using the same struct to describe the inputs to other vertex shaders?


No, the shader is only used for rendering one mesh, but it cramps other shaders called aswell. It's the kind of thing I'd expect if there was an overflow on one of the parameters, and this went from one shaders memory space to another, but all the references I could find used the same sizes for the input.

Thanks for all help!

Einar

Share this post


Link to post
Share on other sites
Quote:
Anyway, I remember I've read somewhere that old hardware (pre DX9 maybe) could have some problems if you don't respect the old FVF vertex format precedence.


Hm, that may be. I'll give it a try.

Edit: No dice.. Same odd result, with some other shaders being obscured.

Einar

Share this post


Link to post
Share on other sites
Ok, I'm now doing the following vertex declaration:

LPDIRECT3DVERTEXDECLARATION9 vertexDecl = NULL;
D3DVERTEXELEMENT9 decl[] = {{0, 0, D3DDECLTYPE_FLOAT4, D3DDECLMETHOD_DEFAULT, D3DDECLUSAGE_POSITION, 0},
{0, 16, D3DDECLTYPE_FLOAT4, D3DDECLMETHOD_DEFAULT, D3DDECLUSAGE_COLOR, 0},
{0, 32, D3DDECLTYPE_FLOAT3, D3DDECLMETHOD_DEFAULT, D3DDECLUSAGE_NORMAL, 0},
{0, 44, D3DDECLTYPE_FLOAT2, D3DDECLMETHOD_DEFAULT, D3DDECLUSAGE_TEXCOORD, 0},
D3DDECL_END()};
m_pDevice->CreateVertexDeclaration(decl, &vertexDecl);
m_pDevice->SetVertexDeclaration(vertexDecl);

And the following input struct for the shader:

float4 Pos : POSITION;
float4 Diffuse : COLOR0;
float3 Normal : NORMAL;
float2 Tex0 : TEXCOORD0;

The results, however, are as they used to be. Messes up two shaders, including the one that gets these parameters. Before using the other shader that's affected (which is rendered from another vertex buffer), I do a SetFVF with appropriate values.

I'm sure this is both spotty and flawed, but I really can't see how and why.

Einar

Share this post


Link to post
Share on other sites
I realised I had a few diagrams kicking around (left-over from an old journal entry) -

Despite what you and your application knows about the data it's providing, you can assume that the hardware see's something like this:



That is, a generic block of binary 1's and 0's (the [] grouping is just for convenience).

Your vertex declaration informs the device how to interpret that block of binary into something useful:



RED for vertex position; GREEN for vertex normal and BLUE for a single texture coordinate.

Now, if you mess that part up then all sorts of strange things happen... the simplest error is that it'll read in data and interpret it as the wrong semantic - thus you'll get garbled data. However, if the declaration and stride don't match then you're really screwed (that's usually a fatal error).

This one is part of the same collection, but isn't quite so relevant... just shows the dataflow through the vertex part of the pipeline:




So, that's some simple theory.

Now the difference here is ID3DXMesh. It controls the incoming binary data, not you. It also controls the set-up for the rendering process - such that you trying to change the vertex declaration is a bit pointless as it'll go and change it again anyway [smile]

SO to solve your problem you need to make sure that what the ID3DXMesh is sending down the pipeline matches (or is a superset of) what your vertex shader is expecting. You can do this in two ways:

1. Determine what format ID3DXMesh is picking up from the file and tailor your vertex shader appropriately. You can do this either by looking at the way you export it (varies from tool to tool), or you can just dump some debug output from ID3DXMesh::GetVertexDeclaration() (will require parsing first).

2. Manipulate the mesh so that it contains the data you require. This can be done by a call to ID3DXMesh::CloneMesh(). However, a quirk of this is that if the call has to create new data (e.g. it didn't have a COLOR component, so it widened the vertex and dropped it in) it'll be left blank (or with garbage). You have to go through and create the new data yourself. D3DX can help with this, but it depends quite how much you're prepared to automate...


The latter option is a bit more robust, but is harder to implement. Although, if you do it well you can re-use the code for future meshes and effects [smile]

hth
Jack

Share this post


Link to post
Share on other sites
Ah, thanks alot. I've solved some of the problems here now, sanity checking the .x files, but I'll read up on your post aswell.

Einar

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

GameDev.net is your game development community. Create an account for your GameDev Portfolio and participate in the largest developer community in the games industry.

Sign me up!