• Advertisement
Sign in to follow this  

Drawing Indexed Meshes w/ Vertex Shaders

This topic is 4248 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

So, I'm a little new to Vertex Shaders, and there is one bit hanging me up right now. Every example I can find on drawing indexed meshes with vertex shaders shows them being drawn by DrawIndexedPrimitive, not DrawSubset. Why? Apprently I need to create a vertex buffer for SetStream to point to, but why can't I use DrawSubset after that? Also, can I use ppmesh->GetVertexBuffer as opposed to CreateVertexBuffer to get my vertex buffer pointer for SetStream? An example I found copies the verticies from ppmesh->GetVertexBuffer to a pointer created by CreateVertexBuffer and then hands this pointer to SetStream, but I'm not sure why it doesn't just hand the pointer from GetVertexBuffer. Any help would be greatly appreciated!!!!

Share this post


Link to post
Share on other sites
Advertisement
Draw subset internally calls DrawIndexedPrimitive, and SetStreamSource.

The reason most Vertex Shader samples use a vertex + index buffer for drawing instead of a mesh, is that this gives you closer control over the vertex format.

A Mesh doesn't have a hardcoded format, which means if you load one .x file, it might have normals but not texcoords, and another might have texcoords but no normals. Now, a vertex shader relies greatly on which information it is given. If some of it is missing in the .x file, the shader won't draw properly.

For your own use, you can freely use DrawSubset with a mesh, assuming you know the vertex format fits that expected in the vertex shader. The fact you're using shaders doesn't matter when you're passing the GPU information to draw, it only affects how it draws it.

Hope this helps.

Share this post


Link to post
Share on other sites
Quote:
Draw subset internally calls DrawIndexedPrimitive, and SetStreamSource


OH! That is VERY helpful to know! so, let me see if I understand you correctly: I can use DrawSubset, but if I do, I do not need to call SetStream, and I do need to be sure I've converted my mesh to the appropriate FVF flags (that match the shader). But if I want more control over the vertex format (do you mean FVF here?) go ahead and use DrawIndexedPrimitive. Correct?

Share this post


Link to post
Share on other sites
Quote:
Original post by mike_ix
Quote:
Draw subset internally calls DrawIndexedPrimitive, and SetStreamSource


OH! That is VERY helpful to know! so, let me see if I understand you correctly: I can use DrawSubset, but if I do, I do not need to call SetStream, and I do need to be sure I've converted my mesh to the appropriate FVF flags (that match the shader). But if I want more control over the vertex format (do you mean FVF here?) go ahead and use DrawIndexedPrimitive. Correct?


The thing to keep in mind is that D3DXMesh provides a number of abstractions of the base facilities that Direct3D provides. That is, it aims to hide a number of the specific details of how you actually go about getting triangles drawn, for the sake of making it easier to use. DrawSubset() is part of that abstraction - it covers up some of the details.

Directly using SetStreamSource(), DrawIndexedPrimitive(), etc., brings you closer to the base D3D functionality - bypassing the D3DXMesh convenience functions.

If I had an initial codebase that used D3DXMesh and I wanted to get finer control over what was going on, I would start breaking down what D3DXMesh does into the D3D calls, "weaning" the code off of the support code, until I was able to get rid of D3DXMesh entirely.

Btw, I would generally recommend that if you're using shaders, you use vertex declarations rather than FVF formats. They're quite a bit more flexible, but they are also more of a pain to deal with. Take a look at the docs for CreateVertexDeclaration() and that neighborhood to see what that's all about.

Share this post


Link to post
Share on other sites
Quote:
Btw, I would generally recommend that if you're using shaders, you use vertex declarations rather than FVF formats. They're quite a bit more flexible, but they are also more of a pain to deal with. Take a look at the docs for CreateVertexDeclaration() and that neighborhood to see what that's all about.


I'm glad you said that because it leads me right into my next question ;) I am currently creating a mesh using D3DXCreateMeshFVF, then loading it up with vertices that have been defined as:

#define D3DFVF_CUSTOMVERTEX (D3DFVF_XYZ|D3DFVF_NORMAL|D3DFVF_TEX1)


I'm then using D3DXComputeNormals to get results like this (click to see screenshot). Notice that I can distinctly see each face's lighting; that it's not uniform. From my understanding, I need to use vertex blending to make the shadows more smooth and I need to use vertex shaders to accomplish this (correct?). So, my question is this: To do all of this should I get rid of D3DXCreateMeshFVF and the FVF format altogether, and if so how do I calculate normals for my meshes? Is that done by the shader?

Share this post


Link to post
Share on other sites
Quote:
Original post by mike_ix
So, my question is this: To do all of this should I get rid of D3DXCreateMeshFVF and the FVF format altogether, and if so how do I calculate normals for my meshes? Is that done by the shader?


Usually you don't calculate normals for meshes, it's preferrable to have these pre-calculated by the modeler and exported in the .X file. In the vertex shader you will be transforming the normals based on the Current Transformation Matrix, World Matrix, Projection Matrix, however you choose to do it.

Share this post


Link to post
Share on other sites
Quote:
Usually you don't calculate normals for meshes, it's preferrable to have these pre-calculated by the modeler and exported in the .X file. In the vertex shader you will be transforming the normals based on the Current Transformation Matrix, World Matrix, Projection Matrix, however you choose to do it.


OK, but what if you are creating a mesh using D3DXCreateMeshFVF? What if you are doing both, importing .x files and creating meshes from scratch and rendering them together? I can still create a mesh using D3DXCreateMesh, and leave out the FVF flags, but I need to calculate normals somewhere, right? Currently I'm using D3DXComputeNormals which gives results like my screenshot. Do I use shaders to caculate normals on my terrain mesh, or do I continue to calculate them with D3DXComputeNormals and use shaders to do vertex blending to smooth out my shadows? I'm just trying to figure out exactly how the shaders fit into the scope of things.

Share this post


Link to post
Share on other sites
I just had to manually draw a mesh with DrawIndexedPrimitive, because I am using hardware or stream instancing, which requires you to set the vertex declaration to include the stream 1 data (see the Instancing sample in the SDK).

Anyway, manually drawing it involves doing things that the DrawSubset() function would do for you:

1) Set the vertex declaration - pDevice->SetVertexDeclaration()

2) Get and set the vertex and index buffers:
pMesh->GetVertexBuffer()
pMesh->GetIndexBuffer()
pDevice->SetIndices()
pDevice->SetStreamSource(0, ...)

3) Get the attribute table so you can draw each subset - pMesh->GetAttributeTable()

4) Loop through the attribute buffer array to draw each one, rather than just looping through your subsets, to set the materials and textures for each subset:

for each attribute item
pDevice->DrawIndexedPrimitive(D3DPT_TRIANGLELIST,
0,
0,
pMesh->GetNumVertices(),
pAttrRangleTbl[iSub].FaceStart * 3,
pAttrRangleTbl[iSub].FaceCount) );

5) release the vertex and index buffers and delete the attribute table array.


So, those are all the things that DrawSubset() does for you.

Share this post


Link to post
Share on other sites
So, something like this should be legal, so long as my mesh FVF flags match my vertex declaration or I have no FVF flags at all?


bool MyWorldObj::RenderMesh(LPD3DXMESH rmesh, int numMaterials, LPDIRECT3DTEXTURE9 meshTextures[], D3DMATERIAL9 meshMaterials[], int wire) {

if (wire > 0) {
g_pd3dDevice->SetRenderState(D3DRS_FILLMODE, 2);
}
for (DWORD i=0; i<numMaterials; i++)
{
g_pd3dDevice->SetVertexDeclaration( g_pVertexDeclaration );
g_pd3dDevice->SetVertexShader( g_pVertexShader );
g_pd3dDevice->SetMaterial(&meshMaterials);
g_pd3dDevice->SetTexture(0,meshTextures);
g_pd3dDevice->SetPixelShader( g_pPixelShader );
rmesh->DrawSubset( i );
g_pd3dDevice->SetVertexShader( NULL );
g_pd3dDevice->SetPixelShader( NULL );
}
rmesh->Release;
return true;

}





Or do I even have these statements in the right place?

Thanks for all the help!!!!!!!!


I found a similar discussion about the FVF format vs. D3DVERTEXELEMENT9 that helps answer my question about whether I should use flags. In a nutshell, I guess it depends on the hardware I want to support, right?

[Edited by - mike_ix on July 7, 2006 3:53:37 PM]

Share this post


Link to post
Share on other sites
Almost correct. The instruction:

"g_pd3dDevice->SetVertexDeclaration( g_pVertexDeclaration );

will accomplish nothing since the DrawSubset call does it internally for you. That is why I had to do it manually. It would set the declaration to the actual vertex declaration, over-riding my attempt to set it for instancing, and the instancing didn't work.

Also, why are you releasing the mesh in your render function? That makes no sense. You should release it when you are done with it (like when the program exits).

And, you can use FVF as long as your vertex buffer is in a format that can be represented by FVF codes AND the elements in the buffer are in the correct order. If you want to re-order them, or add elements it does not support (like tangents and binormals used for normal mapping), then you gotta use a declaration.

Share this post


Link to post
Share on other sites
Quote:
Also, why are you releasing the mesh in your render function? That makes no sense. You should release it when you are done with it (like when the program exits).



The code I posted is a mish-mash of snipets from tutorials I've read. Sorry if it didn't make sense. But I think I understand now. So better code would be:

bool MyWorldObj::RenderMesh(LPD3DXMESH rmesh, int numMaterials, LPDIRECT3DTEXTURE9 meshTextures[], D3DMATERIAL9 meshMaterials[], int wire) {

if (wire > 0) {
g_pd3dDevice->SetRenderState(D3DRS_FILLMODE, 2);
}
for (DWORD i=0; i<numMaterials; i++)
{
g_pd3dDevice->SetVertexShader( g_pVertexShader );
g_pd3dDevice->SetMaterial(&meshMaterials);
g_pd3dDevice->SetTexture(0,meshTextures);
g_pd3dDevice->SetPixelShader( g_pPixelShader );
rmesh->DrawSubset( i );
g_pd3dDevice->SetVertexShader( NULL );
g_pd3dDevice->SetPixelShader( NULL );
}
return true;

}




And to have more control and flexibilty with my vertex data in the vertex shader it is best to avoid the constraints of the FVF format and just use a vertx declaration. I think I'm getting it. I also found this discussion on implementing the vertex declaration very informative.

Share this post


Link to post
Share on other sites
OK, so understanding it and implementing it are two different things. I'm not sure what I'm doing wrong, but this code is throwing errors at runtime. So far, this is how I'm initializing my shader:

void MyWorldObj::initShader( void )
{

D3DVERTEXELEMENT9 declaration[] =
{
{ 0, 0, D3DDECLTYPE_FLOAT3, D3DDECLMETHOD_DEFAULT, D3DDECLUSAGE_POSITION, 0 },
{ 0, 12, D3DDECLTYPE_FLOAT3, D3DDECLMETHOD_DEFAULT, D3DDECLUSAGE_NORMAL, 0 },
{ 0, 24, D3DDECLTYPE_D3DCOLOR, D3DDECLMETHOD_DEFAULT, D3DDECLUSAGE_COLOR, 0 },
{ 0, 40, D3DDECLTYPE_FLOAT2, D3DDECLMETHOD_DEFAULT, D3DDECLUSAGE_TEXCOORD, 0 },
D3DDECL_END()
};


g_pd3dDevice->CreateVertexDeclaration(declaration, &g_pVertexDeclaration);



HRESULT hr;
LPD3DXBUFFER pCode;
DWORD dwShaderFlags = 0;
LPD3DXBUFFER pBufferErrors = NULL;

// Assemble the vertex shader from the file
hr = D3DXCompileShaderFromFile( "vertex_shader.vsh", NULL, NULL, "main",
"vs_1_1", dwShaderFlags, &pCode,
&pBufferErrors, &g_pConstantTableVS );

if( FAILED(hr) )
{
LPVOID pCompilErrors = pBufferErrors->GetBufferPointer();
MessageBox(NULL, (const char*)pCompilErrors, "Vertex Shader Compile Error",
MB_OK|MB_ICONEXCLAMATION);
}

// Create the vertex shader
g_pd3dDevice->CreateVertexShader( (DWORD*)pCode->GetBufferPointer(),
&g_pVertexShader );
pCode->Release();

}




And this is the HLSL vertex_shader.vsh file that is loaded by D3DXCompileShaderFromFile:

float4x4 worldViewProj;

struct VS_INPUT
{
float3 position : POSITION0;
float3 normal : NORMAL0;
float4 color : COLOR0;
float2 texcoord : TEXCOORD0;
};

struct VS_OUTPUT
{
float4 hposition : POSITION0;
float3 normal : NORMAL0;
float4 color : COLOR0;
float2 texcoord : TEXCOORD0;
};

VS_OUTPUT main( VS_INPUT IN )
{
VS_OUTPUT vOut;

float4 v = float4( IN.position.x,
IN.position.y,
IN.position.z,
1.0f );

vOut.hposition = mul( v, worldViewProj );
vOut.normal = IN.normal;
vOut.color = IN.color;
vOut.texcoord = IN.texcoord;

return vOut;
}


So far, as far as my understanding, everything should work fine, and the code does compile without errors. Unfortunately D3DXCompileShaderFromFile HRESULT fails with the error: "invalid output semantics 'NORMAL0'" and VS.NET debugger breaks on g_pd3dDevice->CreateVertexShader. What am I missing here? Is the pixel shader neccessary to run without error? From my understanding, this vertex shader is the simplest shader that can be written in HLSL and should just output everything input to it plus transform world coordinates to screen coordinates and thus take over the pipeline. Why can't I implement it? Ho-hum,

I do thank everyone who's given their time to this! As soon as I get this running, I'll be sure to post the working code so everyone can understand how to do this. There just aren't enough practical tutorials available for "putting it all together" when it comes to shaders. So, once again, thank you!

Share this post


Link to post
Share on other sites
OK, so I found this discussion that states that pixel shaders cannot receive INPUT from the vertex shaders OUTPUT that has the semantic "NORMAL", so I changed my code so that the .vsh file has two TEXCOORD semantics as suggested and BAM, the darn thing runs. Only, none of my meshes are being rendered! OH, do I have the vertex shader blues! Here's my .vsh file now (some of the other names have changed as well during my effort to trouble shoot):

float4x4 worldViewProj;

struct VS_INPUT
{
float3 iposition : POSITION;
float3 inormal : NORMAL;
float4 icolor : COLOR0;
float2 itexcoord : TEXCOORD1;
};

struct VS_OUTPUT
{
float4 hposition : POSITION;
float3 hnormal : TEXCOORD0;
float4 hcolor : COLOR0;
float2 htexcoord : TEXCOORD1;
};

VS_OUTPUT main( VS_INPUT myIN )
{
VS_OUTPUT vOut;

float4 v = float4( myIN.iposition.x,
myIN.iposition.y,
myIN.iposition.z,
1.0f );

vOut.hposition = mul( v, worldViewProj );
vOut.hnormal = myIN.inormal;
vOut.hcolor = myIN.icolor;
vOut.htexcoord = myIN.itexcoord;

return vOut;
}



[Edited by - mike_ix on July 7, 2006 8:02:08 PM]

Share this post


Link to post
Share on other sites
Quote:
Original post by mike_ix
OK, so I found this discussion that states that pixel shaders cannot receive INPUT from the vertex shaders OUTPUT that has the semantic "NORMAL", so I changed my code so that the .vsh file has two TEXCOORD semantics as suggested and BAM, the darn thing runs. Only, none of my meshes are being rendered! OH, do I have the vertex shader blues! Here's my .vsh file now (some of the other names have changed as well during my effort to trouble shoot):

*** Source Snippet Removed ***


OK, so you changed the normal semantic to TEXCOORD0. Does your pixel shader know about this? Do you even use normals in the pixel shader? I'm not too sure, but I think you will also have to write a pixel shader to be able to see any geometry. Something like this might work:



sampler2D samp;

struct PS_OUTPUT
{
float4 hcolor : COLOR0;
};

struct PS_INPUT
{
float4 hposition : POSITION;
float3 hnormal : TEXCOORD0;
float4 hcolor : COLOR0;
float2 htexcoord : TEXCOORD1;
};

PS_OUTPUT main( PS_INPUT myIN )
{
PS_OUTPUT pOut;
pOut.xyz = tex2D(s_2D, myIN.htexcoord);

//forgot the w
pOut.w = 1.0f;

//when all else fails
//pOut.x = 1.0f; // will output red color no matter what

return pOut;
}





[Edited by - deathkrush on July 7, 2006 11:29:59 PM]

Share this post


Link to post
Share on other sites
Quote:
Does your pixel shader know about this? Do you even use normals in the pixel shader? I'm not too sure, but I think you will also have to write a pixel shader to be able to see any geometry. Something like this might work:


Now we're getting to the crux of this discussion!!!


1) Does this mean I MUST ALSO include a pixel shader to render my mesh files?

2) WHERE ARE NORMALS CALCULATED? If I create a mesh from scratch, I have to calulate normals at some point. Currently I do this with D3DXComputeNormals, but when I mess with the FVF flags of the Mesh to make them match my vertex declaration I get Debug errors. Should I be doing this in the shaders instead?

3) If the Pixel shader only outputs color, how are the normals handled? Where do they go?

Remember, I just want to replicate what the fixed pipline does, but with shaders. In other words, I just want to see my meshes rendered. No fancy tricks just yet ;) I'm going to try to implement your pixel shader, and I'll let you know how it goes. Thanks for the help!

Share this post


Link to post
Share on other sites
Quote:
Original post by mike_ix
1) Does this mean I MUST ALSO include a pixel shader to render my mesh files?


Well, if you already wrote a vertex shader, then pixel shaders are very easy. Don't worry about it.

Quote:
Original post by mike_ix
2) WHERE ARE NORMALS CALCULATED? If I create a mesh from scratch, I have to calulate normals at some point. Currently I do this with D3DXComputeNormals, but when I mess with the FVF flags of the Mesh to make them match my vertex declaration I get Debug errors. Should I be doing this in the shaders instead?


Like I said earlier, you could have the normals stored in the .x file or you could calculate them at runtime using the D3DXComputeNormals() function or using any other function that does the job. The point is that the normals need to be calculated and passed into the vertex shader.

What you do with normals at that point is entirely up to you. You can implement any kind of lighting equation in the vertex shader, calculate the per-vertex color (using the normals and light info) and pass the color to the pixel shader. The only thing that the pixel shader needs to know is the per-vertex color so that it can interpolate and blend the color with the texture.

Quote:
Original post by mike_ix
3) If the Pixel shader only outputs color, how are the normals handled? Where do they go?


You either use the normals in the vertex shader to compute color which will give you per-vertex lighting or you use them in the pixel shader to compute color which gives per-pixel lighting.

Quote:
Original post by mike_ix
Remember, I just want to replicate what the fixed pipline does, but with shaders. In other words, I just want to see my meshes rendered. No fancy tricks just yet ;) I'm going to try to implement your pixel shader, and I'll let you know how it goes. Thanks for the help!


Unfortunately when you start using shaders, you lose the fixed function pipeline and you have to implement it in the shader. You can take a look at the Nvidia website, they have a pretty huge shader that re-implements the fixed function pipeline. But if you just want your meshes rendered without any fancy stuff, the shaders are very short (no more than 3 lines of code for the main function).

Share this post


Link to post
Share on other sites
Shouldn't your vertex declaration be:

D3DVERTEXELEMENT9 declaration[] =
{
{ 0, 0, D3DDECLTYPE_FLOAT3, D3DDECLMETHOD_DEFAULT, D3DDECLUSAGE_POSITION, 0 },
{ 0, 12, D3DDECLTYPE_FLOAT3, D3DDECLMETHOD_DEFAULT, D3DDECLUSAGE_NORMAL, 0 },
{ 0, 24, D3DDECLTYPE_D3DCOLOR, D3DDECLMETHOD_DEFAULT, D3DDECLUSAGE_COLOR, 0 },
{ 0, 28, D3DDECLTYPE_FLOAT2, D3DDECLMETHOD_DEFAULT, D3DDECLUSAGE_TEXCOORD, 0 },
D3DDECL_END()
};

28 instead of 40. As far as I can tell, D3DCOLOR's offset is 4

Share this post


Link to post
Share on other sites
Quote:
Original post by Zee Man
Shouldn't your vertex declaration be:

D3DVERTEXELEMENT9 declaration[] =
{
{ 0, 0, D3DDECLTYPE_FLOAT3, D3DDECLMETHOD_DEFAULT, D3DDECLUSAGE_POSITION, 0 },
{ 0, 12, D3DDECLTYPE_FLOAT3, D3DDECLMETHOD_DEFAULT, D3DDECLUSAGE_NORMAL, 0 },
{ 0, 24, D3DDECLTYPE_D3DCOLOR, D3DDECLMETHOD_DEFAULT, D3DDECLUSAGE_COLOR, 0 },
{ 0, 28, D3DDECLTYPE_FLOAT2, D3DDECLMETHOD_DEFAULT, D3DDECLUSAGE_TEXCOORD, 0 },
D3DDECL_END()
};

28 instead of 40. As far as I can tell, D3DCOLOR's offset is 4


I believe 40 is the right number. The offset is caculated as 4 * sizeof(float) = 16.

Share this post


Link to post
Share on other sites
No. It is 4. A D3DCOLOR type is just like a D3DCOLOR in your code (4 packed bytes, or 1 DWORD). It gets converted to 4 floats before being sent to the shader, and gets converted to the D3DXCOLOR format, where each color is a float 0.0 - 1.0. In the D3DCOLOR DWORD the RGBA values are 0-255 (or 0x00 - 0xff). So, if you have white for your D3DCOLOR it will be 0xffffffff, and before beings sent to the shader it gets converted to 4 floats 1.0, 1.0, 1.0, 1.0.

Share this post


Link to post
Share on other sites
Quote:
I believe 40 is the right number. The offset is caculated as 4 * sizeof(float) = 16.


Yeah, I caught that last night. It's now 28.

Quote:
The point is that the normals need to be calculated and passed into the vertex shader.


This is exactly what I wanted to know! Thank you!

Quote:
You either use the normals in the vertex shader to compute color which will give you per-vertex lighting or you use them in the pixel shader to compute color which gives per-pixel lighting.


Bingo! So normal calculations are done before shaders, then used by the shaders to implement tricks.

Quote:
OK, so you changed the normal semantic to TEXCOORD0. Does your pixel shader know about this? Do you even use normals in the pixel shader? I'm not too sure, but I think you will also have to write a pixel shader to be able to see any geometry. Something like this might work:


I'm currnetly trying to implement the code you gave me, but D3DXCompileShaderFromFile fails with the error "error X3018: invalid subscript 'xyz'", and to be honest I'm not sure what this does yet:

pOut.xyz = tex2D(s_2D, myIN.htexcoord);

The struct pOut doesn't define an 'xyz'?

Share this post


Link to post
Share on other sites
OK, I've gotten it to compile and run with no errors, but still, nothing renders! I have a background, and display, and camera, but my meshes are nowhere to be found!

This is how I'm initializing the shaders:

LPDIRECT3DVERTEXSHADER9 g_pVertexShader = NULL;
LPDIRECT3DVERTEXDECLARATION9 g_pVertexDeclaration = NULL;
LPD3DXCONSTANTTABLE g_pConstantTableVS = NULL;

LPDIRECT3DPIXELSHADER9 g_pPixelShader = NULL;
LPD3DXCONSTANTTABLE g_pConstantTablePS = NULL;

void MyWorldObj::initShader( void )
{

D3DVERTEXELEMENT9 declaration[] =
{
{ 0, 0, D3DDECLTYPE_FLOAT3, D3DDECLMETHOD_DEFAULT, D3DDECLUSAGE_POSITION, 0 },
{ 0, 12, D3DDECLTYPE_FLOAT3, D3DDECLMETHOD_DEFAULT, D3DDECLUSAGE_NORMAL, 0 },
{ 0, 24, D3DDECLTYPE_D3DCOLOR, D3DDECLMETHOD_DEFAULT, D3DDECLUSAGE_COLOR, 0 },
{ 0, 28, D3DDECLTYPE_FLOAT2, D3DDECLMETHOD_DEFAULT, D3DDECLUSAGE_TEXCOORD, 0 },
D3DDECL_END()
};

g_pd3dDevice->CreateVertexDeclaration(declaration, &g_pVertexDeclaration);

HRESULT hr;
LPD3DXBUFFER pCode;
DWORD dwShaderFlags = 0;
LPD3DXBUFFER pBufferErrors = NULL;

// Assemble the vertex shader from the file
hr = D3DXCompileShaderFromFile( "vertex_shader.vsh", NULL, NULL, "main",
"vs_1_1", dwShaderFlags, &pCode,
&pBufferErrors, &g_pConstantTableVS );

if( FAILED(hr) )
{
LPVOID pCompilErrors = pBufferErrors->GetBufferPointer();
MessageBox(NULL, (const char*)pCompilErrors, "Vertex Shader Compile Error",
MB_OK|MB_ICONEXCLAMATION);
}

// Create the vertex shader
g_pd3dDevice->CreateVertexShader( (DWORD*)pCode->GetBufferPointer(),
&g_pVertexShader );
pCode->Release();

//
// Create a HLSL based pixel shader.
//

//Assemble the pixel shader from the file
hr = D3DXCompileShaderFromFile( "pixel_shader.psh", NULL, NULL, "main",
"ps_1_1", dwShaderFlags, &pCode,
&pBufferErrors, &g_pConstantTablePS );

if( FAILED(hr) )
{
LPVOID pCompilErrors = pBufferErrors->GetBufferPointer();
MessageBox(NULL, (const char*)pCompilErrors, "Pixel Shader Compile Error",
MB_OK|MB_ICONEXCLAMATION);
}

// Create the pixel shader
g_pd3dDevice->CreatePixelShader( (DWORD*)pCode->GetBufferPointer(),
&g_pPixelShader );
pCode->Release();


}




This is my HLSL vertex shader (.vsh file). From my understanding, all this shader is doing is transforming 3D position data to 2D screen space, which is the bare minimum a vertex shader can do:
float4x4 worldViewProj;

struct VS_INPUT
{
float3 iposition : POSITION;
float3 inormal : NORMAL;
float4 icolor : COLOR0;
float2 itexcoord : TEXCOORD1;
};

struct VS_OUTPUT
{
float4 hposition : POSITION;
float3 hnormal : TEXCOORD0;
float4 hcolor : COLOR0;
float2 htexcoord : TEXCOORD1;
};

VS_OUTPUT main( VS_INPUT myIN )
{
VS_OUTPUT vOut;

float4 v = float4( myIN.iposition.x,
myIN.iposition.y,
myIN.iposition.z,
1.0f );

vOut.hposition = mul( v, worldViewProj );
vOut.hnormal = myIN.inormal;
vOut.hcolor = myIN.icolor;
vOut.htexcoord = myIN.itexcoord;

return vOut;
}




This is my HLSL pixel shader (.psh file). From my understanding, all this shader does is take the output of the vertex shader (position, normal, color and texcoord) then outputs the final color of the pixel by sampling the pixel of the texture set by SetTexture in my render function, which is the bare minimum a pixel shader can do:
sampler2D samp;

struct PS_INPUT
{
float4 hposition : POSITION;
float3 hnormal : TEXCOORD0;
float4 hcolor : COLOR0;
float2 htexcoord : TEXCOORD1;
};

struct PS_OUTPUT
{
float4 pcolor : COLOR0;
};


PS_OUTPUT main( PS_INPUT myIN )
{
PS_OUTPUT pOut;

float4 color = tex2D(samp, myIN.htexcoord);
pOut.pcolor = color;

return pOut;

}




And finally, this is my render function:
bool MyWorldObj::RenderMesh(LPD3DXMESH rmesh, int numMaterials, 
LPDIRECT3DTEXTURE9 meshTextures[],
D3DMATERIAL9 meshMaterials[], int wire) {


if (wire > 0) {
g_pd3dDevice->SetRenderState(D3DRS_FILLMODE, 2);
}

for (DWORD i=0; i<numMaterials; i++)
{
g_pd3dDevice->SetVertexShader( g_pVertexShader );
g_pd3dDevice->SetMaterial(&meshMaterials);
g_pd3dDevice->SetTexture(0,meshTextures);
g_pd3dDevice->SetPixelShader( g_pPixelShader );
rmesh->DrawSubset( i );
g_pd3dDevice->SetVertexShader( NULL );
g_pd3dDevice->SetPixelShader( NULL );
}

return true;




All of this brings me back to my original question: Since this compiles and runs, but nothing renders, can I render an indexed mesh using DrawSubset AND shaders? Or do I HAVE to, create and load vertex and index buffers from my meshes, then draw them using DrawIndexedPrimitive? Or perhaps I'm missing something in the shaders?

[Edited by - mike_ix on July 8, 2006 1:38:23 PM]

Share this post


Link to post
Share on other sites
Quote:
Original post by mike_ix

I'm currnetly trying to implement the code you gave me, but D3DXCompileShaderFromFile fails with the error "error X3018: invalid subscript 'xyz'", and to be honest I'm not sure what this does yet:

pOut.xyz = tex2D(s_2D, myIN.htexcoord);

The struct pOut doesn't define an 'xyz'?


Sorry, I meant

pOut.pcolor.xyz = tex2D(s_2D, myIN.htexcoord);
pOut.pcolor.w = 1.0f;

It just makes sure that alpha is 1.0f.

Quote:
Original post by mike_ix
OK, I've gotten it to compile and run with no errors, but still, nothing renders! I have a background, and display, and camera, but my meshes are nowhere to be found!


Shaders are not fun to debug. You could try a few tricks. What happens when you do this in the pixel shader:

pOut.pcolor.xyzw = 1.0f;

Similarily, you could short-circuit your vertex shader and make it use a hard-coded matrix, or if all else fails, have it draw at (0, 0, 0) and see if you can find a tiny point in the middle of the screen. That way at least you know that your shaders are working and stuff doesn't get passed in correctly.

EDIT: You could probably use the DrawSubset function, but I'm not sure if it works with custom shaders. I heard somewhere that it uses a fixed function pipeline shader with FVF. The reason why example code uses DrawIndexedPrimitive is because that way you get complete control over how your stuff is drawn and you don't have to rely on the magic of DrawSubset.

Share this post


Link to post
Share on other sites
You are missing two things in the shader. You cannot use

g_pd3dDevice->SetMaterial(&meshMaterials);
g_pd3dDevice->SetTexture(0,meshTextures);

with shaders. You need to create a global variable for the texture in the shader and set it from your program. You also need to bind the sampler to the texture. If you want to use materials, you also need to create globals for them and then use them in your pixel shader.

There are good books out there on DirectX and/or HLSL. The new book from Frank Luna covers Direct3D and uses the effects framework for all rendering. The book Programming Vertex and Pixel Shaders by Engel covers HLSL well.

In shaders you gotta do all the work that the fixed function pipeline does for you. It would be too much to explain it all here. THat is why I read books.

Share this post


Link to post
Share on other sites
Quote:
Original post by DXnut
You are missing two things in the shader. You cannot use

g_pd3dDevice->SetMaterial(&meshMaterials);
g_pd3dDevice->SetTexture(0,meshTextures);


The world view projection matrix is probably missing as well.

Share this post


Link to post
Share on other sites
D3DXCreateMesh and GetVertexBuffer
GameDev: Questions regarding D3DXLoadMeshFromX and LPD3DXMESH
Toymaker.info - Vertex Shaders
Problems Rendering Mesh Built from Scratch
http://www.xmission.com/~legalize/book/download/05-Modeling.pdf


Everyone has been so very helpful! I have made much progress in my understanding of both meshes and shaders. After doing a little research I found a few other posts about this same topic (links above), and have decided that I should give up trying to use DrawSubset and should move to DrawIndexedPrimitive. I have also tried to remove all of my FVF code too. I believe once I have gained control using DrawIndexedPrimitive (but still using the fixed-function pipeline) it will be easier to implement shaders. Unfortunately, getting my meshes to render using this method has turned out to be one of my largest DirectX headaches ever. Below are links to screenshots of my problem. As you can see, when I use DrawIndexedPrimitive there are WAY to many verticies and they are "scrambled" for lack of a better word:

This is a screenshot of my meshes using DrawSubset

This is a screenshot of my meshes using DrawIndexedPrimitive




So, here is my vertex declaration, CUSTOMVERTEX and MESHOBJECT structs:

LPDIRECT3DVERTEXDECLARATION9 g_pVertexDeclaration = NULL;

D3DVERTEXELEMENT9 declaration[] =
{
{ 0, 0, D3DDECLTYPE_FLOAT3, D3DDECLMETHOD_DEFAULT, D3DDECLUSAGE_POSITION, 0 },
{ 0, 12, D3DDECLTYPE_FLOAT3, D3DDECLMETHOD_DEFAULT, D3DDECLUSAGE_NORMAL, 0 },
{ 0, 24, D3DDECLTYPE_D3DCOLOR, D3DDECLMETHOD_DEFAULT, D3DDECLUSAGE_COLOR, 0 },
{ 0, 28, D3DDECLTYPE_FLOAT2, D3DDECLMETHOD_DEFAULT, D3DDECLUSAGE_TEXCOORD, 0 },
D3DDECL_END()
};

struct CUSTOMVERTEX
{
D3DXVECTOR3 position; // The 3-D position for the vertex.
D3DXVECTOR3 normal; // The surface normal for the vertex.
DWORD color;
FLOAT tu, tv;
};

struct MESHOBJECT {

LPDIRECT3DTEXTURE9* mText;
D3DMATERIAL9* mMat;
LPD3DXMESH Mesh;
LPDIRECT3DINDEXBUFFER9 MeshIndexBuffer;
LPDIRECT3DVERTEXBUFFER9 MeshVertexBuffer;
DWORD numVerts;
DWORD numFaces;
DWORD numMaterials;

};




I then create the vertex declaration with this call:
g_pd3dDevice->CreateVertexDeclaration(declaration, &g_pVertexDeclaration);




Here is how I'm loading my meshes now, this is where most of the big changes have occured. Alot of this code I adapted from both the Volume Fog sample (DXSDK) as it shows how to load and draw meshes with a vertex declaration and the Optimized Meshes sample as it shows how to load and draw a mesh using DrawIndexedPrimitive. The normal calculations are commented out because I'm not sure which method to use. The method that clones with FVF works, but the other doesn't. But don't I want to eliminate FVF from my code? How do I get the other method to work? This is just a side issue, as right now I'm more interested in just getting the meshes to render correctly with DrawIndexedPrimitive.

HRESULT MyWorldObj::LoadXMesh(LPCSTR filename, MESHOBJECT *workMesh) {

LPD3DXMESH pMeshTemp;
meshMaterials = NULL;
meshTextures = NULL;
d3dxMaterials = NULL;
LPDIRECT3DVERTEXBUFFER9 pVertexBuffer = NULL;
LPDIRECT3DINDEXBUFFER9 pIndexBuffer = NULL;
D3DINDEXBUFFER_DESC IndexDesc;
CUSTOMVERTEX* pVert;
CUSTOMVERTEX* pSource;
void* pSourceI;
void* pDestI;
HRESULT hr;

if (FAILED(hr=D3DXLoadMeshFromX(filename, D3DXMESH_MANAGED,
g_pd3dDevice, &pAdjacencyBuffer,
&materialBuffer, NULL, &numMaterials,
&ppMesh ))) {
MessageBox(NULL, "Could not load .x file!", "Error!",
MB_ICONEXCLAMATION | MB_OK);
return hr;
}
hr = ppMesh->OptimizeInplace(
D3DXMESHOPT_VERTEXCACHE,
(DWORD*)pAdjacencyBuffer->GetBufferPointer(), NULL, NULL, NULL );
/*
if ( !(ppMesh->GetFVF() & D3DFVF_NORMAL) )
{
hr = ppMesh->CloneMeshFVF( ppMesh->GetOptions(), ppMesh->GetFVF() | D3DFVF_NORMAL,
g_pd3dDevice, &pMeshTemp );
ppMesh->Release();
ppMesh = pMeshTemp;
D3DXComputeNormals( ppMesh, NULL );
}
D3DVERTEXELEMENT9 aOldDecl[MAX_FVF_DECL_SIZE];
if( SUCCEEDED( ppMesh->GetDeclaration( aOldDecl ) ) )
{
for( UINT index = 0; index < D3DXGetDeclLength( aOldDecl ); ++index )
if( aOldDecl[index].Usage == D3DDECLUSAGE_NORMAL )
{
D3DXComputeNormals(ppMesh, NULL);
break;
}
}
*/


workMesh->numVerts = ppMesh->GetNumVertices();
workMesh->numFaces = ppMesh->GetNumFaces();

if( FAILED( hr = ppMesh->GetVertexBuffer( &pVertexBuffer ) ) ) {
MessageBox(NULL, "Could not get vertex buffer for .x file!", "Error!",
MB_ICONEXCLAMATION | MB_OK);
}

if( FAILED( hr = g_pd3dDevice->CreateVertexBuffer( D3DXGetDeclVertexSize(declaration,0) * workMesh->numVerts,
0, 0, D3DPOOL_SYSTEMMEM,
&(workMesh->MeshVertexBuffer), NULL ))) {
MessageBox(NULL, "Could not create vertex buffer for .x file!", "Error!",
MB_ICONEXCLAMATION | MB_OK);
}
if( FAILED(hr = workMesh->MeshVertexBuffer->Lock( 0, 0, (void**)(&pVert), 0 ) ) ) {
MessageBox(NULL, "Could not lock vertex buffer for .x file!", "Error!",
MB_ICONEXCLAMATION | MB_OK);
}
if( FAILED(hr = pVertexBuffer->Lock(0, D3DXGetDeclVertexSize(declaration,0) * workMesh->numVerts, (void**)&pSource, 0) ) ) {
MessageBox(NULL, "Could not lock source vertex buffer for .x file!", "Error!",
MB_ICONEXCLAMATION | MB_OK);
}
memcpy( pVert, pSource, D3DXGetDeclVertexSize(declaration,0) * workMesh->numVerts );
pVertexBuffer->Unlock();
workMesh->MeshVertexBuffer->Unlock();

if( FAILED( hr = ppMesh->GetIndexBuffer( &pIndexBuffer ) ) ) {
MessageBox(NULL, "Could not get index buffer for .x file!", "Error!",
MB_ICONEXCLAMATION | MB_OK);
}

pIndexBuffer->GetDesc(&IndexDesc);

if( FAILED( hr = g_pd3dDevice->CreateIndexBuffer(IndexDesc.Size, D3DUSAGE_WRITEONLY,
IndexDesc.Format, D3DPOOL_MANAGED, &(workMesh->MeshIndexBuffer), NULL ) ) )
{
MessageBox(NULL, "Could not create index buffer for .x file!", "Error!",
MB_ICONEXCLAMATION | MB_OK);
}
pIndexBuffer->Lock(0, IndexDesc.Size, &pSourceI, 0);
workMesh->MeshIndexBuffer->Lock(0, IndexDesc.Size, &pDestI, 0);
memcpy(pDestI, pSourceI, IndexDesc.Size);

pIndexBuffer->Unlock();
workMesh->MeshIndexBuffer->Unlock();

d3dxMaterials = (D3DXMATERIAL*)materialBuffer->GetBufferPointer();
meshMaterials = new D3DMATERIAL9[numMaterials];
meshTextures = new LPDIRECT3DTEXTURE9[numMaterials];

for (DWORD i=0; i<numMaterials; i++)
{
meshMaterials = d3dxMaterials.MatD3D;
meshMaterials.Ambient = meshMaterials.Diffuse;
meshTextures = NULL;
if (d3dxMaterials.pTextureFilename) {

meshTextures = theWorld->loadTexture(d3dxMaterials.pTextureFilename);

}
}
materialBuffer->Release();

workMesh->numMaterials = numMaterials;
workMesh->Mesh = ppMesh;
workMesh->mText = new LPDIRECT3DTEXTURE9[numMaterials];
workMesh->mText = meshTextures;
workMesh->mMat = new D3DMATERIAL9[numMaterials];
workMesh->mMat = meshMaterials;

return hr;
ppMesh->Release();
ppMeshTemp->Release();

}




Finally, here is how I'm rendering. If I comment out the DrawIndexedPrimitive and put in the DrawSubset, everything works fine. The other way gives me scrambled meshes.

bool MyWorldObj::RenderMesh(MESHOBJECT *rmesh) {

meshMaterials = rmesh->mMat;
meshTextures = rmesh->mText;
//g_pd3dDevice->SetRenderState(D3DRS_CULLMODE, 1);
if (theOptions->wireframe > 0) {
g_pd3dDevice->SetRenderState(D3DRS_FILLMODE, 2);
}
g_pd3dDevice->SetVertexDeclaration( g_pVertexDeclaration );
g_pd3dDevice->SetStreamSource( 0,rmesh->MeshVertexBuffer, 0,
D3DXGetDeclVertexSize(declaration,0) );
g_pd3dDevice->SetIndices(rmesh->MeshIndexBuffer);

for ( DWORD i = 0; i < rmesh->numMaterials; i++ )
{

//g_pd3dDevice->SetMaterial(&meshMaterials);
//g_pd3dDevice->SetTexture(0,meshTextures);

g_pd3dDevice->DrawIndexedPrimitive( D3DPT_TRIANGLELIST, 0, 0,
rmesh->numVerts, 0, rmesh->numFaces );

//rmesh->Mesh->DrawSubset( i );

}
return true;
}




I assume my indexes are getting messed up, or perhaps something isn't being released? Where are all these extra vertices coming from? I've also tried rendering by getting the attribute table and looping my for loop directly with the attribute id but I get the same results.


Quote:
You are missing two things in the shader. You cannot use

g_pd3dDevice->SetMaterial(&meshMaterials);
g_pd3dDevice->SetTexture(0,meshTextures);

with shaders. You need to create a global variable for the texture in the shader and set it from your program. You also need to bind the sampler to the texture. If you want to use materials, you also need to create globals for them and then use them in your pixel shader.


Thanks! This is good to know. As you can see from my screenshots (above) I've commented those lines out for now and I'm just trying to render in wireframe mode. How do I set the global varible? With SetVertexShaderConstantF?

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement