Problem with shader; vertex Input is wrong.

This topic is 1089 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

Recommended Posts

I seem to have an issue with my shader code.  I an trying to work with animated skeleton meshes, and ran into trouble, so I changed my shader vertex function to work like my normal shader just with the original inputs. My simple cube is displaying completely wrong.

cbuffer MatrixBuffer
{
matrix world;
matrix view;
matrix projection;
};

cbuffer SkinnedBuffer
{
float4x4 BoneTransforms[96];
};

struct VertexInput
{
float3 position : POSITION;
float2 tex : TEXCOORD0;
float3 normal : NORMAL;
float3 tangent : TANGENT;
float3 binormal : BINORMAL;
float3 weights : BLENDWEIGHT;
uint4 boneIndices : BLENDINDICES;
};

struct VertexOutput
{
float4 position : SV_POSITION;
float2 tex : TEXCOORD0;
float3 normal : NORMAL;
float3 tangent : TANGENT;
float3 binormal : BINORMAL;
};

VertexOutput main( VertexInput input )
{
VertexOutput output;
float4 pos = float4(input.position.xyz,1.0f);

output.position = mul(pos,world);
output.position = mul(output.position,view);
output.position = mul(output.position,projection);

// store the tex coords
output.tex = input.tex;

// calculate the normal vector against the world matrix only and then normalize the final value
output.normal = mul(input.normal,(float3x3)world);
output.normal = normalize(output.normal);

// calculate the tangents
output.tangent = mul(input.tangent,(float3x3)world);
output.tangent = normalize(output.tangent);
// calculate the binormals
output.binormal = mul(input.binormal,(float3x3)world);
output.binormal = normalize(output.binormal);

return output;
}



And here is the shader Initialization funciton:

        struct ShaderBuffer
{
unsigned int Size;
void* Buffer;
ShaderBuffer() { Size = 0; Buffer = 0; }
};

//
D3D11_INPUT_ELEMENT_DESC polygonLayout[7] =    {
{"POSITION",     0, DXGI_FORMAT_R32G32B32_FLOAT, 0, 0,  D3D11_INPUT_PER_VERTEX_DATA, 0},
{"TEXCOORD",     0, DXGI_FORMAT_R32G32_FLOAT,    0, 16, D3D11_INPUT_PER_VERTEX_DATA, 0},
{"NORMAL",       0, DXGI_FORMAT_R32G32B32_FLOAT, 0, 24, D3D11_INPUT_PER_VERTEX_DATA, 0},
{"TANGENT",      0, DXGI_FORMAT_R32G32B32_FLOAT, 0, 36, D3D11_INPUT_PER_VERTEX_DATA, 0},
{"BINORMAL",     0, DXGI_FORMAT_R32G32B32_FLOAT, 0, 48, D3D11_INPUT_PER_VERTEX_DATA, 0},
{"BLENDWEIGHT",      0, DXGI_FORMAT_R32G32B32_FLOAT,0, 60, D3D11_INPUT_PER_VERTEX_DATA, 0},
{"BLENDINDICES",  0, DXGI_FORMAT_R8G8B8A8_UINT, 0,  72, D3D11_INPUT_PER_VERTEX_DATA, 0}
};
unsigned int numElements;

D3D11_SAMPLER_DESC samplerDesc;
ID3D11Device* device = graphics->GetDevice();

return false;
return false;

//

numElements = sizeof(polygonLayout) / sizeof(polygonLayout[0]);

// create the vertex input layout
return false;

m_matrixBuffer.Initialize(device);
m_lightBuffer.Initialize(device);
m_skinnedBuffer.Initialize(device);

// Create a texture sampler state description.
samplerDesc.Filter = D3D11_FILTER_MIN_MAG_MIP_LINEAR;
samplerDesc.MipLODBias = 0.0f;
samplerDesc.MaxAnisotropy = 1;
samplerDesc.ComparisonFunc = D3D11_COMPARISON_ALWAYS;
samplerDesc.BorderColor[0] = 0;
samplerDesc.BorderColor[1] = 0;
samplerDesc.BorderColor[2] = 0;
samplerDesc.BorderColor[3] = 0;
samplerDesc.MinLOD = 0;
samplerDesc.MaxLOD = D3D11_FLOAT32_MAX;

if(FAILED(device->CreateSamplerState(&samplerDesc,&m_samplerState)))
return false;

return true;


I think it is an alignment problem.  When I debug it in visual studio 2012 and check the vertex shader input, the values are waaaay off.

Let me know if there is something I am missing here...

Thanks

EDIT:  Also, my buffer initialization code looks like this:

                // Make the constant buffer
D3D11_BUFFER_DESC desc;
desc.Usage = D3D11_USAGE_DYNAMIC;
desc.BindFlags = D3D11_BIND_CONSTANT_BUFFER;
desc.CPUAccessFlags = D3D11_CPU_ACCESS_WRITE;
desc.MiscFlags = 0;
desc.ByteWidth = sizeof(T);
desc.StructureByteStride = 0;

device->CreateBuffer(&desc,0,&m_Buffer);



Edit2: My Vertex Structure

                struct SkinnedVertex
{
DirectX::XMFLOAT3 position;
DirectX::XMFLOAT2 texture;
DirectX::XMFLOAT3 normal;
DirectX::XMFLOAT3 tangent;
DirectX::XMFLOAT3 binormal;
DirectX::XMFLOAT3 weights;
unsigned char boneIndices[4];

};

Edited by Alriightyman

Share on other sites

A quick look:

D3D11_INPUT_ELEMENT_DESC polygonLayout[7] =    {
{"POSITION",     0, DXGI_FORMAT_R32G32B32_FLOAT, 0, 0,  D3D11_INPUT_PER_VERTEX_DATA, 0},
{"TEXCOORD",     0, DXGI_FORMAT_R32G32_FLOAT,    0, 16, D3D11_INPUT_PER_VERTEX_DATA, 0},  // should be 12

The position format is 12 bytes, but you have texcoord offset by 16 bytes.

Rather than specifying each offset, use D3D11_APPEND_ALIGNED_ELEMENT instead. See the docs for D3D11_INPUT_ELEMENT_DESC.

I.e.,

D3D11_INPUT_ELEMENT_DESC polygonLayout[7] =    {
{"POSITION",     0, DXGI_FORMAT_R32G32B32_FLOAT, 0, 0,  D3D11_INPUT_PER_VERTEX_DATA, 0},
{"TEXCOORD",     0, DXGI_FORMAT_R32G32_FLOAT,    0, D3D11_APPEND_ALIGNED_ELEMENT, D3D11_INPUT_PER_VERTEX_DATA, 0},
{"NORMAL",       0, DXGI_FORMAT_R32G32B32_FLOAT, 0, D3D11_APPEND_ALIGNED_ELEMENT, D3D11_INPUT_PER_VERTEX_DATA, 0},
{"TANGENT",      0, DXGI_FORMAT_R32G32B32_FLOAT, 0, D3D11_APPEND_ALIGNED_ELEMENT, D3D11_INPUT_PER_VERTEX_DATA, 0},

etc.

As you will find, if you don't use D3D11_APPEND_ALIGNED_ELEMENT, when you change texcoord offset to 12, you have to change all the offsets after that.

Edited by Buckeye

Share on other sites

Thanks, but no change.  I have changed those offsets a lot, manually and with D3D11_APPEND_ALIGNED_ELEMENT.  If I use D3D11_APPEND_ALIGNED_ELEMENT though,  when debugging the graphics, it seems that the input layout is reversed for some reason.  BLENDINDICES are now at the top, and POSITION is at the bottom.

Share on other sites

BLENDINDICES are now at the top, and POSITION is at the bottom.

Top and bottom of what?

Share on other sites

https://www.dropbox.com/s/ij91graql0f0x2g/Input.PNG?dl=0

Here, in the debugger.

As you can see, in vertex 22, BLENDINDICES's fourth value is 65, and I only have 2 bones.  It is supposed to be zero.

But here it is setup right:

https://www.dropbox.com/s/l38bxclrolakbml/inputlayout.PNG?dl=0

Here is my vertex buffer, if it helps.

https://www.dropbox.com/s/b606cgmp6b9mm7r/vertexBuffer.PNG?dl=0

Edited by Alriightyman

Share on other sites

I don't see a vertex 4 in the image - just vertex 21 and 22. Also, when I use RenderDoc for a similarly constructed input assembly (using 0 for the first element, and D3D11_APPEND_ALIGNED for the remainder), it appears as expected.

If you use offsets instead of APPEND_ALIGNED, does the debugger show a different order? If so, then post the code for both versions of your input layout.

If not, perhaps the list you show reflects data as streamed in. In which case, BLENDINDICES is the last value in the stream and appears at the top as a result. Don't know what debugger you're using, but perhaps, if you state what you're using, others can comment.

Also, when you use APPEND_ALIGNED, do you set the position offset to 0, and then use APPEND_ALIGNED for all the other elements?

in vertex 4, BLENDINDICES's fourth value is 193, and I only have 2 bones. It is supposed to be zero

Do you explicitly set it to 0 when you construct the vertex? And do you explicitly set all 4 blend indices for every vertex?

Here is my vertex buffer, if it helps.

Not with the blend indices, as it appears that image is just an interpretation of every 4 bytes as a float. Blend indices are per-byte.

And the bottom line: are the values as accessed in the shader itself correct? That's what gets rendered.

Edited by Buckeye

Share on other sites

I don't see a vertex 4 in the image - just vertex 21 and 22. Also, when I use RenderDoc for a similarly constructed input assembly (using 0 for the first element, and D3D11_APPEND_ALIGNED for the remainder), it appears as expected.

If you use offsets instead of APPEND_ALIGNED, does the debugger show a different order? If so, then post the code for both versions of your input layout.

Yes, I get a different order.

Here are my layout inputs:

		D3D11_INPUT_ELEMENT_DESC polygonLayout[7] =	{
{"POSITION",     0, DXGI_FORMAT_R32G32B32_FLOAT, 0, 0,							  D3D11_INPUT_PER_VERTEX_DATA, 0},
{"TEXCOORD",     0, DXGI_FORMAT_R32G32_FLOAT,    0, D3D11_APPEND_ALIGNED_ELEMENT, D3D11_INPUT_PER_VERTEX_DATA, 0},
{"NORMAL",       0, DXGI_FORMAT_R32G32B32_FLOAT, 0, D3D11_APPEND_ALIGNED_ELEMENT, D3D11_INPUT_PER_VERTEX_DATA, 0},
{"TANGENT",      0, DXGI_FORMAT_R32G32B32_FLOAT, 0, D3D11_APPEND_ALIGNED_ELEMENT, D3D11_INPUT_PER_VERTEX_DATA, 0},
{"BINORMAL",     0, DXGI_FORMAT_R32G32B32_FLOAT, 0, D3D11_APPEND_ALIGNED_ELEMENT, D3D11_INPUT_PER_VERTEX_DATA, 0},
{"BLENDWEIGHT",  0, DXGI_FORMAT_R32G32B32_FLOAT, 0, D3D11_APPEND_ALIGNED_ELEMENT, D3D11_INPUT_PER_VERTEX_DATA, 0},
{"BLENDINDICES", 0, DXGI_FORMAT_R8G8B8A8_UINT,   0, D3D11_APPEND_ALIGNED_ELEMENT, D3D11_INPUT_PER_VERTEX_DATA, 0}
};



Or this way:

		D3D11_INPUT_ELEMENT_DESC polygonLayout[7] =	{
{"POSITION",     0, DXGI_FORMAT_R32G32B32_FLOAT, 0, 0,							  D3D11_INPUT_PER_VERTEX_DATA, 0},
{"TEXCOORD",     0, DXGI_FORMAT_R32G32_FLOAT,    0, 12, D3D11_INPUT_PER_VERTEX_DATA, 0},
{"NORMAL",       0, DXGI_FORMAT_R32G32B32_FLOAT, 0, 20, D3D11_INPUT_PER_VERTEX_DATA, 0},
{"TANGENT",      0, DXGI_FORMAT_R32G32B32_FLOAT, 0, 32, D3D11_INPUT_PER_VERTEX_DATA, 0},
{"BINORMAL",     0, DXGI_FORMAT_R32G32B32_FLOAT, 0, 44, D3D11_INPUT_PER_VERTEX_DATA, 0},
{"BLENDWEIGHT",  0, DXGI_FORMAT_R32G32B32_FLOAT, 0, 56, D3D11_INPUT_PER_VERTEX_DATA, 0},
{"BLENDINDICES", 0, DXGI_FORMAT_R8G8B8A8_UINT,   0, 68, D3D11_INPUT_PER_VERTEX_DATA, 0}
};


Here's the other layout like before:

https://www.dropbox.com/s/efrmx6wlge3uwh0/input2.PNG?dl=0

If not, perhaps the list you show reflects data as streamed in. In which case, BLENDINDICES is the last value in the stream and appears at the top as a result. Don't know what debugger you're using, but perhaps, if you state what you're using, others can comment.

I am using Visual Studio 2012's Graphics Debugger.

Also, when you use APPEND_ALIGNED, do you set the position offset to 0, and then use APPEND_ALIGNED for all the other elements?

Yes, see above.

Do you explicitly set it to 0 when you construct the vertex? And do you explicitly set all 3 blend indices for every vertex?

They are set in the model file I created.  It's a text file, so I can look up the data.

Not with the blend indices, as it appears that image is just an interpretation of every 4 bytes as a float. Blend indices are per-byte.

And the bottom line: are the values as accessed in the shader itself correct? That's what gets rendered.

This is what get Rendered, I rotated it and zoomed in a bit.

https://www.dropbox.com/s/xdxw690o7u4uj1g/ingame.PNG?dl=0

Share on other sites

They are set in the model file I created. It's a text file, so I can look up the data.

Do you explicitly set it to 0 when you construct the vertex? And do you explicitly set all 4 blend indices for every vertex?

I.e., if the file only specifies 2 indices, do you set just 2, or do you set all 4?

This is what get Rendered

You've already established that something doesn't appear as you expect.   So I asked whether you've examined the actual values in the shader.

The process of debugging is first to find where in your program either the code is incorrect, or the data is incorrect, or both. You may want to consider following the data. Consider: it appears the shader output is incorrect. So, the shader code is incorrect, the data coming into the shader is incorrect, or both.

After you've stared at your code and determined it appears to be correct, you'll have to start examining actual values during program execution. You could use the graphics debugger to examine actual register values during execution of the shader. However, you seem to think the blend indices values are somehow incorrect, so it may be a lot faster and easier to set a breakpoint somewhere in your code.

For instance, see my comment above with regard to setting the blend indices. Check the text file if you want, but verify (don't guess, don't assume) by examining actual values whether you actually load correct values into the buffer that you send to the shader. IF the shader uses all 4 blend index values, verify that all 4 blend indices are actually getting set correctly. [ EDIT: In addition, the blend weights (actually, all the data) must be correct. ] Whether you're creating a vertex buffer with data, or Map it in later, it's likely to be much easier if you examine the actual values of the data which you load into the buffer.

Edited by Buckeye

Share on other sites

All data within vertex data struct is assigned a value; I have checked.  The input to the vertex shader is wrong.  I HAVE been debugging, and I have been stuck on this for some time.  I'm not new to programming; I am a semester away from my BS in Computer Science.  I haven't really debugged graphics before, and am still new to shaders.

Also, it's not just the blendindices that are incorrect, I'm sorry if I made it seem that those were the only problem.  The first vertex is correct.  After that, they are all offset incorrectly.

Here is the first vertex of the first triangle.  The input is correct according to the vertex buffer and the the c++ vertex structure.

position = x = -10.000000000, y = 10.000000000, z = -10.000000000, w = NaN
weights = x = 0.500000000, y = 0.500000000, z = 0.000000000
tex = x = 0.375000000, y = 0.750000000
normal = x = 0.000000000, y = 0.000000000, z = -1.000000000
tangent = x = 1.000000000, y = 0.000000000, z = 0.000000000
binormal = x = 0.000000000, y = 1.000000000, z = 0.000000000


Now, this is the second vertex of the same traingle.

position = x = 0.000000000, y = 1.000000000, z = 0.000000000, w = NaN
weights = x = 10.000000000, y = -10.00000000, z = -10.000000000
tex = x = 0.821455000, y = 0.178545000
normal = x = 0.000000000, y = 0.625000000, z = 1.000000000
tangent = x = 0.000000000, y = 0.000000000, z = -1.000000000
binormal = x = 1.000000000, y = 0.000000000, z = 0.000000000


The problem here is that position should be what is shown for the weights, weights should be what tex is and the x value of normal, ect..  It looks as if the input position copied the values from binormal of the last vertex. And that is what is stumped me.

Oddly, if I leave the shader alone, semantics alone, and change the vertex structure to not include blendweights or blendindices, my cube loads just fine.

Share on other sites

The problem here is that position should be what is shown for the weights, weights should be what tex is ... change the vertex structure to not include blendweights or blendindices ...

That implies the stride of the vertex buffer is shorter by the size of the blendweights and blendindices elements in your IASetVertexBuffers call. Are you setting the stride = sizeof( VertexInput ) ?

EDIT: That is, it seems you may be calling:

context->IASetVertexBuffers( (0, 1, &myVertexBuffer, &strideOf_Unskinned_MeshVertex, &offset);
// rather than
context->IASetVertexBuffers( (0, 1, &myVertexBuffer, &strideOf_Skinned_MeshVertex, &offset);


where your unskinned mesh vertex structure is identical to the skinned mesh vertex structure, EXCEPT - no blend indices/weights.

I HAVE been debugging, and I have been stuck on this for some time. I'm not new to programming; I am a semester away from my BS in Computer Science. I haven't really debugged graphics before, and am still new to shaders.

Yeah, my journal entry assumes an equivalence between "programming" and "debugging." It isn't intended to imply you're new to programming. I should change that to emphasize familiarity with debugging techniques. Unfortunately, it seems that techniques to debug programs (which experienced programmers know will have to be done sometime) aren't taught as well (if at all,) as are programming techniques. Also, I acknowledge you've been "debugging." I'm suggesting a more structured approach.

Note, "following the data," even with your unfamiliarity with shaders, should be taking you down the same path. You believe the shader input is incorrect, and likely an alignment issue. Assume you're correct. Therefore the problem must be before the values are sent to the shader. So, look at the code and values leading up to your draw call (setting up the input assembly, setting the shaders, setting the constant buffers, etc.).  The code may be correct, but somewhere the data/values are incorrect.

Edited by Buckeye

Share on other sites

It was the stride. That's what I was missing.  I set the vertex buffer somewhere else in my code, and didn't even notice I hadn't changed the stride to reflect the skinned vertex.

Yeah, my journal entry assumes an equivalence between "programming" and "debugging." It isn't intended to imply you're new to programming. I should change that to emphasize familiarity with debugging techniques. Unfortunately, it seems that techniques to debug programs (which experienced programmers know will have to be done sometime) aren't taught as well (if at all,) as are programming techniques. Also, I acknowledge you've been "debugging." I'm suggesting a more structured approach.

I took a class for debugging.  Most of the information taught were things I had learn myself.

Thanks for the help, now I can try to implement skinned meshes! Wish me luck!

Share on other sites

Thanks for posting the resolution.

Wish me luck!

You've got it.