Debugging a blank view, does order matter?

Started by
6 comments, last by baldurk 9 years, 4 months ago

I'm trying to port some OpenGL code to DirectX11. I'm getting a blank screen and I can't seem to find the problem. I've pared back the program. I found the Graphics debugger in Visual Studio. I can't seem to get it.

I also posted a Stack Overflow question about it, but I haven't solved this yet. At the risk of posting an xyproblem question, here goes.

Is there a proper order that we have to call the following API calls when we "draw_frame()" for a single object:

First Clear the screen

- ClearRenderTargetView

- ClearDepthStencilView

Then:

- IASetInputLayout

- IASetIndexBuffer

- IASetPrimitiveTopology

- UpdateSubresource <-- sending in the transformation matrix here

- VSSetShader

- VSSetConstantBuffers <-- sending in the transformation matrix here

- PSSetShader

- OMSetRenderTargets

- DrawIndexed

Then swap back to front:

- Present

I'm trying to make my program work and understand the pieces and parts that go into making a DX11 program. In the process I'm trying to learn what debugging tools are available and how to use them. Any advice would be helpful in that area as well.

Advertisement

There's another amazing debugger called renderdoc that may be of use.

Also, make sure you're creating your device with the D3D11_CREATE_DEVICE_DEBUG flag set during development/testing -- this will make D3D emit useful warnings to the debug output window if you're doing anything wrong. Also make sure you check the HRESULTs of everything to make sure there's no failures being ignored.

The order of your "Then:" calls don't matter, except that all of them must come before the Draw call.

e.g.

The IA calls can occur in any order - these are setting different values for vertex input assembly.

The VSSetConstantBuffers call could come before OR after the VSSetShader call -- the device context has a certain number of VS-constant-buffer slots. When changing the shader program, the slots aren't affected.

OMSetRenderTargets could happen at any time, it just specifies where the next draw calls will be drawn to.

UpdateSubresource could come before/after VSSetConstantBuffers. The former call memcpy's bytes into a buffer object. The latter places a pointer to a buffer object into a 'slot', for use by a shader program.

A second vote for RenderDoc, it really is a superb tool to have in your arsenal for graphics debugging.

Regarding your code: have you set the viewport correctly via RSSetViewports? That's something I've forgotten frequently over the years.

Visit http://www.mugsgames.com

Stroids, a retro style mini-game for Windows PC. http://barryskellern.itch.io/stroids

Mugs Games on Twitter: [twitter]MugsGames[/twitter] and Facebook: www.facebook.com/mugsgames

Me on Twitter [twitter]BarrySkellern[/twitter]


I'm trying to port some OpenGL code to DirectX11

As the order of operations you describe seem correct, there are a couple more things that you should check as they may differ from your OGL experience:

1. Commonly on the CPU side, matrices are row-major and matrix multiplication is left-to-right. E.g., world * view * projection. Another example: orientation = scale * rotate * translate.

2. Just to confuse things, when copying those matrices to buffers CPU-side for subresource updating, they should be transposed (to column-major).

3. Just to further confuse things, if you don't set explicit flags for the GPU, and you use common HLSL implementation, your shaders should still use left-to-right order for matrix multiplication.

E.g.,


// CPU-side
cbuffer.World = XMMatrixTranspose( world );
cbuffer.ViewProj = XMMatrixTranspose( view * projection );
context->UpdateResource( ...gpuBuffer, ..., &cbuffer .. );

// GPU-side
output.pos = mul( input.pos, World );
output.pos = mul( output.pos, ViewProj );


There are ways to change those defaults, but, as you will likely see things described as in the examples above, you may want to leave that for a later time.

Please don't PM me with questions. Post them in the forums for everyone's benefit, and I can embarrass myself publicly.

You don't forget how to play when you grow old; you grow old when you forget how to play.

Thank you for your ideas so far:

- I'm glad to hear that the order doesn't matter and looks good as is

- I checked out RenderDoc. (more below)

- In the debugger in the Visual Studio Graphics tool, I sent in an identity matrix and the result looks correct. Is that debugger reliable? (see pic in my stack overflow post)

- One bit of annoyance is that I'm using pure C and the DirectX Math lib is C++ only. I'm not sure what to do about it except to have to compile the D3D11 part with the C++ compiler. I keep wondering about that "align" keyword they are using in the headers. Part of me is annoyed about this because I thought a float was an IEEE thing?

In RenderDoc, I see some stuff that could be my problem:

- In Event Browser -> find the DrawIndexed call -> Click -> <show the mesh output>

In the mesh output, all of the vertices are identical. The first row looks right. The rest are copies of the first row. So somehow I'm not sending in the full list of points?

- When I go to "Buffer Contents" for that buffer, it's got a bunch of pretty large numbers.

Am I looking at the float data type encoded as an int? (thinking about IEEE float packing format)

- Under Pipeline state, click on InputAssembler,

- the slot called "index" -> click Go -> doesn't do anything

- slow 0, is that my transformation matrix? -> Click Go -> shows those weird integers -- like 3212836864

On the mesh view, if the input vertices are all identical that means something is likely wrong with your index buffer set up. The first column shows the vertex ID, ie. linearly increasing from 0. The second column shows the index from the index buffer, which should be different (e.g. for a triangle list it might be 0, 1, 2, 1, 2, 3, 4, 5, 1 or whatever). Double check your indices are as you expect them to be. If the output vertices are all the same but the input vertices are OK, it's something in your vertex shader, so you could try right clicking -> debug selected vertex and step through two different vertices side by side to see what's wrong.

The raw "buffer contents" view is for when you have just a chunk of memory that you'd like to view in an arbitrary format, and the text box near the bottom lets you enter the format - by default it is probably showing the data as int4s (possibly in hex depending on the version you're using). You could always manually enter your input layout, e.g. "float3 pos; float2 texcoord;" and see it that way to verify that the vertex buffer is correct, but if you're just looking at mesh data the mesh view should show you everything you need as it interprets the vertex buffers and input layouts as D3D11 will.

If clicking Go next to the index slot in the InputAssembler doesn't do anything that suggests perhaps there isn't an index buffer bound somehow. It should open up the "buffer contents" for the index buffer itself. Is it highlighted in red? If so then there isn't an index buffer set. I've noticed it's not as clear as it could be (it says 'buffer 0' and suggests it's 1 byte long), I'll fix that.

The numbered slots are your vertex buffers, vertex shader data is on the vertex shader tab, under constant buffers. The vertex inputs come from the elements in your input layout, each element can point to a vertex buffer with a given offset. The concept is fairly analogous to modern opengl's vertex attributes and vertex buffers (where attributes have an offset, format, and binding to a vertex buffer slot, and vertex buffer slots bind the actual buffer as well as its stride).

Still not working yet. The VS Output Preview looks off. It's showing one face of a cube without any shading.

That was a huge help. Thanks Baldurk. (Good work on RenderDoc BTW) I figured out "buffer contents" too. I wondered if there is a way to interrogate the buffer for the format. In Mesh Output, on VS Input, the second column was blank. I tracked it down and now I have my indices coming across and the vertices are correct. The VS Input Preview looks correct now. The VS Output grid of vertices looks right too.

I think what I'm trying to track down now is what's causing the VS Output Preview to show an unshaded cube face. Could it be my shader? Can anyone see anything wrong with my shader?


cbuffer cbTransform : register( b0 )
{
    matrix matWorldViewProj;
};


struct VS_INPUT
{
float3 Position  : POSITION0;
float2 TexCoord : TEXCOORD0;
float3 Normal  : NORMAL;
float4 Color : TEXCOORD1;
};


struct VS_OUTPUT
{
float4 Position  : SV_POSITION;
float2 TexCoord : TEXCOORD0; 
float3 Normal : NORMAL;
float4 Color : COLOR0;
};


VS_OUTPUT vs_main( VS_INPUT Input )
{
VS_OUTPUT Output;
Output.Position = mul(float4(Input.Position,1), matWorldViewProj);
Output.TexCoord = Input.TexCoord;
Output.Normal = mul(Input.Normal, (float3x3)matWorldViewProj);
Output.Color = Input.Color;
return( Output );
}


float4 ps_main(VS_OUTPUT Input) : SV_TARGET
{
return float4(0.2f, 0.2f, 0.2f, 1.0f);
}

The shader seems fine to me, the only part of it that will actually have any effect is the position calculation, as the pixel shader isn't using any of the other inputs.

If you say that the grid of vertices seems fine, what exactly is showing as 'unshaded' in the mesh view? By default it's just a wireframe view of the triangles, so the only thing I can think you mean is you have 'flat shaded' selected on the output and one face is showing up black - that would imply the winding order is incorrect on that face (meaning the normal is calculated backwards).

This topic is closed to new replies.

Advertisement