• Advertisement
  • Popular Tags

  • Popular Now

  • Advertisement
  • Similar Content

    • By Krypt0n
      Finally the ray tracing geekyness starts:
      lets collect some interesting articles, I start with:
    • By lubbe75
      What is the best practice when you want to draw a surface (for instance a triangle strip) with a uniform color?
      At the moment I send vertices to the shader, where each vertice has both position and color information. Since all vertices for that triangle strip have the same color I thought I could reduce memory use by sending the color separate somehow. A vertex could then be represented by three floats instead of seven (xyz instead of xys + rgba).
      Does it make sense? What's the best practice?
    • By ZachBethel
      Hey all,
      I'm trying to understand implicit state promotion for directx 12 as well as its intended use case. https://msdn.microsoft.com/en-us/library/windows/desktop/dn899226(v=vs.85).aspx#implicit_state_transitions
      I'm attempting to utilize copy queues and finding that there's a lot of book-keeping I need to do to first "pre-transition" from my Graphics / Compute Read-Only state (P-SRV | NP-SRV) to Common, Common to Copy Dest, perform the copy on the copy command list, transition back to common, and then find another graphics command list to do the final Common -> (P-SRV | NP-SRV) again.
      With state promotion, it would seem that I can 'nix the Common -> Copy Dest, Copy Dest -> Common bits on the copy queue easily enough, but I'm curious whether I could just keep all of my "read-only" buffers and images in the common state and effectively not perform any barriers at all.
      This seems to be encouraged by the docs, but I'm not sure I fully understand the implications. Does this sound right?
    • By NikiTo
      I need to share heap between RTV and Stencil. I need to render to a texture and without copying it(only changing the barriers, etc) to be able to use that texture as stencil. without copying nothing around. But the creating of the placed resource fails. I think it could be because of the D3D12_RESOURCE_DESC has 8_UINT format, but D3D12_RESOURCE_FLAG_ALLOW_DEPTH_STENCIL enabled too, and MSDN says Stencil does not support that format. Is the format the problem? And if the format is the problem, what format I have to use?

      For the texture of that resource I have the flags like: "D3D12_RESOURCE_FLAG_ALLOW_RENDER_TARGET | D3D12_RESOURCE_FLAG_ALLOW_DEPTH_STENCIL" and it fails, but when I remove the allow-stencil flag, it works.
    • By ritzmax72
      I know vertex buffer is just another GPU resource represented by ID3D12Resource, but why is it said that vertex buffer don’t need a descriptor heap??
      Other resources like depth/stencil resource, swap chain’s buffer need to have descriptor heaps. How does these resources differ from vertex buffer.
  • Advertisement
  • Advertisement

DX12 Problem with rendering a simple triangle

Recommended Posts

Here is the code that is relevant for my problem. Everything else is omitted and is giving no problems.


The hlsl that compiles and later it successfully adds to the PSO:

struct VSInput
	float4 position : mPOSITION;
	float2 uv : mTEXCOORD;

struct PSInput
	float4 position : SV_POSITION;
	//float2 uv : TEXCOORD;

Texture2D g_texture : register(t0);
SamplerState g_sampler : register(s0);

PSInput VSMain(VSInput input)
	PSInput output;

	output.position = input.position;
	//output.uv = input.uv;

	return output;

float4 PSMain(PSInput input) : SV_TARGET
	//return g_texture.Sample(g_sampler, input.uv);
	return float4(1.0, 0.0, 0.0, 1.0);


The part of the C++ I consider relevant to the problem:

Vertex triangleVertices[] =
	{ { 0.0f, 0.25f, 0.0f }, { 0.5f, 0.0f } },
	{ { 0.25f, -0.25f, 0.0f }, { 1.0f, 1.0f } },
	{ { -0.25f, -0.25f, 0.0f }, { 0.0f, 1.0f } }
// FAILED macro is omited
D3DCompileFromFile(shadersPath.c_str(), nullptr, nullptr, "VSMain", "vs_5_0", 0, 0, &mvsByteCode, &errors);
D3DCompileFromFile(shadersPath.c_str(), nullptr, nullptr, "PSMain", "ps_5_0", 0, 0, &mpsByteCode, &errors);
D3D12_INPUT_ELEMENT_DESC mInputLayout[] =
renderQuadVertexBufferView.BufferLocation = mRenderQuadBufferDefault->GetGPUVirtualAddress();
renderQuadVertexBufferView.StrideInBytes = sizeof(Vertex);
renderQuadVertexBufferView.SizeInBytes = sizeof(triangleVertices);
mCommandList->IASetVertexBuffers(0, 1, &renderQuadVertexBufferView);
// this command executes painting the screen well
mCommandList->ClearRenderTargetView(RTVHandleCPU, clearColor, 0, nullptr);

// this command does not show the triangle
mCommandList->DrawInstanced(3, 1, 0, 0);

Before to attempt to render the triangle, I set the state of the vertex buffer to be D3D12_RESOURCE_STATE_VERTEX_AND_CONSTANT_BUFFER. Its heap is of the type DEFAULT.
Do you see any problem in the shown code? If I can discard this as the source of the problem, I could search in other places.

Share this post

Link to post
Share on other sites

Viewport and scissors doesn't help. No I haven't tried PIX/debug. I'm testing each HRESULT and everything for NULL. And If something is wrong it doesn't compile here, it doesn't serialize there, doesn't initialize descriptors, it doesn't close the command list if something is wrong. So I decided to not make it more complex with more APIs for debugging.

Now i'm so sleepy... Tomorrow I will try to read the vertex buffer and see if it contains expected vertex data.

Edited by NikiTo

Share this post

Link to post
Share on other sites

Absolutely try enabling the debug layer and check the Visual Studio output (or whatever you are using) for warnings or errors. I have recently added DX12 support to my engine and the debug output helped me out a lot. If that does not help, run the app with RenderDoc and see that you transform the vertices correctly.


Edited by GuyWithBeard

Share this post

Link to post
Share on other sites

I don't know what else to try...

is this correct?

D3D12_VIEWPORT viewPort = {};
viewPort.TopLeftX = 0;
viewPort.TopLeftY = 0;
viewPort.Width = TextureWidth;
viewPort.Height = TextureHeight;
viewPort.MinDepth = 0;
viewPort.MaxDepth = 1;

D3D12_RECT m_scissorRect = {};
m_scissorRect.left = 0;
m_scissorRect.top = 0;
m_scissorRect.right = TextureWidth;
m_scissorRect.bottom = TextureHeight;

tried with Z for the vertices 0.5 and nothing

Edited by NikiTo

Share this post

Link to post
Share on other sites

Graphics debugging is not a guessing game. You need to run with all the debug tools you can, especially with d3d12 and inspect every elements for the mistake.

Enabling the debug layer prior to device creation is also pretty effective, many graphic issues won't trigger any HRESULT for example ( like improper descriptors in a heap or missing root parameters ).

PIX and Renderdoc are also very valuable once you have no validation errors and still not seeing what you should see.


In your case, i would look at the PSO creation parameters, like the output write mask and backface culling.

Edited by galop1n

Share this post

Link to post
Share on other sites

I would use debug for complex applications. I didn't expected when I started with such simple task to fall in such situation. I mean, it is a simple triangle! What I would need next for one single triangle?! Profiling performance tool?

I will rewrite it all from scratch and if the problem persists, I have no choice but try debugging tools too. :(

Share this post

Link to post
Share on other sites

D3D12 is not for everyone. This is an API for the 1% of applications where D3D11 is not enough ! AAA games, large data set processing and heavy renderer tools.

If you are not an expert at D3D11 nor know why you need D3D12, you don't need it and will just shoot a bullet in your foot using it. D3D12 is not a replacement of D3D11, both are made for cohabitation, and it won't change.

Rendering a triangle with D3D12 is a complex application already, you have to deal with gpu/cpu sync and life time management, manual memory management, complex idioms like queues, allocator and barriers, etc. Add a texture to your triangle and you reach a whole world of non trivial decisions to your design.

Share this post

Link to post
Share on other sites

You might also might trying to compile and run Microsoft DX12 samples on github.  See if they work... if they do just enable the debug layer. (which I think you should do anyway)

Share this post

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

  • Advertisement