Jump to content
  • Advertisement
Sign in to follow this  
AlzPatz

Where is my triangle? Issue moving to Directx 11 and Metro

This topic is 2183 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

Hi. After playing about in directx and metro, I've hit a wall that either I'm being the most useless n00b on the planet or things do not work quite the same as earlier directx. very simply, im trying to render a triangle straight to "normalised / homogenous" or whatever you call it space - using the world's most basic shaders. To make sure that my initialisation code wasn't screwed up, I've transplanted the code into the direct 3D app template and still it does not show. if anyone could help work out why my triangle is not showing, would be appreciated. there must be something easy I've missed. To note this code tries to draw two triangles, filling the screen, one clockwise winding and one anti-clockwise, a test in case my winding was wrong. (Why can't I hit enter on this forum for a new line lol??). Anyway, first I hijack the cuberenders create device resources code
[source lang="cpp"]void CubeRenderer::CreateDeviceResources()
{
 Direct3DBase::CreateDeviceResources();

 VertexPos vertices[] =
 {
  DirectX::XMFLOAT3(0.0f, 0.0f, 0.2f),
  DirectX::XMFLOAT3(1.0f, 0.0f, 0.2f),
  DirectX::XMFLOAT3(0.0f, 1.0f, 0.2f),
  DirectX::XMFLOAT3(0.0f, 1.0f, 0.2f),
  DirectX::XMFLOAT3(1.0f, 1.0f, 0.2f),
  DirectX::XMFLOAT3(1.0f, 0.0f, 0.2f),
 };
 D3D11_BUFFER_DESC TT_vertexDesc;
 ZeroMemory(&TT_vertexDesc, sizeof(TT_vertexDesc));
 TT_vertexDesc.Usage = D3D11_USAGE_DEFAULT;
 TT_vertexDesc.BindFlags = D3D11_BIND_VERTEX_BUFFER;
 TT_vertexDesc.ByteWidth = sizeof(VertexPos) * 6;
 D3D11_SUBRESOURCE_DATA sResource;
 ZeroMemory(&sResource, sizeof(sResource));
 sResource.pSysMem = vertices;
 DX::ThrowIfFailed(m_d3dDevice->CreateBuffer(&TT_vertexDesc, &sResource, &TT_vertexBuffer));

 //Load the Shaders
 //Load the vertex shader (the build configuration / properties of the shader file deals with compile to .cso)
 auto task_LoadTTVS = DX::ReadDataAsync("VS_TriangleTest.cso");
 auto task_CreateTTVS = task_LoadTTVS.then([this](DX::ByteArray ba)
 {
  //This async load / helper classes are taken straight from the template. Why not.
  auto bc_VS = ba.data;
  DX::ThrowIfFailed(m_d3dDevice->CreateVertexShader(bc_VS->Data, bc_VS->Length, nullptr, &VS_TT));

  //Create Input Layout Description
  D3D11_INPUT_ELEMENT_DESC vertexLayout[] =
  {
   {"POSITION", 0, DXGI_FORMAT_R32G32B32_FLOAT, 0, 0, D3D11_INPUT_PER_VERTEX_DATA, 0}
  };

  //Create the Input Layout
  DX::ThrowIfFailed(m_d3dDevice->CreateInputLayout(vertexLayout, ARRAYSIZE(vertexLayout), bc_VS->Data, bc_VS->Length, &TT_inputLayout));
 });

 //Load the pixel shader (again, the build and configuration properties of the shader file deal with compilation
 auto task_LoadTTPS = DX::ReadDataAsync("PS_TriangleTest.cso");
 auto task_CreateTTPS = task_LoadTTPS.then([this](DX::ByteArray ba)
 {
  auto bc_PS = ba.data;
  DX::ThrowIfFailed(m_d3dDevice->CreatePixelShader(bc_PS->Data, bc_PS->Length, nullptr, &PS_TT));
  m_loadingComplete = true; //dont really work properly as earlier shaders might take longer but meh won't be the issue here
 });

}[/source] THen I hijack the rendering bit. FYI I've confirmed that all the context calls are run
const float Red[] = { 1.0f, 0.0f, 0.0f, 1.0f };
m_d3dContext->ClearRenderTargetView(m_renderTargetView.Get(), Red);
//Clear the depth stenicl view
m_d3dContext->ClearDepthStencilView(m_depthStencilView.Get(), D3D11_CLEAR_DEPTH, 1.0f, 0);
if (!m_loadingComplete)
return;
m_d3dContext->IASetInputLayout(TT_inputLayout.Get());
m_d3dContext->OMSetRenderTargets(1, m_renderTargetView.GetAddressOf(), m_depthStencilView.Get());
//Test triangle following some book code to check that everything is working
UINT str = sizeof(VertexPos);
UINT oset = 0;
m_d3dContext->IASetVertexBuffers(0, 1, &TT_vertexBuffer, &str, &oset);
m_d3dContext->IASetPrimitiveTopology(D3D11_PRIMITIVE_TOPOLOGY_TRIANGLELIST);
m_d3dContext->VSSetShader(VS_TT.Get(), 0, 0);
m_d3dContext->PSSetShader(PS_TT.Get(), 0, 0);
m_d3dContext->Draw(6, 0);
then the world's simplest shaders (even desperately rewritten to copy ones on the net using a structure, obviously this pass through could just be main function only returning float4...
struct VOut
{
float4 position : SV_POSITION;
};
VOut main( float4 pos : POSITION )
{
VOut output;
output.position = pos;
return output;
}
and the pixel pencil...
float4 main(float4 position : SV_POSITION) : SV_TARGET
{
return float4 (0.0f, 1.0f, 1.0f, 1.0f);
}
and wahay, I always get a red screen. ive tried making the triangles screen sized in case its not -1,-1 to 1,1 but still no joy. any help for a tired n00b? thanks!!

Share this post


Link to post
Share on other sites
Advertisement
I'm not sure if this will help but it seems to me your input layout defines vertex position as a 3 element vector (DXGI_FORMAT_R32G32B32_FLOAT), while your vertex shader parameter is expecting a 4 element one (float4 pos). I'd suggest you try and change the vertex shader parameter to float3 and see if that helps!

Share this post


Link to post
Share on other sites
Hi, thanks so much for your suggestion. My shaders would not compile with a float3 input, but I changed the code to use XMFLOAT4's instead. unfortunately no luck. Im getting really depressed now. Ive tried this in windows 7 dx11 c++, metro c++/cx DX 11 and with sharp DX in c#, always the same. I even followed the code from book "beginning game programming in directx11 that does this, and it didn't work, except their demo compiled and it worked!). seriously cannot for the life of me work this out. it's infuriating, this was supposed to be small step just to check things are working... :(

Share this post


Link to post
Share on other sites
Hey again, Don't forget to change your input layout to DXGI_FORMAT_R32G32B32A32_FLOAT (and your vertex struct position element if you haven't already) in order for it to work with XMFLOAT4,

If you wish to stick with XMFLOAT3, I'm guessing your shader won't compile because the 2nd row in the main method of your vertex shader has to be changed from ("output.position = pos") to "output.position = float4(pos, 1.0f)".

Share this post


Link to post
Share on other sites
Sorry I wasn't very clear. I did change the Input Layout format to the 4 component format. and I also changed the vertex shader to output a float4 from a float3 (pos, 1.0f). Ill double check again tonight anyhow.

Share this post


Link to post
Share on other sites
Thanks Papulko - you are right, I do not set one. However, Im pretty certain that DX11 sets up a default rasteriser that CULLS back and FILLS_SOLID. Ill add it later, but I dont hold too much hope this will work. :(

Share this post


Link to post
Share on other sites
Change it to disable culling or reverse the winding of your triangles.

Use PIX to confirm the positions of your vertices in screen space after the vertex shader is run.
Use PIX to perform general debugging of the scene and verification of the state at the time of rendering.
Use DirectX Debug Mode and watch the console for important messages on errors and warnings.


L. Spiro

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

We are the game development community.

Whether you are an indie, hobbyist, AAA developer, or just trying to learn, GameDev.net is the place for you to learn, share, and connect with the games industry. Learn more About Us or sign up!

Sign me up!