Jump to content

  • Log In with Google      Sign In   
  • Create Account


#Actualtrevex

Posted 22 December 2012 - 07:28 AM

I am currently working on a small deferred rendering engine for this semester's coursework assignment. The code compiles fine and runs in Visual Studio without any problems except no output to the screen and that's where the problems start.

The deferred rendering system is nothing special it is basically the first draft of the algorithms presented in "Practical Rendering with DirectX 11" [2011 by Jason Zink, Matt Pettineo, Jack Hoxley]. No optimization like Attribute packing etc. are used. Only a simple Lighting system is used (no shadow).

The problems seems to be the fullscreen-quad(simply inefficient fullscreen-quads used for lighting passes) or a general problem. Since I get no errors in the debug log I tried to use PIX and PerfStudio to get some more information on the gpu side. Unfortunately PIX and PerfStudio are before the first frame with this error:

Invalid allocation size: 4294967295 bytes

So for whatever reason it seems to allocate some space with -1 bytes. Awkwardly everything is fine in VisualStudio debugging... and if I attach a debugger to the PIX process and break when the error happens I land in a debuggers header file.

I just started using DirectX with prior OpenGL experience, so I hope I did not something generally wrong. I used the executable that was output by the compiler in debug mode.

To avoid general logic mistakes, here is roughly what I currently do:

1. SetDepthStencilState (with DepthTest enabled)
2. clear all RenderTargetViews (was unsure about gBuffer but gBuffer is being cleared as well currently) and DepthStencilBuffer
3. bind gBuffer and DepthStencilBuffer
4. render Geometry
5. disable DepthTest
6. bind Backbuffer
7. render all lights with the associated shader (since I am using the effects framework I set the BlendState in the shader)
8. render CEGUI (works fine even if rest doesn't output anything)
9. present()

The lights are as already mentioned fullscreen quads. The lighting technique is simply passing the position through, so the quads vertices are in the range [-1, 1].

If you need any additional informations let me know.

Thanks, Nik

P.S. sorry for the bad english...


EDIT:

For further informations: The vertices and indices of the fullscreen quad

glm::vec3 vertices[] =    {		glm::vec3(-1.0f, -1.0f,  1.0f),		glm::vec3(-1.0f,  1.0f,  1.0f),		glm::vec3( 1.0f, -1.0f,  1.0f),		glm::vec3( 1.0f,  1.0f,  1.0f),    };	    UINT indices[] = { 0, 3, 2, 2, 0, 1 };


And a rough walkthough the code:

// before geometry pass
        m_d3dImmediateContext->RSSetState(m_RasterState);
	m_d3dImmediateContext->OMSetDepthStencilState(m_GeometryDepthStencilState, 1);
	m_d3dImmediateContext->ClearRenderTargetView(m_RenderTargetView, reinterpret_cast<const float*>(&clearColor));
	m_d3dImmediateContext->ClearRenderTargetView(m_gBuffer[0], reinterpret_cast<const float*>(&clearColor));
	m_d3dImmediateContext->ClearRenderTargetView(m_gBuffer[1], reinterpret_cast<const float*>(&clearColor));
	m_d3dImmediateContext->ClearRenderTargetView(m_gBuffer[2], reinterpret_cast<const float*>(&clearColor));
	m_d3dImmediateContext->ClearRenderTargetView(m_gBuffer[3], reinterpret_cast<const float*>(&clearColor));
	m_d3dImmediateContext->ClearDepthStencilView(m_DepthStencilView, D3D11_CLEAR_DEPTH|D3D11_CLEAR_STENCIL, 1.0f, 0);
	m_d3dImmediateContext->OMSetRenderTargets(4, m_gBuffer, m_DepthStencilView);

// before lighting pass
        m_d3dImmediateContext->OMSetDepthStencilState(m_LightingDepthStencilState, 1);
	m_d3dImmediateContext->OMSetRenderTargets(1, &m_RenderTargetView, m_DepthStencilView);
	DXLightingShader->enable();

// DXLightingShader::enable (the static cast is necessary because the engine supports opengl and directx this is my dirty way
        static_cast<SDXRenderInfo*>(g_RenderInfo)->context->IASetInputLayout(m_InputLayout);
	static_cast<SDXRenderInfo*>(g_RenderInfo)->context->IASetPrimitiveTopology(D3D11_PRIMITIVE_TOPOLOGY_TRIANGLELIST);
	m_fxNormalMap->SetResource(m_NormalView);
	m_fxDiffuseMap->SetResource(m_DiffuseView);
	m_fxSpecularMap->SetResource(m_SpecularView);
	m_fxPositionMap->SetResource(m_PositionView);
	m_fxCameraPos->SetFloatVector(Camera->getPosition());

// depending on light-type this is how it is drawn
        for(UINT p = 0; p < m_DirectionalLightDesc.Passes; ++p)
	{
		m_DirectionalLight->GetPassByIndex(p)->Apply(0, static_cast<SDXRenderInfo*>(g_RenderInfo)->context);
		static_cast<SDXRenderInfo*>(g_RenderInfo)->context->DrawIndexed(6, 0, 0);
	}

// present function called after light passes
        DXLightingShader->disable();
	CEGUI::System::getSingleton().renderGUI();
	HR(m_SwapChain->Present(0, 0));

// DXLightingShader::disable
        m_fxNormalMap->SetResource(NULL);
	m_fxDiffuseMap->SetResource(NULL);
	m_fxSpecularMap->SetResource(NULL);
	m_fxPositionMap->SetResource(NULL);
	m_DirectionalLight->GetPassByIndex(0)->Apply(0, static_cast<SDXRenderInfo*>(g_RenderInfo)->context);



If you need more information or some details of the shader implementation let me know

#5trevex

Posted 22 December 2012 - 07:26 AM

I am currently working on a small deferred rendering engine for this semester's coursework assignment. The code compiles fine and runs in Visual Studio without any problems except no output to the screen and that's where the problems start.

The deferred rendering system is nothing special it is basically the first draft of the algorithms presented in "Practical Rendering with DirectX 11" [2011 by Jason Zink, Matt Pettineo, Jack Hoxley]. No optimization like Attribute packing etc. are used. Only a simple Lighting system is used (no shadow).

The problems seems to be the fullscreen-quad(simply inefficient fullscreen-quads used for lighting passes) or a general problem. Since I get no errors in the debug log I tried to use PIX and PerfStudio to get some more information on the gpu side. Unfortunately PIX and PerfStudio are before the first frame with this error:

Invalid allocation size: 4294967295 bytes

So for whatever reason it seems to allocate some space with -1 bytes. Awkwardly everything is fine in VisualStudio debugging... and if I attach a debugger to the PIX process and break when the error happens I land in a debuggers header file.

I just started using DirectX with prior OpenGL experience, so I hope I did not something generally wrong. I used the executable that was output by the compiler in debug mode.

To avoid general logic mistakes, here is roughly what I currently do:

1. SetDepthStencilState (with DepthTest enabled)
2. clear all RenderTargetViews (was unsure about gBuffer but gBuffer is being cleared as well currently) and DepthStencilBuffer
3. bind gBuffer and DepthStencilBuffer
4. render Geometry
5. disable DepthTest
6. bind Backbuffer
7. render all lights with the associated shader (since I am using the effects framework I set the BlendState in the shader)
8. render CEGUI (works fine even if rest doesn't output anything)
9. present()

The lights are as already mentioned fullscreen quads. The lighting technique is simply passing the position through, so the quads vertices are in the range [-1, 1].

If you need any additional informations let me know.

Thanks, Nik

P.S. sorry for the bad english...


EDIT:

For further informations: The vertices and indices of the fullscreen quad
 

glm::vec3 vertices[] =    {		
          glm::vec3(-1.0f, -1.0f,  1.0f),		
          glm::vec3(-1.0f,  1.0f,  1.0f),		
          glm::vec3( 1.0f, -1.0f,  1.0f),		
          glm::vec3( 1.0f,  1.0f,  1.0f),   
 };	    
UINT indices[] = { 0, 3, 2, 2, 0, 1 };

 


And a rough walkthough the code:

 

 

 

// before geometry pass
        m_d3dImmediateContext->RSSetState(m_RasterState);
	m_d3dImmediateContext->OMSetDepthStencilState(m_GeometryDepthStencilState, 1);
	m_d3dImmediateContext->ClearRenderTargetView(m_RenderTargetView, reinterpret_cast<const float*>(&clearColor));
	m_d3dImmediateContext->ClearRenderTargetView(m_gBuffer[0], reinterpret_cast<const float*>(&clearColor));
	m_d3dImmediateContext->ClearRenderTargetView(m_gBuffer[1], reinterpret_cast<const float*>(&clearColor));
	m_d3dImmediateContext->ClearRenderTargetView(m_gBuffer[2], reinterpret_cast<const float*>(&clearColor));
	m_d3dImmediateContext->ClearRenderTargetView(m_gBuffer[3], reinterpret_cast<const float*>(&clearColor));
	m_d3dImmediateContext->ClearDepthStencilView(m_DepthStencilView, D3D11_CLEAR_DEPTH|D3D11_CLEAR_STENCIL, 1.0f, 0);
	m_d3dImmediateContext->OMSetRenderTargets(4, m_gBuffer, m_DepthStencilView);

// before lighting pass
        m_d3dImmediateContext->OMSetDepthStencilState(m_LightingDepthStencilState, 1);
	m_d3dImmediateContext->OMSetRenderTargets(1, &m_RenderTargetView, m_DepthStencilView);
	DXLightingShader->enable();

// DXLightingShader::enable (the static cast is necessary because the engine supports opengl and directx this is my dirty way
        static_cast<SDXRenderInfo*>(g_RenderInfo)->context->IASetInputLayout(m_InputLayout);
	static_cast<SDXRenderInfo*>(g_RenderInfo)->context->IASetPrimitiveTopology(D3D11_PRIMITIVE_TOPOLOGY_TRIANGLELIST);
	m_fxNormalMap->SetResource(m_NormalView);
	m_fxDiffuseMap->SetResource(m_DiffuseView);
	m_fxSpecularMap->SetResource(m_SpecularView);
	m_fxPositionMap->SetResource(m_PositionView);
	m_fxCameraPos->SetFloatVector(Camera->getPosition());

// depending on light-type this is how it is drawn
        for(UINT p = 0; p < m_DirectionalLightDesc.Passes; ++p)
	{
		m_DirectionalLight->GetPassByIndex(p)->Apply(0, static_cast<SDXRenderInfo*>(g_RenderInfo)->context);
		static_cast<SDXRenderInfo*>(g_RenderInfo)->context->DrawIndexed(6, 0, 0);
	}

// present function called after light passes
        DXLightingShader->disable();
	CEGUI::System::getSingleton().renderGUI();
	HR(m_SwapChain->Present(0, 0));

// DXLightingShader::disable
        m_fxNormalMap->SetResource(NULL);
	m_fxDiffuseMap->SetResource(NULL);
	m_fxSpecularMap->SetResource(NULL);
	m_fxPositionMap->SetResource(NULL);
	m_DirectionalLight->GetPassByIndex(0)->Apply(0, static_cast<SDXRenderInfo*>(g_RenderInfo)->context);

 

 


#4trevex

Posted 22 December 2012 - 07:24 AM

I am currently working on a small deferred rendering engine for this semester's coursework assignment. The code compiles fine and runs in Visual Studio without any problems except no output to the screen and that's where the problems start.

The deferred rendering system is nothing special it is basically the first draft of the algorithms presented in "Practical Rendering with DirectX 11" [2011 by Jason Zink, Matt Pettineo, Jack Hoxley]. No optimization like Attribute packing etc. are used. Only a simple Lighting system is used (no shadow).

The problems seems to be the fullscreen-quad(simply inefficient fullscreen-quads used for lighting passes) or a general problem. Since I get no errors in the debug log I tried to use PIX and PerfStudio to get some more information on the gpu side. Unfortunately PIX and PerfStudio are before the first frame with this error:

Invalid allocation size: 4294967295 bytes

So for whatever reason it seems to allocate some space with -1 bytes. Awkwardly everything is fine in VisualStudio debugging... and if I attach a debugger to the PIX process and break when the error happens I land in a debuggers header file.

I just started using DirectX with prior OpenGL experience, so I hope I did not something generally wrong. I used the executable that was output by the compiler in debug mode.

To avoid general logic mistakes, here is roughly what I currently do:

1. SetDepthStencilState (with DepthTest enabled)
2. clear all RenderTargetViews (was unsure about gBuffer but gBuffer is being cleared as well currently) and DepthStencilBuffer
3. bind gBuffer and DepthStencilBuffer
4. render Geometry
5. disable DepthTest
6. bind Backbuffer
7. render all lights with the associated shader (since I am using the effects framework I set the BlendState in the shader)
8. render CEGUI (works fine even if rest doesn't output anything)
9. present()

The lights are as already mentioned fullscreen quads. The lighting technique is simply passing the position through, so the quads vertices are in the range [-1, 1].

If you need any additional informations let me know.

Thanks, Nik

P.S. sorry for the bad english...


EDIT:

For further informations: The vertices and indices of the fullscreen quad

glm::vec3 vertices[] =    {		glm::vec3(-1.0f, -1.0f,  1.0f),		glm::vec3(-1.0f,  1.0f,  1.0f),		glm::vec3( 1.0f, -1.0f,  1.0f),		glm::vec3( 1.0f,  1.0f,  1.0f),    };	    UINT indices[] = { 0, 3, 2, 2, 0, 1 };


And a rough walkthough the code:

// before geometry pass
        m_d3dImmediateContext->RSSetState(m_RasterState);
	m_d3dImmediateContext->OMSetDepthStencilState(m_GeometryDepthStencilState, 1);
	m_d3dImmediateContext->ClearRenderTargetView(m_RenderTargetView, reinterpret_cast<const float*>(&clearColor));
	m_d3dImmediateContext->ClearRenderTargetView(m_gBuffer[0], reinterpret_cast<const float*>(&clearColor));
	m_d3dImmediateContext->ClearRenderTargetView(m_gBuffer[1], reinterpret_cast<const float*>(&clearColor));
	m_d3dImmediateContext->ClearRenderTargetView(m_gBuffer[2], reinterpret_cast<const float*>(&clearColor));
	m_d3dImmediateContext->ClearRenderTargetView(m_gBuffer[3], reinterpret_cast<const float*>(&clearColor));
	m_d3dImmediateContext->ClearDepthStencilView(m_DepthStencilView, D3D11_CLEAR_DEPTH|D3D11_CLEAR_STENCIL, 1.0f, 0);
	m_d3dImmediateContext->OMSetRenderTargets(4, m_gBuffer, m_DepthStencilView);

// before lighting pass
        m_d3dImmediateContext->OMSetDepthStencilState(m_LightingDepthStencilState, 1);
	m_d3dImmediateContext->OMSetRenderTargets(1, &m_RenderTargetView, m_DepthStencilView);
	DXLightingShader->enable();

// DXLightingShader::enable (the static cast is necessary because the engine supports opengl and directx this is my dirty way
        static_cast<SDXRenderInfo*>(g_RenderInfo)->context->IASetInputLayout(m_InputLayout);
	static_cast<SDXRenderInfo*>(g_RenderInfo)->context->IASetPrimitiveTopology(D3D11_PRIMITIVE_TOPOLOGY_TRIANGLELIST);
	m_fxNormalMap->SetResource(m_NormalView);
	m_fxDiffuseMap->SetResource(m_DiffuseView);
	m_fxSpecularMap->SetResource(m_SpecularView);
	m_fxPositionMap->SetResource(m_PositionView);
	m_fxCameraPos->SetFloatVector(Camera->getPosition());

// depending on light-type this is how it is drawn
        for(UINT p = 0; p < m_DirectionalLightDesc.Passes; ++p)
	{
		m_DirectionalLight->GetPassByIndex(p)->Apply(0, static_cast<SDXRenderInfo*>(g_RenderInfo)->context);
		static_cast<SDXRenderInfo*>(g_RenderInfo)->context->DrawIndexed(6, 0, 0);
	}

// present function called after light passes
        DXLightingShader->disable();
	CEGUI::System::getSingleton().renderGUI();
	HR(m_SwapChain->Present(0, 0));

// DXLightingShader::disable
        m_fxNormalMap->SetResource(NULL);
	m_fxDiffuseMap->SetResource(NULL);
	m_fxSpecularMap->SetResource(NULL);
	m_fxPositionMap->SetResource(NULL);
	m_DirectionalLight->GetPassByIndex(0)->Apply(0, static_cast<SDXRenderInfo*>(g_RenderInfo)->context);

#3trevex

Posted 22 December 2012 - 07:16 AM

I am currently working on a small deferred rendering engine for this semester's coursework assignment. The code compiles fine and runs in Visual Studio without any problems except no output to the screen and that's where the problems start.

The deferred rendering system is nothing special it is basically the first draft of the algorithms presented in "Practical Rendering with DirectX 11" [2011 by Jason Zink, Matt Pettineo, Jack Hoxley]. No optimization like Attribute packing etc. are used. Only a simple Lighting system is used (no shadow).

The problems seems to be the fullscreen-quad(simply inefficient fullscreen-quads used for lighting passes) or a general problem. Since I get no errors in the debug log I tried to use PIX and PerfStudio to get some more information on the gpu side. Unfortunately PIX and PerfStudio are before the first frame with this error:

Invalid allocation size: 4294967295 bytes

So for whatever reason it seems to allocate some space with -1 bytes. Awkwardly everything is fine in VisualStudio debugging... and if I attach a debugger to the PIX process and break when the error happens I land in a debuggers header file.

I just started using DirectX with prior OpenGL experience, so I hope I did not something generally wrong. I used the executable that was output by the compiler in debug mode.

To avoid general logic mistakes, here is roughly what I currently do:

1. SetDepthStencilState (with DepthTest enabled)
2. clear all RenderTargetViews (was unsure about gBuffer but gBuffer is being cleared as well currently) and DepthStencilBuffer
3. bind gBuffer and DepthStencilBuffer
4. render Geometry
5. disable DepthTest
6. bind Backbuffer
7. render all lights with the associated shader (since I am using the effects framework I set the BlendState in the shader)
8. render CEGUI (works fine even if rest doesn't output anything)
9. present()

The lights are as already mentioned fullscreen quads. The lighting technique is simply passing the position through, so the quads vertices are in the range [-1, 1].

If you need any additional informations let me know.

Thanks, Nik

P.S. sorry for the bad english...


EDIT:

For further informations: The vertices and indices of the fullscreen quad

    glm::vec3 vertices[] =
    {
		glm::vec3(-1.0f, -1.0f,  1.0f),
		glm::vec3(-1.0f,  1.0f,  1.0f),
		glm::vec3( 1.0f, -1.0f,  1.0f),
		glm::vec3( 1.0f,  1.0f,  1.0f),
    };	
    UINT indices[] = { 0, 3, 2, 2, 0, 1 };

#2trevex

Posted 22 December 2012 - 07:15 AM

I am currently working on a small deferred rendering engine for this semester's coursework assignment. The code compiles fine and runs in Visual Studio without any problems except no output to the screen and that's where the problems start.

The deferred rendering system is nothing special it is basically the first draft of the algorithms presented in "Practical Rendering with DirectX 11" [2011 by Jason Zink, Matt Pettineo, Jack Hoxley]. No optimization like Attribute packing etc. are used. Only a simple Lighting system is used (no shadow).

The problems seems to be the fullscreen-quad(simply inefficient fullscreen-quads used for lighting passes) or a general problem. Since I get no errors in the debug log I tried to use PIX and PerfStudio to get some more information on the gpu side. Unfortunately PIX and PerfStudio are before the first frame with this error:

Invalid allocation size: 4294967295 bytes

So for whatever reason it seems to allocate some space with -1 bytes. Awkwardly everything is fine in VisualStudio debugging... and if I attach a debugger to the PIX process and break when the error happens I land in a debuggers header file.

I just started using DirectX with prior OpenGL experience, so I hope I did not something generally wrong. I used the executable that was output by the compiler in debug mode.

To avoid general logic mistakes, here is roughly what I currently do:

1. SetDepthStencilState (with DepthTest enabled)
2. clear all RenderTargetViews (was unsure about gBuffer but gBuffer is being cleared as well currently) and DepthStencilBuffer
3. bind gBuffer and DepthStencilBuffer
4. render Geometry
5. disable DepthTest
6. bind Backbuffer
7. render all lights with the associated shader (since I am using the effects framework I set the BlendState in the shader)
8. render CEGUI (works fine even if rest doesn't output anything)
9. present()

The lights are as already mentioned fullscreen quads. The lighting technique is simply passing the position through, so the quads vertices are in the range [-1, 1].

If you need any additional informations let me know.

Thanks, Nik

P.S. sorry for the bad english...


EDIT:

For further informations: The vertices and indices of the fullscreen quad

    glm::vec3 vertices[] =
    {
		glm::vec3(-1.0f, -1.0f,  1.0f),
		glm::vec3(-1.0f,  1.0f,  1.0f),
		glm::vec3( 1.0f, -1.0f,  1.0f),
		glm::vec3( 1.0f,  1.0f,  1.0f),
    };	

#1trevex

Posted 22 December 2012 - 07:12 AM

I am currently working on a small deferred rendering engine for this semester's coursework assignment. The code compiles fine and runs in Visual Studio without any problems except no output to the screen and that's where the problems start.

 

The deferred rendering system is nothing special it is basically the first draft of the algorithms presented in "Practical Rendering with DirectX 11" [2011 by Jason Zink, Matt Pettineo, Jack Hoxley]. No optimization like Attribute packing etc. are used. Only a simple Lighting system is used (no shadow).

 

The problems seems to be the fullscreen-quad(simply inefficient fullscreen-quads used for lighting passes) or a general problem. Since I get no errors in the debug log I tried to use PIX and PerfStudio to get some more information on the gpu side. Unfortunately PIX and PerfStudio are before the first frame with this error:

 

Invalid allocation size: 4294967295 bytes

 

So for whatever reason it seems to allocate some space with -1 bytes. Awkwardly everything is fine in VisualStudio debugging... and if I attach a debugger to the PIX process and break when the error happens I land in a debuggers header file.

 

I just started using DirectX with prior OpenGL experience, so I hope I did not something generally wrong. I used the executable that was output by the compiler in debug mode.

 

To avoid general logic mistakes, here is roughly what I currently do:

 

1. SetDepthStencilState (with DepthTest enabled)

2. clear all RenderTargetViews (was unsure about gBuffer but gBuffer is being cleared as well currently) and DepthStencilBuffer

3. bind gBuffer and DepthStencilBuffer

4. render Geometry

5. disable DepthTest

6. bind Backbuffer

7. render all lights with the associated shader (since I am using the effects framework I set the BlendState in the shader)

8. render CEGUI (works fine even if rest doesn't output anything)

9. present()

 

The lights are as already mentioned fullscreen quads. The lighting technique is simply passing the position through, so the quads vertices are in the range [-1, 1].

 

If you need any additional informations let me know.

 

Thanks, Nik

 

P.S. sorry for the bad english...

 

 

 


PARTNERS