Jump to content

  • Log In with Google      Sign In   
  • Create Account

Interested in a FREE copy of HTML5 game maker Construct 2?

We'll be giving away three Personal Edition licences in next Tuesday's GDNet Direct email newsletter!

Sign up from the right-hand sidebar on our homepage and read Tuesday's newsletter for details!


We're also offering banner ads on our site from just $5! 1. Details HERE. 2. GDNet+ Subscriptions HERE. 3. Ad upload HERE.


trevex

Member Since 21 Dec 2012
Offline Last Active Mar 10 2013 08:58 AM

Topics I've Started

BlendState during Lighting Pass

12 January 2013 - 09:05 AM

For my coursework I started working on a small deferred renderer. I am using no optimizations like gbuffer packing, a light pre-pass or so.

The general shader seem to work fine, but when I add a second light, I noticed the blending is not working.

 

So instead of the spotlight being added the spotlight overdraws everything, so the previously visible arrays of the directional light are black.

 

My BlendState is defined in the FX file as follows:

 

BlendState AdditiveBlending
{
	BlendEnable[0] = TRUE;
	SrcBlend[0] = ONE;
	DestBlend[0] = ONE;
	BlendOp[0] = ADD;
	SrcBlendAlpha[0] = ZERO;
	DestBlendAlpha[0] = ZERO;
	BlendOpAlpha[0] = ADD;
	RenderTargetWriteMask[0] = 0x0F;
};

 

 

And Set in the associated technique like this:

 

SetBlendState(AdditiveBlending, float4(0.0f, 0.0f, 0.0f, 0.0f), 0xffffffff);

 

 

I am new to blending and thought I figured it out, unfortunately not...

 

Thanking you in anticipation,

Nik


Texture2D Load only returns red channel

10 January 2013 - 07:11 AM

I am currently planning to implement a few Post-Process Effects. The first thing I did was rendering into a texture, when I have a look at the
texture in PixWin it looks totally fine.

The texture has the format "DXGI_FORMAT_R8G8B8A8_UNORM".

The next step was to render the texture to the screen, but the output is simply "red" only the red channel is rendered.

 

float4 VSMain(in float3 Position : POSITION) : SV_Position 
{ 
	return float4(Position, 1.0f); 
}

float PSDisabledMain(in float4 screenPos : SV_Position) : SV_Target0 
{ 
	int3 sampleIndices = int3(screenPos.xy, 0);
	float3 color = BackBufferMap.Load(sampleIndices).xyz;
	return color;
}

 

 

I played around with the shader code but nothing seems to fix it. I am new to directx coming from a opengl background and seem to have overseen something.

Thanking you in anticipation,
Nik

P.S. If you need more informations or anything else let me know

EDIT:

My assumption is it has something todo with the texture format but unsure how to fix it...


Deferred Rendering Problem

22 December 2012 - 07:12 AM

I am currently working on a small deferred rendering engine for this semester's coursework assignment. The code compiles fine and runs in Visual Studio without any problems except no output to the screen and that's where the problems start.

The deferred rendering system is nothing special it is basically the first draft of the algorithms presented in "Practical Rendering with DirectX 11" [2011 by Jason Zink, Matt Pettineo, Jack Hoxley]. No optimization like Attribute packing etc. are used. Only a simple Lighting system is used (no shadow).

The problems seems to be the fullscreen-quad(simply inefficient fullscreen-quads used for lighting passes) or a general problem. Since I get no errors in the debug log I tried to use PIX and PerfStudio to get some more information on the gpu side. Unfortunately PIX and PerfStudio are before the first frame with this error:

Invalid allocation size: 4294967295 bytes

So for whatever reason it seems to allocate some space with -1 bytes. Awkwardly everything is fine in VisualStudio debugging... and if I attach a debugger to the PIX process and break when the error happens I land in a debuggers header file.

I just started using DirectX with prior OpenGL experience, so I hope I did not something generally wrong. I used the executable that was output by the compiler in debug mode.

To avoid general logic mistakes, here is roughly what I currently do:

1. SetDepthStencilState (with DepthTest enabled)
2. clear all RenderTargetViews (was unsure about gBuffer but gBuffer is being cleared as well currently) and DepthStencilBuffer
3. bind gBuffer and DepthStencilBuffer
4. render Geometry
5. disable DepthTest
6. bind Backbuffer
7. render all lights with the associated shader (since I am using the effects framework I set the BlendState in the shader)
8. render CEGUI (works fine even if rest doesn't output anything)
9. present()

The lights are as already mentioned fullscreen quads. The lighting technique is simply passing the position through, so the quads vertices are in the range [-1, 1].

If you need any additional informations let me know.

Thanks, Nik

P.S. sorry for the bad english...


EDIT:

For further informations: The vertices and indices of the fullscreen quad

glm::vec3 vertices[] =    {		glm::vec3(-1.0f, -1.0f,  1.0f),		glm::vec3(-1.0f,  1.0f,  1.0f),		glm::vec3( 1.0f, -1.0f,  1.0f),		glm::vec3( 1.0f,  1.0f,  1.0f),    };	    UINT indices[] = { 0, 3, 2, 2, 0, 1 };


And a rough walkthough the code:

// before geometry pass
        m_d3dImmediateContext->RSSetState(m_RasterState);
	m_d3dImmediateContext->OMSetDepthStencilState(m_GeometryDepthStencilState, 1);
	m_d3dImmediateContext->ClearRenderTargetView(m_RenderTargetView, reinterpret_cast<const float*>(&clearColor));
	m_d3dImmediateContext->ClearRenderTargetView(m_gBuffer[0], reinterpret_cast<const float*>(&clearColor));
	m_d3dImmediateContext->ClearRenderTargetView(m_gBuffer[1], reinterpret_cast<const float*>(&clearColor));
	m_d3dImmediateContext->ClearRenderTargetView(m_gBuffer[2], reinterpret_cast<const float*>(&clearColor));
	m_d3dImmediateContext->ClearRenderTargetView(m_gBuffer[3], reinterpret_cast<const float*>(&clearColor));
	m_d3dImmediateContext->ClearDepthStencilView(m_DepthStencilView, D3D11_CLEAR_DEPTH|D3D11_CLEAR_STENCIL, 1.0f, 0);
	m_d3dImmediateContext->OMSetRenderTargets(4, m_gBuffer, m_DepthStencilView);

// before lighting pass
        m_d3dImmediateContext->OMSetDepthStencilState(m_LightingDepthStencilState, 1);
	m_d3dImmediateContext->OMSetRenderTargets(1, &m_RenderTargetView, m_DepthStencilView);
	DXLightingShader->enable();

// DXLightingShader::enable (the static cast is necessary because the engine supports opengl and directx this is my dirty way
        static_cast<SDXRenderInfo*>(g_RenderInfo)->context->IASetInputLayout(m_InputLayout);
	static_cast<SDXRenderInfo*>(g_RenderInfo)->context->IASetPrimitiveTopology(D3D11_PRIMITIVE_TOPOLOGY_TRIANGLELIST);
	m_fxNormalMap->SetResource(m_NormalView);
	m_fxDiffuseMap->SetResource(m_DiffuseView);
	m_fxSpecularMap->SetResource(m_SpecularView);
	m_fxPositionMap->SetResource(m_PositionView);
	m_fxCameraPos->SetFloatVector(Camera->getPosition());

// depending on light-type this is how it is drawn
        for(UINT p = 0; p < m_DirectionalLightDesc.Passes; ++p)
	{
		m_DirectionalLight->GetPassByIndex(p)->Apply(0, static_cast<SDXRenderInfo*>(g_RenderInfo)->context);
		static_cast<SDXRenderInfo*>(g_RenderInfo)->context->DrawIndexed(6, 0, 0);
	}

// present function called after light passes
        DXLightingShader->disable();
	CEGUI::System::getSingleton().renderGUI();
	HR(m_SwapChain->Present(0, 0));

// DXLightingShader::disable
        m_fxNormalMap->SetResource(NULL);
	m_fxDiffuseMap->SetResource(NULL);
	m_fxSpecularMap->SetResource(NULL);
	m_fxPositionMap->SetResource(NULL);
	m_DirectionalLight->GetPassByIndex(0)->Apply(0, static_cast<SDXRenderInfo*>(g_RenderInfo)->context);



If you need more information or some details of the shader implementation let me know

CreateInputLayout throws E_INVALIDARG

21 December 2012 - 10:03 AM

I am currently working with DirectX on a Coursework Assignment. My first simple shader worked fine, but now after adding some complexity and changing the input layout the code throws an unexpected error. I found a few related posts around here, that unfortunately didn't help me. 

The input layout in the shader is defined as followed:

struct VSInput
{
	float3 Position : POSITION;
	float2 TexCoord : TEXCOORD0;
	float3 Normal   : NORMAL;
};

 

I try to setup the input layout in my small material class like this ( I use the effects framework):

const D3D11_INPUT_ELEMENT_DESC basicInput[] = 
{
	{"POSITION",  0, DXGI_FORMAT_R32G32B32_FLOAT, 0,  0, D3D11_INPUT_PER_VERTEX_DATA, 0},
	{"TEXCOORD0", 0, DXGI_FORMAT_R32G32_FLOAT,    0, 12, D3D11_INPUT_PER_VERTEX_DATA, 0},
	{"NORMAL",    0, DXGI_FORMAT_R32G32B32_FLOAT, 0, 20, D3D11_INPUT_PER_VERTEX_DATA, 0}
};


D3DX11_PASS_DESC passDesc;
m_Tech->GetPassByIndex(0)->GetDesc(&passDesc);
HR(static_cast<SDXRenderInfo*>(g_RenderInfo)->device->CreateInputLayout(basicInput, 3, 
passDesc.pIAInputSignature, passDesc.IAInputSignatureSize, &m_InputLayout));

(HR is a small macro I found in the Frank Luna book; the static_cast might be irritating but the engine supports directx and opengl and this is part of my dirty solution)

 

EDIT:

Let me know if your need more informations...

 


PARTNERS