Jump to content
  • Advertisement


  • Content Count

  • Joined

  • Last visited

Community Reputation

125 Neutral

About trevex

  • Rank
  1. Can't believe it, sorry for the troubles. Basically I was executing the wrong shader still the pointlight shader... Totally stressed because of coursework... thanks for the advice :)
  2. For my coursework I started working on a small deferred renderer. I am using no optimizations like gbuffer packing, a light pre-pass or so. The general shader seem to work fine, but when I add a second light, I noticed the blending is not working.   So instead of the spotlight being added the spotlight overdraws everything, so the previously visible arrays of the directional light are black.   My BlendState is defined in the FX file as follows:   BlendState AdditiveBlending { BlendEnable[0] = TRUE; SrcBlend[0] = ONE; DestBlend[0] = ONE; BlendOp[0] = ADD; SrcBlendAlpha[0] = ZERO; DestBlendAlpha[0] = ZERO; BlendOpAlpha[0] = ADD; RenderTargetWriteMask[0] = 0x0F; };     And Set in the associated technique like this:   SetBlendState(AdditiveBlending, float4(0.0f, 0.0f, 0.0f, 0.0f), 0xffffffff);     I am new to blending and thought I figured it out, unfortunately not...   Thanking you in anticipation, Nik
  3. Oh god can't believe I didn't notice that! Thank you so much!
  4. I am totally confused right now when I use sample the screen is simply red, but the general problem persists.   Why is Load only returning the red value?   the color variable always just receives the red value and the rest are underscores in pix...   The shader with vertex shader float4 VSMain(in float3 Position : POSITION) : SV_Position { return float4(Position, 1.0f); } float PSDisabledMain(in float4 screenPos : SV_Position) : SV_Target0 { //BackBufferMap.Sample(TexureSampler, screenPos.xy); int3 sampleIndices = int3(screenPos.xy, 0); float4 color = BackBufferMap.Load(sampleIndices); return color; }     The vertices used:   glm::vec3 vertices[] = { glm::vec3(-1.0f, -1.0f, 0.0f), glm::vec3(-1.0f, 1.0f, 0.0f), glm::vec3( 1.0f, -1.0f, 0.0f), glm::vec3( 1.0f, 1.0f, 0.0f), };   The vertices and Load function is simply the same I am using in my deferred rendering shaders...   And here's how I create the backbuffer:   D3D11_TEXTURE2D_DESC backTextureDesc; D3D11_RENDER_TARGET_VIEW_DESC backTargetViewDesc; D3D11_SHADER_RESOURCE_VIEW_DESC backResourceViewDesc; ZeroMemory(&backTextureDesc, sizeof(backTextureDesc)); backTextureDesc.Width = width; backTextureDesc.Height = height; backTextureDesc.MipLevels = 1; backTextureDesc.ArraySize = 1; backTextureDesc.Format = DXGI_FORMAT_R8G8B8A8_UNORM; backTextureDesc.SampleDesc.Count = 1; backTextureDesc.Usage = D3D11_USAGE_DEFAULT; backTextureDesc.BindFlags = D3D11_BIND_RENDER_TARGET | D3D11_BIND_SHADER_RESOURCE; backTextureDesc.CPUAccessFlags = 0; backTextureDesc.MiscFlags = 0; HR(m_d3dDevice->CreateTexture2D(&backTextureDesc, NULL, &m_BackTargetTexture)); backTargetViewDesc.Format = backTextureDesc.Format; backTargetViewDesc.ViewDimension = D3D11_RTV_DIMENSION_TEXTURE2D; backTargetViewDesc.Texture2D.MipSlice = 0; HR(m_d3dDevice->CreateRenderTargetView(m_BackTargetTexture, &backTargetViewDesc, &m_BackTargetView)); backResourceViewDesc.Format = backTextureDesc.Format; backResourceViewDesc.ViewDimension = D3D11_SRV_DIMENSION_TEXTURE2D; backResourceViewDesc.Texture2D.MostDetailedMip = 0; backResourceViewDesc.Texture2D.MipLevels = 1; HR(m_d3dDevice->CreateShaderResourceView(m_BackTargetTexture, &backResourceViewDesc, &m_BackResourceView));
  5. I am currently planning to implement a few Post-Process Effects. The first thing I did was rendering into a texture, when I have a look at the texture in PixWin it looks totally fine. The texture has the format "DXGI_FORMAT_R8G8B8A8_UNORM". The next step was to render the texture to the screen, but the output is simply "red" only the red channel is rendered.   float4 VSMain(in float3 Position : POSITION) : SV_Position { return float4(Position, 1.0f); } float PSDisabledMain(in float4 screenPos : SV_Position) : SV_Target0 { int3 sampleIndices = int3(screenPos.xy, 0); float3 color = BackBufferMap.Load(sampleIndices).xyz; return color; }     I played around with the shader code but nothing seems to fix it. I am new to directx coming from a opengl background and seem to have overseen something. Thanking you in anticipation, Nik P.S. If you need more informations or anything else let me know EDIT: My assumption is it has something todo with the texture format but unsure how to fix it...
  6. trevex

    Deferred Rendering Problem

    You are right, thanks alot for all the help, I am successfully rendering loads of pointlight, spotlight etc. now... Best christmas present so far :)
  7. trevex

    Deferred Rendering Problem

    Thanks alot indeed I forgot to set the VertexBuffer and since the vertices of my cube are quite similar, I didn't notice that. Can't believe I haven't noticed that... The only thing left now is a bug in my lighting code, some surfaces stay black... The scene currently uses a single directional light for testing! float3 CalcLighting(in float3 normal, in float3 position, in float3 diffuseAlbedo, in float3 specularAlbedo, in float specularPower, uniform int gLightingMode) { float3 L = 0; float attenuation = 1.0f; if (gLightingMode == POINTLIGHT || gLightingMode == SPOTLIGHT) { L = LightPos - position; float dist = length(L); attenuation = max(0, 1.0f - (dist / LightRange.x)); L /= dist; } else if (gLightingMode == DIRECTIONALLIGHT) { L = -LightDirection; }if (gLightingMode == SPOTLIGHT) { float3 L2 = LightDirection; float rho = dot(-L, L2); attenuation *= saturate((rho - SpotlightAngles.y) / (SpotlightAngles.x - SpotlightAngles.y)); } float nDotL = saturate(dot(normal, L)); float3 diffuse = nDotL * LightColor * diffuseAlbedo; float3 V = CameraPos - position; float3 H = normalize(L + V); float3 specular = pow(saturate(dot(normal, H)), specularPower) * LightColor * specularAlbedo.xyz * nDotL; return (diffuse + specular) * attenuation; } The basic algorithm is basically out of a book so I assumed there would be nothing wrong the only thing I changed is using if and a uniform to embed it in an effects file. The Buffers are filled with informations and there is nothing missing, these are values of a black pixel that is supposed to have some color: EDIT: Lighting Shader Code on pastebin http://pastebin.com/8gbYBLCX, because the code tags seem to be buggy currently either escaping html as well or completly breaking formating
  8. trevex

    Deferred Rendering Problem

    Ok just noticed I was setting last value to 0.0 and not 1.0... The problem now is I changed the z value of the input vertices to 0.0f no change, I changed the shader to set the z value to 0.0f I get some output, still not the right shaded cube but it seems to be a problem with my lighting code... But this would be kind of a dirty fix, so why is the Z value being set to -1...?
  9. trevex

    Deferred Rendering Problem

    Sure, sorry for delayed reply chrismas time...     float4 VSMain(in float3 Position : POSITION) : SV_Position  {       return float4(Position, 0.0f);  }   for what ever reason the z value is always -1.0 also the preVS value... but this is the vertex data in the associated buffer:     glm::vec3 vertices[] =     { glm::vec3(-1.0f, -1.0f,  1.0f), glm::vec3(-1.0f,  1.0f,  1.0f), glm::vec3( 1.0f, -1.0f,  1.0f), glm::vec3( 1.0f,  1.0f,  1.0f),     };   As already stated out previously I am new to directx, so is there anything that can influence how vertex data is interpreted since the preVS value is also -1? 
  10. trevex

    Deferred Rendering Problem

    So I started debugging a frame with pix:   The 4 gBuffer textures are successfully being rendered. The problem seems to be the fullscreen quad:       Since in the viewport output there is only a whiteline it seems to get discarded?
  11. trevex

    Deferred Rendering Problem

    Thanks for the tip. I am currently in chrismas stress but I am trying to debug the application again now.   So I figured out what the problem was, it was quite simple actually... Visual Studio has a different runtime environment, so pix wasn't able to find some files...   I am investigating why it is not rendering now, I'll hopefully come back with more informations later...
  12. I am currently working on a small deferred rendering engine for this semester's coursework assignment. The code compiles fine and runs in Visual Studio without any problems except no output to the screen and that's where the problems start. The deferred rendering system is nothing special it is basically the first draft of the algorithms presented in "Practical Rendering with DirectX 11" [2011 by Jason Zink, Matt Pettineo, Jack Hoxley]. No optimization like Attribute packing etc. are used. Only a simple Lighting system is used (no shadow). The problems seems to be the fullscreen-quad(simply inefficient fullscreen-quads used for lighting passes) or a general problem. Since I get no errors in the debug log I tried to use PIX and PerfStudio to get some more information on the gpu side. Unfortunately PIX and PerfStudio are before the first frame with this error: Invalid allocation size: 4294967295 bytes So for whatever reason it seems to allocate some space with -1 bytes. Awkwardly everything is fine in VisualStudio debugging... and if I attach a debugger to the PIX process and break when the error happens I land in a debuggers header file. I just started using DirectX with prior OpenGL experience, so I hope I did not something generally wrong. I used the executable that was output by the compiler in debug mode. To avoid general logic mistakes, here is roughly what I currently do: 1. SetDepthStencilState (with DepthTest enabled) 2. clear all RenderTargetViews (was unsure about gBuffer but gBuffer is being cleared as well currently) and DepthStencilBuffer 3. bind gBuffer and DepthStencilBuffer 4. render Geometry 5. disable DepthTest 6. bind Backbuffer 7. render all lights with the associated shader (since I am using the effects framework I set the BlendState in the shader) 8. render CEGUI (works fine even if rest doesn't output anything) 9. present() The lights are as already mentioned fullscreen quads. The lighting technique is simply passing the position through, so the quads vertices are in the range [-1, 1]. If you need any additional informations let me know. Thanks, Nik P.S. sorry for the bad english... EDIT: For further informations: The vertices and indices of the fullscreen quad glm::vec3 vertices[] = { glm::vec3(-1.0f, -1.0f, 1.0f), glm::vec3(-1.0f, 1.0f, 1.0f), glm::vec3( 1.0f, -1.0f, 1.0f), glm::vec3( 1.0f, 1.0f, 1.0f), }; UINT indices[] = { 0, 3, 2, 2, 0, 1 }; And a rough walkthough the code: // before geometry pass m_d3dImmediateContext->RSSetState(m_RasterState); m_d3dImmediateContext->OMSetDepthStencilState(m_GeometryDepthStencilState, 1); m_d3dImmediateContext->ClearRenderTargetView(m_RenderTargetView, reinterpret_cast<const float*>(&clearColor)); m_d3dImmediateContext->ClearRenderTargetView(m_gBuffer[0], reinterpret_cast<const float*>(&clearColor)); m_d3dImmediateContext->ClearRenderTargetView(m_gBuffer[1], reinterpret_cast<const float*>(&clearColor)); m_d3dImmediateContext->ClearRenderTargetView(m_gBuffer[2], reinterpret_cast<const float*>(&clearColor)); m_d3dImmediateContext->ClearRenderTargetView(m_gBuffer[3], reinterpret_cast<const float*>(&clearColor)); m_d3dImmediateContext->ClearDepthStencilView(m_DepthStencilView, D3D11_CLEAR_DEPTH|D3D11_CLEAR_STENCIL, 1.0f, 0); m_d3dImmediateContext->OMSetRenderTargets(4, m_gBuffer, m_DepthStencilView); // before lighting pass m_d3dImmediateContext->OMSetDepthStencilState(m_LightingDepthStencilState, 1); m_d3dImmediateContext->OMSetRenderTargets(1, &m_RenderTargetView, m_DepthStencilView); DXLightingShader->enable(); // DXLightingShader::enable (the static cast is necessary because the engine supports opengl and directx this is my dirty way static_cast<SDXRenderInfo*>(g_RenderInfo)->context->IASetInputLayout(m_InputLayout); static_cast<SDXRenderInfo*>(g_RenderInfo)->context->IASetPrimitiveTopology(D3D11_PRIMITIVE_TOPOLOGY_TRIANGLELIST); m_fxNormalMap->SetResource(m_NormalView); m_fxDiffuseMap->SetResource(m_DiffuseView); m_fxSpecularMap->SetResource(m_SpecularView); m_fxPositionMap->SetResource(m_PositionView); m_fxCameraPos->SetFloatVector(Camera->getPosition()); // depending on light-type this is how it is drawn for(UINT p = 0; p < m_DirectionalLightDesc.Passes; ++p) { m_DirectionalLight->GetPassByIndex(p)->Apply(0, static_cast<SDXRenderInfo*>(g_RenderInfo)->context); static_cast<SDXRenderInfo*>(g_RenderInfo)->context->DrawIndexed(6, 0, 0); } // present function called after light passes DXLightingShader->disable(); CEGUI::System::getSingleton().renderGUI(); HR(m_SwapChain->Present(0, 0)); // DXLightingShader::disable m_fxNormalMap->SetResource(NULL); m_fxDiffuseMap->SetResource(NULL); m_fxSpecularMap->SetResource(NULL); m_fxPositionMap->SetResource(NULL); m_DirectionalLight->GetPassByIndex(0)->Apply(0, static_cast<SDXRenderInfo*>(g_RenderInfo)->context); If you need more information or some details of the shader implementation let me know
  13. Sorry for the delayed answers, but for whatever reason I wasn't able to login anymore...   Anyway thanks a lot that fixed it :)
  14. I am currently working with DirectX on a Coursework Assignment. My first simple shader worked fine, but now after adding some complexity and changing the input layout the code throws an unexpected error. I found a few related posts around here, that unfortunately didn't help me.  The input layout in the shader is defined as followed: struct VSInput { float3 Position : POSITION; float2 TexCoord : TEXCOORD0; float3 Normal : NORMAL; };   I try to setup the input layout in my small material class like this ( I use the effects framework): const D3D11_INPUT_ELEMENT_DESC basicInput[] = { {"POSITION", 0, DXGI_FORMAT_R32G32B32_FLOAT, 0, 0, D3D11_INPUT_PER_VERTEX_DATA, 0}, {"TEXCOORD0", 0, DXGI_FORMAT_R32G32_FLOAT, 0, 12, D3D11_INPUT_PER_VERTEX_DATA, 0}, {"NORMAL", 0, DXGI_FORMAT_R32G32B32_FLOAT, 0, 20, D3D11_INPUT_PER_VERTEX_DATA, 0} }; D3DX11_PASS_DESC passDesc; m_Tech->GetPassByIndex(0)->GetDesc(&passDesc); HR(static_cast<SDXRenderInfo*>(g_RenderInfo)->device->CreateInputLayout(basicInput, 3, passDesc.pIAInputSignature, passDesc.IAInputSignatureSize, &m_InputLayout)); (HR is a small macro I found in the Frank Luna book; the static_cast might be irritating but the engine supports directx and opengl and this is part of my dirty solution)   EDIT: Let me know if your need more informations...  
  • Advertisement

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

GameDev.net is your game development community. Create an account for your GameDev Portfolio and participate in the largest developer community in the games industry.

Sign me up!