Jump to content
  • Advertisement

Anthom

Member
  • Content Count

    10
  • Joined

  • Last visited

Community Reputation

130 Neutral

About Anthom

  • Rank
    Member
  1. indeed, you were right, Thanks!   this fixed it: // in the c++ code struct CameraConstData { urd::Matrix projection; // 64 ( 16 floats) urd::Matrix view; // 64 ( 16 floats) urd::Vec3 viewPosition; // 12 ( 3 floats) float a; urd::Vec3 viewDir; // 12 ( 3 floats) float b; // 104 bytes (26 * 4) float offset[26]; }; // in the shader // desc heap cbv cbuffer CameraConstBuffer : register(b0) { float4x4 projectionMatrix; float4x4 viewMatrix; float3 viewPos; float a; float3 viewDir; float b; } so hlsl got its 16 byte boundry.
  2. Hello,   my shader scrambles the values of my constbuffer.   i have a struct like: struct CameraConstData { urd::Matrix projection; // 64 ( 16 floats) urd::Matrix view; // 64 ( 16 floats) urd::Vec3 viewPosition; // 12 ( 3 floats) urd::Vec3 viewDir; // 12 ( 3 floats) // 104 bytes (26 * 4) float offset[26]; }; Displaying the constbuffer in the gpu debugger looks fine. 2 matrices and the 2 vectors at the end.   [attachment=29125:Unbenannt.PNG]   Now if i visualize the viewDir vector in the shader like: // desc heap cbv cbuffer CameraConstBuffer : register(b0) { float4x4 projectionMatrix; float4x4 viewMatrix; float3 viewPos; float3 viewDir; } // in pixel shader float3 vpos = (viewDir); float value = vpos.x; // uses the passed y value //float value = vpos.y; // uses the passed z value color = float4(value, value, value, 1.0); viewDir.x is the y value of the view direction viewDir.y is the z value of the view direction and viewDir.z is 0.0   looks like it all got shifted back one position. and its reading buffer index 36-38 instead of 35 - 37!   if i use float viewPos[2] to shift back one offset, it still doesnt work. so its just reading the cbv values false and not indexing them at the wrong offset.   if i switch viewPos with viewDir, viewPos is affected. always the last member of the constbuffer struct.
  3. Create a texture with D3D12_RESOURCE_DIMENSION_TEXTURE2D and D3D12_TEXTURE_LAYOUT_UNKNOWN. after that, create a view on a DSV heap. while recording, get the D3D12_CPU_DESCRIPTOR_HANDLE of the dsv and set it at the last parameter of OMSetRenderTargets. Before that, ofc clear the view.   Looking at your code above, you have forgotten to set it in the OMSetRenderTargets function? like this: OMSetRenderTargets(1, &rtvHandle, false, &dsvHandle);
  4. Edit:   Just read: https://msdn.microsoft.com/en-us/library/windows/desktop/dn788714(v=vs.85).aspx   "Currently, there is one graphics and one compute root signature per app."   So my only option would be to deserialize the root signature and serialize again with new parameters? Or just define all possible tables at the beginning?     Original:   ok i isolated the problem and its my pipeline state object.   if i use the same root parameters as for the main shader, its working. if i use different root parameters for each graphics pipeline (root signature and pso, each), and bind the shadow texture at descriptor table 0 instead of 2, i get the behavior i explained in the other posts.   in the multithreaded example from microsoft they are also reusing the same root parameters (or root signatures) for both pso`s?!   debug layer is not complaining about anything.
  5. unbound? i created a dsv and a srv poinzing to the shadow texture resource. i use the dsv in the shadow pass and the rsv in the rendering/sampling pass. in between i transition resource state from depth write to shader resource state and back after sampling   Edit:   ok i got it working by unwrapping my class. looks like a resource lifetime problem or maybe lifetime descriptor problems. Now im able to wrap it step by step back into a class and check for errors.
  6. yes i do. no error or warning. its that it binds my default stencil to the shadow texture. it uses the depth the the camera is "seeing" (default depth stencil texture) as texture i bind the svadow texture to. really weird behavior. i checked shadow texture and the depth stencil resources in the gpu debbuger and they are fine. as seen in the picture i posted.
  7. Hello,   im currently experiencing a weird problem. My shadow mapping only applies to the rtv im not seeing (Using doublebuffered swapchain)   Checking with Visual Studios graphics debugger shows that the shadow map is written fine (depth stencil texture with state transition to shader resource)   In the mid the picture i see while running the application. On the left the debugger output of the other rtv. Its always with shadow mapping applied. On the right the shadow map.   The Shadow map is always on one rtv, but this never seems to be the one visible. Its not even flickering (not one rtv with shadow map and one without), its constant without any shadows. Only the debugger shows the shadows.     Any idea what could cause this problem?     Edit:   Outputting the shadow depth texture as a diffuse texture just shows that the visible rtv somehow has a depth texture from camera perspective (the default depth stencil used). In the debugger, the other one rtv has the real shadow map applied. What the hell?    
  8. Thanks for the hint ajmiles. Seems like my suspicion was right. After enabling the debug layer, i got the following output:     So only one Descriptor Heap for each Heap Type (SRV_CBV_UAV or SAMPLER) can be set at the same time!   So the way to go is to have one big descriptor heap that can store all the needed descriptors. (at least for each commandlist)
  9. Its really noticable, that the  d3d12 examples only use one descriptor heap for all srv's and cbv's (+ maybe another sampler heap). So i thought i try to split that into 2 heaps, one for srv and one for cbv.   Ive made a stackoverflow question documenting my failed try: http://stackoverflow.com/questions/32114174/directx-12-how-to-use-commandlist-with-multiple-descriptor-heaps   Maybe i missed something in the documentation like "only one descriptor heap per type per commandlist allowed"?  
  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

GameDev.net is your game development community. Create an account for your GameDev Portfolio and participate in the largest developer community in the games industry.

Sign me up!