• Advertisement
Sign in to follow this  

DX11 Shader debugger in DX9

This topic is 1632 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

I would like to do real shader-debugging on my Windows 7 PC: being able to capture a frame, select a pixel and step through its pixel shader. The real deal, just like I can on my Xbox 360 devkit. Which tools can I use for that on Windows 7?

Supposedly Visual Studio 2012 has a really good PIX inside, but this is only DX10 and DX11, while my game is still DX9. So I cannot use that.

Nvidia's Perfhud used to also be a nice graphics debugger (don't know if it had stepping through pixel shaders, though), but I cannot get Perfhud to work on my Windows 7 computer: it keeps giving the error that the drivers are not instrumented. Also, Perfhud doesn't seem to have been update in six years, suggesting it is simply dead.

So, what to do? Are there other debugging/profiling tools I could use for graphics?

My game is made in DX9 (with the Ogre engine), shader model 3.0 and I can switch between HLSL and CG, so either would be good. I use Windows 7 and have an Nvidia GTX480.

Thanks in advance! smile.png

Share this post


Link to post
Share on other sites
Advertisement

Like Yourself mentioned Nsight is the replacement for PerfHUD. Nsight is the only debugger availble for PC that actually debugs shaders on the hardware, all other debuggers work by emulating the shader on the host PC.

The old PIX for Windows will still work for debugging shaders (via emulation as mentioned above), however you can't install the Windows 7 hotfix that brought DX11.1 to Win7. With that hotfix PIX will crash. However it is possible to uninstall the hotfix and use PIX again.

Share this post


Link to post
Share on other sites
I am okay with software emulation, just as long as the results are correct, since I need it for debugging, not so much for profiling. smile.png

I had a look at Nsight before, but I got the impression it only does shader debugging in DX10/11, not in DX9. Am I misinterpreting the description here, then? https://developer.nvidia.com/nsight-visual-studio-edition-features

I had also found Intel GPA, but I had automatically assumed it would only support Intel videocards. Does it also fully function on Nvidia cards then? I also couldn't find any description of full shader debugging in the GPA description. It only mentions "Shader experiments" in their Product Brief, which suggests things like swapping shaders, not fully stepping through a shader for an individual pixel.

Since you are mentioning 3 different alternative (Nsight, GPA and PIX), which would be my best choice?

Share this post


Link to post
Share on other sites

It does look like you're correct regarding Nsight and DX9. Honestly it's been so long since I used DX9, so I hadn't checked what tools supported it. I think PIX might be your best bet here, it's only tool that I can confirm works for debugging DX9 shaders.

Share this post


Link to post
Share on other sites

Microsoft DirectX SDK June 2010 has 64bit PIX in the package, I just have it on my win7 and I succesfully debug 32bit dx9 applications. PLease, post your result wheather you got it or not, thanks.

Share this post


Link to post
Share on other sites
I tried PIX now and it indeed does exactly what I was looking for! biggrin.png

I did have trouble with PIX in the DirectX June 2010 SD,K giving some weird error when I tried to debug a pixel. But it turns out that the 32 bit PIX in the August 2007 SDK does work. Great, on to some good debugging fun!

Thanks for the advice, folks! smile.png

Share this post


Link to post
Share on other sites

weird error? You could have been more specific. Wasn't it alerting " Multisampled surfaces cannot be pixel shader debugged. " ?

Since PIX 2007 works fine, I wasn't particularly looking for a solution, but I guess newer PIX would of course be better, so if you know how to solve it, I'd love to hear the solution! smile.png

When I try to debug a shader for a specific pixel, I first get a popup that tell me PIX is going to "Enable Shader Debugging" in the DirectX Control Panel. I click Yes, I allow it to make changes through a Windows popup, and then I get the message "Shader debugging could not be enabled", and then "An error occurred while preparing to debug the shader".

I tried enabling shader debugging myself in all the DirectX control panels I could find (2007/2010 32/64bit), and also setting them to Debug instead of Release DirectX, but neither helped. PIX 2007 works fine right away. Edited by Oogst

Share this post


Link to post
Share on other sites

I have that same exact problem with PIX while trying to debug vertex or pixel shaders in my XNA project...  I guess I'll go see if the old version works for me.

Share this post


Link to post
Share on other sites

I have that same exact problem with PIX while trying to debug vertex or pixel shaders in my XNA project...  I guess I'll go see if the old version works for me.

 

If you're using XNA, make sure you check "Disable D3DX Analysis" in the Target Program tab of "More Options" in PIX.

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement
  • Advertisement
  • Popular Tags

  • Advertisement
  • Popular Now

  • Similar Content

    • By Matt_Aufderheide
      I am trying to draw a screen-aligned quad with arbitrary sizes.
       
      currently I just send 4 vertices to the vertex shader like so:
      pDevCon->IASetPrimitiveTopology(D3D_PRIMITIVE_TOPOLOGY_TRIANGLESTRIP);
      pDevCon->Draw(4, 0);
       
      then in the vertex shader I am doing this:
      float4 main(uint vI : SV_VERTEXID) : SV_POSITION
      {
       float2 texcoord = float2(vI & 1, vI >> 1);
      return float4((texcoord.x - 0.5f) * 2, -(texcoord.y - 0.5f) * 2, 0, 1);
      }
      that gets me a screen-sized quad...ok .. what's the correct way to get arbitrary sizes?...I have messed around with various numbers, but I think I don't quite get something in these relationships.
      one thing I tried is: 
       
      float4 quad = float4((texcoord.x - (xpos/screensizex)) * (width/screensizex), -(texcoord.y - (ypos/screensizey)) * (height/screensizey), 0, 1);
       
      .. where xpos and ypos is number of pixels from upper right corner..width and height is the desired size of the quad in pixels
      this gets me somewhat close, but not right.. a bit too small..so I'm missing something ..any ideas?
       
      .
    • By Stewie.G
      Hi,
      I've been trying to implement a gaussian blur recently, it would seem the best way to achieve this is by running a bur on one axis, then another blur on the other axis.
      I think I have successfully implemented the blur part per axis, but now I have to blend both calls with a proper BlendState, at least I think this is where my problem is.
      Here are my passes:
      RasterizerState DisableCulling { CullMode = BACK; }; BlendState AdditiveBlend { BlendEnable[0] = TRUE; BlendEnable[1] = TRUE; SrcBlend[0] = SRC_COLOR; BlendOp[0] = ADD; BlendOp[1] = ADD; SrcBlend[1] = SRC_COLOR; }; technique11 BlockTech { pass P0 { SetVertexShader(CompileShader(vs_5_0, VS())); SetGeometryShader(NULL); SetPixelShader(CompileShader(ps_5_0, PS_BlurV())); SetRasterizerState(DisableCulling); SetBlendState(AdditiveBlend, float4(0.0, 0.0, 0.0, 0.0), 0xffffffff); } pass P1 { SetVertexShader(CompileShader(vs_5_0, VS())); SetGeometryShader(NULL); SetPixelShader(CompileShader(ps_5_0, PS_BlurH())); SetRasterizerState(DisableCulling); } }  
      D3DX11_TECHNIQUE_DESC techDesc; mBlockEffect->mTech->GetDesc( &techDesc ); for(UINT p = 0; p < techDesc.Passes; ++p) { deviceContext->IASetVertexBuffers(0, 2, bufferPointers, stride, offset); deviceContext->IASetIndexBuffer(mIB, DXGI_FORMAT_R32_UINT, 0); mBlockEffect->mTech->GetPassByIndex(p)->Apply(0, deviceContext); deviceContext->DrawIndexedInstanced(36, mNumberOfActiveCubes, 0, 0, 0); } No blur

       
      PS_BlurV

      PS_BlurH

      P0 + P1

      As you can see, it does not work at all.
      I think the issue is in my BlendState, but I am not sure.
      I've seen many articles going with the render to texture approach, but I've also seen articles where both shaders were called in succession, and it worked just fine, I'd like to go with that second approach. Unfortunately, the code was in OpenGL where the syntax for running multiple passes is quite different (http://rastergrid.com/blog/2010/09/efficient-gaussian-blur-with-linear-sampling/). So I need some help doing the same in HLSL :-)
       
      Thanks!
    • By Fleshbits
      Back around 2006 I spent a good year or two reading books, articles on this site, and gobbling up everything game dev related I could. I started an engine in DX10 and got through basics. I eventually gave up, because I couldn't do the harder things.
      Now, my C++ is 12 years stronger, my mind is trained better, and I am thinking of giving it another go.
      Alot has changed. There is no more SDK, there is evidently a DX Toolkit, XNA died, all the sweet sites I used to go to are 404, and google searches all point to Unity and Unreal.
      I plainly don't like Unity or Unreal, but might learn them for reference.
      So, what is the current path? Does everyone pretty much use the DX Toolkit? Should I start there? I also read that DX12 is just expert level DX11, so I guess I am going DX 11.
      Is there a current and up to date list of learning resources anywhere?  I am about tired of 404s..
       
       
    • By Stewie.G
      Hi,
       
      I've been trying to implement a basic gaussian blur using the gaussian formula, and here is what it looks like so far:
      float gaussian(float x, float sigma)
      {
          float pi = 3.14159;
          float sigma_square = sigma * sigma;
          float a = 1 / sqrt(2 * pi*sigma_square);
          float b = exp(-((x*x) / (2 * sigma_square)));
          return a * b;
      }
      My problem is that I don't quite know what sigma should be.
      It seems that if I provide a random value for sigma, weights in my kernel won't add up to 1.
      So I ended up calling my gaussian function with sigma == 1, which gives me weights adding up to 1, but also a very subtle blur.
      Here is what my kernel looks like with sigma == 1
              [0]    0.0033238872995488885    
              [1]    0.023804742479357766    
              [2]    0.09713820127276819    
              [3]    0.22585307043511713    
              [4]    0.29920669915475656    
              [5]    0.22585307043511713    
              [6]    0.09713820127276819    
              [7]    0.023804742479357766    
              [8]    0.0033238872995488885    
       
      I would have liked it to be more "rounded" at the top, or a better spread instead of wasting [0], [1], [2] with values bellow 0.1.
      Based on my experiments, the key to this is to provide a different sigma, but if I do, my kernel values no longer adds up to 1, which results to a darker blur.
      I've found this post 
      ... which helped me a bit, but I am really confused with this the part where he divide sigma by 3.
      Can someone please explain how sigma works? How is it related to my kernel size, how can I balance my weights with different sigmas, ect...
       
      Thanks :-)
    • By mc_wiggly_fingers
      Is it possible to asynchronously create a Texture2D using DirectX11?
      I have a native Unity plugin that downloads 8K textures from a server and displays them to the user for a VR application. This works well, but there's a large frame drop when calling CreateTexture2D. To remedy this, I've tried creating a separate thread that creates the texture, but the frame drop is still present.
      Is there anything else that I could do to prevent that frame drop from occuring?
  • Advertisement