Sign in to follow this  
cephalo

DX11 Debugging DX11 Shaders as a hobbyist.

Recommended Posts

The new graphics debugging functionality in VS2012 sounds great, but you can't use it with VS2012 express. I won't be making money with my endeavors any time soon, and the family budget can't justify $500 for VS Pro just so Dad can mess around with computer graphics.

 

I'm trying to learn DirectX 11, and my very first shader code is drawing nothing. I'm not having any luck with PIX from the old June 2010 SDK, it crashes when I try to run my program with it, even though my program can run without PIX. (It's just not drawing properly.) DebugView, which was quite useful in DirectX9 debugging, doesn't seem to work either.

 

Is there a free way to debug DirectX 11 shaders on a Windows 7 platform?

Share this post


Link to post
Share on other sites

You can get far without spending any money on development software.

 

Concerning PIX: [url="http://www.gamedev.net/topic/639532-d3d11createdevice-failed/"]read here[/url].  Luckily I was warned by that thread then and blocked that update. Hopefully you can uninstall (if that's actually the issue and solves it ;)

 

Debug output will only happen when you have used D3D11_CREATE_DEVICE_DEBUG at device creation (not entirely correct: The control panel lets you select apps explicitly, but I never used that, so I can't comment on that).

 

That thread made me finally look into alternative GPU debuggers: The only one that worked for me was Intel's GPA Frame analyzer (works great for my NVidia and is the only one apart from PIX that works with my C# stuff). NVidia and AMD have "their" debuggers, too (NSight, GPUPerfStudio).

 

Good luck. Shaders are fun, especially with D3D11.

 

Edit: Important: I'm still using VS Express 2010 and Win7 so use my uninstall-approach at your own risk.

Edited by unbird

Share this post


Link to post
Share on other sites

You can try your luck with GPU PerfStudio if you have an AMD GPU or Nsight if you have an Nvidia GPU, but I'll warn you that the overall experience isn't great with either of those tools.

Share this post


Link to post
Share on other sites

I didnt manage to make pix show shader SOURCE code using the new D3DCompileFromFile instead of june sdk D3DX11CompileFromFile (I have the windows sdk 8 installed on my windows 7), I gave up and turned back to the d3dx one.

Share this post


Link to post
Share on other sites

Does nsight have a stand alone version ? I thought it was integrated into visual studio which would require the pro version as express doesn't support add ons ? I've been using it myself recently and although its better than nothing, I still prefer pix a graphics debugger. Intel GPA is another alternative which might give you some info.

 

As for pix crashing, can you attach a debugger to the process to see why it crashes ? I had a pix crash last week which ended being causes by the app trying to create a SW or REF device. Apparently pix only works with a HAL device (This was d3d9 of course). 

Share this post


Link to post
Share on other sites

Ha! I uninstalled (KB2670838) and now PIX will run my app farther so that my window opens to a white client area, but instead of throwing an exception like it did before, the computer completely freezes up such that I have to do a hard reset. When I run the app by itself, I get my cornflower blue screen (so the screen clear is working, or seems to be working) and it remains responsive enough to shut down.

 

The thing is, this is my very first run of my very first dx11 application, and I'm sure I have many misconceptions. I was hoping I could find out what those misconceptions are through debugging. It's enormously hard to learn something when your tools are taken away. I don't understand why Microsoft would deduce that only professionals need to debug. I don't see myself progressing in this environment.

Share this post


Link to post
Share on other sites

Does nsight have a stand alone version ? I thought it was integrated into visual studio which would require the pro version as express doesn't support add ons ? I've been using it myself recently and although its better than nothing, I still prefer pix a graphics debugger.

Ah, true, forgot that. There's an NSight eclipse version coming with the CUDA Toolkit. Unfortunately that download alone is 1 Gb and my system partition is not capable of taking that load. Neither do I know if that is only working for CUDA stuff. Could anybody comment on that ?

As for pix crashing, can you attach a debugger to the process to see why it crashes ? I had a pix crash last week which ended being causes by the app trying to create a SW or REF device. Apparently pix only works with a HAL device (This was d3d9 of course).

Hmmm, IIRC you can force the REF device in PIX, but it will be terribly slow.

Ha! I uninstalled (KB2670838) and now PIX will run my app farther so that my window opens to a white client area, but instead of throwing an exception like it did before, the computer completely freezes up such that I have to do a hard reset. When I run the app by itself, I get my cornflower blue screen (so the screen clear is working, or seems to be working) and it remains responsive enough to shut down.
 
The thing is, this is my very first run of my very first dx11 application, and I'm sure I have many misconceptions. I was hoping I could find out what those misconceptions are through debugging. It's enormously hard to learn something when your tools are taken away. I don't understand why Microsoft would deduce that only professionals need to debug. I don't see myself progressing in this environment.

This is really painful if you're just starting out. Although PIX has some troubles with more advanced DX11 stuff and occasionally hiccups it's a bloody essential app.

After you've done your initial boilerplating you could at least read back textures/buffers and debug this way. But if you can't even render teh famous first triangle this is a vicious circle. I would be glad, too, if the graphics debugger became available for VS 2012 Express (until then, I probably wait with the install).

Also, have you tried the other stuff from the thread I linked (again: at your own risk) ? And also try running one or two from the SDK tutorials through PIX. Maybe you're really doing something so weird PIX would give up anyway.

Did you have at least success with the debug flag ? Feel free to show your code and debug messages, maybe we'll see something.

Share this post


Link to post
Share on other sites

Well, it's not exactly my first 3D graphics program, as I had some experience with dx9. For my first dx11 app I went for broke with some fairly complex bezier triangle instancing making full use of tessellation.

 

I'll have to work on it for a while, but my suspicion at this time is that I'm probably instancing something ad infinitum, creating massive amounts of debug data that PIX can't handle, while the stand alone program just gives up in a less dramatic fashion.   

Share this post


Link to post
Share on other sites
laugh.png Your first D3D11 app and you went straight for tesselation. Well, if that isn't bold.

As mentioned, PIX isn't of much use here. Well, it doesn't crash, but e.g. you can't debug these shaders.

Share this post


Link to post
Share on other sites

laugh.png Your first D3D11 app and you went straight for tesselation. Well, if that isn't bold.

As mentioned, PIX isn't of much use here. Well, it doesn't crash, but e.g. you can't debug these shaders.

I've never actually been able to get PIX to debug any shader even in the dx9 days. It always had crazy bugs that changed after every update. Still, it was useful just to see what the vertex streams were composed of. I'm quite sure that my vertex buffers aren't how I am imagining they ought to be. Staring at my code and waiting for an epiphany is discouraging, but I already found a couple of bugs that way.

 

I'm frustrated because PIX currently ends my computer session without any kind of error message, and without PIX my CPU code steps through fine and each frame appears to make it to the draw call. BTW I am also using C# with SharpDX.

Edited by cephalo

Share this post


Link to post
Share on other sites
Well, I admit, though the shader debugging works with my PIX, I rarely use it. Never found it particularly useful. I rather put some "logging" into my shaders by outputting a debug value. This is easy for pixel shaders. For others you have to pass them along first (maybe with an additional semantic).

Checking the streams on the other hand is useful indeed. How about geometry shader stream out and read back through staging ? Are you familiar with that?

I use SlimDX, but you really should give Intel's GPA Frame Analyzer a shot. At least the input stream is viewable - you can even look at the geometry with an arcball camera. Edited by unbird

Share this post


Link to post
Share on other sites

It is an unfortunate situation...  Are you able to run the program from your VS IDE?  If so, do you get any error messages or anything like that?  And do you have the debug layer of the device being used???

 

To prove that your system isn't messed up, you could try downloading the Hieroglyph 3 project and running the 'BasicTessellation' demo.  That will tell you if the problem is coming from your program or if it is something systemic on your machine.

Share this post


Link to post
Share on other sites

Ok, I brought this setup to my home computer, and it works! PIX is working here. Unfortunately, all the vertex data looks fine too, both on the input and after the instancing. I wonder what the deal was with my work computer?

 

Anyway, I'm not sure why I'm only seeing the cleared background. I may have some render state set wrong or have the wrong primitive chosen. My bezier triangle has 10 control points, so I'm using PrimitiveTopology.PatchListWith10ControlPoints because it seems right, but in PIX on the mesh tab I'm not seeing triangles, only points. Obviously if you draw points without setting any size, you aren't likely to see anything.

Share this post


Link to post
Share on other sites

Also, something that is very simple but that helped me a lot in debugging the actual shader code... Have a way to recompile your shaders during run-time.  This has really saved me a lot of time, especially since reloading resources at the beginning of the application takes quite a long time.  Now I can make tweaks within my shaders without having to restart the whole app.

 

I know it's pretty trivial, but it really increased my productivity.

Share this post


Link to post
Share on other sites

Nothing has changed yet. Your only option for your first DirectX 11 app is to hope that you can see something on the first run. If you don't, there are many things that could prevent things from being visible, and without a debugger, its like feeling your way out of a dark cave and hope you aren't miles away from daylight. Not impossible, but highly unpleasant and time consuming.

 

It's a bad time to start learning DirectX.

Edited by cephalo

Share this post


Link to post
Share on other sites

I've found intels GPA to be a suitable replacement. I recently moved from x9 to x11 and although i had to go right back to a simple triangle to get anything on screen, i was able to debug the issues i had. After that i was able to bring back most if not all of my frameworks capabilities within a week or so.

Share this post


Link to post
Share on other sites

That's a good tip Burnt_Fyr. I tried it myself unsuccessfully, but I think it's because I'm using some features from DX 11.1.

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

Sign in to follow this  

  • Forum Statistics

    • Total Topics
      627764
    • Total Posts
      2978976
  • Similar Content

    • By schneckerstein
      Hello,
      I manged so far to implement NVIDIA's NDF-Filtering at a basic level (the paper can be found here). Here is my code so far:
      //... // project the half vector on the normal (?) float3 hppWS = halfVector / dot(halfVector, geometricNormal) float2 hpp = float2(dot(hppWS, wTangent), dot(hppWS, wBitangent)); // compute the pixel footprint float2x2 dhduv = float2x2(ddx(hpp), ddy(hpp)); // compute the rectangular area of the pixel footprint float2 rectFp = min((abs(dhduv[0]) + abs(dhduv[1])) * 0.5, 0.3); // map the area to ggx roughness float2 covMx = rectFp * rectFp * 2; roughness = sqrt(roughness * roughness + covMx); //... Now I want combine this with LEAN mapping as state in Chapter 5.5 of the NDF paper.
      But I struggle to understand what theses sections actually means in Code: 
      I suppose the first-order moments are the B coefficent of the LEAN map, however things like
      float3 hppWS = halfVector / dot(halfVector, float3(lean_B, 0)); doesn't bring up anything usefull.
      Next theres:
      This simply means:
      // M and B are the coefficents from the LEAN map float2x2 sigma_mat = float2x2( M.x - B.x * B.x, M.z - B.x * B.y, M.z - B.x * B.y, M.y - B.y * B.y); does it?
      Finally:
      This is the part confuses me the most: how am I suppose to convolute two matrices? I know the concept of convolution in terms of functions, not matrices. Should I multiple them? That didn't make any usefully output.
      I hope someone can help with this maybe too specific question, I'm really despaired to make this work and i've spend too many hours of trial & error...
      Cheers,
      Julian
    • By Baemz
      Hello,
      I've been working on some culling-techniques for a project. We've built our own engine so pretty much everything is built from scratch. I've set up a frustum with the following code, assuming that the FOV is 90 degrees.
      float angle = CU::ToRadians(45.f); Plane<float> nearPlane(Vector3<float>(0, 0, aNear), Vector3<float>(0, 0, -1)); Plane<float> farPlane(Vector3<float>(0, 0, aFar), Vector3<float>(0, 0, 1)); Plane<float> right(Vector3<float>(0, 0, 0), Vector3<float>(angle, 0, -angle)); Plane<float> left(Vector3<float>(0, 0, 0), Vector3<float>(-angle, 0, -angle)); Plane<float> up(Vector3<float>(0, 0, 0), Vector3<float>(0, angle, -angle)); Plane<float> down(Vector3<float>(0, 0, 0), Vector3<float>(0, -angle, -angle)); myVolume.AddPlane(nearPlane); myVolume.AddPlane(farPlane); myVolume.AddPlane(right); myVolume.AddPlane(left); myVolume.AddPlane(up); myVolume.AddPlane(down); When checking the intersections I am using a BoundingSphere of my models, which is calculated by taking the average position of all vertices and then choosing the furthest distance to a vertex for radius. The actual intersection test looks like this, where the "myFrustum90" is the actual frustum described above.
      The orientationInverse is the viewMatrix in this case.
      bool CFrustum::Intersects(const SFrustumCollider& aCollider) { CU::Vector4<float> position = CU::Vector4<float>(aCollider.myCenter.x, aCollider.myCenter.y, aCollider.myCenter.z, 1.f) * myOrientationInverse; return myFrustum90.Inside({ position.x, position.y, position.z }, aCollider.myRadius); } The Inside() function looks like this.
      template <typename T> bool PlaneVolume<T>::Inside(Vector3<T> aPosition, T aRadius) const { for (unsigned short i = 0; i < myPlaneList.size(); ++i) { if (myPlaneList[i].ClassifySpherePlane(aPosition, aRadius) > 0) { return false; } } return true; } And this is the ClassifySpherePlane() function. (The plane is defined as a Vector4 called myABCD, where ABC is the normal)
      template <typename T> inline int Plane<T>::ClassifySpherePlane(Vector3<T> aSpherePosition, float aSphereRadius) const { float distance = (aSpherePosition.Dot(myNormal)) - myABCD.w; // completely on the front side if (distance >= aSphereRadius) { return 1; } // completely on the backside (aka "inside") if (distance <= -aSphereRadius) { return -1; } //sphere intersects the plane return 0; }  
      Please bare in mind that this code is not optimized nor well-written by any means. I am just looking to get it working.
      The result of this culling is that the models seem to be culled a bit "too early", so that the culling is visible and the models pops away.
      How do I get the culling to work properly?
      I have tried different techniques but haven't gotten any of them to work.
      If you need more code or explanations feel free to ask for it.

      Thanks.
       
    • By evelyn4you
      hi,
      i have read very much about the binding of a constantbuffer to a shader but something is still unclear to me.
      e.g. when performing :   vertexshader.setConstantbuffer ( buffer,  slot )
       is the buffer bound
      a.  to the VertexShaderStage
      or
      b. to the VertexShader that is currently set as the active VertexShader
      Is it possible to bind a constantBuffer to a VertexShader e.g. VS_A and keep this binding even after the active VertexShader has changed ?
      I mean i want to bind constantbuffer_A  to VS_A, an Constantbuffer_B to VS_B  and  only use updateSubresource without using setConstantBuffer command every time.

      Look at this example:
      SetVertexShader ( VS_A )
      updateSubresource(buffer_A)
      vertexshader.setConstantbuffer ( buffer_A,  slot_A )
      perform drawcall       ( buffer_A is used )

      SetVertexShader ( VS_B )
      updateSubresource(buffer_B)
      vertexshader.setConstantbuffer ( buffer_B,  slot_A )
      perform drawcall   ( buffer_B is used )
      SetVertexShader ( VS_A )
      perform drawcall   (now which buffer is used ??? )
       
      I ask this question because i have made a custom render engine an want to optimize to
      the minimum  updateSubresource, and setConstantbuffer  calls
       
       
       
       
       
    • By noodleBowl
      I got a quick question about buffers when it comes to DirectX 11. If I bind a buffer using a command like:
      IASetVertexBuffers IASetIndexBuffer VSSetConstantBuffers PSSetConstantBuffers  and then later on I update that bound buffer's data using commands like Map/Unmap or any of the other update commands.
      Do I need to rebind the buffer again in order for my update to take effect? If I dont rebind is that really bad as in I get a performance hit? My thought process behind this is that if the buffer is already bound why do I need to rebind it? I'm using that same buffer it is just different data
       
    • By Rockmover
      I am really stuck with something that should be very simple in DirectX 11. 
      1. I can draw lines using a PC (position, colored) vertices and a simple shader just fine.
      2. I can draw 3D triangles using PCN (position, colored, normal) vertices just fine (even transparency and SpecularBlinnPhong shaders).
       
      However, if I'm using my 3D shader, and I want to draw my PC lines in the same scene how can I do that?
       
      If I change my lines to PCN and pass them to the 3D shader with my triangles, then the lighting screws them all up.  I only want the lighting for the 3D triangles, but no SpecularBlinnPhong/Lighting for the lines (just PC). 
      I am sure this is because if I change the lines to PNC there is not really a correct "normal" for the lines.  
      I assume I somehow need to draw the 3D triangles using one shader, and then "switch" to another shader and draw the lines?  But I have no clue how to use two different shaders in the same scene.  And then are the lines just drawn on top of the triangles, or vice versa (maybe draw order dependent)?  
      I must be missing something really basic, so if anyone can just point me in the right direction (or link to an example showing the implementation of multiple shaders) that would be REALLY appreciated.
       
      I'm also more than happy to post my simple test code if that helps as well!
       
      THANKS SO MUCH IN ADVANCE!!!
  • Popular Now