Jump to content

  • Log In with Google      Sign In   
  • Create Account


Direct3D11 crashing display driver


Old topic!
Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.

  • You cannot reply to this topic
4 replies to this topic

#1 jdub   Members   -  Reputation: 419

Like
0Likes
Like

Posted 15 October 2012 - 01:49 PM

I have a program which draws a model (> 100k triangles). For testing purposes, I am printing out values to the console after loading the model. When I minimize the program to look at the output in visual studio, my display driver crashes and I get a DXGI_ERROR_DEVICE_REMOVED error.

What kind of behavior tends to cause this exception?
J.W.

Sponsor:

#2 MJP   Moderators   -  Reputation: 10917

Like
1Likes
Like

Posted 15 October 2012 - 02:36 PM

Probably the most typical case is when the GPU spends a long time doing work and the display driver times out.

#3 jdub   Members   -  Reputation: 419

Like
0Likes
Like

Posted 15 October 2012 - 03:31 PM

So what is the solution? Should I dispose of the device/context and recreate them from scratch?
J.W.

#4 RythBlade   Members   -  Reputation: 152

Like
0Likes
Like

Posted 16 October 2012 - 09:32 AM

Recreating your devices from scratch will take a considerable amount of time as the majority of DirectX objects rely on the device and context so would need to be recreated also.

I can think of a few possibilities - for your problem - like MJP says, when a GPU task takes too long your graphics driver will time out. For instance if you attempted to render an obsene amount of triangles and do heavy shader calculations on them, all in one go - your grpahics driver will probably die. I had this problem while writing a DirectCompute shader that sometimes took a very long time and kept killing my drivers.

Also - the device object has some quirks when its windowed. For example in a multi-screen situation, draggin the window from one screen to the other will destroy your current graphics device object, as we've changed which physical device we want to use.

The DXUT framework is really good at handling this sort of thing for you with some handy event handler stub functions.

This may also happen when you minimise the window (I can't remember if I've actually ever tried this.....).

Also I believe if your running it in full-screen and you minimise, this can cause problems as well similar to those above. If you're in full-screen mode, use the alt+enter shortcut to switch back to a windowed mode. Then you should be able to get to visual studio without minimising the window - you're just moving focus to another window - which in my experience has always been fine.

You could also try placing some timers around you draw calls that output to the debug window. Compare the times in situations where it works and doesn't work to see if its a long running GPU task that kills the device - in this instance you'd probably see a long running time or a start time without a finish time immediatly before the crash.

Hope this has been helpful!!
I've got a new blog!! I post details of my projects, useful things I find around and about the place and some tutorials on various technologies I'm experimenting with.

#5 mhagain   Crossbones+   -  Reputation: 7822

Like
0Likes
Like

Posted 16 October 2012 - 09:51 AM

NVIDIA hardware? I had the very same recently, and it turned out that I was overflowing a buffer object; check your program under the debug runtimes and it should give you the event that triggered the device to signal it's removal.

It appears that the gentleman thought C++ was extremely difficult and he was overjoyed that the machine was absorbing it; he understood that good C++ is difficult but the best C++ is well-nigh unintelligible.





Old topic!
Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.



PARTNERS