Sign in to follow this  
Cluq

How to "lose device" on purpose?

Recommended Posts

Cluq    122
I want to enflict a "Lost Device" in my game. How can this be done? I can of course ALT-TAB, but I want to do it in the code. I have tried d3ddevice->Reset(), but that did not do the trick (the Reset call failed). The reason for all this, is that I want the video memory to be completely flushed. I use 3rd party code for my resource manager, this is why I want to enflict the lost device - all resources are then recreated by the manager, really without me doing anything. I'm trying to keep all resources in the managed pool, such that I can call d3ddevice->EvictManagedResources(). But it does not seem to flush the video memory as sufficiently as an ALT+TAB. Does anyone have some insight into this?

Share this post


Link to post
Share on other sites
Scoob Droolins    258
i dont think you can do this with code. if you're trying to debug your app's response to a lost device, then you can force a lost device, in windowed mode, by simply starting up another D3D app in full screen mode. Just run this other app long enough for it to create the full screen D3D device, then exit.

another option is to remote debug your app from a second machine. run your app in full screen mode, then just alt-tab.

Share this post


Link to post
Share on other sites
don    431
Resizing your window won't cause a lost device. This will just cause the Present to stretch/shrink during the copy operation from the back buffer to the primary surface.

The easiest way is to do this manually is to hit WINKEY+L to lock the desktop, but from what I gather from the original post is that he's trying to force a 3rd Party library to do something and this library needs to get a D3DERR_DEVICELOST in order to know when to do it (correct me if I'm wrong).

I'm curious as to why Reset failed? This ususally happens when your app hasn't released all video memory resources before calling Reset. If the device gets into this state, the 3rd Party library should detect this and clean-up its resources. I'm just guessing here. Who wrote the library that you're using?

Share this post


Link to post
Share on other sites
jbarcz1    265
You might be using DXUT. IIRC DXUT tears down and re-builds the device if the window is re-sized, which will look to your app like a lost device, but actually isn't.

Share this post


Link to post
Share on other sites
Cluq    122
Thanks for all the replies - very helpful! I will just briefly explain my problem, so that you might help me a bit more - so feel free to comment randomly (but in context please :)).

My setup is, that I have X number of games running in sequence, and all have to be loaded into system memory at the get go - no loading from disk what so ever when each of the games are running. First I load all the games' memory from disk to system mem. Then the first game's memory is loaded to video memory (I run a directX PreLoad() on all resources, which should copy managed pool resources from system mem to video mem). When the game has finished, the memory is evicted from video memory (either with the use of d3ddevice->EvictManagedResources() or some other way), and the next game's memory is loaded to video memory.

My problem is then, that not all memory is evicted from video mem, and I don't know why. When I call EvictManagedResources(), it leaves some memory in the video mem (about 60 megs, so it is not just a backbuffer or something like that). It could be default pool resources, but I have no default pool textures, only default pool vertex buffers, which should not be that much...?
This is however removed, when I get a lost device. On the other hand, losing a device does not evict all memory - the memory is just "cleaned" somehow - all managed pool resources seem to stay in the video mem (which makes sense of course).

At first the resizing of the window worked fine. I'm using Ogre3d, so I dug into it and found a method to make Ogre think it had a lost device - all video memory is cleaned and all resources are recreated.

So now, when a game ends, it evicts (EvictManagedResources()) all memory from video mem, and when the next game starts, I cause a "lost device" which "cleans" the video memory, and then it PreLoad() all resources. Quite hacky in my book, but it works :)

Now I just have to figure out why Ogre3d seems to fill up the video memory as the game is running (which is odd, since i PreLoad() all resources).

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

Sign in to follow this