Jump to content
  • Advertisement

Archived

This topic is now archived and is closed to further replies.

poozer

Preventing the loss of offscreen surfaces

This topic is 6128 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

Hello Everyone, I'm somewhat new to DirectX and need some help. Currently, I'm designing a "windowed" DirectX application (not a game). The application will store several offscreen surfaces in video memory. The thing is, the application CANNOT RECREATE/RERENDER these offscreen surfaces if they are lost. And, from what I understand, whenever the video mode changes, DirectX may deallocate offscreen surfaces stored in video memory. This is bad. Is there a way to prevent the loss of offscreen surfaces (without storing them in system memory)? Or is there a way to prevent the video mode from being changed? Note that I'll be developing two versions of my app, one for NT which uses DirectX 3.0, and another version which uses DirectX 7.0 or 8.0. The NT version does not need to perfect however... Any ideas? Thanks in advance! Edited by - poozer on February 4, 2002 11:16:39 PM

Share this post


Link to post
Share on other sites
Advertisement
Well, you could respond to the WM_ACTIVATEAPP message or whatever it''s called (it''s been a couple months, bear with me ) and re-create them when your app becomes activated again... it sounds like you can''t be sure when Direct X loses your surfaces, so thats all I can think of off the top of my head.

MOV ax, bx
asm programming
written by Jello

Share this post


Link to post
Share on other sites
I appreciate the suggestion Jello.

Anyone other ideas? I'd prefer not to have to store/restore the offscreen surfaces everytime the application is activated or deactivated. This will occur a lot. Plus, the app will write to the offscreen surfaces while the application is deactivated...

It would probably help if I elaborated more. I wrote a GDI-based X Server for Windows (like Hummingbird Exceed). Now I'm incorporating DirectX to hopefully improve performance. Like other X Servers, mine can run in a "multiple window mode", so it has to coexist and play nicely with other Win32 applications. This means it should still work (but not necessarily perfectly) if the user or another application changes the display mode. The problem is that X clients can create offscreen buffers on the server called Pixmaps. I want to store the Pixmaps in video memory, but there isn't a way to inform X clients that their Pixmaps have been lost. I would not mind preventing the display mode from being changed, but I don't think there is a way.

Thanks!



Edited by - poozer on February 4, 2002 12:32:57 AM

Share this post


Link to post
Share on other sites
I guess there is no simple way to ensure offscreen surfaces are retained in video memory when the display mode changes. This kind of sucks for me.

Share this post


Link to post
Share on other sites
The best that you'll get out of D3D is to let D3D manage your texture surfaces for you (IIRC you pass D3D_MANAGED as one of the flags when creating the surface). That way you don't have to keep a copy of the surface - D3D will do that for you and maintain consistency between they backup and the actual surface that is in VRAM.

HTH


Iain Hutchison
Programmer, Silicon Dreams
The views expressed here are my own, not those of my employer.

Edited by - pieman on February 5, 2002 2:31:09 PM

Share this post


Link to post
Share on other sites
Hello,

You could dump them to regular areas of memory. When you receive the message that the screen mode is changing, just create buffers (NOT DX Surfaces) in system memory to hold the data.. Then copy it back in when you want them back

Oh, just saw that you do not want to store them aside when you loose focus... hmmmm... then i dont know, sorry. About changing them during that time, that may be dangerous because you dont know what is going to be using that memory. I guess If you ended up going with what i suggested you could edit the temporary buffers in memory.

BrianH

Edited by - BrianH on February 5, 2002 4:32:03 PM

Share this post


Link to post
Share on other sites
You CANNOT keep the data stored in video memory when your app is minimized. The ONLY WAY is to keep a backup in system memory.
When your app is minimized, the device window looses focus and DX cleans everything up. There is a reason for this apparent madness: If another D3D app is put into focus after your app has lost focus, it can re-create its surfaces and use up video memory. If every application didn''t free video memory, then you''d run out very quickly. The windows desktop for example would be in video memory, as would the surfaces for any windows that are minimized.

Steve

Share this post


Link to post
Share on other sites
plus being that you are playing nice, a fullscreen program that grabs exclusive control will kick you out of dx for a while so you wont have access to anything. like said before, keep them a backup in system memory. recreate surfaces when lost AND active. copy from the system mem back ups you have. plus since you are not using dx for anything except for actually drawing, you should be fine when somethign gets exclusive mode (i am unsure how dx deals with system memory buffers so you may be better off making your own). i dont see why you would need to use dd7 instead of dd3 for everything. also dont make the nt version less then perfect, either make one or dont.

Share this post


Link to post
Share on other sites
Ok. Thanks for the input everyone. As I said, I''m somewhat new to DirectX.

I''ll write some small programs to try out your suggestions. And I''ll run some tests to measure the potential performance gains offered by DirectX (given the requirements of the X protocol).

Thanks for the help.

Share this post


Link to post
Share on other sites
Just create a very small SYSTEM-MEMORY surface (4x4 pix or so).
Then allocate your own buffer the size you want the surface to be.
Then GetSurfaceDesc() to buffer "foo".
Then modify the lpSurface, lPitch, dwWidth and dwHeight in the buffer "foo" and call SetSurfaceDesc() with it.

This way you have your DD-Surface in system-memory (which is somewhat slower, but who cares ) and it will never get lost (!).

BTW: d3d does not guarantee you, but it will not throw away any system-memory surfaces, even if you didnt allocate the memory yourself. but then again its only guaranteed for self-allocated surface-mem.

you could (if you dont change the contents very much) create a video-memory-copy of your surface, that you update everytime the original surface is lost or changed. that way you could use the (fast) vid-mem-surface for blitting.

and: if you change the surface very frequently then its probably better to have it in system-memory anyway.

bye,

--- foobar
We push more polygons before breakfast than most people do in a day

Share this post


Link to post
Share on other sites

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

We are the game development community.

Whether you are an indie, hobbyist, AAA developer, or just trying to learn, GameDev.net is the place for you to learn, share, and connect with the games industry. Learn more About Us or sign up!

Sign me up!