Jump to content

  • Log In with Google      Sign In   
  • Create Account

FREE SOFTWARE GIVEAWAY

We have 4 x Pro Licences (valued at $59 each) for 2d modular animation software Spriter to give away in this Thursday's GDNet Direct email newsletter.


Read more in this forum topic or make sure you're signed up (from the right-hand sidebar on the homepage) and read Thursday's newsletter to get in the running!


DrawPrimitive() fails after device reset?


Old topic!
Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.

  • You cannot reply to this topic
7 replies to this topic

#1 m_a_s_gp   Members   -  Reputation: 107

Like
0Likes
Like

Posted 27 September 2012 - 03:13 PM

Hi,

I am trying to deal with the case when the D3D device is lost (due to window minimization in a fullscreen application)..
Everything seem to be working fine, except for the DrawPrimitive function, which keeps failing after the device is successfully reset!

Here is what my rendering function look like:

1. Call the TestCooperativeLevel function and only proceed to rendering if the function returned D3D_OK.
2. If the function returned D3DERR_DEVICELOST, return without doing anything.
3. If it returned D3DERR_DEVICENOTRESET, ReleaseResources(), reset the device (passing the same presentation parameters used when creating the device) and then RecreateResources().
4. If it returned D3D_OK, continue rendering.

Where in ReleaseResources(), I release an array of dynamic vertex buffers which I use to render all the geometry in my application, as well as releasing all the textures used in the application (I don't create any other resources), and in RecreateResources(), I recreate the textures again. (Vertex buffers are created and released as needed each rendering frame so they don't need recreation)

After debugging, I found that the device reset function succeeds, as well as the device functions: Clear(), BeginScene(), SetStreamSource(), EndScene(), Present(), and when writing vertex data to vertex buffers, the functions Lock() and Unlock() also succeed.
The only function that fails is DrawPrimitive().

Since the device was reset, I know it is not a problem of releasing resources..
So what could the problem be?

Thanks for any help, and let me know if you need any further details..

Sponsor:

#2 m_a_s_gp   Members   -  Reputation: 107

Like
0Likes
Like

Posted 27 September 2012 - 10:12 PM

Solved!

The problem was simply that, in addition to releasing/recreating resources and resetting the device, you must restore the render states you were using before the device was lost..

#3 kubera   Members   -  Reputation: 971

Like
0Likes
Like

Posted 28 September 2012 - 01:30 AM

You would also consider using managed pool, which will be better for Windows Vista and newer.

#4 m_a_s_gp   Members   -  Reputation: 107

Like
0Likes
Like

Posted 28 September 2012 - 05:38 AM

I think that managed pool is suitable only if your vertex data is not frequently updated, otherwise, you should use the default pool, isn't that right?

#5 kubera   Members   -  Reputation: 971

Like
0Likes
Like

Posted 28 September 2012 - 06:09 AM

A good question, but why?
(both are being stored in the GPU while working)
The losting device in Windows Vista and further is implemented only for compatibility with past versions of Windows.

#6 m_a_s_gp   Members   -  Reputation: 107

Like
0Likes
Like

Posted 28 September 2012 - 06:54 AM

I didn't know that..
So you mean that there is totally no difference (especially in performance) between using default pool and managed pool? What about if I am publishing a game, and players may have older versions of windows?

#7 mhagain   Crossbones+   -  Reputation: 8282

Like
0Likes
Like

Posted 28 September 2012 - 10:43 AM

I've never seen a performance difference between using default pool versus managed pool; where things are different is that the managed pool will mirror the resource in system memory whereas the default pool will not (even if the definition of "driver optimal" memory for a default pool resource turns out to be system memory depending on other creation flags). In general usage a managed pool resource will therefore exist in two places - system memory and "driver optimal" memory, with the latter being the same as what is used by the default pool (on the GPU hardware there is no distinction; the concept of memory pools is purely an API artefact).

Note that the need to provide support for older Windows systems can often be quite overstated. According to the latest Steam hardware survey, Vista or 7 with a DX10/11 class GPU is now just a couple of percent off 90%. Of course, your own target audience may be different from those surveyed, so you've got to profile your audience, know what kind of kit they've got, and base that decision on hard facts. But do base it on actual facts and not just some vague sense of locking out XP users though. ;)

It appears that the gentleman thought C++ was extremely difficult and he was overjoyed that the machine was absorbing it; he understood that good C++ is difficult but the best C++ is well-nigh unintelligible.


#8 m_a_s_gp   Members   -  Reputation: 107

Like
0Likes
Like

Posted 28 September 2012 - 12:04 PM

So there is no big difference between the two.. maybe I'll try the managed pool later.. thanks for clarifying.

Of course, your own target audience may be different from those surveyed, so you've got to profile your audience, know what kind of kit they've got, and base that decision on hard facts. But do base it on actual facts and not just some vague sense of locking out XP users though. ;)

That sounds a little scary!
I hope this will not be a big issue for the 2D game I am developing..

Thanks kubera, thanks mhagain.




Old topic!
Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.



PARTNERS