Shared surfaces with WPF and SlimDX

Started by
2 comments, last by MyMikeD 12 years, 3 months ago
Hi,

Our application is using Shared Surfaces to interop between Direct3D10 and WPF (using d3dimage). Basically the concept is to created a shared Direct3D9 surface that is attached to the d3dimage and then have Direct3D10 render to that shared Direct3D9 surface. There is an SlimDX example that does this as well. We use the code from the example in our application.

We have been finding some strange drawing anomalies using this technique on some cards. Basically it looks like the image is only partially rendered. My current theory is that this is because of synchronization issues between the shared surfaces. I stumbled upon this article and info about synchronization http://archive.msdn.microsoft.com/D3D9ExDXGISharedSurf From what I can tell the SlimDX code for doing this shared surface stuff to WPF doesn't have any of this type of stuff in it. Should it?

Also is any of the queue functionality described in this D3D9ExDGIISharedSurf API (ie ISurfaceQueue, ISurfaceProducer, ISurfaceConsumer) exposed via SlimDX?

Thanks,

Mike
Advertisement
Found this thread after I had posted http://www.gamedev.net/topic/606138-slimdx-issues-with-keyedmutex-and-wpf/ I'm going to try the "staging resource" technique for synchronizing and see if that fixes the problem. Will post the results!
Hello MyMikeD,

Are you using the D3DImage as explained in this article: Introduction to D3DImage? I have also tried it before and it worked for me. In the end, getting WPF and SlimDX (or SharpDX) to work together is some narly business. I never actually got them to work together in the way I wanted (WPF dynamic framerate controlled my own framerate). In the end I dropped it all together and just did it myself with SlimDX.

In case you can't get it to work, there is also the option of using an overlaying WPF windows on top of your 'Render'-window (which can be native). By making the WPF window transparent, borderless and on top, it will appear to be merged with render-window. That way you can let the Render-window render on a desirable 60 Hz frame rate and WPF performs in its own terms.
Desktop Windows Manager (DWM) takes care of the rest for you, only during resizing the render window the user might see some black borders as artifacts, but if that doesn't matter to you, you are set.
Yes we are using d3dImage to render the Direct3D 10 scene via a shared texture... there is also some Direct2D stuff mixed in there as well. So it's a bit of a complicated scenario with making use of shared surfaces.

Turns out that yes indeed it was a synchronization issue. The different results I had on different cards was a result of how fast the card could render my scene. If it could render it fast enough before extracting it to the d3dImage then I was fine, if not then stuff was dropped. This article describes the issues and the solutions http://archive.msdn.microsoft.com/D3D9ExDXGISharedSurf

All I had to do was the "mapping staging texture" trick (my case I'm not worried about multiple threads) and voila things seem to be synchronized now fine on all cards.

Below is the piece of slimDX code I used (I'm using D3D10) to synchronize after rendering my D3D10 scene :



const int subResourceNumber = 0;
ResourceRegion resourceRegion = new ResourceRegion { Back = 1, Bottom = _stagingTexture.Description.Height, Front = 0, Left = 0, Right = _stagingTexture.Description.Width, Top = 0 };
_device.CopySubresourceRegion(_renderTexture, subResourceNumber, resourceRegion, _stagingTexture, subResourceNumber, 0, 0, 0);
_stagingTexture.Map(subResourceNumber, MapMode.Read, MapFlags.None);
_stagingTexture.Unmap(subResourceNumber);

This topic is closed to new replies.

Advertisement