Sign in to follow this  

Sharing a GPU across multiple Virtual Machines

This topic is 1860 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

There are talks about how the future gaming will look like. Some people predict that in the future a central server packed with GPUs will run Virtual Machines that thin clients will connect to, i.e Remote Desktop. There is technology today that explores this concept. Microsoft has its RemoteFX and Nvidia is launching its VGX technology soon.

I want to test this technology by installing Windows Server and running Virtual Machines in Hyper-V with RemoteFX on my PC. RemoteFX supports up to 12 clients per GPU. In a Quad-SLI configuration this will enable up to 48 users to play video games on a single server.

Has anyone any experiance in this technology? Specificly I want to know if I can start a game in the virtual machine, have the user disconnect and still have the game running in the background. I have read people reporting that Direct3D applications crash after the client disconnects, can this be prevented?

Share this post


Link to post
Share on other sites
The crashes are usually due to incorrect handling of "lost device" (I assume you know what this means in an ordinary D3D application). The processes themselves can continue to run even though the user would disconnect. There can potentially be additional complications regarding the user's various handles (including window handles and mutexes), so this type of scenario may not be as easy as one would initially think.

Share this post


Link to post
Share on other sites
I am handling "lost device", do I need a timeout routine that SetFocus to my window again perhaps?
If the user disconnects his RDP, will Windows Server do anything to the running Direct3D device other then just take away focus?

So if the user starts the program from RDP, the handle is tied to the connection, and if the connection terminates the handle to the window will be corrupted?

On the Task Manager you have Users tab, I guess when a RDP comes in a user gets listed here as logged in, but if the user terminates the window the user is still present on the Users Tan right? So why then does it crash.

Share this post


Link to post
Share on other sites
Are you using D3D9? It was not designed for multi-desktop sharing at all, and there may be internal bugs that cause the crash. I believe the problem is that D3D over RDP uses global mutex (that can be signaled and waited across the network) for synchronization locking, and if you disconnect the RDP connection, the mutex can be inadvertently be discarded or corrupted by the server.

D3D11 alleviates this problem a bit by separating the window and memory handling to DXGI, and using fewer global locks. The global kernel resource sharing (as described above) can still be a source of a lot of instability, though. And yet it is very much necessary for the operation of RDP.

You are not on an easy path, by far.

Share this post


Link to post
Share on other sites

This topic is 1860 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

Sign in to follow this