Jump to content

  • Log In with Google      Sign In   
  • Create Account

Interested in a FREE copy of HTML5 game maker Construct 2?

We'll be giving away three Personal Edition licences in next Tuesday's GDNet Direct email newsletter!

Sign up from the right-hand sidebar on our homepage and read Tuesday's newsletter for details!


We're also offering banner ads on our site from just $5! 1. Details HERE. 2. GDNet+ Subscriptions HERE. 3. Ad upload HERE.


Sharing a GPU across multiple Virtual Machines


Old topic!
Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.

  • You cannot reply to this topic
3 replies to this topic

#1 Tispe   Members   -  Reputation: 1038

Like
0Likes
Like

Posted 12 December 2012 - 07:07 AM

There are talks about how the future gaming will look like. Some people predict that in the future a central server packed with GPUs will run Virtual Machines that thin clients will connect to, i.e Remote Desktop. There is technology today that explores this concept. Microsoft has its RemoteFX and Nvidia is launching its VGX technology soon.

I want to test this technology by installing Windows Server and running Virtual Machines in Hyper-V with RemoteFX on my PC. RemoteFX supports up to 12 clients per GPU. In a Quad-SLI configuration this will enable up to 48 users to play video games on a single server.

Has anyone any experiance in this technology? Specificly I want to know if I can start a game in the virtual machine, have the user disconnect and still have the game running in the background. I have read people reporting that Direct3D applications crash after the client disconnects, can this be prevented?

Sponsor:

#2 Nik02   Crossbones+   -  Reputation: 2869

Like
0Likes
Like

Posted 12 December 2012 - 08:05 AM

The crashes are usually due to incorrect handling of "lost device" (I assume you know what this means in an ordinary D3D application). The processes themselves can continue to run even though the user would disconnect. There can potentially be additional complications regarding the user's various handles (including window handles and mutexes), so this type of scenario may not be as easy as one would initially think.

Niko Suni


#3 Tispe   Members   -  Reputation: 1038

Like
0Likes
Like

Posted 13 December 2012 - 02:49 AM

I am handling "lost device", do I need a timeout routine that SetFocus to my window again perhaps?
If the user disconnects his RDP, will Windows Server do anything to the running Direct3D device other then just take away focus?

So if the user starts the program from RDP, the handle is tied to the connection, and if the connection terminates the handle to the window will be corrupted?

On the Task Manager you have Users tab, I guess when a RDP comes in a user gets listed here as logged in, but if the user terminates the window the user is still present on the Users Tan right? So why then does it crash.

#4 Nik02   Crossbones+   -  Reputation: 2869

Like
0Likes
Like

Posted 13 December 2012 - 07:55 AM

Are you using D3D9? It was not designed for multi-desktop sharing at all, and there may be internal bugs that cause the crash. I believe the problem is that D3D over RDP uses global mutex (that can be signaled and waited across the network) for synchronization locking, and if you disconnect the RDP connection, the mutex can be inadvertently be discarded or corrupted by the server.

D3D11 alleviates this problem a bit by separating the window and memory handling to DXGI, and using fewer global locks. The global kernel resource sharing (as described above) can still be a source of a lot of instability, though. And yet it is very much necessary for the operation of RDP.

You are not on an easy path, by far.

Niko Suni





Old topic!
Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.



PARTNERS