Jump to content

  • Log In with Google      Sign In   
  • Create Account

Banner advertising on our site currently available from just $5!


1. Learn about the promo. 2. Sign up for GDNet+. 3. Set up your advert!


DX11, multiple video cards, SLI/Crossfire


Old topic!
Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.

  • You cannot reply to this topic
3 replies to this topic

#1 Naruto-kun   Members   -  Reputation: 373

Like
0Likes
Like

Posted 03 March 2014 - 07:13 AM

Hi guys

 

Can anyone here share an example of how to set up a DX10/11 device to use a specific video card, or to take advantage of what SLI/Crossfire offers?

 

Thanks

JB



Sponsor:

#2 kauna   Crossbones+   -  Reputation: 2894

Like
1Likes
Like

Posted 03 March 2014 - 10:27 AM

- SLI/Crossfire compatibility can be tested easily by changing the name of your executable to AFR-FriendlyD3D.exe

- AFR rendering requires to certain things to be done in consideration of multiple GPUs such as updating GPU resources (such as shadowmaps, GPU generated cubemaps) as many times as there are GPU's. 

- As far as I know, SLI works in windowed modes, but Crossfire only in fullscreen (somebody correct me if I'm wrong)

- There's documentation on the net about best SLI practises.

 

Cheers!



#3 Naruto-kun   Members   -  Reputation: 373

Like
1Likes
Like

Posted 03 March 2014 - 11:50 AM

Urm.....my d3d11 devices are all offscreen renderers and separate from the main program's d3d device which is actually d3d9. Is there nothing I can do to take advantage of this in the code when I create the device?



#4 MJP   Moderators   -  Reputation: 14281

Like
1Likes
Like

Posted 03 March 2014 - 04:49 PM

D3D11 has no knowledge of SLI or Crossfire, they're both systems that are set up by driver such that they're transparent to D3D. In fact the only way to enable SLI/Crossfire is using vendor-specific API's, or have your game placed on a whitelist either by the user or by the vendor itself.

 

As far as choosing which GPU your device is bound to, this is definitely supported by D3D11. It's all controlled by first parameter (pAdapter) that you pass to D3D11CreateDevice. Normally you pass NULL as this parameter, and then pass D3D_DRIVER_TYPE_HARDWARE which causes D3D to use the primary adapter (this is the GPU to which the primary display is connected to). If you want to use a different GPU, then you need to get a pointer to the IDXGIAdapter1 interface for that GPU. This is done by creating an IDXGIFactory1 using CreateDXGIFactory1, and then calling EnumAdapters1 to get a pointer to an adapter. Then you pass that adapter to D3D11CreateDevice, along with D3D_DRIVER_TYPE_UNKNOWN.






Old topic!
Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.



PARTNERS