DX11, multiple video cards, SLI/Crossfire

Started by
2 comments, last by MJP 10 years, 1 month ago

Hi guys

Can anyone here share an example of how to set up a DX10/11 device to use a specific video card, or to take advantage of what SLI/Crossfire offers?

Thanks

JB

Advertisement

- SLI/Crossfire compatibility can be tested easily by changing the name of your executable to AFR-FriendlyD3D.exe

- AFR rendering requires to certain things to be done in consideration of multiple GPUs such as updating GPU resources (such as shadowmaps, GPU generated cubemaps) as many times as there are GPU's.

- As far as I know, SLI works in windowed modes, but Crossfire only in fullscreen (somebody correct me if I'm wrong)

- There's documentation on the net about best SLI practises.

Cheers!

Urm.....my d3d11 devices are all offscreen renderers and separate from the main program's d3d device which is actually d3d9. Is there nothing I can do to take advantage of this in the code when I create the device?

D3D11 has no knowledge of SLI or Crossfire, they're both systems that are set up by driver such that they're transparent to D3D. In fact the only way to enable SLI/Crossfire is using vendor-specific API's, or have your game placed on a whitelist either by the user or by the vendor itself.

As far as choosing which GPU your device is bound to, this is definitely supported by D3D11. It's all controlled by first parameter (pAdapter) that you pass to D3D11CreateDevice. Normally you pass NULL as this parameter, and then pass D3D_DRIVER_TYPE_HARDWARE which causes D3D to use the primary adapter (this is the GPU to which the primary display is connected to). If you want to use a different GPU, then you need to get a pointer to the IDXGIAdapter1 interface for that GPU. This is done by creating an IDXGIFactory1 using CreateDXGIFactory1, and then calling EnumAdapters1 to get a pointer to an adapter. Then you pass that adapter to D3D11CreateDevice, along with D3D_DRIVER_TYPE_UNKNOWN.

This topic is closed to new replies.

Advertisement