Can anyone here share an example of how to set up a DX10/11 device to use a specific video card, or to take advantage of what SLI/Crossfire offers?
Jump to content
Posted 03 March 2014 - 10:27 AM
- SLI/Crossfire compatibility can be tested easily by changing the name of your executable to AFR-FriendlyD3D.exe
- AFR rendering requires to certain things to be done in consideration of multiple GPUs such as updating GPU resources (such as shadowmaps, GPU generated cubemaps) as many times as there are GPU's.
- As far as I know, SLI works in windowed modes, but Crossfire only in fullscreen (somebody correct me if I'm wrong)
- There's documentation on the net about best SLI practises.
Posted 03 March 2014 - 04:49 PM
D3D11 has no knowledge of SLI or Crossfire, they're both systems that are set up by driver such that they're transparent to D3D. In fact the only way to enable SLI/Crossfire is using vendor-specific API's, or have your game placed on a whitelist either by the user or by the vendor itself.
As far as choosing which GPU your device is bound to, this is definitely supported by D3D11. It's all controlled by first parameter (pAdapter) that you pass to D3D11CreateDevice. Normally you pass NULL as this parameter, and then pass D3D_DRIVER_TYPE_HARDWARE which causes D3D to use the primary adapter (this is the GPU to which the primary display is connected to). If you want to use a different GPU, then you need to get a pointer to the IDXGIAdapter1 interface for that GPU. This is done by creating an IDXGIFactory1 using CreateDXGIFactory1, and then calling EnumAdapters1 to get a pointer to an adapter. Then you pass that adapter to D3D11CreateDevice, along with D3D_DRIVER_TYPE_UNKNOWN.