Jump to content
  • Advertisement
Sign in to follow this  
Volgogradetzzz

DX11 [D3D11] Enumerate output fail on notebook with 2 GPUs

This topic is 1681 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

Hello. I have a notebook with integrated IntelHD card and dedicated Nvidia card. When I enumerate adapters I have 2 adapters. If I choose Intel card and enumerate output all works fine. But if I select Nvidia card that EnumOutputs fails. A set Nvidia as main card in control panel, I set power mode as performance in power options. But, for example, when I look in control panel on monitor properties I see that notebook always uses Intel card and there's no way to change it. I tried to run several games - all they uses Nvidia (althought not modern games, without DX11), so it seems there's possibility to use good card exist.

Share this post


Link to post
Share on other sites
Advertisement

The way this kind of setup works is that it uses the NVIDIA card for most GPU operations, but does the final present through the Intel.  The NVIDIA is therefore not a valid output.

 

In order to force it to be used you need to export a global variable from your program; the following code will do it for you (just put it near the top of one of your source files):

 

extern "C" {  
    _declspec(dllexport) DWORD NvOptimusEnablement = 0x00000001;  
}

 

You can also access the NVIDIA control panel and create a profile for your program, selecting to use the NVIDIA card when it runs.

 

Be aware that for certain classes of program - low rendering overhead, low vertex/poly count, not much blending, mainly CPU-bound - the Intel card may actually be faster.  If your program falls into this class I'd encourage you to benchmark with both and select the one that works best.

Share this post


Link to post
Share on other sites

Thank you. I found also this document http://developer.download.nvidia.com/devzone/devcenter/gamegraphics/files/OptimusRenderingPolicies.pdf. Sad, but none of the methods works for me sad.png.

 

Upd.: to clarify - I can select dedicated GPU from list, but this GPU always have zero output device. And I don't know how games use it, maybe some workaround in driver for games (for example, NVidia Expirience sees some installed games on my machine, but not all).

Edited by Volgogradetzzz

Share this post


Link to post
Share on other sites

As I said above, 0 output devices is expected because the NVIDIA GPU isn't actually used for output.  Just enumerate modes for the default adapter and use one of those; using the documented procedures the NVIDIA should automatically kick in.  So, the NVIDIA is used for most GPU work, the Intel is used as the output, and there's nothing else special you need to do.

 

If it helps, this is kind of similar to the old days of the 3DFX Voodoo, where you'd have a main card handling 2D and display, but the Voodoo was an add-on card that did the actual acceleration.

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

GameDev.net is your game development community. Create an account for your GameDev Portfolio and participate in the largest developer community in the games industry.

Sign me up!