NVIDIA Optimus enablement

Recommended Posts

Posted (edited)

NVIDIA Optimus enablement seems not to work anymore.

I have set "Auto-select" as the preferred graphics processor in the NVIDIA Control Panel.

My code contains the following:

extern "C" {
    __declspec(dllexport) DWORD NvOptimusEnablement = 0x00000001; // the same for "_declspec"
}

But my NVIDIA GPU has no IDXGIOutput attached. (EnumOutputs(0, output.GetAddressOf()))

Any solutions?

Edited by matt77hias

Share this post


Link to post
Share on other sites
17 hours ago, matt77hias said:

But my NVIDIA GPU has no IDXGIOutput attached. (EnumOutputs(0, output.GetAddressOf()))

 

If using the automatic GPU selection, you should just use the default adaptor (null pointer), not specify one manually.

The NVidia GPU in an optimus system is not connected to any display output HW (the Intel GPU is), so it makes sense that there would be no outputs listed.

Share this post


Link to post
Share on other sites
1 hour ago, Hodgman said:

The NVidia GPU in an optimus system is not connected to any display output HW (the Intel GPU is), so it makes sense that there would be no outputs listed.

But the code worked before. The dedicated GPU had an output due to the NvOptimusEnablement.

I need this output for iterating its supported display modes.

Share this post


Link to post
Share on other sites
Posted (edited)

In DX11 and upwards you dont need optimus or the amd equivalent to select the graphics device, you can actually detect if the device has dedicated memory assigned to it, if so its a dGPU. When you select your DXGI adapter pass this to the Device construction function and you will always select a dGPU. The following piece of code shows you how to achieve what that article is listing for D3D11, with the exception that it will accept a NVidia or AMD chip as the rendering adapter.

IDXGIAdapter* adapter = nullptr;
IDXGIFactory * pFactory;
HRESULT hr = CreateDXGIFactory(__uuidof(IDXGIFactory), (void**)(&pFactory));
if (hr == S_OK)
{
	for (size_t counter = 0; hr == S_OK; ++counter)
	{
		adapter = nullptr;
		hr = pFactory->EnumAdapters((UINT)counter, &adapter);
		if (adapter != nullptr)
		{
			DXGI_ADAPTER_DESC adapterDesc;
			adapter->GetDesc(&adapterDesc);

			if (adapterDesc.VendorId != 0x8086)
			{
				break;
			}
		}
	}
  
	pFactory->Release();
}

Within windows the DX11 adapter selection will respect your choice, the only thing the preferred GPU setting in NV control panel does is list the NV GPU first in the adapter list which works for applications that always choose adapter 0.

 

BTW: VedorId 0x8086 is Intel you can find AMD (0x1002 this is actually ATI) and NV(0x10DE) on google easily

Edited by NightCreature83

Share this post


Link to post
Share on other sites
1 hour ago, NightCreature83 said:

In DX11 and upwards you dont need optimus or the amd equivalent to select the graphics device, you can actually detect if the device has dedicated memory assigned to it, if so its a dGPU. When you select your DXGI adapter pass this to the Device construction function and you will always select a dGPU. The following piece of code shows you how to achieve what that article is listing for D3D11, with the exception that it will accept a NVidia or AMD chip as the rendering adapter.


IDXGIAdapter* adapter = nullptr;
IDXGIFactory * pFactory;
HRESULT hr = CreateDXGIFactory(__uuidof(IDXGIFactory), (void**)(&pFactory));
if (hr == S_OK)
{
	for (size_t counter = 0; hr == S_OK; ++counter)
	{
		adapter = nullptr;
		hr = pFactory->EnumAdapters((UINT)counter, &adapter);
		if (adapter != nullptr)
		{
			DXGI_ADAPTER_DESC adapterDesc;
			adapter->GetDesc(&adapterDesc);

			if (adapterDesc.VendorId != 0x8086)
			{
				break;
			}
		}
	}
  
	pFactory->Release();
}

Within windows the DX11 adapter selection will respect your choice, the only thing the preferred GPU setting in NV control panel does is list the NV GPU first in the adapter list which works for applications that always choose adapter 0.

 

BTW: VedorId 0x8086 is Intel you can find AMD (0x1002 this is actually ATI) and NV(0x10DE) on google easily

This selects the right adapter. But still no output is associated, so the supported display modes could not be iterated?

Share this post


Link to post
Share on other sites

As Hodgman mentioned earlier in an Optimus setup there are no outputs bound to the dGPU they are all bound to the iGPU, this is also why VR doesnt work on these kinds of cards. And why we are now getting laptops like we had in the past that have direct connections again.

Can you list what dxDiag says if it tells you the NV chip is render only it has no output things to enumerate.

Share this post


Link to post
Share on other sites
11 hours ago, NightCreature83 said:

Can you list what dxDiag says if it tells you the NV chip is render only it has no output things to enumerate.

What do you mean with this?

Furthermore, it is also possible to have an integrated and dedicated GPU in a desktop where the dedicated GPU is not connected to any monitor (just used as GPGPU chip, no output). In this case, you want the integrated GPU.

Share this post


Link to post
Share on other sites

He means you should just post your DxDiag.txt from your machine. I'd be curious to see whether we've detected your machine as hybrid or not.

These days, Optimus doesn't really exist, it's a Windows-implemented hybrid graphics solution. The only piece that's still left up to the individual GPU vendor is determining which applications should be affected by hybrid and should use the discrete GPU, vs which ones shouldn't. And when I say "these days," I mean since Windows 8.1.

When your machine is classified as hybrid, and your app qualifies for the discrete GPU, then the OS will fake the output connections to make the system appear as you describe - with outputs connected to the discrete GPU, making it enumerate first. We'll also do the things necessary to optimize presenting in such a configuration, and enable fullscreen mode even though that kind of indirect connection.

Share this post


Link to post
Share on other sites
2 minutes ago, SoldierOfLight said:

Are you sure you're exporting that symbol out of your exe (and not, e.g. a renderer dll)?

The symbol is exported in a .cpp file of a static .lib which is used by my .exe.

Since you mentioned this, I started moving the export to a .cpp of the .exe itself (NVIDIA GPU is selected and has an output). But it also works when I move the export to another .cpp of my static .lib. Based on these observations and the fact that it worked once, I looked at the changes to the .cpp file for which it does not work anymore. Apparently, I moved all declarations from this file. So basically:

#include "rendering\rendering.hpp"
extern "C" {
    __declspec(dllexport) DWORD NvOptimusEnablement = 0x00000001;
}
extern "C" {
    __declspec(dllexport) int AmdPowerXpressRequestHighPerformance = 1;
}

is all what is left in the file. "rendering.obj" is still generated by the compiler. So could the linker be the culprit?

 

Anyway, thanks to everyone who looked into this.

Share this post


Link to post
Share on other sites
Posted (edited)
Quote

Card name: Intel(R) HD Graphics 5500
Manufacturer: Intel Corporation
Chip type: Intel(R) HD Graphics Family
DAC type: Internal
Device Type: Full Device  
        
Card name: NVIDIA GeForce GTX 960M
Manufacturer: NVIDIA
Chip type: GeForce GTX 960M
DAC type: Integrated RAMDAC
Device Type: Render-Only Device

You can see here that the NV chip is not connected to any real outputs and as mentioned earlier the OS fakes all of this stuff for you. I did a dx diag on a desktop PC and the "Device Type" lines dont show up in that log, which to me means all devices in there are Full devices. Besides you dont really want to transport your framebuffer over PCI-E to display each time.

If the GPU is used as GPGPU you dont need it to output anything that is true but its also most likely not doing rendering work at that moment in time.

 

My NV 970M GTX says the same thing Render-Only Device, this means that most things work but stuff like VR is impossible. It can render everything but the latency of the VR kit itself is so bad you cant use it.

Edited by NightCreature83

Share this post


Link to post
Share on other sites
2 hours ago, NightCreature83 said:

My NV 970M GTX says the same thing Render-Only Device, this means that most things work but stuff like VR is impossible. It can render everything but the latency of the VR kit itself is so bad you cant use it.

How do GTX 1060M etc. "VR-Ready" notebooks handle VR in the presence of an integrated and dedicated GPU?

Share this post


Link to post
Share on other sites

They actually have dedicated output paths to HDMI, so in the case of the hybrids they all run the external outputs over the iGPU which adds a frame delay to the frame being rendered. Also there is no 1060m NVidia did away with the mobile lineup and just sticks desktop grade chips in the laptops now (maxwel2 already paved the way for that its why a 970m is so powerful and is actually comparable to a 960 desktop chip).

For video and display purpose this doesnt matter that much but for VR it does because the tracking needs to match the frame rendering.

This is why its important that if you have a laptop and you want to buy a VR kit you check its VR Ready.

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now