Jump to content
  • Advertisement
matt77hias

NVIDIA Optimus enablement

This topic is 389 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

NVIDIA Optimus enablement seems not to work anymore.

I have set "Auto-select" as the preferred graphics processor in the NVIDIA Control Panel.

My code contains the following:

extern "C" {
    __declspec(dllexport) DWORD NvOptimusEnablement = 0x00000001; // the same for "_declspec"
}

But my NVIDIA GPU has no IDXGIOutput attached. (EnumOutputs(0, output.GetAddressOf()))

Any solutions?

Edited by matt77hias

Share this post


Link to post
Share on other sites
Advertisement
17 hours ago, matt77hias said:

But my NVIDIA GPU has no IDXGIOutput attached. (EnumOutputs(0, output.GetAddressOf()))

 

If using the automatic GPU selection, you should just use the default adaptor (null pointer), not specify one manually.

The NVidia GPU in an optimus system is not connected to any display output HW (the Intel GPU is), so it makes sense that there would be no outputs listed.

Share this post


Link to post
Share on other sites
1 hour ago, Hodgman said:

The NVidia GPU in an optimus system is not connected to any display output HW (the Intel GPU is), so it makes sense that there would be no outputs listed.

But the code worked before. The dedicated GPU had an output due to the NvOptimusEnablement.

I need this output for iterating its supported display modes.

Share this post


Link to post
Share on other sites

In DX11 and upwards you dont need optimus or the amd equivalent to select the graphics device, you can actually detect if the device has dedicated memory assigned to it, if so its a dGPU. When you select your DXGI adapter pass this to the Device construction function and you will always select a dGPU. The following piece of code shows you how to achieve what that article is listing for D3D11, with the exception that it will accept a NVidia or AMD chip as the rendering adapter.

IDXGIAdapter* adapter = nullptr;
IDXGIFactory * pFactory;
HRESULT hr = CreateDXGIFactory(__uuidof(IDXGIFactory), (void**)(&pFactory));
if (hr == S_OK)
{
	for (size_t counter = 0; hr == S_OK; ++counter)
	{
		adapter = nullptr;
		hr = pFactory->EnumAdapters((UINT)counter, &adapter);
		if (adapter != nullptr)
		{
			DXGI_ADAPTER_DESC adapterDesc;
			adapter->GetDesc(&adapterDesc);

			if (adapterDesc.VendorId != 0x8086)
			{
				break;
			}
		}
	}
  
	pFactory->Release();
}

Within windows the DX11 adapter selection will respect your choice, the only thing the preferred GPU setting in NV control panel does is list the NV GPU first in the adapter list which works for applications that always choose adapter 0.

 

BTW: VedorId 0x8086 is Intel you can find AMD (0x1002 this is actually ATI) and NV(0x10DE) on google easily

Edited by NightCreature83

Share this post


Link to post
Share on other sites
1 hour ago, NightCreature83 said:

that always choose adapter 0.

My Integrated Intel comes out as 0. Nvidia as 1, reference as 2.

Share this post


Link to post
Share on other sites
1 hour ago, NightCreature83 said:

In DX11 and upwards you dont need optimus or the amd equivalent to select the graphics device, you can actually detect if the device has dedicated memory assigned to it, if so its a dGPU. When you select your DXGI adapter pass this to the Device construction function and you will always select a dGPU. The following piece of code shows you how to achieve what that article is listing for D3D11, with the exception that it will accept a NVidia or AMD chip as the rendering adapter.


IDXGIAdapter* adapter = nullptr;
IDXGIFactory * pFactory;
HRESULT hr = CreateDXGIFactory(__uuidof(IDXGIFactory), (void**)(&pFactory));
if (hr == S_OK)
{
	for (size_t counter = 0; hr == S_OK; ++counter)
	{
		adapter = nullptr;
		hr = pFactory->EnumAdapters((UINT)counter, &adapter);
		if (adapter != nullptr)
		{
			DXGI_ADAPTER_DESC adapterDesc;
			adapter->GetDesc(&adapterDesc);

			if (adapterDesc.VendorId != 0x8086)
			{
				break;
			}
		}
	}
  
	pFactory->Release();
}

Within windows the DX11 adapter selection will respect your choice, the only thing the preferred GPU setting in NV control panel does is list the NV GPU first in the adapter list which works for applications that always choose adapter 0.

 

BTW: VedorId 0x8086 is Intel you can find AMD (0x1002 this is actually ATI) and NV(0x10DE) on google easily

This selects the right adapter. But still no output is associated, so the supported display modes could not be iterated?

Share this post


Link to post
Share on other sites

As Hodgman mentioned earlier in an Optimus setup there are no outputs bound to the dGPU they are all bound to the iGPU, this is also why VR doesnt work on these kinds of cards. And why we are now getting laptops like we had in the past that have direct connections again.

Can you list what dxDiag says if it tells you the NV chip is render only it has no output things to enumerate.

Share this post


Link to post
Share on other sites

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

We are the game development community.

Whether you are an indie, hobbyist, AAA developer, or just trying to learn, GameDev.net is the place for you to learn, share, and connect with the games industry. Learn more About Us or sign up!

Sign me up!