NVIDIA Optimus enablement

Started by
17 comments, last by NightCreature83 6 years, 7 months ago

NVIDIA Optimus enablement seems not to work anymore.

I have set "Auto-select" as the preferred graphics processor in the NVIDIA Control Panel.

My code contains the following:


extern "C" {
    __declspec(dllexport) DWORD NvOptimusEnablement = 0x00000001; // the same for "_declspec"
}

But my NVIDIA GPU has no IDXGIOutput attached. (EnumOutputs(0, output.GetAddressOf()))

Any solutions?

🧙

Advertisement

Maybe try this: http://onemanmmo.com/index.php?cmd=newsitem&comment=news.1.211.0

Didn't try it myself, but could be worth a shot.

12 hours ago, DeathBuffer said:

Maybe try this: http://onemanmmo.com/index.php?cmd=newsitem&comment=news.1.211.0

Didn't try it myself, but could be worth a shot.

Due to the includes this is not GPU independent?

🧙

It's Nvidia only, but so is Optimus.

17 hours ago, matt77hias said:

But my NVIDIA GPU has no IDXGIOutput attached. (EnumOutputs(0, output.GetAddressOf()))

 

If using the automatic GPU selection, you should just use the default adaptor (null pointer), not specify one manually.

The NVidia GPU in an optimus system is not connected to any display output HW (the Intel GPU is), so it makes sense that there would be no outputs listed.

1 hour ago, Hodgman said:

The NVidia GPU in an optimus system is not connected to any display output HW (the Intel GPU is), so it makes sense that there would be no outputs listed.

But the code worked before. The dedicated GPU had an output due to the NvOptimusEnablement.

I need this output for iterating its supported display modes.

🧙

In DX11 and upwards you dont need optimus or the amd equivalent to select the graphics device, you can actually detect if the device has dedicated memory assigned to it, if so its a dGPU. When you select your DXGI adapter pass this to the Device construction function and you will always select a dGPU. The following piece of code shows you how to achieve what that article is listing for D3D11, with the exception that it will accept a NVidia or AMD chip as the rendering adapter.


IDXGIAdapter* adapter = nullptr;
IDXGIFactory * pFactory;
HRESULT hr = CreateDXGIFactory(__uuidof(IDXGIFactory), (void**)(&pFactory));
if (hr == S_OK)
{
	for (size_t counter = 0; hr == S_OK; ++counter)
	{
		adapter = nullptr;
		hr = pFactory->EnumAdapters((UINT)counter, &adapter);
		if (adapter != nullptr)
		{
			DXGI_ADAPTER_DESC adapterDesc;
			adapter->GetDesc(&adapterDesc);

			if (adapterDesc.VendorId != 0x8086)
			{
				break;
			}
		}
	}
  
	pFactory->Release();
}

Within windows the DX11 adapter selection will respect your choice, the only thing the preferred GPU setting in NV control panel does is list the NV GPU first in the adapter list which works for applications that always choose adapter 0.

 

BTW: VedorId 0x8086 is Intel you can find AMD (0x1002 this is actually ATI) and NV(0x10DE) on google easily

Worked on titles: CMR:DiRT2, DiRT 3, DiRT: Showdown, GRID 2, theHunter, theHunter: Primal, Mad Max, Watch Dogs: Legion

1 hour ago, NightCreature83 said:

that always choose adapter 0.

My Integrated Intel comes out as 0. Nvidia as 1, reference as 2.

🧙

1 hour ago, NightCreature83 said:

In DX11 and upwards you dont need optimus or the amd equivalent to select the graphics device, you can actually detect if the device has dedicated memory assigned to it, if so its a dGPU. When you select your DXGI adapter pass this to the Device construction function and you will always select a dGPU. The following piece of code shows you how to achieve what that article is listing for D3D11, with the exception that it will accept a NVidia or AMD chip as the rendering adapter.



IDXGIAdapter* adapter = nullptr;
IDXGIFactory * pFactory;
HRESULT hr = CreateDXGIFactory(__uuidof(IDXGIFactory), (void**)(&pFactory));
if (hr == S_OK)
{
	for (size_t counter = 0; hr == S_OK; ++counter)
	{
		adapter = nullptr;
		hr = pFactory->EnumAdapters((UINT)counter, &adapter);
		if (adapter != nullptr)
		{
			DXGI_ADAPTER_DESC adapterDesc;
			adapter->GetDesc(&adapterDesc);

			if (adapterDesc.VendorId != 0x8086)
			{
				break;
			}
		}
	}
  
	pFactory->Release();
}

Within windows the DX11 adapter selection will respect your choice, the only thing the preferred GPU setting in NV control panel does is list the NV GPU first in the adapter list which works for applications that always choose adapter 0.

 

BTW: VedorId 0x8086 is Intel you can find AMD (0x1002 this is actually ATI) and NV(0x10DE) on google easily

This selects the right adapter. But still no output is associated, so the supported display modes could not be iterated?

🧙

As Hodgman mentioned earlier in an Optimus setup there are no outputs bound to the dGPU they are all bound to the iGPU, this is also why VR doesnt work on these kinds of cards. And why we are now getting laptops like we had in the past that have direct connections again.

Can you list what dxDiag says if it tells you the NV chip is render only it has no output things to enumerate.

Worked on titles: CMR:DiRT2, DiRT 3, DiRT: Showdown, GRID 2, theHunter, theHunter: Primal, Mad Max, Watch Dogs: Legion

This topic is closed to new replies.

Advertisement