Jump to content

  • Log In with Google      Sign In   
  • Create Account

Interested in a FREE copy of HTML5 game maker Construct 2?

We'll be giving away three Personal Edition licences in next Tuesday's GDNet Direct email newsletter!

Sign up from the right-hand sidebar on our homepage and read Tuesday's newsletter for details!


We're also offering banner ads on our site from just $5! 1. Details HERE. 2. GDNet+ Subscriptions HERE. 3. Ad upload HERE.


D3D11CreateDevice returns wrong feature level


Old topic!
Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.

  • You cannot reply to this topic
7 replies to this topic

#1 sunnysideup   Members   -  Reputation: 105

Like
0Likes
Like

Posted 27 July 2013 - 04:15 AM

I have a GTX 570 graphics card which SHOULD support DirectX 11, but the highest feature level that D3D11CreateDevice selects is 10_1. I had a look around on these forums and it seems that other people have had this problem and that it might be Nvidia Optimus selecting the Intel integrated graphics chipset instead of the dedicated Nvidia GPU. So I added my .exe to the list of programs in the Nvidia control panel and then used the following code to iterate through the available adapters to hopefully select the correct adapter.

 

Here's the code:

	/* Create the dxgi factory. */
	Microsoft::WRL::ComPtr<IDXGIFactory1> dxgi_factory;
	HRESULT hr = CreateDXGIFactory1(__uuidof(IDXGIFactory2), (void**)&dxgi_factory);

	if (FAILED(hr)) {
		throw std::runtime_error("Failed to create dxgi factory.");
	}

	if (FAILED(dxgi_factory.As(&m_dxgi_factory))) {
		throw std::runtime_error("Failed to obtain IDXGIFactory2 interface.");
	}

	D3D_FEATURE_LEVEL feature_levels[] = 
	{
		D3D_FEATURE_LEVEL_11_1,
		D3D_FEATURE_LEVEL_11_0,
		D3D_FEATURE_LEVEL_10_1,
		D3D_FEATURE_LEVEL_10_0,
		D3D_FEATURE_LEVEL_9_3
	};

    int num_feature_levels = ARRAYSIZE(feature_levels);

	Microsoft::WRL::ComPtr<ID3D11Device> device;
	Microsoft::WRL::ComPtr<ID3D11DeviceContext> context;
	
	/* Iterate through each feature level. */
	for (int f = 0; f < num_feature_levels; f++) {

		/* Iterate through available adapters. */
		while (m_dxgi_factory->EnumAdapters1(f, &m_dxgi_adapter) != DXGI_ERROR_NOT_FOUND) { 

			hr = D3D11CreateDevice(m_dxgi_adapter.Get(), D3D_DRIVER_TYPE_UNKNOWN, 0, 
				create_device_flags, feature_levels, num_feature_levels, D3D11_SDK_VERSION, 
				&device, &m_feature_level, &context);

			/* If success break out of loop. */
			if(SUCCEEDED(hr))
				break;
		}

		if(SUCCEEDED(hr))
			break;
	}

This still doesn't work. Someone on these forums also suggested looking in the DirectX caps viewer and in the DXGI 1.1 > NVIDIA GeForce GTX 570 > Direct3D 11 folder it only lists D3D_FEATURE_LEVEL_10_1. It's like my gpu isn't being recognized as a DirectX 11 capable device.

 

Does anyone have any suggestions? Thanks.



Sponsor:

#2 ajmiles   Members   -  Reputation: 194

Like
1Likes
Like

Posted 27 July 2013 - 06:37 AM

The GTX 570 should support DX11, you're right about that, but the code you've written above doesn't do what you think it does.

 

You're passing 'f' (the feature level index) into EnumAdapters1 for some reason, you're not iterating over the available adapters in that loop. In fact, if EnumAdapters1 returned a valid adapter but then D3D11DCreateDevice failed, you're stuck in an infinite loop.

 

Try this:

D3D_FEATURE_LEVEL feature_levels[] = 
{
	D3D_FEATURE_LEVEL_11_1,
	D3D_FEATURE_LEVEL_11_0
};

Microsoft::WRL::ComPtr<ID3D11Device> device;
Microsoft::WRL::ComPtr<ID3D11DeviceContext> context;

UINT i = 0;

/* Iterate through available adapters. */
while (m_dxgi_factory->EnumAdapters1(i++, &m_dxgi_adapter) != DXGI_ERROR_NOT_FOUND) { 

	hr = D3D11CreateDevice(m_dxgi_adapter.Get(), D3D_DRIVER_TYPE_UNKNOWN, 0, 
		create_device_flags, feature_levels, ARRAYSIZE(num_feature_levels), D3D11_SDK_VERSION, 
		&device, &m_feature_level, &context);

	/* If success break out of loop. */
	if(SUCCEEDED(hr))
		break;
}

if(device != nullptr)
	//Hurray
else
	// Boo

Secondly, have you tried disabling your integrated GPU in the BIOS and then trying to create an 11.0 feature level device? I have definitely heard of issues whereby conflicting levels of WDDM drivers across different drivers causing both adapters to use the lowest common denominator. What is your integrated GPU?



#3 sunnysideup   Members   -  Reputation: 105

Like
0Likes
Like

Posted 27 July 2013 - 06:59 AM

Thank you ajmiles. Fixing my code solved the problem as long as I only specify D3D_FEATURE_LEVEL_11_1 and D3D_FEATURE_LEVEL_11_0  in the feature level array. But  if I add D3D_FEATURE_LEVEL_10_1 to the array it still selects that feature level. I guess if I actually started using some Direct3D 11 only features then it would work?



#4 ajmiles   Members   -  Reputation: 194

Like
0Likes
Like

Posted 27 July 2013 - 07:10 AM

What I expect was happening is that your Intel GPU is adapter 0 and your Nvidia GPU is adapter 1. Your original code did the following:

 

"Try to make me a device of level 11.1, 11.0, 10.1, 10.0 or 9.3 on adapter 0".

 

It succeeds at level 10.1, and breaks out of both loops. By removing 10.1/10.0/9.3 from the feature levels it just ensures that device creation fails on the Intel GPU so it then goes on to try adapter 1 (Nvidia GPU) which then succeeds at 11.0. If you add back in 10.1 then it just allows device creation to succeed again on the Intel GPU and it never attempts to create a device on the Nvidia GPU.



#5 osmanb   Crossbones+   -  Reputation: 1608

Like
0Likes
Like

Posted 27 July 2013 - 07:51 AM

Yes. Alternately, you could try to make a device on ALL adapters, and pick the one that yields the highest feature level, deleting the others...



#6 kubera   Members   -  Reputation: 946

Like
0Likes
Like

Posted 27 July 2013 - 08:23 AM

Please consider reading that thread:

http://xboxforums.create.msdn.com/forums/p/99471/593344.aspx#593344



#7 sunnysideup   Members   -  Reputation: 105

Like
0Likes
Like

Posted 27 July 2013 - 09:16 AM

Now I get the follwing error whenever I try and ALT+ENTER.

DXGI ERROR: IDXGISwapChain::GetContainingOutput: The swapchain's adapter does not control the output on which the swapchain's window resides.


#8 ajmiles   Members   -  Reputation: 194

Like
0Likes
Like

Posted 27 July 2013 - 02:35 PM

Sounds like you're trying to go full-screen on a monitor that isn't plugged into the GPU that you created the device and swap chain on, in this case, I expect the monitor the window is located on is plugged in to your Intel GPU. I believe the default behaviour of ALT+Enter is to go to full-screen on whichever monitor the majority of the window is displayed on.






Old topic!
Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.



PARTNERS