Jump to content

  • Log In with Google      Sign In   
  • Create Account


DirectX 11 can't find all of my graphics adapters


Old topic!
Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.

  • You cannot reply to this topic
10 replies to this topic

#1 RythBlade   Members   -  Reputation: 152

Like
0Likes
Like

Posted 03 January 2012 - 06:01 PM

Hi folks,
I'm in a long battle with DirectX 11. Originally my aim was to do deferred shading (the original thread is here), but the code got a little hacky and a long story short, I started again trying to keep things clean and tidy.

Now my problem is with the creation of a device and swap chain using, D3D11CreateDeviceAndSwapChain(). It won't build me a context at feature level 11.

I'll try and explain everything I've tried and give you a code listing below.

I read somewhere that sometimes, especially on laptops the primary graphcis adapter is a low powered DX10.1 chip and the more powerful DX11 card has to be selected. I'm developing on a laptop so on these lines, I enumerated through all of the available graphics adapters using an IDXGIFactory and try to set up my D3D objects at feature level 11 on all of them until I succeed. It only returns 1 adapter before returning DXGI_ERROR_NOT_FOUND.
I extracted the adapters description in the debugger and it says it's an intel graphics family chip. On inspection of my device manager under Display Adapters I can see 2 adapters listed - this Intel® HD Graphics Family (that DX 11 seems to be able to find) and my NVIDIA GeForce GT 555M (which DX11 can't seem to find).
I've checked the NVidia website and my card is definitly fully DX 11 compatible. Infact I can run all of the DX11 samples in the SDK Sample browser!!! I've also made sure all of my drivers are up to date and still I can't get it build my D3D objects at feature level 11....

I've also tried letting DX try and pick the adapter for me by leaving the corresponding argument blank and simply specifying a feature level but it only manages feature level 10.1

So....any ideas on why it won't let me build my device and swap chain a feature leve 11??

Thanks very much in advance!

P.s. Here is all of my set up code so far. It just fills out the swap chain description, and creates a DXGI Factory. The creation code enumerates through the adapters trying to create everything at feature level 11, if it fails it moves to the next. As a last ditch if it really can't do it - it'll go for software emulation but let you opt out if you want.
HRESULT result = S_OK;
D3D_DRIVER_TYPE driverType;
// a hardware driver is preferred so this is default.
driverType = D3D_DRIVER_TYPE_HARDWARE;
//driverType = D3D_DRIVER_TYPE_REFERENCE;
IDXGIFactory* factory = NULL;
IDXGIAdapter* adapter = NULL;
// this will be used to step through the devices to find a DX11 compatible one
result = CreateDXGIFactory(__uuidof(IDXGIFactory), (void**)&factory);
if(FAILED(result))
{
  m_crashed = true;
  return result;
}
DXGI_SWAP_CHAIN_DESC swapChainDesc;
D3D_FEATURE_LEVEL featureLevel;
// Initialize the swap chain description.
ZeroMemory(&swapChainDesc, sizeof(swapChainDesc));
// Set to a single back buffer.
swapChainDesc.BufferCount = 1;
// Set the width and height of the back buffer.
swapChainDesc.BufferDesc.Width = screenWidth;
swapChainDesc.BufferDesc.Height = screenHeight;
// Set regular 32-bit surface for the back buffer.
swapChainDesc.BufferDesc.Format = DXGI_FORMAT_R8G8B8A8_UNORM_SRGB;
// TODO:FLAG - These can be altered to make the rendering match the refresh rate of the hardware
swapChainDesc.BufferDesc.RefreshRate.Numerator = 0;
swapChainDesc.BufferDesc.RefreshRate.Denominator = 1;
// Set the usage of the back buffer.
swapChainDesc.BufferUsage = DXGI_USAGE_RENDER_TARGET_OUTPUT;
// Set the handle for the window to render to.
swapChainDesc.OutputWindow = m_hWnd;
// Turn multisampling off.
swapChainDesc.SampleDesc.Count = D3D_SAMPLE_DESC_COUNT;
swapChainDesc.SampleDesc.Quality = D3D_SAMPLE_DESC_QUALITY;

swapChainDesc.Windowed = !fullscreen;
// Set the scan line ordering and scaling to unspecified.
swapChainDesc.BufferDesc.ScanlineOrdering = DXGI_MODE_SCANLINE_ORDER_UNSPECIFIED;
swapChainDesc.BufferDesc.Scaling = DXGI_MODE_SCALING_UNSPECIFIED;
// Discard the back buffer contents after presenting.
swapChainDesc.SwapEffect = DXGI_SWAP_EFFECT_DISCARD;
// Don't set the advanced flags.
swapChainDesc.Flags = 0;
D3D_FEATURE_LEVEL targetLevels[] =
{
  D3D_FEATURE_LEVEL_11_0,
  //D3D_FEATURE_LEVEL_10_1,
  //D3D_FEATURE_LEVEL_10_0,
  //D3D_FEATURE_LEVEL_9_3,
  //D3D_FEATURE_LEVEL_9_2,
  //D3D_FEATURE_LEVEL_9_1,
};
int levelCount = 1;
// initialise to true, as if there aren't any adapters - then we've failed to create the device and swap chain
bool failedToCreateDeviceAndSwapChain = true;

char videoCardDescription[128];
DXGI_ADAPTER_DESC adapterDesc;
unsigned int stringLength;
// step through the adapters until we find a compatible adapter and successfully create a device and swap chain
for(int i = 0; factory->EnumAdapters(i, &adapter) != DXGI_ERROR_NOT_FOUND; i++)
{
  failedToCreateDeviceAndSwapChain = false;
  adapter->GetDesc(&adapterDesc);
  int error = wcstombs_s(&stringLength, videoCardDescription, 128, adapterDesc.Description, 128);
  driverType = D3D_DRIVER_TYPE_UNKNOWN;// as we're specifying an adapter to use, we must specify that the driver type is unknown!!!
  result = D3D11CreateDeviceAndSwapChain(
   adapter,
   driverType,
   NULL,
   0,
   targetLevels,
   levelCount,
   D3D11_SDK_VERSION,
   &swapChainDesc,
   &m_pSwapChain,
   &m_pDevice,
   &featureLevel,
   &m_pDeviceContext);
  // this device failed to create the devie and swap chain - ensure no handles are left by safely releasing what we were trying to create,
  // set the flag to fail and then try again with the next device
  if(FAILED(result))
  {
   SafeRelease(m_pSwapChain);
   SafeRelease(m_pDevice);
   SafeRelease(m_pDeviceContext);
   failedToCreateDeviceAndSwapChain = true;
   // try again with the next device
   continue;
  }
  // if we've reached this point then a compatible graphics device must've been found so break from the loop
  break;
}
// after looping through the devices we still couldn't find what we needed, therefore see if the user will use software emulation as a last attempt
if(failedToCreateDeviceAndSwapChain)
{
  int messageAnswer = MessageBox(
   m_hWnd,
   "No compatible graphics hardware was detected. Do you want to use software emulation instead? Note: This will be very slow!",
   "No compatible graphics hardware",
   MB_YESNO | MB_ICONWARNING);
  // the user doesn't want to use software emulation - therefore initialisation has failed. Quit here
  if(messageAnswer == IDNO)
  {
   m_crashed = true;
   return E_FAIL;
  }
  // the user is happy to use software emulation so make a final attempt at creating the device and swap chain at the desired level using the
  // reference device
  driverType = D3D_DRIVER_TYPE_REFERENCE;
  result = D3D11CreateDeviceAndSwapChain(
   NULL,
   driverType,
   NULL,
   0,
   targetLevels,
   levelCount,
   D3D11_SDK_VERSION,
   &swapChainDesc,
   &m_pSwapChain,
   &m_pDevice,
   &featureLevel,
   &m_pDeviceContext);
}
// Ensure all temporary COM objects are released
SafeRelease(factory);
return S_OK;

I've got a new blog!! I post details of my projects, useful things I find around and about the place and some tutorials on various technologies I'm experimenting with.

Sponsor:

#2 iedoc   Members   -  Reputation: 522

Like
1Likes
Like

Posted 03 January 2012 - 09:53 PM

go to start->programs->microsoft directx sdk-> directx utilities - directx caps viewer. on the left window, under where it says DXGI 1.1 devices, click the + sign next to your graphics card. a couple folders should pop up below your graphics card, they should be each version of directx from like 9 to 11. click the + sign next to the direct3d 11 folder. If you do not see a feature level of 11, then your device does not support it. I'm guessing you'll probably see a feature level of 10.1 since you can create a 10.1 feature level device. This is just to see if your device actually supports feature level 11, maybe it does, but just check anyway. just a suggestion ;)

the folder should say D3D_FEATURE_LEVEL_11_0 if your device supports it. otherwise it might say D3D_FEATURE_LEVEL_10_1 or D3D_FEATURE_LEVEL_10_0. whatever it says, that's the highest feature level supported by your card.
Braynzar Soft - DirectX Lessons & Game Programming Resources!

#3 RythBlade   Members   -  Reputation: 152

Like
0Likes
Like

Posted 04 January 2012 - 03:18 AM

Hi iedoc, thanks very much for the quick response. I've had a look in the caps viewer (64-bit version).
I can see the good old Intel graphics family chip which confirms that it only goes up to 10.1 My NVidia GeForce GT 555m doesn't have anything listed under it. It expands, and only lists a folder called "Outputs" which can't be expanded further. There are no Direct3D folders listed under it at all.....

Scanning down the rest of the sections in the viewer - all I can see listed is the Intel Graphics Family chip....

I take this means that DirectX doesn't fully recognise my graphics card for some reason, or it's mossing some obscure set-up code in order to fully activate it?? Anymore ideas??

[Edit: Also forgot to mention, I tried the Heaven DX11 Benchmark 2.5 app in what it says is full DX11 mode and all seems to work fine.]
I've got a new blog!! I post details of my projects, useful things I find around and about the place and some tutorials on various technologies I'm experimenting with.

#4 RythBlade   Members   -  Reputation: 152

Like
1Likes
Like

Posted 04 January 2012 - 04:08 AM

I've managed to make a little progress with this but it's still not working. I've managed to get it to find my NVidia card when enumerating through the graphics adapters by altering the following lines in the code I posted earlier:
IDXGIFactory* factory = NULL;
has been changed to:
IDXGIFactory1* factory = NULL;

and
result = CreateDXGIFactory(__uuidof(IDXGIFactory), (void**)&factory);
has been changed to:
result = CreateDXGIFactory1(__uuidof(IDXGIFactory1), (void**)&factory);

It still won't create my objects at feature level 11, and actually as the caps viewer displayed, if I force it to choose the NVidia adapter, it can't create any feature level!!
The create and swap chain function always returns "-2005270524" which I assume is some DXGI error code to say the device doesn't support the feature level.
The Intel graphics family adapter still only creates devices at feature level 10.1.......
I've got a new blog!! I post details of my projects, useful things I find around and about the place and some tutorials on various technologies I'm experimenting with.

#5 iedoc   Members   -  Reputation: 522

Like
1Likes
Like

Posted 04 January 2012 - 05:17 AM

The way choosing your adapters is set up looks like you will be choosing the default adapter, which is your intell chip, since it probably doesn't even see your nvidia card. since your nvidia card is not showing any feature level under it at all, it almost certainly means you need to install the latest drivers. I think you can download them here:

http://www.nvidia.com/object/product-geforce-gt-555m-us.html

After you have installed the drivers, check to see what feature level is supported by your card and try to run your app again! good luck!
Braynzar Soft - DirectX Lessons & Game Programming Resources!

#6 RythBlade   Members   -  Reputation: 152

Like
1Likes
Like

Posted 04 January 2012 - 06:49 AM

OMG I'VE GOT IT!!!

Right then I'll explain my fix which actually poses another question. I'm quite certain it wasn't the drivers - as the NVidia updater said I was up-to-date, but never the less I forced the install of the latest drivers from their site. Opened up caps viewer and I still got the same strange results I got last time (see above).

But that started me thinking again. I've been stepping through a DX sample - that car tesselator one - which successfully sets up a DX 11 feature level - in a more round about fashion than I do.
I had look through the NVidia Control Panel application and stumbled onto the graphics adapter settings. There's a combo box that says "preferred graphics processor". It was set to auto-select between the intel chip and GeForce card. I set it to always use the GeForce card and now my set up works perfectly. Caps viewer also now displays what we were expecting under the graphics card - it's a little weird though, the orignal empty one is still there and the intel one has been replaced by another GeForce folder that displays everything we were expecting.

For everyone's reference follow these steps to alter the default display adapter in the NVidia Control Panel:
-Open the NVidia Control panel by right-clicking on an empty area of the desktop and select it from the drop down menu
-If it asks - make sure you're using the advanced view
-Expand 3D Settings in the window on the left and click Manage 3D Settings
-In the main window, select the Global Settings tab
-In the combo box marked Preferred Graphics Processor, expand the box and select your graphics card, instead of Auto-Select

So for my own reference, as there's obviously a way to make DX 11 choose the NVidia properly as the samples do it, any ideas on how this might be achieved.....
its almost as if the max feature level support code is choosing the minimum feature level needed.......
I'll investigate further and let you know if I find anything!

Thanks very much for you help with this though - its been great to get another viewpoint! Let me know if you need any help in the future - I promise I'm not as bad at this as I may seem :P
I've got a new blog!! I post details of my projects, useful things I find around and about the place and some tutorials on various technologies I'm experimenting with.

#7 RythBlade   Members   -  Reputation: 152

Like
1Likes
Like

Posted 04 January 2012 - 10:33 AM

I've found some more information on this problem. Like I said earlier, some machines - espcially laptops have a second low power graphics chip to minimise power consumption when high performance isn't needed. The NVidia Optimus technology is in charge of determining this for NVidia cards. That is what is interferring with my ability to create feature level 11.
The Optimus is apparently triggered whenever DirectX, DXVA or Cuda calls are made. You can also create an application profile via the NVidia Control Panel which targets an executeable and enables/disables its features according to what you set. Most games register with NVidia and add themselves to their profile database, which is distributed as part of Optimus. The DirectX SDK has several of these profiles which is why the samples can see and make use of the GeForce card and why I can't.
I'm not sure about Optimus triggering itself when it detects DirectX.....as I wouldn't have this problem in the first place. It seems a temperamental utility at present.

So anyway - I've added my own profile to the NVidia control panel to enable the graphics card when my application starts and reset the global settings back to auto-select the graphics adapter to use (just so I don't forget to switch it back off and then wonder where all my battery is going....) and everything works fine.

I've found a link to the white paper explaining this. Pages 15, 16, 17 are the ones to read on this issue.

Thanks again for your help with matter!!
I've got a new blog!! I post details of my projects, useful things I find around and about the place and some tutorials on various technologies I'm experimenting with.

#8 iedoc   Members   -  Reputation: 522

Like
0Likes
Like

Posted 04 January 2012 - 11:18 PM

thanks for posting all that information on it ;)
Braynzar Soft - DirectX Lessons & Game Programming Resources!

#9 RythBlade   Members   -  Reputation: 152

Like
1Likes
Like

Posted 05 January 2012 - 04:12 AM

No worries! Thought I'd better make sure I document my solution as I know how helpful these boards are!

Update - Pix also suffers the same problems! Make sure you make similar alterations in the NVidia control panel for PIX as we did above. Either add a new profile for the Pix executable or modify the global settings.
Make sure that you have closed Pix while making these changes - or at least restart it after you've made them.
Note that your application will run fine when running your experiment, but when you attempt to inspect the rendering and debug the pixels after the exeriment, if will constantly fail as Pix isn't able to create the feature level 11 devices like the application did.

I assume this will be the same for all of the DX utilities as I can't see them in the profile list on the NVidia control panel.
I've got a new blog!! I post details of my projects, useful things I find around and about the place and some tutorials on various technologies I'm experimenting with.

#10 black_light   Members   -  Reputation: 100

Like
0Likes
Like

Posted 11 February 2012 - 03:22 AM

Thanks for this, I was having the same problem following the tutorials at rastertek.com and your solution worked fine!

#11 RythBlade   Members   -  Reputation: 152

Like
1Likes
Like

Posted 11 February 2012 - 06:30 AM

No worries, that's why I put all the detail Posted Image
Those are some good tutorials! I'd also have a look through the tutorils available on directtutorial.com, they give some nice descriptions of what everything does, an explanation of the parameters and why they've done it. You have to pay for some of them, but I did perfectly fine with the free ones.
I've got a new blog!! I post details of my projects, useful things I find around and about the place and some tutorials on various technologies I'm experimenting with.




Old topic!
Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.



PARTNERS