• Announcements

    • khawk

      Download the Game Design and Indie Game Marketing Freebook   07/19/17

      GameDev.net and CRC Press have teamed up to bring a free ebook of content curated from top titles published by CRC Press. The freebook, Practices of Game Design & Indie Game Marketing, includes chapters from The Art of Game Design: A Book of Lenses, A Practical Guide to Indie Game Marketing, and An Architectural Approach to Level Design. The GameDev.net FreeBook is relevant to game designers, developers, and those interested in learning more about the challenges in game development. We know game development can be a tough discipline and business, so we picked several chapters from CRC Press titles that we thought would be of interest to you, the GameDev.net audience, in your journey to design, develop, and market your next game. The free ebook is available through CRC Press by clicking here. The Curated Books The Art of Game Design: A Book of Lenses, Second Edition, by Jesse Schell Presents 100+ sets of questions, or different lenses, for viewing a game’s design, encompassing diverse fields such as psychology, architecture, music, film, software engineering, theme park design, mathematics, anthropology, and more. Written by one of the world's top game designers, this book describes the deepest and most fundamental principles of game design, demonstrating how tactics used in board, card, and athletic games also work in video games. It provides practical instruction on creating world-class games that will be played again and again. View it here. A Practical Guide to Indie Game Marketing, by Joel Dreskin Marketing is an essential but too frequently overlooked or minimized component of the release plan for indie games. A Practical Guide to Indie Game Marketing provides you with the tools needed to build visibility and sell your indie games. With special focus on those developers with small budgets and limited staff and resources, this book is packed with tangible recommendations and techniques that you can put to use immediately. As a seasoned professional of the indie game arena, author Joel Dreskin gives you insight into practical, real-world experiences of marketing numerous successful games and also provides stories of the failures. View it here. An Architectural Approach to Level Design This is one of the first books to integrate architectural and spatial design theory with the field of level design. The book presents architectural techniques and theories for level designers to use in their own work. It connects architecture and level design in different ways that address the practical elements of how designers construct space and the experiential elements of how and why humans interact with this space. Throughout the text, readers learn skills for spatial layout, evoking emotion through gamespaces, and creating better levels through architectural theory. View it here. Learn more and download the ebook by clicking here. Did you know? GameDev.net and CRC Press also recently teamed up to bring GDNet+ Members up to a 20% discount on all CRC Press books. Learn more about this and other benefits here.
Sign in to follow this  
Followers 0
RythBlade

DX11
DirectX 11 can't find all of my graphics adapters

10 posts in this topic

Hi folks,
I'm in a long battle with DirectX 11. Originally my aim was to do deferred shading (the original thread is [url="http://www.gamedev.net/topic/617867-incorrect-depth-test-result/page__fromsearch__1"]here[/url]), but the code got a little hacky and a long story short, I started again trying to keep things clean and tidy.

Now my problem is with the creation of a device and swap chain using, D3D11CreateDeviceAndSwapChain(). It won't build me a context at feature level 11.

I'll try and explain everything I've tried and give you a code listing below.

I read somewhere that sometimes, especially on laptops the primary graphcis adapter is a low powered DX10.1 chip and the more powerful DX11 card has to be selected. I'm developing on a laptop so on these lines, I enumerated through all of the available graphics adapters using an IDXGIFactory and try to set up my D3D objects at feature level 11 on all of them until I succeed. It only returns 1 adapter before returning DXGI_ERROR_NOT_FOUND.
I extracted the adapters description in the debugger and it says it's an intel graphics family chip. On inspection of my device manager under Display Adapters I can see 2 adapters listed - this Intel® HD Graphics Family (that DX 11 seems to be able to find) and my NVIDIA GeForce GT 555M (which DX11 can't seem to find).
I've checked the NVidia website and my card is definitly fully DX 11 compatible. Infact I can run all of the DX11 samples in the SDK Sample browser!!! I've also made sure all of my drivers are up to date and still I can't get it build my D3D objects at feature level 11....

I've also tried letting DX try and pick the adapter for me by leaving the corresponding argument blank and simply specifying a feature level but it only manages feature level 10.1

So....any ideas on why it won't let me build my device and swap chain a feature leve 11??

Thanks very much in advance!

P.s. Here is all of my set up code so far. It just fills out the swap chain description, and creates a DXGI Factory. The creation code enumerates through the adapters trying to create everything at feature level 11, if it fails it moves to the next. As a last ditch if it really can't do it - it'll go for software emulation but let you opt out if you want.
[CODE]
HRESULT result = S_OK;
D3D_DRIVER_TYPE driverType;
// a hardware driver is preferred so this is default.
driverType = D3D_DRIVER_TYPE_HARDWARE;
//driverType = D3D_DRIVER_TYPE_REFERENCE;
IDXGIFactory* factory = NULL;
IDXGIAdapter* adapter = NULL;
// this will be used to step through the devices to find a DX11 compatible one
result = CreateDXGIFactory(__uuidof(IDXGIFactory), (void**)&factory);
if(FAILED(result))
{
m_crashed = true;
return result;
}
DXGI_SWAP_CHAIN_DESC swapChainDesc;
D3D_FEATURE_LEVEL featureLevel;
// Initialize the swap chain description.
ZeroMemory(&swapChainDesc, sizeof(swapChainDesc));
// Set to a single back buffer.
swapChainDesc.BufferCount = 1;
// Set the width and height of the back buffer.
swapChainDesc.BufferDesc.Width = screenWidth;
swapChainDesc.BufferDesc.Height = screenHeight;
// Set regular 32-bit surface for the back buffer.
swapChainDesc.BufferDesc.Format = DXGI_FORMAT_R8G8B8A8_UNORM_SRGB;
// TODO:FLAG - These can be altered to make the rendering match the refresh rate of the hardware
swapChainDesc.BufferDesc.RefreshRate.Numerator = 0;
swapChainDesc.BufferDesc.RefreshRate.Denominator = 1;
// Set the usage of the back buffer.
swapChainDesc.BufferUsage = DXGI_USAGE_RENDER_TARGET_OUTPUT;
// Set the handle for the window to render to.
swapChainDesc.OutputWindow = m_hWnd;
// Turn multisampling off.
swapChainDesc.SampleDesc.Count = D3D_SAMPLE_DESC_COUNT;
swapChainDesc.SampleDesc.Quality = D3D_SAMPLE_DESC_QUALITY;

swapChainDesc.Windowed = !fullscreen;
// Set the scan line ordering and scaling to unspecified.
swapChainDesc.BufferDesc.ScanlineOrdering = DXGI_MODE_SCANLINE_ORDER_UNSPECIFIED;
swapChainDesc.BufferDesc.Scaling = DXGI_MODE_SCALING_UNSPECIFIED;
// Discard the back buffer contents after presenting.
swapChainDesc.SwapEffect = DXGI_SWAP_EFFECT_DISCARD;
// Don't set the advanced flags.
swapChainDesc.Flags = 0;
D3D_FEATURE_LEVEL targetLevels[] =
{
D3D_FEATURE_LEVEL_11_0,
//D3D_FEATURE_LEVEL_10_1,
//D3D_FEATURE_LEVEL_10_0,
//D3D_FEATURE_LEVEL_9_3,
//D3D_FEATURE_LEVEL_9_2,
//D3D_FEATURE_LEVEL_9_1,
};
int levelCount = 1;
// initialise to true, as if there aren't any adapters - then we've failed to create the device and swap chain
bool failedToCreateDeviceAndSwapChain = true;

char videoCardDescription[128];
DXGI_ADAPTER_DESC adapterDesc;
unsigned int stringLength;
// step through the adapters until we find a compatible adapter and successfully create a device and swap chain
for(int i = 0; factory->EnumAdapters(i, &adapter) != DXGI_ERROR_NOT_FOUND; i++)
{
failedToCreateDeviceAndSwapChain = false;
adapter->GetDesc(&adapterDesc);
int error = wcstombs_s(&stringLength, videoCardDescription, 128, adapterDesc.Description, 128);
driverType = D3D_DRIVER_TYPE_UNKNOWN;// as we're specifying an adapter to use, we must specify that the driver type is unknown!!!
result = D3D11CreateDeviceAndSwapChain(
adapter,
driverType,
NULL,
0,
targetLevels,
levelCount,
D3D11_SDK_VERSION,
&swapChainDesc,
&m_pSwapChain,
&m_pDevice,
&featureLevel,
&m_pDeviceContext);
// this device failed to create the devie and swap chain - ensure no handles are left by safely releasing what we were trying to create,
// set the flag to fail and then try again with the next device
if(FAILED(result))
{
SafeRelease(m_pSwapChain);
SafeRelease(m_pDevice);
SafeRelease(m_pDeviceContext);
failedToCreateDeviceAndSwapChain = true;
// try again with the next device
continue;
}
// if we've reached this point then a compatible graphics device must've been found so break from the loop
break;
}
// after looping through the devices we still couldn't find what we needed, therefore see if the user will use software emulation as a last attempt
if(failedToCreateDeviceAndSwapChain)
{
int messageAnswer = MessageBox(
m_hWnd,
"No compatible graphics hardware was detected. Do you want to use software emulation instead? Note: This will be very slow!",
"No compatible graphics hardware",
MB_YESNO | MB_ICONWARNING);
// the user doesn't want to use software emulation - therefore initialisation has failed. Quit here
if(messageAnswer == IDNO)
{
m_crashed = true;
return E_FAIL;
}
// the user is happy to use software emulation so make a final attempt at creating the device and swap chain at the desired level using the
// reference device
driverType = D3D_DRIVER_TYPE_REFERENCE;
result = D3D11CreateDeviceAndSwapChain(
NULL,
driverType,
NULL,
0,
targetLevels,
levelCount,
D3D11_SDK_VERSION,
&swapChainDesc,
&m_pSwapChain,
&m_pDevice,
&featureLevel,
&m_pDeviceContext);
}
// Ensure all temporary COM objects are released
SafeRelease(factory);
return S_OK;
[/CODE]
0

Share this post


Link to post
Share on other sites
go to start->programs->microsoft directx sdk-> directx utilities - directx caps viewer. on the left window, under where it says DXGI 1.1 devices, click the + sign next to your graphics card. a couple folders should pop up below your graphics card, they should be each version of directx from like 9 to 11. click the + sign next to the direct3d 11 folder. If you do not see a feature level of 11, then your device does not support it. I'm guessing you'll probably see a feature level of 10.1 since you can create a 10.1 feature level device. This is just to see if your device actually supports feature level 11, maybe it does, but just check anyway. just a suggestion ;)

the folder should say D3D_FEATURE_LEVEL_11_0 if your device supports it. otherwise it might say D3D_FEATURE_LEVEL_10_1 or D3D_FEATURE_LEVEL_10_0. whatever it says, that's the highest feature level supported by your card.
1

Share this post


Link to post
Share on other sites
Hi iedoc, thanks very much for the quick response. I've had a look in the caps viewer (64-bit version).
I can see the good old Intel graphics family chip which confirms that it only goes up to 10.1 My NVidia GeForce GT 555m doesn't have anything listed under it. It expands, and only lists a folder called "Outputs" which can't be expanded further. There are no Direct3D folders listed under it at all.....

Scanning down the rest of the sections in the viewer - all I can see listed is the Intel Graphics Family chip....

I take this means that DirectX doesn't fully recognise my graphics card for some reason, or it's mossing some obscure set-up code in order to fully activate it?? Anymore ideas??

[Edit: Also forgot to mention, I tried the Heaven DX11 Benchmark 2.5 app in what it says is full DX11 mode and all seems to work fine.]
0

Share this post


Link to post
Share on other sites
I've managed to make a little progress with this but it's still not working. I've managed to get it to find my NVidia card when enumerating through the graphics adapters by altering the following lines in the code I posted earlier:
[CODE]
IDXGIFactory* factory = NULL;
[/CODE]
has been changed to:
[CODE]
IDXGIFactory1* factory = NULL;
[/CODE]

and
[CODE]
result = CreateDXGIFactory(__uuidof(IDXGIFactory), (void**)&factory);
[/CODE]
has been changed to:
[CODE]
result = CreateDXGIFactory1(__uuidof(IDXGIFactory1), (void**)&factory);
[/CODE]

It still won't create my objects at feature level 11, and actually as the caps viewer displayed, if I force it to choose the NVidia adapter, it can't create any feature level!!
The create and swap chain function always returns "-2005270524" which I assume is some DXGI error code to say the device doesn't support the feature level.
The Intel graphics family adapter still only creates devices at feature level 10.1.......
1

Share this post


Link to post
Share on other sites
The way choosing your adapters is set up looks like you will be choosing the default adapter, which is your intell chip, since it probably doesn't even see your nvidia card. since your nvidia card is not showing any feature level under it at all, it almost certainly means you need to install the latest drivers. I think you can download them here:

[url="http://www.nvidia.com/object/product-geforce-gt-555m-us.html"]http://www.nvidia.com/object/product-geforce-gt-555m-us.html[/url]

After you have installed the drivers, check to see what feature level is supported by your card and try to run your app again! good luck!
1

Share this post


Link to post
Share on other sites
OMG I'VE GOT IT!!!

Right then I'll explain my fix which actually poses another question. I'm quite certain it wasn't the drivers - as the NVidia updater said I was up-to-date, but never the less I forced the install of the latest drivers from their site. Opened up caps viewer and I still got the same strange results I got last time (see above).

But that started me thinking again. I've been stepping through a DX sample - that car tesselator one - which successfully sets up a DX 11 feature level - in a more round about fashion than I do.
I had look through the NVidia Control Panel application and stumbled onto the graphics adapter settings. There's a combo box that says "preferred graphics processor". It was set to auto-select between the intel chip and GeForce card. I set it to always use the GeForce card and now my set up works perfectly. Caps viewer also now displays what we were expecting under the graphics card - it's a little weird though, the orignal empty one is still there and the intel one has been replaced by another GeForce folder that displays everything we were expecting.

For everyone's reference follow these steps to alter the default display adapter in the NVidia Control Panel:
-Open the NVidia Control panel by right-clicking on an empty area of the desktop and select it from the drop down menu
-If it asks - make sure you're using the advanced view
-Expand 3D Settings in the window on the left and click Manage 3D Settings
-In the main window, select the Global Settings tab
-In the combo box marked Preferred Graphics Processor, expand the box and select your graphics card, instead of Auto-Select

So for my own reference, as there's obviously a way to make DX 11 choose the NVidia properly as the samples do it, any ideas on how this might be achieved.....
its almost as if the max feature level support code is choosing the minimum feature level needed.......
I'll investigate further and let you know if I find anything!

Thanks very much for you help with this though - its been great to get another viewpoint! Let me know if you need any help in the future - I promise I'm not as bad at this as I may seem :P
1

Share this post


Link to post
Share on other sites
I've found some more information on this problem. Like I said earlier, some machines - espcially laptops have a second low power graphics chip to minimise power consumption when high performance isn't needed. The NVidia Optimus technology is in charge of determining this for NVidia cards. That is what is interferring with my ability to create feature level 11.
The Optimus is apparently triggered whenever DirectX, DXVA or Cuda calls are made. You can also create an application profile via the NVidia Control Panel which targets an executeable and enables/disables its features according to what you set. Most games register with NVidia and add themselves to their profile database, which is distributed as part of Optimus. The DirectX SDK has several of these profiles which is why the samples can see and make use of the GeForce card and why I can't.
I'm not sure about Optimus triggering itself when it detects DirectX.....as I wouldn't have this problem in the first place. It seems a temperamental utility at present.

So anyway - I've added my own profile to the NVidia control panel to enable the graphics card when my application starts and reset the global settings back to auto-select the graphics adapter to use (just so I don't forget to switch it back off and then wonder where all my battery is going....) and everything works fine.

I've found a link to the [url="http://www.nvidia.com/object/LO_optimus_whitepapers.html"]white paper[/url] explaining this. Pages 15, 16, 17 are the ones to read on this issue.

Thanks again for your help with matter!!
1

Share this post


Link to post
Share on other sites
No worries! Thought I'd better make sure I document my solution as I know how helpful these boards are!

Update - Pix also suffers the same problems! Make sure you make similar alterations in the NVidia control panel for PIX as we did above. Either add a new profile for the Pix executable or modify the global settings.
Make sure that you have closed Pix while making these changes - or at least restart it after you've made them.
Note that your application will run fine when running your experiment, but when you attempt to inspect the rendering and debug the pixels after the exeriment, if will constantly fail as Pix isn't able to create the feature level 11 devices like the application did.

I assume this will be the same for all of the DX utilities as I can't see them in the profile list on the NVidia control panel.
1

Share this post


Link to post
Share on other sites
No worries, that's why I put all the detail [img]http://public.gamedev.net//public/style_emoticons/default/smile.png[/img]
Those are some good tutorials! I'd also have a look through the tutorils available on [url="http://www.directxtutorial.com/Tutorial11/tutorials.aspx"]directtutorial.com[/url], they give some nice descriptions of what everything does, an explanation of the parameters and why they've done it. You have to pay for some of them, but I did perfectly fine with the free ones.
1

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!


Register a new account

Sign in

Already have an account? Sign in here.


Sign In Now
Sign in to follow this  
Followers 0

  • Similar Content

    • By lonewolff
      Hi Guys,
      I am just wondering if it is possible to acquire the address of the backbuffer if an API (based on DX11) only exposes the 'device' and 'context' pointers?
      Any advice would be greatly appreciated
    • By MarcusAseth
      bool InitDirect3D::Init() { if (!D3DApp::Init()) { return false; } //Additional Initialization //Disable Alt+Enter Fullscreen Toggle shortkey IDXGIFactory* factory; CreateDXGIFactory(__uuidof(IDXGIFactory), reinterpret_cast<void**>(&factory)); factory->MakeWindowAssociation(mhWindow, DXGI_MWA_NO_WINDOW_CHANGES); factory->Release(); return true; }  
      As stated on the title and displayed on the code above, regardless of it Alt+Enter still takes effect...
      I recall something from the book during the swapChain creation, where in order to create it one has to use the same factory used to create the ID3D11Device, therefore I tested and indeed using that same factory indeed it work.
      How is that one particular factory related to my window and how come the MakeWindowAssociation won't take effect with a newly created factory?
      Also what's even the point of being able to create this Factories if they won't work,?(except from that one associated with the ID3D11Device) 
    • By ProfL
      Can anyone recommend a wrapper for Direct3D 11 that is similarly simple to use as SFML? I don't need all the image formats etc. BUT I want a simple way to open a window, allocate a texture, buffer, shader.
    • By lucky6969b
      Q1:
      Since there is no more fixed pipeline rendering in DX11, for every part of rendering in DX11, do I need to create a brand-new vertex shader and pixel shader... or at least I have to find one relevant online. If you work on skinned meshes and other effects originally worked in DX9 fixed pipeline, do I have to rework everything by now?
       
      Q2:
      For assimp, if it originally was designed for DX9, like it is coupled to a DX9 device for creating meshes and materials etc. Do I have to add in the DX11 device in the assimp, or can I just leave the assimp to remain in DX9 and after the meshes are loaded, I just convert the vertex buffers and index buffers into DX11 buffers?
      Thanks
      Jack
    • By MarcusAseth
      This header is mentioned in the book I'm reading but there is no documentation on msdn... Is it like an... outdated and abandoned header?
      If so, what's the current default/recomended library for handling errors with directX?
  • Popular Now