Windows OpenGL and choosing from multiple video cards
Members - Reputation: 144
Posted 16 September 2011 - 01:15 PM
I've run into a small issue on one machine with two video cards. The other is used mainly to run web solutions simultaneously with high-end stuff running on the main NVidia card. For some reason no matter how I initialize OpenGL window, it uses the low-end video card and drivers. I assume this is because it is OpenGL compatible as well as was actually installed prior to NVidia card, thus having its drivers showing up in the registries first.
I know this is mainly user/OS side driver/config issue, but I'd love to create a user-friendly way to go around this; Simply by allowing user choose the display driver from the ones available if the default doesn't suit his/her/its needs.
So, is there a way to force OpenGL Windows application to choose a different video card / driver as opposed to the one Windows defaults it to?
I didn't manage to find any articles / topics about this, but feel free to point me to one if this has already been discussed elsewhere.
Crossbones+ - Reputation: 4313
Posted 16 September 2011 - 03:44 PM
Normally, it usually(always?) uses the adapter that you have set as your primary monitor with the start-menu in the monitor control panel. As far as I know there's no reliable way to override that. It's rather annoying though, as you can actually select one monitor as primary, create one OpenGL context which will use that monitor, then go into the control panel and switch which monitor is primary while that OpenGL context is running, and afterward create one more OpenGL context, and you will have one context on each card.
I've actually tested that on my system with Windows 7 64 and one AMD and one NVidia card, and it works, but it's a pain and not really a viable solution.. though I guess you could change the primary monitor programmatically if you just want it for yourself. Since it's that simple there's _probably_ a way to hack it without changing the primary monitor too, but I don't know how.
Also, it's definitely not guaranteed to always work on other configurations. For example I don't know what happens if the same driver controls both cards.. in my case it's two completely different drivers for the two cards.
Members - Reputation: 1434
Posted 17 September 2011 - 01:19 AM
Crossbones+ - Reputation: 4313
Posted 17 September 2011 - 05:47 AM
Aren't you supposed to do this by calling EnumDisplayDevices to get the list of video cards and then using CreateDC to create HDCs on those devices? Then you can use wglChoosePixelFormat (which takes the HDC) to get a render context of the right type on the right device?
That doesn't work, it still creates the OpenGL context for the primary adapter on every system I've tried it (some drivers won't even allow a GL context to be created on such a DC). I've read that if you have two separate ATI cards in the same machine then it works by simply creating the window on the correct monitor, but I never had two ATI cards so I haven't tried it. I have tried with two NVidia cards and with 1 Nvidia + 1 ATI in almost every way I could think of and it never worked.
Members - Reputation: 805
Posted 17 September 2011 - 07:23 AM
If both cards are from the same vendor, then it should work (AMD or nvidia).They have some "special" coding in their driver to handle multi card situations.
If they are different, then I think for the second card, it just runs the default ms driver (opengl32.dll). Am I correct?
an open source GLU replacement library. Much more modern than GLU.
float matrix, inverse_matrix;
glhTranslatef2(matrix, 0.0, 0.0, 5.0);
glhScalef2(matrix, 1.0, 1.0, -1.0);
glUniformMatrix4fv(uniformLocation1, 1, FALSE, matrix);
glUniformMatrix4fv(uniformLocation2, 1, FALSE, inverse_matrix);
Members - Reputation: 102
Posted 09 May 2012 - 11:37 AM
In the context of switchable graphics, I end up with the problem that opengl32.dll uses the wrong device if an accelerated window is created with CreateWindow. Although the display adapter (in my problematic case a Mobile Intel 4 Express Chipset) seems to be available with support for OpenGL 2.1 (I've checked this with OpenGL Extension Viewer 4.0) the GL_RENDERER that is used for my application is Microsoft's Generic GDI renderer.
First idea, of course: Driver problem. However, at a closer look, one may come up with the idea that the system simply chooses the wrong candidate out of two (Intel, GDI). Therefore, an appropriate solution to that issue could be to choose the display device instead of using the default one. Well, as can be read in this thread, this is not that easy...
I stumbled across this thread that describes a (hacky) solution how to choose the device manually. Unfortunately, it did not work for me, and I'm not really convinced that (if it would work) this solution is robust, but this is the best I've found so far.
Another post on stackoverflow sums up ideas also expressed in this thread.
It seems that there is a stable solution for this issue: the OpenGL Extension Viewer 4.0 I've mentioned above is able to switch between different display adapters, to show its OpenGL capabilities and(!) to run rendering tests. I wonder how this is possible!