I need at least an OpenGL 2.1 context, and preferably more. I am interfacing directly to the Win32 layer.
I'm setting up a context in one of two methods; neither works correctly all of the time.
1: Create an invisible dummy window that "hosts" the context
2: Make context with wglCreateContext
3: Set context to be current
4: Load extensions
5: Make a new (visible) window and use the context in it.
Method 1 seems to work fine for most programs using this code, and glGetString(GL_VERSION) typically indicates a 4.2 context (as high as my card supports). However, in one particular program, for some reason it instead indicates a 1.2 context and advanced functionality subsequently fails.
To try to solve this, I changed the code to implement method 2.
1: Create an invisible dummy window that "hosts" all contexts
2: Make a dummy context with wglCreateContext
3: Set the dummy context to be current
4: Load extensions
5: Make a new context with wglCreateContextAttribsARB having the desired properties
6: Set the dummy context to be not current and then set the new context to be current
7: Delete dummy context
8: Make a new (visible) window and use the new context in it.
Using this, I can get an OpenGL 3.1 context to work correctly (since my programs use OpenGL 2 functionality, a 3.2 context is not used). However, for that one particular program, something very odd happens. glGetString(GL_VERSION) indicates a 1.2 context, but trying to check it with this:
int version; glGetIntegerv(GL_MAJOR_VERSION, version ); glGetIntegerv(GL_MINOR_VERSION, version+1); printf("OpenGL %d.%d\n",version,version);. . . indicates the 3.1 context as requested! However, the advanced functionality still fails, so I suspect it is wrong in saying so.
It's worth noting that the code for the one particular program where both methods fail is directly copied from a program that works. For some reason, when compiled, the binaries don't hash to the same value, which means that some configuration option might be perturbing this problem into existence.