OpenGL Erroneous Context Version

Started by
13 comments, last by Brother Bob 11 years ago
Hi,

I need at least an OpenGL 2.1 context, and preferably more. I am interfacing directly to the Win32 layer.

I'm setting up a context in one of two methods; neither works correctly all of the time.

Method 1:
1: Create an invisible dummy window that "hosts" the context
2: Make context with wglCreateContext
3: Set context to be current
4: Load extensions
5: Make a new (visible) window and use the context in it.

Method 1 seems to work fine for most programs using this code, and glGetString(GL_VERSION) typically indicates a 4.2 context (as high as my card supports). However, in one particular program, for some reason it instead indicates a 1.2 context and advanced functionality subsequently fails.

To try to solve this, I changed the code to implement method 2.

Method 2:
1: Create an invisible dummy window that "hosts" all contexts
2: Make a dummy context with wglCreateContext
3: Set the dummy context to be current
4: Load extensions
5: Make a new context with wglCreateContextAttribsARB having the desired properties
6: Set the dummy context to be not current and then set the new context to be current
7: Delete dummy context
8: Make a new (visible) window and use the new context in it.

Using this, I can get an OpenGL 3.1 context to work correctly (since my programs use OpenGL 2 functionality, a 3.2 context is not used). However, for that one particular program, something very odd happens. glGetString(GL_VERSION) indicates a 1.2 context, but trying to check it with this:
[source]int version[2];
glGetIntegerv(GL_MAJOR_VERSION, version );
glGetIntegerv(GL_MINOR_VERSION, version+1);
printf("OpenGL %d.%d\n",version[0],version[1]);[/source]. . . indicates the 3.1 context as requested! However, the advanced functionality still fails, so I suspect it is wrong in saying so.

It's worth noting that the code for the one particular program where both methods fail is directly copied from a program that works. For some reason, when compiled, the binaries don't hash to the same value, which means that some configuration option might be perturbing this problem into existence.

-G

[size="1"]And a Unix user said rm -rf *.* and all was null and void...|There's no place like 127.0.0.1|The Application "Programmer" has unexpectedly quit. An error of type A.M. has occurred.
[size="2"]

Advertisement

Why are you creating a dummy window?

That should be done only if you need multisampling without FBO, in order to find appropriate pixel format.

In all other cases, you should do the following:

1. Make a new (visible) window and set appropriate pixel format
2. Create a dummy context with wglCreateContext
3. Set the dummy context to be current
4. Create a new (GL 3.0+) context with wglCreateContextAttribsARB having the desired properties
5: Set the new context to be current
6: Delete dummy context

7: Load extensions using the new context

Yeah, creating the context on a separate window seems weird - in Windows (at least), your OpenGL context is specific to your HDC, which in turn is specific to your window. So even if it does work for you, you're still relying on undefined behaviour to "do the right thing".

Direct3D has need of instancing, but we do not. We have plenty of glVertexAttrib calls.

Yeah, creating the context on a separate window seems weird - in Windows (at least), your OpenGL context is specific to your HDC, which in turn is specific to your window. So even if it does work for you, you're still relying on undefined behaviour to "do the right thing".

Check the documentation for wglMakeCurrent. It states that you can activate a context on any device context, as long as the pixel format and the device is the same.

Yeah, creating the context on a separate window seems weird - in Windows (at least), your OpenGL context is specific to your HDC, which in turn is specific to your window. So even if it does work for you, you're still relying on undefined behaviour to "do the right thing".

Check the documentation for wglMakeCurrent. It states that you can activate a context on any device context, as long as the pixel format and the device is the same.

Hmm - you're right: http://msdn.microsoft.com/en-us/library/windows/desktop/dd374387%28v=vs.85%29.aspx

It need not be the same hdc that was passed to wglCreateContext when hglrc was created, but it must be on the same device and have the same pixel format.

Can someone -1 me? :)

Direct3D has need of instancing, but we do not. We have plenty of glVertexAttrib calls.

Why are you creating a dummy window?
That should be done only if you need multisampling without FBO, in order to find appropriate pixel format.

That's actually the eventual plan.

However, the real reason is to make the design cleaner. Contexts require a window to be created, but this supposes that that window will be around forever. The way I've structured the architecture is to have a context wrapper object contain its own invisible window. So, the window that "owns" the context is guaranteed to be around for as long as the life of the context. This allows the user to create and destroy windows at will without affecting the context's existence.

In all other cases, you should do the following:
[...]

Don't I need to load extensions before using wglCreateContextAttribsARB?

[size="1"]And a Unix user said rm -rf *.* and all was null and void...|There's no place like 127.0.0.1|The Application "Programmer" has unexpectedly quit. An error of type A.M. has occurred.
[size="2"]

Why are you creating a dummy window?
That should be done only if you need multisampling without FBO, in order to find appropriate pixel format.

That's actually the eventual plan.

However, the real reason is to make the design cleaner. Contexts require a window to be created, but this supposes that that window will be around forever. The way I've structured the architecture is to have a context wrapper object contain its own invisible window. So, the window that "owns" the context is guaranteed to be around for as long as the life of the context. This allows the user to create and destroy windows at will without affecting the context's existence.

Just create the context, there's no need for the hidden window there. If you want to create windows at will, then do that, and keep the context as a separate object and bind the two at some point (for example when rendering you attach the desired context to the desired window).

In all other cases, you should do the following:
[...]

Don't I need to load extensions before using wglCreateContextAttribsARB?

You don't need to, no, as long as the pixel format doesn't change. The function pointers returned by wglGetProcAddress are required to be the same for different contexts as long as the pixel format is the same. If you just create a dummy context in order to be able to create another context, then that's fine. If you create a dummy window also, then you have to make sure that the pixel format is the same.

One reason for multiple windows is if you want to use wglChoosePixelFormatARB to select your pixel format, and the final pixel-format might differ from the one selected with wglChoosePixelFormat for the dummy context. A window can't change its pixel-format once set.

As for the original question, do you mean that an active context claims to be 3.1 through GL_MAJOR_VERSION/GL_MINOR_VERSION, but functions that are guaranteed to be in core 3.1 are not available?

If they are extension functions, check if they are available in the extensions string.

EDIT: If I understand correctly the problem is that, when calling glGetIntegerv(GL_MAJOR_VERSION) and glGetVersion() after each other, on the same context, they return conflicting information. Which seems strange to say the least.. can you make a minimal example and post the code?

As well as update your drivers, might be a bug.

My guess is that glGetIntegerv(GL_MAJOR_VERSION, ..) returns an error (check with glGetError()), and does not overwrite the integers in int version[2]. Therefore some old values that happen to be 3 and 1 are still there and the real version is 1.2, for which GL_MAJOR_VERSION is not supported.

Just create the context, there's no need for the hidden window there. If you want to create windows at will, then do that, and keep the context as a separate object and bind the two at some point (for example when rendering you attach the desired context to the desired window).

Ummm . . . both wglCreateContext and wglCreateContextAttribsARB take a device context as an argument; I assumed that can only come from a valid window?

One reason for multiple windows is if you want to use wglChoosePixelFormatARB to select your pixel format, and the final pixel-format might differ from the one selected with wglChoosePixelFormat for the dummy context. A window can't change its pixel-format once set.

Right. Although that's not implemented now, that's my eventual plan.

As for the original question, do you mean that an active context claims to be 3.1 through GL_MAJOR_VERSION/GL_MINOR_VERSION, but functions that are guaranteed to be in core 3.1 are not available?
If they are extension functions, check if they are available in the extensions string.


EDIT: If I understand correctly the problem is that, when calling glGetIntegerv(GL_MAJOR_VERSION) and glGetVersion() after each other, on the same context, they return conflicting information. Which seems strange to say the least.. can you make a minimal example and post the code?
As well as update your drivers, might be a bug.

The point is that checking the context version returns different results. I recently found out that GL_MAJOR_VERSION/GL_MINOR_VERSION are only supported on OpenGL 3.0 or later.

Unfortunately, I can't really make a minimal sample; an identical program's source works in one project, but the same code fails when recompiled in a different project. It's very bizarre.

At any rate, what's happening is that the context somehow fails to be even OpenGL 2 compatible. It's 1.2, apparently. Since this code is currently being tested on Windows 7, I suspect that somehow it's getting the system default OpenGL instead of that provided by the graphics card vendor? I don't know why that would be though.

My guess is that glGetIntegerv(GL_MAJOR_VERSION, ..) returns an error (check with glGetError()), and does not overwrite the integers in int version[2]. Therefore some old values that happen to be 3 and 1 are still there and the real version is 1.2, for which GL_MAJOR_VERSION is not supported.

Initializing the data shows that they are being set.

[size="1"]And a Unix user said rm -rf *.* and all was null and void...|There's no place like 127.0.0.1|The Application "Programmer" has unexpectedly quit. An error of type A.M. has occurred.
[size="2"]


After reading up a bit more, I think it is relevant to mention that the pixelformat found by both the working and nonworking programs is the same (i.e. they (should?) both be hardware accelerated).

[size="1"]And a Unix user said rm -rf *.* and all was null and void...|There's no place like 127.0.0.1|The Application "Programmer" has unexpectedly quit. An error of type A.M. has occurred.
[size="2"]

This topic is closed to new replies.

Advertisement