Jump to content

  • Log In with Google      Sign In   
  • Create Account

OpenGL Erroneous Context Version


Old topic!
Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.

  • You cannot reply to this topic
14 replies to this topic

#1 Geometrian   Crossbones+   -  Reputation: 1602

Like
0Likes
Like

Posted 25 April 2013 - 12:17 AM

Hi,

I need at least an OpenGL 2.1 context, and preferably more. I am interfacing directly to the Win32 layer.

I'm setting up a context in one of two methods; neither works correctly all of the time.

Method 1:
1: Create an invisible dummy window that "hosts" the context
2: Make context with wglCreateContext
3: Set context to be current
4: Load extensions
5: Make a new (visible) window and use the context in it.

Method 1 seems to work fine for most programs using this code, and glGetString(GL_VERSION) typically indicates a 4.2 context (as high as my card supports). However, in one particular program, for some reason it instead indicates a 1.2 context and advanced functionality subsequently fails.

To try to solve this, I changed the code to implement method 2.

Method 2:
1: Create an invisible dummy window that "hosts" all contexts
2: Make a dummy context with wglCreateContext
3: Set the dummy context to be current
4: Load extensions
5: Make a new context with wglCreateContextAttribsARB having the desired properties
6: Set the dummy context to be not current and then set the new context to be current
7: Delete dummy context
8: Make a new (visible) window and use the new context in it.

Using this, I can get an OpenGL 3.1 context to work correctly (since my programs use OpenGL 2 functionality, a 3.2 context is not used). However, for that one particular program, something very odd happens. glGetString(GL_VERSION) indicates a 1.2 context, but trying to check it with this:
int version[2];
glGetIntegerv(GL_MAJOR_VERSION, version );
glGetIntegerv(GL_MINOR_VERSION, version+1);
printf("OpenGL %d.%d\n",version[0],version[1]);
. . . indicates the 3.1 context as requested! However, the advanced functionality still fails, so I suspect it is wrong in saying so.

It's worth noting that the code for the one particular program where both methods fail is directly copied from a program that works. For some reason, when compiled, the binaries don't hash to the same value, which means that some configuration option might be perturbing this problem into existence.

-G

And a Unix user said rm -rf *.* and all was null and void...|There's no place like 127.0.0.1|The Application "Programmer" has unexpectedly quit. An error of type A.M. has occurred.

Sponsor:

#2 Aks9   Members   -  Reputation: 920

Like
0Likes
Like

Posted 25 April 2013 - 02:27 AM

Why are you creating a dummy window?

That should be done only if you need multisampling without FBO, in order to find appropriate pixel format.

In all other cases, you should do the following:

 

1. Make a new (visible) window and set appropriate pixel format
2. Create a dummy context with wglCreateContext
3. Set the dummy context to be current
4. Create a new (GL 3.0+) context with wglCreateContextAttribsARB having the desired properties
5: Set the new context to be current
6: Delete dummy context

7: Load extensions using the new context



#3 mhagain   Crossbones+   -  Reputation: 8285

Like
0Likes
Like

Posted 25 April 2013 - 03:19 AM

Yeah, creating the context on a separate window seems weird - in Windows (at least), your OpenGL context is specific to your HDC, which in turn is specific to your window.  So even if it does work for you, you're still relying on undefined behaviour to "do the right thing".


It appears that the gentleman thought C++ was extremely difficult and he was overjoyed that the machine was absorbing it; he understood that good C++ is difficult but the best C++ is well-nigh unintelligible.


#4 Brother Bob   Moderators   -  Reputation: 8632

Like
2Likes
Like

Posted 25 April 2013 - 04:43 AM

Yeah, creating the context on a separate window seems weird - in Windows (at least), your OpenGL context is specific to your HDC, which in turn is specific to your window.  So even if it does work for you, you're still relying on undefined behaviour to "do the right thing".

Check the documentation for wglMakeCurrent. It states that you can activate a context on any device context, as long as the pixel format and the device is the same.



#5 mhagain   Crossbones+   -  Reputation: 8285

Like
-1Likes
Like

Posted 25 April 2013 - 06:47 AM

Yeah, creating the context on a separate window seems weird - in Windows (at least), your OpenGL context is specific to your HDC, which in turn is specific to your window.  So even if it does work for you, you're still relying on undefined behaviour to "do the right thing".

Check the documentation for wglMakeCurrent. It states that you can activate a context on any device context, as long as the pixel format and the device is the same.

 

Hmm - you're right: http://msdn.microsoft.com/en-us/library/windows/desktop/dd374387%28v=vs.85%29.aspx

 

It need not be the same hdc that was passed to wglCreateContext when hglrc was created, but it must be on the same device and have the same pixel format.

 

Can someone -1 me? :)


It appears that the gentleman thought C++ was extremely difficult and he was overjoyed that the machine was absorbing it; he understood that good C++ is difficult but the best C++ is well-nigh unintelligible.


#6 Geometrian   Crossbones+   -  Reputation: 1602

Like
1Likes
Like

Posted 25 April 2013 - 09:53 AM

Why are you creating a dummy window?
That should be done only if you need multisampling without FBO, in order to find appropriate pixel format.

That's actually the eventual plan.

However, the real reason is to make the design cleaner. Contexts require a window to be created, but this supposes that that window will be around forever. The way I've structured the architecture is to have a context wrapper object contain its own invisible window. So, the window that "owns" the context is guaranteed to be around for as long as the life of the context. This allows the user to create and destroy windows at will without affecting the context's existence.

 

In all other cases, you should do the following:
[...]

Don't I need to load extensions before using wglCreateContextAttribsARB?


Edited by Geometrian, 25 April 2013 - 10:00 AM.

And a Unix user said rm -rf *.* and all was null and void...|There's no place like 127.0.0.1|The Application "Programmer" has unexpectedly quit. An error of type A.M. has occurred.

#7 Brother Bob   Moderators   -  Reputation: 8632

Like
0Likes
Like

Posted 25 April 2013 - 10:26 AM

Why are you creating a dummy window?
That should be done only if you need multisampling without FBO, in order to find appropriate pixel format.

That's actually the eventual plan.

However, the real reason is to make the design cleaner. Contexts require a window to be created, but this supposes that that window will be around forever. The way I've structured the architecture is to have a context wrapper object contain its own invisible window. So, the window that "owns" the context is guaranteed to be around for as long as the life of the context. This allows the user to create and destroy windows at will without affecting the context's existence.

Just create the context, there's no need for the hidden window there. If you want to create windows at will, then do that, and keep the context as a separate object and bind the two at some point (for example when rendering you attach the desired context to the desired window).

 

In all other cases, you should do the following:
[...]

Don't I need to load extensions before using wglCreateContextAttribsARB?

You don't need to, no, as long as the pixel format doesn't change. The function pointers returned by wglGetProcAddress are required to be the same for different contexts as long as the pixel format is the same. If you just create a dummy context in order to be able to create another context, then that's fine. If you create a dummy window also, then you have to make sure that the pixel format is the same.



#8 Erik Rufelt   Crossbones+   -  Reputation: 3644

Like
1Likes
Like

Posted 25 April 2013 - 11:17 AM

One reason for multiple windows is if you want to use wglChoosePixelFormatARB to select your pixel format, and the final pixel-format might differ from the one selected with wglChoosePixelFormat for the dummy context. A window can't change its pixel-format once set.

 

 

As for the original question, do you mean that an active context claims to be 3.1 through GL_MAJOR_VERSION/GL_MINOR_VERSION, but functions that are guaranteed to be in core 3.1 are not available?

If they are extension functions, check if they are available in the extensions string.

 

 

EDIT: If I understand correctly the problem is that, when calling glGetIntegerv(GL_MAJOR_VERSION) and glGetVersion() after each other, on the same context, they return conflicting information. Which seems strange to say the least.. can you make a minimal example and post the code?

As well as update your drivers, might be a bug.

 

 

My guess is that glGetIntegerv(GL_MAJOR_VERSION, ..) returns an error (check with glGetError()), and does not overwrite the integers in int version[2]. Therefore some old values that happen to be 3 and 1 are still there and the real version is 1.2, for which GL_MAJOR_VERSION is not supported.


Edited by Erik Rufelt, 25 April 2013 - 11:27 AM.


#9 Geometrian   Crossbones+   -  Reputation: 1602

Like
0Likes
Like

Posted 25 April 2013 - 05:13 PM

Just create the context, there's no need for the hidden window there. If you want to create windows at will, then do that, and keep the context as a separate object and bind the two at some point (for example when rendering you attach the desired context to the desired window).

Ummm . . . both wglCreateContext and wglCreateContextAttribsARB take a device context as an argument; I assumed that can only come from a valid window?
 

One reason for multiple windows is if you want to use wglChoosePixelFormatARB to select your pixel format, and the final pixel-format might differ from the one selected with wglChoosePixelFormat for the dummy context. A window can't change its pixel-format once set.

Right. Although that's not implemented now, that's my eventual plan.
 

As for the original question, do you mean that an active context claims to be 3.1 through GL_MAJOR_VERSION/GL_MINOR_VERSION, but functions that are guaranteed to be in core 3.1 are not available?
If they are extension functions, check if they are available in the extensions string.
 
 
EDIT: If I understand correctly the problem is that, when calling glGetIntegerv(GL_MAJOR_VERSION) and glGetVersion() after each other, on the same context, they return conflicting information. Which seems strange to say the least.. can you make a minimal example and post the code?
As well as update your drivers, might be a bug.

The point is that checking the context version returns different results. I recently found out that GL_MAJOR_VERSION/GL_MINOR_VERSION are only supported on OpenGL 3.0 or later.

Unfortunately, I can't really make a minimal sample; an identical program's source works in one project, but the same code fails when recompiled in a different project. It's very bizarre.

At any rate, what's happening is that the context somehow fails to be even OpenGL 2 compatible. It's 1.2, apparently. Since this code is currently being tested on Windows 7, I suspect that somehow it's getting the system default OpenGL instead of that provided by the graphics card vendor? I don't know why that would be though.

My guess is that glGetIntegerv(GL_MAJOR_VERSION, ..) returns an error (check with glGetError()), and does not overwrite the integers in int version[2]. Therefore some old values that happen to be 3 and 1 are still there and the real version is 1.2, for which GL_MAJOR_VERSION is not supported.

Initializing the data shows that they are being set.


Edited by Geometrian, 25 April 2013 - 05:15 PM.

And a Unix user said rm -rf *.* and all was null and void...|There's no place like 127.0.0.1|The Application "Programmer" has unexpectedly quit. An error of type A.M. has occurred.

#10 Geometrian   Crossbones+   -  Reputation: 1602

Like
0Likes
Like

Posted 25 April 2013 - 05:37 PM


After reading up a bit more, I think it is relevant to mention that the pixelformat found by both the working and nonworking programs is the same (i.e. they (should?) both be hardware accelerated).


And a Unix user said rm -rf *.* and all was null and void...|There's no place like 127.0.0.1|The Application "Programmer" has unexpectedly quit. An error of type A.M. has occurred.

#11 Erik Rufelt   Crossbones+   -  Reputation: 3644

Like
0Likes
Like

Posted 25 April 2013 - 05:45 PM

Unfortunately, I can't really make a minimal sample; an identical program's source works in one project, but the same code fails when recompiled in a different project. It's very bizarre.

 

Run a diff on the project files to find exactly what lines are different, then change one after another until it works.

If you make a minimal example, depending on your environment, that should be just one .cpp file (identical) and one project file (different).



#12 Brother Bob   Moderators   -  Reputation: 8632

Like
0Likes
Like

Posted 25 April 2013 - 06:12 PM

Just create the context, there's no need for the hidden window there. If you want to create windows at will, then do that, and keep the context as a separate object and bind the two at some point (for example when rendering you attach the desired context to the desired window).

Ummm . . . both wglCreateContext and wglCreateContextAttribsARB take a device context as an argument; I assumed that can only come from a valid window?
 

That doesn't mean the context is tied to that window in any way (it is tied to its pixel format though, so you could say there is some connection, but that only limits which contexts can be tied to which windows). In order to create a rendering context, you need a window, yes. But you can move that context around as you like, with and without a window. You don't need the context to have a hidden window, you only need a window to create it.

 

It is perfectly fine to separate the concepts of windows and rendering contexts. The window holds a window, and the rendering context holds a rendering context; no need for hidden windows anywhere for this reason. Just tie a rendering context to a window before rendering.



#13 Geometrian   Crossbones+   -  Reputation: 1602

Like
0Likes
Like

Posted 25 April 2013 - 07:55 PM

Run a diff on the project files to find exactly what lines are different, then change one after another until it works.
If you make a minimal example, depending on your environment, that should be just one .cpp file (identical) and one project file (different).

The .sln, .vcxproj, .vcxproj.filters, .vcxproj.user files are the same, except for (some) hash values and the project names. I'll see if I can perturb it into/out of existence another way.

The program that fails uses a library that uses a library that uses the library where the windowing code is defined.

 

In order to create a rendering context, you need a window, yes. But you can move that context around as you like, with and without a window. You don't need the context to have a hidden window, you only need a window to create it.

Yes. To clarify, the hidden window exists only to create the context. This hidden window is local to my context class. User windows can be created and destroyed completely independently of the context--in fact, this is exactly the point of this design.


And a Unix user said rm -rf *.* and all was null and void...|There's no place like 127.0.0.1|The Application "Programmer" has unexpectedly quit. An error of type A.M. has occurred.

#14 Geometrian   Crossbones+   -  Reputation: 1602

Like
0Likes
Like

Posted 25 April 2013 - 08:26 PM

I'll see if I can perturb it into/out of existence another way.

Amazingly, the differences continue to shrink. I can literally copy working project files to the same directory, rename them, add them to the solution, and the original project works while the copied one breaks.

 

I strongly suspect the hash values are magically the problem. Can anyone guess why they'd cause a weird error like this?


And a Unix user said rm -rf *.* and all was null and void...|There's no place like 127.0.0.1|The Application "Programmer" has unexpectedly quit. An error of type A.M. has occurred.

#15 Brother Bob   Moderators   -  Reputation: 8632

Like
0Likes
Like

Posted 26 April 2013 - 02:57 AM

In order to create a rendering context, you need a window, yes. But you can move that context around as you like, with and without a window. You don't need the context to have a hidden window, you only need a window to create it.

Yes. To clarify, the hidden window exists only to create the context. This hidden window is local to my context class. User windows can be created and destroyed completely independently of the context--in fact, this is exactly the point of this design.

Ok, then I apparently misunderstood you. I though the hidden window followed the context, but if you create it temporarily just for creating the context, and then destroy it immediately and forget about it as soon as the context is created, then that's a bit better. But I would use one of your primary windows instead, since that force you to actually have a real window as opposed to just a temporary throw-away window in order to have a rendering context. It also ensures that the pixel format of the rendering context is compatible with the window(s) it is supposed to be used with.






Old topic!
Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.



PARTNERS