Jump to content
  • Advertisement
Sign in to follow this  
uhfath

OpenGL Some strangness in OpenGL 3.3 context creation

This topic is 2468 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

Hello, everyone.

Recently i tried to understand the process of creation OpenGL 3.3 context. So if i understand it right, i need to create a dummy window, set it's pixel format, create a dummy context, fetch the function addresses (WGL), create new window and context with them, destroy the dummy context, destroy the old window.
Is this the correct steps?
If so, then there must something that i'm missing. I've attached a small sample (no dependencies, just the VS2010 project).
What i don't understand is that this sample works correctly (it just outputs the glGetString() info in the console) as it is. But if i check it under gDebugger or GLIntercept, they both show an error right after the first glGetString, which is not captured by the app itself (there is a check in it). More then that, gDebugger shows an access vialotion. I've tested this in ATI AMD 5400 (mobility) and ATI AMD 6870. I've tested already 3 driver versions starting from the newest one.

If someone could please take a look at this, it would be greatly appreciated.

Thank you.


P.S.

Just in case, this is what GLIntercept 1.0.2 shows:
GL Intercept Log. Version : 1.02 Compile Date: Nov 5 2011 Run on: Tue Nov 08 11:46:38 2011

===================================================
GL ERROR - Function glGetString(GL_VENDOR) generated error GL_INVALID_ENUM
===================================================
Log End.[/quote]
And the full log:
===============================================================================
GLIntercept version 1.02 Log generated on: Tue Nov 08 11:46:38 2011

===============================================================================

wglChoosePixelFormat(58012B14,0019FC08)=2
wglSetPixelFormat(58012B14,2,0019FC08)=true
wglCreateContext(58012B14)=00010000
wglMakeCurrent(58012B14,00010000)=true
wglGetProcAddress("wglChoosePixelFormatARB")=10025340
wglGetProcAddress("wglCreateContextAttribsAR...")=10025370
wglChoosePixelFormatARB(1C012D15,0019FABC,00000000,1,0019FAB0,0019FAA4)=true
wglSetPixelFormat(1C012D15,2,0019FA74)=true
wglCreateContextAttribsARB(1C012D15,00000000,0019FA48)
----->wglCreateLayerContext(1C012D15,0)=00010001 =00010001
wglMakeCurrent(1C012D15,00010001)=true
wglDeleteContext(00010000)=true
glGetString(GL_VENDOR)="ATI Technologies Inc." glGetError() = GL_INVALID_ENUM
glGetError()=GL_INVALID_ENUM
glGetString(GL_RENDERER)="AMD Radeon HD 6800 Series"
glGetError()=GL_NO_ERROR
glGetString(GL_VERSION)="3.3.11079 Core Profile Fo..."
glGetError()=GL_NO_ERROR
glGetString(GL_SHADING_LANGUAGE_VERSION)="4.10"
glGetError()=GL_NO_ERROR
wglMakeCurrent(1C012D15,00000000)=true
wglDeleteContext(00010001)=true [/quote]
What is really strange is that the app itself doesn't receive that error when launched separately. Only when under GLIntercept.

Share this post


Link to post
Share on other sites
Advertisement
I have actually run into the same thing. I am not sure if it is worth stressing over, but I would like to know if I am doing something wrong. I have a GL error after the window is created before I've done any drawing of any kind.

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

We are the game development community.

Whether you are an indie, hobbyist, AAA developer, or just trying to learn, GameDev.net is the place for you to learn, share, and connect with the games industry. Learn more About Us or sign up!

Sign me up!