Jump to content
  • Advertisement
Sign in to follow this  
ill

OpenGL Problem with GLEW and GL Contexts 3.2 and Above

This topic is 2504 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

So I'm using SDL 1.3 with OpenGL and GLEW.

My video update code basically looks like this:

glBindFramebuffer(GL_FRAMEBUFFER, 0);
glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);


My initialization code looks like this after I omitted a bunch of stuff:

sdlFlags |= SDL_WINDOW_OPENGL;

SDL_GL_SetAttribute(SDL_GL_CONTEXT_MAJOR_VERSION, 3);
SDL_GL_SetAttribute(SDL_GL_CONTEXT_MINOR_VERSION, 3);
SDL_GL_SetAttribute(SDL_GL_DOUBLEBUFFER, 1);
SDL_GL_SetAttribute(SDL_GL_DEPTH_SIZE, 24);

m_window = SDL_CreateWindow("", SDL_WINDOWPOS_CENTERED, SDL_WINDOWPOS_CENTERED, m_screenWidth, m_screenHeight, sdlFlags);

m_glContext = SDL_GL_CreateContext(m_window);

if(glewInit() != GLEW_OK) {
//an error never happens no matter what I do, glewInit() always initializes successfully no matter the context
}


When I initialize an OpenGL context 3.1 and below it works.

When I try to do anything 3.2 and above my application crashes when I call glBindFramebuffer(). The function pointer appears to be NULL since it says "at address 0x000000000"

I can call glClear() no problem though. So it seems that GLEW isn't properly initializing the proper functions depending on the context.

Running GlewInfo I get this at the top:

GLEW version 1.7.0
Reporting capabilities of pixelformat 1
Running on a GeForce GTX 480/PCI/SSE2 from NVIDIA Corporation
OpenGL version 4.2.0 is supported

I also looked for the line where it says the status of glBindFramebuffer and says OK.

I can't seem to find any info on this anywhere. Not sure if this is a problem with GLEW or me doing something wrong.

Share this post


Link to post
Share on other sites
Advertisement
Ah thanks.


GLEW's problem is that it calls glGetString(GL_EXTENSIONS) which causes GL_INVALID_ENUM on GL 3.2 core context as soon as glewInit() is called. It also doesn't fetch the function pointers. The solution is for GLEW to use glGetStringi instead. The current version of GLEW is 1.7.0 but they still haven't corrected it. The only fix is to use glewExperimental for now :
[/quote]

Changed my code to this:

glewExperimental = true;
if(glewInit() != GLEW_OK) {
//Log fatal error
}


My thoughts are basically, "WHYYYY doesn't the GLEW website at least acknowledge this and tell people to write glewExperimental = true; before making the call to glewInit() when using a GL 3.2 and higher context?!?!?!!?"

That would have saved me hours of agony.

I never heard of that unofficial OpenGl Development kit but I might check that out some time. I'm already using glm which according to them is part of it.

Share this post


Link to post
Share on other sites
They answered the reason on the GLEW mailing list:


Is there any particular reason why GLEW still uses glGetString() to get the list of extensions available?


Some technical reasons.

My main reluctance to deal with this is that I don't have any non-trivial code using
core GL contexts, so it's a blind-spot from a testing point of view.

One issue is a desire to cache the lengthly extension string, rather than
a long sequence of queries. We're stuck with glGetStringi for core contexts,
I guess we'd want to cache those too.

I use a GL 3.2 forward compatible context in my application, and have have
been trying to figure out why I get null pointers for functions such as
glGenRenderbuffers. After debug tracing, I found GLEW's use of glGetString
rather than glGetStringi to be the cause.

I know I can use glewExperimental to bypass the issue, but a real fix would
be nice as forward compatible contexts become more common.[/quote]

It's becoming a bit of a frequently-asked question, I really expected to be
dealing more with core contexts by now, but I'm not.

I also have reservations about the assumptions built into GLEW about core
GL entry points being available via ARB extensions. I'm not sure all those
assumptions are valid for core contexts.[/quote]

For my own projects, I decided to switch to flextGL rather than deal with this issue.

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

GameDev.net is your game development community. Create an account for your GameDev Portfolio and participate in the largest developer community in the games industry.

Sign me up!