To transition to OpenGL 3.2

Started by
18 comments, last by tre 13 years, 11 months ago
Hi, my question is basically: How do I move to 3.2 core profile from before using and mostly understanding the Nehe tutorials? The projects I've previously written have been based on the Nehe tutorials and now that I feel kind of comfortable with that code I thought that moving to more modern code would be a good thing. However I'm getting stumped all around when I try to find tutorials, guides or whatever that explains how to set up a 3.2 context (the highest version supported by my graphics card). Are there any great tutorials out there on this? Or, rather, how should I start? I'm lost again :) Thanks, Marcus Axelsson (goes back to reading the specification with a worried expression)
Advertisement
I have been learning OpenGL 3.2 core profile for about a month. There are some tutorials which are very useful. For example,

http://www.opengl.org/wiki/Category:Tutorials
http://sites.google.com/site/opengltutorialsbyaks/introduction-to-opengl-3-2---tutorial-01
http://nopper.tv/opengl_3_2.html
http://www.g-truc.net/post-0204.html

By reading these materials, it is easy to set up my first demo with phong lighting and texture mapping.

However, the most troublesome problem for me is the immature support by related development libraries. Especially, the GUI libraries like freeglut, SDL and QT, are still far away from stable support for core profile. I have post some threads in this forum to show the bugs of freeglut and no solution could be received. I hope all the problems could be solved as soon as possible.
I've checked out two of those links (the sample program links) before and they're OK. I hope to understand them better after having read through the specification. The other two links seem allright. I'll have a close look at those.

The libraries you mention will probably have support soon enough. Or maybe they're working on 4.0 compatible code instead? It'd be great with some 3 compatible stuff though, since I'm not really in a position to buy a new graphics card right now :)

Thanks for the links.

Anyone else got something? All references and guides, tutorials and code samples are welcome.

Thanks,
Marcus Axelsson
Quote:Original post by ZHAO Peng
Especially, the GUI libraries like freeglut, SDL and QT, are still far away from stable support for core profile.
It is even worse, not even the most basic libraries like GLEW and GLee do, which is ironical since they explicitely claim to support all the new versions and extensions. Nevertheless, both libraries use the 1.x mechanism for querying extensions. Which, in principle is fine, since the 3.x way totally sucks (especially if you want support both 1.x/2.x and 3.x codepaths), whereas the 1.x way works perfectly well.

However, the 1.x extension query mechanism is deprecated for 3.0 and removed from 3.1, and it is indeed unimplemented (or rather, disabled -- returns null) on recent drivers. I've even had that happen on a compatibility profile (where it's supposed to work!) on a nVidia development driver not long ago, though that was probably a driver bug.
Sadly, the Khronos group seems to have put a lot of effort into making migrating to OpenGL 3.x while retaining a legacy path as troublesome as possible (GLSL versions being another such thing), you have to wonder why. Probably the plan behind that is something like "move on, no way back".

In reply to the OP:
You must use either wglCreateContextAttribsARB or the glx counterpart, depending on whether you're under Windows or Unix.
These functions take a zero-terminated list of name-value integer pairs. Upon first sight, this may be confusing, but once you get it, it is actually quite nice, flexible, and easy. Don't forget the zero at the end!
Also, do note that you must create a fake context first to get the function pointer to that function. You cannot just call the function like that.
The links posted by ZHAO Peng show how to do this.

Also, you must use glGetStringi instead of glGetString to query for extensions. This involves many more driver calls than "the old way" did, and it makes caching the info a lot more tedious too, but alas... that's how it is. If you structure your program properly so all caps are queried at startup and stored in a few global variables, it doesn't make a real difference at runtime, though it might take half a second longer to start up.
Actually, to be correct, you must get a function pointer for glGetStringi, too, since it is not a OpenGL 1.3 function (only implemented after 3.0). Strange enough, it does work either way on my system, so it seems that glGetStringi actually has an entry in the GL import library. I don't think that this is technically correct.
Sorry! After lots of tries, I should admit that the core profile support of freeglut is not so bad. The problems I met are caused by GLEW. "glewExperimental" should be set to "GL_TRUE" for the drivers NVIDIA-Linux-x86_64-190.53 and NVIDIA-Linux-x86_64-195.36.07.04.

God! I have wasted several days for this "trap".
Quote:Original post by ZHAO Peng
Sorry! After lots of tries, I should admit that the core profile support of freeglut is not so bad. The problems I met are caused by GLEW. "glewExperimental" should be set to "GL_TRUE" for the drivers NVIDIA-Linux-x86_64-190.53 and NVIDIA-Linux-x86_64-195.36.07.04.
God! I have wasted several days for this "trap".

I've not tried it yet but someone's written a OpenGL 3/4 core profile extractor (see here). It looks a bit like a GLEW for core profiles.

Looks like it might be useful if your IDE does function name completion and you don't want it to keep suggesting deprecated functions.
Quote:Original post by dave j
Quote:Original post by ZHAO Peng
Sorry! After lots of tries, I should admit that the core profile support of freeglut is not so bad. The problems I met are caused by GLEW. "glewExperimental" should be set to "GL_TRUE" for the drivers NVIDIA-Linux-x86_64-190.53 and NVIDIA-Linux-x86_64-195.36.07.04.
God! I have wasted several days for this "trap".

I've not tried it yet but someone's written a OpenGL 3/4 core profile extractor (see here). It looks a bit like a GLEW for core profiles.

Looks like it might be useful if your IDE does function name completion and you don't want it to keep suggesting deprecated functions.


Yes, GL3W looks a new and easy use library which is similar to GLEW. It's good news for programmers to have one more choice. Thanks!
Hm. gl3w seemed really useful. But I can't run the script at all. I get an "invalid syntax" error when I try to run it (python.exe "gl3w.py"). So... too bad :/

With GLUT and this gl3w I think it'd be a breeze to get the ogl 3.2 context up and running. Which would be great :)

Thanks for keeping the thread alive, guys.
Quote:Original post by tre
Hm. gl3w seemed really useful. But I can't run the script at all. I get an "invalid syntax" error when I try to run it (python.exe "gl3w.py"). So... too bad :/

With GLUT and this gl3w I think it'd be a breeze to get the ogl 3.2 context up and running. Which would be great :)

Thanks for keeping the thread alive, guys.

Which version of Python are you using? The script is for 2.6.
Allright. Got that going. Now to getting a project on :)

This topic is closed to new replies.

Advertisement