help: OpenGL/GLSL v2.10/v1.10 -- to -- v3.20/v1.50

Started by
25 comments, last by maxgpgpu 14 years, 3 months ago
idinev:
Just out of curiousity, why is the technique you mention easier than adding new code to the glee.h and glee.c files? In fact, I suppose we could just create our own gl_extensions.h and gl_extensions.c files, then add #include statements in glee.h and glee.c to add the new functions we need.

Or am I missing something about how this works?

Also, the way you do it, doesn't your whole OpenGL program need to be filled with function-pointer syntax instead of normal C function-call syntax? Or not? The way GLEE works, my application simply calls all OpenGL functions the same way it calls all other functions.

When I google-searched-for and visited the GLEE and GLEW websites, they say they are only updated to OpenGL v3.00. I have a feeling they don't know how to handle the new "compatibility" capabilities, and didn't want to create two separate versions (core and back-compatible). Just a guess. However, I suppose it is possible the GLEW saved at the www.opengl.org website might be updated to v3.20 by the OpenGL folks, so I'll check (though I doubt it).

-----

STATUS : OpenGL v3.20
I still can't call the new WGL functions, but fortunately the old context create function creates a "v3.20 compatibility" context... sorta, anyway. I say "sorta" because it didn't recognize some v3.20 functions. For example, I had to put glVertexAttribIPointerEXT() instead of glVertexAttribIPointer() in my code. But at least the function is available in this way, so I was able to change some those vertex attributes to integer that should be integers.

STATUS : GLSL v1.50
I changed my GLSL v1.10 shader code to GLSL v1.50 code, and that worked just fine (given "#version 1.50 compatibility" in both shaders). I haven't removed the "compatibility" yet (and replaced it with "core"), but I doubt that will give me any hassle.
Advertisement
Mildly said, glee and glew are bloatware, when it comes to GL3.2 . And no, they also use func-ptrs. Looky in the header and src of glee:
#ifndef GLEE_H_DEFINED_glBindFramebuffer#define GLEE_H_DEFINED_glBindFramebuffer  typedef void (APIENTRYP GLEEPFNGLBINDFRAMEBUFFERPROC) (GLenum target, GLuint framebuffer);  GLEE_EXTERN GLEEPFNGLBINDFRAMEBUFFERPROC GLeeFuncPtr_glBindFramebuffer;  #define glBindFramebuffer GLeeFuncPtr_glBindFramebuffer#endif

#ifndef GLEE_C_DEFINED_glBindFramebuffer#define GLEE_C_DEFINED_glBindFramebuffer  void __stdcall GLee_Lazy_glBindFramebuffer(GLenum target, GLuint framebuffer)  {if (GLeeInit()) glBindFramebuffer(target, framebuffer);}  GLEEPFNGLBINDFRAMEBUFFERPROC GLeeFuncPtr_glBindFramebuffer=GLee_Lazy_glBindFramebuffer;#endif

With my approach, when you want another proc, you just paste its name in the tiny .h .
I guess my other question is, to switch to your technique from GLEE or GLEW, don't you need to define dozens if not hundreds of functions and constants in your file (all those functions not in the super-oldie-and-moldie OpenGL v1.1 or whatever macroshaft supports in their ancient opengl.lib file)?

I mean, declaring functions in a .h file is one thing, but making sure function calls invoke actual (new) functions is something else, right?

I have a feeling I'm missing something obvious and important.

PS: Not trying to be obstinate. I'm just about convinced to attempt your technique, but just a bit gun shy (knowledge short).
Look in glext.h . Symbols of the form of PFN%procname%PROC are defined there. Those symbols are meant to be loaded via wglGetProcAddress(), right after you create the GL context.

Let's look at the difference in the asm code, by comparing calls to glGenTextures and glGenBuffers: one is defined in the DLL, the other we manually load.

:0040F71A 53                      push ebx:0040F71B 6A01                    push 00000001:0040F71D FF158C114100            call dword ptr [0041118C] ; glGenTextures:0040F723 53                      push ebx:0040F724 6A01                    push 00000001:0040F726 FF15B8574100            call dword ptr [004157B8] ; glGenBuffers

Identical! They both use the "FF,15" instruction.

Older compilers would make something worse in the DLL-way:
push ebxpush 1call __imp_glGenTextures....__imp_glGenTextures:  jmp [offset overwritten by OS on module load]


Quote:
STATUS : GLSL v1.50
I changed my GLSL v1.10 shader code to GLSL v1.50 code, and that worked just fine (given "#version 1.50 compatibility" in both shaders). I haven't removed the "compatibility" yet (and replaced it with "core"), but I doubt that will give me any hassle.


Was there major differences between your 1.10 version compared to the 1.50? Are the 2 versions essentially identical? I am asking because I have yet to get a GL 3X ready graphics card. My card only handles up to GLSL 1.20.
Quote:Original post by andy_boy
Quote:
STATUS : GLSL v1.50
I changed my GLSL v1.10 shader code to GLSL v1.50 code, and that worked just fine (given "#version 1.50 compatibility" in both shaders). I haven't removed the "compatibility" yet (and replaced it with "core"), but I doubt that will give me any hassle.

Was there major differences between your 1.10 version compared to the 1.50? Are the 2 versions essentially identical? I am asking because I have yet to get a GL 3X ready graphics card. My card only handles up to GLSL 1.20.
Yes, the versions are essentially identical - meaning I just changed syntax from v1.10 to v1.50 but didn't add or remove any functionality. I did change a vertex attribute from float to uint, because that attribute is just a flag the shader tests to decide whether to normal-map or not, and/or texture or not, and so forth.
idinev:
You convinced me to change to your way... someday... but as luck would have it, GLEW just released a new version for OpenGL v3.20, so I adopted that instead for the short run. My program compiles and runs now, but has a strange problem that perhaps you will recognize.

The problem is, when I create a new-style context with wglCreateContextAttribsARB(hdc,0,attribs), almost nothing displays. When I uncomment the line "// error = 1;" below - to adopt the old-style context via wglCreateContext() - the program displays all 3D objects correctly (as it did before I switched from GLEE to GLEW and created the new-style context).

Any ideas why this would happen? Actually, I do see a few line objects on some frames once in a while, but no objects created from triangles (of which there are dozens). Any ideas?

//// if no OpenGL context has been created yet,// GLEW has not been initialized and we cannot call new WGL functions// like wglCreateContextAttribsARB(), so we must do it the old-way first//  error = 0;  if (igstate.context_initialized == 0) {    xrc = wglCreateContext (hdc);                        // temporary old-style OpenGL context    success = wglMakeCurrent (hdc, xrc);                 // make OpenGL render context active    if (xrc == 0) { return (CORE_ERROR_INTERNAL); }      // WGL or OpenGL is busted    if (success == 0) { return (CORE_ERROR_INTERNAL); }  // WGL or OpenGL is busted//// initialize GLEW or GLEE//   --- an OpenGL context must exist before GLEW is initialized//   --- then we can call "new" functions like wglCreateContextAttribsARB() and etc.//#ifndef NOGLEE#ifdef GLEW_STATIC    error = glewInit();    if (error != GLEW_OK) {      fprintf(stderr, "Error: %s\n", glewGetErrorString(error));  // report error    }    fprintf (stdout, "status: GLEW initialized %s\n", glewGetString(GLEW_VERSION));#endif#endif  }//// set desired attributes of new-style OpenGL context//  int attribs[] = {    WGL_CONTEXT_MAJOR_VERSION_ARB, 3,    WGL_CONTEXT_MINOR_VERSION_ARB, 2,    WGL_CONTEXT_FLAGS_ARB, WGL_CONTEXT_FORWARD_COMPATIBLE_BIT_ARB,//  WGL_CONTEXT_FLAGS_ARB, 0,    0, 0  };//// create new-style OpenGL context//  const c08* report;//error = 1;  if (error == GLEW_OK) {  // we initialized GLEW so we can now create a new-style context    hrc = wglCreateContextAttribsARB (hdc, 0, attribs);  // create OpenGL v3.20 render context    if (hrc) {      wglDeleteContext (xrc);  // delete temporary old-style OpenGL context      success = wglMakeCurrent (hdc, hrc);  // make OpenGL render context active    } else {      hrc = xrc;  // cannot create new-style OpenGL context, so try the old-style context      fprintf (stdout, "status: could not create new-style OpenGL context\n")  // report problem    }  } else {    hrc = xrc;  // cannot create new-style OpenGL context, so try old-style context  }//// report OpenGL version the first time an OpenGL context is created//  cpu major = 0;  cpu minor = 0;  if (igstate.context_initialized == 0) {    igstate.context_initialized = 1;           // an OpenGL context has been initialized    report = glGetString (GL_VERSION);         // get OpenGL version string (check)    glGetIntegerv (GL_MAJOR_VERSION, &major);  // get OpenGL version : major    glGetIntegerv (GL_MINOR_VERSION, &minor);  // get openGL version : minor    fprintf (stdout, "version string == \"%s\" ::: major.minor == %d.%d\n", report, major, minor);  }
idinev:
I found a way to fix the problem I described above, by including one more value--pair definition in the attribs[] array (that you didn't have in your example code). The new line is:

WGL_CONTEXT_PROFILE_MASK_ARG, WGL_CONTEXT_COMPATIBILITY_PROFILE_BIT_ARB,

in the following:
  int attribs[] = {    WGL_CONTEXT_MAJOR_VERSION_ARB, 3,    WGL_CONTEXT_MINOR_VERSION_ARB, 2,    WGL_CONTEXT_FLAGS_ARB, WGL_CONTEXT_FORWARD_COMPATIBLE_BIT_ARB,    WGL_CONTEXT_PROFILE_MASK_ARB, WGL_CONTEXT_COMPATIBILITY_PROFILE_BIT_ARB,    0, 0  };

I'm not sure why I/we/they need both of those compatibility bits. Do you?
Quote:Yes, the versions are essentially identical - meaning I just changed syntax from v1.10 to v1.50 but didn't add or remove any functionality. I did change a vertex attribute from float to uint, because that attribute is just a flag the shader tests to decide whether to normal-map or not, and/or texture or not, and so forth.


Glad to hear. By the way are there any books, tutorials, web sites that you would recommend for a beginner to learn GLSL programming?
I just didn't need to set that bit :) . I only need core-profile (moving my 2.1 stuff to 3.1 was quick and painless, moving to 3.2 took 5 minutes).


There's "forward compatibility" and "backward compatibility". Probably you don't need the forward-compatibility, replace it with 0.

This topic is closed to new replies.

Advertisement