Sign in to follow this  
Geometrian

OpenGL My Properly-Created OpenGL Context Is Lying To Me.

Recommended Posts

Hi,
 
I have created an OpenGL context in what I believe is the proper way, but on Linux, basic calls such as `glGetString(...)`, `glGetIntegerv(...)` are returning bogus values/no-ops. Despite this, the context seems to render everything fine. On Windows, I follow the same algorithm and it works perfectly.
 
Fortunately, since I've been having this problem for such a long time, I've had the opportunity to make some very pretty debug output. In case it isn't obvious, in the following, red is a frame/window, cyan is an OpenGL context, yellow is an API pointer, and violet is the display server handle.
 
Here is the output on Windows:
[sharedmedia=gallery:images:7659]
The algorithm is as follows:

  • (line 1): Create a basic context on a default frame
  • (lines 3-5): Set the basic context as current, load its API pointer for `wglCreateContextAttribsARB(...)`, unset context.
  • (lines 7-9): Set the basic context as current, call the API pointer to create an attribute context
  • (lines 11-13): Set the attribute context as current, and set it up (including loading its API pointer, which happens to be the same; we don't use it ever, though).
  • (line 15): Set attribute context as current in preparation for main loop.
  • (line 15.5): [Main loop]
  • (line 16): Unset the attribute context.
  • (lines 18-19): Cleanup.

Now, I try to do something very similar on Linux:
[sharedmedia=gallery:images:7658]
Unfortunately, it doesn't work. Notice the error after line 12. At that point, I called `glGetString(...)` and it returned null. This should not be possible. A context is bound (line 11). Crucially, there is no GL error--yet the only case the documentation says null is returned is if an error happened. In fact, no OpenGL error occurs at all, anywhere!
 
Basically, I want to know why this happens, and prevent it. Did I screw up the context creation somehow? Why wouldn't it throw an error? Is this OpenGL driver just terrible?
 
---
 
One other potentially-relevant fact: on AMD CodeXL, I get the following output on Windows:
 

Debug String: CodeXL warning: The debugged process asked for an extension function pointer (wglCreateContextAttribsARB) from one render context, but called this function pointer in another render context (context #1)

This should not be possible either; as you can see, that function is only ever called when the basic context is bound, using the pointer loaded while the basic context was bound. Additionally, at the time this message appears, only one context had been created, so . . .

 
Thanks,
-G

Edited by Geometrian

Share this post


Link to post
Share on other sites

Do you have glx 1.3 ? This is the minimum requirement for this in order to work. Also, I don't see any framebuffer configs. If you show your code, people could help you more easily.

 

Also, you might be interested in reading this.

 

Finally, generally nowadays, people tend to use existing libraries for doing this (ie, SDL, SFML, glfw...). These are helpful.

Share this post


Link to post
Share on other sites

Why disable the context on line 5, only to turn it on again on 7? and why get two pointers to CreateContextAttribs? seems odd, or are these sanity checks?

 

Generally you

 

CreateContext(old)

MakeCurrent(old)

load CreateContextAttribs  // get ptr

call CreateContextAttribs   // get the new context

MakeCurrent(null)             // done with old

DeleteContext(old)           // delete old

MakeCurrent(new)           

// off you go

Edited by NumberXaero

Share this post


Link to post
Share on other sites

@_Silence_: GLX version on this system is 1.4. The FB configs load is pretty standard, using glXChooseFBConfig, and the result is queried for validity. The code is open-source, albeit the latest version is not online. If you (or someone else) would like to see it, I can update the repo; the reason I didn't lead with that is that the code is quite lengthy; much control and additional functionality needs to be exposed. This is also the reason I moved from using existing context/windowing libraries (I've worked extensively with wx, Qt, SDL, and GLUT previously, and am somewhat familiar with GLFW).

@NumberXaero: Basically what's happening is there is a "Context" object which sets the context at the beginning of its constructor, and unsets it at the end. This, along with some other logic in there, ensures that the bound context is the same before/after making the "Context" object. The constructors are (lines 1-5) and (lines 7-13). Each context loads its own pointers automatically, which is why the second context gets its own pointer, even though it didn't have to.

 

I thought it best to present the problem as simply as possible, with just the raw API commands. Perhaps more commands would be helpful? Or perhaps someone wants to dig through the source (it's actually very readable; just long)? Suggestions?

Share this post


Link to post
Share on other sites

Basically, does a copy/paste from the example given in the link I gave previously work ?

 

Plus, you didn't tell what you put in your glGetString. For example, GL_EXTENSIONS will return null on OpenGL 3.2 and above (and this is normal) whereas it was returning a full list of extensions prior to this version.

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

Sign in to follow this  

  • Partner Spotlight

  • Forum Statistics

    • Total Topics
      627636
    • Total Posts
      2978319
  • Similar Content

    • By xhcao
      Before using void glBindImageTexture(    GLuint unit, GLuint texture, GLint level, GLboolean layered, GLint layer, GLenum access, GLenum format), does need to make sure that texture is completeness. 
    • By cebugdev
      hi guys, 
      are there any books, link online or any other resources that discusses on how to build special effects such as magic, lightning, etc. in OpenGL? i mean, yeah most of them are using particles but im looking for resources specifically on how to manipulate the particles to look like an effect that can be use for games,. i did fire particle before, and I want to learn how to do the other 'magic' as well.
      Like are there one book or link(cant find in google) that atleast featured how to make different particle effects in OpenGL (or DirectX)? If there is no one stop shop for it, maybe ill just look for some tips on how to make a particle engine that is flexible enough to enable me to design different effects/magic 
      let me know if you guys have recommendations.
      Thank you in advance!
    • By dud3
      How do we rotate the camera around x axis 360 degrees, without having the strange effect as in my video below? 
      Mine behaves exactly the same way spherical coordinates would, I'm using euler angles.
      Tried googling, but couldn't find a proper answer, guessing I don't know what exactly to google for, googled 'rotate 360 around x axis', got no proper answers.
       
      References:
      Code: https://pastebin.com/Hcshj3FQ
      The video shows the difference between blender and my rotation:
       
    • By Defend
      I've had a Google around for this but haven't yet found some solid advice. There is a lot of "it depends", but I'm not sure on what.
      My question is what's a good rule of thumb to follow when it comes to creating/using VBOs & VAOs? As in, when should I use multiple or when should I not? My understanding so far is that if I need a new VBO, then I need a new VAO. So when it comes to rendering multiple objects I can either:
      * make lots of VAO/VBO pairs and flip through them to render different objects, or
      * make one big VBO and jump around its memory to render different objects. 
      I also understand that if I need to render objects with different vertex attributes, then a new VAO is necessary in this case.
      If that "it depends" really is quite variable, what's best for a beginner with OpenGL, assuming that better approaches can be learnt later with better understanding?
       
    • By test opty
      Hello all,
       
      On my Windows 7 x64 machine I wrote the code below on VS 2017 and ran it.
      #include <glad/glad.h>  #include <GLFW/glfw3.h> #include <std_lib_facilities_4.h> using namespace std; void framebuffer_size_callback(GLFWwindow* window , int width, int height) {     glViewport(0, 0, width, height); } //****************************** void processInput(GLFWwindow* window) {     if (glfwGetKey(window, GLFW_KEY_ESCAPE) == GLFW_PRESS)         glfwSetWindowShouldClose(window, true); } //********************************* int main() {     glfwInit();     glfwWindowHint(GLFW_CONTEXT_VERSION_MAJOR, 3);     glfwWindowHint(GLFW_CONTEXT_VERSION_MINOR, 3);     glfwWindowHint(GLFW_OPENGL_PROFILE, GLFW_OPENGL_CORE_PROFILE);     //glfwWindowHint(GLFW_OPENGL_FORWARD_COMPAT, GL_TRUE);     GLFWwindow* window = glfwCreateWindow(800, 600, "LearnOpenGL", nullptr, nullptr);     if (window == nullptr)     {         cout << "Failed to create GLFW window" << endl;         glfwTerminate();         return -1;     }     glfwMakeContextCurrent(window);     if (!gladLoadGLLoader((GLADloadproc)glfwGetProcAddress))     {         cout << "Failed to initialize GLAD" << endl;         return -1;     }     glViewport(0, 0, 600, 480);     glfwSetFramebufferSizeCallback(window, framebuffer_size_callback);     glClearColor(0.2f, 0.3f, 0.3f, 1.0f);     glClear(GL_COLOR_BUFFER_BIT);     while (!glfwWindowShouldClose(window))     {         processInput(window);         glfwSwapBuffers(window);         glfwPollEvents();     }     glfwTerminate();     return 0; }  
      The result should be a fixed dark green-blueish color as the end of here. But the color of my window turns from black to green-blueish repeatedly in high speed! I thought it might be a problem with my Graphics card driver but I've updated it and it's: NVIDIA GeForce GTX 750 Ti.
      What is the problem and how to solve it please?
  • Popular Now