Sign in to follow this  

OpenGL Unrecognized OpenGL version

This topic is 2862 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

I am using qt (+opengl) to implement a program. I need to test it on different OS. It has problem in Fedora 12 and Ubuntu 9.10. The terminal shows "Unrecognized OpenGL version", and the display is frozen. I checked the opengl version on different OS. The OSs which are working have lower opengl version. for example: OpenSUSE 10.3 32bit: server glx version string: 1.2 client glx version string: 1.4 OpenGL version string: 1.4 (2.1 Mesa 7.0.1) The OS which doesn't work: Ubuntu 9.10 32bit: server glx version string: 1.2 client glx version string: 1.4 OpenGL version string: 2.1 ( Mesa 7.6) OpenGL shading language version string: 1.20 Fedora 12 32bit: server glx version string: 1.2 client glx version string: 1.4 OpenGL version string: 2.1 Mesa 7.7-devel OpenGL shading language version string: 1.20 I am thinking about the problem might be: higher opengl version doesn't support my program. Am I right? Does anyone know the solution? Thanks in advance.

Share this post


Link to post
Share on other sites
I found some hint in OpenGL website:
--------------
23.040 How can I code for different versions of OpenGL?

Because a feature or extension is available on the OpenGL development environment you use for building your app, it doesn't mean it will be available for use on your end user's system. Your code must avoid making feature or extension calls when those features and extensions aren't available.

When your program initializes, it must query the OpenGL library for information on the OpenGL version and available extensions, and surround version- and extension-specific code with the appropriate conditionals based on the results of that query. For example:

#include <stdlib.h> ... int gl12Supported; gl12Supported = atof(glGetString(GL_VERSION)) >= 1.2; ... if (gl12Supported) { // Use OpenGL 1.2 functionality }
--------------

What is the difference between Opengl version 1.4 and 2.1??

Share this post


Link to post
Share on other sites
The main difference between them is that OpenGL 2.1 has buffer objects and shaders and is much more up-to-date. It is what most of the hardware in the wild supports, too.

It is fully backwards compatible with version 1.4 unlike the more recent versions, such as 3.3 and 4.0. Those more recent versions add features that are only found in the more recent GPUs and drop functionality which is not supported in hardware any more.
The 3.0 and 3.1 versions are "in between" as they still do support old features, but consider them deprecated.

Share this post


Link to post
Share on other sites
But still, I have problem with my application.

I compiled my application in OpenSUSE 10.3, and executed it in Ubuntu 9.10. The problem comes only in Ubuntu 9.10, and the difference I could find is opengl version. If opengl 2.1 is backwards compatible, I don't know why the problem comes....

Share this post


Link to post
Share on other sites
That message means Qt couldn't parse the version string from OpenGL (glGetString(GL_VERSION)), probably because it's malformed, and that would be a driver bug.

You can check how that message is triggered here

Share this post


Link to post
Share on other sites
Are the appropriate drivers installed? What is the vendor string? Mesa appearing in the output you have posted makes me believe you are using the software rasterizer. I use Qt for my OpenGL applications on Ubuntu 9.4 and 9.10 and I had no problems with the NVidia drivers installed.

Qt seems to expects the version string to look like for example "1.4" but not like "1.4 (2.1 Mesa 7.0.1)". NVidia and ATI drivers actually return in the expected format.

Share this post


Link to post
Share on other sites
OK, sth I missed to mention.
I am running my OS in VM.

But I tested 'real' OS of Ubuntu 9.10 as well. There I got more strange result: one account is with OpenGL version 1.4 (my app works), another account with 2.1 (my app not working). The hardware is the same. How could this happen? Install additional libs?

Share this post


Link to post
Share on other sites
With the NVidia drivers installed, my collegue's machine also works.

So the driver is a must for my app? What if on VM?

Share this post


Link to post
Share on other sites
Quote:
Original post by stella1016
With the NVidia drivers installed, my collegue's machine also works.

So the driver is a must for my app? What if on VM?


VMs have limited support for hardware accelerated graphics, VirtualBox for example has some DX/GL drivers for Windows (as guest OS).
OpenGL is an API to make use of the GPU, when no hardware acceleration is available why would you want to use OpenGL? Well, if needed there is Mesa, a software implementation (much slower than hardware). You could try making Qt parse the GL version that is returned by Mesa.

Share this post


Link to post
Share on other sites
But why this differs for different version of Ubuntu (all based on VM)? Other OS doesn't have the real hardware to support, but they worked, and this Ubuntu 9.10 doesn't.

I am thinking about some init error of my app....

Share this post


Link to post
Share on other sites
Quote:
Original post by stella1016
But why this differs for different version of Ubuntu (all based on VM)?

Maybe different versions of mesa. How does the version string look like when it works?

Share this post


Link to post
Share on other sites
If QT has some issue, then update it or don't use it. I use SDL myself and I check the GL version myself. Gl 1.4 didn't have vbo. Then vbo was added to 1.5. 2.0 got shaders. 2.1 got pbo. I am citing that from memory. Check the docs if you want.

Why your code doesn't work on some ubuntu with 2.1... I don't know. You have the source code.

Share this post


Link to post
Share on other sites
Quote:
Original post by stella1016
But I tested 'real' OS of Ubuntu 9.10 as well. There I got more strange result: one account is with OpenGL version 1.4 (my app works), another account with 2.1


Make sure both accounts are in "video" group.

Share this post


Link to post
Share on other sites
Quote:
Original post by EngineCoder
Quote:
Original post by stella1016
But I tested 'real' OS of Ubuntu 9.10 as well. There I got more strange result: one account is with OpenGL version 1.4 (my app works), another account with 2.1


Make sure both accounts are in "video" group.


Sorry that it take long time until I reply for you who kindly tried to help.

What do you mean "video" group? How to check this out?

Share this post


Link to post
Share on other sites
Here I talk more details about my app.

My application uses the concept of QT modules, which means it can load detected *.dll files automatically.

I have two modules among them, module1 using QGraphicsView (no direct gl function), another module2 using QGLWidget (direct use of gl function). A flag in module1 is read from a file to decide whether using OpenGL to accelerate rendering (not sure, done by other people). My module2 is the module having problem.

In normal OS system, no matter the flag is 0 or 1, display has no problem.

In Ubuntu 9.10, if the flag is 0, module1 is working, module2 not; if the flag is 1, module1 is not working, module2 works.

There is only on command differency in module1 for the flag on or off:
if(b_opengl_monitor)
{
this->setViewport(new QGLWidget());
}

"this" inherits class QGraphicsView.

Now I am confused, why this flag effects my module2???


Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Similar Content

    • By _OskaR
      Hi,
      I have an OpenGL application but without possibility to wite own shaders.
      I need to perform small VS modification - is possible to do it in an alternative way? Do we have apps or driver modifictions which will catch the shader sent to GPU and override it?
    • By xhcao
      Does sync be needed to read texture content after access texture image in compute shader?
      My simple code is as below,
      glUseProgram(program.get());
      glBindImageTexture(0, texture[0], 0, GL_FALSE, 3, GL_READ_ONLY, GL_R32UI);
      glBindImageTexture(1, texture[1], 0, GL_FALSE, 4, GL_WRITE_ONLY, GL_R32UI);
      glDispatchCompute(1, 1, 1);
      // Does sync be needed here?
      glUseProgram(0);
      glBindFramebuffer(GL_READ_FRAMEBUFFER, framebuffer);
      glFramebufferTexture2D(GL_READ_FRAMEBUFFER, GL_COLOR_ATTACHMENT0,
                                     GL_TEXTURE_CUBE_MAP_POSITIVE_X + face, texture[1], 0);
      glReadPixels(0, 0, kWidth, kHeight, GL_RED_INTEGER, GL_UNSIGNED_INT, outputValues);
       
      Compute shader is very simple, imageLoad content from texture[0], and imageStore content to texture[1]. Does need to sync after dispatchCompute?
    • By Jonathan2006
      My question: is it possible to transform multiple angular velocities so that they can be reinserted as one? My research is below:
      // This works quat quaternion1 = GEQuaternionFromAngleRadians(angleRadiansVector1); quat quaternion2 = GEMultiplyQuaternions(quaternion1, GEQuaternionFromAngleRadians(angleRadiansVector2)); quat quaternion3 = GEMultiplyQuaternions(quaternion2, GEQuaternionFromAngleRadians(angleRadiansVector3)); glMultMatrixf(GEMat4FromQuaternion(quaternion3).array); // The first two work fine but not the third. Why? quat quaternion1 = GEQuaternionFromAngleRadians(angleRadiansVector1); vec3 vector1 = GETransformQuaternionAndVector(quaternion1, angularVelocity1); quat quaternion2 = GEQuaternionFromAngleRadians(angleRadiansVector2); vec3 vector2 = GETransformQuaternionAndVector(quaternion2, angularVelocity2); // This doesn't work //quat quaternion3 = GEQuaternionFromAngleRadians(angleRadiansVector3); //vec3 vector3 = GETransformQuaternionAndVector(quaternion3, angularVelocity3); vec3 angleVelocity = GEAddVectors(vector1, vector2); // Does not work: vec3 angleVelocity = GEAddVectors(vector1, GEAddVectors(vector2, vector3)); static vec3 angleRadiansVector; vec3 angularAcceleration = GESetVector(0.0, 0.0, 0.0); // Sending it through one angular velocity later in my motion engine angleVelocity = GEAddVectors(angleVelocity, GEMultiplyVectorAndScalar(angularAcceleration, timeStep)); angleRadiansVector = GEAddVectors(angleRadiansVector, GEMultiplyVectorAndScalar(angleVelocity, timeStep)); glMultMatrixf(GEMat4FromEulerAngle(angleRadiansVector).array); Also how do I combine multiple angularAcceleration variables? Is there an easier way to transform the angular values?
    • By dpadam450
      I have this code below in both my vertex and fragment shader, however when I request glGetUniformLocation("Lights[0].diffuse") or "Lights[0].attenuation", it returns -1. It will only give me a valid uniform location if I actually use the diffuse/attenuation variables in the VERTEX shader. Because I use position in the vertex shader, it always returns a valid uniform location. I've read that I can share uniforms across both vertex and fragment, but I'm confused what this is even compiling to if this is the case.
       
      #define NUM_LIGHTS 2
      struct Light
      {
          vec3 position;
          vec3 diffuse;
          float attenuation;
      };
      uniform Light Lights[NUM_LIGHTS];
       
       
    • By pr033r
      Hello,
      I have a Bachelor project on topic "Implenet 3D Boid's algorithm in OpenGL". All OpenGL issues works fine for me, all rendering etc. But when I started implement the boid's algorithm it was getting worse and worse. I read article (http://natureofcode.com/book/chapter-6-autonomous-agents/) inspirate from another code (here: https://github.com/jyanar/Boids/tree/master/src) but it still doesn't work like in tutorials and videos. For example the main problem: when I apply Cohesion (one of three main laws of boids) it makes some "cycling knot". Second, when some flock touch to another it scary change the coordination or respawn in origin (x: 0, y:0. z:0). Just some streng things. 
      I followed many tutorials, change a try everything but it isn't so smooth, without lags like in another videos. I really need your help. 
      My code (optimalizing branch): https://github.com/pr033r/BachelorProject/tree/Optimalizing
      Exe file (if you want to look) and models folder (for those who will download the sources):
      http://leteckaposta.cz/367190436
      Thanks for any help...

  • Popular Now