Problems with vsync on Linux

Started by
2 comments, last by Vladimir128 8 years, 9 months ago

Hi everyone. I've got some troubles with OpenGL on Linux in my graphical application. Actually, I cannot turn vsync on. glXSwapInterval returns GLX_BAD_CONTEXT. So I took the code for Linux from NeHe tutorial, as a simple example, but the situation is the same. I tried to call the function after glXMakeCurrent and before, but anyway it returns an error. Probably, there's no need to show you the code, because it is available on the NeHe website, and you can try it yourself. Has anyone got some suggestions, what can be the reason and how to fix it?
P.S. There are few functions glXSwapInterval, for example glXSwapIntervalSGI. I run Linux on the virtual machine with a simple videocard, so the function available there is glXSwapIntervalMESA. Maybe this is the reason smile.png) And other functions would work on the real PC. But I'm not sure

Advertisement

First off,


there's no need to show you the code, because it is available on the NeHe website, and you can try it yourself.

This is ALWAYS the wrong answer. If you are asking others for help, you need to make sure that you supply an adequate amount of code that shows the target problem, as well as any error messages that are thrown. In this case, supplying the required information is even more important as 'NeHe's tutorials are out of date.

As far as the bad context situation goes, "GLX_BAD_CONTEXT" is thrown for multiple reasons, however, they all revolve around the fact that your context is not a valid GLX context. With no code, this tells me that your context creation failed. Again, this happens for any number of reasons. For starters, I would try running natively and not in a virtual machine. If you supplied context creation parameters that the virtual machine does not meet, context creation will fail, thus giving you a bad context, and finally leading to 'GLX_BAD_CONTEXT' being thrown. If you want more help that that, post the code and errors.

EDIT: Adding to the whole extension mess, make sure that if you are using extensions, that they are currently supported and loaded -- especially the vendor specific extensions. Depending on usage, that could also potentially cause these issues.

Byte

"The code you write when you learn a new language is shit.
You either already know that and you are wise, or you don’t realize it for many years and you are an idiot. Either way, your learning code is objectively shit." - L. Spiro

"This is called programming. The art of typing shit into an editor/IDE is not programming, it's basically data entry. The part that makes a programmer a programmer is their problem solving skills." - Serapth

"The 'friend' relationship in c++ is the tightest coupling you can give two objects. Friends can reach out and touch your privates." - frob

3D acceleration is the one thing that VM's actually don't do very well, so you'd do well do run your OpenGL experiments on actual hardware rather than a virtual machine. It will save you time and frustration in the long run.

“If I understand the standard right it is legal and safe to do this but the resulting value could be anything.”

Thanks for reply smile.png I listened to your advices, and these few days we were testing different applications with friends, who have Linux. VSync works fine, looks like it was a problem of Virtual Machine. But there's another problem with extensions. There are a lot of similar code, so I'll show you just a pseudocode, with a few lines, how it is structured in files. Something like this

File GLXExtensions.h


class GLXExtensions
{
public:
    GLXExtensions(void);
    ...
private:
    bool Initialize(void);
    ...
};


File GLXExtensions.cpp


#include "GLXExtensions.h"
 
static PFNGLXSWAPINTERVALSGIPROC glXSwapIntervalSGI = NULL;
 
GLXExtensions::GLXExtensions(void)
{
    Initialize();
}

...

bool GLXExtensions::Initialize(void)
{
    glXSwapIntervalSGI = (PFNGLXSWAPINTERVALSGIPROC)glXGetProcAddress(reinterpret_cast<const GLubyte*>("glXSwapIntervalSGI"));
 
    // CASE 1
 
    return true;
}


File GLExtensions.h


extern PFNGLXSWAPINTERVALSGIPROC glXSwapIntervalSGI;


File GLRenderDevice.h


#include GLExtensions.h
 
class GLRenderDevice
{
...
    bool initialize(void);
...
    GLXExtensions *glxExt;
}


File GLRenderDevice.cpp


#include "GLRenderDevice.h"
 
GLRenderDevice::initialize(void)
{
    glxExt = new GLXExtensions();
 
    // CASE 2
}


The problem is - if I call glXSwapIntervalSGI in the "CASE 1" line, everything is fine, but if I call it in "CASE 2" line, program crashes. Even if pointer is not null. And also, if pointer is declared without "static" attribute, just as "PFNGLXSWAPINTERVALSGIPROC glXSwapIntervalSGI = NULL;", than program crashes even in "CASE 1". The same thing with other extensions, for example with VBO. So the question - how these pointers should be initialized properly? smile.png By the way - everything works fine on Virtual Machine, it crashes only on real Linux. Also the same code with wglGetProcAddress works on Windows, even without "static". Does anyone have suggestions? smile.png

This topic is closed to new replies.

Advertisement