Jump to content
  • Advertisement
Sign in to follow this  

XServer and OpenGL

This topic is 4337 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

Hi, I am porting my 3d Editor project from windows to linux Suse 10.0 I have installed the nvidia driver, which runs fine, I have compiled wxWidgets 2.6.3 with opengl enabled and installed it Here is what I get when executing the program I have searched google, but I found no resource that could explain whats this means at all. ./uni3d ---------------------------- - init file system ---------------------------- test - writedir: /home/basiror/uni3d/uni3d/Release/map_source/ - basedir: /home/basiror/uni3d/uni3d/Release/ - search path: assets.pak - search path: tic01.pak - search path: ze.pak X Error of failed request: BadMatch (invalid parameter attributes) Major opcode of failed request: 129 (GLX) Minor opcode of failed request: 5 (X_GLXMakeCurrent) Serial number of failed request: 714 Current serial number in output stream: 714 thx for any help in advance

Share this post


Link to post
Share on other sites
Advertisement
The next step is to use your debugger, in this case gdb.

Step through the program to see what function call causes the problem. Once you know where it is crashing, check your variables and make sure you set up all the preconditions correctly.

Share this post


Link to post
Share on other sites
Some information on X protocol errors:

http://www.rahul.net/kenton/perrors.html

Quote:
BadMatch errors occur when only specific values are acceptable, but another value is provided. The valid values may be a small set of enumerated integers or they may be a relation between other arguments, e.g., a graphics context in a drawing request must have the same depth as the drawing window. There is rarely more than one possible BadMatch error for any particular request type, so identifying the problem is usually straight forward. In my experience, most BadMatch errors are related to drawable depths. Make sure your windows, pixmaps, visual types, colormaps, etc. have the correct depths in your X requests.


You could be passing invalid data to glxMakeCurrent(). Make sure calls to XOpenDisplay(), XCreateWindow() and glXCreateContext() are not failing for any reason.

Share this post


Link to post
Share on other sites
Hi, the line where it exits is the constructor call of the wxGLCanvas.

The parameters you need to pass in order to let wxGLCanvas determine the pixelformat/visual

From my understanding the code is correct, it must be the parameters, but when passing a null pointer or an empty array with only a {0} element it still results in a similar exit with different requestid, but also in GLX_makecurrent.


int32 attr []=
{
WX_GL_RGBA,
WX_GL_DOUBLEBUFFER,
WX_GL_DEPTH_SIZE,
24,
0
};




glview::glview(uint32 uiGLViewID, wxWindow *pParent):wxGLCanvas(pParent,static_cast<wxWindowID>(uiGLViewID),wxDefaultPosition,wxDefaultSize,0,wxGLC anvasName,attr,wxNullPalette)
{
cout << "test6"<<endl;
m_uiGLView = uiGLViewID;

InitOpenGL();
}



here s an excerpt of glxinfo about the visuals available:
visual x bf lv rg d st colorbuffer ax dp st accumbuffer ms cav
id dep cl sp sz l ci b ro r g b a bf th cl r g b a ns b eat
----------------------------------------------------------------------
0xea 16 tc 0 16 0 r y . 5 6 5 0 4 16 0 16 16 16 16 0 0 None
0xeb 16 dc 0 16 0 r y . 5 6 5 0 4 16 0 16 16 16 16 0 0 None
0xec 16 tc 0 16 0 r . . 5 6 5 0 4 16 0 16 16 16 16 0 0 None
0xed 16 tc 0 16 0 r y . 5 6 5 0 4 24 0 16 16 16 16 0 0 None
0xee 16 tc 0 16 0 r . . 5 6 5 0 4 24 0 16 16 16 16 0 0 None
0xef 16 tc 0 16 0 r y . 5 6 5 0 4 24 8 16 16 16 16 0 0 None
0xf0 16 tc 0 16 0 r . . 5 6 5 0 4 24 8 16 16 16 16 0 0 None
0xf1 16 tc 0 16 0 r y . 5 6 5 0 4 0 0 16 16 16 16 0 0 None
0xf2 16 tc 0 16 0 r . . 5 6 5 0 4 0 0 16 16 16 16 0 0 None
0xf3 16 tc 0 16 0 r y . 5 6 5 0 4 16 0 16 16 16 16 2 1 Ncon
0xf4 16 tc 0 16 0 r y . 5 6 5 0 4 16 0 16 16 16 16 4 1 Ncon
0xf5 16 tc 0 16 0 r . . 5 6 5 0 4 16 0 16 16 16 16 2 1 Ncon
0xf6 16 tc 0 16 0 r . . 5 6 5 0 4 16 0 16 16 16 16 4 1 Ncon
0xf7 16 tc 0 16 0 r y . 5 6 5 0 4 24 0 16 16 16 16 2 1 Ncon
0xf8 16 tc 0 16 0 r y . 5 6 5 0 4 24 0 16 16 16 16 4 1 Ncon
0xf9 16 tc 0 16 0 r . . 5 6 5 0 4 24 0 16 16 16 16 2 1 Ncon
0xfa 16 tc 0 16 0 r . . 5 6 5 0 4 24 0 16 16 16 16 4 1 Ncon
0xfb 16 tc 0 16 0 r y . 5 6 5 0 4 24 8 16 16 16 16 2 1 Ncon
0xfc 16 tc 0 16 0 r y . 5 6 5 0 4 24 8 16 16 16 16 4 1 Ncon
0xfd 16 tc 0 16 0 r . . 5 6 5 0 4 24 8 16 16 16 16 2 1 Ncon
0xfe 16 tc 0 16 0 r . . 5 6 5 0 4 24 8 16 16 16 16 4 1 Ncon
0xff 16 dc 0 16 0 r . . 5 6 5 0 4 16 0 16 16 16 16 0 0 None
0x100 16 dc 0 16 0 r y . 5 6 5 0 4 24 0 16 16 16 16 0 0 None
0x101 16 dc 0 16 0 r . . 5 6 5 0 4 24 0 16 16 16 16 0 0 None
0x102 16 dc 0 16 0 r y . 5 6 5 0 4 24 8 16 16 16 16 0 0 None
0x103 16 dc 0 16 0 r . . 5 6 5 0 4 24 8 16 16 16 16 0 0 None
0x104 16 dc 0 16 0 r y . 5 6 5 0 4 0 0 16 16 16 16 0 0 None
0x105 16 dc 0 16 0 r . . 5 6 5 0 4 0 0 16 16 16 16 0 0 None
0x106 16 dc 0 16 0 r y . 5 6 5 0 4 16 0 16 16 16 16 2 1 Ncon
0x107 16 dc 0 16 0 r y . 5 6 5 0 4 16 0 16 16 16 16 4 1 Ncon
0x108 16 dc 0 16 0 r . . 5 6 5 0 4 16 0 16 16 16 16 2 1 Ncon
0x109 16 dc 0 16 0 r . . 5 6 5 0 4 16 0 16 16 16 16 4 1 Ncon
0x10a 16 dc 0 16 0 r y . 5 6 5 0 4 24 0 16 16 16 16 2 1 Ncon
0x10b 16 dc 0 16 0 r y . 5 6 5 0 4 24 0 16 16 16 16 4 1 Ncon
0x10c 16 dc 0 16 0 r . . 5 6 5 0 4 24 0 16 16 16 16 2 1 Ncon
0x10d 16 dc 0 16 0 r . . 5 6 5 0 4 24 0 16 16 16 16 4 1 Ncon
0x10e 16 dc 0 16 0 r y . 5 6 5 0 4 24 8 16 16 16 16 2 1 Ncon
0x10f 16 dc 0 16 0 r y . 5 6 5 0 4 24 8 16 16 16 16 4 1 Ncon
0x110 16 dc 0 16 0 r . . 5 6 5 0 4 24 8 16 16 16 16 2 1 Ncon
0x111 16 dc 0 16 0 r . . 5 6 5 0 4 24 8 16 16 16 16 4 1 Ncon

Share this post


Link to post
Share on other sites
UPDATE:
I removed the WX_GL_RGBA from the attribute list, my app starts but the opengl drawing doesn t appear, looks gray with no buffer swapping taking place

Any idea of what could be wrong there?

Share this post


Link to post
Share on other sites
Some further reading reveals:

Quote:
Bool glXMakeCurrent( Display *dpy, GLXDrawable drawable, GLXContext ctx)


BadMatch is generated if drawable was not created with the same X screen and visual as ctx. It is also generated if drawable is None and ctx is not NULL.


So my guess is specifying an attribute list for an X Visual that cannot be met (causing drawable to be None) is the cause of the BadMatch errors.

The glxinfo output you posted does not indicate any visuals with a 32bit buffer (they are all 16bit), so assuming the WX_GL_RGBA attribute requests a 32bit buffer, the BadMatch error it causes backs up this analysis.

You could try explicitly setting the buffersize to 16 with the WX_GL_BUFFER_SIZE attribute instead.

I hope this helps...

As an aside, invoking "glxinfo -t" gives a much more readable output.

Share this post


Link to post
Share on other sites
Ok I got it to work. Thx.

My XServer was running in 16 bit mode *cough*
I switched to 24 bit so the 32 bit visuals became available.

Thx, you ll get some credit in the final product

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

Participate in the game development conversation and more when you create an account on GameDev.net!

Sign me up!