Listing supported bit-depths & resolutions

Started by
3 comments, last by Shawn Myers 12 years, 12 months ago
Hey all,
I recently got a notion to begin cross-platform game development. I've been an XNA/DirectX user for a few years now, so it goes without saying that I am experiencing a bit of a shift on how things are to be done. I've picked up a few good tools and libraries to assist me in this cross-platform quest, but I am having a bit of an issue on the Linux end of things. In Windows I am able to request a 32 bit depth buffer, but this operation fails under Linux. If I manually set it to be 24, everything works fine. I would like this to happen without my direct intervention however, so that my game(s) will work on as many systems as possible. Is there a way (preferably cross platform) to get a list of all the supported bit depths and resolutions the graphics device supports? I know in SDL I can get a list of resolutions, and also check to see if a particular bit depth is valid, but I can't, as far as I know, validate the depth buffer bit depth. I'm thinking something similar to the Mesa command-line function glxinfo would be handy.

Any ideas?
Thanks.
Advertisement
You can find the source of glxinfo here. As you can see, it calls the low-level glX functions to loop through the available GLX visuals.

It's been a while since I looked into OpenGL context creation since it's a kind of fire-and-forget type code. I thought SDL automatically chooses a supported depthbuffer/stencilbuffer combo quite reliably? Why do you need to override that?
Widelands - laid back, free software strategy
Thanks for the reply Perfect.
I guess I'm just used to the way DirectX handles things in which you more or less have direct control over what you want to set. It does appear that SDL picks a depth buffer for you if you don't specify any by using SDL_GL_SetAttribute(SDL_GL_DEPTH_SIZE, depthSize); It seems odd to me that SDL won't find the nearest depth size if what you requested was not supported though. Instead it just fails. Perhaps I should just not worry about the backbuffer it is setting and trust it is doing its best?
I believe the point is that SDL allows you to request a minimum bit depth. Think of it that way: if you want a 24 bit depth buffer but get 32 bits, you're most likely not going to complain. But if you want a 32 bit depth buffer and only get 24 bits, that may be a problem if you really need those 32 bits of precision. So it makes sense that SDL fails in that case.

If you want the full control, you need to call the relevant GLX functions directly.
Widelands - laid back, free software strategy
Ahh, thinking about it that way does make sense. Thanks for clearing up my confusion.

This topic is closed to new replies.

Advertisement