Jump to content
  • Advertisement
Sign in to follow this  
Shawn Myers

Listing supported bit-depths & resolutions

This topic is 2686 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

Hey all,
I recently got a notion to begin cross-platform game development. I've been an XNA/DirectX user for a few years now, so it goes without saying that I am experiencing a bit of a shift on how things are to be done. I've picked up a few good tools and libraries to assist me in this cross-platform quest, but I am having a bit of an issue on the Linux end of things. In Windows I am able to request a 32 bit depth buffer, but this operation fails under Linux. If I manually set it to be 24, everything works fine. I would like this to happen without my direct intervention however, so that my game(s) will work on as many systems as possible. Is there a way (preferably cross platform) to get a list of all the supported bit depths and resolutions the graphics device supports? I know in SDL I can get a list of resolutions, and also check to see if a particular bit depth is valid, but I can't, as far as I know, validate the depth buffer bit depth. I'm thinking something similar to the Mesa command-line function glxinfo would be handy.

Any ideas?
Thanks.

Share this post


Link to post
Share on other sites
Advertisement
You can find the source of glxinfo here. As you can see, it calls the low-level glX functions to loop through the available GLX visuals.

It's been a while since I looked into OpenGL context creation since it's a kind of fire-and-forget type code. I thought SDL automatically chooses a supported depthbuffer/stencilbuffer combo quite reliably? Why do you need to override that?

Share this post


Link to post
Share on other sites
Thanks for the reply Perfect.
I guess I'm just used to the way DirectX handles things in which you more or less have direct control over what you want to set. It does appear that SDL picks a depth buffer for you if you don't specify any by using SDL_GL_SetAttribute(SDL_GL_DEPTH_SIZE, depthSize); It seems odd to me that SDL won't find the nearest depth size if what you requested was not supported though. Instead it just fails. Perhaps I should just not worry about the backbuffer it is setting and trust it is doing its best?

Share this post


Link to post
Share on other sites
I believe the point is that SDL allows you to request a minimum bit depth. Think of it that way: if you want a 24 bit depth buffer but get 32 bits, you're most likely not going to complain. But if you want a 32 bit depth buffer and only get 24 bits, that may be a problem if you really need those 32 bits of precision. So it makes sense that SDL fails in that case.

If you want the full control, you need to call the relevant GLX functions directly.

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

We are the game development community.

Whether you are an indie, hobbyist, AAA developer, or just trying to learn, GameDev.net is the place for you to learn, share, and connect with the games industry. Learn more About Us or sign up!

Sign me up!