16 bit mode questions

Started by
10 comments, last by GalaxyQuest 22 years, 4 months ago
I remember when I was working on a demo in DirectX 5 in 16bit modes, that I had to test the computer''s graphics display settings. Some cards use 565 and other 555 RGB 16bit colors. In "opengl game programming" it doesnt mention if this is an issue. 1) Does OPENGL test AND use the correct rgb 16bit format, thereby eliminating my need to manually check this and use it correctly? 2) If I do use 16bit mode and my TEXTURES are loaded in 24bits, will I have problems? Does it matter?
Advertisement
A) Yes
B) No you won''t have problems.

------------------------------
Trent (ShiningKnight)
E-mail me
"Whoso is a man, is a nonconformist"
--Ralph Waldo Emerson
Also, in the oixel format descriptor you can set the bit depth and z-buffer depth to let say 32bit and the driver will choose the highest it can go...
quote:Original post by GalaxyQuest
I remember when I was working on a demo in DirectX 5 in 16bit modes, that I had to test the computer''s graphics display settings. Some cards use 565 and other 555 RGB 16bit colors. In "opengl game programming" it doesnt mention if this is an issue.

1) Does OPENGL test AND use the correct rgb 16bit format, thereby eliminating my need to manually check this and use it correctly?

2) If I do use 16bit mode and my TEXTURES are loaded in 24bits, will I have problems? Does it matter?


1) Doesn''t really matter, I think it picks 4444 16-bit though, so you can do alpha channels.

2) Doesn''t make a difference
You can set the colour depth to anything (with in reason of course, ie 2, 4, 8, 16, 24, 32, 48) and the OpenGL will run it for you.
But like I mentioned it will only go as high as it can...

If you put 32 bit and your card can only go 16 it will drop down to 16 bit...
How nice of opengl to do this for me....feels good!

Is there a performance HIT on any of the different bpp formats. For example, if someone with a slower machine needs a boost, my configuration should lower the bpp to something like 16(no lower though) as compared to 24 or 32 modes(if allowable on the card)?

Oh, and thanks for responses.
The very nature of OpenGL is to abstract the hardware so you don''t have to worry about whether your card does 5-5-5 or 5-6-5 or whatever else. In fact, you don''t have to worry about what the color depth or much of anything else is, except when setting up the frame buffer (but that''s a Window system thing, not OpenGL). One way or another you should specify your colors in floating-point format which is pretty much color-depth neutral. Yes, floats are bigger than bytes, but all APIs use floating-point internally. And, also, someday we''ll have floating-point frame buffers.

There usually is a "big" performance hit from 16 to 24 or 32 bpp, but 32 bpp is usually barely worse than 24. But it really depends upon the card. Historically, most cards but, say, NVIDIA cards would take huge performance hits--the GeForce series takes a comparitively minor hit. But it''s not quite so bad these days across the board. Regardless, it''s always a good thing to allow the user to set the color depth, if not the entire display mode.

Yes OGL will handle it for yeah. Also in your display setings for most cards you can tell the drivers to what values to default to for example, my card allows me to set the color depth value, to the current desktop value, to 16 bit or 32 bit. So let say in my pixel format descriptor I put 32bit for color depth and I had the swith to use the desktop value and did not make my app change the screen resulution i.e width, height, bits per pixel, the driver would use the one set for my desktop. In my case it doesnt matter because am in 32 bit mode anyways :p But you get the point. Also for OGL, screen res doesnt matter as logn as your monitor can support it, of course the higher the res the more fill rate the slower the rendering...

ATI not to sure about NVidia, ATI has always focussed on optimizing 32bit, so there cards are actually faster at 32bit modes, then in 24 or 16

Edited by - ANSI2000 on December 13, 2001 4:52:58 PM
32 bit is the native OpenGL colour depth
changing to anything else will slow it down a small ammount. For games and such it isn''t noticable. Moving upwards to 48 bit may slow it down, but my card doesn''t go that high so I haven''t tried.

This topic is closed to new replies.

Advertisement