Jump to content

  • Log In with Google      Sign In   
  • Create Account

16 bit mode questions


Old topic!
Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.

  • You cannot reply to this topic
11 replies to this topic

#1 GalaxyQuest   Members   -  Reputation: 122

Like
Likes
Like

Posted 12 December 2001 - 01:15 PM

I remember when I was working on a demo in DirectX 5 in 16bit modes, that I had to test the computer''s graphics display settings. Some cards use 565 and other 555 RGB 16bit colors. In "opengl game programming" it doesnt mention if this is an issue. 1) Does OPENGL test AND use the correct rgb 16bit format, thereby eliminating my need to manually check this and use it correctly? 2) If I do use 16bit mode and my TEXTURES are loaded in 24bits, will I have problems? Does it matter?

Sponsor:

#2 mittens   Moderators   -  Reputation: 1315

Like
Likes
Like

Posted 12 December 2001 - 02:21 PM

A) Yes
B) No you won''t have problems.

------------------------------
Trent (ShiningKnight)
E-mail me
"Whoso is a man, is a nonconformist"
--Ralph Waldo Emerson


#3 ANSI2000   Members   -  Reputation: 122

Like
Likes
Like

Posted 12 December 2001 - 03:47 PM

Also, in the oixel format descriptor you can set the bit depth and z-buffer depth to let say 32bit and the driver will choose the highest it can go...

#4 Ultima7   Members   -  Reputation: 122

Like
Likes
Like

Posted 12 December 2001 - 06:23 PM

quote:
Original post by GalaxyQuest
I remember when I was working on a demo in DirectX 5 in 16bit modes, that I had to test the computer''s graphics display settings. Some cards use 565 and other 555 RGB 16bit colors. In "opengl game programming" it doesnt mention if this is an issue.

1) Does OPENGL test AND use the correct rgb 16bit format, thereby eliminating my need to manually check this and use it correctly?

2) If I do use 16bit mode and my TEXTURES are loaded in 24bits, will I have problems? Does it matter?



1) Doesn''t really matter, I think it picks 4444 16-bit though, so you can do alpha channels.

2) Doesn''t make a difference

#5 Anonymous Poster_Anonymous Poster_*   Guests   -  Reputation:

Likes

Posted 13 December 2001 - 05:46 AM

You can set the colour depth to anything (with in reason of course, ie 2, 4, 8, 16, 24, 32, 48) and the OpenGL will run it for you.

#6 ANSI2000   Members   -  Reputation: 122

Like
Likes
Like

Posted 13 December 2001 - 08:51 AM

But like I mentioned it will only go as high as it can...

If you put 32 bit and your card can only go 16 it will drop down to 16 bit...

#7 GalaxyQuest   Members   -  Reputation: 122

Like
Likes
Like

Posted 13 December 2001 - 08:59 AM

How nice of opengl to do this for me....feels good!

Is there a performance HIT on any of the different bpp formats. For example, if someone with a slower machine needs a boost, my configuration should lower the bpp to something like 16(no lower though) as compared to 24 or 32 modes(if allowable on the card)?

Oh, and thanks for responses.

#8 merlin9x9   Members   -  Reputation: 174

Like
Likes
Like

Posted 13 December 2001 - 09:15 AM

The very nature of OpenGL is to abstract the hardware so you don''t have to worry about whether your card does 5-5-5 or 5-6-5 or whatever else. In fact, you don''t have to worry about what the color depth or much of anything else is, except when setting up the frame buffer (but that''s a Window system thing, not OpenGL). One way or another you should specify your colors in floating-point format which is pretty much color-depth neutral. Yes, floats are bigger than bytes, but all APIs use floating-point internally. And, also, someday we''ll have floating-point frame buffers.

There usually is a "big" performance hit from 16 to 24 or 32 bpp, but 32 bpp is usually barely worse than 24. But it really depends upon the card. Historically, most cards but, say, NVIDIA cards would take huge performance hits--the GeForce series takes a comparitively minor hit. But it''s not quite so bad these days across the board. Regardless, it''s always a good thing to allow the user to set the color depth, if not the entire display mode.



#9 ANSI2000   Members   -  Reputation: 122

Like
Likes
Like

Posted 13 December 2001 - 09:52 AM

Yes OGL will handle it for yeah. Also in your display setings for most cards you can tell the drivers to what values to default to for example, my card allows me to set the color depth value, to the current desktop value, to 16 bit or 32 bit. So let say in my pixel format descriptor I put 32bit for color depth and I had the swith to use the desktop value and did not make my app change the screen resulution i.e width, height, bits per pixel, the driver would use the one set for my desktop. In my case it doesnt matter because am in 32 bit mode anyways :p But you get the point. Also for OGL, screen res doesnt matter as logn as your monitor can support it, of course the higher the res the more fill rate the slower the rendering...

ATI not to sure about NVidia, ATI has always focussed on optimizing 32bit, so there cards are actually faster at 32bit modes, then in 24 or 16

Edited by - ANSI2000 on December 13, 2001 4:52:58 PM

#10 Anonymous Poster_Anonymous Poster_*   Guests   -  Reputation:

Likes

Posted 13 December 2001 - 04:44 PM

32 bit is the native OpenGL colour depth
changing to anything else will slow it down a small ammount. For games and such it isn''t noticable. Moving upwards to 48 bit may slow it down, but my card doesn''t go that high so I haven''t tried.

#11 merlin9x9   Members   -  Reputation: 174

Like
Likes
Like

Posted 14 December 2001 - 09:14 AM

No, 128-bit is closer to the truth. According to the OpenGL documentation in MSDN, "current color values are stored in floating-point format, with unspecified mantissa and exponent sizes." This will usually be a float which is usually 32 bits. OpenGL keeps track of 4 color components, so that means 128 total bits per color. But, again, the floating-point format is undefined by OpenGL, nor is it exposed. Most frame buffers these days are 24-bit or 32-bit, but this won''t be true for much longer, since 24/32-bit still produce noticeable banding artifacts; hopefully, frame buffers will be floating-point with at least 16 bits per channel.

#12 Anonymous Poster_Anonymous Poster_*   Guests   -  Reputation:

Likes

Posted 15 December 2001 - 03:46 AM

just repeating what was in the opengl 1.2 source code from sgi




Old topic!
Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.



PARTNERS