• Advertisement

Archived

This topic is now archived and is closed to further replies.

How smart... reallly

This topic is 6478 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

I guess I''ve been beaten up enuff with DirectX when it comes to generating colors (greater than 8 bit color). It has me thinking. If all the surfaces (textures) I have been generating in OpenGL are sent as 24bit color values (RGB or RGBA) is OpenGL smart enough to convert those values in memory from 24 bit to 16bit if the display is only set to 16 bit. I''m curious if this much intelligence was built into OpenGL. When I first started playing with OGL I thought "gee, could it be so simple as to send a buffer of RGBs to the texture routine and it does all the work". It was but now I wonder how efficient it is at conserving memory. I guess I''ve been so used to having to jump thru all those hoops in DX (ie: detecting the color display mode and having to convert my colors accordingly) that I''m curious how OpenGL does it so seemlessly. I''m not complaining, just curious if anyone else has experimented with how efficient OGL is at conserving memory whenever possible.

Share this post


Link to post
Share on other sites
Advertisement
OpenGL leaves most of it''s efficiency up to the driver that is written for your video card. For example, the nvidia drivers allow you to choose (the user, not the programmer) if you want the textures in 16 bit format, or 32 bit format.

That being said, you can still give the driver a hint on how to store your textures using the glHint..() function.

PreManDrake

Share this post


Link to post
Share on other sites

  • Advertisement