Jump to content
  • Advertisement
Sign in to follow this  
random_acts

Memory usage of bmp's

This topic is 4840 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

This is a silly question that I'm sure I already know the answer to, but I need some damn peace of mind. Correct me if I'm wrong: Say I have a 256x256 texture in 16bit and a 32bit bmp format then the memory usage for the 16bit would be 131,072 (256*256*2) bytes and the 32bit would be 262,144 (256*256*4) bytes (which is why many games have two different modes). I'm almost positive that the above is true, but when you think about jpeg, png, gif, etc... the smaller format is for file storage and transfer, and that when loaded by a game is decompressed into its bitmap representation in memory (although I think many cards can handle gif and jpeg natively now). So it occured to me that the card still might represent a 16bit R5G6B5 bmp as 32bit in memory - which couldn't be true because what would be the point... I told you it was a silly question...a simple yea or nea to my scenario will suffice. Although if you have any resources in this vein that I could peruse, that would be good too.

Share this post


Link to post
Share on other sites
Advertisement
Nae. I think most cards use internal formats that optimize memory footprint to (de)compression overhead but surely take advantage of a 5-5-5 format over 8-8-8. I doubt any has built-in JPG/GIF compression. The DDS-format seems to offer a disksize that is equal to the in-memory size, and which is very small.

Illco

Share this post


Link to post
Share on other sites
You are right in your calculations. As for the 16 bit graphics being stored as 32 bit, I'm fairly sure that does not happen.

Share this post


Link to post
Share on other sites
Take a look at this page for a list of what texture formats nVidia cards actually support in OpenGL (and what unsupported formats are converted to).

Share this post


Link to post
Share on other sites
Quote:
Original post by random_acts
I'm almost positive that the above is true, but when you think about jpeg, png, gif, etc... the smaller format is for file storage and transfer, and that when loaded by a game is decompressed into its bitmap representation in memory (although I think many cards can handle gif and jpeg natively now). So it occured to me that the card still might represent a 16bit R5G6B5 bmp as 32bit in memory - which couldn't be true because what would be the point...


There are no consumer graphics cards that support GIF and/or JPEG decompression in hardware. These images are, as you said, decompressed in memory and then preferably transfered to texture memory on board GF-card, if it is used by a high performance application. The GF-cards do however support some type of compressed images, namely those that use S3TC compression. I belive MS use this in their texture formats. Correct me if I'm wrong.

EDIT:
Both DirectX and OpenGL have support for compressed textured in their APIs.

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

We are the game development community.

Whether you are an indie, hobbyist, AAA developer, or just trying to learn, GameDev.net is the place for you to learn, share, and connect with the games industry. Learn more About Us or sign up!

Sign me up!