Memory usage of bmp's
This is a silly question that I'm sure I already know the answer to, but I need some damn peace of mind.
Correct me if I'm wrong:
Say I have a 256x256 texture in 16bit and a 32bit bmp format then the memory usage for the 16bit would be 131,072 (256*256*2) bytes and the 32bit would be 262,144 (256*256*4) bytes (which is why many games have two different modes).
I'm almost positive that the above is true, but when you think about jpeg, png, gif, etc... the smaller format is for file storage and transfer, and that when loaded by a game is decompressed into its bitmap representation in memory (although I think many cards can handle gif and jpeg natively now). So it occured to me that the card still might represent a 16bit R5G6B5 bmp as 32bit in memory - which couldn't be true because what would be the point...
I told you it was a silly question...a simple yea or nea to my scenario will suffice. Although if you have any resources in this vein that I could peruse, that would be good too.
Nae. I think most cards use internal formats that optimize memory footprint to (de)compression overhead but surely take advantage of a 5-5-5 format over 8-8-8. I doubt any has built-in JPG/GIF compression. The DDS-format seems to offer a disksize that is equal to the in-memory size, and which is very small.
Illco
Illco
You are right in your calculations. As for the 16 bit graphics being stored as 32 bit, I'm fairly sure that does not happen.
Take a look at this page for a list of what texture formats nVidia cards actually support in OpenGL (and what unsupported formats are converted to).
Quote:Original post by random_acts
I'm almost positive that the above is true, but when you think about jpeg, png, gif, etc... the smaller format is for file storage and transfer, and that when loaded by a game is decompressed into its bitmap representation in memory (although I think many cards can handle gif and jpeg natively now). So it occured to me that the card still might represent a 16bit R5G6B5 bmp as 32bit in memory - which couldn't be true because what would be the point...
There are no consumer graphics cards that support GIF and/or JPEG decompression in hardware. These images are, as you said, decompressed in memory and then preferably transfered to texture memory on board GF-card, if it is used by a high performance application. The GF-cards do however support some type of compressed images, namely those that use S3TC compression. I belive MS use this in their texture formats. Correct me if I'm wrong.
EDIT:
Both DirectX and OpenGL have support for compressed textured in their APIs.
This topic is closed to new replies.
Advertisement
Popular Topics
Advertisement