8bit Color vs. 16bit. Faster?

Started by
1 comment, last by Ului 20 years ago
Of course I can assume 8bit Color is definatly faster, last bytes to work with etc. But heres my problem, I am coding for a cell phone, it has 16bit color 555 or 565, can''t remember which. I have routines to blt from say a sprite, to a buffer. Then once i have finished, i lock the screen & blt that buffer to the foreground. All in 16bits. Would i see speed improvment, if I had 8bit sprites & 8bit buffer, and then blt that to the 16bits using a palette? Or the speed going to remain or slow? Dun mes wit me!
Dun mes wit me!
Advertisement
Different format -> conversion needed -> slow(er)
same format -> direct copy at full speed -> fast(er)

Fruny: Ftagn! Ia! Ia! std::time_put_byname! Mglui naflftagn std::codecvt eY'ha-nthlei!,char,mbstate_t>

Work with 16 bit all the way. ALL and ANY extra conversion steps in the blitting will kill speed. If your phone uses 16 bit display depth, store all your data in it''s native format (555 or 565) and blit straight in that format, that way you will have the least loss of performance.
If however the target platform (phone) supports 8-bit palletized blitting, by all means store all your sprites in that format and blit them.

Bottom line, NEVER! ever convert on the fly, this will slow you down tremendously.
JRA GameDev Website//Bad Maniac

This topic is closed to new replies.

Advertisement