• Advertisement

Archived

This topic is now archived and is closed to further replies.

8bit Color vs. 16bit. Faster?

This topic is 5067 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

Of course I can assume 8bit Color is definatly faster, last bytes to work with etc. But heres my problem, I am coding for a cell phone, it has 16bit color 555 or 565, can''t remember which. I have routines to blt from say a sprite, to a buffer. Then once i have finished, i lock the screen & blt that buffer to the foreground. All in 16bits. Would i see speed improvment, if I had 8bit sprites & 8bit buffer, and then blt that to the 16bits using a palette? Or the speed going to remain or slow? Dun mes wit me!

Share this post


Link to post
Share on other sites
Advertisement
Work with 16 bit all the way. ALL and ANY extra conversion steps in the blitting will kill speed. If your phone uses 16 bit display depth, store all your data in it''s native format (555 or 565) and blit straight in that format, that way you will have the least loss of performance.
If however the target platform (phone) supports 8-bit palletized blitting, by all means store all your sprites in that format and blit them.

Bottom line, NEVER! ever convert on the fly, this will slow you down tremendously.

Share this post


Link to post
Share on other sites

  • Advertisement