15 and 16bit (555, 565) video cards.

Started by
6 comments, last by VisualLR 23 years, 11 months ago
I have a slight problem with setting up my game to work in both modes correctly, I made a sprite file format, which is generally saved as 16bit (565) but this won''t work on 15bit (555) cards, so a quick fix is to make a batch of the same files in 15bit, but this is a lousy solution, now my idea is to check the card''s mode at load time, and convert the files (in memory) from 16bit to 15bit if necessary, if not, to leave them how they are, but Im not sure if there''s a better way, and I dont know how to convert from 16bit to 15bit and vice versa. Thanks! Luis Sempe
visual@guate.net
http://www.geocities.com/SiliconValley/6276


Advertisement
Not to sound too rude, but why are you doing this when there already exist routines for converting 24bit bitmaps to 555 and 565 ?
___________________________Freeware development:ruinedsoft.com
Maybe because he wants to save disc space.
Anyway here is a way to convert a 565 pixel to 555:
unsigned short p565to555(unsigned short pixel){   unsigned short retval=0;   retval=pixel&31;   retval+=(pixel&65504)>>1;   return(retval);};Keep in mind that there are also BGR cards out there, so you have to write a RGB to BGR routine too.    
well, iwasbiggs, I wrote my own bitmap loading class which doesn''t use any Win32 functions, the reason I did this was so I could add more functionality to my code, for example, I can load a bitmap and have it shaded by 50% at load time, and a bunch of other stuff, and I did get the converting from 24bit to 16 or 15bit, but I didnt know how to convert from 16 to 15bit.
But now I do, thanks bosjoh

later!



Luis Sempe
visual@guate.net
http://www.geocities.com/SiliconValley/6276


Just a note I thought you might find useful:

When I wrote my own TGA loader for DirectX I implemented color conversion as part of the loader. This way it only has to do the color conversion once for a run of the same color (in RLE compressed TGAs).

Basically the function loads up any TGA file to the pixel format of the current graphics mode and returns a surface. Works pretty well. I''m not done yet though... still a few bugs.

Now I''m mostly trying to figure out color keying and alpha channel stuff.

- n8




nathany.com
Hey nathany, that''s some nice advice, you see, when I wrote my bitmap loading class, I took that problem into consideration, and my bitmap loading class does pretty much what yours does, but when I wrote my sprite class, which uses a custom sprite file format, I made a design error, see, when I write the sprite file, it gets written in whatever mode the user is in, but then if you use that file on someone else''s computer that works in the other mode, the colors (logically) get messed up, so now I gotta figure out a way to make the sprite file loader convert the image to the appropiate mode before using it... so that''s what Im working on... the best solution Ive come up with so far (but havent implented) is to make the sprite class always save the files as either 16bit or 15 bit, so then they''d only need to be converted if the computer is being run on isn''t on the mode the file was saved in...

design errors are the worst.

later!



Luis Sempe
visual@guate.net
http://www.geocities.com/SiliconValley/6276


I''ve always been slightly worried about my games not running on BGR and 5.5.5 cards because my game library is only written for the ''standard'' 8, 16, 24, and 32 bit representations..

They have worked on every computer I''ve tried; how common are these cards? And does anybody know any specific ones?

adamm@san.rr.com
adamm@san.rr.com
My laptop works with 15bit, and my desktop works with 16bit, so luckily, I can test it, also I always send my projects to a friend who tests it and his video card is also 15bit...

so, that''s why Ive been very concerned on making it work on all platforms...

later!


Luis Sempe
visual@guate.net
http://www.geocities.com/SiliconValley/6276


This topic is closed to new replies.

Advertisement