• Advertisement

Archived

This topic is now archived and is closed to further replies.

Video Card Info?

This topic is 6252 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

Hi, (I tried searching the forums for this info - but it''s not working ) I''ve determined that my vid cards use this format in 16-bit: Card:.......First Byte/Second Byte S3 VirgeGX:..XGG BBBBB/RRRRR GGG Voodoo2:.....GGG BBBBB/RRRRR GGG Now, I believe these are BGR cards (the S3 seems to be, in 24-bit anyway. And somebody mentioned in another post of mine, that the Banshee was BGR(which is a derivative of the V2?)). But I thought BGR would be something like: BBBBB GGG/GGG RRRRR. Am I missing something here? Is there anywhere where you can get this information like this on video cards (I''ve tried manufacturer websites), preferably online? Thanks, Ro_Akira

Share this post


Link to post
Share on other sites
Advertisement
Looks like you read the bytes backwards. Should come out like:
RRRRR GGG/GGG BBBBB
for a nice 5/6/5 RGB format, which is pretty standard

Share this post


Link to post
Share on other sites
Now that you mention it, it does look as if I''ve done that (by the resulting format)...

I want this info, because I want to know what way I should be ordering the bits in my own graphics format.

  
for(Y=0;Y<sizey;Y++)
{
for(X=0;X<sizex;X++)
{
// Currently RGB

// First Byte

*lpd=0;
lpd++;
byte_offset++;

// Second Byte

*lpd=224;
lpd++;
byte_offset++;
}
lpd+=dyoffset;
}


This testing code will give pure red. (I''ve also tested the pure blue/green). this is where I''m getting the
GGG BBBBB/RRRRR GGG format from.

Ro_Akira

Share this post


Link to post
Share on other sites
In your own graphics format???

Well if you want to be able to use your game on any video card, then you will probably have to use 24-bit or 8/16-bit paletted images.

Personally I like to use 24-bit, but I have seen 16-bit palettes used quite effectively.

16-bit palette = 2 pwr 16 colours (stored in a index).

There are many different formats used by various cards... the easiest way is to develop an algorithm that checks the device specifications and stores several variables that allow you to do you RGB24bit to RGB16bit conversion at runtime.



Regards,
Nekosion

Share this post


Link to post
Share on other sites

  • Advertisement