Archived

This topic is now archived and is closed to further replies.

OoMMMoO

24 bgr-rgb

Recommended Posts

Or...

__asm
{
mov eax,color
mov ebx,eax
shr eax,16
rol bx,8
shl ebx,16
shr ebx,8
or eax,ebx
mov color,eax
}

Something like that, I think...

"Paranoia is the belief in a hidden order behind the visible." - Anonymous

Edited by - Staffan on October 31, 2000 2:29:22 PM

Share this post


Link to post
Share on other sites
BTW. "Normally" 32-bit display format is like:
A mask 0xFF000000
R mask 0x00FF0000
G mask 0x0000FF00
B mask 0x000000FF

Have you seen any video cards which don''t support this format? I ask this, because I''m assuming (to optimize code) this display format in my game engine.

-Jussi

"Don''t do that, it''s legal!"

Share this post


Link to post
Share on other sites
Since this topic is related enough to my question, I am going to ask here instead of a new thread...

Could someone tell me how to extract the RGB components from a 16 bit color? I have a macro that makes 16 bit colors, but I don''t have one that takes 16 bit colors to make r, g and b values. I *think* my DD surface is 5r-6g-5b bits.

--------------------


You are not a real programmer until you end all your sentences with semicolons; (c) 2000 ROAD Programming


You are unique. Just like everybody else.

Yanroy@usa.com

Visit the ROAD Programming Website for more programming help.

Share this post


Link to post
Share on other sites
quote:
Original post by Selkrank

BTW. "Normally" 32-bit display format is like:
A mask 0xFF000000
R mask 0x00FF0000
G mask 0x0000FF00
B mask 0x000000FF

Have you seen any video cards which don''t support this format? I ask this, because I''m assuming (to optimize code) this display format in my game engine.

-Jussi

"Don''t do that, it''s legal!"



You can''t assume this. Some cards are ARGB, some are RGBA, others are BGRA, still others are ABGR. Normally, your api will provide you with an interface for setting pixel-color that is video-card independent.

Share this post


Link to post
Share on other sites
quote:
Original post by FordPrefect

You can't assume this. Some cards are ARGB, some are RGBA, others are BGRA, still others are ABGR. Normally, your api will provide you with an interface for setting pixel-color that is video-card independent.


Sigh. Here are my color structure and pixel plotting function. Any clever ideas to make them multiple-format compatible?

        
struct RGBA
{
union
{
unsigned long data;
struct
{
unsigned char r;
unsigned char g;
unsigned char b;
unsigned char a;
}
}
};

inline void DisplayClass::Draw(unsigned long x, unsigned long y, const RGBA& color)
{
m_SurfacePointer[x + y * m_SurfacePitch] = color.data;
}


-Jussi

"No need to hurry, you've got a whole life ahead. Oops, actually, had."

Edited by - Selkrank on November 2, 2000 5:56:54 AM

Share this post


Link to post
Share on other sites
quote:
Original post by Sludge

Maybe a good idea: Take over the pixel data that is saved in the sprites.
You can change the pixel format in your sprite loading routine./quote]

I have no problem with sprites, I use DDraw blitting for them. The problem is individual pixels and shapes (circles etc).

-Jussi

"The fine art of bullshitting"

Share this post


Link to post
Share on other sites
ok,

maybe you can fill 3 arrays of 256 elements. like this:

    
unsigned long red[256],green[256],blue[256];
unsigned short index;
for (index=0;index<256;index++){
red[index]=redmask/256*index;
green[index]=greenmask/256*index;
blue[index]=bluemask/256*index;
};


Or you can try to use an if-statement in your struct definition.

Sludge Software
www.sludgesoft.com
Developing a secret of mana style role-playing-game

Share this post


Link to post
Share on other sites