Jump to content
  • Advertisement

Archived

This topic is now archived and is closed to further replies.

OoMMMoO

24 bgr-rgb

This topic is 6437 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

Advertisement
Or...

__asm
{
mov eax,color
mov ebx,eax
shr eax,16
rol bx,8
shl ebx,16
shr ebx,8
or eax,ebx
mov color,eax
}

Something like that, I think...

"Paranoia is the belief in a hidden order behind the visible." - Anonymous

Edited by - Staffan on October 31, 2000 2:29:22 PM

Share this post


Link to post
Share on other sites
BTW. "Normally" 32-bit display format is like:
A mask 0xFF000000
R mask 0x00FF0000
G mask 0x0000FF00
B mask 0x000000FF

Have you seen any video cards which don''t support this format? I ask this, because I''m assuming (to optimize code) this display format in my game engine.

-Jussi

"Don''t do that, it''s legal!"

Share this post


Link to post
Share on other sites
Since this topic is related enough to my question, I am going to ask here instead of a new thread...

Could someone tell me how to extract the RGB components from a 16 bit color? I have a macro that makes 16 bit colors, but I don''t have one that takes 16 bit colors to make r, g and b values. I *think* my DD surface is 5r-6g-5b bits.

--------------------


You are not a real programmer until you end all your sentences with semicolons; (c) 2000 ROAD Programming


You are unique. Just like everybody else.

Yanroy@usa.com

Visit the ROAD Programming Website for more programming help.

Share this post


Link to post
Share on other sites
#define GETRED16(c) (( (c) & (31<<11)) >> 11)
#define GETGREEN16(c) (( (c) & (63<<5)) >> 5)
#define GETBLUE16(c) (( (c) & (31)))

Edited by - clonemaker on November 1, 2000 10:49:02 AM

Share this post


Link to post
Share on other sites
quote:
Original post by Selkrank

BTW. "Normally" 32-bit display format is like:
A mask 0xFF000000
R mask 0x00FF0000
G mask 0x0000FF00
B mask 0x000000FF

Have you seen any video cards which don''t support this format? I ask this, because I''m assuming (to optimize code) this display format in my game engine.

-Jussi

"Don''t do that, it''s legal!"



You can''t assume this. Some cards are ARGB, some are RGBA, others are BGRA, still others are ABGR. Normally, your api will provide you with an interface for setting pixel-color that is video-card independent.

Share this post


Link to post
Share on other sites
quote:
Original post by FordPrefect

You can't assume this. Some cards are ARGB, some are RGBA, others are BGRA, still others are ABGR. Normally, your api will provide you with an interface for setting pixel-color that is video-card independent.


Sigh. Here are my color structure and pixel plotting function. Any clever ideas to make them multiple-format compatible?

        
struct RGBA
{
union
{
unsigned long data;
struct
{
unsigned char r;
unsigned char g;
unsigned char b;
unsigned char a;
}
}
};

inline void DisplayClass::Draw(unsigned long x, unsigned long y, const RGBA& color)
{
m_SurfacePointer[x + y * m_SurfacePitch] = color.data;
}


-Jussi

"No need to hurry, you've got a whole life ahead. Oops, actually, had."

Edited by - Selkrank on November 2, 2000 5:56:54 AM

Share this post


Link to post
Share on other sites
Maybe a good idea: Take over the pixel data that is saved in the sprites.
You can change the pixel format in your sprite loading routine.

Sludge Software
www.sludgesoft.com
Developing a secret of mana style role-playing-game

Share this post


Link to post
Share on other sites

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

Participate in the game development conversation and more when you create an account on GameDev.net!

Sign me up!