RGB Craziness

Started by
13 comments, last by CoolMike 24 years, 4 months ago
I prefer to shift the values:

#define _RGB16BIT(r,g,b) ((r>>3)<<10) | ((g>>3)<<5) | (b>>3)

But, you should replace the pointer as such:

typedef unsigned short USHORT
USHORT *bitmap_buffer = NULL;
bitmap_buffer = (*USHORT)ddsd.lpSurface;
bitmap_buffer[x+ddsd.lpPitch*y] = _RGB16BIT(255,255,255);

I don't remember offhand if you have to divide lpPitch by two beforehand. Try both ways.

BTW - RGB(0,0,255) should make a bright blue (unless you mean it's adding some green for a teal color).

Remember - when writing 16 bit colors, you should use a 16-bit pointer (USHORT), not a byte pointer (CHAR).

Jim

Advertisement
I have 3 notes about this:
1/ Your explaination is very clear, Splat! I have ever written on 16 bit colors. The problem is: we must get the 5 most significant bits of each component, ignore the 3 low bits. So, we must use the shift bit macro.
2/ The pixel's format of each Video card is different. Some card are 555, and anothers are 565 (S3 Trio 64V+ is an ex.). So, you must get the pixel's format to get the correct macro.
3/ I have a problem on ATI card too! Sorry, I don't remember the version of my card. But I want to tell you know that: the GDI functions will destroy the DirectSurface interface! It means: in my program, I use the LoadBitmap function to load the external bitmap file, get Device context of surface GetDC(), BitBlt the memory device context to Surface. And I had error when blitting with src color key. So, I must load bitmap into surface manually, not using the GDI functions! And it work well.
Goodluck.

------------------

I am getting nowhere with this. I failed to get it to work when I changed my UCHAR's to USHORT's and I know my ATI card is 555 because I can already load sprites from bitmaps and display them just fine on the screen. It is only when I blit pixels and use my _RGB16BIT macro that it doesn't work right.
BTW, would anyone be willing to be emailed my project? I would really like it if someone could actually look at my entire project (as small as it is) and tell me what's wrong. It is about 70kb and is for Visual C++ 5.0. Thanks in advance for any gracious soul willing to undertake this task!

-Mike

I have two problems: My blits are not transparent when I set the color key of the source surface to RGB(0,0,0), and when I plot a 16-bit pixel, it doesn't display the right color. For example, if I plot an RGB(255,255,255) pixel, it displays a blue pixel, and if I plot a pixel as RGB(255,255,0), it comes out dark green.
Here is the piece of code I use to plot my pixels (RGBred, RGBgreen, and RGBblue are my color values, and bitmap_buffer is a pointer to the locked backbuffer):

bitmap_buffer[x*2+ddsd.lPitch*y]=
RGB(RGBred,RGBgreen,RGBblue);

I have a feeling these two problems are caused by a problem with RGB. Is it my video card being screwy? I have an ATI All-In-Wonder PRO--does it maybe have some strange hardware thing I have to work with? I think I might need to use GetPixelFormat or something like that, but I'm not sure how to do that. Can anybody help me with this?

Thanks in advance.

Send away! My email should be listed in my profile...

- Splat

This topic is closed to new replies.

Advertisement