wgpfd bitmap loading

Started by
16 comments, last by lpsoftware 24 years, 1 month ago
Hi all, For those who have "Windows Game Programming for Dummies" by Andre Lamothe, I have a quick question. In program 10_5(bitmap loading with 16bit mode), why do the colors come out very weird, all green and bluish? Will I just figure it out if I keep reading? Lamothe doesn''t mention anything about it in the section about it. Sorry if this was a pointless question, Martin
______________Martin EstevaolpSoftware
Advertisement
I''m pretty sure that it is because he doesn''t do anything about the 5-?-5 problem in 16-bit. That would cause this effect.

*** Triality ***
*** Triality ***
there was the exact same post the other day and zerwits you need to check what pixel format your card is 555 or 565, look up DDGETPIXELFORMAT and HRESULT GetPixelFormat(
LPDDPIXELFORMAT lpDDPixelFormat) in the directX help file.
"I have realised that maths can explain everything. How it can is unimportant, I want to know why." -Me
If you''re right, why doesn''t Lamothe just cover this problem?

Martin
______________Martin EstevaolpSoftware
I am right because I had the exact same problem, and LaMothe does address the problem in his latest book "Tricks of the game programming gurus".
"I have realised that maths can explain everything. How it can is unimportant, I want to know why." -Me
I'm the guy who posted the same question a few days ago. But I'm still confused what do I do with DDGETPIXELFORMAT? Does the function just check what the card uses or does it actually change something?

could you really help me out and give me an example of the code to write. Thanks

Edited by - Chrono999 on 3/11/00 6:39:05 PM
GetPixelFormat fills out a DDPIXELFORMAT structure with information about the pixel format of your surfaces. In the following code, primary is an already initialized primary surface:

   DDPIXELFORMAT ddpf;   ZeroMemory (&ddpf,sizeof(ddpf));   ddpf.dwSize = sizeof(ddpf);   primary->GetPixelFormat (&ddpf);   color_depth = ddpf.dwRGBBitCount; 


So, you would use a switch statement or or a series of if...else to determine the value of color_depth. The values will be either 8,15,16,24,or 32. Since you''re using 16-bit, you are only concerned with the values 15 and 16. 15 indicates a 555 format, while 16 indicates 565.
Thanks a lot for the help, I found out that it's 16. But now what do I do with the information? the book has a macro:
#define _RGB16BIT(r,g,b) ((b%32) + ((g%32) << 5) + ((r%32) << 10))

So I changed that to
#define _RGB16BIT(r,g,b) ((b%32) + ((g%32) << 5) + ((r%32) << 11))

Edited by - Chrono999 on 3/12/00 11:03:23 AM
Change the manipulation of the green value to this:

((g%64) << 6)

and you''ll be set.
It seems to work, but now there''s a new problem with the engine he made, I''m trying to load 16 bit pictures, but when I load them in, the picture is doubled horizontally! I don''t know how to fix that. I took out the palette part in the dd_init function. If you don''t have knowledge of his engine then you probably won''t be able to help me though.

This topic is closed to new replies.

Advertisement