Jump to content
  • Advertisement

Archived

This topic is now archived and is closed to further replies.

lpsoftware

wgpfd bitmap loading

This topic is 6670 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

Hi all, For those who have "Windows Game Programming for Dummies" by Andre Lamothe, I have a quick question. In program 10_5(bitmap loading with 16bit mode), why do the colors come out very weird, all green and bluish? Will I just figure it out if I keep reading? Lamothe doesn''t mention anything about it in the section about it. Sorry if this was a pointless question, Martin

Share this post


Link to post
Share on other sites
Advertisement
I''m pretty sure that it is because he doesn''t do anything about the 5-?-5 problem in 16-bit. That would cause this effect.

*** Triality ***

Share this post


Link to post
Share on other sites
there was the exact same post the other day and zerwits you need to check what pixel format your card is 555 or 565, look up DDGETPIXELFORMAT and HRESULT GetPixelFormat(
LPDDPIXELFORMAT lpDDPixelFormat) in the directX help file.

Share this post


Link to post
Share on other sites
I am right because I had the exact same problem, and LaMothe does address the problem in his latest book "Tricks of the game programming gurus".

Share this post


Link to post
Share on other sites
I'm the guy who posted the same question a few days ago. But I'm still confused what do I do with DDGETPIXELFORMAT? Does the function just check what the card uses or does it actually change something?

could you really help me out and give me an example of the code to write. Thanks

Edited by - Chrono999 on 3/11/00 6:39:05 PM

Share this post


Link to post
Share on other sites
GetPixelFormat fills out a DDPIXELFORMAT structure with information about the pixel format of your surfaces. In the following code, primary is an already initialized primary surface:


DDPIXELFORMAT ddpf;
ZeroMemory (&ddpf,sizeof(ddpf));
ddpf.dwSize = sizeof(ddpf);
primary->GetPixelFormat (&ddpf);
color_depth = ddpf.dwRGBBitCount;


So, you would use a switch statement or or a series of if...else to determine the value of color_depth. The values will be either 8,15,16,24,or 32. Since you''re using 16-bit, you are only concerned with the values 15 and 16. 15 indicates a 555 format, while 16 indicates 565.

Share this post


Link to post
Share on other sites
Thanks a lot for the help, I found out that it's 16. But now what do I do with the information? the book has a macro:
#define _RGB16BIT(r,g,b) ((b%32) + ((g%32) << 5) + ((r%32) << 10))

So I changed that to
#define _RGB16BIT(r,g,b) ((b%32) + ((g%32) << 5) + ((r%32) << 11))

Edited by - Chrono999 on 3/12/00 11:03:23 AM

Share this post


Link to post
Share on other sites
It seems to work, but now there''s a new problem with the engine he made, I''m trying to load 16 bit pictures, but when I load them in, the picture is doubled horizontally! I don''t know how to fix that. I took out the palette part in the dd_init function. If you don''t have knowledge of his engine then you probably won''t be able to help me though.

Share this post


Link to post
Share on other sites

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

Participate in the game development conversation and more when you create an account on GameDev.net!

Sign me up!