SetColorKey and 16bit graphics

Started by
8 comments, last by sunbeam60 23 years, 9 months ago
Hi all I''m rather new in the DirectX field at large. Thus, I''m delving into using 16 bit graphics and finally abandoning the annoying 8-bit arena. So I''ve set up my primary surface as 640x480x16 Also, I''ve set up an offscreen_plain surface loaded with a bitmap (24 bit, automatically converted, right?) Now I want to set the colorkey in the offscreen_plain surface, but how is the 16 bit surface set up? Does it use 5-bit per color channel, and how are they arranged? RRRRR GGGGG BBBBB X ? In my frustration I didn''t really care, so I figured I just wanted to see if the SetColorKey worked at all in 16 bit. Thus, I set up the DDCOLORKEY structure as follows: colorKey.dwColorSpaceLowValue := $FFFE; colorKey.dwColorSpaceHighValue := $FFFF; Yep, that''s right: Delphi ... but I''m sure you get the picture. I figured that whatever I set up the colorkey space to be, as long as I kept it rather small, I was sure that something from my bitmap would be displayed when blitted. I made the call as: FBitmap.SetColorKey(DDCKEY_COLORSPACE or DDCKEY_SRCBLT, @colorKey); and the actual blit with DDBLTFAST_SRCCOLORKEY or DDBLTFAST_WAIT. However, nothing is displayed. Absolutely nothing! Removing all references to colorkeying and making the blit with DDBLTFAST_NOCOLORKEY copied the bitmap as it should, but ofcourse also *with* the supposedly transparent color. What am I doing wrong? (besides using Delphi, I know, I know) Regards Toft
Advertisement
Ah, there''s nothing wrong with using Delphi. It''s a fast compiler anyway. Well, I''m not sure exactally why it doesn''t work right. I try to use either white or black for a colorkey. It has something to do with those 565, 888, 666 type color codes. I never messed with those so I don''t know much. Try doing what I do to see if it works. Just use white or black.

------------------------
Captured Reality.
i think that colorspacehighvalue to 0 may work. Because you work in colorkey not chromakey.
-----------------------------------------------"Cuando se es peon, la unica salida es la revolución"
Hi again

I solved the mystery myself, so I figured I might just as well share it with anyone who had an insane amount of free time on their hands to actually read my ramblings.

Here it goes:
First of all, I found out my Viper 550 didn''t support color spaces in setting the colorkey, only one color. I know my Viper isn''t exactly new, but it sure is mainstream, so I''d be insane to demand that a user running my application needed a color-space enabled card; not that many would have one.

The actual querying of capabilities was done through GetCaps and comparing it with dwCKeyCaps and DDCKEYCAPS_SRCBLTCLRSPACE.

Then I reverted back to only using one color in my colorkey, as follows:
//set color key in FBitmap
colorKey.dwColorSpaceLowValue := $F89E;
colorKey.dwColorSpaceHighValue := $F89E;
FBitmap.SetColorKey(DDCKEY_SRCBLT, @colorKey);

In that time, I was still unsure how my 24-bit bitmap got converted to 16-bit format, so I was able to set the correct color as key. Also, I didn''t know if the Viper used 5-6-5 or 5-5-5 bit format to store the color (due to the unlucky fact that 16 bit doesn''t divide equally into 3).

So I locked the bitmap surface, which was where I stored the bitmap with transparency, using the lock method and cast the DDSurfaceDesc2.lpSurface to a 16-bit value (word for the Delphi types and short for the C types). Finally I was able to retrieve how my color got converted.

It turned out that the transparent color, which in the original 24-bit format was $FF11F1, got converted into $F89E. When these was written as binaries, I could deduce that my Viper used 5-6-5. However, the difference between 24-bit and 16-bit was somehow skewed, and the discrepancy was up to 7%.

This makes it rather difficult to set the correct color key, since I can''t simply convert the 24-bit (from my paint program) to 16-bit by hand. The result isn''t aligned with DirectX and thus: No transparency.

But as the actual writing of the bitmap is performed by Delphi wrapper class TBitmap, I can at least be sure that the conversion is the same from PC to PC. It would suck to find out that what was transparent at my screen, wasn''t at my friends.

So ... I can''t imagine anyone would actually read this, let alone understand my disjointed thoughts, but at least it cleared my head

Regards
Toft
Regarding 16 bit - you''ll find some funny/frustrating differences between video cards. They store the values differently. I forget the specifics, but I think they either store the colors as 555, or 556.

Your color key code may work on some computers, and not on others. This is the same thing with other effects, like alpha blending, etc.

I know that NukeDX deals with these things in the background - you can take a look at how it does it. I believe it checks the pixelformat before setting the colorkey, and does a RGB conversion if necessary.

Clay
Clay LarabieLead DeveloperTacendia.com
Once more he bounces back with usefull/less information about this problem.

Okay ... So I managed to find the color that my 24 bit color mapped into in the 16 bit surface. Of course, after doing it the hard way, I learned the easy.

First of all. To query for the internal pixelformat used by 16 bit graphics, one can use IDirectDraw4::GetDisplayMode and query the DDPIXELFORMAT structure''s dwRBitMask, dwGBitMask, and dwBBitMask to determine how the bits are laid out. Read the second paragraph in DirectX Foundation / DirectDraw / DirectDraw Essentials / Display Modes / Setting display modes in the DirectX SDK for more info.

Also, ddutil2.cpp or DDUtil.pas defines two functions, DDColorMatch and DDSetColorKey that are *indeed* worth checking out. Oh, by the way: If you actually get them to work, please mail me. I can see exactly what they do and why it should work, but it doesn''t work any wonders for me. Must be connected to how Delphi TBitmap class draws itself on a surface ....

Regards
Toft

I just read the first pixel in the 16 bit surface and use that value for transparency.......Is that an ok solution ??

-------------Ban KalvinB !
You know what? I just figured out a solution for color spaces...if you want to impement them. What you do is in load time, you replace those ranges of colors with one color. Attached the color you choose and you get the same effect. I think I''m gonna use this for video editing.

------------------------
Captured Reality.
Hey Granat! Are you using DDSetColorKey(CLR_INVALID) or are you manually accessing your video memory through the lpSurface pointer?

If you are actually using DDSetColorKey(CLR_INVALID), can''t you pass some code along? I haven''t been able to get it working and I can''t understand why.

Regards
Toft
Hey Toft,

I'm having the exact same problem you are. It even sounds like we have similar graphics cards (TNT w/ 16MB). I'm trying to mask the color white in my 24-bit source art (255,255,255) when I display it on the screen in 16-bit.

For some reason I can mask out red and blue up to values 255, but green up to a value of 101. Any higher value of green will let the key color be displayed on screen (which is pissing me off). Here is the code I'm using:

        COLORREF color_to_convert = RGB(255,255,255);DWORD transparent_key = DDColorMatch(lpddsback, color_to_convert);Write_Error("transparent_key is %x\n", transparent_key);    // set color key to default color DDCOLORKEY color_key; // used to set color keycolor_key.dwColorSpaceLowValue  = transparent_key;color_key.dwColorSpaceHighValue = transparent_key;// now set the color key for source blitting(bob->images[index])->SetColorKey(DDCKEY_SRCBLT, &color_key);        



I suppose I can get away with using another color as my key but this is really irritating not knowing what the cause of the error is. If you find anything out, I would be indebted to you.

Wes



Edited by - NinjaOne on July 4, 2000 4:54:06 PM
-Wes

This topic is closed to new replies.

Advertisement