Pixel Plotting in DirectDraw

Started by
5 comments, last by Peon 21 years, 1 month ago
An age old question, I''m sure. Unfortunately, the search is broken or down or both, and I''m lost So here goes... I''m trying to work on some kind of graphics engine, and I need the abiltiy to plot pixels. Ideally, I want to send a RGB value and have my pixel appear correctly (much like I do with the aptly named SetPixel function in GDI... only not in GDI) I''m using TOWGPG because I figure, if I can''t copy his code and make it work, I sure can''t write my own. So I''m slapping his code as placeholder code for now, basically, just for the pixel plotting routines. This is what he does: #define RGB16(r,g,b) ((b%32) + ((g%64) << 5) + ((r%32) << 11)) I really don''t understand this part. What exactly is the result of this? It''s supposed to generate some code for a 16bit pixel plot but I am not sure what it does. I have assumed the << are bitshifts? but even if they are, I don''t know what they do very well. I''d apprciate help on this piece. But anyway, after that, LaMothe writes: surfaceMemory[(int)x+(int)y*(memPitch >> 1)] = RGB16(red, green, blue); The variable surfaceMemory is a USHORT pointer to the surface from the DDSURFACEDESC2 structure, and memPitch is an int pointer to the lPitch. I use this, call the function, and low and behold, it plots. The only problem is, my image displays wrong. It is discernible, but the colors are off. A little background... I wrote the image format myself (kind of). Basically, it reads in a red, green, then blue RGB code, and does that for the entire size of the image. I have double checked the loading code by reloading it into my image editor and in the image editor, it displays normally (the image editor is written using GDI pixel plotting routines) Anyone have an idea why I can''t get LaMothe''s code to work, or perhaps a better idea? Peon
Peon
Advertisement
I''ve encountered the same problem with mr. LaMothes code...
the problem is because he uses this macro he makes certain
assumptions about the way your graphicscard stores it''s
pixels in 16bit color mode...
While it''s fairly safe do to this in 32 bits color mode,
in 16 bits color it''ll lead to problems like the one you
describe. The solution is to get the pixel format of the
primary surface...and use the bitmasks to calculate the
precision, and right shift you have for each color channel.
An article on how you''re supposed to do this is hosted in the
directdraw-section of this site...so I''m not going to
cover it here.

Good luck,
I''ll check back on this post...
but my science class is now finished
so I gotta go...

Hi,

there are two 16 bits color modes:

--> 1,5,5,5 = where 5 bits used for red, 5 bits for gren and 5 bits for blue, and 1 bits unused or for alpha. (also somitimes called 15 bits color).

---> 5,6,5 = where 5 bits used for red, 6 bits used for green and 5 bits used for blue.

>#define RGB16(r,g,b) ((b%32) + ((g%64) << 5) + ((r%32) << 11))

That code is for 5,6,5 color mode and expect values from:

red = 0-31;
green = 0-63;
blue = 0-31;

First, be sure that your directx device is setuped to the correct 16 bit color mode, then be sure that your loading file format pic is passing the correct range values for each pixel.

if not works then try this
surfaceMemory[(int)x+(int)y*(memPitch >> 1)] = RGB16(blue, green, red); // bgr order instead rgb


good luck,

tp.
RGB16BIT555(r,g,b) ((b&31) + ((g&31) << 5 ) + ((r & 31) << 10))
RGB16BIT565(r,g,b) ((b&31) + ((g&63) << 5) + ((r&31) << 11))

RGB32BIT(a,r,g,b)((b&255)+((g&255) << 8) +((r&255) << 16) + ((a&255) <<16))


To plot:
16Bit:
//use something to decide which mode your in...//i''ll assume 565..//x and Y are ints...SurfaceMemory[(y*(SurfacePitch >> 1) + x] = RGB16BIT565(0,0,0);32Bit:SurfaceMemory[(y*(SurfacePitch >> 2) + x] = RGB32BIT(0,0,0,0); 

hope this helps...
Programmers of the world, UNTIE!
Alright, I just want to say first of all, thanks for the help I tried some of the suggestions found here, as well as the link to the 16 pixel plot article. Unfortunately, I'm still having the same problem (the tutorial was also a bit beyond my level, thoughI will continue to look at it) I decided to post a picture to show better what the problem is; maybe someone will see the colors and figure out exactly what is wrong.

NOTE: I am using an image anonimizer, so it is possible that the pictures will either run slow, or exceed the bandwitdth limit; bear with me, I have no good, free webspace



(This one is supposed to be a rainbow; note how the red, green, blue colors are correct, but the rest are not)

EDIT: I added an original; broken version. On second glance, the green and blue ARE slightly off the original version... this might be due to less shades though. The original was done with the RGB() macro and the GDI SetPixel function.



A blue gradient, from light to dark. Original is the top right one; the "broken one" is the lower left. Notice how the gradient looks ALMOST right, but almost like on the wrong "cycle".

Any ideas from these screenshots? It's definetly plotting, but the colors are still off

Peon

[edited by - Peon on March 18, 2003 8:06:19 PM]

[edited by - Peon on March 18, 2003 8:06:37 PM]

[edited by - Peon on March 18, 2003 8:11:57 PM]
Peon
Just a quiet bump... I promise I''ll only do it once

Peon
Peon
I still think, after seeing your examples that your
problem has to do with the way your vidcard masks
the color channels in 16bit mode...

Try writing the values for these masks to a file:
(They are stored in the ddpfPixelFormat of the
primary surface)
and if they''re not what you expected them to be...
you know that''s the reason your algorythm doesn''t work.

I myself am the "proud" owner of a Geforce2 MX and that
quite common card doesn''t mask the colors in 16 bit mode
the same as the RGB16 macro does...

Good luck!
Pixel plotting can be tough...
but the satisfaction you get from
solving it is very high too...

This topic is closed to new replies.

Advertisement