Converting a color image to a grayscale image

Started by
10 comments, last by Scptre 21 years, 5 months ago
OK, this is apparently more complicated than I had first thought (in Direct3D at least). The reason? Direct3D's dotproduct3 operation first converts the input values into signed values, in the range [-1,1] when they were in fact in the range [0,1] to begin with. When doing normalmapping, this is useful because the normals can point in any direction. However, for our purposes we want to be using unsigned values since we're dealing in colorspace and not a geometric space. I'll mention this on the directxdev list in a little while, I've been generally unimpressed with the flexbility of the d3d texture stage states.

Moving on, the solution: Make a texture (8x8 pixels or 1x1, whatever you want) that is filled with the following color: RGB(166,203,143). That is the 0.30, 0.59, 0.11 values converted into the [0,1] space in a [-1,1] range; in other words, (color*0.5) + 0.5. That color is a kind of light puke-green. Now that you have that texture, it's time for some trickery. Put the value 0x80ffffff into your texturefactor state (0.5 alpha, 1.0 RGB) and set up the first stage to do D3DTOP_BLENDFACTORALPHA using TEXTURE and TFACTOR arguments, where the texture is whichever texture you will be rendering in grayscale. The 2nd stage is then D3DTOP_DOTPRODUCT3 using TEXTURE and CURRENT as arguments, where the texture used is the special texture filled with the magic value. Hey presto, grayscale rendering!

What's happening? the blendfactoralpha operation does
alpha*arg1 + (1-alpha)*arg2
so with the specified tfactor and arguments, you get
0.5*texture + 0.5*1 .... or (texture*0.5) + 0.5 which is what we needed up above to get the color value into the right space for the signed dotproduct. Then the dotproduct does its thing to get a grayscale result from the inputs, operating on signed values that are in fact both in the [0,1] subset of the [-1,1] range being used. The main result of this is that you will see noticeable banding on smooth gradients, since it's using only 128 colors instead of 256.

If you could invoke an unsigned dotproduct operation, you could do away with this whole mess and do it in a single stage, but whoever designed the direct3d texture stage interface was being pretty closed-minded. OpenGL lets you do it with nVidia register combiner extensions quite easily.

Hope this helps.

Addendum: you can actually replace the custom texture thing with a vertex color entry if you're doing the render-to-texture- then-render-texture-to- backbuffer-on-a-big-quad thing, by simply filling in the diffuse element in the vertex data with the same color that the texture would contain. Then use DIFFUSE instead of TEXTURE in the 2nd stage when doing the dot3.

[edited by - Assassin on November 21, 2002 8:37:48 PM]
Assassin, aka RedBeard. andyc.org
Advertisement
After actually installing the sdk,
I now have a socalled "reference device" up and running

And using this reference device, my dotprotuct3 and
LODbias is working as they should !!
(veeery slooow and of absolutely no use of course)

My HAL caps reported "no" for both of them

So now I see what you meant in your latest reply Assasin.
The image is greyscale allright, but somewhat dark
and inverted-like ..

I want a new Geforce 4 card for christmas :-)

Again thank you

CGameProgrammer .. you are somewhat missing our point here.
We are able to write a function that converts to greyscale,
but that is not what we want to do at all.
We want to apply texture stages to convert our image
in real time, as this would be much faster!

And the Dotproduct3 colorop is the right way to go
(if your card suports it) ...

This topic is closed to new replies.

Advertisement