I have written a point cloud viewer using slimdx and directx 9.
We use it to visualize and render images from some content creation software we're building here.
Currently my renderer can display points and polys. The main purpose is to render EXR files with color and zDepth (floating point values > 1) information in the alpha.
My render target is 32bit float, and I dump the framebuffer to an exr file.
When rendering polys it works perfectly, I just calculate the color of the pixel shader like this (hlsl).
color = Input.color; color.w = distance(Input.Position, cameraPosition),
and I get values in real world units away from the camera. So if a object is 1000 units away the alpha of the exr shows 1000 for that pixel's raw value;
Now I'm trying to render just the vertices of point clouds using a vertex shader with the same calculation as above. But my depth values clamp themselves to a value of 1.
If I change the above code to
color = Input.color; color.w = (distance(Input.Position, cameraPosition) - near) / far;
(near and far are my camera clipping distances) This scales the depth values into the 0-1 range and they output this way, but I need greater precision and real world units!
Is there a limitation on vertex shaders that won't allow these kinds of floating point operations?