Jump to content

  • Log In with Google      Sign In   
  • Create Account


dssannyasi

Member Since 22 Jul 2010
Offline Last Active Jan 16 2013 02:51 AM

Topics I've Started

Vertex Shader Clamps 0-1... Need float!

15 January 2013 - 06:34 PM

I have written a point cloud viewer using slimdx and directx 9.

We use it to visualize and render images from some content creation software we're building here.

Currently my renderer can display points and polys. The main purpose is to render EXR files with color and zDepth (floating point values > 1) information in the alpha.

 

My render target is 32bit float, and I dump the framebuffer to an exr file.

When rendering polys it works perfectly, I just calculate the color of the pixel shader like this (hlsl).    

 

color = Input.color;
color.w = distance(Input.Position, cameraPosition),

 

    and I get values in real world units away from the camera. So if a object is 1000 units away the alpha of the exr shows 1000 for that pixel's raw value;

 

 

 

Now I'm trying to render just the vertices of point clouds using a vertex shader with the same calculation as above. But my depth values clamp themselves to a value of 1.

 

If I change the above code to    

 

color = Input.color;
color.w = (distance(Input.Position, cameraPosition) - near) / far;

(near and far are my camera clipping distances)    This scales the depth values into the 0-1 range and they output this way, but I need greater precision and real world units!

 

Is there a limitation on vertex shaders that won't allow these kinds of floating point operations?

 

Thanks


SlimDX - No depth sorting on a render target

25 October 2010 - 08:46 PM

It has been a while since my first post on this subject (been working on some other projects)... but 3 months and no response to what I would think would be a simple problem :(

Here is the original post

Z Depth Error (order vs distance)

Subject really says it all. I've got a gui where a 3d view works perfectly, but all I do is switch to a Floating point render target and my depth sorting doesn't work.

With the exception of the floating point format of the renderTarget, there is literally no difference code wise between the viewport and the renderTarget.

Please any assistance.

Z Depth Error (order vs distance)

22 July 2010 - 08:23 AM

Using:
DirectX9 via SlimDX (Feb 2010)


I'm creating an application to visualize point cloud data, and I'm trying to render out floating point images, specifically depth.

Problem is:
In the viewport everything looks fine but when I switch the renderTarget to a different Surface and save the resulting image to disk my objects aren't rendering in the correct z order. What appears to be happening is the surface is drawing the primitives in the order it receives them, so a cube that it receives last even if its behind other objects gets rendered on top.

On a side note i would prefer to output this data to an exr format (currently dds is the only float format that works). Anyone know of something for c# that I maybe able to use, or a way to get an array out of the surface object?

Any help... please.






Heres the basics of my code paraphrased a bit:


Main()
{
UI ui = new UI();
PresentParameters pp = new PresentParameters();
Direct3D d3d = new Direct3D();
Device device = new Device(d3d, 0, DeviceType.Hardware, ui.viewer.Handle, CreateFlags.HardwareVertexProcessing, pp);

Surface windowSurface = device.GetRenderTarget(0);
Surface renderSurface = Surface.CreateRenderTarget(device, pp.BackBufferWidth, pp.BackBufferHeight, Format.A32B32G32R32F, MultisampleType.FourSamples, 4, false);


MessagePump.Run(ui, () =>
{
if (renderFrame)
device.SetRenderTarget(0, renderSurface); //Switch surface

device.SetRenderState(RenderState.ZEnable, true); //This doesn't help

Draw() //Regardless of surface the draw code is always the same

if (renderFrame)
{
renderSurface.toFile(); //Save file
device.SetRenderTarget(0, windowSurface); //Set target back to UI
renderFrame = false;
}
}
}

PARTNERS