# volume rendering errors at boundary of object of interest

This topic is 3213 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

## Recommended Posts

Say I have a simple volume dataset defined as follows. For each point a distance 0.25 from the center of the volume cube, the point has grayscale value 128. Other points have grayscale value 0. In other words, the volume is just a sphere. Now say I have a transfer function defined that maps 0 to rgba=(0,0,0,0), and 128 to (1,1,1,1). The problem I am seeing is for voxels near the boundary of the sphere, due to the hardware trilinear interpolation, the grayscale value will be interpolated and be a value between between 0 and 128. For the sake of example, suppose it is 64, and that the transfer function maps 64 to (0,0,1,1). This causes a bluish noise to appear on the boundary of the volume. The problem, of course is the linear interpolation across a discontinuity. However, I don't want to use point filtering because that gives me a blocky appearance everywhere. How is this problem typically solved with volume rendering?

##### Share on other sites

>This causes a bluish noise to appear on the boundary of the volume.

Well, it is not a noise, it is the accurate result of the arrangement you've described.

>The problem, of course is the linear interpolation across a discontinuity.

I really fail to see the "problem", you get-back the result consistent with your setup.

>However, I don't want to use point filtering because that gives me a blocky >appearance everywhere.

The prescription is quite apparent, to reduce "blocky" appearance you need to go to higher order of interpolation; the higher order the less "blocky" it will look. It is not very practical advice since order beyond tricubic is clearly not a practical way. For your arrangement tri-cubic will definitely make it look smoother however still it will look quite "blocky".

> How is this problem typically solved with volume rendering?

There is nothing typical with volume rendering unless it is crappy one.

There is a way to make sphere iso-surface look clean and smooth by means of volume rendering without going to insanely high order of interpolations:

The recommendations below are correct for the following setup:
- Interpolation -> Classification (IC) ray casting is applied
- Gradients are computed as a symmetric central delta relatively sampling point
- the super-sampling level should be at least 8 samples per cell to have artifact-free iso-surface.

1) change the 3D scalar field - increase density quadratically from periphery to center. Changing it to linear dependency will show why quadratic dependency provides better result.

2) Set Opacity with some slope to ensure that ray will have several interactions before it stops, it creates surface looking softer since its appearance is formed via multiple ray-scalar-field interactions.

3) Apply Phong lighting to emphasize the surface perfection or its imperfection ;o)

##### Share on other sites
Quote:
 For the sake of example, suppose it is 64, and that the transfer function maps 64 to (0,0,1,1).
shouldn't it be (0.5,0.5,0.5,0.5)?

Quote:
 How is this problem typically solved with volume rendering?
if you want a clear edge, you'll have to use a threshold function, the simple one would be
return n<64?0:128;

in case you want a clear edge but not a sharp edge, you might try some more advance tresholding like
low=0high=128tmp = (1-cos(n/128.0*PI))/2;return low*(1-tmp)+high*tmp;

##### Share on other sites
Quote:
 Well, it is not a noise, it is the accurate result of the arrangement you've described.

Right, it is not noise but the result looks noisy.

Quote:
 I really fail to see the "problem", you get-back the result consistent with your setup.

The problem is that there is a discontinuity between the sphere and its complement in the volume cube. And it is using linear interpolation to make it continuous, which is not right for the setup I want, because I do want a discontinuity. I guess the problem is with my transfer function.

Quote:
 - the super-sampling level should be at least 8 samples per cell to have artifact-free iso-surface.

Can you explain what you mean by supersampling here? Is this referring to CPU ray tracing where you cast multiple rays per pixel? What is the GPU analog (other than rendering to a larger render target and downsampling?)

##### Share on other sites
>The problem is that there is a discontinuity between the sphere and
>its complement in the volume cube. And it is using linear
>interpolation to make it continuous, which is not right for
>the setup I want, because I do want a discontinuity. I guess
>the problem is with my transfer function.

Well, apparently it is matter of semantic, you use the word "problem" for the arrangement you set up; within frame of your arrangement you are getting consistent result. If you reluctant to reconsider the setup then the only option to visualize the smooth iso-surface for your data layout is to go to impractically high order of interpolations.

The other options may arise only and only if you change the problem definition; for example, do not show iso-surface at all but rather to show "cloudy" transaction from transparent to opaque which is a composition of multiple iso-surfaces contributions. Still, in case of linear interpolation you have only one/single layer of cell with values between 0 to 128 and this layer is the only "blanket" you may use as a "cloudy" transaction to reduce "cubical" appearance. In this case set opacity ramp/slope from 0 to 128. To ensure an equal contribution of each iso-surface the ramp shape should not be linear but has to be logarithmic (the base depends on size of opacity quanta). For this arrangement the "issue" will be less profound but still quite blocky...

Another way is to redefine the data itself (see my prev post)

>Can you explain what you mean by supersampling here? Is this referring
>to CPU ray tracing where you cast multiple rays per pixel?

No, the "supersampling" for volumetric ray casting used to reference to sampling density along each ray "number samples per cell along ray". This measure increases the accuracy of rendering integral.

The increasing number of rays per each pixel of projection plain apparently can not effect/improve the accuracy of rendering integral... ;o) An excessive number of rays per pixel just helps to smooth edges on 2D image (known as anti-aliasing what is a lame term anyway)

>to CPU ray tracing where you

You could reference to GPU here as well, it makes no difference from math point of view but it does make a huge practical difference from side of quality and performance.

--sb

1. 1
2. 2
3. 3
Rutin
14
4. 4
frob
12
5. 5

• 9
• 9
• 11
• 11
• 23
• ### Forum Statistics

• Total Topics
633670
• Total Posts
3013262
×