Why does texture-sampling only work at (0,0,0)?

Started by
7 comments, last by ferhua 7 years ago

I want to save the color of every point of one object in the RWTexture3D (UAV Resource) and transfer them to another object in its shader.

I made a test in the two shaders. Both in the fragment shader.

In first shader, I gave


RWTexture3D<float4> gUAVColor;
gUAVColor[uint3(0,0,0)] = float4(1.0f,0.0f,0.0f,1.0f);

In second shader:


Texture3D<float4> gVoxelList;
float4 output = gVoxelList.SampleLevel(Filter, uint3(0,0,0),0);

The result is correct, I got red as the result.

But when I change the code, the texture cant be sampled correctly.

In first shader, I gave


RWTexture3D<float4> gUAVColor;
gUAVColor[uint3(1,0,0)] = float4(1.0f,0.0f,0.0f,1.0f);

In second shader:


Texture3D<float4> gVoxelList;
float4 output = gVoxelList.SampleLevel(Filter, uint3(1,0,0),0);

I only changed the pos which save the red color from uint(0,0,0) to uint(1,0,0), but what I got changed to black, which means it's uncorrect.

If I use gVoxelList[uint(1,0,0)].xyz ,it works.

Does anyone have the idea where may be the problem?

Besides, what's the difference between gVoxelList[pos] and gTexture.SampleLevel(Filter, uint3(1,0,0),0); Both the two function backs the color: float4, right?

Advertisement

I have tried the load function,


float4 output = gVoxelList.Load( uint3(1,0,0,0));

It also can work well.

So the load function and Operator[ ] function both can work, why only sample can't ?

Sample/SampleLevel/SampleGrad etc all take normalised UVW coordinates in the range 0.0f to 1.0f. Exceeding 0-1 in either direction reads out of bounds and invokes the behaviour specified in the sampler for what to do in such a case. This includes "Clamp", "Mirror", "Wrap", Border Colour, etc.

If you don't need filtering, then using the [] operator or .Load is the best approach.

Your coordinate uint3(1,0,0) is converted to float3(1.0f, 0.0, 0.0f) which is reading on the extreme "right" side of the 3D texture.

Adam Miles - Principal Software Development Engineer - Microsoft Xbox Advanced Technology Group

This is because the bracket operator require pixel location in [(0,0,0)..(width,height,depth)] while a sampler will require a normalized texture coordinate, first pixel is then (0.5/width,0.5/height,05/depth) and last is (1-0.5/width,1-0.5/height,1-0.5/depth)

This is because the bracket operator require pixel location in [(0,0,0)..(width,height,depth)] while a sampler will require a normalized texture coordinate, first pixel is then (0.5/width,0.5/height,05/depth) and last is (1-0.5/width,1-0.5/height,1-0.5/depth)

DX11 doesn't have half-pixel offset anymore, thankfully, so first pixel should be (0, 0, 0) and last pixel should be(1, 1, 1).

This is because the bracket operator require pixel location in [(0,0,0)..(width,height,depth)] while a sampler will require a normalized texture coordinate, first pixel is then (0.5/width,0.5/height,05/depth) and last is (1-0.5/width,1-0.5/height,1-0.5/depth)

DX11 doesn't have half-pixel offset anymore, thankfully, so first pixel should be (0, 0, 0) and last pixel should be(1, 1, 1).

No, galop1n is correct. The left edge of the left-most texel is at u=0 and the right edge of the right-most texel is at u=1. The center of the leftmost texel is at u=0.5/width and the center of the rightmost texel is at u=(width-0.5)/width (aka u=1-0.5/width). If you use linear filtering and wrapped coordinates, then u=0 and u=1 are actually the same coordinate, and both result in a 50% mix of the left-most and right-most texels. When doing precise sampling of pixels, you need to get these coordinates right, or alternatively, use the new Load instructions that let you use integer pixel addressing :D
These statements are true in D3D11, D3D9 and GL (and practically everything else).

D3D9's silly half-pixel offset bug (feature?) is to do with the definition of screen pixel coordinates, not texture coordinates. D3D9 made the mistake of declaring the center of the left-most pixels in the rasterization grid to be at xNDC=-1... instead of the sane thing to do, which is the left edge of the leftmost pixels being at xNDC=-1. This choice meant that screen coords and texture coords didn't match up without some manual shader trickery (or non-intuitive tex coord generation in your buffers) for any screen-aligned passes (such as post processing). It also meant that MSAA was straight up broken for pixels on the left and top of the screen, as half your samples would be outside the viewport :o

Yes, Good update Hodgman :)

This one has fixed, but there's still problem... please read the next post.

Thank you guys,I have a new problem in sampling when I am doing voxelization.

I only return the following color in pixel shader.

The pictures are the visualization of the voxelization. The first one looks no problem. I get it use


//pos is from(0,256)
//get every uint point in x:(0,256) y:(0,256) z:(0,256)
    uint VoxelDim = 256;
    uint sliceNum = VoxelDim*VoxelDim;
//index:256^3
    uint z = vin.index / (sliceNum);
    uint temp = vin.index % (sliceNum);
    uint y = temp / (uint)VoxelDim;
    uint x = temp % (uint)VoxelDim;
    uint3 pos = uint3(x, y, z);

    float3 color=gVoxelList[pos].xyz

Next, I changed operator[ ] to sampling function,


float3 color=gVoxelList.SampleLevel(SVOFilter, pos/256.0f ,1).xyz;

So it comes the second picture, you can see it has very strange "triangle like" pixels...

I have no idea where the problem will be... Is it the accuracy problem in voxelization?

I have checked the pos, every point (every point I need to get from the 3D texture) in the visualization is in the right order to correspond the point in voxelization (every point I save to RW3Dtexture).

visualie.PNG

sample.PNG

OK I am sorry, in one fx it works , but in the other one which need to transform form world coordinate to voxel coordinate it doesn't...

The following code transform pixels from world pos to voxel pos.


float3 world_to_svo(float3 posW,float voxel_size,float3 offset)
{
    float3 pos=posW;
    pos=((pos+offset)/voxel_size);
    return pos;
}

So in the voxelization.fx, I do following to save the voxel pos.


svoPos=world_to_svo(posW.xyz,gVoxelSize,gVoxelOffset);
gUAVColor[svoPos] = float4(litColor,1.0f);

And in the other fx, I want to get the color which I saved in the texture:


float3 pos_svo=world_to_svo(posW,gVoxelSize,gVoxelOffset);
color=gVoxelList.SampleLevel(SVOFilter, pos_svo/gDim+0.5f/256.0f ,0);
color=gVoxelList[pos_svo];

Both the two methods can't work. And I got the wrong picture like I posted before...

It may be a little offset in the sampling texture coordinate, because when I set the gDim to 64, it works. But when I set it to 128,256 it is wrong...

I use the same method so I don't know how to fix this bug.

Anyone have some idea? Thank you very much!

The first picture is set to 128 res and the second is 256, you can see the comparison, the more res it increase, the more imprecise.

128.PNG

256.PNG

This topic is closed to new replies.

Advertisement