Interval mapping problem

Started by
-1 comments, last by cod1 11 years, 7 months ago
So I created a shader based on linear+binary steps, but I need to create a interval mapping shader. However I got strange articrafts and I don't know why, please help me



for( i=0;i<binarySearchSteps;i++ )
{
float line_slope = (bound_A.y-bound_B.y)/(bound_A.x-bound_B.x);
float line_inter = bound_A.y-(line_slope*bound_A.x);

float dem = (IN.viewDir.y/IN.viewDir.x)-line_slope;
float inter_pt = line_inter/dem;
float best_depth;

float tex_coords_offset2D = inter_pt*float2(IN.viewDir.y, -IN.viewDir.x);
float int_depth = (IN.viewDir.y/IN.viewDir.x)*inter_pt;
float pixel_color = tex2Dlod(_DispMap0, float4(tex_coords_offset2D+IN.uv_Splat0,0, 0)).a;

if (pixel_color<int_depth)//new upper Bound
{
bound_A.y = pixel_color;
bound_A.x = inter_pt;
best_depth = bound_A.y;
}
else // new lower bound
{
bound_B.y = pixel_color;
bound_B.x = inter_pt;
best_depth = bound_B.y;
}

tex_coords_offset2D = ((1.0f/(IN.viewDir.y/IN.viewDir.x))*best_depth)*float2(IN.viewDir.y,-IN.viewDir.x);
p.xy += tex_coords_offset2D;
}

This topic is closed to new replies.

Advertisement