[Software Rasterizer] Texture Mapping help

Started by
5 comments, last by lipsryme 10 years, 4 months ago

I've gotten some progress done on my software rasterizer biggrin.png and I'm now trying to implement texture mapping.

Though it seems like I'm doing something wrong.

Have a look at some of the relevant code snippets first:

I interpolate the UV's using my 3 normalized [0,1] barycentric weights:


Vector2 interpolatedTexCoord(v0.TexCoord.X() * lambda0 +
                             v1.TexCoord.X() * lambda1 +
                             v2.TexCoord.X() * lambda2,	
							     
                             v0.TexCoord.Y() * lambda0 + 
                             v1.TexCoord.Y() * lambda1 + 
                             v2.TexCoord.Y() * lambda2);

Then pass these to my pixel shader which samples the Vector4 color value:


// Implement pixel shading here...
Vector4 PS(const PSInput &input)
{
	Vector4 output(0, 0, 0, 1);			
				
	// Sample texture
	output = SampleTex2DPoint(this->textures[0], input.TexCoord);


	return Vector4(output);
}

And last the actual sampling function (where I think the error is):


static Vector4 SampleTex2DPoint(Texture2D* tex, const Vector2 &uv)
{
	unsigned int sampleX = static_cast<unsigned int>(uv.X() * (tex->width - 1));
	unsigned int sampleY = static_cast<unsigned int>(uv.Y() * (tex->height - 1));

	Rgb pixelValue = tex->pixels[sampleX + sampleY * tex->width];
	return Vector4(pixelValue.r, pixelValue.g, pixelValue.b, 1);
}

Does anyone have an idea what's going wrong there ? wacko.png

The output I get looks like this:

screen: http://d.pr/i/TlLX

video:

In the video you can see that it's also warping during rotation. Does that mean it's not perspectively correct interpolated ? I thought using these barycentric coordinates would already do that.

Advertisement

Yeah, you're not doing perspective correct texture mapping. Barycentric coordinates won't help with perspective correction. All barycentric coordinates are is a representation of a point as a linear combination of other points. The problem is that your texture coordinates are not linear in screen space, so you can't simply interpolate them or conjure them up with barycentric coordinates. You'll want to divide your texture coordinates by the vertex's z coordinate, then interpolate those u/z, v/z (the texture coordinates) and 1/z across the polygon, then divide by the interpolated u/z and v/z by 1/z to recover the "real" texture coordinate, which will be perspective correct.

Here's a really good article (plus some code, linked at the bottom) which I think explains the issue extremely well: http://www.lysator.liu.se/~mikaelk/doc/perspectivetexture/

Thanks for the info and the link but it didn't seem to fix anything wacko.png Any idea why ?

Update: I managed to fix the UV problem but it's still not perspectively correct hmm...

screen: http://d.pr/i/qPLg

The z position I'm dividing by is the vertex z position after perspective divide (is that wrong ?)

Update2: I've managed to get it working but it's not Z that I'm dividing by but the homogenous w coordinate.

I've read something about that being the same as the view-space z ? Is that correct ?

Updated code (using Z/W divide, which doesn't work):


// Perspective correct interpolate UV's 
Vector3 v0_perspective_UV = Vector3(v0.TexCoord / v0.Position.Z(), 1.0f / v0.Position.Z());
Vector3 v1_perspective_UV = Vector3(v1.TexCoord / v1.Position.Z(), 1.0f / v1.Position.Z());
Vector3 v2_perspective_UV = Vector3(v2.TexCoord / v2.Position.Z(), 1.0f / v2.Position.Z());

// Interpolate
Vector2 interpolatedTexCoord(v0_perspective_UV.X() * lambda0 + 
                             v1_perspective_UV.X() * lambda1 + 
                             v2_perspective_UV.X() * lambda2,
											                                                     v0_perspective_UV.Y() * lambda0 +
                             v1_perspective_UV.Y() * lambda1 + 
                             v2_perspective_UV.Y() * lambda2);

float interpolated1OverZ = v0_perspective_UV.Z() * lambda0 +
                           v1_perspective_UV.Z() * lambda1 +
                           v2_perspective_UV.Z() * lambda2;
						
// Divide by 1/z
interpolatedTexCoord = interpolatedTexCoord / interpolated1OverZ;


Update2: I've managed to get it working but it's not Z that I'm dividing by but the homogenous w coordinate.
I've read something about that being the same as the view-space z ? Is that correct ?

That's correct! I should have been more clear about that. The Z you divide by is the viewspace Z, which gets stuffed into the W component after the projection transformation.

To further investigate perspective interpolation...I couldn't really figure this out but do I need to do this for every vertex attribute ? (Normal, Tangents,...custom attributes like position in world space) ?

Thinking in GPU shader terms, everything that I output from the VS is transformed like this ? (if I don't mark it as nointerpolation or similar that is)

I read that the depth z/w can be interpolated linearly in screen space so that means I don't need to do it for the depth, right?


To further investigate perspective interpolation...I couldn't really figure this out but do I need to do this for every vertex attribute ? (Normal, Tangents,...custom attributes like position in world space)

Yes. In DX/GL there are linear interpolation modes without perspective correction, but for all the 'regular' attributes you need perspective correction.


Thinking in GPU shader terms, everything that I output from the VS is transformed like this ? (if I don't mark it as nointerpolation or similar that is)

There is also a constant interpolation, which uses the leading vertex attributes. But in general that's true - the default is linear with perspective correction.


I read that the depth z/w can be interpolated linearly in screen space so that means I don't need to do it for the depth, right?

Depth doesn't require perspective correction.

Alright thanks for clearing that up smile.png

This might not be exactly related but do you have an idea what's going wrong with my normal mapping ?

The normals look fine, but both tangent and bitangent seem to be wrong. As soon as I use the bumpNormal my mesh has some kind of lighting seam and further dark spots.

I suspect the problem occurs after the TBN transform...

The vertex input's should be fine I've reused the same mesh from my engine.

VS:


// Transform normal to world space
Vec4_SSE_Transform(this->cbTransforms.World, input.normal);

// Transform tangent to world space
Vec4_SSE_Transform(this->cbTransforms.World, input.tangent);
				
// Transform biTangent to world space
Vec4_SSE_Transform(this->cbTransforms.World, input.biTangent);

I transform normal, tangent, bitangent to world space in the vertex shader

Then interpolate all three vectors:

http://d.pr/i/zplz (sorry the code formatting is a mess, screen was a lot faster tongue.png)

And in the PS:


Vector4 Main(PSInput &input)
{			
       // Sample Diffuse Map
       Vector4 diffColor = SampleTex2DLinear(this->textures[0], input.TexCoord);

	// Sample Normal Map
	Vector4 normalMapColor = SampleTex2DLinear(this->textures[1], input.TexCoord);
	normalMapColor = normalMapColor * 2.0f - 1.0f;
	Vector4_SSE bumpNormal(normalMapColor.X(), 
                               normalMapColor.Y(), 
                               normalMapColor.Z(),
                               1.0f);

	Vec4_SSE_Normalize(bumpNormal);
				
	// Normalize normals, tangent, bitangent
	Vec4_SSE_Normalize(input.Normal);
	Vec4_SSE_Normalize(input.Tangent);
	Vec4_SSE_Normalize(input.BiTangent);
	
	Matrix4x4_SSE TBN(input.Tangent,
                          input.BiTangent,
                          input.Normal,
                          Vector4_SSE(0, 0, 0, 1));

	Vec4_SSE_Transform(TBN, bumpNormal);
	Vec4_SSE_Normalize(bumpNormal);
				
	// Lambertian diffuse
	float cosTheta = Vec4_SSE_Dot(bumpNormal, L);
	if(cosTheta < 0) cosTheta = 0;

	// Calculate V
	Vector4_SSE H = input.PosWS;
	Vec4_SSE_Sub(H, this->cbTransform.CamPosition);

	Vec4_SSE_Add(H, L);
	Vec4_SSE_Normalize(H);

	float NdotH = Vec4_SSE_Dot(input.Normal, H);
	if(NdotH < 0) NdotH = 0;

	float blinnPhong_ndf = pow(NdotH, 120);
				
				
	return Vector4(diffColor.X() * cosTheta + blinnPhong_ndf,
		       diffColor.Y() * cosTheta + blinnPhong_ndf,
                       diffColor.Z() * cosTheta + blinnPhong_ndf,
		       1.0f);
}

This topic is closed to new replies.

Advertisement