I've gotten some progress done on my software rasterizer and I'm now trying to implement texture mapping.
Though it seems like I'm doing something wrong.
Have a look at some of the relevant code snippets first:
I interpolate the UV's using my 3 normalized [0,1] barycentric weights:
Vector2 interpolatedTexCoord(v0.TexCoord.X() * lambda0 +
v1.TexCoord.X() * lambda1 +
v2.TexCoord.X() * lambda2,
v0.TexCoord.Y() * lambda0 +
v1.TexCoord.Y() * lambda1 +
v2.TexCoord.Y() * lambda2);
Then pass these to my pixel shader which samples the Vector4 color value:
// Implement pixel shading here...
Vector4 PS(const PSInput &input)
{
Vector4 output(0, 0, 0, 1);
// Sample texture
output = SampleTex2DPoint(this->textures[0], input.TexCoord);
return Vector4(output);
}
And last the actual sampling function (where I think the error is):
static Vector4 SampleTex2DPoint(Texture2D* tex, const Vector2 &uv)
{
unsigned int sampleX = static_cast<unsigned int>(uv.X() * (tex->width - 1));
unsigned int sampleY = static_cast<unsigned int>(uv.Y() * (tex->height - 1));
Rgb pixelValue = tex->pixels[sampleX + sampleY * tex->width];
return Vector4(pixelValue.r, pixelValue.g, pixelValue.b, 1);
}
Does anyone have an idea what's going wrong there ?
The output I get looks like this:
screen: http://d.pr/i/TlLX
In the video you can see that it's also warping during rotation. Does that mean it's not perspectively correct interpolated ? I thought using these barycentric coordinates would already do that.