Jump to content
  • Advertisement
Sign in to follow this  
irreversible

Projective texturing: doing it manually

This topic is 2510 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

In short: I need to map vertex (I'm using triangles) texture coordinates into an atlas texture and later map the individual face textures stored in the atlas for image editing, taking 3D world coordinates as input, and I need to do it myself and programmatically.

In mode detail: I've generated the geometry, I've generated the atlas and I've mapped the atlas to the geometry. However, I'm experiencing confusion regarding the updating of the face textures based on world coordinates. The whole process is a lot like baking decals into textures: if a bullet hits a wall, I need to update the atlas texture for that face based on the point of impact (which I initially know in world space), or like painting a model in Blender or Photoshop that allows you to directly draw on the model while the software updates the underlying 2D texture automatically (if UV coords have been issued).

Problems arise when creating and interpreting UV coordinates for individual faces: firstly how do I decide the proper vertex/UV coord rotation for faces with different orientation (based on its major plane?) so that all face textures would be stored in a proper and predictable way? Secondly, how precisely do I go about projecting the world space coordinate into localized texture space? I can't figure out how to construct the "texture matrix" (is that what it's called?). I've done my share of searching, so if anyone could explain the thought process behind it or share some useful resources that would explain the transformation process in more detail, I'd appreciate it a lot!

Share this post


Link to post
Share on other sites
Advertisement
Okay, it appears yesterday was a long day and I way over-complexified the problem for myself. Figuring out how to store UV coords and also how to extract them isn't as tough. What is turning out to be somewhat more challenging, however, is establishing rotational consistency when generating the UV coords and updating the stretched atlas properly.

The first problem one is consistently generating texture coordinates from world space points and later accessing them. Right now I have a brute force approach that:

- receives a number of triangles as input
- computes the min and max extent of all the points (vMin, vMax)
- for each vertex then, the atlas UV coordinates are computed as:


//world space bounding box for the geometry
IVector3D vExt = vMax - vMin;

//vLocal is the relative position of a vertex in bounded local space
IVector3D vLocal = (mesh->vertices - vMin) / vExt;

//division by zero not handle explicitly, hence this hack
if(vExt.x < EPSILON) vLocal.x = 0.0f;
if(vExt.y < EPSILON) vLocal.y = 0.0f;
if(vExt.z < EPSILON) vLocal.z = 0.0f;

//this is the iffy bit - projection: move two of the major axis components to X and Y axes and zero out the Z axis
if(vExt.x > vExt.y && vExt.z > vExt.y) { vLocal.y = vLocal.z; }
if(vExt.y > vExt.x && vExt.z > vExt.x) { vLocal.x = vLocal.z; }
vLocal.z = 0;

//calculate the atlas UV coordinates from the (projected) world position and atlas rectangle (expressed my minimum UV and size of UV rect - extents)
IVector3D vAtlas = vLocal * vAtlasUVExt + vAtlasUVMin;


- this generates UV coords that are usable, BUT either this code or my extraction code is not consistent as is show by problem number 2

The second problem has to do with properly updating the atlas given a 3D point on a face, more particularly computing the scale coefficient that is required when editing the atlas directly. Consider a rectangular atlas with a non-rectangular face texture that's placed at an arbitrary position. The edge ratio of the face texture is proportional to some geometric edge ratio inherent in the actual vertex positions. Now, while the texture coordinates of the entire atlas run from [0,0] to [1,1], let's say the texture coordinates of the face texture in it run from [0.2, 0.15] to [0.4, 0.20], making the size of the face texture [0.2, 0.05]. Now, let's say I want to draw a circle with radius X on the face. When I access the atlas, I need to account for the fact that the face's texture is not square. As such, the shader code that would draw the circle looks something like this:



vec3 dist;
//tc is the current texture coordinate in the fragment shader
dist.x = (tc.x - vCircle.x) / vScalingFactor.x;
dist.y = (tc.y - vCircle.y) / vScalingFactor.y;
dist.z = 0;

float fLen = length(dist);
//fCircleRadius is effectively expressed in texture coordinate units
if(fLen <= fCircleRadius)
gl_FragData[0] = someColor;
else
discard;



This raises the question of how vCircle and vScalingFactor are calculated so that the circle would remain the same size and unslanted regardless of what the shape of the face texture region is that it is being drawn on.

The center of the circle in UV coords is relatively straightforward to compute:


//calculate the relative local space coordinate from a world space collision point
IVector3D uv = ((vCollision[k] - vMin) / vExt);

//deal with division by zero
if(vExt.x < EPSILON) uv.x = 0.0f;
if(vExt.y < EPSILON) uv.y = 0.0f;
if(vExt.z < EPSILON) uv.z = 0.0f;

//project. once again - very iffy!
if(vExt.x > vExt.y && vExt.z > vExt.y) { vIPuv.y = vIPuv.z; }
if(vExt.y > vExt.x && vExt.z > vExt.x) { vIPuv.x = vIPuv.z; }

//convert from object space to atlas-based face texture UV space
uv *= vAtlasUVExt;
uv += vAtlasUVMin;

//flip the y-axis within the face texture region
uv.y = vAtlasUVMax.y - (uv.y - vAtlasUVMin.y);



This bit up to here works most of the time, but it harks back to the first problem: some faces are not oriented consistently: when I update the atlas on the GPU, some face textures are correct, but some seem to be rotated 90 degrees. The bad bit is that I can't really tell which ones or why.

The other problem is calculating the scaling factor: I'm simply not sure how to do that.

I'd appreciate some constructive critique! :)

Share this post


Link to post
Share on other sites
The first problem one is consistently generating texture coordinates from world space points and later accessing them. Right now I have a brute force approach that:

- receives a number of triangles as input
- computes the min and max extent of all the points (vMin, vMax)
- for each vertex then, the atlas UV coordinates are computed as:
...

I'm afraid I don't understand. Cannot you just transform each vertex again WRT light source MVP?

Share this post


Link to post
Share on other sites

[quote name='irreversible' timestamp='1327459393' post='4905979']The first problem one is consistently generating texture coordinates from world space points and later accessing them. Right now I have a brute force approach that:

- receives a number of triangles as input
- computes the min and max extent of all the points (vMin, vMax)
- for each vertex then, the atlas UV coordinates are computed as:
...

I'm afraid I don't understand. Cannot you just transform each vertex again WRT light source MVP?
[/quote]

There's are no light or view and projection matrices involved. I'm dealing with direct mapping of vertices onto a texture from world model coordinates - I'm not trying to project the texture into the scene. Doesn't that come clear from my description?

Share this post


Link to post
Share on other sites
There's are no light or view and projection matrices involved. I'm dealing with direct mapping of vertices onto a texture from world model coordinates - I'm not trying to project the texture into the scene. Doesn't that come clear from my description?
That's minor difference as far as I understand. Projections are projections. Call it "light source projection" or some other linear function. Obviously not. Something is not clear.
And anyway, even after this, I still don't understand why cannot you just to the projection.

The bold marks exactly what projective texturing does... maps vertices onto a texture. And both of us know this perfectly as it's topic's title.

Are you asking how to build the matrix?

Share this post


Link to post
Share on other sites

[quote name='irreversible' timestamp='1327487369' post='4906061']There's are no light or view and projection matrices involved. I'm dealing with direct mapping of vertices onto a texture from world model coordinates - I'm not trying to project the texture into the scene. Doesn't that come clear from my description?
That's minor difference as far as I understand. Projections are projections. Call it "light source projection" or some other linear function. Obviously not. Something is not clear.
And anyway, even after this, I still don't understand why cannot you just to the projection.

The bold marks exactly what projective texturing does... maps vertices onto a texture. And both of us know this perfectly as it's topic's title.

Are you asking how to build the matrix?
[/quote]

No, not really. I'm not even sure there's any need for matrices. The projection part actually takes on a new angle because of the fact that the textures are inside an atlas and there's a rescaling and remapping stage that needs to be accounted for. After some brain-racking I took a step back and rethought the entire problem and the solution now works in world space instead of texture space, which is IMO considerably more straightforward.

I'll post the solution soon - I think I have it pretty much 98% working with (so far as I can find) one minor mathematical problem still to figure out.

Share this post


Link to post
Share on other sites
Okay then. I think I have it working. I'll post what I did in hope that anyone in the future will find this useful. This is essentially a lightmapper without shadowing.

STEP1: build the texture atlas. Use whatever method you like here. The only thing that matters is to generate properly aligned texture quads within the atlas.
STEP2: generate new texture coordinates that map the atlas onto the geometry. This is slightly involved, but generally straightforward. The following code is unoptimized and takes two unindexed triangles that are placed within the atlas texture (if you wish to store more geometry, you'll need to adapt the algorithm):


//the ractangle of the texture in the atlas in pixel coordinates

IRect rect = node->rect;
//triangle numbers that form a quad - used for vertex indexing, which are stored consecutively in groups of three
int iFace0 = node->iFace0;
int iFace1 = node->iFace1;


IVector3D vMin = IVector3D( 1000000, 1000000, 1000000);
IVector3D vMax = IVector3D(-1000000, -1000000, -1000000);

//do some hack to find out what the four corners are - this is much
//easier when vertices are indexed. Assume there's no rule to vertex
//ordering for either triangle. The first triangle is used straight-up
IVector3D v0, v1, v2, v3;
v0 = mesh->vertices[iFace0 * 3 + 0];
v1 = mesh->vertices[iFace0 * 3 + 1];
v2 = mesh->vertices[iFace0 * 3 + 2];

//the second triangle shares 2 vertices with the first, so we'll need to find the odd vertex. Proper indexing would help a lot here.
for(int i = 0; i < 3; i++)
{
if( VecDistanceSquared(mesh->vertices[iFace1 * 3 + i], v0) > EPSILON &&
VecDistanceSquared(mesh->vertices[iFace1 * 3 + i], v1) > EPSILON &&
VecDistanceSquared(mesh->vertices[iFace1 * 3 + i], v2) > EPSILON)
{ v3 = mesh->vertices[iFace1 * 3 + i]; break; }
}

//calculate face AABB. These functions are per-component
vMin = IMath::VecMin(v0, vMin);
vMin = IMath::VecMin(v1, vMin);
vMin = IMath::VecMin(v2, vMin);
vMin = IMath::VecMin(v3, vMin);

vMax = IMath::VecMax(v0, vMax);
vMax = IMath::VecMax(v1, vMax);
vMax = IMath::VecMax(v2, vMax);
vMax = IMath::VecMax(v3, vMax);

IVector3D vExt = vMax - vMin;

//and UV extents. iRequiredWidth and iRequiredHeight are atlas extents. iBorder is the number of pixels kept around each face's actual coverage
IVector3D vAtlasUVMin = IVector3D((rect.left + iBorder) / (TReal)iRequiredWidth, 1 - (rect.top + iBorder) / (TReal)iRequiredHeight, 0.0f);
IVector3D vAtlasUVMax = IVector3D((rect.right - iBorder) / (TReal)iRequiredWidth, 1 - (rect.bottom - iBorder) / (TReal)iRequiredHeight, 0.0f);
IVector3D vAtlasUVExt = vAtlasUVMax - vAtlasUVMin;

//get the absolute values of the face normal (currently same as vertex normals)
IVector3D na = IMath::VecAbs(mesh->vnormals[iFace0 * 3]);

int iNormalAxis = 0;
if(na.y > na.x && na.y > na.z) iNormalAxis = 1;
if(na.z > na.x && na.z > na.y) iNormalAxis = 2;

IVector3D uv0, uv1, uv2, uv3;
//calculate UVs for the four distinct vertices
uv0 = CalcUV(v0, vMin, vExt, vAtlasUVMin, vAtlasUVExt, iNormalAxis);
uv1 = CalcUV(v1, vMin, vExt, vAtlasUVMin, vAtlasUVExt, iNormalAxis);
uv2 = CalcUV(v2, vMin, vExt, vAtlasUVMin, vAtlasUVExt, iNormalAxis);
uv3 = CalcUV(v3, vMin, vExt, vAtlasUVMin, vAtlasUVExt, iNormalAxis);

//now map the coordinates to the actual vertices. The y-coordinate needs to be flipped.
uv0.y = vAtlasUVMax.y - (uv0.y - vAtlasUVMin.y);
uv1.y = vAtlasUVMax.y - (uv1.y - vAtlasUVMin.y);
uv2.y = vAtlasUVMax.y - (uv2.y - vAtlasUVMin.y);
uv3.y = vAtlasUVMax.y - (uv3.y - vAtlasUVMin.y);

//finally store the coordinates
mesh->vtexcrd1[iFace0 * 3 + 0] = uv0;
mesh->vtexcrd1[iFace0 * 3 + 1] = uv1;
mesh->vtexcrd1[iFace0 * 3 + 2] = uv2;

for(int i = 0; i < 3; i++)
{
if( VecDistanceSquared(mesh->vertices[iFace1 * 3 + i], v0) < EPSILON)
mesh->vtexcrd1[iFace1 * 3 + i] = uv0;
else if(VecDistanceSquared(mesh->vertices[iFace1 * 3 + i], v1) < EPSILON)
mesh->vtexcrd1[iFace1 * 3 + i] = uv1;
else if(VecDistanceSquared(mesh->vertices[iFace1 * 3 + i], v2) < EPSILON)
mesh->vtexcrd1[iFace1 * 3 + i] = uv2;
else
mesh->vtexcrd1[iFace1 * 3 + i] = uv3;
}

//the helper function that calculates the actual UV coordinates

IVector3D CalcUV(IVector3D v, IVector3D vMin, IVector3D vExt, IVector3D vAtlasUVMin, IVector3D vAtlasUVExt, int iNormalAxis)
{
IVector3D vLocal = (v - vMin) / vExt;

//avoid division by zero
if(vExt.x < EPSILON) vLocal.x = 0.0f;
if(vExt.y < EPSILON) vLocal.y = 0.0f;
if(vExt.z < EPSILON) vLocal.z = 0.0f;

if(iNormalAxis == 1) { vLocal.y = vLocal.z; }
if(iNormalAxis == 0) { vLocal.x = vLocal.y; vLocal.y = vLocal.z; }
vLocal.z = 0;

return vLocal * vAtlasUVExt + vAtlasUVMin;
}


At this point the atlas can be rendered directly onto the geometry contained within it.

Next up is updating the atlas based on a 3D point in world space. Let's assume there's a sphere that moves around and paints onto the geometry in the atlas with radius fSplatRadius.

STEP3: find the affected faces. Use your favourite collision code to determine this.
STEP4: client-side (CPU) code for the updater, which loads the atlas as an FBO output.
- find affected face's vMin and vMax plus vUVMin, vUVMax
- the rest of the required properties are calculated as above
- for each affected face do:


IRect rcScissor = IRect((int)(vUVMin.x * iRequiredWidth), (int)(vUVMin.y * iRequiredHeight), (int)(vUVMax.x * iRequiredWidth), (int)(vUVMax.y * iRequiredHeight));
//a point in world space
shdPaint->SetUniform3fv("vSplatPosition", 1, vCOLLPOINTS[0]);
//radius in world space units
shdPaint->SetUniform1f("fSplatRadius", fSplatRadius);
//world space extents
shdPaint->SetUniform3fv("vWSMin", 1, vMin);
shdPaint->SetUniform3fv("vWSMax", 1, vMax);
//texture space extents within the atlas

shdPaint->SetUniform2fv("vUVMin", 1, vUVMin);
shdPaint->SetUniform2fv("vUVExt", 1, vUVExt);
//the face's normal
shdPaint->SetUniform3fv("vNormal", 1, normal);
shdPaint->SetUniform1i("iNormalAxis", iNormalAxis);

//don't forget to set up scissoring and draw a full-screen quad


On the server side (GPU) run a fragment shader:


//current texture coordinate from the vertex shader

varying vec2 tc;

uniform vec3 vWSMax;
uniform vec3 vWSMin;

uniform vec2 vUVExt;
uniform vec2 vUVMin;
uniform vec3 vNormal;
uniform int iNormalAxis;

uniform vec3 vSplatPosition;
uniform float fSplatRadius;

void main()
{
vec3 vTexelWS;
//calculate the extents of the polygon (size of its AABB)
vec3 vWSExt = vWSMax - vWSMin;
//vMin and vMax are component-based so they don't necessarily lie on the plane;
//however their point of intersection (average) does
vec3 vPointOnPlane = (vWSMax - vWSMin) * 0.5f + vWSMin;

//rescale the texture coordinate to local face coordinates
vec2 vUVScale = (tc - vUVMin) / vUVExt;
//calculate the D component
float D = -dot(vPointOnPlane, vNormal);

//unproject the texel coordinate into world space
if(iNormalAxis == 1)
{
vTexelWS.x = vUVScale.x * vWSExt.x + vWSMin.x;
vTexelWS.z = vUVScale.y * vWSExt.z + vWSMin.z;
vTexelWS.y = -(vNormal.x * vTexelWS.x + vNormal.z * vTexelWS.z + D) / vNormal.y;
}
if(iNormalAxis == 0)
{
vTexelWS.y = vUVScale.x * vWSExt.y + vWSMin.y;
vTexelWS.z = vUVScale.y * vWSExt.z + vWSMin.z;
vTexelWS.x = -(vNormal.y * vTexelWS.y + vNormal.z * vTexelWS.z + D) / vNormal.x;
}
if(iNormalAxis == 2)
{
vTexelWS.x = vUVScale.x * vWSExt.x + vWSMin.x;
vTexelWS.y = vUVScale.y * vWSExt.y + vWSMin.y;
vTexelWS.z = -(vNormal.x * vTexelWS.x + vNormal.y * vTexelWS.y + D) / vNormal.z;
}

//make all texels that lie within the sphere white
if(distance(vTexelWS, vSplatPosition) <= fSplatRadius)
gl_FragData[0] = vec4(1, 1, 1, 1);
else
discard;
}


So there - this works for me and hopefully someone will find it useful as well!

Share this post


Link to post
Share on other sites

Okay then. I think I have it working. I'll post what I did in hope that anyone in the future will find this useful. This is essentially a lightmapper without shadowing.
You know, I could have bet on this. Mind explaining what's the deal with the "sphere painting"?

Share this post


Link to post
Share on other sites

[quote name='irreversible' timestamp='1327591835' post='4906442']
Okay then. I think I have it working. I'll post what I did in hope that anyone in the future will find this useful. This is essentially a lightmapper without shadowing.
You know, I could have bet on this. Mind explaining what's the deal with the "sphere painting"?
[/quote]

As just an example it uses a spherical volume for texel intersection testing.

PS - it's my fault I didn't classify this as a lightmapper earlier. I guess I was too hooked on my implementation, which has got nothing to do with lightmaps, but rather texturing. It's a matter of wording.

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

GameDev.net is your game development community. Create an account for your GameDev Portfolio and participate in the largest developer community in the games industry.

Sign me up!