MAP Texture Coordinates

Started by
6 comments, last by Krohm 12 years, 9 months ago
Hey guys,

Here is an example of a face (parsed from a quake 3 map file), the face has 1 texture 256x256 and the texture is set to fit the face....


( 128 64 64 ) ( 128 -64 64 ) ( 0 64 64 ) common/1 0 128 0 0.5 0.5 0 0 0


I have already worked out how to create the 4 vertex's from the 3 points of the plane, so I can now draw the face perfectly well.....

however... I am having trouble with generating the texture coordinates. I am expecting the following type of values:


0, 0
0, 1
1, 1
1, 0


I simply cannot find a way to generate them, could anyone please post a simple code snippet or try to explain if they have the time.

I also have to take into consideration the stretch, rotation, etc. All of these values I have at hand in my application.

thanks in advance.
Advertisement
First, why are you taking three points and creating four? Second, if you know the texture coords are suppose to be (0, .5) (.5, 0), (0, 0) why would you change that to (0, 0), (0, 1), (1, 1) thinking that is the right thing to do? I am asking these questions because it doesn't make sense to do this and I am wondering if you have some ideas mixed up.

From what you have posted, you should try manually creating some type of a shape in your engine, like a cube. Manually do the texture coords and normals to get a handle on what everything means. Then, try moving it, scaling it, and rotating it. Then do the same to the texture coords. After that, you should have a better understanding of how to make this work. I can appreciate your enthusiasm, but you must crawl before you want walk.
Wisdom is knowing when to shut up, so try it.
--Game Development http://nolimitsdesigns.com: Reliable UDP library, Threading library, Math Library, UI Library. Take a look, its all free.
Hey, firstly thanks for reading my post,

The face data I have pasted is a single face of a cube created using GTK Radiant, Those 3 points that the map file gives you are not vertex points, they describe a plane (I have found the vertex points by intersecting more than 2 planes).

In the map format the texture stretch (0.5) is dependant on the texture size, whereas in openGL as you know a stretch of 1 means that regardless of the texture size it will repeat the texture exactly once accross the face. This is not true in the map format as a stretch of 1 would literally mean repeat the texture with no stretch as many times as it will fit over the face.

As you can see there is more to the MAP format than meets the eye and that is why I am having difficulty deriving the texture coordinates.



If you're using GTK Radiant, why don't you compile the map file to bsp, then read it like a Quake 3 BSP format?

http://www.mralligator.com/q3/
The BSP format will give me the same problems (plus the fact that it is just a binary format) it seems overkill for what I am trying to do.

I already have it drawing the map, but unfortunately the textures aren't the right sizes etc.... to start with the BSP format seems counterproductive when I already have so much done with the text based format (.MAP)

When using GTK Radiant it doesn't mean that the developer is tied to using the BSP compiled format. (luckily)

I thought that, when compiled to BSP, the map's texture coordinates already had stretch accounted for. Now I'm not sure, am I wrong?
Perhaps, but it will be best to just use the plain text .MAP format for now.

Ok, let's see what we have here.
We have generated a plane and, by intersecting it with other brep planes we figured out it's going to have n points p[sub]0[/sub]...p[sub]n-1[/sub].
I assume you already mangled the whole brep and solidified the points.
What we need to find are two vectors s, t generators for texture space on the plane.
Recall that Quake performs "cubic" mapping, where each face is mapped using the basis plane giving the less distortion.
So first step is comparing plane normal against the canonical generators. Loop over an array like this:

const float baseAxis[6][3][3] = {
{ { 0.0f, 1.0f, 0.0f }, { 1.0f, 0.0f, 0.0f }, { 0.0f, 0.0f, 1.0f } },
{ { 0.0f, -1.0f, 0.0f }, { 1.0f, 0.0f, 0.0f }, { 0.0f, 0.0f, 1.0f } },
{ { 1.0f, 0.0f, 0.0f }, { 0.0f, 0.0f, 1.0f }, { 0.0f, 1.0f, 0.0f } },
{ {-1.0f, 0.0f, 0.0f }, { 0.0f, 0.0f, 1.0f }, { 0.0f, 1.0f, 0.0f } },
{ { 0.0f, 0.0f, 1.0f }, { 1.0f, 0.0f, 0.0f }, { 0.0f, 1.0f, 0.0f } },
{ { 0.0f, 0.0f, -1.0f }, { 1.0f, 0.0f, 0.0f }, { 0.0f, 1.0f, 0.0f } }
};
Keep in mind that I had to fiddle with generators a bit as I'm using Direct3D/Renderman texture coord convention. You probably want to check GtkRadiant for generators which will likely fit you better.
Use the plane having the [font="Courier New"]max(dot(polyNormal, baseAxis[0])[/font]. Let this be plane c. Then [font="Courier New"](sa = baseAxis[c][1], ta = baseAxis[c][2])[/font] will be a canonical basis for texture coordinate generation.

Now adapt the canonical generators sa and ta to scale and rotate. In line of concept you need to perform a 2D transform here but since values such as 0, 90,180,270 deg are common I suggest to short-circuit those cases to steer clear of floating point approximations.
Let sv be the index of the first nonzero component in sa and tv be the similar for ta.
Let sinv = sin(angle), cosv = cos(angle).
Perform the scale and transform like:
s[sv] = float((cosv * sa[sv] - sinv * sa[tv]) / plane.texMods.sScale);
s[tv] = float((sinv * sa[sv] + cosv * sa[tv]) / plane.texMods.sScale);
t[sv] = float((cosv * ta[sv] - sinv * ta[tv]) / plane.texMods.tScale);
t[tv] = float((sinv * ta[sv] + cosv * ta[tv]) / plane.texMods.tScale);
s[3] = plane.texMods.sOff * plane.texMods.sScale;
t[3] = plane.texMods.tOff * plane.texMods.tScale;
Again, keep in mind that since I have D3D9 convention the matrix might be different for you. You can still check out GtkRadiant.
Now you have (s,t) texture coordinate generator basis for points on the plane. Those texture coordinates however will be generated in units. Then you need to define a "period" for each texture. That's texture dimensions in pixels... doubled? Or half? I don't remember.
What I remember is that I put in texture dimensions in the x and y texture axis but I cannot track the flow of this info right now. Just try.

Loop on all the points and just project them in texture space, like:
qcoords[p * 2 + 0] = Plane::Dot3(s, point);
qcoords[p * 2 + 1] = Plane::Dot3(t, point);
qcoords[p * 2 + 0] /= texSizes; // use inverse here, this loops much more
qcoords[p * 2 + 1] /= texSizet;
You're still not done. Loop on all the points+texcoords again and make sure you try to "center them around the origin". This is especially important for rotating shaders as they will typically offset by half a texture. If you don't, your fans will go all over the place.

[size=2]

[size=2]I thought that, when compiled to BSP, the map's texture coordinates already had stretch accounted for. Now I'm not sure, am I wrong?
[size=2]
If memory serves it will store the generators explicitly so a lot of fuss is gone. But I'm fairly sure it doesn't store the coords explicitly as this is fundamentally redundant[size=2]. Anyway, BSP mangling wreaks havoc on pretty much anything that is not BSP... although the convex hull capability is fairly slick sometime the splits are just so "wrong" it's hard to believe.

Previously "Krohm"

This topic is closed to new replies.

Advertisement