Sign in to follow this  
SirDeathKnight

MAP Texture Coordinates

Recommended Posts

SirDeathKnight    122
Hey guys,

Here is an example of a face (parsed from a quake 3 map file), the face has 1 texture 256x256 and the texture is set to fit the face....

[code]
( 128 64 64 ) ( 128 -64 64 ) ( 0 64 64 ) common/1 0 128 0 0.5 0.5 0 0 0
[/code]

I have already worked out how to create the 4 vertex's from the 3 points of the plane, so I can now draw the face perfectly well.....

however... I am having trouble with generating the texture coordinates. I am expecting the following type of values:

[code]
0, 0
0, 1
1, 1
1, 0
[/code]

I simply cannot find a way to generate them, could anyone please post a simple code snippet or try to explain if they have the time.

I also have to take into consideration the stretch, rotation, etc. All of these values I have at hand in my application.

thanks in advance.

Share this post


Link to post
Share on other sites
smasherprog    568
First, why are you taking three points and creating four? Second, if you know the texture coords are suppose to be (0, .5) (.5, 0), (0, 0) why would you change that to (0, 0), (0, 1), (1, 1) thinking that is the right thing to do? I am asking these questions because it doesn't make sense to do this and I am wondering if you have some ideas mixed up.

From what you have posted, you should try manually creating some type of a shape in your engine, like a cube. Manually do the texture coords and normals to get a handle on what everything means. Then, try moving it, scaling it, and rotating it. Then do the same to the texture coords. After that, you should have a better understanding of how to make this work. I can appreciate your enthusiasm, but you must crawl before you want walk.

Share this post


Link to post
Share on other sites
SirDeathKnight    122
Hey, firstly thanks for reading my post,

The face data I have pasted is a single face of a cube created using GTK Radiant, Those 3 points that the map file gives you are not vertex points, they describe a plane (I have found the vertex points by intersecting more than 2 planes).

In the map format the texture stretch (0.5) is dependant on the texture size, whereas in openGL as you know a stretch of 1 means that regardless of the texture size it will repeat the texture exactly once accross the face. This is not true in the map format as a stretch of 1 would literally mean repeat the texture with no stretch as many times as it will fit over the face.

As you can see there is more to the MAP format than meets the eye and that is why I am having difficulty deriving the texture coordinates.



Share this post


Link to post
Share on other sites
gsamour    140
If you're using GTK Radiant, why don't you compile the map file to bsp, then read it like a Quake 3 BSP format?

[url="http://www.mralligator.com/q3/"]http://www.mralligator.com/q3/[/url]

Share this post


Link to post
Share on other sites
SirDeathKnight    122
The BSP format will give me the same problems (plus the fact that it is just a binary format) it seems overkill for what I am trying to do.

I already have it drawing the map, but unfortunately the textures aren't the right sizes etc.... to start with the BSP format seems counterproductive when I already have so much done with the text based format (.MAP)

When using GTK Radiant it doesn't mean that the developer is tied to using the BSP compiled format. (luckily)

Share this post


Link to post
Share on other sites
gsamour    140
I thought that, when compiled to BSP, the map's texture coordinates already had stretch accounted for. Now I'm not sure, am I wrong?

Share this post


Link to post
Share on other sites
Krohm    5030
Ok, let's see what we have here.
We have generated a plane and, by intersecting it with other brep planes we figured out it's going to have [i]n[/i] points [i]p[sub]0[/sub][/i]...[i]p[sub]n-1[/sub][/i].
I assume you already mangled the whole brep and solidified the points.
What we need to find are two vectors [i]s[/i], [i]t [/i]generators for texture space on the plane.
Recall that Quake performs "cubic" mapping, where each face is mapped using the basis plane giving the less distortion.
So first step is comparing plane normal against the canonical generators. Loop over an array like this:
[code]
const float baseAxis[6][3][3] = {
{ { 0.0f, 1.0f, 0.0f }, { 1.0f, 0.0f, 0.0f }, { 0.0f, 0.0f, 1.0f } },
{ { 0.0f, -1.0f, 0.0f }, { 1.0f, 0.0f, 0.0f }, { 0.0f, 0.0f, 1.0f } },
{ { 1.0f, 0.0f, 0.0f }, { 0.0f, 0.0f, 1.0f }, { 0.0f, 1.0f, 0.0f } },
{ {-1.0f, 0.0f, 0.0f }, { 0.0f, 0.0f, 1.0f }, { 0.0f, 1.0f, 0.0f } },
{ { 0.0f, 0.0f, 1.0f }, { 1.0f, 0.0f, 0.0f }, { 0.0f, 1.0f, 0.0f } },
{ { 0.0f, 0.0f, -1.0f }, { 1.0f, 0.0f, 0.0f }, { 0.0f, 1.0f, 0.0f } }
};
[/code]Keep in mind that I had to fiddle with generators a bit as I'm using Direct3D/Renderman texture coord convention. You probably want to check GtkRadiant for generators which will likely fit you better.
Use the plane having the [font="Courier New"]max(dot(polyNormal, baseAxis[0])[/font]. Let this be plane [i]c[/i]. Then [font="Courier New"][i](sa = baseAxis[c][1], ta = baseAxis[c][2])[/i][/font] will be a canonical basis for texture coordinate generation.

Now adapt the canonical generators [i]sa [/i]and [i]ta [/i]to scale and rotate. In line of concept you need to perform a 2D transform here but since values such as 0, 90,180,270 deg are common I suggest to short-circuit those cases to steer clear of floating point approximations.
Let sv be the index of the first nonzero component in sa and tv be the similar for ta.
Let [i]sinv = sin(angle), cosv = cos(angle)[/i].
Perform the scale and transform like:
[code]s[sv] = float((cosv * sa[sv] - sinv * sa[tv]) / plane.texMods.sScale);
s[tv] = float((sinv * sa[sv] + cosv * sa[tv]) / plane.texMods.sScale);
t[sv] = float((cosv * ta[sv] - sinv * ta[tv]) / plane.texMods.tScale);
t[tv] = float((sinv * ta[sv] + cosv * ta[tv]) / plane.texMods.tScale);
s[3] = plane.texMods.sOff * plane.texMods.sScale;
t[3] = plane.texMods.tOff * plane.texMods.tScale;[/code]Again, keep in mind that since I have D3D9 convention the matrix might be different for you. You can still check out GtkRadiant.
Now you have (s,t) texture coordinate generator basis for points on the plane. Those texture coordinates however will be generated in units. Then you need to define a "period" for each texture. That's texture dimensions in pixels... doubled? Or half? I don't remember.
What I remember is that I put in texture dimensions in the x and y texture axis but I cannot track the flow of this info right now. Just try.

Loop on all the points and just project them in texture space, like:
[code]qcoords[p * 2 + 0] = Plane::Dot3(s, point);
qcoords[p * 2 + 1] = Plane::Dot3(t, point);
qcoords[p * 2 + 0] /= texSizes; // use inverse here, this loops much more
qcoords[p * 2 + 1] /= texSizet;[/code]You're still not done. Loop on all the points+texcoords again and make sure you try to "center them around the origin". This is especially important for rotating shaders as they will typically offset by half a texture. If you don't, your fans will go all over the place.

[size=2][quote name='gsamour' timestamp='1311005506' post='4836859'][/size]
[size=2]I thought that, when compiled to BSP, the map's texture coordinates already had stretch accounted for. Now I'm not sure, am I wrong?[/size]
[size=2][/quote]If memory serves it will store the generators explicitly so a lot of fuss is gone. But I'm fairly sure it doesn't store the coords explicitly as this is fundamentally [/size]redundant[size=2]. Anyway, BSP mangling wreaks havoc on pretty much anything that is not BSP... although the convex hull capability is fairly slick sometime the splits are just so "wrong" it's hard to believe.[/size]

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

Sign in to follow this