Math for textures, (some smart people needed ;))

Started by
5 comments, last by GmDev2002 21 years, 9 months ago
Hello, I saw lots of 3d applications/demo''s with oblique faces, with WRONG stretched textures, because it''s hard to make in an editor an AUTO texture RIGHT placer, with the possibility to give an X and Y stretch, and that in any other case the texture will be always right standing (without you have to put on every face the texture manually right). Q3Radiant/GTKRadiant has a feature like this (some where in their large source code, I do not know where (and this one isn''t perfect, atleast to what I saw)) So my question is, has anybody source code like that, or does anyone know HOW to calculate it ? Any tips/hints/information/source code/math... is welcome. (My first idea, was to calculate the size of the poly (3 vertices), so that you know the size of the texture on it, and later move it into the 3d space of textures, so a new poly left to it (with same normal) will have EXACTLY the same texture without you see any line(s) between it) Only I didn''t succeeded in making it I hope some of you, wants to put some time in this topic, thanks. Regards, G.D.k.
Advertisement
Hmmm... Let me clarify first, you are talking about the situation where an oblique surface has a small part of a texture mapped onto it and so looks really stretched? ( I''ve seen this sooo many times in terrain demos ). I tried to combat this, and failed, that was quite a while ago though, and I haven''t tried since, I''m about to again though...

Death of one is a tragedy, death of a million is just a statistic.
If at first you don't succeed, redefine success.
Could you explain the problem a little more clearly please... perhaps with reference to a simple example? Perhaps consider a 3-d cube and a texture mapped onto it. I suspect I understand your problem and might be able to help, but I want to clarify that I do indeed understand the problem.

Thanks,

Timkin
well, I have no screenshots that I can show directly, but I''ll try to explain it more detailed.

Just imagen this:

I have a 3D Map editor, using brushes, who''re seperated in polies.
When I add a default brush, with static TU, TV numbers and I add an other left to it with the SAME texture then you''ll see that it are 2 diffrent brushes, because the texture position isn''t the same.

I made now a normal check, that when the polies of the brushes are looking up or down, then I split them in a list, and when they are looking left or right / back or front.

On this way, I just do PolyVertexPos * TextureZoom, and I have the right texture position for all the 6 sides.

Only it doesn''t work when I have an oblique side, a side without a normal "1" on the X Y or Z axis.

in that case I took the texture of X * normal.x and also for Y and Z, but of course it''s really looking bad in that case.
And I''m asking help on that part.

I was also planning to have texture rotate, but I guess I may leave that part out of it I''ll already be more then happy to get this main texture possitioning working.

Regards,
G.D.k.


Unfortunately I don''t know enough about graphics to be able to understand your problem... but that''s my fault, not yours. Perhaps someone in the Graphics forum could help? Or maybe there are others in this forum that can help you.

I still suspect I understand the problem, but I could be completely way off, so I won''t waste your time or peoples bandwidth.

Good luck,

Timkin
i don''t know if this is what you want, but i guess your''re searching for an algo to calculate "well looking" texture coordinates.
i had the same problem some time ago, maybe you can use my solution:

first i define a texture plane described by two rectangular base vectors (i named them ua and va - don''t ask for what the ''a'' stands). Once you have these you can calculate your u and v coordinates by a dot product of your vertex coordinate and the ua or uv vector.

And her comes how to calculate these ua and va vectors:
first i check out 2 special cases where your normal vector''s up/down component (in my coordinate systems it''s y) is 1 or -1, like it is on floors and ceilings. In these 2 cases i define the ua vector as (normal.y, 0, 0). In all other cases i define ua as (normal.z, 0, -normal.x). Then i normalize the ua vector. The va vector is calculated by a cross product of ua cross your normal vector.

That''s it.

hope this is what you wanted. If not maybe it helps someone else ;-)

eloso
If python_regious has interpreted your problem correctly, then a possible solution is to automatically determined the direction for projecting the texture based on the direction of the normal. For example, if the normal of the face is (0,0,1) then you project the texture in the (0,0,-1) direction towards the face. And if the texture points in the (0.707, 0.707, 0) direction, you project along (-0.707, -0.707, 0). The details are quite similar to the method presented above by eloso. We do this exact thing for texturing underground subway systems in our modeling tool and depending on the texture you use it works fairly well. But it isn''t perfect, and its difficult, as you say, to get the UV''s to be aligned at the edge of adjacent faces. We happen to not worry about that. Our geometry is such that we don''t have to worry about it. At least you don''t have to do manual UV''s for each face.

Big Blue Box has worked on this for their terrain rendering system in Project Ego, and they''ve had some success. Fortunately, they presented a paper at GDC 2002 with some details on this very subject, and you can find their presentation here (though their details are quite brief and not very technical):

http://www.bigbluebox.com/files/GDCIanMattWeb.pps

Good luck!

Graham Rhodes
Senior Scientist
Applied Research Associates, Inc.
Graham Rhodes Moderator, Math & Physics forum @ gamedev.net

This topic is closed to new replies.

Advertisement