• Create Account

# Collapsing geometry

Posted by polyfrag, 18 June 2013 · 410 views

I can now collapse a collection of planes defining a convex hull into triangles.

Now working on a way to map texture coordinates onto brush sides.

Convex hulls or "brushes" are how first person shooters that derive from the Quake/Doom-architecture define geometry for use in collision detection and response. Using a set of planes, 6 for all sides of a cube, we can define any convex shape. This simplifies collision detection against axis-aligned bounding boxes and spheres because all we have to do is check each plane/side of the brush against the farthest and nearest point of the AA box or sphere, and check if both points are on both sides of the plane or inside. If there's a point on the inside side of each plane there's an overlap with the sphere or AA box and its not hard to get the exact distance we have to move to just be touching the surface using the dot product.

To be drawn, brushes must be broken down into triangles. To do this I loop through each plane/side of the brush "i". And for each side "i" I get another, different side "j". I get the line intersection between them. This is the code I used.

http://devmaster.net/forums/topic/8676-2-plane-intersection/

Then we need another side (each is a different side) "k" that I then get the point intersection of the line with, and another side "l" that I get another point intersection with. I use a for-loop to go through all the sides and for "l" I started counting at "k+1" so we don't get any repeats (this becomes important later when building a polygon for the brush side). The two point intersections form a side edge for a polygon for the brush side. I store it in an array of STL lists of lines. Each brush side has a list of lines. I store the line for side "i" because that is the brush side that the side edge belongs to and is along.

Then I loop the side edges for each brush side, making a "polygon" - basically an outline, with a point for each vertex. I use an epsilon value to check if two points are within a certain distance, and use the side edge's other vertex as the next point to check for proximity, starting over from the first side edge and making sure to exclude checking the last connecting edge.

Then I check the polygon to be clockwise order (because that is how I cull my polygons) by checking the normal of a triangle formed by the first three vertices of the polygon and checking if its closer to the plane normal or if the opposite normal is closer. If the opposite is closer I reverse the list of vertices.

Oh before I make the polygon I discard any side edges with at least one point that is not inside or on any one plane of the brush. This is necessary to cull away bounding planes that are outside the brush, resulting from moving the other planes. Later I remove these planes that have less than 3 side edges, the minimum to form a triangle.

Next I allocate (v-2) triangles where "v" is the number of vertices in the side's polygon. I construct the triangles in a fan pattern.

There's probably some improvements that can be made like storing shared edges and not having to reconnect them by checking distance, which I will probably learn as I follow in the footsteps of q3map and other Quake/Doom games' source code.

[edit2] By "nearest point to the plane" I mean nearest to the "inside" side of the plane, according to the normal. For an AA box we just check the signedness of each axis of the normal and use the min or max on each axis to get the innermost or outermost point of the 8 points of the AA box.

[edit3] And actually, the farthest point has to be the one from before the AA box moved and the "nearest" point has to be from the moved position.

I think there needs to be a little bit more information here.

Ah, very nice. I really like the detail that you have in this entry now!

FWIW (if relevant): there are several ways to do the texture/plane projection, for example:

Quake style (Quake series, Doom 3);

Half-Life style (HL, most source-engine games).

Quake-style tends to define the projection in terms of a 2D space on the nearest unit-plane, and calculate the texture coords from the vertex position.

Half-Life style tends to define the texture projection in terms of a pair of axes, then you effectively calculate the dot-product of the point along these axes (with scaling and translation integrated), so the pair of vectors is something like "[ ux uy uz ud ] [ vx vy vz vd ]" (normal direction, distance, scaled as appropriate).

brush handling in my engine uses Half-Life style, FWIW (personally I find it a little cleaner).

(granted, yes, at present I am mostly using voxels, nevermind this for a moment, it mostly had to do with my general lack of ability to make good maps, and the engine supports both world styles...).

I don't know what you need the dot product for. I read that you just plug in xyz and add up those terms for u and v. How do you manipulate it meaningfully in the editor then though? And if you flipped a brush side you'd have to adjust the plane equation too. What would be more interesting is if the plane was actually the plane that the texture was aligned to, and this may actually be the case. I was thinking of a solution that didn't require you to adjust the parameters when you flipped a face. You'd specify a rotation of the texture and the scale and offset in world coordinates. Then you'd use the dot product somewhere to see how far along each vertex is along the plane, minus (or plus?) the offset and multiplied by the scale to get the texture coordinates. Let me know how it's actually done in the editor. Thanks.

well, as noted, there are multiple ways to do all this.

in HL-style projection, the U/V vectors essentially define a plane which the texture is aligned with, but can hold a little more information. this need not necessarily be aligned with the plane they are being projected onto.

IIRC, coordinate calculation is usually something like:

s=(x*ux+y*uy+z*uz+ud)/width;

t=(x*vx+y*vy+z*vz+vd)/height;

as for needing to modify/recalculate the texture projection if/when the brush plane as altered. yes, this often needs to be done.

in-editor, I have more often handled texture projection say by creating a box, which can be dragged around and rotated and scaled relative to the object. the texture-axes for each face are then matched up with and calculated relative to the faces of this box (with a few different styles of box as well, to represent different types of projections, such as box/plane(x/y/z)/sphere/cylinder/...).

some other editors allow manipulating textures face-by-face. such as by a plane dragged around relative to the face.

in my 3D modeler, I have typically calculated the texture coordinates relative to the box directly, which allows for a little more flexibility in terms of projection.

storage for such a texture box would require a few more numbers, in addition to the style.

one way of representing it is mostly as a 3x4 or 4x4 matrix and a style-type (the matrix would define a texture coordinate space, and the style would indicate the projection strategy to be used).

S M T W T F S
1234567
8 91011121314
15161718192021
22232425262728
293031