Advertisement Jump to content
  • Advertisement


  • Content Count

  • Joined

  • Last visited

Community Reputation

159 Neutral

About 0xffffffff

  • Rank
  1. 0xffffffff

    3D format that supports vertex tangents.

    Geometrian, with real models, UV space is often full of distortion with respect to model space. It isn't the artist's problem to be concerned with the quality of tangents that your algorithm produces from his UV map. You say there's only one way, in practice. That's funny, because I could swear I've seen at least three different ways, in practice. To put that in perspective: out of three different tools commonly used by our art team, each one produced a different tangent basis. And I had three subtly different routines of my own to reproduce the results (and idiosyncrasies) of each one. If there is one and only one way to do it, somebody should kindly inform these disparate tools programmers to get them all on the same page. For now, there might be a best way, and lesser ways, but let there be no doubt about it: there are different ways to skin the cat. To the others: yes, it is possible to maintain tangents all the way from modeling, through baking/sampling, into the engine. The aforementioned FBX format is one possible vehicle for this, and is the one we use on the tools side. For export into the game engine, our data makes a trip through granny where again, the original tangents produced by the modeler are maintained. The tangents we use at runtime are the original Maya tangents.
  2. 0xffffffff

    3D format that supports vertex tangents.

    Is this U and V basis orthogonal everywhere on the surface of the mesh? No, it isn't. Hence, you probably want to orthogonalize. See, already you must make decisions about how to best calculate the tangent basis, and cross your fingers that you arrive at the same basis as your tools. Of course, with orthogonalization in particular, there are three obvious different approaches yielding different results! Which one did the tool programmer use? You have to use the same one if you hope to avoid seams. Also: in working with this U and V basis, in model space, you discover that every face of your mesh incident at a given vertex prescribes for it a different model space tangent (parallel to the face). But you can store only one tangent at a vertex, and want it to be smooth where possible. Obviously you must perform some kind of averaging or fitting (What kind? Choices, choices) to arrive at a single, smooth vertex tangent from a set of discreet face-derived tangents. Or maybe you shouldn't make it too smooth, if if would induce too severe interpolation of the tangent basis during shading. Maybe it would be better to create a "tangent seam" to relieve this distortion. Again you have decisions to make--and choose carefully, or you'll end up with something slightly different from the tangent basis your normal map was baked for. And then the issue of UV flips/mirroring. And so on, and so on, as mentioned previously. Tell me the "single, mathematically correct" solution for the above problems, and also how you know that your tools code arrived at the same one and not an "inferior" calculation? Remember--if the tangent basis differs, the reconstructed normal will also differ.
  3. 0xffffffff

    3D format that supports vertex tangents.

    There is no singularly "correct" pair of tangent basis vectors. There are infinitely many that are "perfectly orthogonal and mathematically true". Take your theoretically perfect tangent vectors. Rotate them any amount around the normal (or reflect in the plane, or flip one axis... etc). Viola-- a different, orthogonal, perfect tangent basis. How did you arrive at your one, true tangent basis in the first place? Unless your UV mapping was completely free of distortion, you had to perform at the bare minimum a calculation to average the face tangents incident at the vertex (there is no one, true way to do this), and orthogonalize the result (also subject to coder whim). Now, maybe you are a math god and have the best tangent basis calculation by every objective measure--too bad, your normalmap baking tool is using an inferior one, and you've got seams anyway. If the baking or modeling tool is doing something "fancy", such as introducing discontinuities to minimize tangent distortion (Maya has options to control this very thing), good luck reproducing that at asset load time and arriving at precisely the same result.
  4. 0xffffffff

    3D format that supports vertex tangents.

    Not really-- when calculating a tangent basis there's plenty of room for creative license that will lead to significantly different results. For example: to orthogonalize or not (probably yes, but using which of at least 3 different approaches: u-centric, v-centric, or unbiased?), to account for UV flips or not, to perform a surface-area weighted average or a simple mean, whether or not to introduce discontinuities to alleviate twist/distortion (and how, using what threshold), etc. I know from experience that different tools make different decisions for all of those points--the result is arbitrarily different, yet still valid, tangent bases from one tool to the next. The odds that your runtime code will make exactly the same decisions as some offline baking tool or modeling package is low, and lighting seams will result.
  5. 0xffffffff

    3D format that supports vertex tangents.

    I caution against calculating your own tangents if you're using baked normal maps with unwrapped UVs. It's very important that the tangent space used during rendering is the same used during baking. Subtle differences in tangent basis calculation will result in lighting seams coincident with UV-space discontinuities.
  6. Because it ends up calculating lighting fragments "outside" (potentially far in front of) the volume. This can be an advantage for more realistic specular, but it's obviously not as efficient as a two-pass stencil approach that effectively culls all fragments outside the volume.
  7. It's the classic "LookAt" routine for building a camera to world transform. It does become unstable when the view vector nearly coincides with the up vector, so it's usually protected by a pitch limit.
  8. The "quick and dirty" solution (without stencil) is to render only the back faces of the light volume, inverting the z test. This lights every pixel that is in front of the back side of your volume, rather than all the pixels behind the front side (which might get clipped), thus solving the camera inter-penetration problem.
  9. 0xffffffff

    Multilevel render queue

    Yes, a queue shuffles its contents around constantly... it's typically implemented as a heap, so every pop or insertion incurs log(N) swaps. I use an array because it's the simplest, fastest container for this purpose (my bucket indices are known ahead of time, otherwise I might use a hash).
  10. 0xffffffff

    Multilevel render queue

    You mean a priority queue, or a deque? A deque of deques is safe because a deque never moves its contents. On the other hand, a priority queue of priority queues gives me nightmares just thinking about it.. Anyway, I wouldn't use either one--just an array of buckets (that is, an array of array pointers, or an array of linked lists). (We use our own containers, for what it's worth).
  11. 0xffffffff

    Multilevel render queue

    The only thing that seems odd to me is Mesh... specifically that it indirectly owns a world matrix. Typically a mesh is just model definition data (vertices, faces), not an instance of something in the scene. I would have RenderOp point to the Mesh instead of the other way around, and Mesh hold the vertex/index buffers. Aside from that it seems good. Eventually you will want some way to manipulate the rendering order of materials (and their render targets), but this is a good start towards that. We have a similar mechanism for bucketing "render ops" by material, but instead of having an inner queue organized by texture, we optionally sort the contents of a material bucket by "signature" which is a hash of relevant parameter settings, including textures. A bucket can alternatively be sorted by depth. It's a bit more flexible this way (accommodating more than one texture, and/or other expensive state changes), and the cost of sorting a bucket is tiny.
  12. 0xffffffff

    Storage vs Draw calls

    It isn't slow. We draw thousands of flora instances this way, each one individually culled against the view frustum. The time to build the instance buffer each frame is negligible. We did design the per-instance data to be as compact as possible to facilitate this, however.
  13. 0xffffffff

    Storage vs Draw calls

    Use instancing.
  14. You don't need to use barycentric coordinates for texture mapping. Just calculate gradients for every interpolated parameter. Gradients are constant across the whole triangle and tell you how much the interpolated parameter changes for each step in x or in y. Chris Hecker wrote an excellent series on texture mapping for Game Developer in the 90's. Read it if you can find it, otherwise just study the code available here: http://www.gamers.or...ker_texmap.html
  15. 0xffffffff

    Normal Maps and Tangent space

    It's actually quite feasible to apply object-space normalmaps with deformation to a skinned object. In terms of calculation, it's identical to the per-pixel deformation of tangent-space normals--it all boils down to a 3x3 transform of the bump normal (or the inverse transformation of light vectors into surface local space--if you're insane)
  • Advertisement

Important Information

By using, you agree to our community Guidelines, Terms of Use, and Privacy Policy. is your game development community. Create an account for your GameDev Portfolio and participate in the largest developer community in the games industry.

Sign me up!