Jump to content
  • Advertisement

Archived

This topic is now archived and is closed to further replies.

silvermace

Calculating Tangent and Binormal in Vertex Shader

This topic is 5309 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

hi, i was looking STFW on the subject and i noticed that all the algorithms calculate the TNB matrix using a triangle, ie. a minima of 3 verticies, so i want to know how vertex programs would go about doing the bump mapping calculations with and even generating the TB vectors etc. im wondering how it would be done in a vertex shader because AFIK you cant send extra vectors per vertex using the normal bind->rendermesh->unbind methods.. if some one could shed light as to how this might be done, it would save me from alot of confusion. thanx in advance -- Danushka | IRIX | sILverMacE ::.
A GOOD friend will come bail you out of jail... but, a TRUE friend will be sitting next to you saying, "Damn, we fucked up." Ingite 3D Game Engine Home -- Just click it

Share this post


Link to post
Share on other sites
Advertisement
Since a vertex shader/program only gets to work on an individual vertex at a time and in isolation (this allows parallel processing), you *MUST* store information in every vertex to construct the NBT basis. If the basis is orthonormal you can get away with just two of the vectors (the normal can be the same one used for traditional vertex lighting) and a cross product to get the third.

As a minimum you''re going to need to store a single vector and some form of description of the slope of mapping in the U and V direction of the polygon in each vertex. You may be able to get away with something like a unit quaternion in a single packed UBYTE4.

To be honest though, with the amount of instructions you''ll spend on turning that into a form that can be used to transform vectors like the Light direction I''d just bite the bullet and store NBT or just NB per vertex.

Share this post


Link to post
Share on other sites
thanx for that, just to clarify, how should i pass these vectors to the vertex shader, encode them into a textureunits texcoords or similar?

thanx

- danushka | IRIX | sILverMacE ::.

Share this post


Link to post
Share on other sites
im also wondering how to send it, with directx, it should be pretty simple, but with opengl, doesnt seem to allow multiple normals to be set? (not too sure)

Share this post


Link to post
Share on other sites
OpenGL have generic vertex attributes for the programmable pipeline, that''s a way to send them, or you can choose to send them pretending they are texcoords, first solution is the best (as in the most elegant and logical.)
Both API run on the same hardware it''s obvious that they both allow access to most if not all the hardware capability. (Besides D3D started of by mimicking OpenGL)


-* So many things to do, so little time to spend. *-

Share this post


Link to post
Share on other sites

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

We are the game development community.

Whether you are an indie, hobbyist, AAA developer, or just trying to learn, GameDev.net is the place for you to learn, share, and connect with the games industry. Learn more About Us or sign up!

Sign me up!