Jump to content
  • Advertisement
Sign in to follow this  
thmfrnk

Texture Coord Generation in Shader

This topic is 901 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

Hi,

 

I'm looking for a way to give my users the option to use automatically generated UV coords instead of UVs delivered by imported meshes. So something equal like in Cinema 4D where you can choose between UVW, Cubic, Cylindric, Spherical, etc.. mapping. Finally I want to generate the UVs based on vertex positions and normals. I think this should be possible in Vertex or Geometry shader, but I've no idea actually. I know the result would not be always perfect for non-convex meshes, but most cases are for simple boxes, spheres or planes.

 

Any help, links or codes are appreciated.

 

Thanks,

Thomas

Share this post


Link to post
Share on other sites
Advertisement

You might want to search for "projective texture mapping". You can assign UVs in the vertex shader based upon the local or world position values.

 

If you project along 3 axes (just by applying a scale/offset to xz, xy and yz positions), you can generate 3 sets of UVs, and then blend between three texture samples in the pixel shader, depending on the vertex normal. This is an easy way to handle arbitrary meshes - the downside being that it requires 3 texture samples in the pixel shader. I saw a talk at PAXDev where a team was using this a lot in their game though (I forget which company), and it was a big time-saver for them in terms of artist effort.

 

I use it in one of my own projects to texture things like rocks. The following rock uses projected textures:

 

https://mtnphil.files.wordpress.com/2016/03/projtexture.jpg

 

If you look closely, you can see the blurring caused by blending between different projections. That's another downside.

 

Of course, for simple objects like cubes or planes, it's pretty straightforward.

Edited by phil_t

Share this post


Link to post
Share on other sites

It's possible but would require some additional data passed to the shaders depending on how you want to map things.  The simplest way would be to generate the coordinates in the vertex shader while the vertex are still in object space.  This makes everything centered around the model origin instead of world space, which makes things easier to do.  For some things like a cube, you'd need to know the extents of the mesh before generating the coordinates.  You'll need to know where the minimum/maximum are, as those become 0 and 1.  From there you can project out like a cubemap and generate 2D coordinates for each face of the map based on the vertex position.  The same applies to the different shapes, you're basically assuming a fixed mapping of coordinates over the shape (sphere would use spherical coordinates, cube would use cube mapping (6 2D faces), cylinder would be two faces on the caps plus a single map wrapped around it), then finding where the vertex would hit that shape and using the coordinate at that point.

 

W00t, phil_t beat me to it and with the proper term that I couldn't, for the life of me, remember.

Edited by xycsoscyx

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

We are the game development community.

Whether you are an indie, hobbyist, AAA developer, or just trying to learn, GameDev.net is the place for you to learn, share, and connect with the games industry. Learn more About Us or sign up!

Sign me up!