Jump to content
  • Advertisement
Sign in to follow this  
Ganoosh_

Multi-polygon texture mapping

This topic is 4855 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

This probably isn't something asked very often, but whenever you see a 3D game or engine, a model always loads a texture (an image), but that image is mapped across multiple polygons. This confuses me a little. I know how to map a single texture to a single polygon (loading an image and mapping that to one polygon). I have a texture class which holds the color data in an array for the texture and the polygon uses that one array of data. How would a single image be mapped on multiple polygons? Would the image be cut up and loaded into multiple textures (data arrays in the class) and each polygon would have it's own texture to reference? Or would it just load into one texture (class) and multiple polygons will reference the same data array, and based on the polygon's vertices will it get different coordinates in the texture? For example from say vertices 1 - 4 (normals or something, not indices) in a polygon will get the coordinates of 1 - 4 in the texture and vertices 5 - 8 get 5 - 8 in the texture. Or is the another way that it may commonly be done? Anyone know? And which would be the best way overall? Any help would be appreciated. Thanks in advance. [Edited by - Ganoosh_ on June 7, 2005 8:26:38 PM]

Share this post


Link to post
Share on other sites
Advertisement
assuming a 2D texture, each vertex gets a 2D coordinate of where in the texture it's values are (generally a pair of float numbers ranging from 0-1). Then given the 3 vertices' coordinates you just lineraly interpolate the coordinates across the face of the poly.

-me

Share this post


Link to post
Share on other sites
The term you're looking for is "UV mapping". Google will find you a lot of information about various techniques.

Usually when making a model using an editor like 3DS or Maya you'll map each polygon to a different part of the texture. This gives each vertex a set of UV (texture) coordinates which will then be used by your rendering code. For example, in simple OpenGL you'd feed the UV coordinates to glTexCoord2f() before each glVertex3f().

Share this post


Link to post
Share on other sites
A texture is stretched across a polygon based on two 2D coordinates for each vertex. Like this:

0,0_________ 1,0
|\ |
| \ |
| \ |
0,1|______\_|1,1

The numbers at the corners of the quad represent the 2D coordinates at each vertex. (1,0) means (100%, 0%). It means that a pixel at that vertex should come from the far right of the texture. (1,1) means the pixel should come from the lower right corner of the texture. These numbers are interpolated across the quad so the computer can figure out which pixel in the texture to get at each point on the quad.

Hope this helped!

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

We are the game development community.

Whether you are an indie, hobbyist, AAA developer, or just trying to learn, GameDev.net is the place for you to learn, share, and connect with the games industry. Learn more About Us or sign up!

Sign me up!