Jump to content
  • Advertisement
Sign in to follow this  
sushi

How to solve texture-mapping dilemma?

This topic is 4748 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

<noob_mode>I am trying to create a new model-class for my DirectX game. How do i implement polygon-based texture-mapping?</noob_mode> DirectX ties texture-coords, normals, vertex-colors etc. to vertices, right? How do i map each poly with it's own texture-coords, without cluttering the model with tons of additional vertices? I'd concider the following setup as optimal: http://santex-heimtextil.bei.t-online.de/spike/pictures/coords.jpg Is there some kind of common solution to this problem? Thank's

Share this post


Link to post
Share on other sites
Advertisement
This may be possible with instancing, but that is a new feature, so it's probably not what you're after.

Simple answer : You can't do it, so don't worry about it. Clutter away.

Share this post


Link to post
Share on other sites
What are you trying to do?
Why can't you just use a skin that contains all the texturing info for the whole object, and index into it with your texture coordinates as usual?

Texture coordinates are really parts of vertices. Unique texture coordinates means unique vertices. You can get around this somehow with vertex streams, allowing a little bit more position/normal data reuse, but you'll end up doing lots of DrawPrimitive calls that it's not worth it.

Share this post


Link to post
Share on other sites
Does the mesh use several textures? If so, you'll want to be capable of rendering each of the subsets seperately anyway. So you want vertices on the seems to exist once for each texture.

If you're not using multiple textures, what are the coordinates like in each quad? If they repeat (0 to 1, 0 to 1), that effect can be sustained by just letting the coordinates keep counting up (0 to 1, 1 to 2). You need to make sure clamp mode is off, though.

If you're not actually in a specific scenario and are just in the process of creating a generic routine, you shouldn't worry. In most situations, seperated texture subsets are seperated for a good reason. Terrain type situations would be the only exception that I've encountered. But they require such special attention, it's not worth making them fit into your generic mesh setup.

Share this post


Link to post
Share on other sites
Thank's for the quick help so far.

I think, Jiia's proposed solution fits my problem best. I want to use multiple textures / effects on my game objects, so each texture can have its own vertex-coords.

Anyhow, this brings me to another problem: How do i align shading-normals on the seams between the textures / subsets?

I am now using the average of the adjacent polygon-normals for calculating vertex-normals. Unfortunately, this approach won't work anymore, if i use co-position vertices.

I'm planning to make use of some extensive specular lighting, so the resulting seams will be especially visible. Any ideas?

Share this post


Link to post
Share on other sites
Oh, never mind. I figured it out myself. You just have to interpolate between the normals on the co-position vertices, to get an smooth, seamless transition. Problem solved.

Thanks people.

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

Participate in the game development conversation and more when you create an account on GameDev.net!

Sign me up!