Jump to content
  • Advertisement
Sign in to follow this  
Slaaitjuh

Texture matrix from triangle

This topic is 1824 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

Hello,

 

I was wondering if there's anyone that can help me with the following idea. I am currently working on a CSG library. I made a function to import models. The models get converted to planes, and later on rebuild to vertices. For material editing I am using texture matrices per plane. I would like to continue this, but I am facing a problem/idea:

 

When reading an existing model, I can get the position, normal and texture coordinates for a given vertex/triangle. Since I can get the three vertices and texture coordinates of a triangle, I was wondering if there is a way to construct a Matrix from those parameters to recreate the texture coordinates after the CSG process.

This would mean that the matrix would be able to calculate texture coordinates by transforming the vertex position.

 

The reason I would like for this to work, is that I can reuse the texture coordinates in the models, making the initial model look much better.

 

I would prefer examples in C# if I have a choice, but any language will do :-) Idea's are very welcome as well. Any clarification can be provided also.

 

Thanks,

Share this post


Link to post
Share on other sites
Advertisement

I'm not completely clear about your intended use case, but I'll give it a shot anyways :)  So you have a process that converts a 3D mesh (which is a list of vertices + triangle connectivity information) into a new representation which is a list of planes, and the locations within those planes that define the vertex locations - is that correct?

 

Then if you want to have a single transformation that produces the texture coordinates from the positions, there are some strict limitations that would be required that I think would make this impractical.  You would need to have a uniform 3D mesh surface area to texture space mapping, which typically is not the case.  In addition, the production of the original texture coordinates must respect the mapping to and from position information, which is also not normally the case since texture information is just packed in as efficiently as possible.

 

So if I have understood what you want properly, then I don't think there is a generic way to map between positions and texture coordinates unless you impose this mapping on the original input models.  Have you considered just storing the texture data along with your planar representation of the positions?

Share this post


Link to post
Share on other sites

I'm not completely clear about your intended use case, but I'll give it a shot anyways smile.png  So you have a process that converts a 3D mesh (which is a list of vertices + triangle connectivity information) into a new representation which is a list of planes, and the locations within those planes that define the vertex locations - is that correct?

 

Then if you want to have a single transformation that produces the texture coordinates from the positions, there are some strict limitations that would be required that I think would make this impractical.  You would need to have a uniform 3D mesh surface area to texture space mapping, which typically is not the case.  In addition, the production of the original texture coordinates must respect the mapping to and from position information, which is also not normally the case since texture information is just packed in as efficiently as possible.

 

So if I have understood what you want properly, then I don't think there is a generic way to map between positions and texture coordinates unless you impose this mapping on the original input models.  Have you considered just storing the texture data along with your planar representation of the positions?

 

I am sorry for replying this late, I did not notice there was a reaction. Thanks for that. 

 

Your first assumption is correct. The problem is that we are performing CSG on these planes, which creates sub vertices etc etc. A way to recalculate the texture coordinates ( in my head, that is) was to produce a uniform texture matrix that somehow preserves the original intention of the texture map. This would be a matrix per plane. I guess I have to agree with you that it would be impractical, maybe even impossible. 

 

I considered barycentric coordinates (remember which plane had which vertices / uv, and do the math) yet the results were often _not spot on_. A lot of vertices got little offsetted texture coordinates which looks unnatural. 

 

Another way was to do planar world mapping, but I wanted to avoid that since that only results in something nice on a box. Mapping a sphere or cone with this is not pretty.

 

The previous version I was working on involved creating the planes, looking up the texture coordinates and interpolate them along with the vertex data. The problem I have with this, is that it requires a lookup table from the original model to the CSG model. This is a weak link, which I'd like to avoid.

 

Do you have any tips/ideas I did not try yet?

 

Thanks again for your clear explanation.

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

Participate in the game development conversation and more when you create an account on GameDev.net!

Sign me up!