I'm not completely clear about your intended use case, but I'll give it a shot anyways So you have a process that converts a 3D mesh (which is a list of vertices + triangle connectivity information) into a new representation which is a list of planes, and the locations within those planes that define the vertex locations - is that correct?
Then if you want to have a single transformation that produces the texture coordinates from the positions, there are some strict limitations that would be required that I think would make this impractical. You would need to have a uniform 3D mesh surface area to texture space mapping, which typically is not the case. In addition, the production of the original texture coordinates must respect the mapping to and from position information, which is also not normally the case since texture information is just packed in as efficiently as possible.
So if I have understood what you want properly, then I don't think there is a generic way to map between positions and texture coordinates unless you impose this mapping on the original input models. Have you considered just storing the texture data along with your planar representation of the positions?
I am sorry for replying this late, I did not notice there was a reaction. Thanks for that.
Your first assumption is correct. The problem is that we are performing CSG on these planes, which creates sub vertices etc etc. A way to recalculate the texture coordinates ( in my head, that is) was to produce a uniform texture matrix that somehow preserves the original intention of the texture map. This would be a matrix per plane. I guess I have to agree with you that it would be impractical, maybe even impossible.
I considered barycentric coordinates (remember which plane had which vertices / uv, and do the math) yet the results were often _not spot on_. A lot of vertices got little offsetted texture coordinates which looks unnatural.
Another way was to do planar world mapping, but I wanted to avoid that since that only results in something nice on a box. Mapping a sphere or cone with this is not pretty.
The previous version I was working on involved creating the planes, looking up the texture coordinates and interpolate them along with the vertex data. The problem I have with this, is that it requires a lookup table from the original model to the CSG model. This is a weak link, which I'd like to avoid.
Do you have any tips/ideas I did not try yet?
Thanks again for your clear explanation.