So as part of my development I am trying to generate procedural planets in my game. I really want to use a subdivided icosahedral based sphere and have had a lot of success.
I'm working on creating procedural textures and as far as UV mapping that has been simple, as it just uses the same subdivision as the 3D vectors.
I'm am just curious about techniques of texturing. I am trying to use 10 square textures that are procedurally generated and am having trouble with coordinates. Pretty much a dymaxion map split in to 10 sets of two triangles.
Since each square actually represents a diamond made up of two equalateral triangles, I am having trouble converting coordinates in each pixel of a texture to either long-lat coordinates or 3D coordinates of the sphere surface. I need all textures to fit together seamlessly so I need a formula to get real world coordinates from texture coordinates.
tldr; Convert 2D dymaxion world texture coordinates to 3D coordinates or lat/long coordinates accurately?
Basically, instead of generating a bunch of 2D textures and trying to project them onto the icosahedron, you would construct a 3-dimensional function, evaluated on a point (x,y,z), that would return texture information for that input point. Then you could rasterize triangles in the UV map, and for each triangle you would interpolate the 3D positions of the vertices and pass the interpolated values in to the 3D function. An example:
An icosphere with a crude UV mapping.
A basic F2-F1 cellular 3D function baked to the texture map.
The icosphere shown with the UV texture mapped on it.
This way, instead of having to manage 10 separate textures you simple "bake" the data directly from one solid texture to a single UV texture. If you create your UV mapping more intelligently than I did for this example, you have less texture waste.