Jump to content

  • Log In with Google      Sign In   
  • Create Account

Texturing a procedural Icosahedron


Old topic!
Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.

  • You cannot reply to this topic
1 reply to this topic

#1 technotheist   Members   -  Reputation: 110

Like
0Likes
Like

Posted 18 October 2012 - 04:38 AM

Hi,
So as part of my development I am trying to generate procedural planets in my game. I really want to use a subdivided icosahedral based sphere and have had a lot of success.
I'm working on creating procedural textures and as far as UV mapping that has been simple, as it just uses the same subdivision as the 3D vectors.

I'm am just curious about techniques of texturing. I am trying to use 10 square textures that are procedurally generated and am having trouble with coordinates. Pretty much a dymaxion map split in to 10 sets of two triangles.

Since each square actually represents a diamond made up of two equalateral triangles, I am having trouble converting coordinates in each pixel of a texture to either long-lat coordinates or 3D coordinates of the sphere surface. I need all textures to fit together seamlessly so I need a formula to get real world coordinates from texture coordinates.

tldr; Convert 2D dymaxion world texture coordinates to 3D coordinates or lat/long coordinates accurately?

Sponsor:

#2 JTippetts   Moderators   -  Reputation: 8493

Like
2Likes
Like

Posted 18 October 2012 - 07:46 AM

Have you looked into procedural solid texturing?

Basically, instead of generating a bunch of 2D textures and trying to project them onto the icosahedron, you would construct a 3-dimensional function, evaluated on a point (x,y,z), that would return texture information for that input point. Then you could rasterize triangles in the UV map, and for each triangle you would interpolate the 3D positions of the vertices and pass the interpolated values in to the 3D function. An example:

Posted Image

An icosphere with a crude UV mapping.

Posted Image

A basic F2-F1 cellular 3D function baked to the texture map.

Posted Image

The icosphere shown with the UV texture mapped on it.

This way, instead of having to manage 10 separate textures you simple "bake" the data directly from one solid texture to a single UV texture. If you create your UV mapping more intelligently than I did for this example, you have less texture waste.




Old topic!
Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.



PARTNERS