Per Vertex color to Texture

Started by
6 comments, last by HardlineDigital 11 years, 1 month ago

I have a .obj mesh which has per vertex color information stored in it. There are no texture associated with this mesh.

Does someone know a method/source code/algorithm which can help me generate a texture image from per vertex color ?

Advertisement

Assuming the mesh has unique texture coordinates (i.e. every triangle is unwrapped to a non-overlapping part of the UV-space):
*Make a vertex shader that outputs the vertex's texture coordinate as the position. This will draw the mesh in UV-space. Make sure backface culling is disabled.

*Make a pixel shader that outputs the (interpolated) vertex colour as the output colour.

*Create a render-target with the desired resolution, draw the mesh to this render-target using these shaders.

Assuming the mesh has unique texture coordinates (i.e. every triangle is unwrapped to a non-overlapping part of the UV-space):
*Make a vertex shader that outputs the vertex's texture coordinate as the position. This will draw the mesh in UV-space. Make sure backface culling is disabled.

*Make a pixel shader that outputs the (interpolated) vertex colour as the output colour.

*Create a render-target with the desired resolution, draw the mesh to this render-target using these shaders.

and don't forget:

*Store the contents of the render target to file for use later on.

Like Hodgeman said, this requires unique and continuous texture coordinates for all vertices (i.e. no vertices can be at the same location with different colors and/or texture coordinates!) so it may or may not be correct to do this...

Thanks Hodgeman, Jason . However the problem is that mesh doesn't have texture coordinates .

Thanks Hodgeman, Jason . However the problem is that mesh doesn't have texture coordinates .

How you want to use that texture then? If you need it for drawing then you need texture coordinates anyway. Maybe bit more info for use case?

He possibly just misunderstood how the rendering works and thinks you could only show something that contains a texture, when he just as easily could put coordinate+color inside the vertex and write that same color out in the shader, instead of having coordinate+texturecoordinate in the vertex and reading the color from a texture before writing it out in the shader.

As for generating texture coordinates, if the mesh isnt too complicated he could try to approximate it to how a simple object like a box, cylinder or sphere would be handled, map the points on the mesh to this and use the corresponding coordinate from this object. Then he could possibly interpolate the colors between these points to generate textures, but that is just wasted effort and unnecessarily blows up the size of the data just for the little convenience of not having to switch to another shader.

I understand that a mesh can be rendered just based on vertex color, however we were having problems loading a .obj file with per vertex color ( without texture map) in Unity game engine. There were a couple of other reasons that we need to bake vertex color in to texture programmatically.

You can use a program called xNormal. It is free, but you will need to unwrap your final game topology model into UV space. Then you can use a high poly mesh with vertex colors, a low poly mesh that is your target with the UVs, and another mesh ballooned outwards on all surface normals called a cage which sets boundaries for the process. You can read more about it elsewhere.

It works to generate all kinds of maps other than vertex colors as well, such as normal maps, displacement, and ambient occlusion etc...

This topic is closed to new replies.

Advertisement