Fundamental procedural texture generation problem

Started by
2 comments, last by pernyblom 14 years, 11 months ago
I am writing a procedural texture/sprite generator which should output a color, normal and height for each pixel and the plan is to use OpenGL for accelerating the rendering. I am currently using a triangle mesh for representing the texture with one vertex for each pixel because I thought it would be easy to transform parts of the mesh with all attributes. This approach seems to work well if you stick to the same mesh all the time but I want to be able to project other possibly deformed "submeshes" on top of a regular-spaced mesh and update the attributes accordingly. For example, first generate a stone texture with normals and all and then map it onto a half cylinder while keeping all the attributes. This, according to my deduction, boils down to rendering the triangles in the submesh onto this regular mesh/grid which is a lot similar to standard rasterization of triangles made by OpenGL but with an important difference: I can not find a way to extract the interpolated normals from the OpenGL rendering (Yes, I could use the height data but I want to use the original normals). So, I am just wondering if my thinking is approximately correct/sane and that I have one of the following alternatives: * Use another texture generation approach * Program a shader that retains the interpolated normals * Program/Use a software triangle renderer I have looked at Genetica and MaPZone as well but I would like to be able to generate textures within the game engine (currently written in Java). Any help and other suggestions would be greatly appreciated!
Advertisement
Does anyone know how to generate a normal map from a highly detailed model with the help from OpenGL?

I know that there are tools that can do this (such as a Blender plugin) but I want to bake it into the game engine with as little hardware requirements as possible.
You can just emit normals in your fragment shader into a framebuffer.

Vertex shader:
varying vec3 normal;void main(){    gl_Position = ftransform();    normal = gl_NormalMatrix * gl_Normal;}


Fragment shader:
varying vec3 normal;void main(){    gl_FragColor = normalize(normal) * 0.5 + 0.5; // convert from -1 - 1 to 0 - 1}


You can also use draw buffers to render color, normal and height at the same time to 3 framebuffers. But if it's too complicated, don't bother. You can just render 3 times with diferent shaders.
Thank you very much!

I guess it is time for me to learn that magical shading language :) I've been avoiding it to long.

This topic is closed to new replies.

Advertisement