Jump to content
  • Advertisement
Sign in to follow this  
pernyblom

OpenGL Fundamental procedural texture generation problem

This topic is 3493 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

I am writing a procedural texture/sprite generator which should output a color, normal and height for each pixel and the plan is to use OpenGL for accelerating the rendering. I am currently using a triangle mesh for representing the texture with one vertex for each pixel because I thought it would be easy to transform parts of the mesh with all attributes. This approach seems to work well if you stick to the same mesh all the time but I want to be able to project other possibly deformed "submeshes" on top of a regular-spaced mesh and update the attributes accordingly. For example, first generate a stone texture with normals and all and then map it onto a half cylinder while keeping all the attributes. This, according to my deduction, boils down to rendering the triangles in the submesh onto this regular mesh/grid which is a lot similar to standard rasterization of triangles made by OpenGL but with an important difference: I can not find a way to extract the interpolated normals from the OpenGL rendering (Yes, I could use the height data but I want to use the original normals). So, I am just wondering if my thinking is approximately correct/sane and that I have one of the following alternatives: * Use another texture generation approach * Program a shader that retains the interpolated normals * Program/Use a software triangle renderer I have looked at Genetica and MaPZone as well but I would like to be able to generate textures within the game engine (currently written in Java). Any help and other suggestions would be greatly appreciated!

Share this post


Link to post
Share on other sites
Advertisement
Does anyone know how to generate a normal map from a highly detailed model with the help from OpenGL?

I know that there are tools that can do this (such as a Blender plugin) but I want to bake it into the game engine with as little hardware requirements as possible.

Share this post


Link to post
Share on other sites
You can just emit normals in your fragment shader into a framebuffer.

Vertex shader:


varying vec3 normal;

void main()
{
gl_Position = ftransform();
normal = gl_NormalMatrix * gl_Normal;
}




Fragment shader:

varying vec3 normal;

void main()
{
gl_FragColor = normalize(normal) * 0.5 + 0.5; // convert from -1 - 1 to 0 - 1
}




You can also use draw buffers to render color, normal and height at the same time to 3 framebuffers. But if it's too complicated, don't bother. You can just render 3 times with diferent shaders.

Share this post


Link to post
Share on other sites
Thank you very much!

I guess it is time for me to learn that magical shading language :) I've been avoiding it to long.

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

GameDev.net is your game development community. Create an account for your GameDev Portfolio and participate in the largest developer community in the games industry.

Sign me up!