using bytes for normals

Started by
2 comments, last by Digitalfragment 17 years, 4 months ago
I'm trying to store the normals for a mesh as bytes, ranging from 0 to 255. Then in the vertex shader I take the incoming normal, which has been scaled between 0 and 1, and do the following: vec3 normal = gl_Normal.xyz * 2.0 - 1.0; Now the normal ranges from -1 to 1. I then pass this normal onto the pixel shader and color the pixel vec4(normal.xyz, 1.0) but every pixel is always blue? When I create and initialize the normals for the mesh, I set each triplet to 128,255,128. So I 'should' only be seeing green, but instead I get blue. Does anybody know if my assumptions about the normal being scaled to 0 to 1 when it reached the vertex shader is correct? Is what I am trying to do possible?
Author Freeworld3Dhttp://www.freeworld3d.org
Advertisement
Content removed. Fair enough \/, although I don't see any mention of the range of the char in your formula
I already know how to pack floats into bytes. But what I want to do is send the normals to the graphics card as bytes and then read them as normal in the vertex shader.
Author Freeworld3Dhttp://www.freeworld3d.org
you can upload your normal stream as a signed-byte array (-128 to 127) as a normalized array, in which case the gpu will automatically convert it in your shader to a -1 to 1 range.

This topic is closed to new replies.

Advertisement