Sign in to follow this  

using bytes for normals

This topic is 4023 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

I'm trying to store the normals for a mesh as bytes, ranging from 0 to 255. Then in the vertex shader I take the incoming normal, which has been scaled between 0 and 1, and do the following: vec3 normal = gl_Normal.xyz * 2.0 - 1.0; Now the normal ranges from -1 to 1. I then pass this normal onto the pixel shader and color the pixel vec4(normal.xyz, 1.0) but every pixel is always blue? When I create and initialize the normals for the mesh, I set each triplet to 128,255,128. So I 'should' only be seeing green, but instead I get blue. Does anybody know if my assumptions about the normal being scaled to 0 to 1 when it reached the vertex shader is correct? Is what I am trying to do possible?

Share this post


Link to post
Share on other sites
I already know how to pack floats into bytes. But what I want to do is send the normals to the graphics card as bytes and then read them as normal in the vertex shader.

Share this post


Link to post
Share on other sites

This topic is 4023 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

Sign in to follow this