Jump to content
  • Advertisement
Sign in to follow this  
DaMuzza

OpenGL signed LUMINANCE8_ALPHA8

This topic is 4522 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

Under Direct3D, I store my normal maps in the format V8U8. This is a signed format so I in the pixel shader I can do: normal.xy = texture.RG; normal.z = (computed...) Under OpenGL, Nvidia recommend using the format GL_LUMINANCE8_ALPHA8. However, I'm having some troubles getting the correct results from it. I *think* that my problem lies with my texture generation rather than my pixel shader. For reference, the only difference in the pixel shader is that rather than doing: normal.xy = texture.RG; it becomes: normal.xy = texture.GA; My results are 'close' to being correct, and at first glance appeared to be fine, but on further examination, the normal values are not correct. I create my texture data like this: glTexImage2D( GL_TEXTURE_2D, miplevel, GL_LUMINANCE8_ALPHA8, Width, Height, 0, GL_LUMINANCE_ALPHA, GL_BYTE, // signed data ); There are two things I'm not sure about. Firstly, I cannot find anywhere whether GL_LUMINANCE8_ALPHA8 is a signed format. Will my pixel shader (in Cg) read the texture to be from -1 to 1, or 0 to 1? From observation, it appears to be signed, but it would be nice if I could find documentation somewhere to support that. Secondly, before calling glTexImage2D, I suspect I may need to use glPixelTransfer to tell OpenGL exactly how I want the texture data formatted. I've experimented with various values that I though made sense, but with no luck. Can anyone help? Alternatively, if anyone knows of any demo that generates a texture of format GL_LUMINANCE8_ALPHA8, then that could be useful too. Thanks

Share this post


Link to post
Share on other sites
Advertisement
To add to this, I've discovered that glTexImage2D is discarding the sign bit on all the data that I pass in.
If I use GL_SIGNED_LUMINANCE8_ALPHA8_NV, rather than GL_LUMINANCE8_ALPHA8, then everything works fine.
I can find no version of GL_SIGNED_LUMINANCE8_ALPHA8_NV for ATI cards though.

Any ideas?

Thanks

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

We are the game development community.

Whether you are an indie, hobbyist, AAA developer, or just trying to learn, GameDev.net is the place for you to learn, share, and connect with the games industry. Learn more About Us or sign up!

Sign me up!