Jump to content
  • Advertisement
Sign in to follow this  
gbook2

OpenGL reduced colors in GLSL fragment shader

This topic is 3976 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

I'm trying to do some operations on a 16-bit texture in a fragment shader, but the resulting colors are all wrong. The data I'm working with is 12 bit, and I've been using a multiplier to bring the data to the range of a 16-bit number. That method has worked fine in regular OpenGL before trying it in the shader, and the colors all look correct (shades of gray). In the shader, I use the same multiplier, but the number of resulting colors are reduced... instead of 256 shades of gray I get about 8. Here's the shader code: ---------------- uniform sampler2D theTexture; void main() { float mult; mult = 65535.0/4095.0; vec3 color = vec3(texture2D(theTexture, gl_TexCoord[0].st))*mult; gl_FragColor = vec4(color, 1.0); } ----------------- How are textures represented in shaders? What format does gl_FragColor take? Is it 0 to 1 or some other range? or is the range based on the current texture datatype such as GL_UNSIGNED_SHORT? -Greg

Share this post


Link to post
Share on other sites
Advertisement
Sign in to follow this  

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

We are the game development community.

Whether you are an indie, hobbyist, AAA developer, or just trying to learn, GameDev.net is the place for you to learn, share, and connect with the games industry. Learn more About Us or sign up!

Sign me up!