GLSL problem using the alpha channel (RGBA)

Started by
1 comment, last by hayden 15 years, 3 months ago
Hi again :P in my application using OpenGL, GLSL and FBO's, in my fisrt renderization to texture i'm trying to put the depth information in the alpha channel, i use RGBA 16 bits: glTexImage2D(GL_TEXTURE_2D, 0, GL_RGB16F_ARB, width, height, 0, GL_RGBA, GL_FLOAT, NULL); But in my next shader, while loading the first rendered texture, the alpha channel just gives me the value 1.0.. i'm kinda noob at this and don't know what can i possible doing wrong. Should i configure OpenGL in a different way? This is what i have: glutInitDisplayMode(GLUT_DOUBLE | GLUT_RGBA | GLUT_DEPTH); Thanks in advance! :)
Advertisement
GL_RGB16F_ARB doesn't have an alpha channel so by default alpha = 1.0

Use GL_RGBA16F_ARB
Sig: http://glhlib.sourceforge.net
an open source GLU replacement library. Much more modern than GLU.
float matrix[16], inverse_matrix[16];
glhLoadIdentityf2(matrix);
glhTranslatef2(matrix, 0.0, 0.0, 5.0);
glhRotateAboutXf2(matrix, angleInRadians);
glhScalef2(matrix, 1.0, 1.0, -1.0);
glhQuickInvertMatrixf2(matrix, inverse_matrix);
glUniformMatrix4fv(uniformLocation1, 1, FALSE, matrix);
glUniformMatrix4fv(uniformLocation2, 1, FALSE, inverse_matrix);
Ups, how didn't i saw that?... lol

Sorry about this dumb question but thanks! By now i would be still without knowing what was the prob.. :S

This topic is closed to new replies.

Advertisement