problem with PNG files and devil

Started by
3 comments, last by Serge K 16 years, 3 months ago
hi i have a png file with alpha chanel but i cant use the alpha chanel in opengl in other word all of the transparent pixels shown dark this is my code, where is my problem? ilGenImages (1, &ImgId); ilOriginFunc(IL_ORIGIN_LOWER_LEFT); ilEnable(IL_ORIGIN_SET); ilBindImage (ImgId); ilutGLLoadImage("C:/kge.png"); Width = ilGetInteger(IL_IMAGE_WIDTH); Height = ilGetInteger(IL_IMAGE_HEIGHT); ilConvertImage(IL_RGBA,IL_UNSIGNED_INT); sourcedata = ilGetData(); ilDeleteImages(1, &ImgId); glBlendFunc(GL_SRC_ALPHA,GL_ONE_MINUS_SRC_ALPHA); glRasterPos2i(0,0); glDrawPixels(Width ,Height,GL_RGBA,GL_UNSIGNED_INT, sourcedata); and,when i enable blending, any thing dont be shown
Advertisement
It depends on what values are present in the alpha, so have a look at sourcedata with a debugger.
Why do you use ilConvertImage(IL_RGBA,IL_UNSIGNED_INT)
instead of just using unsigned bytes?
Sig: http://glhlib.sourceforge.net
an open source GLU replacement library. Much more modern than GLU.
float matrix[16], inverse_matrix[16];
glhLoadIdentityf2(matrix);
glhTranslatef2(matrix, 0.0, 0.0, 5.0);
glhRotateAboutXf2(matrix, angleInRadians);
glhScalef2(matrix, 1.0, 1.0, -1.0);
glhQuickInvertMatrixf2(matrix, inverse_matrix);
glUniformMatrix4fv(uniformLocation1, 1, FALSE, matrix);
glUniformMatrix4fv(uniformLocation2, 1, FALSE, inverse_matrix);
hi
i use ilConvertImage for using all of the components in the png file ( in default devil use RGB)
unsigned int--> 8bits for every component (RGBA) => 32bit(UNSIGNED_INT)
unsigned int means that every component will be 32 bit. So each pixel will be 128-bit sized with RGBA.
Quote:Original post by keltarSo each pixel will be 128-bit sized with RGBA.

Exactly.
Besides, why glDrawPixels? It's definetely not the fastest way to do it…
You can draw this image using the texture prepared by ilutGLLoadImage.

This topic is closed to new replies.

Advertisement