• Advertisement
Sign in to follow this  

Blending not working in non power of 2 textures ?

This topic is 4888 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

Hello :) Today I started to write a tga loader.. it's a rewrite of the one from gametutorials anyway I thought it couldn't load tga's with alpha channels for some reason ( I know adobe also saves them bad ) so I used the nehe tga textures which also didn't work. I also enabled blending in the right way I think.
	glEnable(GL_BLEND);
	glEnable(GL_CULL_FACE);
	glCullFace(GL_FRONT);
	glBlendFunc(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA);

Now the question is. Do alpha channel textures for some reason not work on GL_TEXTURE_RECTANGLE_NV textures or GL_TEXTURE_RECTANGLE_ARB textures ?

Share this post


Link to post
Share on other sites
Advertisement
>>Now the question is. Do alpha channel textures for some reason not work on GL_TEXTURE_RECTANGLE_NV textures or GL_TEXTURE_RECTANGLE_ARB textures ?<<

there should be no difference

Share this post


Link to post
Share on other sites
glGetError doesn't give any errors... and blending just WON'T work. Is it the gametutorials tga loader or is it me using OpenGL in the wrong way :| ?

Share this post


Link to post
Share on other sites
Also doesn't work with power of 2 textures :| Okay... so that isn't the problem but what could be it ?

In my initialization code.
TextureType is GL_TEXTURE_RECTANGLE_NV on nvidia cards and GL_TEXTURE_RECTANGLE_ARB on non nvidia cards :) I thought it may have a speed advantage.


glShadeModel(GL_SMOOTH);
glClearColor(0.0f, 0.0f, 0.0f, 0.0f);
glClearDepth(1.0f);
glEnable(GL_DEPTH_TEST);
glEnable(TextureType);
glEnable(GL_BLEND);
glEnable(GL_CULL_FACE);
glCullFace(GL_FRONT);
glBlendFunc(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA);
glDepthFunc(GL_LEQUAL);
glHint(GL_PERSPECTIVE_CORRECTION_HINT, GL_NICEST);





Render function

/*==============================================*/
//
// REF_DrawImage Draw an image
//
/*==============================================*/
int REF_DrawImage(PXImage *Image, int X, int Y, int Width, int Height)
{
glLoadIdentity();
glTranslatef(X, Y, 0.0f);
glRotatef(Image->RotDeg, 0.0f, 0.0f, 1.0f);

// Enable blending set blend func
glColor4ub(255, 255, 255, Image->Alpha);
glBindTexture(TextureType, gl_Textures[Image->imgnr]);

glBegin(GL_QUADS);
glTexCoord2i(0, Height); glVertex3i(0, 0, 0);
glTexCoord2i(Width, Height); glVertex3i(Width, 0, 0);
glTexCoord2i(Width, 0); glVertex3i(Width, Height, 0);
glTexCoord2i(0, 0); glVertex3i(0, Height, 0);
glEnd();

return 0;
}





Now I really can't find out what is wrong there. If anybody knows plz help :) Thank you.

Share this post


Link to post
Share on other sites
Quote:

TextureType is GL_TEXTURE_RECTANGLE_NV on nvidia cards and GL_TEXTURE_RECTANGLE_ARB on non nvidia cards :) I thought it may have a speed advantage.


Actually, those constants should resolve to the same number, so it doesn't make a difference :)

As for your problem, I'd like to have a look at the part with your glTexImage2d call.

Share this post


Link to post
Share on other sites
Well you may call this messy or not :P But this works fine since I don't want to convert things in the loading code :)


if (Channels == 4)
{
if (Type != TGA_RLE)
{
glTexImage2D(TextureType, 0, 3, ImageTGA->sizeX, ImageTGA->sizeY, 0, GL_RGBA, GL_UNSIGNED_BYTE, ImageTGA->data);
}
else
{
glTexImage2D(TextureType, 0, 3, ImageTGA->sizeX, ImageTGA->sizeY, 0, GL_RGBA, GL_UNSIGNED_BYTE, ImageTGA->data);
}
}
else
{
if (Type != TGA_RLE)
{
glTexImage2D(TextureType, 0, 3, ImageTGA->sizeX, ImageTGA->sizeY, 0, GL_BGR_EXT, GL_UNSIGNED_BYTE, ImageTGA->data);
}
else
{
glTexImage2D(TextureType, 0, 3, ImageTGA->sizeX, ImageTGA->sizeY, 0, GL_RGB, GL_UNSIGNED_BYTE, ImageTGA->data);
}
}

Share this post


Link to post
Share on other sites
Here's your problem:
Quote:

if (Channels == 4 & Type != TGA_RLE)
glTexImage2D(TextureType, 0, 3, ImageTGA->sizeX, ImageTGA->sizeY, 0, GL_RGBA, GL_UNSIGNED_BYTE, ImageTGA->data);


The third parameter to glTexImage2D ('3' here) indicates the internal format of the texture. What you've used here is the legacy format of the number of components. i.e., you're telling GL that the texture has only R, G and B components, although the data has RGB & A. So GL will happily filter out the alpha component to adhere to your request ;)

You'll want to change that parameter either to 4, or if you want to use the official GL specification, GL_RGBA.

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement