Jump to content
  • Advertisement

Archived

This topic is now archived and is closed to further replies.

Erim

Blending problem.

This topic is 5797 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

Hello. I've got a little problem concerning blending. The thing is that I have coded a little landscape engine, and now I'm adding trees to it. The trees are simple texture which are alligned like a cross, to reproduce the effect off a real tree. Currently I can add about 40000 trees without having frame rate problems, but I just can't seem to get the bitmaps to be transparent. If I use blending, I have to disable depth testing, right? And if I do that, the trees that are hidden will be drawn, and whatever blend function i use (i use a mask map and a texture map) the trees glow, or are semi-transparent. What I really wonder is, is there any function which just skips to draw the black pixels on my bitmap, so that I can skip the masking map, and still have depth testing, or do I need to depth sort every tree, which will reduce my framerate quite much. I would be really grateful if someone could help me with this stupid prolem. [edited by - Erim on December 3, 2002 8:38:39 AM]

Share this post


Link to post
Share on other sites
Advertisement
Erim,

The blending function you are looking for is
glBlendFunc(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA)

As for drawing the objects in the correct order, there is not way but to draw the objects in back to front order (where blending is concerned)

The correct way to render translucent objects is this

1. Draw the opaque objects
2. Make the depth buffer read-only using glDepthMask(GL_TRUE)
3. Draw the objects one by one, starting from the farthest upto the closest (yes, you will have sort them

Hope this helps

Regards,
HellSpawn

Share this post


Link to post
Share on other sites
There is an intermediate solution that works without blending, it is called alpha testing.
First of all, you need an alpha channel for your picture. Since you already have a RGB channel, you can easily compute yourself the channel using the following formula :
Alpha = (Red + Green + Blue) / 3

Then activate alpha testing by calling :
glEnable(GL_ALPHA_TEST); // Enable alpha testing.
glAlphaFunc(GL_NOTEQUAL, 1.0f); // Discard pixels when their alpha component is null

Pros and cons :
- You don''t need to sort trees by depth, and you don''t need tricks such as enabling depth reading while disabling depth writing,
- You can''t blend trees. Like the GIF picture format, your texture will be rendered in two states : show the pixel or don''t show the pixel. There is no level between them (there is no 50%).

If you wanter finer detail, you should try with glAlphaFunc(GL_GREATER, 0.5f) or you can replace 0.5 with any value between 0 and 1.

Hope this helps.

Share this post


Link to post
Share on other sites
Hmm, thank you both, I think the method vincoof posted works best, but where do I put the alpha information in the texture?
I''m using my own bmp loader, but I am thinking of using tga instead, so that I can use alpha too, but how do I tell OpenGL to use the alpha channel?

Share this post


Link to post
Share on other sites
You simply tell him while you call glTexImage2D. Could you please post the line of code where you call glTexImage2D ?

Share this post


Link to post
Share on other sites
Guest Anonymous Poster
Well, this is how it looks, I''m really grateful you''re helping me!

glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA8, tex->width, tex->height, 0, GL_RGB, GL_UNSIGNED_BYTE, tex->data);

Share this post


Link to post
Share on other sites
ok, you have to set an alpha channel, and then call this line :
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA8, tex->width, tex->height, 0, GL_RGBA , GL_UNSIGNED_BYTE, tex->data);
(Note the GL_RGBA instead of GL_RGB)

If the pixels into tex->data are RGB, you can easily setup the fourth channel with the following code snippet :

  
GLubyte* rgba_data;
GLubyte r, g, b, a;
int nb_pixels = tex->width*tex->height;
rgba_data = (GLubyte*)malloc(4*nb_pixels*sizeof(GLubyte)); // Allocate pixels

for (int i=0 ; i<nb_pixels ; i++)
{
r = tex->data[i*3+0]; // Get Red from RGB texture

g = tex->data[i*3+1]; // Get Green from RGB texture

b = tex->data[i*3+2]; // Get Blue from RGB texture

a = r/3 + g/3 + b/3; // Compute Alpha

rgba_data[i*4+0] = r; // Set Red to RGBA texture

rgba_data[i*4+1] = g; // Set Green to RGBA texture

rgba_data[i*4+2] = b; // Set Blue to RGBA texture

rgba_data[i*4+3] = a; // Set Alpha to RGBA texture

}
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA8, tex->width, tex->height, 0, GL_RGBA, GL_UNSIGNED_BYTE, rgba_pixels);
free(rgba_data); // Free pixels

You can optimize it easily, but I''ve tried to be clear instead.
If you prefer C++, you may call new/delete instead of malloc/free :

  
rgba_data = new GLubyte[4*nb_pixels]; // Allocate pixels


  
delete[] rgba_data; // Free pixels

And please note that you may NOT replace the alpha computation with this line :

  
a = (r+g+b)/3

because the r+g+b sum may exceed the capacity of an unsigned byte (8 bits) thus would yield to errors.

Share this post


Link to post
Share on other sites
Whee, thanks!
It works really good now, I wrote a TGA loader so that OpenGL loads the alpha from the TGA.
Thank you very much for your help!

Share this post


Link to post
Share on other sites

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

We are the game development community.

Whether you are an indie, hobbyist, AAA developer, or just trying to learn, GameDev.net is the place for you to learn, share, and connect with the games industry. Learn more About Us or sign up!

Sign me up!