Jump to content
  • Advertisement

Archived

This topic is now archived and is closed to further replies.

FireSlash

Texture transparencies and timerGetTime

This topic is 5651 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

Two questions (wee!!!) 1) I included libWinMM.a (Dev-C++), yet the compiler still refuses to acknowlage timerGetTime as a function. grrr... 2) Masks aren''t ideal for what I am doing, is there any way to simply define a color (black works) as transparent?

Share this post


Link to post
Share on other sites
Advertisement
Here's the nifty part you're looking for:



    
glEnable(GL_ALPHA_TEST);
//anything greater than 0 (pitch black) passes

glAlphaFunc(GL_GREATER, 0);
---draw ocelot here---
glDisable(GL_ALPHA_TEST);


Code untested

Regarding the timerGetTime() function - use the performance counter (check out Nehe's tutorials or search the Web) - build a wrapper class and you're off to work

Crispy

EDIT: forgot to add - as the name glAlphaTest suggests, OpenGL uses the alpha channel for this, so you need your textures to have an alpha channel. How to accomplish this? If you're loading bitmaps as your textures, there is no alpha channel, so there really is no other way than to create one yourself. For example, you could reserve some color in the original image to act as the alpha mask (eg pitch black), so before creating your textures, you simply add the fourth channel to the image data that is > 0 where ever the image color is > 0 and 0 where ever the image is black. Easy!

[edited by - crispy on January 29, 2003 4:11:24 PM]

Share this post


Link to post
Share on other sites
Well maybe it doesnt recognize it because its called ''timeGetTime'' and not ''timerGetTime'' ?

Share this post


Link to post
Share on other sites
I have the same problem. This code doesn''t work for me. You can see my thread (Textures: Transparent Color) that I created yesterday. Can anyone tell us what the issue is? I have an alpha channel on my texture, but it doesn''t seem to make any effect with GL_ALPHA_TEST enabled and whatnot.

Share this post


Link to post
Share on other sites
I was already loading TGAs, so I eventually figured out a similar working piece of code. On the plus side, using yours instead fixed another problem with glPrint :D

Still no luck on the timer. Im using the code from NeHe''s tutorials, and still getting the same error. (Yes, I realse its timeGetTime() )

Share this post


Link to post
Share on other sites
Here''s a timer class I wrote about a year (okayokay - I''ve no idea when I wrote it - a long time ago anyway) ago based on Nehe''s code. It has a mighty disclaimer in it which allows you to do anything with it, but I urge everyone to understand it rather than copy-paste it. I know it''s something you don''t have to know off the top of your head, but... you know...

Anyway - instructions are enclosed, hope this helps anyone out struggling with timeGetTime().

Crispy

Share this post


Link to post
Share on other sites
Disabling blending still doesn't work. Perhaps this code will help enlighten someone who knows more than I do:


  
/* Texture generation: */
glGenTextures(1, &texture[0]); // Create The Texture
glBindTexture(GL_TEXTURE_2D, texture[0]);
glTexImage2D(GL_TEXTURE_2D, 0, 3, tga1->sizex, tga1->sizey, 0, GL_RGBA, GL_UNSIGNED_BYTE, tga1->data);
glTexParameteri(GL_TEXTURE_2D,GL_TEXTURE_MIN_FILTER,GL_LINEAR);
glTexParameteri(GL_TEXTURE_2D,GL_TEXTURE_MAG_FILTER,GL_LINEAR);

/* InitGL: */
glEnable(GL_TEXTURE_2D);
glShadeModel(GL_SMOOTH);
glClearColor(1.0f, 1.0f, 1.0f, 0.5f);
glClearDepth(1.0f);
glEnable(GL_DEPTH_TEST);
glDepthFunc(GL_LEQUAL);
glDisable(GL_BLEND);
glHint(GL_PERSPECTIVE_CORRECTION_HINT, GL_NICEST);

/* Draw Scene: */
glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
glLoadIdentity();
glTranslatef(0.0f,0.0f,-5.0f);
glEnable(GL_ALPHA_TEST);
glAlphaFunc(GL_GREATER, 0);
glBindTexture(GL_TEXTURE_2D, texture[0]);

glBegin(GL_QUADS);
/* draw stuff... */
glEnd();

glDisable(GL_ALPHA_TEST);
/* end */



Any ideas? I can't figure this out. btw, tga1 is a class that I made to store a targa image data (very cool because I can mess with it on its own) and it loads in RGBA values that are stored in a flat array like data={R,G,B,A,R,G,B,A,...}, only not coded that way.

[edited by - g3ck0blu3 on January 30, 2003 6:22:10 PM]

Share this post


Link to post
Share on other sites
Have a look at the targa''s alpha channel (in some image editor). If it''s ALL black you won''t be able to see anything at all in your game. The fact that a tga has the alpha channel doesn''t mean that it has to contain meaningful data (though, alas, I have never dealt with tga''s). Other than that, your code looks ok. You could try posting a link to the tga file on this forum so everyone can see what you''re trying to load.

Crispy

Share this post


Link to post
Share on other sites

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

We are the game development community.

Whether you are an indie, hobbyist, AAA developer, or just trying to learn, GameDev.net is the place for you to learn, share, and connect with the games industry. Learn more About Us or sign up!

Sign me up!