Texture transparencies and timerGetTime

Started by
10 comments, last by FireSlash 21 years, 2 months ago
Two questions (wee!!!) 1) I included libWinMM.a (Dev-C++), yet the compiler still refuses to acknowlage timerGetTime as a function. grrr... 2) Masks aren''t ideal for what I am doing, is there any way to simply define a color (black works) as transparent?
Advertisement
Here's the nifty part you're looking for:



    glEnable(GL_ALPHA_TEST);//anything greater than 0 (pitch black) passesglAlphaFunc(GL_GREATER, 0);---draw ocelot here---glDisable(GL_ALPHA_TEST);    


Code untested

Regarding the timerGetTime() function - use the performance counter (check out Nehe's tutorials or search the Web) - build a wrapper class and you're off to work

Crispy

EDIT: forgot to add - as the name glAlphaTest suggests, OpenGL uses the alpha channel for this, so you need your textures to have an alpha channel. How to accomplish this? If you're loading bitmaps as your textures, there is no alpha channel, so there really is no other way than to create one yourself. For example, you could reserve some color in the original image to act as the alpha mask (eg pitch black), so before creating your textures, you simply add the fourth channel to the image data that is > 0 where ever the image color is > 0 and 0 where ever the image is black. Easy!

[edited by - crispy on January 29, 2003 4:11:24 PM]
"Literally, it means that Bob is everything you can think of, but not dead; i.e., Bob is a purple-spotted, yellow-striped bumblebee/dragon/pterodactyl hybrid with a voracious addiction to Twix candy bars, but not dead."- kSquared
Well maybe it doesnt recognize it because its called ''timeGetTime'' and not ''timerGetTime'' ?
I have the same problem. This code doesn''t work for me. You can see my thread (Textures: Transparent Color) that I created yesterday. Can anyone tell us what the issue is? I have an alpha channel on my texture, but it doesn''t seem to make any effect with GL_ALPHA_TEST enabled and whatnot.
Try disabling blending
"Literally, it means that Bob is everything you can think of, but not dead; i.e., Bob is a purple-spotted, yellow-striped bumblebee/dragon/pterodactyl hybrid with a voracious addiction to Twix candy bars, but not dead."- kSquared
I was already loading TGAs, so I eventually figured out a similar working piece of code. On the plus side, using yours instead fixed another problem with glPrint :D

Still no luck on the timer. Im using the code from NeHe''s tutorials, and still getting the same error. (Yes, I realse its timeGetTime() )
Here''s a timer class I wrote about a year (okayokay - I''ve no idea when I wrote it - a long time ago anyway) ago based on Nehe''s code. It has a mighty disclaimer in it which allows you to do anything with it, but I urge everyone to understand it rather than copy-paste it. I know it''s something you don''t have to know off the top of your head, but... you know...

Anyway - instructions are enclosed, hope this helps anyone out struggling with timeGetTime().

Crispy
"Literally, it means that Bob is everything you can think of, but not dead; i.e., Bob is a purple-spotted, yellow-striped bumblebee/dragon/pterodactyl hybrid with a voracious addiction to Twix candy bars, but not dead."- kSquared
Disabling blending still doesn't work. Perhaps this code will help enlighten someone who knows more than I do:


  /* Texture generation: */glGenTextures(1, &texture[0]);	// Create The TextureglBindTexture(GL_TEXTURE_2D, texture[0]);glTexImage2D(GL_TEXTURE_2D, 0, 3, tga1->sizex, tga1->sizey, 0, GL_RGBA, GL_UNSIGNED_BYTE, tga1->data);glTexParameteri(GL_TEXTURE_2D,GL_TEXTURE_MIN_FILTER,GL_LINEAR);glTexParameteri(GL_TEXTURE_2D,GL_TEXTURE_MAG_FILTER,GL_LINEAR);/* InitGL: */glEnable(GL_TEXTURE_2D);glShadeModel(GL_SMOOTH);glClearColor(1.0f, 1.0f, 1.0f, 0.5f);glClearDepth(1.0f);glEnable(GL_DEPTH_TEST);glDepthFunc(GL_LEQUAL);glDisable(GL_BLEND);glHint(GL_PERSPECTIVE_CORRECTION_HINT, GL_NICEST);/* Draw Scene: */glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);glLoadIdentity();glTranslatef(0.0f,0.0f,-5.0f);glEnable(GL_ALPHA_TEST);glAlphaFunc(GL_GREATER, 0);glBindTexture(GL_TEXTURE_2D, texture[0]);glBegin(GL_QUADS);/* draw stuff... */glEnd();glDisable(GL_ALPHA_TEST);/* end */  


Any ideas? I can't figure this out. btw, tga1 is a class that I made to store a targa image data (very cool because I can mess with it on its own) and it loads in RGBA values that are stored in a flat array like data={R,G,B,A,R,G,B,A,...}, only not coded that way.

[edited by - g3ck0blu3 on January 30, 2003 6:22:10 PM]
Bueller? Bueller? Anyone know what the problem is here?
Have a look at the targa''s alpha channel (in some image editor). If it''s ALL black you won''t be able to see anything at all in your game. The fact that a tga has the alpha channel doesn''t mean that it has to contain meaningful data (though, alas, I have never dealt with tga''s). Other than that, your code looks ok. You could try posting a link to the tga file on this forum so everyone can see what you''re trying to load.

Crispy
"Literally, it means that Bob is everything you can think of, but not dead; i.e., Bob is a purple-spotted, yellow-striped bumblebee/dragon/pterodactyl hybrid with a voracious addiction to Twix candy bars, but not dead."- kSquared

This topic is closed to new replies.

Advertisement