ATI vs NVidia: glCopyTexImage2D (pics)

Started by
5 comments, last by GameDev.net 19 years ago
So I've ran into an interesting seemingly driver-related issue. On all ATI videocards, my code works fine. But on all NVidia videocards -- the alpha component doesn't get copied when using glCopyTextImage2D. What gives? Basically I am printing Textured text on the screen then save it as a texture which I later use (one texture per panel/window for my GUI so it doesn't get re-drawn every frame). I realize some of you may ask why save textured font into a texture, but I do it because I can use Raster font in its place as well just by calling the constructor differently, and that one works much faster when rendered to a texture then reused. Here are a pair of screenshots to illustrate what I mean: ATI Radeon 9800, ATI Radeon 7000: NVidia GeForce 5700, NVidia GeForce 2 MX, NVidia GeForce 4 Ti 4200: And here's the code of how I create a blank texture:

unsigned int iSize = (unsigned int)iWidth * iHeight * 4;
unsigned char *pTexture = new unsigned char[iSize];
ZeroMemory(pTexture, iSize);

glGenTextures(1, &iTexture);
glBindTexture(GL_TEXTURE_2D, iTexture);	
glTexImage2D(GL_TEXTURE_2D, 0, 4, iWidth, iHeight, 0, GL_RGBA, GL_UNSIGNED_BYTE, pTexture);
	
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);


I fill it with:

glBindTexture(GL_TEXTURE_2D, iTexture);
glCopyTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, 0, 0, iTWidth, iTHeight, 0);


Am I doing something wrong, or is this a driver issue?
Advertisement
Try explicitly settings things up with glPixelTransfer[f,i](), ive heard that nvidia cards tend to be less forgiving with "bad input", so while ATI may assume because it "knows what your doing" nvidia may not.

edit: i dunno if youve used glPixelTransfer (i know i havent lol) so heres a reference http://msdn.microsoft.com/library/default.asp?url=/library/en-us/opengl/glfunc03_881e.asp
hope that helps
-Dan
When General Patton died after World War 2 he went to the gates of Heaven to talk to St. Peter. The first thing he asked is if there were any Marines in heaven. St. Peter told him no, Marines are too rowdy for heaven. He then asked why Patton wanted to know. Patton told him he was sick of the Marines overshadowing the Army because they did more with less and were all hard-core sons of bitches. St. Peter reassured him there were no Marines so Patton went into Heaven. As he was checking out his new home he rounded a corner and saw someone in Marine Dress Blues. He ran back to St. Peter and yelled "You lied to me! There are Marines in heaven!" St. Peter said "Who him? That's just God. He wishes he were a Marine."
Hmm... Well setting Alpha Scale to 0.5f through glPixelTransfer made the text and the black background around it it on NVidia videocards simply have 0.5f alpha... So apparently it does pass the alpha on glCopyTexSubImage2D (using that now), but the alpha is set to 100%. Hrrm.

Curiously enough, changing

glClearColor(0, 0, 0, 0);
glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);

to...

glClearColor(0, 0, 0, 0.5f);
glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);

...produces a visible gray box around my font on ATI, but NVidia remains at its 100% alpha...
Quote:the alpha component doesn't get copied when using glCopyTextImage2D. What gives?

Basically I am printing Textured text on the screen then save it as a texture which I later us

not understandding 100%, u mean destination alpha right, u need to specify this when u create the window, u can query gl to see how many bits of alpha the current window has
Not sure what you mean? It works fine on ATI's. On NVidia I basically have to avoid drawing to textures as it loses the alpha channel values. Here's how everything looks after I implemented the kludge for NVidia.

Edit: If you meant pixel format descriptor, it's set to 32 bits for both depth and color. No dice. :|

Quote:If you meant pixel format descriptor, it's set to 32 bits for both depth and color. No dice

doesnt matter what u ask for as u mightnt be given it, thus query the window,
eg asking for 32depth will not work with nvidia (as no nvidia cards support 32bit depth but only 16/24)
there are glGet.. commands to query this info
Quote:Original post by zedzeek
Quote:If you meant pixel format descriptor, it's set to 32 bits for both depth and color. No dice

doesnt matter what u ask for as u mightnt be given it, thus query the window,
eg asking for 32depth will not work with nvidia (as no nvidia cards support 32bit depth but only 16/24)
there are glGet.. commands to query this info


I had the same problem during the night, but finally discovered how to fix it. You _MUST_ request a 8 bit alpha channel in your PIXELFORMATDESCRIPTOR when creating the OpenGL window! If you leave this at zero, ATI will still provide you with a Alpha-channel, while Nvidia won't.

Spent a good hour digging around, but finally -- it resolved all our problems.. and we reached the deadline at The Gathering 2005 :)

This topic is closed to new replies.

Advertisement