Jump to content
  • Advertisement
Sign in to follow this  
MichaelMook

ATI vs NVidia: glCopyTexImage2D (pics)

This topic is 4811 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

So I've ran into an interesting seemingly driver-related issue. On all ATI videocards, my code works fine. But on all NVidia videocards -- the alpha component doesn't get copied when using glCopyTextImage2D. What gives? Basically I am printing Textured text on the screen then save it as a texture which I later use (one texture per panel/window for my GUI so it doesn't get re-drawn every frame). I realize some of you may ask why save textured font into a texture, but I do it because I can use Raster font in its place as well just by calling the constructor differently, and that one works much faster when rendered to a texture then reused. Here are a pair of screenshots to illustrate what I mean: ATI Radeon 9800, ATI Radeon 7000: NVidia GeForce 5700, NVidia GeForce 2 MX, NVidia GeForce 4 Ti 4200: And here's the code of how I create a blank texture:
unsigned int iSize = (unsigned int)iWidth * iHeight * 4;
unsigned char *pTexture = new unsigned char[iSize];
ZeroMemory(pTexture, iSize);

glGenTextures(1, &iTexture);
glBindTexture(GL_TEXTURE_2D, iTexture);	
glTexImage2D(GL_TEXTURE_2D, 0, 4, iWidth, iHeight, 0, GL_RGBA, GL_UNSIGNED_BYTE, pTexture);
	
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);


I fill it with:
glBindTexture(GL_TEXTURE_2D, iTexture);
glCopyTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, 0, 0, iTWidth, iTHeight, 0);


Am I doing something wrong, or is this a driver issue?

Share this post


Link to post
Share on other sites
Advertisement
Try explicitly settings things up with glPixelTransfer[f,i](), ive heard that nvidia cards tend to be less forgiving with "bad input", so while ATI may assume because it "knows what your doing" nvidia may not.

edit: i dunno if youve used glPixelTransfer (i know i havent lol) so heres a reference http://msdn.microsoft.com/library/default.asp?url=/library/en-us/opengl/glfunc03_881e.asp
hope that helps
-Dan

Share this post


Link to post
Share on other sites
Hmm... Well setting Alpha Scale to 0.5f through glPixelTransfer made the text and the black background around it it on NVidia videocards simply have 0.5f alpha... So apparently it does pass the alpha on glCopyTexSubImage2D (using that now), but the alpha is set to 100%. Hrrm.

Curiously enough, changing

glClearColor(0, 0, 0, 0);
glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);

to...

glClearColor(0, 0, 0, 0.5f);
glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);

...produces a visible gray box around my font on ATI, but NVidia remains at its 100% alpha...

Share this post


Link to post
Share on other sites
Quote:
the alpha component doesn't get copied when using glCopyTextImage2D. What gives?

Basically I am printing Textured text on the screen then save it as a texture which I later us

not understandding 100%, u mean destination alpha right, u need to specify this when u create the window, u can query gl to see how many bits of alpha the current window has

Share this post


Link to post
Share on other sites
Not sure what you mean? It works fine on ATI's. On NVidia I basically have to avoid drawing to textures as it loses the alpha channel values. Here's how everything looks after I implemented the kludge for NVidia.

Edit: If you meant pixel format descriptor, it's set to 32 bits for both depth and color. No dice. :|

Share this post


Link to post
Share on other sites
Quote:
If you meant pixel format descriptor, it's set to 32 bits for both depth and color. No dice

doesnt matter what u ask for as u mightnt be given it, thus query the window,
eg asking for 32depth will not work with nvidia (as no nvidia cards support 32bit depth but only 16/24)
there are glGet.. commands to query this info

Share this post


Link to post
Share on other sites
Guest Anonymous Poster
Quote:
Original post by zedzeek
Quote:
If you meant pixel format descriptor, it's set to 32 bits for both depth and color. No dice

doesnt matter what u ask for as u mightnt be given it, thus query the window,
eg asking for 32depth will not work with nvidia (as no nvidia cards support 32bit depth but only 16/24)
there are glGet.. commands to query this info


I had the same problem during the night, but finally discovered how to fix it. You _MUST_ request a 8 bit alpha channel in your PIXELFORMATDESCRIPTOR when creating the OpenGL window! If you leave this at zero, ATI will still provide you with a Alpha-channel, while Nvidia won't.

Spent a good hour digging around, but finally -- it resolved all our problems.. and we reached the deadline at The Gathering 2005 :)

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

Participate in the game development conversation and more when you create an account on GameDev.net!

Sign me up!