glReadPixels - NO alpha on nVidia - alpha OK on ATI

Started by
7 comments, last by guvidu 13 years ago
Hello guys,

I've seen other post on the internet regarding this issue but no fix for it so maybe someone could point me in the right direction.
The problem is that on ATI video card the result is ok - the pixels are read correctly with alpha as background
When on nvidia - the background is black - so no alpha
I have problem testing this because i have an ati card - so i have somebody who is testing for me the application - but as you can imagine this is going slow and - i work blind


Can you give me some advices on what i'm doing wrong ?

Why on nvidia there is no alpha but on ati alpha is ok ?

Thank you in advance for the help!


glReadBuffer(GL_BACK);

[self drawImageForSave];

[color="#ff0000"]//same outcome with this two lines
// glReadPixels(0,0,rectSize,rectSize,GL_RGBA,GL_UNSIGNED_BYTE,[tempImage bitmapData]);
glReadPixels(0, 0, rectSize, rectSize, GL_RGBA, GL_UNSIGNED_INT_8_8_8_8_REV, [tempImage bitmapData]);


[color="#ff0000"]and the rendering function

[[self openGLContext] makeCurrentContext];

glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
glLoadIdentity();

glClearColor(0.0f, 0.0f, 0.0f, 0.0f);

glViewport(0, 0, 512, 512);
glMatrixMode(GL_PROJECTION);
glLoadIdentity();
glOrtho(0, 512, 512, 0, -1, 1);

int count = [[RectSelections getInstance] count];

for(int i = 0; i < count; ++i)
{
SpriteInfo* data = [[RectSelections getInstance] data:i];

Rect rect = [data rect];

ilTexture* tex = [[TextureManager getInstance] textureWithName:[data imagePath]];


// double viewW= (double)self.bounds.size.width;
double viewH= (double)self.bounds.size.height;


double imgW = [tex width];
double imgH = [tex height];

double left = rect.left;
double right= rect.right;
double top = rect.top;
double bottom=rect.bottom;

NSPoint pos = [data position];
float width = [data width];
float height= [data height];



glDisable(GL_BLEND);
glEnable(GL_TEXTURE_2D);
glBindTexture(GL_TEXTURE_2D, [tex texture]);

glColor4f(1.0f, 1.0f, 1.0f, 1.0f);

glBegin( GL_QUADS );

glTexCoord2f( left/imgW, top/imgH );
glVertex2f(pos.x, viewH - pos.y );

glTexCoord2f( right/imgW, top/imgH );
glVertex2f(pos.x + width, viewH - pos.y);

glTexCoord2f( right/imgW, bottom/imgH );
glVertex2f(pos.x + width, viewH - (pos.y + height));

glTexCoord2f( left/imgW, bottom/imgH );
glVertex2f(pos.x, viewH - (pos.y + height));

glEnd();

}
Advertisement
Maybe I'm missing something, but where do you upload the texture to the graphics card?

Maybe I'm missing something, but where do you upload the texture to the graphics card?


i don't - i save the pixels as png image
So here its what is happening on ATI - CORRECT Behaviour
[attachment=1718:onATI.png]

And this is on nVidia - BAD BEHAVIOUR
[attachment=1717:onNVIDIA.png]
Ok, next, how are you requesting your context? I have no experience with OS X at all but there might be an optional alpha bits input you're expected to fill out otherwise the result is undefined. It looks like you're clearing the buffer THEN telling it which colour to use, too?

Ok, next, how are you requesting your context? I have no experience with OS X at all but there might be an optional alpha bits input you're expected to fill out otherwise the result is undefined. It looks like you're clearing the buffer THEN telling it which colour to use, too?


If you are refering to this part [[self openGLContext] makeCurrentContext]; - thats a NSOpenGLView method (the class I inherit from) - and that is not even necessary. I only have one context.
Can you check your context creation code? Unless you explicitly request alpha in your framebuffer at creation time, the driver is not actually obliged to give you alpha, and even if you do request alpha the driver may still give you a "nearest match" format that doesn't include alpha. Either way glReadPixels is not going to give you any alpha.

Direct3D has need of instancing, but we do not. We have plenty of glVertexAttrib calls.


Can you check your context creation code? Unless you explicitly request alpha in your framebuffer at creation time, the driver is not actually obliged to give you alpha, and even if you do request alpha the driver may still give you a "nearest match" format that doesn't include alpha. Either way glReadPixels is not going to give you any alpha.


I will try it - seams very logical. I will come back with news. Thank you!
Fixed the issue - it seams you have to specifically select 32 bits RGBA in interface builder for your opengl view. Default values are not the same on nvidia and Ati.

This topic is closed to new replies.

Advertisement