Jump to content

  • Log In with Google      Sign In   
  • Create Account

OpenGL - glReadPixels returns incorrect values


Old topic!
Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.

  • You cannot reply to this topic
2 replies to this topic

#1 Silverlan   Members   -  Reputation: 350

Like
0Likes
Like

Posted 26 August 2014 - 02:08 PM

I'm probably missing something obvious, but I can't find anything wrong with the code. It's for testing purposes, and all it's supposed to do is set the color of all pixels to white and read the pixel information. The result is an array full of 0s:

unsigned int frameBuffer;
glGenFramebuffers(1,&frameBuffer);
unsigned int texture;
unsigned int depthTexture;
glGenTextures(1,&texture);
glGenTextures(1,&depthTexture);

glBindFramebuffer(GL_FRAMEBUFFER,frameBuffer);
glBindTexture(GL_TEXTURE_2D,texture);

int w = 256;
int h = 256;

glTexImage2D(
	GL_TEXTURE_2D,
	0,
	GL_RGB,
	w,h,
	0,GL_RGB,
	GL_UNSIGNED_BYTE,
	0
);
glTexParameteri(GL_TEXTURE_2D,GL_TEXTURE_MAG_FILTER,GL_NEAREST);
glTexParameteri(GL_TEXTURE_2D,GL_TEXTURE_MIN_FILTER,GL_NEAREST);
glTexParameteri(GL_TEXTURE_2D,GL_TEXTURE_WRAP_S,GL_CLAMP_TO_EDGE);
glTexParameteri(GL_TEXTURE_2D,GL_TEXTURE_WRAP_T,GL_CLAMP_TO_EDGE);
glFramebufferTexture2D(GL_FRAMEBUFFER,GL_COLOR_ATTACHMENT0,GL_TEXTURE_2D,texture,0);

glBindTexture(GL_TEXTURE_2D,depthTexture);
glTexImage2D(
	GL_TEXTURE_2D,
	0,
	GL_DEPTH_COMPONENT16,
	w,h,
	0,GL_DEPTH_COMPONENT,
	GL_FLOAT,
	0
);
glTexParameteri(GL_TEXTURE_2D,GL_TEXTURE_MAG_FILTER,GL_NEAREST);
glTexParameteri(GL_TEXTURE_2D,GL_TEXTURE_MIN_FILTER,GL_NEAREST);
glTexParameteri(GL_TEXTURE_2D,GL_TEXTURE_WRAP_S,GL_CLAMP_TO_EDGE);
glTexParameteri(GL_TEXTURE_2D,GL_TEXTURE_WRAP_T,GL_CLAMP_TO_EDGE);
glFramebufferTexture2D(GL_FRAMEBUFFER,GL_DEPTH_ATTACHMENT,GL_TEXTURE_2D,depthTexture,0);

int status = glCheckFramebufferStatus(GL_FRAMEBUFFER);
if(status == GL_FRAMEBUFFER_COMPLETE)
{
	unsigned char *pixels = new unsigned char[w *h *3];
	for(unsigned int i=0;i<(w *h *3);i++)
	pixels[i] = 255;
	glDrawPixels(w,h,GL_RGB,GL_UNSIGNED_BYTE,&pixels[0]);
	glReadPixels(0,0,w,h,GL_RGB,GL_UNSIGNED_BYTE,&pixels[0]);
	for(unsigned int i=0; i<(w *h *3); i+=3)
	{
		unsigned int r = pixels[i];
		unsigned int g = pixels[i + 1];
		unsigned int b = pixels[i + 2];
		std::cout<<i<<": "<<r<<","<<g<<","<<b<<std::endl;
	}
	delete[] pixels;
	int err = glGetError(); // No error reported
}
glBindTexture(0,GL_TEXTURE_2D);
glBindFramebuffer(0,GL_FRAMEBUFFER);

Disabling the depth test and removing the depth attachment doesn't help.

No errors reported and the frame buffer is fine. What's going on here?


Sponsor:

#2 NumberXaero   Prime Members   -  Reputation: 1513

Like
2Likes
Like

Posted 26 August 2014 - 02:35 PM

Not sure what gl version you are targeting, try

if(status == GL_FRAMEBUFFER_COMPLETE)
{
        glBindFramebuffer(GL_DRAW_FRAMEBUFFER, framebuffer);

	unsigned char *pixels = new unsigned char[w *h *3];
	for(unsigned int i=0;i<(w *h *3);i++)
	pixels[i] = 255;
	glDrawPixels(w,h,GL_RGB,GL_UNSIGNED_BYTE,&pixels[0]);

        glBindFramebuffer(GL_READ_FRAMEBUFFER, framebuffer);
        glReadBuffer(GL_COLOR_ATTACHMENT0);
	glReadPixels(0,0,w,h,GL_RGB,GL_UNSIGNED_BYTE,&pixels[0]);
	for(unsigned int i=0; i<(w *h *3); i+=3)
	{
		unsigned int r = pixels[i];
		unsigned int g = pixels[i + 1];
		unsigned int b = pixels[i + 2];
		std::cout<<i<<": "<<r<<","<<g<<","<<b<<std::endl;
	}
	delete[] pixels;

        glReadBuffer(GL_BACK);

	int err = glGetError(); // No error reported
}

Edited by NumberXaero, 26 August 2014 - 02:36 PM.


#3 Silverlan   Members   -  Reputation: 350

Like
0Likes
Like

Posted 27 August 2014 - 11:41 AM

I'm using OpenGL 4.4. Either way, the glReadBuffer-Call was the solution, thank you!






Old topic!
Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.



PARTNERS