Jump to content
  • Advertisement
Sign in to follow this  
fms

glReadPixels returning garbage values.

This topic is 4841 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

Hi, I am writing a simple application where I need to verify the pixel values before further processing it. For this purpose, I am using glReadPixels() to read back the frambuffer. But glReadPixels() is returning some garbage values. My source code:
##################################################################################
#include <stdio.h>
#include <C:\Program Files\NVIDIA Corporation\Cg\include\GL\glut.h>

#define XSCREEN 4 // width
#define YSCREEN 4 // height

GLubyte buffer[YSCREEN * XSCREEN * 4];

void GLUTInit (int*, char**);
void Init (void);
void Display (void);
void Draw (void);
void ReadTexture (void);
void Error (void);

int main (int argc, char** argv)
{
	GLUTInit (&argc, argv);
	Init ();
	glutDisplayFunc (Display);
	ReadTexture ();
	glutMainLoop ();
	return 0;
} // main

void GLUTInit (int* argc, char** argv)
{
	glutInit (argc, argv);
	glutInitWindowSize (XSCREEN, YSCREEN); 
	glutInitDisplayMode (GLUT_RGBA);
	glutCreateWindow ("glReadPixels");
} // GLUTInit

void Init (void)
{
	glClearColor (1.0, 1.0, 1.0, 1.0);
	glClearDepth (1.0f);
} // Init

void Display (void)
{
	glClear (GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
	Draw ();
} // Display

void Draw (void)
{	
	glViewport (0, 0, XSCREEN, YSCREEN);
	glMatrixMode (GL_PROJECTION);
	glLoadIdentity ();
	glOrtho (0, XSCREEN, 0, YSCREEN, -1, 1);
	glMatrixMode (GL_MODELVIEW);
	glLoadIdentity ();	
	glBegin (GL_QUADS);
		glColor4ub (128, 128, 128, 128);
		glVertex2f (0, 0);
		glVertex2f (XSCREEN, 0);
		glVertex2f (XSCREEN, YSCREEN);
		glVertex2f (0, YSCREEN);
	glEnd ();
	glFlush ();
} // Draw

void ReadTexture (void)
{
	Error ();
	glReadPixels (0, 0, XSCREEN, YSCREEN, GL_RGBA, GL_UNSIGNED_BYTE, buffer);
	Error ();
} // ReadTexture

void Error (void)
{
	GLenum errCode;
	const GLubyte* errString;

	if ((errCode = glGetError ()) != GL_NO_ERROR)
	{
		errString = gluErrorString (errCode);
		printf ("GL Error : %s\n", errString);
	}
} // Error

##################################################################################

Please help. Thanks. Edited to add source tags, please use them in future for large code blocks. See the Forum FAQ for details of the tags in use on this forum [Edited by - phantom on August 15, 2005 8:00:45 AM]

Share this post


Link to post
Share on other sites
Advertisement
I believe it may have something to do with reading the texture before you ever draw...

ReadTexture ();
glutMainLoop ();

glutMainLoop(); will start the drawing, and ReadTexture() reads the texture. glutDisplayFunc(n); just sets the pointer to the function that will draw the scene.

If you just want to draw once, and then copy, just do:


int main (int argc, char** argv)
{
GLUTInit (&argc, argv);
Init ();
glutDisplayFunc (Display);
Draw();
ReadTexture ();
//glutMainLoop ();
return 0;
} // main



I'm pretty sure that'll fix it
Hope I could help
~zix~

Share this post


Link to post
Share on other sites
Quote:
Original post by zix99
I believe it may have something to do with reading the texture before you ever draw...

ReadTexture ();
glutMainLoop ();

glutMainLoop(); will start the drawing, and ReadTexture() reads the texture. glutDisplayFunc(n); just sets the pointer to the function that will draw the scene.

If you just want to draw once, and then copy, just do:

*** Source Snippet Removed ***

I'm pretty sure that'll fix it
Hope I could help
~zix~


Hi Zix,

I did as you said, and now I am getting some of the values as I expect. But the first 32 bytes still have garbage values.

Can you help me locate the reason why the first 32 bytes have garbage values.

Your help is highly appreciated.

Thanks in advance.
- FMS

Share this post


Link to post
Share on other sites
I'm not sure exactly why that program isn't working, I'm having trouble with it too. It's most likely the way the buffer is set up, but I'm not sure.

On your program if I enlarge the size of the window I actually get a picture of the my screen, but somewhat upsidedown and discolored :). This means that the buffer isn't being cleared.

This program I wrote works fine, similiar to yours, but it uses double-buffering (so it has to swap buffers each time), and perspective projection rather than ortho.

[source land="cpp"]
#include <iostream.h>
#include <stdio.h>
#include <GL/glut.h>

#define width 256
#define height 256


void SavePicture(){
char buffer[width*height*4];
glReadPixels(0,0,width,height,GL_RGBA,GL_UNSIGNED_BYTE,buffer);
FILE *file;
file=fopen("file.raw","wb");
fwrite(buffer,width*height*4,1,file);
fclose(file);
}

void Draw(){

glClear(GL_COLOR_BUFFER_BIT);

glMatrixMode(GL_PROJECTION);
glLoadIdentity();
gluPerspective(50, 1, 1, 100);

glMatrixMode(GL_MODELVIEW);
glLoadIdentity();
gluLookAt(25,15,25,0,0,0,0,1,0);

//draw plane
glColor3ub(255,0,0);
glBegin(GL_QUADS);
glVertex3f(-25,0,-25);
glVertex3f(25,0,-25);
glVertex3f(25,0,25);
glVertex3f(-25,0,25);
glEnd();

//draw ball
glColor3ub(0,255,0);
glutSolidSphere(1,10,10);

SavePicture(); //save pixels before swapping buffer
glutSwapBuffers();
glutPostRedisplay();
}

int main(int argcp, char **argv){
cout << "Starting...\n";

glutInit(&argcp, argv);
glutInitWindowSize(width,height);
glutInitWindowPosition(25,25);
glutInitDisplayMode(GLUT_RGBA | GLUT_DOUBLE);
glutCreateWindow("Ball");
glutDisplayFunc(Draw);

glClearColor(0,0.5,1,1); //blue

//glutMainLoop();
Draw();

return 0;
}



I hope you spot your problem, this program also saves it to a raw file so you can look at the picture afterwards

Hope I could help
~zix~

Share this post


Link to post
Share on other sites
It appears to me that the size of the window plays a role in feeding garbage values to the buffer. I varied the size of the window and I have started seeing appropriate values.

Here is the summary for the various window sizes:
Window Size Result
4x4 Garbage values (0 - 31 bytes)
8x8 Garbage values (0 - 127 bytes)
16x16 Correct result
.... Correct result

So, I am confused now as to what could be the reason.

Does it make any sense to you?

Thanks.
- FMS

Share this post


Link to post
Share on other sites
I'm not entirely sure where exactly the window itself is created, but it seems to be in glutCreateWindow (no, it's not necessarily that obvious, as that function could just store the request to create a window and actually create it later), but I'm fairly sure the window isn't shown until you call glutMainLoop. Reading from a hidden window results undefined pixel values to be returned, so garbage is actually the "correct" result.

Therefore, to get anything reliably concering pixel content, you must call glutMainLoop so GLUT can show the window and call the draw function. Don't call the draw function manually, as GLUT is the one handling the window, not you.

Share this post


Link to post
Share on other sites
Hi,

I am only getting garbage values when the window size is 8x8 or lesser than that. If the window size is more than 8x8, I get correct results.

I agree that the window is not displayed until one calls the glutMainLoop() but the Draw() basically complete the quad on the framebuffer and the ReadTexture() reads from the framebuffer.

So, the issue is not with displaying the quad but reading the contents of the framebuffer. And peculiar enough the results are garbage for window sizes less than 8x8 (as mentioned earlier).

I really need to know where I am missing the point. Please someone help.

Thanks in anticipation.
- FMS

Share this post


Link to post
Share on other sites
Ah yes, you did say it works correct for larger windows. Didn't read your post enough it seems, sorry.

Anyway, my point that you're reading undefined pixel values from a hidden window still remains. You should not expect correct results unless the window is correctly and entirely shown. Is the window actually shown at the time you start rendering and reading back the values? If not, do you get correct result if the window is shown correctly?

Share this post


Link to post
Share on other sites
Here are the results that I am getting.

XSCREEN = 4
YSCREEN = 4

For 0-31 bytes, RGBA values are : R = 225, G = 222, B = 217, A = 0 (garbage)
For 32-63 bytes, RGBA values are : R = 128, G = 128, B = 128, A = 1 (expected)

So, clearly the results are correct for bytes between 32-63. No, I am not displaying the window because I just want to read the data from the framebuffer. Moreover, I can certainly render the window and I still obtain the same results as mentioned above for the above mentioned window size.

- FMS

Share this post


Link to post
Share on other sites
Can you post the current code as it is? Would like to see the code in it's current state and try it myself. Could be that the driver (incorrectly) assumes the window should be of a certain size, and don't handle the smaller ones. So, if possible, try to update the drivers.

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

We are the game development community.

Whether you are an indie, hobbyist, AAA developer, or just trying to learn, GameDev.net is the place for you to learn, share, and connect with the games industry. Learn more About Us or sign up!

Sign me up!