glReadPixels returning garbage values.

Started by
21 comments, last by fms 18 years, 8 months ago
Hi,

Here is the latest snapshot of the code. Also I am developing on WindowsXP SP2 and using ATI Mobility Radeon 9600 (64MB RAM) as the video graphics card.

Source Code:
###################################################################################include <stdio.h>#include <C:\Program Files\NVIDIA Corporation\Cg\include\GL\glut.h>#define XSCREEN 4 // width#define YSCREEN 4 // heightGLubyte buffer[YSCREEN * XSCREEN * 4];void GLUTInit (int*, char**);void Init (void);void Display (void);void Draw (void);void ReadBack (void);void Error (void);int main (int argc, char** argv){	GLUTInit (&argc, argv);	Init ();	glutDisplayFunc (Display);	Draw ();	ReadBack ();	glutMainLoop ();	return 0;} // mainvoid GLUTInit (int* argc, char** argv){	glutInit (argc, argv);	glutInitWindowSize (XSCREEN, YSCREEN); 	glutInitDisplayMode (GLUT_RGBA);	glutCreateWindow ("glReadPixels");} // GLUTInitvoid Init (void){	glClearColor (0.0, 0.0, 0.0, 1.0);	glClearDepth (1.0f);} // Initvoid Display (void){	glClear (GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);	Draw ();} // Displayvoid Draw (void){		glViewport (0, 0, XSCREEN, YSCREEN);	glMatrixMode (GL_PROJECTION);	glLoadIdentity ();	glOrtho (0, XSCREEN, 0, YSCREEN, 0, 1);	glMatrixMode (GL_MODELVIEW);	glLoadIdentity ();	glBegin (GL_QUADS);		glColor4ub (127, 127, 127, 1);		glVertex2f (0, 0);		glVertex2f (XSCREEN, 0);		glVertex2f (XSCREEN, YSCREEN);		glVertex2f (0, YSCREEN);	glEnd ();	glFinish ();} // Drawvoid ReadBack (void){	Error ();	glPixelStorei (GL_PACK_ALIGNMENT, 1);	glReadPixels (0, 0, XSCREEN, YSCREEN, GL_RGBA, GL_UNSIGNED_BYTE, buffer);	Error ();} // ReadTexturevoid Error (void){	GLenum errCode;	const GLubyte* errString;	if ((errCode = glGetError ()) != GL_NO_ERROR)	{		errString = gluErrorString (errCode);		printf ("GL Error : %s\n", errString);	}} // Error##################################################################################

Please let me know anyways and I really appreciate your help.

Thanks.
- FMS

Edited to add source tags, please use them in future for large code blocks. See the Forum FAQ for details of the tags in use on this forum

[Edited by - phantom on August 15, 2005 8:26:12 AM]
Advertisement
You should not expect the code you have posted to give you the correct result, because, as I have said, you're reading from a hidden window. It returns garbage for me too. But when I comment out the call to Draw and ReadBack before glutMainLoop, and place the ReadBack call in Display instead, everything works as expected, since the window is now visible.

For reference, I have an ATI Radeon X300.
Quote:Original post by Brother Bob
You should not expect the code you have posted to give you the correct result, because, as I have said, you're reading from a hidden window. It returns garbage for me too. But when I comment out the call to Draw and ReadBack before glutMainLoop, and place the ReadBack call in Display instead, everything works as expected, since the window is now visible.

For reference, I have an ATI Radeon X300.


Hi Brother Bob,

Can you please post your modified source code and also the results that you are getting. This way I can verify as to what is going on with my code. Also can you elaborate as to what do you mean by "everything works as expected".

Thanks.
-FMS

Well, seeing as brother bob is probably asleep now, and I'm awake again, I thought I might try to help... :) (All considering we both sleep in somewhat normal hours)

As for source:
###################################################################################include <stdio.h>#include <C:\Program Files\NVIDIA Corporation\Cg\include\GL\glut.h>#define XSCREEN 4 // width#define YSCREEN 4 // heightGLubyte buffer[YSCREEN * XSCREEN * 4];void GLUTInit (int*, char**);void Init (void);void Display (void);void Draw (void);void ReadBack (void);void Error (void);int main (int argc, char** argv){	GLUTInit (&argc, argv);	Init ();	glutDisplayFunc (Display);        //I took out draw() and readback() here	glutMainLoop (); //start the main loop (will bring up the window)	return 0;} // mainvoid GLUTInit (int* argc, char** argv){	glutInit (argc, argv);	glutInitWindowSize (XSCREEN, YSCREEN); 	glutInitDisplayMode (GLUT_RGBA);	glutCreateWindow ("glReadPixels");} // GLUTInitvoid Init (void){	glClearColor (0.0, 0.0, 0.0, 1.0);	glClearDepth (1.0f);} // Initvoid Display (void){	glClear (GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);	Draw ();        ReadBack(); //Added readback here after Draw(), it will read the data every frame} // Displayvoid Draw (void){		glViewport (0, 0, XSCREEN, YSCREEN);	glMatrixMode (GL_PROJECTION);	glLoadIdentity ();	glOrtho (0, XSCREEN, 0, YSCREEN, 0, 1);	glMatrixMode (GL_MODELVIEW);	glLoadIdentity ();	glBegin (GL_QUADS);		glColor4ub (127, 127, 127, 1);		glVertex2f (0, 0);		glVertex2f (XSCREEN, 0);		glVertex2f (XSCREEN, YSCREEN);		glVertex2f (0, YSCREEN);	glEnd ();	glFinish ();} // Drawvoid ReadBack (void){	Error ();	glPixelStorei (GL_PACK_ALIGNMENT, 1);	glReadPixels (0, 0, XSCREEN, YSCREEN, GL_RGBA, GL_UNSIGNED_BYTE, buffer);	Error ();} // ReadTexturevoid Error (void){	GLenum errCode;	const GLubyte* errString;	if ((errCode = glGetError ()) != GL_NO_ERROR)	{		errString = gluErrorString (errCode);		printf ("GL Error : %s\n", errString);	}} // Error##################################################################################


I put comments in the source where I changed the code.. in the main() function and in the Display() function.

Hope I could help
~zix~
---------------------------------------------------Game Programming Resources, Tutorials, and Multimedia | Free Skyboxes
Hi,

I have tried that approach but it returns me all zeros. Did you try to verify the results? Please let me know if this approach works on your machine.

Thanks.
- FMS
Works fine for me.

Returns all 127's, except for alpha of course. Maybe my code is slightly different from the code you tried?

~zix~
---------------------------------------------------Game Programming Resources, Tutorials, and Multimedia | Free Skyboxes
Hi Zix,

I am posting my code and it is the exact same code as you posted. What development environment are you using ? I am using MS Visual Studio 2005 on WindowsXP. Let me explain how I am getting zeros. When I run the code and set the breakpoint on the line "glutMainLoop ()", before this line is executed all the values in the buffer are zeros as expected. But after the execution of this line, the program goes into a perpetual loop and when I watch the buffer values, they still show me zeros. Hope this gives you an indication of my environment and the way I am percieving the results..

Please let me know about your development environment and tools and how you are obtaining the results.

Source Code:
###################################################################################include <stdio.h>#include <C:\Program Files\NVIDIA Corporation\Cg\include\GL\glut.h>#define XSCREEN 4 // width#define YSCREEN 4 // heightGLubyte buffer[YSCREEN * XSCREEN * 4];void GLUTInit (int*, char**);void Init (void);void Display (void);void Draw (void);void ReadBack (void);void Error (void);int main (int argc, char** argv){	GLUTInit (&argc, argv);	Init ();	glutDisplayFunc (Display);	//Draw ();	//ReadBack ();	glutMainLoop ();	return 0;} // mainvoid GLUTInit (int* argc, char** argv){	glutInit (argc, argv);	glutInitWindowSize (XSCREEN, YSCREEN); 	glutInitDisplayMode (GLUT_RGBA);	glutCreateWindow ("glReadPixels");} // GLUTInitvoid Init (void){	glClearColor (0.0, 0.0, 0.0, 1.0);	glClearDepth (1.0f);} // Initvoid Display (void){	glClear (GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);	Draw ();	ReadBack ();} // Displayvoid Draw (void){		glViewport (0, 0, XSCREEN, YSCREEN);	glMatrixMode (GL_PROJECTION);	glLoadIdentity ();	glOrtho (0, XSCREEN, 0, YSCREEN, 0, 1);	glMatrixMode (GL_MODELVIEW);	glLoadIdentity ();	glBegin (GL_QUADS);		glColor4ub (127, 127, 127, 1);		glVertex2f (0, 0);		glVertex2f (XSCREEN, 0);		glVertex2f (XSCREEN, YSCREEN);		glVertex2f (0, YSCREEN);	glEnd ();	glFinish ();} // Drawvoid ReadBack (void){	Error ();	glPixelStorei (GL_PACK_ALIGNMENT, 1);	glReadPixels (0, 0, XSCREEN, YSCREEN, GL_RGBA, GL_UNSIGNED_BYTE, buffer);	Error ();} // ReadTexturevoid Error (void){	GLenum errCode;	const GLubyte* errString;	if ((errCode = glGetError ()) != GL_NO_ERROR)	{		errString = gluErrorString (errCode);		printf ("GL Error : %s\n", errString);	}} // Error##################################################################################

Thanks.
- FMS

Added source tags.

[Edited by - phantom on August 15, 2005 9:16:04 PM]
Quote:Original post by zix99
Well, seeing as brother bob is probably asleep now, and I'm awake again, I thought I might try to help... :) (All considering we both sleep in somewhat normal hours)

I don't know whta you consider to be "normal hours", but I joined this thread in the morning before I went to work (and continued it there). So quite the opposite, it's been full day here. [rolleyes]

Quote:Original post by fms
I am posting my code and it is the exact same code as you posted. What development environment are you using ? I am using MS Visual Studio 2005 on WindowsXP. Let me explain how I am getting zeros. When I run the code and set the breakpoint on the line "glutMainLoop ()", before this line is executed all the values in the buffer are zeros as expected. But after the execution of this line, the program goes into a perpetual loop and when I watch the buffer values, they still show me zeros. Hope this gives you an indication of my environment and the way I am percieving the results..

Please let me know about your development environment and tools and how you are obtaining the results.

That's the exact code I use, which works and gives me expected results. However, your description on how you used the debugger made me wonder.

I agree on the result you get when you run to the breakpoint at glutMainLoop, but then I'm confused. Are you trying to step over that function call to see the values after? Remember, glutMainLoop never returns, so the debugger never stops on the next line, because it won't get there. And you won't see the values being updated in realtime either, because values are only updated when the execution is stopped in the debugger. What you need to do is place a breakpoint after the call to ReadBack and watch the result there.

And by the way, choise of compiler, development environment and such is irrelevant, as this is about getting information back from OpenGL. OpenGL is completely unrelated to those things with regards to what we're discussing. But since you asked, I use Visual C++ 2005 Express Beta 2.
Hi,

I think there is a problem with either WindowsXP or Visual C++ 2005 Express Beta 2 or ATI Mobility Radeon 9600 driver. I made some minor modifications where I am writing to a file after glReadPixels () call and when I run the exact same code on my other machine which has RedHat 9.0 and running gcc environment with Nvidia 6600GT as video graphics card, I found the results as I expected regardless of the window size or whatsoever.

I think it works fine on the other machine but the problem is only when running it on WindowsXP with Visual C++ 2005 Express Beta 2 on ATI Mobility Radeon 9600.

I just wanted to ask opinions from you guys if it makes any sense to you people.

And I completely agree with Bob's feedback regarding OpenGL independece from development environment or the choice of compiler. But the above peculiar behavior proves otherwise or this is an issue with the graphics card driver.

Thanks.
- FMS
Then it's likely a driver issue, just like you mentioned. Given the behaviour described, I think that is the most likely explanation. It's just absurd to blame the development environment in this case. If you can, update the drivers for your graphics card and see if it works.

But on the other hand, is there a reason you have to create a small window like this? Wouldn't a larger window, large enough not to give you garbage result, and then only using a small part of it work too?

This topic is closed to new replies.

Advertisement