SDL OpenGL screenshot problem

Started by
6 comments, last by karx11erx 19 years, 1 month ago
Hi, I am having a weird problem with the glReadPixels() function, probably in conjunction with SDL. I am trying to take a screen shot in an old 3D game that had been ported to SDL+OpenGL by other people. For some reason however, I get total junk images, even if calling glReadPixels() right before or after swapping the render buffers. If however I set a breakpoint before calling glReadPixels() and step over it in the debugger, I get a nice screenshot of my desktop. Here is how I call glReadPixels():
Quote:GLint glDrawBuf; GLubyte *buf = (GLuint *) malloc(w*h*3); //w,h == width,height glGetIntegerv (GL_DRAW_BUFFER, &glDrawBuf); glReadBuffer(glDrawBuf); glReadPixels(0,0,w,h,GL_RGB,GL_UNSIGNED_BYTE,buf); //save the screenshot as TGA RGB file free(buf);
Any idea what's going wrong? [Edited by - karx11erx on March 8, 2005 3:01:52 AM]
_________karx11erxVisit my Descent site or see my case mod.
Advertisement
You want GLubyte *buf not GLuint *.
The more applications I write, more I find out how less I know
You're right, but I am using GLubyte. That was just a mistake from posting it here. Apart from that it doesn't really matter - the buffer just needs to big enough.
_________karx11erxVisit my Descent site or see my case mod.
Here are two pictures of how it should look (taken with FRAPS), and how it does look (taken in-game with glReadPixels()):

http://www.brockart.de/descent/images/d2xgood.jpg

http://www.brockart.de/descent/images/d2xbogus.jpg

Actually, the in-game screenshot is very distorted: At the top some of the textures from the game are still somehow recognizeable.

I am totally clueless what's wrong here.
_________karx11erxVisit my Descent site or see my case mod.
if all your gl code is fine then maybe its the write to file code part.
http://sourceforge.net/projects/pingux/ <-- you know you wanna see my 2D Engine which supports DirectX and OpenGL or insert your renderer here :)
Hey is that some version of descent?

Anyways, you bogus pic doesnt even have a distorted version of the good pic, which gives me the impression that your grabbing that image from a bad memory location. Like maybe the buffer that your reading from is filled with garbage. My suggestion, dump the buffer into another offscreen buffer, and save the data from that buffer. If that doesn't work, then I don't know what to tell you.
[edit]Oh haha you are...[/edit]
@PinguinDude:
Quote:#define BYTES_PER_PIXEL 3

typedef struct TGAImageHeader {
// imagetype 2==truecolour uncompressed,
// 3==b+w uncompressed (theres no implementational difference between the two)
GLubyte id; // the number of bytes in image ID (comes after imageDescription) + before the actual image data
GLubyte colormap;
GLubyte imageType;
GLubyte colormapSpec[5];
GLubyte xOrigin[2];
GLubyte yOrigin[2]; //
GLubyte width[2];
GLubyte height[2];
GLubyte bitDepth;
GLubyte imageDescription;
} TGAImageHeader;

TGAImageHeader tgaInfo;
GLubyte *s;
int i;
GLubyte temp;
tgaInfo.id = 0;
tgaInfo.colormap = 0;
tgaInfo.imageType = 2; // true colour image
memset( tgaInfo.colormapSpec, 0, sizeof( GLubyte )*5 );
tgaInfo.xOrigin[0] = 0;
tgaInfo.xOrigin[1] = 0;
tgaInfo.yOrigin[0] = 0;
tgaInfo.yOrigin[1] = 0;
tgaInfo.width[0] = w%256;
tgaInfo.width[1] = w/256;
tgaInfo.height[0] = h%256;
tgaInfo.height[1] = h/256;
tgaInfo.bitDepth = 8*BYTES_PER_PIXEL;
tgaInfo.imageDescription = 0;
write (f,&tgaInfo,sizeof(tgaInfo));
for (s=buf,i=w*h;i;i--,s+=BYTES_PER_PIXEL) { //flip RGB to BGR
temp=s[0];
s[0]=s[2];
s[2]=temp;
}
if (write(f,buf,w*h*BYTES_PER_PIXEL) != w*h*BYTES_PER_PIXEL) {
#if TRACE
con_printf (CON_DEBUG,"screenshot error, couldn't write to %s (err %i)\n",savename,errno);
#endif
}


@Arenth:

If you look very closely, you will see some distorted parts of the upper area of the good screen shot.

Yes, this is d2x, an OpenGL version of Descent.

[Edited by - karx11erx on March 8, 2005 12:12:14 PM]
_________karx11erxVisit my Descent site or see my case mod.
Obviously the problem is hardware/driver related. I tried the screenshot function on older NVidia hardware today, and everything worked fine. On my X800XT PE w/ Catalyst 5.2 it doesn't. ATI and OpenGL. ):<

Update: Doesn't work on a GF FX 5950 either.

[Edited by - karx11erx on March 10, 2005 9:32:22 AM]
_________karx11erxVisit my Descent site or see my case mod.

This topic is closed to new replies.

Advertisement