Jump to content
  • Advertisement
Sign in to follow this  
karx11erx

OpenGL SDL OpenGL screenshot problem

This topic is 5002 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

Hi, I am having a weird problem with the glReadPixels() function, probably in conjunction with SDL. I am trying to take a screen shot in an old 3D game that had been ported to SDL+OpenGL by other people. For some reason however, I get total junk images, even if calling glReadPixels() right before or after swapping the render buffers. If however I set a breakpoint before calling glReadPixels() and step over it in the debugger, I get a nice screenshot of my desktop. Here is how I call glReadPixels():
Quote:
GLint glDrawBuf; GLubyte *buf = (GLuint *) malloc(w*h*3); //w,h == width,height glGetIntegerv (GL_DRAW_BUFFER, &glDrawBuf); glReadBuffer(glDrawBuf); glReadPixels(0,0,w,h,GL_RGB,GL_UNSIGNED_BYTE,buf); //save the screenshot as TGA RGB file free(buf);
Any idea what's going wrong? [Edited by - karx11erx on March 8, 2005 3:01:52 AM]

Share this post


Link to post
Share on other sites
Advertisement
You're right, but I am using GLubyte. That was just a mistake from posting it here. Apart from that it doesn't really matter - the buffer just needs to big enough.

Share this post


Link to post
Share on other sites
Here are two pictures of how it should look (taken with FRAPS), and how it does look (taken in-game with glReadPixels()):

http://www.brockart.de/descent/images/d2xgood.jpg

http://www.brockart.de/descent/images/d2xbogus.jpg

Actually, the in-game screenshot is very distorted: At the top some of the textures from the game are still somehow recognizeable.

I am totally clueless what's wrong here.

Share this post


Link to post
Share on other sites
Hey is that some version of descent?

Anyways, you bogus pic doesnt even have a distorted version of the good pic, which gives me the impression that your grabbing that image from a bad memory location. Like maybe the buffer that your reading from is filled with garbage. My suggestion, dump the buffer into another offscreen buffer, and save the data from that buffer. If that doesn't work, then I don't know what to tell you.
[edit]Oh haha you are...[/edit]

Share this post


Link to post
Share on other sites
@PinguinDude:
Quote:
#define BYTES_PER_PIXEL 3

typedef struct TGAImageHeader {
// imagetype 2==truecolour uncompressed,
// 3==b+w uncompressed (theres no implementational difference between the two)
GLubyte id; // the number of bytes in image ID (comes after imageDescription) + before the actual image data
GLubyte colormap;
GLubyte imageType;
GLubyte colormapSpec[5];
GLubyte xOrigin[2];
GLubyte yOrigin[2]; //
GLubyte width[2];
GLubyte height[2];
GLubyte bitDepth;
GLubyte imageDescription;
} TGAImageHeader;

TGAImageHeader tgaInfo;
GLubyte *s;
int i;
GLubyte temp;
tgaInfo.id = 0;
tgaInfo.colormap = 0;
tgaInfo.imageType = 2; // true colour image
memset( tgaInfo.colormapSpec, 0, sizeof( GLubyte )*5 );
tgaInfo.xOrigin[0] = 0;
tgaInfo.xOrigin[1] = 0;
tgaInfo.yOrigin[0] = 0;
tgaInfo.yOrigin[1] = 0;
tgaInfo.width[0] = w%256;
tgaInfo.width[1] = w/256;
tgaInfo.height[0] = h%256;
tgaInfo.height[1] = h/256;
tgaInfo.bitDepth = 8*BYTES_PER_PIXEL;
tgaInfo.imageDescription = 0;
write (f,&tgaInfo,sizeof(tgaInfo));
for (s=buf,i=w*h;i;i--,s+=BYTES_PER_PIXEL) { //flip RGB to BGR
temp=s[0];
s[0]=s[2];
s[2]=temp;
}
if (write(f,buf,w*h*BYTES_PER_PIXEL) != w*h*BYTES_PER_PIXEL) {
#if TRACE
con_printf (CON_DEBUG,"screenshot error, couldn't write to %s (err %i)\n",savename,errno);
#endif
}


@Arenth:

If you look very closely, you will see some distorted parts of the upper area of the good screen shot.

Yes, this is d2x, an OpenGL version of Descent.

[Edited by - karx11erx on March 8, 2005 12:12:14 PM]

Share this post


Link to post
Share on other sites
Obviously the problem is hardware/driver related. I tried the screenshot function on older NVidia hardware today, and everything worked fine. On my X800XT PE w/ Catalyst 5.2 it doesn't. ATI and OpenGL. ):<

Update: Doesn't work on a GF FX 5950 either.

[Edited by - karx11erx on March 10, 2005 9:32:22 AM]

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

We are the game development community.

Whether you are an indie, hobbyist, AAA developer, or just trying to learn, GameDev.net is the place for you to learn, share, and connect with the games industry. Learn more About Us or sign up!

Sign me up!