Win32 - capturing screenshots and odd resolutions

Started by
9 comments, last by SymLinked 11 years, 10 months ago
Hi!

I've implemented a simple screenshot capture method, where I do the following:

  1. Get the screen DC by calling GetDC (NULL);
  2. Use CreateCompatibleDC () to create a device context.
  3. Grab width/height of the screen by calling GetDeviceCaps ().
  4. Calling CreateCompatibleBitmap (screenDC, width, height);
  5. Calling SelectObject (deviceContext, bitmap);
  6. Finally, calling BitBlt (deviceContext, 0, 0, x, y, screenDC, 0, 0, SRCCOPY | CAPTUREBLT); and GetDIBits () to get the data pointer.

This works fine except that it doesn't work in odd resolutions like 1366x768 where everything shows up as white/black lines. Anyone got any ideas what's causing this? Is the data output from GetDIBits() padded in this case or something? It looks like there's an extra byte per channel or something which distorts the image as I'm assuming 32-bit, but that's hardly the case as the image is 32-bit.

Any advice is greatly appreciated!
Advertisement
Read up a little about the bitmap format.

The format has some details that you may have missed.

The data probably IS NOT CONTINUOUS.

Every pixel entry can potentially have some unused space on it; as you noticed only 24 bits are valid but 32 bits are used. Every scan line may potentially have some unused space on it. e.g. The data may have space for 1024 but the window is only 1022 pixels long.

You need to account for both.
That sounds logical, but how exactly would you detect when this happens? It's definatly not different bitrates, I already account for that - it's something else.
I've Googled and Googled but come up with nothing.

I retrieve the width, height, numBits and format but it looks exactly the same as in all other resolutions.
How are you processing lpvBits argument from GetDIBits function?
Just a memory copy, which works in all other resolutions I've tested except 1366x768. Even the bitmap header says the size of the lpvBits data is what I expected (ie, 4 * width * height) so I'm not sure how I would be able to read it any differently.
Yes the symptoms conform with padding unaccounted for. In addition to per-pixel padding, the BMP format uses per-row padding. You can calculate the number of bytes used per row of pixels like so:

rowSize = DWordAlign( bpp/8 * width );

[size=2]You may find these useful: Mirror: GDI-based mirror window (src code included), GDI Memory Bitmap
Thanks Amr0!

But the return value of DWordAlign with the argument of (32 / 8 * 1366) is the same as the argument so it has no effect.
GDI Memory Bitmap is a very useful resource though, I'll keep poking and see what's different from your implementation!

Thanks again.
If you have 4 bytes per pixel, and are using a 1366 width - everything is already aligned to 4 bytes. It would have been different if it was 3 bytes per pixel but that's not the case here.
I'm still looking for suggestions. (Edit: Fixed. See my last response for solution)

1366x768.jpg
Code, please. Without the actual details of how you obtain this result we can't do any better than vaguely theorize on why you aren't getting what you want. The shearing clearly indicates you are getting out of sync with the pixels on a per-row basis, and the color corruption is just weird.

“If I understand the standard right it is legal and safe to do this but the resulting value could be anything.”

The image sheering looks like a textbook example of not accounting for the row length. Each row has a few pixels from the next row, causing the row below it to be shifted a few pixels.

This topic is closed to new replies.

Advertisement