Bitmap Size problem

Started by
10 comments, last by Evil Steve 15 years, 3 months ago
How does a program like Photoshop open MASSIVE bitmaps? I can open a bitmap using something like: hbm = (HBITMAP)LoadImage(NULL, palBitmapName, IMAGE_BITMAP, 0, 0, LR_LOADFROMFILE); if the bitmap is beyond 50 megabytes, or bigger than 4000 x 4000... either the LoadImage() function fails, or the GetObject(hbm, sizeof(bm), &bm) fails... Is it a lack of memory? If anyone has information/links regarding this question, feel free to respond. Nokame
Advertisement
Quote:Original post by Nokame
if the bitmap is beyond 50 megabytes, or bigger than 4000 x 4000... either the LoadImage() function fails,
Functions tend to indicate why they failed, precisely to give more details than "It failed!" for diagnosis. In the case of LoadImage, the documentation mentions GetLastError.


They'll probably use their own loader.

Then there's other ways to not have to load the complete image. Probably something with memory mapped files where you only access parts of the full image at once.

Fruny: Ftagn! Ia! Ia! std::time_put_byname! Mglui naflftagn std::codecvt eY'ha-nthlei!,char,mbstate_t>

It's possible you have memory fragmentation issues when this is failing. One solution is to use a better allocator which is less prone to fragmentation. I agree with the memory mapped files probably being how they get around it.
Photoshop fails with massive images in my experience - I get an "Out of memory" error opening a 600MB TGA file.

I suspect they load the entire image file into memory, but use a DIB section to map the visible region to a windows HBITMAP, which is then displayed on the window.

Out of interest, does anyone know why Windows fails so miserably with large images? 50MB is pitiful for modern OSs, and it's not as if that 50MB is allocated from kernel memory or anything else silly.
Thanks for all the responses, especially the GetLastError() response. It turns out the the problem is memory. I suppose i'll look into the memory mapped file idea.

Thanks guys. Merry xmas, quanza, and/or hanukkah.
Quote:Original post by Evil Steve
Photoshop fails with massive images in my experience - I get an "Out of memory" error opening a 600MB TGA file.

I suspect they load the entire image file into memory, but use a DIB section to map the visible region to a windows HBITMAP, which is then displayed on the window.

Out of interest, does anyone know why Windows fails so miserably with large images? 50MB is pitiful for modern OSs, and it's not as if that 50MB is allocated from kernel memory or anything else silly.


Memory fragmentation, in my experience. At a previous company, we ran into issues where we couldn't display images of a certain size. I replaced the default allocator with jemalloc and the problems went away.

I believe the default allocator on Linux is much less prone to fragmentation.
Quote:Original post by Evil Steve
Photoshop fails with massive images in my experience - I get an "Out of memory" error opening a 600MB TGA file.

I suspect they load the entire image file into memory, but use a DIB section to map the visible region to a windows HBITMAP, which is then displayed on the window.

Out of interest, does anyone know why Windows fails so miserably with large images? 50MB is pitiful for modern OSs, and it's not as if that 50MB is allocated from kernel memory or anything else silly.


Actually, I have proof that this is not true. However, it depends on which function you use to load.

I recently changed some loading code in our application from the IPicture interface(Which is the same OLE interface that IE is supposedly using to load images) to GDI+, and after this changed, I am capable of successfully loading images that are 170MB on disk.

Said Bitmap is 13000x13000 pixels big, is a standard Bitmap and works fine to load, convert into thumbnails, etc.

The problem might lie with the older kernel functions not being able to cope with gigantic images due to poorly written code. I'm not sure. I would also assume that Photoshop would use it's own loader routines instead of the built-in OS ones.

Nokame: If you want to load large(r) images, why not give GDI+ a go?

Toolmaker

Quote:Original post by Toolmaker
Actually, I have proof that this is not true. However, it depends on which function you use to load.

I recently changed some loading code in our application from the IPicture interface(Which is the same OLE interface that IE is supposedly using to load images) to GDI+, and after this changed, I am capable of successfully loading images that are 170MB on disk.

Said Bitmap is 13000x13000 pixels big, is a standard Bitmap and works fine to load, convert into thumbnails, etc.

The problem might lie with the older kernel functions not being able to cope with gigantic images due to poorly written code. I'm not sure. I would also assume that Photoshop would use it's own loader routines instead of the built-in OS ones.

Nokame: If you want to load large(r) images, why not give GDI+ a go?

Toolmaker
Have you tried accessing a 13000x13000 bitmap as a HBITMAP though? I'd be surprised if that didn't fail.
Quote:Original post by Evil Steve
Toolmaker
Have you tried accessing a 13000x13000 bitmap as a HBITMAP though? I'd be surprised if that didn't fail.

No I haven't. I only loaded it into a GDI+ Bitmap object, and use LockBits()/UnlockBits() to obtain access to the RAW pixel data.

But in essence you can get away without converting it into a HBITMAP, as rendering could be done through GDI+'s Graphics object.

Toolmaker

This topic is closed to new replies.

Advertisement