Memory Optimization

Started by
2 comments, last by RDragon1 15 years, 7 months ago
I am developing an Image decoder for embedded processor, whose main memory is 32MB. Apart from the Image decoder,One more application is running which needs huge memory.So available memory is too short enough to decode Images of large size (> 500k). Somebody suggested to use heap memory for small allocations and a static buffer for larger allocation, inside Decoder code. In first place, It is not that easy to implement. Is this best way to optimize the memory? Do you have any other suggestions?
Advertisement
Just sounds to me like you can't load the whole image in a single go, so use smaller chunks.
Quote:Original post by Dave
Just sounds to me like you can't load the whole image in a single go, so use smaller chunks.



I am doing the same. But for larger images, it is still same issue - out of memory.

No my question is, How can we optimize the memory?





What does "optimize the memory" mean?

Why can't a 500k allocation fit into 32MB of space?

If you can't fit the whole image into memory, process just a part of it at a time. This solves your problem.

This topic is closed to new replies.

Advertisement