Question Memory Allocation

Started by
2 comments, last by mumpo 18 years, 8 months ago
I'm writing a class in VS C++ to deal with lots of data. One of my members is a float pointer. In the constructor, the caller passes the size of the data set and I call "m_pfData = new float[size_of_data_set];" It correctly allocates the proper size and I can still access it like an arry. However, the data sets that I deal with are huge. I wrote a standard Windows GUI app and used my class inside of it. When the data set is small the Window is created quickly. However, a large data set slows the Window creation process down (Note: I'm not using the data to draw to the Window; just for calculations purposes). Once the Window is created, the app runs fine. It's this lag that is a problem. I tried making my data member a static array, and the lag goes away. However, predefining a huge array seems like a waste if the caller does not need all of it. Would global alloc or Virtual Alloc be any better for me? Thanks.
Advertisement
Probably not. GlobalAlloc is still heap memory. VirtualAlloc is probably overkill. Recraft your calculations so that they require less memory.
"I thought what I'd do was, I'd pretend I was one of those deaf-mutes." - the Laughing Man
You may want to consider using a std::deque. It's a data structure that allows you to start off small and grow as needed without worrying about allocating a large contiguous section of memory.
If you can't find a way to avoid dynamically allocating a large chunk of memory, you could avoid some lag problems by doing it in a separate thread. You will still have to wait for it to finish before you can use the memory, of course, but that way you can let the user do something else while it loads. Of course, you would then need to get into multithreading programming in order to have a safe way to tell if the memory is ready.

This topic is closed to new replies.

Advertisement