dynamic memory capacity

Started by
4 comments, last by Humbaba 20 years, 9 months ago
I have a whole bunch of game data that I access through arrays of pointers. As needed, I create new items in the game by alocating dynamic memory to next available pointer in the array. My problem is I run into a sudden performance degradation when I exceed a certain number of these items. I''m no where near using up all my system''s physical memory so I''m stumpted as to what bottleneck I''m hitting. I''m also deleting the dynamic memory correctly. I''m wondering if windows is limiting the amount of dynamic memory my program can use and is therefore tossing the excess in virtual memory, which would definitely kill performance. Should I avoid using dynamic memory for large amounts of game data? Or could it be something else? Any help would be great. Cheers, Stephen
Advertisement
You''re always in virtual memory (unless you''re something like a driver). A process doesn''t get assigned all of physical memory, or even close. Eventually it''ll swap to disk.

Check the disk activity when that comes up. Or use task manager to see it.

I like pie.
[sub]My spoon is too big.[/sub]
are you using direct-x? Maybe you''re running out of video memory and the system has started using system memory, which would obviously be a lot slower?
This is my physics class, which runs independently of directx (which I am using for graphics). So none of it is in video memory. It''s just that I''ve alocated all the memory for my physical objects dynamically (by calling ''new'' in C++). So I''m wonding if there are tight limitations on the dynamic memory available to a program, i.e. tighter than the memory available if everything was declared statically.

And if page swapping is the problem, do commercial applications with large memory requirements somehow ask windows for more room in physical memory before pages start swapping to disk?

BTW: virtual memory is the page swap space on disk.

-Stephen
I don''t think there is a limit for each program, but I do think that if you''re using XP, a large chunk of memory is used by Windows. But if you aren''t, well..
quote:Original post by Humbaba
I''m wondering if windows is limiting the amount of dynamic memory my program can use and is therefore tossing the excess in virtual memory, which would definitely kill performance.

Virtual memory is just a mechanism for representing disk space as RAM seamlessly. What you call ''dynamic memory'' could therefore come from the disk or ''real'' memory and you have little control over that. The fast memory is allocated first, but once all the applications on your system exhaust that, then some of that will be written out to disk via the virtual memory system so that there''s some memory free for your objects. It doesn''t matter whether you allocate it dynamically at the start or statically at the beginning, as the same thing applies - when you ask for memory, it gets it for you, but may have to push other things out onto disk first. Even if your entire program is resident in memory, Windows still probably had to write everything else out to disk in the process.



[ MSVC Fixes | STL Docs | SDL | Game AI | Sockets | C++ Faq Lite | Boost
Asking Questions | Organising code files | My stuff | Tiny XML | STLPort]

This topic is closed to new replies.

Advertisement