Yes, sorry I read 512meg page file, not virtual memory as you really said.
Windows will do all sorts of things to prevent an out-of-memory condition, what exactly it will do depends on the version and your setup; lazily allocating memory, and dynamically increasing the size of the page-file are two examples. Both of which could kick-in if you allocate a 300meg chunk.
It can very well still fail, but your code would know - malloc or new would return 0 instead of a valid pointer, the new_handler would be invoked, a SEH exception would be issued, etc... and the same thing would happen to any other (use-mode) process that attempts to allocate memory during this time - this is why the system becomes unstable. I suppose if everyone handled out-of-memory conditions gracefully, this wouldn't be the case. I'm just not sure how you can handle them gracefully.
Perhaps I'm misunderstanding you. Did you detect the memory allocation failure in your code and shutdown? If not, it should have excepted.
Edited by - Magmai Kai Holmlor on February 20, 2002 9:33:12 PM
Exception handling
I had a weird bug and I never figured out its cause or why I was allocating 300MB of memory. It''s hard to do stuff when you have to wait 5 minutes for ''step'' command to complete . I was just saying that NT can recover from a program demanding all system memory, that''s all.
This topic is closed to new replies.
Advertisement
Popular Topics
Advertisement