Quote:Original post by PromitQuote:Original post by DeyjaIt depends. Severe heap fragmentation plus a large allocation can cause this to happen; it's frequently a problem for server type applications that run a long time. You can get a similar effect by simply trying to allocate a huge block of memory, like 2 GB or something. It will fail, even if your system has enough physical memory to handle it.
The chances of actually getting a new failure are slim in the first place.
That's a hardcoded limit in Windows: a process can't allocate more than 30 bytes less than 2GB unless it creates a new heap (with HeapCreate).
Presumably, this is to stop you shooting yourself in the foot by accidentally allocating all available virtual memory (although 30 bytes less than 2GB is still most of it on 32-bit systems).
Combined with heap fragmentation, this can cause allocations to fail even if you have enough RAM and swap to cover your needs.