Can I get Windows 7 to overcommit memory?

Started by
10 comments, last by klems 13 years ago
So I have this beefy machine with four gigs of RAM running 64-bit Windows 7. Unfortunately, I don't have the disk space to match, so I figure that, with that much RAM I should be able to run without page file, to conserve a bit. Also, having a program die because of a failed allocation tends to be preferrable to having the system go into 15 minutes of frantic paging activity, which usually doesn't end until the memory hog is killed anyway.

This has worked out beautifully on Linux, which I used to use as my primary OS until my other disk died and took my Linux partition with it. Unfortunately, no such luck in Windows 7. I get constant nag screens about the system being low on memory, and occasionally programs commit suicide because of failed allocations. Turning on the page file solves the problem, but the file is barely used at all. This is of course due to the difference in memory handling between W7 and Linux; W7 requires that every allocated page of memory actually exists. Linux, on the other hand, allows programs to allocate quite a bit more than what's actually available (and then goes on a killing spree if programs actually touch more memory than what's available,) the rationale being that programs quite frequently allocate huge chunks of memory without using anywhere near all of it. Java is a great example of this, and pretty much every program that uses a custom memory allocator, I would presume.

Now, I think it's pretty silly to have to give up six gigs of HD space just because some programs like to lie about how much memory they're going to use, so I was wondering if there's any way to get Windows 7 to overcommit memory the way Linux does?
Advertisement
I read somethign a while back about this and the general opinion was its best left on. That said, if you do need more space why not just turn it down? You don't have to use 6gb. Tell it to use less, I checked mine and its only on 4gb (it does say reccomended 6gb though).

Interested in Fractals? Check out my App, Fractal Scout, free on the Google Play store.

I guess this isn't the answer that you're looking for, but why not buy a bigger hard drive? Even if you're a bit hard up on money, a 1 TB hard drive isn't quite inexpensive, maybe $100AUD or less.
[size="2"][size=2]Mort, Duke of Sto Helit: NON TIMETIS MESSOR -- Don't Fear The Reaper

Unfortunately, no such luck in Windows 7. I get constant nag screens about the system being low on memory, and occasionally programs commit suicide because of failed allocations.
I've been running without a page file on many systems since XP times.

I've encountered out-of-memory problem precisely once, when a version of Flash had a nasty memory leak that gobbled up gigabytes.

Unless there is something fishy going on, then the explanation is quite simply that you don't have enough RAM.

some programs like to lie[/quote]They don't lie, they don't know.

If you're a heavy user, you'll simply need to buy more RAM. And is this a 64-bit version of Windows?

I read somethign a while back about this and the general opinion was its best left on. That said, if you do need more space why not just turn it down? You don't have to use 6gb. Tell it to use less, I checked mine and its only on 4gb (it does say reccomended 6gb though).[/quote]
It's a myth. There is nothing that requires system to scribble over disk.

Having pagefile can help in some corner cases, but when you run out of memory you run out. There is a point for everyone, it just turns out that having 4+6GB is usually enough.

Size of pagefile is also often overstated. 6GB takes around 4 minutes to write fully. RAM needs 1 second to do the same. So when application hits the pagefile seriously, things slow down to utterly unusable.

The reason why one would want to turn it off is, perhaps, to reduce disk activity, on laptop it could allow system to turn it off, or perhaps to prevent wear on a SSD. Unless doing some really heavy image manipulation or some strange computations, 8GB is enough these days, 4GB will typically work, but may be a bit tight.

W7 requires that every allocated page of memory actually exists.

That's not true. VirtualAlloc can reserve a virtual address range without actually allocating physical RAM/pagefile space for it. The problem is simply that programmers WANT a huge pile of just-in-case RAM already there for them; it makes it faster to use when they actually decide to use it.
You should probably wait until you have 8 GB or more to turn off the page file. I have it off on my laptop since I have 16 GB of memory which seems to be a nice amount where a page file just seems silly. I'm still not convinced for the arguments to keep it on. I keep thinking those discussions are for people wanting to turn it off with only 2 GB of memory.
4Gb isn't a beefy machine for W7-64. If you don't want to get a bigger HD (or consider a second HD) get more RAM.

www.simulatedmedicine.com - medical simulation software

Looking to find experienced Ogre & shader developers/artists. PM me or contact through website with a contact email address if interested.


I guess this isn't the answer that you're looking for, but why not buy a bigger hard drive? Even if you're a bit hard up on money, a 1 TB hard drive isn't quite inexpensive, maybe $100AUD or less.
It's not that much of a practical problem really, it's more of an annoyance.


[quote name='valderman' timestamp='1301135563' post='4790630']
Unfortunately, no such luck in Windows 7. I get constant nag screens about the system being low on memory, and occasionally programs commit suicide because of failed allocations.
I've been running without a page file on many systems since XP times.

I've encountered out-of-memory problem precisely once, when a version of Flash had a nasty memory leak that gobbled up gigabytes.

Unless there is something fishy going on, then the explanation is quite simply that you don't have enough RAM.[/quote]I used to do that under XP too, and it worked just fine. However, everything I've read indicates that Windows requires all allocated memory pages to be backed by physical storage, so I sort of wonder how it did work. Perhaps programs were less prone to pre-allocate huge (in relation to that time's memory sizes) chunkds of memory back then.

some programs like to lie[/quote]They don't lie, they don't know.[/quote]They aren't telling the truth, isn't that the definition of lie, regardless of reason?

If you're a heavy user, you'll simply need to buy more RAM. And is this a 64-bit version of Windows?[/quote]A freshly started copy of Eclipse with one three-class project loaded plus Firefox is enough to cause the problem. I don't think they and W7 between them actually touch enough pages to cause memory to run out. Of course buying more RAM is a solution, but huge chunks of it are going to remain unused "just in case," which seems like a waste of perfectly good memory to me.



[quote name='valderman' timestamp='1301135563' post='4790630']
W7 requires that every allocated page of memory actually exists.

That's not true. VirtualAlloc can reserve a virtual address range without actually allocating physical RAM/pagefile space for it. The problem is simply that programmers WANT a huge pile of just-in-case RAM already there for them; it makes it faster to use when they actually decide to use it.
[/quote]And physical storage gets allocated on first access to a page? That's the behavior I want to enforce for all allocations, but I guess Windows wouldn't let me? I'm a bit sceptical of the actual performance hit incurred by overcommitting, as you'll only get a page fault the first time you hit a previously untouched page of memory; seems more like a case of "my application is more important than the rest of the system" than actual performance concerns to me.

They aren't telling the truth, isn't that the definition of lie, regardless of reason?


How much memory will Word need? ~200MB. Unless the user happens to load a 10,000x10,000 bitmap and inserts it.

A freshly started copy of Eclipse with one three-class project loaded plus Firefox is enough to cause the problem. I don't think they and W7 between them actually touch enough pages to cause memory to run out.[/quote]

Then there's something wrong here.

New eclipse is around 64MB, it will grow arbitrarily. New Firefox is ~90MB. There is no way to exhaust memory using these.

Open task manager, or better yet, ProcessExplorer and see what's consuming all memory.

I've written in another thread about juggling java applications exceeding 4GB of memory with no problem. There must be something else running, either services, perhaps some memory leaking application, antivirus.

Of course buying more RAM is a solution, but huge chunks of it are going to remain unused "just in case," which seems like a waste of perfectly good memory to me[/quote]
Then turn on the page file. It's precisely what it was made for - the "just in case".

And physical storage gets allocated on first access to a page? That's the behavior I want to enforce for all allocations, but I guess Windows wouldn't let me? I'm a bit sceptical of the actual performance hit incurred by overcommitting, as you'll only get a page fault the first time you hit a previously untouched page of memory; seems more like a case of "my application is more important than the rest of the system" than actual performance concerns to me.[/quote]
Except that application doesn't allocate all the memory it will ever need at start.

FF runs for 1 hour, using 150MB and not a byte more. Then user goes to YouTube. Suddely, FF needs to allocate plugin, plugin needs to allocate driver resources for hardware acceleration, windows needs to allocate whatever desktop handles, then images need to be loaded which causes several calls to VirtualAlloc - but some images are of unknown size, so FF uses Heuristic and allocates one 100kb block for file and one 4MB block for unpacked image. Suddenly, FF uses 210MB per process, 20MB for plugin sandbox and 60MB in kernel and driver resources.

Applications simply aren't written in a way to know how much they will need. Except for Java VM, which can be limited to min/max values.

But each time VirtualAlloc is called, memory is checked.
Considering the cost/Gb of a hard disk, all this messing about seems a little pointless.

www.simulatedmedicine.com - medical simulation software

Looking to find experienced Ogre & shader developers/artists. PM me or contact through website with a contact email address if interested.

This topic is closed to new replies.

Advertisement