Paging file increase/decrease

Associate
Joined
18 Apr 2004
Posts
332
Location
Milton Keynes, UK
Hey all,

What's your consensus on touching the paging file in a Windows 10 computer? I'm constantly crashing out and my memory usage is around 7.3/8GB. Yes, I can increase memory going through IT/procurement to get it authorised which can take 3 months... however, in the meantime I've read hit and miss on changing the default. Some said the initial size should be x1.5 of your current memory and the maximum size is x3. What do you think?
 
Associate
OP
Joined
18 Apr 2004
Posts
332
Location
Milton Keynes, UK
Based on what I do at work, I run multiple browsers for the wallboard, webchat, call logging systems. I do have a lot of tabs open to help with my job. I've got excel rota, and word docs open. Closing them all down as a test, it hovers around 5.5GB memory use still. I also use Skype and Teams... this is all before I start using Power BI as well.
 
Man of Honour
Joined
13 Oct 2006
Posts
91,240
If you are getting crashing a lot with a high memory load I'd possibly suspect memory stability - Windows 10 supposedly has maintenance tasks to detect faulty RAM but they aren't very reliable.

End of the day if you don't have enough RAM, page file tweaks won't help much. But there are only 3 approaches worth considering - let Windows decide, 1024MB initial size and max size based on your use (generally 8192MB will cover most people unless you use big databases, etc. or have a specific need to cover all physical RAM including commit) or a static size if on older HDDs.
 
Don
Joined
19 May 2012
Posts
17,208
Location
Spalding, Lincolnshire
Mark Russinovich's answer is still fairly relevant

How Big Should I Make the Paging File?​

Perhaps one of the most commonly asked questions related to virtual memory is, how big should I make the paging file? There’s no end of ridiculous advice out on the web and in the newsstand magazines that cover Windows, and even Microsoft has published misleading recommendations. Almost all the suggestions are based on multiplying RAM size by some factor, with common values being 1.2, 1.5 and 2. Now that you understand the role that the paging file plays in defining a system’s commit limit and how processes contribute to the commit charge, you’re well positioned to see how useless such formulas truly are.

Since the commit limit sets an upper bound on how much private and pagefile-backed virtual memory can be allocated concurrently by running processes, the only way to reasonably size the paging file is to know the maximum total commit charge for the programs you like to have running at the same time. If the commit limit is smaller than that number, your programs won’t be able to allocate the virtual memory they want and will fail to run properly.

So how do you know how much commit charge your workloads require? You might have noticed in the screenshots that Windows tracks that number and Process Explorer shows it: Peak Commit Charge. To optimally size your paging file you should start all the applications you run at the same time, load typical data sets, and then note the commit charge peak (or look at this value after a period of time where you know maximum load was attained). Set the paging file minimum to be that value minus the amount of RAM in your system (if the value is negative, pick a minimum size to permit the kind of crash dump you are configured for). If you want to have some breathing room for potentially large commit demands, set the maximum to double that number.

Some feel having no paging file results in better performance, but in general, having a paging file means Windows can write pages on the modified list (which represent pages that aren’t being accessed actively but have not been saved to disk) out to the paging file, thus making that memory available for more useful purposes (processes or file cache). So while there may be some workloads that perform better with no paging file, in general having one will mean more usable memory being available to the system (never mind that Windows won’t be able to write kernel crash dumps without a paging file sized large enough to hold them).
 
Soldato
Joined
24 Jun 2021
Posts
3,633
Location
UK
Let windows manage it.

It's all based on the premise that windows would be writing a crash dump to the pagefile so the pagefile needs to be big enough to hold it, plus whatever else was already in there based on the stuff you have running. 1.5x was advice from the 32-bit days, when 64-bit came out people bickered about using 2.5x instead. That's all nonsense because windows has chosen a 3gb pagefile on my pc with 16gb ram and it has been absolutely file. Don't see what I'd want a crash dump for anyway, clearly it has never come up.

In your example the laptop just doesn't have enough ram, so tinkering with the pagefile isn't going to help, you need to download more ram.
 
Man of Honour
Joined
13 Oct 2006
Posts
91,240
That's a good read. Sounds a bit hardcore to get your paging file exact to what you use.

It is pretty simple really - don't go under 1024MB minimum as that gives enough to allow Windows to operate correctly in every circumstance including panic situations for the OS. Very few people need a max above 8192MB but it doesn't really hurt to set it higher. You could hand tune the minimum to your max commit situation but it doesn't make much odds unless you have a slow OS/swap disc.

Pretty much any other settings will result in a less than optimal operation, outside of specific situations where having a static pagefile is an advantage, though whether it will be noticeable or not is another matter - many modern systems it will largely be unnoticeable unless it causes crashing because of being too limited.
 
Associate
Joined
20 Mar 2022
Posts
87
Location
d
This made a difference back about 15 - 20 years ago, these days your wasting your time just let windows manage it. Order some new RAM you can close programs you dont use to free up memory in the meantime. (check the system tray)
 
Last edited:
Don
Joined
19 May 2012
Posts
17,208
Location
Spalding, Lincolnshire
these days your wasting your time just let windows manage it.

Which is fine if you have enough memory, or aren't ever running up against memory pressure, but the OP has 8GB and is likely struggling.

Setting the pagefile to e.g. 16GB min and max, would allow Windows to start paging less used memory earlier (or even requested but not necessarily used memory - i.e. "Committed" figure in Task manager), leaving actual ram available - enabling things to run a bit more smoothly.

Some more good discussion here:
 
Associate
Joined
20 Mar 2022
Posts
87
Location
d
Which is fine if you have enough memory, or aren't ever running up against memory pressure, but the OP has 8GB and is likely struggling.

Setting the pagefile to e.g. 16GB min and max, would allow Windows to start paging less used memory earlier (or even requested but not necessarily used memory - i.e. "Committed" figure in Task manager), leaving actual ram available - enabling things to run a bit more smoothly.

Some more good discussion here:

Aint got the time to read at second lol but...
If set to system managed, wont it just increase the page file automatically if its insufficient?
 
Don
Joined
19 May 2012
Posts
17,208
Location
Spalding, Lincolnshire
Aint got the time to read at second lol but...
If set to system managed, wont it just increase the page file automatically if its insufficient?

It should, but doesn't always.

As above, in most cases system managed is fine, but when it isn't - it really isn't :)


I've got 32GB RAM in my current work PC - System managed suggests a 4903Mb recommended page file. Leaving it set to system managed and I get frequent "Resource-Exhaustion-Detector" Event 2004 messages in Event viewer, and sluggish performance when logging on every day.

Set the page file to 32GB to match my ram, and not had an issue since.
 
Last edited:
Soldato
Joined
25 Jan 2007
Posts
4,738
Location
King's Lynn
I just set mine manually..
OS drive is set to 16GB and my scratch disk (both os and scratch are nvme) has a second 64GB one which is the same amount as my ram.

Do I need this much, who knows, but what I do know is that I do need some on C drive due to some programs requiring a page file on C: drive and having had my pc's setup with split page files for around 10 years (if not longer) I've never had any issues.... mind you I'm not really going to miss those amounts on 2x2TB drives
 
Man of Honour
Joined
18 Oct 2002
Posts
100,404
Location
South Coast
You still need at least the amount of RAM you have in pagefile allocation. If the system crashes, then Windows drops a memory dump to disk, so if it crashes whilst you have 28GB of RAM consumed for example, then you should have at least a 32GB pagefile assuming you have 32GB RAM. System Managed accounts for this sort of scenario.

My setup is set to system managed, I have 64GB RAM and my pagefile is exactly 64GB.

You don't need a pagefile on non OS disks.
 
Man of Honour
Joined
13 Oct 2006
Posts
91,240
You still need at least the amount of RAM you have in pagefile allocation. If the system crashes, then Windows drops a memory dump to disk, so if it crashes whilst you have 28GB of RAM consumed for example, then you should have at least a 32GB pagefile assuming you have 32GB RAM. System Managed accounts for this sort of scenario.

My setup is set to system managed, I have 64GB RAM and my pagefile is exactly 64GB.

You don't need a pagefile on non OS disks.

Yeah if the system is configured for a complete memory dump you will need a maximum setting high enough to cover all memory in use - you might also encounter situations where the system can't manage the page file in that scenario so setting a static size or minimum size of 1.5x RAM would be better - in Windows 10/11 there is a new setting of an automatic memory dump which attempts to work with that limitation with a system managed page file.

The default on most systems will be an automatic or kernel dump which generally needs around 600-700MB (hence having a minimum size of at least 1024MB is better) - having a complete memory dump configured only really makes sense when doing specific troubleshooting.

If you do have a complete dump configured ignore my advice in earlier posts - I generally assume most people don't, and those who do know what they are doing.
 
Back
Top Bottom