Results 1 to 4 of 4

Thread: Page filing revisted...again!

  1. #1
    SG Enthusiast Think's Avatar
    Join Date
    Sep 2001
    Posts
    2,283

    Arrow Page filing revisted...again!

    When it comes to page filing, I've heard so many different approaches in relationship to ones configuration that it confuses the user including myself.

    However, I managed to find a good article by a credible individual and was wondering what you think.

    Here's a quote:

    How big should the page file be?
    There is a great deal of myth surrounding this question. Two big fallacies are:

    The file should be a fixed size so that it does not get fragmented, with minimum and maximum set the same
    The file should be 2.5 times the size of RAM (or some other multiple)
    Both are wrong in a modern, single-user system. A machine using Fast User switching is a special case, discussed below.)

    Windows will expand a file that starts out too small and may shrink it again if it is larger than necessary, so it pays to set the initial size as large enough to handle the normal needs of your system to avoid constant changes of size. This will give all the benefits claimed for a ‘fixed’ page file. But no restriction should be placed on its further growth. As well as providing for contingencies, like unexpectedly opening a very large file, in XP this potential file space can be used as a place to assign those virtual memory pages that programs have asked for, but never brought into use. Until they get used — probably never — the file need not come into being. There is no downside in having potential space available.

    For any given workload, the total need for virtual addresses will not depend on the size of RAM alone. It will be met by the sum of RAM and the page file. Therefore in a machine with small RAM, the extra amount represented by page file will need to be larger — not smaller — than that needed in a machine with big RAM. Unfortunately the default settings for system management of the file have not caught up with this: it will assign an initial amount that may be quite excessive for a large machine, while at the same leaving too little for contingencies on a small one.

    How big a file will turn out to be needed depends very much on your work-load. Simple word processing and e-mail may need very little — large graphics and movie making may need a great deal. For a general workload, with only small dumps provided for (see note to ‘Should the file be left on Drive C:?’ above), it is suggested that a sensible start point for the initial size would be the greater of (a) 100 MB or (b) enough to bring RAM plus file to about 500 MB. EXAMPLE: Set the Initial page file size to 400 MB on a computer with 128 MB RAM; 250 on a 256 MB computer; or 100 MB for larger sizes.

    But have a high Maximum size — 700 or 800 MB or even more if there is plenty of disk space. Having this high will do no harm. Then if you find the actual pagefile.sys gets larger (as seen in Explorer), adjust the initial size up accordingly. Such a need for more than a minimal initial page file is the best indicator of benefit from adding RAM: if an initial size set, for a trial, at 50MB never grows, then more RAM will do nothing for the machine's performance.

    Bill James MS MVP has a convenient tool, ‘WinXP-2K_Pagefile’, for monitoring the actual usage of the Page file, which can be downloaded here. A compiled Visual Basic version is available from Doug Knox's site which may be more convenient for some users. The value seen for ‘Peak Usage’ over several days makes a good guide for setting the Initial size economically.

    Note that these aspects of Windows XP have changed significantly from earlier Windows NT versions, and practices that have been common there may no longer be appropriate. Also, the ‘PF Usage’ (Page File in Use) measurement in Task Manager | Performance for ‘Page File in Use’ include those potential uses by pages that have not been taken up. It makes a good indicator of the adequacy of the ‘Maximum’ size setting, but not for the ‘Initial’ one, let alone for any need for more RAM.
    by Alex Nichol
    (MS-MVP - Windows Storage Management/File Systems)
    © 2002 by Author, All Rights Reserved

    Here's the link

    http://aumha.org/win5/a/xpvm.htm
    got old



  2. #2
    Elite Member TonyT's Avatar
    Join Date
    Jan 2000
    Location
    Fairfax, VA
    Posts
    10,337
    My opinion:

    Let Windows manage memory and use the defaults. Unless one does a lot of audio-video editing regularly)

    Hers's why:

    The obvious reason to have memory and page file set correctly is to assist in system persformance. I say 'assist' because many other factors contribute to or reduce perormance. Memory management is a small part of system performance.

    The obvious downfalls of mismanaged memory are page faults (resulting in program or Windows crashes) or poor system performance. The main cause of page faults lies NOT in memory management, but in the actual programmer's code of the program that is making the page. Not all programs are coded the same.

    There are 'standards & rules of thumb' for each programming language, but there are also many many variables to these standards and rules. It is NOT EXACT science and there exist loopholes and workarounds to known documented problems, therefore a program''s code is never perfect. Now, add in an individual system's hardware code and differences and you have a broad general area of potential for error.

    There are basically 2 types of programs that cause or generate page faults. They are:

    1. Programs that are coded without regard to stock Windows DLL regulations and guidelines, sometimes knowingly violated or malicious programs. (viruses and spyware)
    2. Poorly coded programs made by lazy programmers who failed to take the DLL rules into consideration or failed to test their applications thoroughly in many possible environments before they release them.

    I have not seen a page fault on my systems in over 3 yrs. Why? Because I ONLY use programs that are well coded and I never get spyware or viruses.

    The other side to this is RAM. There is no such thing as too much RAM on an NT based system. It is far far easier and better to add more RAM than it is to muck around with page file settings.

    And above all, configure your system to ONLY load what it needs to load when it starts up. This is accomplished by learning what Services are and TSR programs are.
    No one has any right to force data on you
    and command you to believe it or else.
    If it is not true for you, it isn't true.

    LRH

  3. #3

  4. #4
    SG Enthusiast Think's Avatar
    Join Date
    Sep 2001
    Posts
    2,283
    Thank you TonyT for your input and Burke; those are the types of threads that tend to cloud a concise avenue of understanding in how one should set these parameters but very informative

    One point that seems to be the consensus amongst a few members here such as TonyT, UOD and a few others is not to set the min, max at the same values OR to disable page filing.

    I believe that TonyT seems to of given a very logical explanation.
    got old



Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •