Actually, the length of this interval could be even smaller
and is often a point of furious debating.
The 2Q algorithm seems to have solved this problem by not
using an interval, but a FIFO queue of small, fixed length.
> > This seems to be generally accepted theory and practice in the
> > field of page replacement.
>
> Okay, just seems odd to me, but IANAVMGuru.
Seems odd at first glance, true.
Let me give you an example:
- sequential access of a file
- script reads the file in 80-byte segments
(parsing some arcane data structure)
- these segments are accessed in rapid succession
- each 80-byte segment is accessed ONCE
In this case, even though the data is accessed only
once, each page is touched PAGE_SIZE/80 times, with
one 80-byte read() each time.
I hope this example gives you some holding point to
make it easier and grasp this - somewhat counter
intuitive - concept.
regards,
Rik
-- Executive summary of a recent Microsoft press release: "we are concerned about the GNU General Public License (GPL)"
http://www.surriel.com/ http://www.conectiva.com/ http://distro.conectiva.com/
- To unsubscribe from this list: send the line "unsubscribe linux-kernel" in the body of a message to majordomo@vger.kernel.org More majordomo info at http://vger.kernel.org/majordomo-info.html Please read the FAQ at http://www.tux.org/lkml/