Originally posted by: Moralpanic
If you're running on NTFS partitions file system fragmentation is not usually a big issue
Why is this so? I've even heard of others claiming you don't even need to defrag with NTFS... but my 80gb hdd hasn't been defragged in other 8 months, and i just did it the other day, and i notice an improvement. It was entirely storage, with only one game loaded on it that i never played, so i figured why bother defragging... but i started to play the game again, and noticed an improvement after the defrag.
AFAIK there are basically two reasons why fragmentation of the file system slows system performance, and this only really applies (on modern systems) when there is a LOT of hard drive access going on:
1. The file system's allocation tables (generic use of the term, here) are inefficient and fragmentation causes them to become more inefficient. (The allocation tables tell the OS where the all the file's fragments are. In a linear table system like you have in FATxx partitions it can take the OS quite a bit of time to find out where all the pieces are.)
2. The drive heads have to jump around a lot more to scoop up all the file pieces when the file system is heavily fragmented.
But these two factors don't really have as much effect as you might expect on most systems, and that's especially true of modern operating systems and modern file systems. Why? First of all, a file system like NTFS has a much more efficient way of indexing the locations of file fragments. Secondly, most applications don't read in all of any kind of file all at once. The fact that the file is scattered about doesn't mean much when the system is only looking for 4K or 16K or whatever of it at a time. Obviously there are some exceptions to this. There are some text editors and image editors and games that read in a whole large file at one time and then work on it in volatile memory. You would think that defragging would give you a big advantage when running such applications, but you have to remember that an OS like Win 2000 or XP does a lot of scratching about on the drive for anything that it calls into volatile memory, too. Pagefiles and metadata are a fact of life with these operating systems. Their use may actually slow a machine down a little under specific circumstances compared to how fast things might move if everything were done in volatile memory, but file system journaling and lots of other very nice features of these operating systems (including stability) are very nice features indeed to have.
I don't doubt that, if you stop everything you're doing and do a full defrag, and then try your game you might perceive a more rapid startup and even improved in-game performance. Defraggers that make proper use of the file system handling APIs in Windows XP will make use of the file systems file use data to place frequently and recently used stuff in prefetch caches. So, the file system fragmentation level will not have been the only variable on the system that has been changed by your actions.
I also wouldn't discount the old placebo effect. Hey, I like to defrag personal systems on occasion, too. I feel better afterwards. But I don't really think I'm making a significant change in the performance of those systems when I defrag them.
Now a place that I have seen true noticeable and measurable differences in performance with level of fragmentation are file servers that are being hit by hundreds of users simultaneously and some types of high-end (usually graphics and database) workstations that are chewing hard on the file system for data from many threads at once.
Bear in mind that this is essentially a shopworn (but experienced) hobbyist's observations. But I can tell you that I was a true believer in defragging at one time, and I've come to see it as being much less efficacious today than it was when I was using older operating systems with different memory and disk space management schemes and much slower hard drives.
- prosaic