This is sometimes feasible, but sometimes it is a hack with painful
consequences in the form of software incompatibilites.
> I guess what I really mean is that I think Linus' strategy of
> generally optimizing for the "usual case" is a good thing. It
> is actually quite annoying in general to have that many files in
> a single directory (think \winnt\... here). So maybe it would
> be better to focus on the normal situation of, say, a few hundred
> files in a directory rather than thousands ...
Linus' strategy is to not let optimizations for uncommon cases inflict
the common case. However, I think we can make an improvement here that
will work well even on moderate-sized directories.
My main problem with the fixed-depth tree proposal is that is seems to
work well for a certain range of directory sizes, but the range seems a
bit arbitrary. The case of very small directories is also quite
important, too.
-hpa
-- <hpa@transmeta.com> at work, <hpa@zytor.com> in private! "Unix gives you enough rope to shoot yourself in the foot." http://www.zytor.com/~hpa/puzzle.txt - To unsubscribe from this list: send the line "unsubscribe linux-kernel" in the body of a message to majordomo@vger.kernel.org More majordomo info at http://vger.kernel.org/majordomo-info.html Please read the FAQ at http://www.tux.org/lkml/