> -----Original Message-----
> From: Bert Huijben [mailto:bert_at_qqmail.nl]
> Sent: Friday, April 09, 2010 8:27 AM
> To: 'Paul Holden'; dev_at_subversion.apache.org
> Subject: RE: Severe performance issues with large directories
>
> This issue is actually worse on Windows then on linux, because NTFS is a
> fully transactional filesystem with a more advanced locking handling. And
> for this it needs to do more to open a file. (Some tests I performed 1.5
> year ago indicated that NTFS is more than 100 times slower on handling
> extremely small files, then the EXT3 filesystem on Linux. While througput
> within a single file is not far apart).
I've had to go to some lengths to deal with poor performance of large
working copy folders under Windows.
I've got working copies containing media snippets that are shared between
Solaris and Windows systems. Since the layout was originally designed for
the Solaris systems, it had no problem with many thousands of files in a
directory. Under Windows, a checkout of one of these folders will start out
zipping along, but will eventually slow down to a crawl. Left to itself,
it'll take several days to complete. (Under Linux, this takes a couple
hours.)
And before someone tries to claim it - this has nothing to do with virus
checking. I don't install virus checking until after I've set up the working
copies.
In order to get a new Windows working copy, or to apply a major addition,
we've taken to doing an update/checkout on a Linux system, archiving the
working copy and extracting it onto the Windows system. Obviously, not our
preferred method.
Any performance testing for Subversion should include testing under Windows.
Theological discussions aside, it's an important market segment.
---
Geoff Rowell
geoff.rowell_at_gmail.com
Received on 2010-04-09 16:05:54 CEST