On Mon, 2011-03-07, John Beranek wrote:
> On 05/03/2011 23:23, John Beranek wrote:
> > Hi,
> > I'm not sure if much performance comparison has been performed, but I'm
> > unhappy to report a significant _reduction_ in speed in a checkout.
> Hmm...I'm surprised (and disappointed). No one is interested in
> Subversion 1.7 being lower performance than 1.6?
I'm interested. Thank you for your report. I read it yesterday and
couldn't think of anything terribly helpful to reply. Some of us are
working on the speed issues. (Personally I'm looking at correctness
issues at the moment.) It's good to see hard numbers. That's a speed
regression that we certainly need to address.
It's helpful for development if we can see measurements that are
isolated to one area at a time. If you're interested in the HTTP
changes, do speed comparisons on operations that don't involve the WC,
such as 'import', 'export', 'log'. On the other hand, to check on WC
performance, measure 'checkout', 'update' (when there's nothing
changed), update (when there's lots changed), 'commit', 'add', 'move',
'delete', and so on, using local ('file://') repository access.
One thing we don't have, but we need, is a set of performance criteria
that we can apply to decide when 1.7 is fast enough to release. We just
have a general agreement that it needs to be "at least as fast as 1.6".
Because of the wide-ranging changes, the change in performance will vary
quite widely depending on the platform, the file system, the operation.
I assume we would accept a little bit of slow-down in some
configurations if necessary.
But we don't yet have any standard set of benchmarks at all. Therefore
it would be very useful to have a script that runs a speed test like
yours, or a set of such tests. Would you be able and willing to write a
self-contained script that creates a load of test data from nothing and
then exercises the checkout (and perhaps other operations)? It needs to
work with svn 1.6 and svn 1.7 of course, and it needs to run on Windows
and Linux, so I'd suggest writing it in Python like most of our test
I think we'll want to run this script on the build-bots so that we can
see frequent reports, tied to source code changes, on a variety of
platforms. (It would be lovely to see the numbers plotted automatically
on a graph on the web site, but it's more important just to have the raw
test capability. Flashy presentation can come later if at all.)
Would you be interested in helping to set up any of this? It would be
jolly helpful if so.
Received on 2011-03-08 15:41:12 CET