[svn.haxx.se] · SVN Dev · SVN Users · SVN Org · TSVN Dev · TSVN Users · Subclipse Dev · Subclipse Users · this month's index

Re: Enh rqst: subversion being more robust in the face of poorly performing network or unreliable connection

From: Johan Corveleyn <jcorvel_at_gmail.com>
Date: Sun, 22 Apr 2018 23:54:29 +0200

On Wed, Apr 18, 2018 at 11:07 PM, Andrew Marlow <marlow.agents_at_gmail.com> wrote:
> Hello everyone,
>
> I am currently working at a place where, IMHO, the network is poorly
> configured. Connections are unreliable and average transmission rates are
> worse than poor ADSL dialup. Under these conditions we have a few problems
> with subversion. A brand new checkout of the entire repo is often
> interrupted with socket errors. Sometimes these force us to say 'svn
> cleanup' before resuming with 'svn up'. Even when the connection is not
> dropped it is very slow to get all the files. Hence, I was wondering if the
> network code might be improved to make it more robust in the face of network
> glitches. Also, could the file fetches be done in parallel?
>
> I realize these are non-trivial changes but is there any chance a future of
> subversion may have them? Please? Otherwise the robustness would have to be
> added at application level in tools such as Tortoise SVN. That's fine for
> Windows but I also see the issue on the command line on linux.

Hi Andrew,

What version of SVN are you using? Can you test with our latest release, 1.10?

As of 1.8, SVN uses the "serf" http library, and already fetches the
files with individual GET requests, in parallel (by default, unless
client or server are configured to force "bulk reports"). This is
called "skelta mode", as opposed to "bulk mode". See a bit of
explanation in the 1.8 release notes [1]. The amount of parallellism
depends on the config setting http-max-connections in the "servers"
file in your SVN runtime configuration area.

There are a couple of other configuration parameters in the servers
file, for you to experiment with to make your svn operations more
robust:
### http-timeout Timeout for HTTP requests in seconds
### http-compression Whether to compress HTTP requests
### http-max-connections Maximum number of parallel server
### connections to use for any given
### HTTP operation.
### http-chunked-requests Whether to use chunked transfer
### encoding for HTTP requests body.

That said, if you have any concrete suggestions on how to improve the
robustness, we're always glad to hear them.

[1] http://subversion.apache.org/docs/release-notes/1.8.html#neon-deleted

-- 
Johan
Received on 2018-04-22 23:55:00 CEST

This is an archived mail posted to the Subversion Dev mailing list.

This site is subject to the Apache Privacy Policy and the Apache Public Forum Archive Policy.