[svn.haxx.se] · SVN Dev · SVN Users · SVN Org · TSVN Dev · TSVN Users · Subclipse Dev · Subclipse Users · this month's index

RE: Re: timeout issues over vpn

From: Rob van Oostrum <rob.vanoostrum_at_blastradius.com>
Date: 2006-01-09 01:53:39 CET

I've been having this same problem. Large repository (>16GB) and all
Windows clients. Lots of switching between branches (a fresh checkout of
the >5GB codebase can take a lot of time depending on things like disk
fragmentation etc).

My Apache configuration already had KeepAlive enabled. I just increased
my KeepAliveTimeout, combined with increasing my MaxClients to keep
users from getting refused connections (I only have to worry about max.
200 or so users).

Potentially related: I have also previously set MaxRequestsPerChild
because I would periodically get client threads lock up. This would
force me to stop Apache and kill the offending process(es) manually,
resulting in minor downtime and inconvenience to end users. Forcing
clients to die and be reborn from time to time seems to have made that
problem go away.

Hope this helps.

Cheers
Rob

-----Original Message-----
From: Steve Williams [mailto:stevewilliams@kromestudios.com]
Sent: Sunday, January 08, 2006 7:09 PM
To: Ximon Eighteen
Cc: steven.higgan; users@subversion.tigris.org
Subject: Re: timeout issues over vpn

Ximon Eighteen wrote:
> steven.higgan wrote:
>
>>Has anybody had any issues regarding receiving timeout errors when
>>attempting to commit small (50kb) change sets over a vpn ?
>>
>>The exact error were getting 'Can't read from connection. An existing
>>connection was forcibly closed by the remote host'
>>
>>Fwiw we have tried setting http-timeout to 30 but it didn't appear to
do
>>anything.
>
>
> I think someone recently had a similar problem and solved it by
changing
> the value of the Apache KeepAlives setting. I can't find the message
now
> using the Google search at svn.haxx.se and I've deleted the mail, but
> there was something about KeepAlives recently.

Possibly this one from Carlo Hogeveen on Jan 5 2006:
=====================================================
I had a consistent bug importing or checking in very large
amounts of files thru http. Smaller amounts were no problem.
And the problem occurred only from Windows clients.

Checking this mailing list revealed many similar reports,
with the suggestion to set LimitXMLRequestBody to 0 or a high value,
but to no avail. The suggestion to use Ethereal was educational,
but only revealed that the clients's final failing PUT never even
reached the client's own interface as a packet.

The solution turned out to be this:

In httpd.conf change "KeepAlive Off" to "KeepAlive On",
and restart Apache.
=====================================================

-- 
Sly
This message and its attachments may contain legally privileged or
confidential information. This message is intended for the use of the
individual or entity to which it is addressed. If you are not the
addressee indicated in this message, or the employee or agent
responsible for delivering the message to the intended recipient, you
may not copy or deliver this message or its attachments to anyone.
Rather, you should permanently delete this message and its attachments
and kindly notify the sender by reply e-mail. Any content of this
message and its attachments, which does not relate to the official
business of the sending company must be taken not to have been sent or
endorsed by the sending company or any of its related entities. No
warranty is made that the e-mail or attachment(s) are free from computer
virus or other defect.
---------------------------------------------------------------------
To unsubscribe, e-mail: users-unsubscribe@subversion.tigris.org
For additional commands, e-mail: users-help@subversion.tigris.org
---------------------------------------------------------------------
To unsubscribe, e-mail: users-unsubscribe@subversion.tigris.org
For additional commands, e-mail: users-help@subversion.tigris.org
Received on Mon Jan 9 01:56:18 2006

This is an archived mail posted to the Subversion Users mailing list.

This site is subject to the Apache Privacy Policy and the Apache Public Forum Archive Policy.