On Wed, May 5, 2010 at 09:21, Brendan Farr-Gaynor
<brendan_at_resolutionim.com> wrote:
> I run a small team of web developers (6) who all work from an in-house repository. We make lots of commits and often and notice the performance gets pretty bad. Sometimes taking as long as 2 minutes to commit a change. We rely a lot on server-side software so we often need to commit to see what our code will do, this can get painful during a debug process where as a developer you'll often want to commit many little changes quickly within the span of 5-10 minutes.
Set each developer's workstation up such that it can run the
application. Test locally, don't commit code if you don't know whether
it's broken or not. Local debugging will be faster too.
> We're using SVN on a newer quad-core xServe with about 4GB of ram and a standard 7200 RPM disk (eSATA). Is there something we can do on the hardware side that would help? Solid State drive? More RAM? Is there something in our SVN config that I should be playing with? We're currently using the stock install and config that Apple ships with OS X Server 10.6 (Snow Leopard).
Before you throw hardware at the problem, you need to determine what
the bottleneck is. Is it your network? Do you have long-running hook
scripts? Are you committing large binary files which Subversion
struggles with diffing, or has to write the full contents out every
time? Do you have a lot of path-based authorization rules? How do you
serve the repository (I'm guessing Apache, but you don't say). What
else is running on the server? Is performance better after an Apache
restart (assuming you serve with Apache), and degrades over time?
Apply the same sort of troubleshooting & optimization you apply to
your software development - observe, measure, then address the worst
offender first. Don't just throw random "fixes" at it hoping that
something works.
Received on 2010-05-05 15:30:52 CEST