[svn.haxx.se] · SVN Dev · SVN Users · SVN Org · TSVN Dev · TSVN Users · Subclipse Dev · Subclipse Users · this month's index

Running a repository out of RAM?

From: Troy Curtis Jr <troycurtisjr_at_gmail.com>
Date: 2007-06-17 19:31:26 CEST

A solution to this is probably not Subversion specific, therefore the
question may not be 100% on-topic, but I am hoping that someone on the
list has some helpful experience.

I *may* be getting a new server which will have more RAM than you can
shake a stick at. In an effort to make the best use of said RAM I am
trying to determine whether I can successfully run a Subversion
repository straight out of RAM, but in a caching sort of way such that
the data is written to disk and can thus survive a power failure
(important for version control wouldn't you say? :) )

This will be a running on a Redhat Enterprise Linux 4 Update 4 OS
being served out via Apache. There will be several repositories, but
there is really only one that I think will help, it is ~2.5GB in size
and is a BDB. (Actually, running from RAM might improve FSFS enough
that I could switch to it!)

I know that Linux is natively aggressive in it's disk caching and
after the first request much of the data WILL be cached. However,
this server will also be serving out 5+ GB vmware images which will
likely quickly invalidate the cached repository blocks for initial
requests. I have been doing some google searches but all the info I
have ran into is related to caching network file systems on the client
side.

Ideas anyone?

Thanks ahead of time for any info.

Troy

-- 
"Beware of spyware. If you can, use the Firefox browser." - USA Today
Download now at http://getfirefox.com
Registered Linux User #354814 ( http://counter.li.org/)
---------------------------------------------------------------------
To unsubscribe, e-mail: users-unsubscribe@subversion.tigris.org
For additional commands, e-mail: users-help@subversion.tigris.org
Received on Sun Jun 17 19:51:37 2007

This is an archived mail posted to the Subversion Users mailing list.

This site is subject to the Apache Privacy Policy and the Apache Public Forum Archive Policy.