[svn.haxx.se] · SVN Dev · SVN Users · SVN Org · TSVN Dev · TSVN Users · Subclipse Dev · Subclipse Users · this month's index

Re: stopping webcrawlers using robots.txt

From: Bob Proulx <bob_at_proulx.com>
Date: 2006-07-10 00:28:40 CEST

Thomas Beale wrote:
> I don't think changing the URL is an option - it is published all over
> the place - everyone in our community knows it.

I understand the pain of it.

> I am quite surprised to find that the configuration which seems to
> be preferred by the subversion manual is not compatible with
> managing web robots...

Where does it say this in the subversion manual? I can't find
anything that recommends that and if any exist I presume it to be a
documentation but that should be reported. The examples I see all use
/repos. Also the subversion project itself uses the /repos
convention.

  http://svn.collab.net/repos/svn/trunk

Bob

---------------------------------------------------------------------
To unsubscribe, e-mail: users-unsubscribe@subversion.tigris.org
For additional commands, e-mail: users-help@subversion.tigris.org
Received on Mon Jul 10 00:29:49 2006

This is an archived mail posted to the Subversion Users mailing list.

This site is subject to the Apache Privacy Policy and the Apache Public Forum Archive Policy.