[svn.haxx.se] · SVN Dev · SVN Users · SVN Org · TSVN Dev · TSVN Users · Subclipse Dev · Subclipse Users · this month's index

Re: Robots and Spiders, oh my!

From: Justin Erenkrantz <justin_at_erenkrantz.com>
Date: 2004-03-12 07:33:12 CET

--On Thursday, March 11, 2004 10:01 PM -0600 "Brian W. Fitzpatrick"
<fitz@red-bean.com> wrote:

> Shouldn't we have a big bold warning box in the book telling people to
> create a robots.txt file in their DOCUMENT_ROOT That contains:
>
> User-agent: *
> Disallow: /
>
> We've had this on svn.collab.net for ages, and I'm thinking we should
> really let people know about it.

For apache.org, I don't think we would do this. I don't see how excluding
robots is possibly a good idea.

> # Disallow browsing of Subversion working copy administrative
> # directories.
> <DirectoryMatch "^/.*/\.svn/">
> Order deny,allow
> Deny from all
> </DirectoryMatch>
>
> Thoughts?

Neither for this. What's possibly sensitive here? The auth info isn't stored
there any more. -- justin

---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscribe@subversion.tigris.org
For additional commands, e-mail: dev-help@subversion.tigris.org
Received on Fri Mar 12 07:33:28 2004

This is an archived mail posted to the Subversion Dev mailing list.

This site is subject to the Apache Privacy Policy and the Apache Public Forum Archive Policy.