[svn.haxx.se] · SVN Dev · SVN Users · SVN Org · TSVN Dev · TSVN Users · Subclipse Dev · Subclipse Users · this month's index

Re: Potential bug, svn delete with large number of files in directory

From: Andy Levy <andy.levy_at_gmail.com>
Date: 2006-05-05 15:28:52 CEST

On 5/5/06, Ryan Schmidt <subversion-2006q2@ryandesign.com> wrote:
> On May 5, 2006, at 04:13, Gary Bentley wrote:
>
> > Did a svn delete <directory> of a directory which contains 18000
> > files. The delete started off ok but after about 30 minutes it
> > just "froze", that is no more output was being output to the
> > shell. [snip]
>
> Wasn't there some problem with the Windows shell mentioned on this
> list a couple times now where it just blows up if it gets too much
> data? Would printing 18,000 paths qualify as too much data? If this
> was the problem, then I believe the only solution was to get
> Microsoft to fix the Windows shell or to print less data to it (i.e.
> delete fewer files at a time?).... Or to get an OS with a better
> shell....

Or use the -q switch for svn (variation on "print less data").
Or redirect the output of the command to a file.

Maybe PowerShell (formerly known as Monad) doesn't have such a limitation?

BUT, I don't think that's what's happening here. The shell usually
reports an error when this limit is hit, terminates the program and
returns you to a prompt. For some reason I can't seem to reproduce it
right now so that I can give the actual message.

---------------------------------------------------------------------
To unsubscribe, e-mail: users-unsubscribe@subversion.tigris.org
For additional commands, e-mail: users-help@subversion.tigris.org
Received on Fri May 5 15:29:57 2006

This is an archived mail posted to the Subversion Users mailing list.

This site is subject to the Apache Privacy Policy and the Apache Public Forum Archive Policy.