[svn.haxx.se] · SVN Dev · SVN Users · SVN Org · TSVN Dev · TSVN Users · Subclipse Dev · Subclipse Users · this month's index

RE: Recursively finding locks based on a URL

From: Phillip Gussow <pgussow_at_cordys.com>
Date: Thu, 16 Sep 2010 13:12:09 +0200

Hi Allan,

Thanks for your response. It seems this is indeed the path to take to solve
my problem.
Only downside is that I will list everything in the project. And there might
be 1000s of files in there with only 3 locks on it. So it would be nice to
have SVN do the filtering server side (in svnserve or the Apache mod.

Thanks again and regards,

Phillip

-----Original Message-----
From: Campbell Allan [mailto:campbell.allan_at_sword-ciboodle.com]
Sent: donderdag 16 september 2010 12:09
To: users_at_subversion.apache.org
Cc: Phillip Gussow
Subject: Re: Recursively finding locks based on a URL

On Thursday 16 Sep 2010, Phillip Gussow wrote:
> Hi group,
>
>
>
> I’m running into a situation where I need to know all locks that exist on
> a
> certain repository path.
>
> I know there is the svnadmin lslocks command. Which is in general what I
> need, but I need to do it on an URL, because I don’t have access to the
> server path.
>
>
>
> Back ground: we’re using a software package which utilizes the locks in
> SVN
> to prevent editing of unmergable content. But sometimes the product puts
> ‘illegal’ locks on some files without showing their end users.
>
> So what I would like is to ask SVN for a list of all paths that are
> locked.
>
> So something like this:
>
> svn lslocks https://server/svn/repository_name/project1/trunk
>
>
>
> Which produces something like this:
>
> user1 2010-09-16 10:00:00 /folder1/folder2/file.txt
> The comment
>
> user3 2010-09-16 10:00:00 /folder1/folder2//folder3/something.jsp
> The comment
>
> user1 2010-09-16 10:00:00 /folder1/morefile.xml
> The comment
>
>
>
> Do I need to log a feature request for this? And if yes, how should I
> phrase this?
>
> Or is there some other way to achieve this? Keep in mind that I have to be
> able to run this on a URL from any client (of course using proper SVN
> credentials J )
>
>
>
> Thanks in advance and regards,
>
>
>
> Phillip Gussow

if you are on an environment with access to perl

svn ls -R --verbose <svn-url> \
 | perl -n -e 'print "$_" if /^\s*\d+\s\S+\s+O\s+/;'

will do some of it. All you're looking for is an O in the third column of
the
output. I didn't bother optimising the regex to something nicer but it's
possible or you could get svn to list in xml format and then it becomes a
quite a bit easier to parse.

I don't know what the comment would be as subversion does not require a
comment to be provided when locking. If you wanted the last commit message
then that would require a second request for each locked file. I also do not
believe the date is the time of the lock being obtained.

-- 
__________________________________________________________________________________
Sword Ciboodle is the trading name of ciboodle Limited (a company
registered in Scotland with registered number SC143434 and whose
registered office is at India of Inchinnan, Renfrewshire, UK,
PA4 9LH) which is part of the Sword Group of companies.
This email (and any attachments) is intended for the named
recipient(s) and is private and confidential. If it is not for you,
please inform us and then delete it. If you are not the intended
recipient(s), the use, disclosure, copying or distribution of any
information contained within this email is prohibited. Messages to
and from us may be monitored. If the content is not about the
business of the Sword Group then the message is neither from nor
sanctioned by us.
Internet communications are not secure. You should scan this
message and any attachments for viruses. Under no circumstances
do we accept liability for any loss or damage which may result from
your receipt of this email or any attachment.
__________________________________________________________________________________
Received on 2010-09-16 13:13:10 CEST

This is an archived mail posted to the Subversion Users mailing list.

This site is subject to the Apache Privacy Policy and the Apache Public Forum Archive Policy.