[svn.haxx.se] · SVN Dev · SVN Users · SVN Org · TSVN Dev · TSVN Users · Subclipse Dev · Subclipse Users · this month's index

Re: Mike Pilato, a question for you

From: C. Michael Pilato <cmpilato_at_collab.net>
Date: Mon, 30 Jun 2008 12:37:25 -0400

Karl Fogel wrote:
> "Rui, Guo" <timmyguo_at_mail.ustc.edu.cn> writes:
>> On Sun, Jun 15, 2008 at 02:53:42PM +0800, Rui, Guo wrote:
>>> On Sat, Jun 14, 2008 at 08:58:08PM -0400, Karl Fogel wrote:
>>>>> --- branches/issue-2843-dev/subversion/libsvn_wc/adm_crawler.c (r31723)
>>>>> +++ branches/issue-2843-dev/subversion/libsvn_wc/adm_crawler.c (r31724)
>>>>> @@ -278,6 +278,19 @@ report_revisions_and_depths(svn_wc_adm_a
>>>>> continue;
>>>>> }
>>>>>
>>>>> + /* Report the excluded path, no matter whether report_everything. */
>>>>> + if (current_entry->depth == svn_depth_exclude)
>>>>> + {
>>>>> + SVN_ERR(reporter->set_path(report_baton,
>>>>> + this_path,
>>>>> + SVN_INVALID_REVNUM,
>>>>> + svn_depth_exclude,
>>>>> + FALSE,
>>>>> + NULL,
>>>>> + iterpool));
>>>>> + continue;
>>>>> + }
>>>>> +
>>>> I think it would be good for this comment to explain in more detail why
>>>> we're reporting the excluded path always.
>>> Because the report_everything flag indicates that the server will treate the
>>> wc as empty and thus push full content of the files/subdirs. We just want to
>>> say no to this. Does it clear enough? I'll write it as the new comment.
>> I found a bug here during working on a new test case. The test is too
>> restrictive. We should bypass the svn_depth_exclude report when the
>> depth_is_sticky, so as to let go the server-side modifications.
>>
>> After reading the crawler code, I was quite confused. There seems no easy way
>> to get the depth_is_sticky information. Why does the code work without knowing
>> the depth_is_sticky information? Server-side magic? It seems that the server
>> will always push edit to the degree required by 'requested_depth', while the
>> wc_dir is only used to determine whether full content or just delta is needed.
>> I did not read the server code very carefully. Correct me if I am wrong.
>
> (In a previous mail, I mistakenly referred to this problem as being
> elsewhere in the code. Sorry for the confusion. I know where we are
> now.)
>
> Uh. Hmmm. That's an interesting question... I think cmpilato is
> probably better qualified to answer it. Mike? By the way, the full
> thread is
>
> http://subversion.tigris.org/servlets/BrowseList?list=dev&by=thread&from=660362
>
> and the message I'm responding to here is:
>
> http://subversion.tigris.org/servlets/ReadMsg?list=dev&msgNo=140208

I. Uh. I dunno.

The crawler is responsible for telling the server what the working copy
contains. The ra_do_update call tells the server what the working copy
wants. The server gives the client what is needed to become what it wants
to become. This much I know. I know no more.

Given this, I would assume that the crawler doesn't need to know the
"depth_is_sticky" bit because that is a facet of the requested operation,
not the working copy description.

-- 
C. Michael Pilato <cmpilato_at_collab.net>
CollabNet   <>   www.collab.net   <>   Distributed Development On Demand

Received on 2008-06-30 18:37:41 CEST

This is an archived mail posted to the Subversion Dev mailing list.

This site is subject to the Apache Privacy Policy and the Apache Public Forum Archive Policy.