The take-away for me is that if we adopted a convention (or did
something in the test code) to make sure that all XFails were
associated with an issue then it would collectively save the project a
lot of time and effort in understanding the significance of each
XFailing test. This becomes especially important in the run-ups to
releases.
On Tue, Feb 1, 2011 at 9:53 PM, Paul Burba <ptburba_at_gmail.com> wrote:
> Hi All,
>
> One of the roadmap items yet to be started was a 'test review' to
> 'Determine which XFail and WIP tests should remain so, and which need
> to be fixed before release.'
>
> I took a look at this today. We currently have 61 tests set to XFail
> (2 of these are WIPs). Here is how they break down:
>
> A) Tests with an associated issue (35).
>
> The target milestone of these 35 break down like this:
>
> (8) Never scheduled, i.e. '---'
> (9) Unscheduled [3 of these are marked as RESOLVED]
> (1) 1.6.1
> (3) 1.7.0
> (8) 1.7-consider
> (1) 1.8.0
> (5) 1.8.0-consider
>
> B) 26 tests have no issue associated with them (well there *might* be
> an issue, but a cursory look in the issue tracker didn't reveal
> anything obvious).
>
> So a whopping 55 tests are *potentially* release blockers (all but the
> 1.8.0* milestone issues).
>
> I put a summary of all 61 Xfailing tests in
> ^/subversion/trunk/notes/xfail-status[1].
>
> For many of the tests I listed a "point person" as an educated guess
> as to who is best equipped to decide the fate of the test for 1.7. In
> the case of tests with an associated issue that is assigned to
> someone, the point person is the assignee. For issues not assigned to
> anyone, I left this blank. In the case of tests with no associated
> issues, this is usually the person who committed the test or changed
> it to XFail. If I cc'ed you then you are one of the lucky point
> people :-P
>
> Keep in mind this is just a guess, if your name is there you aren't
> compelled to do anything...though I suspect we'd all agree it's bad
> form to create an XFailing test with no associated issue when we
> aren't planning on fixing the underlying problem.
>
> If you have some time please look through the list. If a test you
> wrote needs an issue please write one up. If there is an existing
> issue that should be associated with a test add it. If an associated
> issue is unscheduled or never scheduled and you're familiar with it,
> please take a stab at some issue triage.
>
> Thanks,
>
> Paul
>
> [1] I'm happy to find this a new home if it belongs elsewhere.
> Ideally I'd like to tweak svntest\testcase.py:XFail() so that we have
> a way to associate an issue and a target milestone with each XFailing
> test, so we can easily create this list. I'll take a look at doing
> just that tomorrow once I get through the items I've assigned myself.
>
--
Thanks
Mark Phippard
http://markphip.blogspot.com/
Received on 2011-02-02 13:47:54 CET