On Fri, Mar 25, 2011 at 12:33 PM, Mark Phippard <markphip_at_gmail.com> wrote:
> I have been working on a framework for writing tests to record
> performance. I have something good enough to share:
> It is pretty easy to add new tests if you have ideas on more tests you
> think we should add. I think I have pretty good coverage of the major
> functions. The wiki on the site I linked to above has details on how
> I have constructed the current tests. I am going to put out a call to
> users for feedback and try to get more people to run the tests and
> record results.
> I am not claiming these are anything definitive or even that we will
> use them to help us make the release decision, but I think it is a
> start on coming up with some reproducible tests that people can run
> easily. If after people look at and run the tests they think they are
> useful or can be tweaked to be useful, then great. If not, then at
> least I got to write some code for a change :)
> The tests are written in Java because that is what I know and it gives
> me good cross platform coverage. However, the Java just drives the
> command line so all you need to do is have the svn command line in
> your PATH and that is what it uses for all the work.
Very cool to see something which will hopefully give us some
quantitative measure of performance.
I've seen people submit reports based on particular revisions. Would
it be possible to run the same suite of tools across a number of
different revisions to give us some sense of change over time? It'd
be nice to know if we're getting better or worse, or how particular
changes impacted performance, etc.
Just a thought.
Received on 2011-03-28 19:12:51 CEST