[Zope-CMF] Re: CMFTestCase: Best way to create the CMF site?

Geoff Davis geoff at phds.org
Thu Oct 6 14:41:43 EDT 2005


Hi Tres--

I think this is a case of us having a violent agreement :)  Sorry if you
get this twice -- my first attempt to send appears to have disappeared
into the aether.

I agree completely that minimalist test rigs with dummy components are a
good fit for some things.  However, the point I was trying to make
(perhaps not very clearly) is that (1) such tests come at a cost, and (2)
they are not appropriate in _all_ cases; there are a lot of cases for
which CMFTestCase-like tests are a better fit.  See comments interspersed
below.

On Thu, 2005-10-06 at 11:47 -0400, Tres Seaver wrote:
> > 
> > * As you note, dummy components take a lot of time to write.
> 
> Not necessarily.  They *do* require some knowledge of the API of the
> thing they are fronting for, as well as a sense of what the calling test
> needs.
>
> > * Dummy components create the need for new tests to ensure that the dummy
> > components' functionality really does match that of the components they
> > are replacing.  Do we have such tests in the CMF?  I'm not sure we do.
> 
> I don't think we need to test the tests.  The point of the dummies is to
> emulate the published API (the interface) of the tool / content they are
> replacing.  Often, they won't actually *do* the work required, and may
> in fact have extra stuff in them to make testing the *caller* easier.

Yes, instrumenting dummy components can offer some real advantages.  I
am definitely not saying that one should never do these kinds of
things!  

My concern is that the CMF's API evolves over time.  If you have a bunch
of dummy components, you have a bunch of things that can get out of
sync.  If you forget to update a dummy component's API, you could have
tests passing that would fail with the real components.

> > * Dummy components create the need for additional documentation.  The
> > absence of such documentation creates barriers to test writing and, as a
> > result, to the contribution of code to the CMF.
> 
> Nope.  Dummy components do *not* need documentation.  Their purpose
> should be clear from use / naming, and their API is supposed to be the
> same as the (already documented, we assume).  The price of maintenance
> (occasionally having to extend / fix the jig) is a necessary

I personally find the existing dummy components to be rather obscure. 
Perhaps my understanding of the deep innards of the CMF is simply
insufficient.  I don't think there needs to be extensive documentation,
but coming up with the incantation needed to, say, produce a content
object with a view in a skin is not entirely straightforward. The more
difficult it is to write tests, the fewer you'll get.

> > At some point I think we have to trust the stack.
> 
> I do not believe that "trusting the stack" makes senses when trying to
> test a component of the stack.  If you are writing tests for an
> application (or higher layer) which *uses* the stack, then you can
> safely trust it.  For instance, I'm willing to use OFS.SimpleItem and
> OFS.Folder when building out a test jig, because they belong to a lower
> layer of the stack, and have their own tests.

Right.  I think we are in agreement here.

> Such assumptions don't create unwanted dependencies, true.  They may or
> may not make for useful tests:
> 
>   - If the "trusted" component has no side effects which might affect
>     this or later tests;
> 
>   - If the "trusted" component does not make unwarrented assumptions
>     about the state of the system;
> 
>   - If the test being written does not need to "instrument" the
>     component in order to write a better / clearer / more comprehensive
>     test of its target component.

Of course.  On the other hand, dummy components make their own set of
assumptions about the state of the system, which may or may not hold in
a real production system.  And dummy components can have bugs that can
mask other problems.

I'm not arguing that dummy components are bad or should not be used.  I
just think that CMFTestCase-type tests have an important place as well.
I think it would be possible to construct something like CMFTestCase
that would assume a very stripped down CMFCore site.  That would make
test writing for CMFDefault-level things much simpler.  And I think
CMFTestCase is potentially quite useful for people doing pure CMF sites.

> Timing *may* be a red herring;  the issue is likely worse for folks
> trying to run tests on machines with less-than-blazing CPUs.  

Yes.  My development box is no speed demon!  

> THere is a
> classic back-and-forth in the test-driven development community
> (documented by Beck and others) in which people write more and more
> tests, until the run-time for the entire suite becomes so painful that
> people begin avoiding running them all;  the team then has to stop and
> profile / refactor the tests themselves, in order to remove that burden.

The desire for speed creates a number of tradeoffs.  

1) One can invest time in optimizing tests.  I think a bit of investment
is warranted, but after a point I think developer effort is better spent
writing code and tests.  Where that point is is an open question.

2) One can write fewer tests.  My preference is for more tests at the
expense of speed, especially since the cost of CPU has been dropping
rapidly over time.  I'm sure you would agree.

3) One can run tests less frequently.  I agree with you that this is an
undesirable outcome.  Three cheers for whoever added the "zopectl test"
feature with the additional options for selecting a particular directory
and a particular test.  In my own work I tend to run a focused set of
tests frequently while working on a single feature, then I periodically
run all tests to make sure I haven't created any side effects.  That may
be anathema to the XP crowd, but I find it works well for me.  Maybe one
solution would be create some kind of sensible partitioning of tests? 
Tests are already partitioned by module; why not partition further so
people can focus their test cycles on the things that matter most to the
work at hand?

> Here are timings for the stock CMF components and Plone on my box:
> 
>   Product         # tests   Wall-clock (s)
>   ---------       -------   --------------
>   CMFCore             382           28.775
>   CMFDefault          164            2.980
>   CMFActionIcons       11            0.002
>   CMFCalendar          23            1.636
>   CMFTopic             58            1.898
>   CMFSetup            341            2.028
>   DCWorkflow           10            0.025

I don't see the Plone numbers, but I'm guessing that you probably got
numbers along the lines of

CMFPlone        ~1500   ~175 sec

Plone's tests are slow, but there are a _lot_ of them.  I think that's an
important thing to note.  There are a lot of tests for Plone in part
because they are easy to write.  I'd rather have full(er) coverage with
slow tests than less full coverage with fast ones. Note that I'm _not_
claiming that CMF's tests are insufficient or that Plone's are complete.
Rather, I'm saying that good tools can help everyone get more tests
written and more good code into the CMF.

> My guess is that CMFCore's tests are ripe for such a refactoring (there
> is a noticeable lag of a couple of seconds several times during the test
> run, for instance).
> 
> False dependencies, poor separation of concerns, and poor test coverage
> are real issues with using a "functional" test jig where a unit test jig
> would be more appropriate.

I agree completely.  There are definitely places where narrow, focused
tests are the right way to go.  More "functional" style tests do have
their disadvantages.  

On the flip side, though, minimalist test tools that require developers to
jump through extra hoops can raise barriers to entry for community
participation and can hinder development.  Open source is as much social
engineering as software engineering; simplifying tests are one way to
address some needs in the latter area.

Geoff



More information about the Zope-CMF mailing list