[Zope] Large XML files

Toby Dickenson tdickenson@geminidataloggers.com
Wed, 06 Dec 2000 16:44:01 +0000


On Tue, 5 Dec 2000 23:02:38 -0000, "Phil Harris"
<phil.harris@zope.co.uk> wrote:



>The XMLDocument type is rather 'expensive' and you may be able to 'get away'
>with using a simpler type such as DTMLDocument.

Simpler, but thats not necessarily an advantage. DTMLDocument will
store the whole document in memory, but XMLDocument uses ZODB
effectively so that it only loads the DOM nodes that are in use.

If you are always using the *whole* document then this is no help, and
simpler may indeed be better.

>From: <paul_s_johnson@urscorp.com>

>> I am testing the possibilities of delivering the content of  XML Documents
>> through the Zope environment.  Unfortunately, some of the proposed file
>are
>> rather large (up to 760KB) and just uploading them and viewing them on our
>> current Zope server is prohibitively slow.  Our server, running Z2 is a
>> blazing P133 running NT 4.0 with 32 MB of RAM (I get the bottom feeders).
>> Is the bottleneck the hardware; is there something I can do software-wise
>> to improve performance; or is development not yet advanced enough to
>handle
>> this scenario efficiently? Any opinions on this?

For a machine of that size I suggest you use only one publisher thread
(thats -T 1 on the command line), rather than the default of 4. Each
thread gets a copy of the ZODB object cache, and you probably dont
want to keep four copies of your 700k document in memory.

Toby Dickenson
tdickenson@geminidataloggers.com