[ZODB-Dev] ZODB memory problems (was: processing a Very Large file)

Christian Heimes christian at cheimes.de
Sat May 21 11:38:32 EDT 2005


DJTB wrote:
> What should I do to make sure RAM is no longer a limiting factor?
> (in other words: The program should work with any (large) value of
> self.__range and self.__et_count
> Because in my case, self.__et_count = 5000 is only a toy example...)
> I'm now working on a PC with 2.5 GB RAM and even that's not enough!

Grab the Zope2 sources and read lib/python/OFS/Image.py. Zope's 
OFS.Image.Image class (and also Zope3's implementation) is using a so 
called possible large data class (Pdata) that is a subclass of Persistent.

Pdata is using a simple and genious approach to minimize the memory 
usage when storing large binary data in ZODB. The data is read from a 
temporary file chunk by chunk. Each chunk is stored inside a Pdata 
object and committed in a subtransaction. The Pdata objects are linked 
in a simple linear chain just like a linear list connected with pointers 
in old style C.

Try to understand the code. It might help to solve your problem. In 
general: Don't try to store large data in one block like a binary 
string. Use small, persistent chunks.

Christian


More information about the ZODB-Dev mailing list