Inserting large amounts of data (Was: Re: [Zope] Postgres adapters)

Christopher N. Deckard cnd@ecn.purdue.edu
Mon, 21 May 2001 10:42:53 -0500


I'm trying to take a "File" object in ZODB and iterate over the
entire thing (about 300,000 lines) and insert into a Postgres
database.  The file is about 37MB of comma sepearated numbers.  I
guess Zope treats the entire transaction as one transaction and
doesn't really ever come back from attempting to insert every row. 
Do you have an idea on how to split this up into multiple
transactions or do a commit or some sort so that Zope doesn't croke?

Thanks,
-Chris

Federico Di Gregorio wrote:
> 
> > Can anyone comment on past experiences with various postgres adapters? I am
> > using the latest PoPy (201) and have just recently noticed psychopg. Is it
> > better, faster, more reliable or any of the above?
> 
> obviously yes. but look at my email address and at psycopg hosting site
> before giving too much weight to my words... :)
> 
> more seriously, psycopg is actively developed (i.e., we usually fix bugs
> in 2-3 days and release a new version about 24h afterwards) and we are
> really commited to DB-API compliance. what can i say apart from give
> psyco a try? http://initd.org/Software/psycopg/
> 
> ciao,
> federico