Inserting large amounts of data (Was: Re: [Zope] Postgres adapters)
Mon, 21 May 2001 12:58:48 -0400
On Mon, May 21, 2001 at 10:42:53AM -0500, Christopher N. Deckard wrote:
> I'm trying to take a "File" object in ZODB and iterate over the
> entire thing (about 300,000 lines) and insert into a Postgres
> database. The file is about 37MB of comma sepearated numbers. I
> guess Zope treats the entire transaction as one transaction and
> doesn't really ever come back from attempting to insert every row.
> Do you have an idea on how to split this up into multiple
> transactions or do a commit or some sort so that Zope doesn't croke?
Has it been exported to filespace?
Is it a Unix-like system? Are you comfortable writing edit scripts?
If yes to all of these, then I would recommend
using psql to inject it into the database. If not, I would drop it
to the filesystem and then use psql ;-).
> Federico Di Gregorio wrote:
> > > Can anyone comment on past experiences with various postgres adapters? I am
> > > using the latest PoPy (201) and have just recently noticed psychopg. Is it
> > > better, faster, more reliable or any of the above?
> > obviously yes. but look at my email address and at psycopg hosting site
> > before giving too much weight to my words... :)
> > more seriously, psycopg is actively developed (i.e., we usually fix bugs
> > in 2-3 days and release a new version about 24h afterwards) and we are
> > really commited to DB-API compliance. what can i say apart from give
> > psyco a try? http://initd.org/Software/psycopg/
> > ciao,
> > federico
> Zope maillist - Zope@zope.org
> ** No cross posts or HTML encoding! **
> (Related lists -
> http://lists.zope.org/mailman/listinfo/zope-dev )