cayenne-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Tony Giaccone <t...@giaccone.org>
Subject Commit Exception
Date Thu, 27 Sep 2018 19:35:55 GMT
I'm processing a large number of rows, over 600,000 and the key value
should be unique in this file but I'd like to ensure that. I also want this
to happen with some rapidity.  To speed this process upI'm going to read
lines from the file, create objects and commit the changes after 500 have
been created.

The problem with this is that if I have a duplicate value I won't catch it
till I do the commit.

When I insert a second key value the first exception is a db level :
org.postgresql.util.PSQLException

eventually this gets wrapped by a Cayenne Commit error.

So I'd like to get a sense of what folks think. Given that I want to
balance these conflicting goals of speed and accuracy.

Can I easily figure out what object or objects caused the error and can I
exclude them from the context and redo the commit? f

Is this a reasonable path to follow.



Tony Giaccone

Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message