cayenne-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Peter Schröder <Peter.Schroe...@freenet-ag.de>
Subject AW: updating large number of data
Date Tue, 05 Dec 2006 14:53:39 GMT
we are deleting all rows with truncate table first. then loading cvs with load data infile.

i would prefer not to use this method, but it is simple and fast. 

-----Ursprüngliche Nachricht-----
Von: Michael Gentry [mailto:blacknext@gmail.com] 
Gesendet: Dienstag, 5. Dezember 2006 14:38
An: cayenne-user@incubator.apache.org
Betreff: Re: updating large number of data

Are you deleting all of the original data and then doing inserts or
are you doing updates?

Thanks,

/dev/mrg


On 12/5/06, Peter Schröder <Peter.Schroeder@freenet-ag.de> wrote:
> hi,
>
> we get a cvs-file with a large number of user-data every hour. and we want to replace
the existing data in our database with that. is there a best-practice to do something like
this?
> currently we are doing that with php an using mysql load data infile with the cvs-file.
but i think that doing this with cayenne would leave the context in a bad state.
>
> hava a nice day,
> peter
>

Mime
View raw message