cayenne-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Peter Schröder <Peter.Schroe...@freenet-ag.de>
Subject AW: AW: updating large number of data
Date Tue, 05 Dec 2006 16:52:03 GMT
yes, we have auto-increment on that table. how do i invalidate the caches? i know there is
a invalidateObjects method, but is this the way to do it?

-----Ursprüngliche Nachricht-----
Von: Michael Gentry [mailto:blacknext@gmail.com] 
Gesendet: Dienstag, 5. Dezember 2006 16:30
An: cayenne-user@incubator.apache.org
Betreff: Re: AW: updating large number of data

You know, sometimes simple and fast is a good way to do things.  Do
you have an auto-increment PK in that table? Would be helpful.  As for
Cayenne, can you can flush (invalidate) any active DataContexts (at
least the objects for that table) when the load occurs?

/dev/mrg


On 12/5/06, Peter Schröder <Peter.Schroeder@freenet-ag.de> wrote:
> we are deleting all rows with truncate table first. then loading cvs with load data infile.
>
> i would prefer not to use this method, but it is simple and fast.
>
> -----Ursprüngliche Nachricht-----
> Von: Michael Gentry [mailto:blacknext@gmail.com]
> Gesendet: Dienstag, 5. Dezember 2006 14:38
> An: cayenne-user@incubator.apache.org
> Betreff: Re: updating large number of data
>
> Are you deleting all of the original data and then doing inserts or
> are you doing updates?
>
> Thanks,
>
> /dev/mrg
>
>
> On 12/5/06, Peter Schröder <Peter.Schroeder@freenet-ag.de> wrote:
> > hi,
> >
> > we get a cvs-file with a large number of user-data every hour. and we want to replace
the existing data in our database with that. is there a best-practice to do something like
this?
> > currently we are doing that with php an using mysql load data infile with the cvs-file.
but i think that doing this with cayenne would leave the context in a bad state.
> >
> > hava a nice day,
> > peter
> >
>

Mime
View raw message