cayenne-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Musall, Maik" <m...@selbstdenker.ag>
Subject Re: Truncating a table
Date Thu, 21 Dec 2017 16:55:47 GMT
So far, I don't use query caches. This application grows to tens of GB RAM from filling the
shared object cache alone, I use short-lived ObjectContexts, and I don't really want or need
another level of caching that I could forget to invalidate. Plus I don't have that many explicit
queries anyway. Users are navigating object graphs all the time.

This one table is different. I _might_ be using a query cache for that. Could be sensible
to do in this case, and invalidation would also be trivial.

But I don't see how I could avoid those objects ending up in the shared object cache anyway.
Can I?

Maik


> Am 21.12.2017 um 16:07 schrieb John Huss <johnthuss@gmail.com>:
> 
> It's going to depend on which cache you mean.  The query cache can be
> cleared by setting a cache group on the query that fetches the objects and
> then removing that cache group later.
> 
> The shared object cache can be cleared by finding the objects you want in
> context.getGraphManager().registeredNodes() and then invalidating them one
> by one. It would better to use the query cache.
> 
> On Thu, Dec 21, 2017 at 6:48 AM Musall, Maik <maik@selbstdenker.ag> wrote:
> 
>> Hi Michael,
>> 
>> how to deal with the caches is basically my actual question. Ideally, I'd
>> like to call something like myentity.truncateTable(), and just have all
>> data deleted and all caches purged by that, but of course that doesn't
>> exist yet.
>> 
>> Maik
>> 
>> 
>>> Am 21.12.2017 um 13:13 schrieb Michael Gentry <blacknext@gmail.com>:
>>> 
>>> Hi Maik,
>>> 
>>> Raw SQL would certainly be the most efficient way.  Even if you didn't
>> use
>>> raw SQL, though, how were you planning on dealing with Cayenne's
>> caches?  I
>>> think this issue would, regardless of how you truncated the table.  There
>>> are various options, I'm just trying to get a feel for your use can and
>>> thoughts.
>>> 
>>> Thanks,
>>> 
>>> mrg
>>> 
>>> 
>>> On Thu, Dec 21, 2017 at 5:10 AM, Musall, Maik <maik@selbstdenker.ag>
>> wrote:
>>> 
>>>> Hi,
>>>> 
>>>> I have a lookup table with >400k rows that I want to periodically refill
>>>> from external sources. Since it also contains precomputed values that
>> are
>>>> not part of the external source, my plan is to read the external data
>> and
>>>> batch-insert it all into the table.
>>>> 
>>>> How can I truncate the entire table to prepare it for new inserts? The
>>>> only thing that comes to mind is raw SQL, but that would obviously leave
>>>> stale data in Cayenne's various caches.
>>>> 
>>>> Thanks
>>>> Maik
>>>> 
>>>> 
>> 
>> 


Mime
View raw message