cassandra-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Weijun Li <weiju...@gmail.com>
Subject Re: Heap sudden jump during import
Date Sat, 03 Apr 2010 09:08:34 GMT
Thank you Benoit. I did a search but couldn't find any that you mentioned.
Both jhat and netbean load entire map file int memory. Do you know the name
of the tools that requires less memory to view map file?

Thanks,
-Weijun

On Sat, Apr 3, 2010 at 12:55 AM, Benoit Perroud <benoit@noisette.ch> wrote:

> It exists other tools than jhat to browse a heap dump, which stream
> the heap dump instead of loading it full in memory like jhat do.
>
> Kind regards,
>
> Benoit.
>
> 2010/4/3 Weijun Li <weijunli@gmail.com>:
> > I'm running a test to write 30 million columns (700bytes each) to
> Cassandra:
> > the process ran smoothly for about 20mil then the heap usage suddenly
> jumped
> > from 2GB to 3GB which is the up limit of JVM, --from this point Cassandra
> > will freeze for long time (terrible latency, no response to nodetool that
> I
> > have to stop the import client ) before it comes back to normal . It's a
> > single node cluster with JVM maximum heap size of 3GB. So what could
> cause
> > this spike? What kind of tool can I use to find out what are the objects
> > that are filling the additional 1GB heap? I did a heap dump but could get
> > jhat to work to browse the dumped file.
> >
> > Thanks,
> >
> > -Weijun
> >
>

Mime
View raw message