zookeeper-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Attila Szabo <asz...@cloudera.com>
Subject Re: garbage collector choice and tuning
Date Fri, 13 May 2016 18:42:36 GMT
Hi all,

I do have one specific story worth mentioning in this subject:
About one year ago I was working for a financial company, where we had to
maintain a system responsible tons of EOD stock related calculations. The
system was working quite okay, but during the quarterly rebalances we
always experienced serious performance issues. We were still using CMS and
just 12GB mem, and regardless the advices provided my team to do GC tuning
and put more mem to the machine we were not allowed to change anything.
Until a part when by the next rebalancing date we hit an 8 minute SLA miss
(you could imagine what kind of complain storm it has started from our

So I've spent 1 weekend in the office with analyzing GC logs, playing with
GC params, and learning about G1.

The result was the following:
After I'd totally understood how our application was working at that time I
was able to tune it (without touching the code) to work perfectly with the
12G scenario without even having any stop the world event and shrank the
full runtime below the average runtime (so not the worst case! but average
everyday runtime) with 2 minutes. The throughput increased from 68% up to
99.8%. I had to be aware that our heap contained a quite big amount of old
objects needed to be used, thus I had to set bigger the
and -XX:G1MixedGCLiveThresholdPercent higher, and AFAIR I've also set
to 500ms (but in this last I'm not totally sure).

When I've switched to Java8 and turned on string deduplication the
performance got even better. (With that turned on that specific usecase was
even okay with just 10GB of memory)

The best resources I've used for my journeys were:
http://www.infoq.com/news/2015/12/java-garbage-collection-minibook (since
removed but you could find it here https://www.reddit.com/comments/3d8nfo )

Since that time I really in love with G1 and I'm very willing to use it for
small heap size cases (like the one you'd depicted) and nearly a must for
big heap size cases.

However I did learn a very pragmatic approach during this journey.

In all of the cases when you plan a change like this follow this approach:

   1. Build an easily repeatable testcase for measurements.
   2. Measure with both of the old and new GC settings
   3. Analyze the logs with GC viewer. Understand how your application
   works from consumption POV
   4. Repeat step 2 and 3 until you can tune any of the parameters with
   each GC algorithm
   5. Use the better one.

About the massive load of short sessions scenario I would consider playing
with -XX:G1HeapRegionSize and -XX:G1MaxNewSizePercent params as a start,
and check the results.

My 2 cents,


p.s.: If you're interested I happily advise my aid with the measurements
and tuning.

On Fri, May 13, 2016 at 4:18 PM, Guy Laden <guy.laden@gmail.com> wrote:

> Hi, We are considering CMS vs G1 for ZooKeeper running under Oracle JDK8.
> The expected heap size is 4-6GB.
> How workload-specific is this choice in your opinion and it what ways? E.g.
> if many short sessions prefer G1, etc...
> Has anybody had experience they're willing to share regarding this?
> We'd also be very interested to hear about any gc-tuning flags you've had
> good experience with.
> Thanks

Best regards,

Attila Szabo
Sotware Engineer


  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message