hive-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Edward Capriolo <edlinuxg...@gmail.com>
Subject Re: Seeing strange limit
Date Wed, 30 Dec 2015 14:55:11 GMT
This message means the garbage collector runs but is unable to free memory
after trying for a while.

This can happen for a lot of reasons. With hive it usually happens when a
query has a lot of intermediate data.

For example imaging a few months ago count (distinct(ip)) returned 20k.
Everything works, then your data changes and suddenly you have issues.

Try tuning mostly raising your xmx.

On Wednesday, December 30, 2015, Gary Clark <gclark@neces.com> wrote:

> Hello,
>
>
>
> I have a multi-node cluster (hadoop 2.6.0) and am seeing the below message
> causing the hive workflow to fail:
>
>
>
> Looking at the hadoop logs I see the below:
>
>
>
> 45417 [main] ERROR org.apache.hadoop.hive.ql.Driver  - FAILED: Execution
> Error, return code -101 from org.apache.hadoop.hive.ql.exec.mr.MapRedTask.
> GC overhead limit exceeded
>
>
>
> I have been running for months without problems. When I removed a large
> amount of the files from the directory which I was running a query on the
> query succeeded. It looks like I’m hitting a limit not sure how to remedy
> this.
>
>
>
> Has anybody else seen this problem?
>
>
>
> Thanks,
>
> Gary C
>


-- 
Sorry this was sent from mobile. Will do less grammar and spell check than
usual.

Mime
View raw message