hive-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Deepak Khandelwal <dkhandelwal....@gmail.com>
Subject Re: Hive UDAF function throwing "java.lang.OutOfMemoryError: GC overhead limit exceeded" error
Date Fri, 17 Feb 2017 04:19:25 GMT
How much data is there? Is your table partitioned ?

On Thursday, February 16, 2017, Srinivas Pogiri <pogiri.sri@gmail.com>
wrote:

> Hi,
>
> We have a custom Hive UDAF function which aggregates a lot of data for
> grouping. The reduce task fails with the below stack trace. Any suggestion
> would be very help.
>
> MR job was having 5 Maps which completed fine. There were 6 reduces out of
> which 5 only completed. Here is a sample MR job_1476197655848_2037226
> Error:'
> INFO communication thread org.apache.hadoop.mapred.Task: Communication
> exception: java.lang.OutOfMemoryError: GC overhead limit exceeded
> at java.io.BufferedReader.<init>(BufferedReader.java:105)
> at java.io.BufferedReader.<init>(BufferedReader.java:116)
> at org.apache.hadoop.yarn.util.ProcfsBasedProcessTree.
> constructProcessInfo(ProcfsBasedProcessTree.java:525)
> at org.apache.hadoop.yarn.util.ProcfsBasedProcessTree.updateProcessTree(
> ProcfsBasedProcessTree.java:223)
> at org.apache.hadoop.mapred.Task.updateResourceCounters(Task.java:847)
> at org.apache.hadoop.mapred.Task.updateCounters(Task.java:986)
> at org.apache.hadoop.mapred.Task.access$500(Task.java:79)
> at org.apache.hadoop.mapred.Task$TaskReporter.run(Task.java:735)
> at java.lang.Thread.run(Thread.java:745)
>
> Thank you,
> Srinivas Pogiri
>
>

Mime
View raw message