hive-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Arthur.hk.chan@gmail.com" <arthur.hk.c...@gmail.com>
Subject Re: Smoke Test after 1 days 7 hours 5 minutes 19 seconds 70 msec, Failed with Error: GC overhead limit exceeded
Date Mon, 13 Oct 2014 10:12:58 GMT
Hi,

I have managed to resolve the issue by turning the SQL.

Regards
Arthur
On 12 Oct, 2014, at 6:49 am, Arthur.hk.chan@gmail.com <arthur.hk.chan@gmail.com> wrote:

> Hi,
> 
> My Hive version is 0.13.1, I tried a smoke test, after 1 days 7 hours 5 minutes 19 seconds
70 msec, the job failed with error: Error: GC overhead limit exceeded
> 
> 
> LOG:
> 2014-10-12 06:16:07,288 Stage-6 map = 100%,  reduce = 50%, Cumulative CPU 425.35 sec
> 2014-10-12 06:16:12,431 Stage-6 map = 100%,  reduce = 67%, Cumulative CPU 433.01 sec
> 2014-10-12 06:16:15,515 Stage-6 map = 100%,  reduce = 100%, Cumulative CPU 447.59 sec
> …...
> Hadoop job information for Stage-19: number of mappers: 3; number of reducers: 0
> 2014-10-12 06:16:30,643 Stage-19 map = 0%,  reduce = 0%
> 2014-10-12 06:16:55,494 Stage-19 map = 33%,  reduce = 0%, Cumulative CPU 153.83 sec
> 2014-10-12 06:16:56,520 Stage-19 map = 0%,  reduce = 0%
> 2014-10-12 06:17:57,037 Stage-19 map = 0%,  reduce = 0%
> 2014-10-12 06:18:27,720 Stage-19 map = 100%,  reduce = 0%
> MapReduce Total cumulative CPU time: 2 minutes 33 seconds 830 msec
> Ended Job = job_1413024651684_0033 with errors
> Error during job, obtaining debugging information...
> Examining task ID: task_1413024651684_0033_m_000001 (and more) from job job_1413024651684_0033
> 
> Task with the most failures(4):
> -----
> Task ID:
>   task_1413024651684_0033_m_000002
> 
> URL:
>   http://m1:8088/taskdetails.jsp?jobid=job_1413024651684_0033&tipid=task_1413024651684_0033_m_000002
> -----
> Diagnostic Messages for this Task:
> Error: GC overhead limit exceeded
> 
> FAILED: Execution Error, return code 2 from org.apache.hadoop.hive.ql.exec.mr.MapRedTask
> MapReduce Jobs Launched:
> Job 0: Map: 5  Reduce: 1   Cumulative CPU: 10705.42 sec   HDFS Read: 829911667 HDFS Write:
693918010684 SUCCESS
> Job 1: Map: 2684  Reduce: 721   Cumulative CPU: 100612.23 sec   HDFS Read: 720031197955
HDFS Write: 56301916 SUCCESS
> Job 2: Map: 25  Reduce: 6   Cumulative CPU: 447.59 sec   HDFS Read: 5785850462 HDFS Write:
22244710 SUCCESS
> Job 3: Map: 3   Cumulative CPU: 153.83 sec   HDFS Read: 0 HDFS Write: 0 FAIL
> Total MapReduce CPU Time Spent: 1 days 7 hours 5 minutes 19 seconds 70 msec
> 
> my smoke test SQL :
> SELECT O_YEAR, 
>        SUM(CASE 
>              WHEN NATION = 'BRAZIL' THEN VOLUME 
>              ELSE 0 
>            END) / SUM(VOLUME) AS MKT_SHARE 
> FROM   (SELECT  YEAR(cast(O_ORDERDATE as date))     AS O_YEAR, 
>                L_EXTENDEDPRICE * ( 1 - L_DISCOUNT ) AS VOLUME, 
>                N2.N_NAME                            AS NATION 
>         FROM   PART, 
>                SUPPLIER, 
>                LINEITEM, 
>                ORDERS, 
>                CUSTOMER, 
>                NATION N1, 
>                NATION N2, 
>                REGION 
>         WHERE  P_PARTKEY = L_PARTKEY 
>                AND S_SUPPKEY = L_SUPPKEY 
>                AND L_ORDERKEY = O_ORDERKEY 
>                AND O_CUSTKEY = C_CUSTKEY 
>                AND C_NATIONKEY = N1.N_NATIONKEY 
>                AND N1.N_REGIONKEY = R_REGIONKEY 
>                AND R_NAME = 'AMERICA' 
>                AND S_NATIONKEY = N2.N_NATIONKEY 
>                AND cast(O_ORDERDATE as date)  >= cast('1995-01-01' as date) 
>                AND cast(O_ORDERDATE as date)  <= cast('1996-12-31' as date) 
>                AND P_TYPE = 'ECONOMY ANODIZED STEEL') AS ALL_NATIONS 
> GROUP  BY O_YEAR 
> ORDER  BY O_YEAR;
> 
> 
> Please help.
> Regards
> Arthur
> 
> 
> 


Mime
View raw message