hive-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Steven Wong <>
Subject RE: Issue on using hive Dynamic Partitions on larger tables
Date Sat, 18 Jun 2011 01:24:34 GMT
The name of the parameter is actually hive.exec.max.created.files. The wiki has a typo, which
I'll fix.

From: Bejoy Ks []
Sent: Thursday, June 16, 2011 9:35 AM
To: hive user group
Subject: Issue on using hive Dynamic Partitions on larger tables

Hi Hive Experts
    I'm facing an issue while using hive Dynamic Partitions on larger tables. I tried out
 Dynamic partitions on smaller tables and it was working fine but unfortunately when i tried
the same on a larger table the map reduce job terminates throwing an error as

2011-06-16 12:14:28,592 Stage-1 map = 74%,  reduce = 0%
[Fatal Error] total number of created files exceeds 100000. Killing the job.
Ended Job = job_201106061630_0536 with errors
FAILED: Execution Error, return code 2 from org.apache.hadoop.hive.ql.exec.MapRedTask

I tried setting the parameter hive.max.created.files to a larger value, still the same error
hive>set hive.max.created.files=500000;
The same error was thrown 'total number of created files exceeds 100000' even after I changed
the value to 500000. I doubt whether the value is set for the config parameter is not getting
affected. Or am I setting the wrong parameter to solve this issue. Please advise

The other parameters I did set on hive CLI for dynamic partitions are
set hive.exec.dynamic.partition.mode=nonstrict;
set hive.exec.dynamic.partition=true;
set hive.exec.max.dynamic.partitions.pernode=300;

The hive QL query I used for dynamic partition  is
SELECT p.seq_id,p.lead_id,p.arr_datetime,p.computed_value,
p.del_date,p.location FROM parameter_def p;

Please help me out in resolving the same

Thank You.


View raw message