hadoop-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Dieter De Witte <drdwi...@gmail.com>
Subject Re: heap space error
Date Mon, 24 Feb 2014 10:35:58 GMT
No problem, it's not easy to learn about all hadoop's configuration
options. Definitely consider looking into the reference (Tom White)


2014-02-24 11:20 GMT+01:00 Raj hadoop <raj.hadoop@gmail.com>:

> Thanks a ton Dieter
>
>
> On Mon, Feb 24, 2014 at 3:45 PM, Dieter De Witte <drdwitte@gmail.com>wrote:
>
>> You can configure the heap size of the mappers with the following
>> parameter (in mapred.site.xml)
>>
>> mapred.map.child.java.opts=-Xmx3200m
>>
>> Also setting the nummber of map tasks is not useful. You should set the
>> number of map slots per node:
>>
>> mapred.tasktracker.map.tasks.maximum=6
>>
>> Regards,
>> Dieter
>>
>>
>> 2014-02-24 11:08 GMT+01:00 Raj hadoop <raj.hadoop@gmail.com>:
>>
>>  Hi All
>>>
>>>                      In our Map reduce code, when we are giving more
>>> than 10 input sequence files, we are facing the java heap space error.
>>> Kindly Find the attached screen shot for error and log file for failed
>>> task. The program is working fine when number of input files is 10.
>>>
>>> I tried to set number of Map Tasks to 10 from code and also input to jar
>>>  but both are not working.
>>>
>>> Alternatives tried :
>>>
>>>    1.
>>>
>>>    While running the map-reduce jar , providing the input as / -D
>>>    mapred.map.tasks =10.
>>>    2.
>>>
>>>    In code I changed Job to JobConf to set num of map-tasks as :
>>>
>>>    JobConf job = new JobConf(conf, SplitAutomation.class);
>>>
>>>    job.setNumMapTasks(10);
>>>
>>> Kindly help us as soon as possible as this is high priority.
>>>
>>>
>>>
>>>
>>>
>>
>

Mime
View raw message