hadoop-hdfs-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Suresh V <verdi...@gmail.com>
Subject Outofmemory error with Java Heap space when running mapreduce
Date Thu, 21 Jan 2016 21:52:05 GMT
We have a mapreduce that processes text files that are inside a zip file.
The program ran fine when we gave upto 40GB sized zip files.

When we gave a zip file of size 80MB as input (the zip file has a 1.2GB
text file inside), the map reduce errored out with
below error:

2016-01-21 14:47:19,384 FATAL [main]
org.apache.hadoop.mapred.YarnChild: Error running child :
java.lang.OutOfMemoryError: Java heap space
	at java.util.Arrays.copyOf(Arrays.java:2271)

We suspect it could be due to memory overrun in the container...

Can you please help us with the parameters we should set to the map
reduce to make it process this zip file?

This is running on Yarn.

Please let me know if any additional information is required.

Thank you,


View raw message