hadoop-common-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Boyu Zhang <boyuzhan...@gmail.com>
Subject TaskTracker: Java heap space error
Date Thu, 11 Mar 2010 16:24:58 GMT
Dear All,

I am running a hadoop job processing data. The output of map is really
large, and it spill 15 times. So I was trying to set io.sort.mb = 256
instead of 100. And I leave everything else default. I am using 0.20.2
version. And when I run the job, I got the following errors:

2010-03-11 11:09:37,581 INFO org.apache.hadoop.metrics.jvm.JvmMetrics:
Initializing JVM Metrics with processName=MAP, sessionId=
2010-03-11 11:09:38,073 INFO org.apache.hadoop.mapred.MapTask: numReduceTasks: 1
2010-03-11 11:09:38,086 INFO org.apache.hadoop.mapred.MapTask: io.sort.mb = 256
2010-03-11 11:09:38,326 FATAL org.apache.hadoop.mapred.TaskTracker:
Error running child : java.lang.OutOfMemoryError: Java heap space
	at org.apache.hadoop.mapred.MapTask$MapOutputBuffer.<init>(MapTask.java:781)
	at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:350)
	at org.apache.hadoop.mapred.MapTask.run(MapTask.java:307)
	at org.apache.hadoop.mapred.Child.main(Child.java:170)


I can't figure out why, could anyone please give me a hint? Any hlep will be
appreciate! Thanks a lot!

SIncerely,

Boyu

Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message