hadoop-common-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "P.ILAYARAJA" <ilayar...@rediff.co.in>
Subject Nutch/Hadoop: Crawl is crashing
Date Mon, 03 Nov 2008 14:57:09 GMT
Hi,

I started an internet crawl of 30 million pages in a single segment.
The crawl was crashing with the following exception:

java.lang.ArrayIndexOutOfBoundsException: 17
 at org.apache.hadoop.mapred.MapTask$MapOutputBuffer.mergeParts(MapTask.java:540)
 at org.apache.hadoop.mapred.MapTask$MapOutputBuffer.flush(MapTask.java:607)
 at org.apache.hadoop.mapred.MapTask.run(MapTask.java:193)
 at org.apache.hadoop.mapred.TaskTracker$Child.main(TaskTracker.java:1760)


Any idea on why is it happenning and what would be the soln.

... am using hadoop 0.15.3 and nutch 1.0 versions.

Regards,
Ilay
Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message