Return-Path: Delivered-To: apmail-lucene-hadoop-dev-archive@locus.apache.org Received: (qmail 15408 invoked from network); 15 Dec 2007 23:59:07 -0000 Received: from hermes.apache.org (HELO mail.apache.org) (140.211.11.2) by minotaur.apache.org with SMTP; 15 Dec 2007 23:59:07 -0000 Received: (qmail 95250 invoked by uid 500); 15 Dec 2007 23:58:55 -0000 Delivered-To: apmail-lucene-hadoop-dev-archive@lucene.apache.org Received: (qmail 95216 invoked by uid 500); 15 Dec 2007 23:58:55 -0000 Mailing-List: contact hadoop-dev-help@lucene.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: hadoop-dev@lucene.apache.org Delivered-To: mailing list hadoop-dev@lucene.apache.org Received: (qmail 95206 invoked by uid 99); 15 Dec 2007 23:58:55 -0000 Received: from nike.apache.org (HELO nike.apache.org) (192.87.106.230) by apache.org (qpsmtpd/0.29) with ESMTP; Sat, 15 Dec 2007 15:58:55 -0800 X-ASF-Spam-Status: No, hits=-100.0 required=10.0 tests=ALL_TRUSTED X-Spam-Check-By: apache.org Received: from [140.211.11.4] (HELO brutus.apache.org) (140.211.11.4) by apache.org (qpsmtpd/0.29) with ESMTP; Sat, 15 Dec 2007 23:58:51 +0000 Received: from brutus (localhost [127.0.0.1]) by brutus.apache.org (Postfix) with ESMTP id 282C1714243 for ; Sat, 15 Dec 2007 15:58:43 -0800 (PST) Message-ID: <13772344.1197763123162.JavaMail.jira@brutus> Date: Sat, 15 Dec 2007 15:58:43 -0800 (PST) From: "arkady borkovsky (JIRA)" To: hadoop-dev@lucene.apache.org Subject: [jira] Created: (HADOOP-2438) In streaming, jobs that used to work, crash in the map phase -- even if the mapper is /bin/cat MIME-Version: 1.0 Content-Type: text/plain; charset=utf-8 Content-Transfer-Encoding: 7bit X-Virus-Checked: Checked by ClamAV on apache.org In streaming, jobs that used to work, crash in the map phase -- even if the mapper is /bin/cat ---------------------------------------------------------------------------------------------- Key: HADOOP-2438 URL: https://issues.apache.org/jira/browse/HADOOP-2438 Project: Hadoop Issue Type: Bug Affects Versions: 0.15.1 Reporter: arkady borkovsky The exception is either "out of memory" of or "broken pipe" -- see both stack dumps bellow. st Hadoop input: |null| last tool output: |[B@20fa83| Date: Sat Dec 15 21:02:18 UTC 2007 java.io.IOException: Broken pipe at java.io.FileOutputStream.writeBytes(Native Method) at java.io.FileOutputStream.write(FileOutputStream.java:260) at java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:65) at java.io.BufferedOutputStream.flush(BufferedOutputStream.java:123) at java.io.BufferedOutputStream.flush(BufferedOutputStream.java:124) at java.io.DataOutputStream.flush(DataOutputStream.java:106) at org.apache.hadoop.streaming.PipeMapper.map(PipeMapper.java:96) at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:50) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:192) at org.apache.hadoop.mapred.TaskTracker$Child.main(TaskTracker.java:1760) at org.apache.hadoop.streaming.PipeMapper.map(PipeMapper.java:107) at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:50) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:192) at org.apache.hadoop.mapred.TaskTracker$Child.main(TaskTracker.java:1760) ------------------------------------------------- java.io.IOException: MROutput/MRErrThread failed:java.lang.OutOfMemoryError: Java heap space at java.util.Arrays.copyOf(Arrays.java:2786) at java.io.ByteArrayOutputStream.write(ByteArrayOutputStream.java:94) at java.io.DataOutputStream.write(DataOutputStream.java:90) at org.apache.hadoop.io.Text.write(Text.java:243) at org.apache.hadoop.mapred.MapTask$MapOutputBuffer.collect (MapTask.java:347) at org.apache.hadoop.streaming.PipeMapRed$MROutputThread.run (PipeMapRed.java:344) at org.apache.hadoop.streaming.PipeMapper.map(PipeMapper.java:76) at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:50) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:192) at org.apache.hadoop.mapred.TaskTracker$Child.main(TaskTracker.java: 1760) -- This message is automatically generated by JIRA. - You can reply to this email to add a comment to the issue online.