Return-Path: X-Original-To: apmail-hadoop-mapreduce-user-archive@minotaur.apache.org Delivered-To: apmail-hadoop-mapreduce-user-archive@minotaur.apache.org Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by minotaur.apache.org (Postfix) with SMTP id BDD2EE340 for ; Sun, 24 Feb 2013 11:04:20 +0000 (UTC) Received: (qmail 91730 invoked by uid 500); 24 Feb 2013 11:04:15 -0000 Delivered-To: apmail-hadoop-mapreduce-user-archive@hadoop.apache.org Received: (qmail 91431 invoked by uid 500); 24 Feb 2013 11:04:15 -0000 Mailing-List: contact user-help@hadoop.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: user@hadoop.apache.org Delivered-To: mailing list user@hadoop.apache.org Received: (qmail 91391 invoked by uid 99); 24 Feb 2013 11:04:13 -0000 Received: from nike.apache.org (HELO nike.apache.org) (192.87.106.230) by apache.org (qpsmtpd/0.29) with ESMTP; Sun, 24 Feb 2013 11:04:13 +0000 X-ASF-Spam-Status: No, hits=2.2 required=5.0 tests=HTML_MESSAGE,RCVD_IN_DNSWL_LOW,SPF_NEUTRAL X-Spam-Check-By: apache.org Received-SPF: neutral (nike.apache.org: local policy) Received: from [209.85.219.41] (HELO mail-oa0-f41.google.com) (209.85.219.41) by apache.org (qpsmtpd/0.29) with ESMTP; Sun, 24 Feb 2013 11:04:06 +0000 Received: by mail-oa0-f41.google.com with SMTP id i10so1986963oag.14 for ; Sun, 24 Feb 2013 03:03:45 -0800 (PST) X-Google-DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=google.com; s=20120113; h=mime-version:x-received:in-reply-to:references:date:message-id :subject:from:to:content-type:x-gm-message-state; bh=N3BD7wAPBNzYGxR3+87owWMZCexDhsQbII82xg4UIhQ=; b=MOmEic71FOaTS71EAa7i+pdVmMtrMxaIgqKEIs6EfJeWv1b0h1SYvOWH6qMQO8zZS6 0vTBEB+4wVqnlWIi0xJhkdm2HEsFLwsJ73pyFzhS2kIlgjz2bHpdIIAVKCxioD3jPgIS Vr3QK3pNeRz4ggk5Q0ztDmoNbOZ43KVpsAPf8bfIXHfcQQfuVYFaoVq9h/Kz7jSzW8WO J26UXmpkppk9URznOsZHYU+GHVhL1+I89l/MzLAPWYDUAPs1ZCTHP5Gcna2vthqRh36k /T/MmT62p9aG89qZ5aOMnx3VeLZToTdUTGF7eZakjE8Iv3Dx1qz5jGl0Xd0KLvY4cOSs H4XA== MIME-Version: 1.0 X-Received: by 10.60.2.164 with SMTP id 4mr4204807oev.85.1361703825611; Sun, 24 Feb 2013 03:03:45 -0800 (PST) Received: by 10.60.42.142 with HTTP; Sun, 24 Feb 2013 03:03:45 -0800 (PST) In-Reply-To: References: Date: Sun, 24 Feb 2013 15:03:45 +0400 Message-ID: Subject: Re: Trouble in running MapReduce application From: Fatih Haltas To: "user@hadoop.apache.org" Content-Type: multipart/alternative; boundary=e89a8f923da2d09bbf04d6766030 X-Gm-Message-State: ALoCoQliYugmuk1TNy4dScLr7tz6RRpVBipO9bWI1J/aNtzsY+ihtqm3jK9n7SVo1RwpkmT32/1G X-Virus-Checked: Checked by ClamAV on apache.org --e89a8f923da2d09bbf04d6766030 Content-Type: text/plain; charset=ISO-8859-9 Content-Transfer-Encoding: quoted-printable Thank you very much but No this is the file in hdfs and it is the exact path of netflow data in hdfs. Hadoop-data is the home hdfs directory, before downgrading my jdk this command worked well 24 =DEubat 2013 Pazar tarihinde sudhakara st adl=FD kullan=FDc=FD =FE=F6yle= yazd=FD: > Hi, > Your specifying the input directory in local file system not in > HDFS, Copy some text file to using '-put' or '-copyFromLoca'l to HDFS use= r > home directory then try to execute word count by specifying home as input > directory. > > On Sun, Feb 24, 2013 at 3:29 PM, Fatih Haltas wrote= : > > > > Hi Hemanth; > > Thanks for your grreat helps, > > I am really much obliged to you. > > I solved this problem by changing my java compiler vs. but now though I > changed everynodes configuration I am getting this error even I tried to > run example of wordcount without making any changes. > > What may be the reason, I believe that I checked all config files and > changed the home variables, also /etc/hosts > > Here is my problem: > ************************************************************ > [hadoop@ADUAE042-LAP-V logs]$ hadoop jar ../hadoop-examples-1.0.4.jar > wordcount /home/hadoop/project/hadoop-data/NetFlow test1353.out > > Warning: $HADOOP_HOME is deprecated. > > 13/02/24 13:32:28 INFO input.FileInputFormat: Total input paths to proces= s > : 1 > 13/02/24 13:32:28 INFO util.NativeCodeLoader: Loaded the native-hadoop > library > 13/02/24 13:32:28 WARN snappy.LoadSnappy: Snappy native library not loade= d > 13/02/24 13:32:29 INFO mapred.JobClient: Running job: job_201301141457_00= 34 > 13/02/24 13:32:30 INFO mapred.JobClient: map 0% reduce 0% > 13/02/24 13:32:37 INFO mapred.JobClient: Task Id : > attempt_201301141457_0034_m_000002_0, Status : FAILED > java.lang.Throwable: Child Error > at org.apache.hadoop.mapred.TaskRunner.run(TaskRunner.java:271) > Caused by: java.io.IOException: Task process exit with nonzero status of = 1. > at org.apache.hadoop.mapred.TaskRunner.run(TaskRunner.java:258) > > attempt_201301141457_0034_m_000002_0: execvp: No such file or directory > 13/02/24 13:32:43 INFO mapred.JobClient: Task Id : > attempt_201301141457_0034_r_000002_0, Status : FAILED > java.lang.Throwable: Child Error > at org.apache.hadoop.mapred.TaskRunner.run(TaskRunner.java:271) > Caused by: java.io.IOException: Task process exit with nonzero status of = 1. > at org.apache.hadoop.mapred.TaskRunner.run(TaskRunner.java:258) > > attempt_201301141457_0034_r_000002_0: execvp: No such file or directory > 13/02/24 13:32:50 INFO mapred.JobClient: Task Id : > attempt_201301141457_0034_m_000002_1, Status : FAILED > java.lang.Throwable: Child Error > at org.apache.hadoop.mapred.TaskRunner.run(TaskRunner.java:271) > Caused by: java.io.IOException: Task process exit with nonzero status of = 1. > at org.apache.hadoop.mapred.TaskRunner.run(TaskRunner.java:258) > > attempt_201301141457_0034_m_000002_1: execvp: No such file or directory > 13/02/24 13:32:56 INFO mapred.JobClient: Task Id : > attempt_201301141457_0034_r_000002_1, Status : FAILED > java.lang.Throwable: Child Error > at org.apache.hadoop.mapred.TaskRunner.run(TaskRunner.java:271) > Caused by: java.io.IOException: Task process exit with nonzero status of = 1. > at org.apache.hadoop.mapred.TaskRunner.run(TaskRunner.java:258) > > attempt_201301141457_0034_r_000002_1: execvp: No such file or directory > 13/02/24 13:33:02 INFO mapred.JobClient: Task Id : > attempt_201301141457_0034_m_000002_2, Status : FAILED > Error initializing attempt_201301141457_0034_m_000002_2: > java.lang.InternalError > at > sun.misc.URLClassPath$JarLoader.getResource(URLClassPath.java:769) > at sun.misc.URLClassPath.getResource(URLClassPath.java:185) > at sun.misc.URLClassPath.getResource(URLClassPath.java:237) > at > java.lang.ClassLoader.getBootstrapResource(ClassLoader.java:1113) > at java.lang.ClassLoader.getResource(ClassLoader.java:974) > at java.lang.ClassLoader.getResource(ClassLoader.java:972) > at java.lang.ClassLoader.getSystemResource(ClassLoader.java:1075) > at > java.lang.ClassLoader.getSystemResourceAsStream(ClassLoader.java:1181) > at java.lang.Class.getResourceAsStream(Class.java:2045) > at > com.sun.org.apache.xml.internal.serializer.OutputPropertiesFactory$1.run(= OutputPropertiesFactory.java:370) > at java.security.AccessController.doPrivileged(Native M > > -- > > Regards, > ..... Sudhakara.st > > --e89a8f923da2d09bbf04d6766030 Content-Type: text/html; charset=ISO-8859-9 Content-Transfer-Encoding: quoted-printable
Thank you very much but=A0
No this is the file in hdfs and it is = the exact path of netflow data in hdfs. Hadoop-data is the home hdfs direct= ory, before downgrading my jdk this command worked well
24 =DEubat 2013 Pazar tarihinde sudhakara st adl=FD kullan=FDc=FD =FE=F6yl= e yazd=FD:
Hi,
=A0=A0=A0=A0 Your speci= fying=A0 the input directory in local file system not in HDFS, Copy some te= xt file to using '-put' or '-copyFromLoca'l to HDFS user ho= me directory then try to execute word count by specifying home as input dir= ectory.

On Sun, Feb 24, 2013 at 3:29 PM, Fatih Haltas &l= t;fatih.haltas@nyu.edu> wrote:


Hi Hemanth;

Thanks for your grreat helps,

I am really much = obliged to you.

I solved this problem by changing = my java compiler vs. but now though I changed everynodes configuration I am= getting this error even I tried to run example of wordcount without making= any changes.

What may be the reason, I believe that I checked all co= nfig files and changed the home variables, also /etc/hosts

Here is my problem:
************************************= ************************
[hadoop@ADUAE042-LAP-V logs]$ hadoop jar ../hadoop-examples-1.0.4= .jar wordcount /home/hadoop/project/hadoop-data/NetFlow test1353.out
<= div>
Warning: $HADOOP_HOME is deprecated.

13/02/24 13:32:28 INFO input.FileInputFormat: Total input pa= ths to process : 1
13/02/24 13:32:28 INFO util.NativeCodeLoader: = Loaded the native-hadoop library
13/02/24 13:32:28 WARN snappy.Lo= adSnappy: Snappy native library not loaded
13/02/24 13:32:29 INFO mapred.JobClient: Running job: job_201301141457= _0034
13/02/24 13:32:30 INFO mapred.JobClient: =A0map 0% reduce 0= %
13/02/24 13:32:37 INFO mapred.JobClient: Task Id : attempt_2013= 01141457_0034_m_000002_0, Status : FAILED
java.lang.Throwable: Child Error
=A0 =A0 =A0 =A0 at org.apac= he.hadoop.mapred.TaskRunner.run(TaskRunner.java:271)
Caused by: j= ava.io.IOException: Task process exit with nonzero status of 1.
= =A0 =A0 =A0 =A0 at org.apache.hadoop.mapred.TaskRunner.run(TaskRunner.java:= 258)

attempt_201301141457_0034_m_000002_0: execvp: No such f= ile or directory
13/02/24 13:32:43 INFO mapred.JobClient: Task Id= : attempt_201301141457_0034_r_000002_0, Status : FAILED
java.lan= g.Throwable: Child Error
=A0 =A0 =A0 =A0 at org.apache.hadoop.mapred.TaskRunner.run(TaskRunner.= java:271)
Caused by: java.io.IOException: Task process exit with = nonzero status of 1.
=A0 =A0 =A0 =A0 at org.apache.hadoop.mapred.= TaskRunner.run(TaskRunner.java:258)

attempt_201301141457_0034_r_000002_0: execvp: No such f= ile or directory
13/02/24 13:32:50 INFO mapred.JobClient: Task Id= : attempt_201301141457_0034_m_000002_1, Status : FAILED
java.lan= g.Throwable: Child Error
=A0 =A0 =A0 =A0 at org.apache.hadoop.mapred.TaskRunner.run(TaskRunner.= java:271)
Caused by: java.io.IOException: Task process exit with = nonzero status of 1.
=A0 =A0 =A0 =A0 at org.apache.hadoop.mapred.= TaskRunner.run(TaskRunner.java:258)

attempt_201301141457_0034_m_000002_1: execvp: No such f= ile or directory
13/02/24 13:32:56 INFO mapred.JobClient: Task Id= : attempt_201301141457_0034_r_000002_1, Status : FAILED
java.lan= g.Throwable: Child Error
=A0 =A0 =A0 =A0 at org.apache.hadoop.mapred.TaskRunner.run(TaskRunner.= java:271)
Caused by: java.io.IOException: Task process exit with = nonzero status of 1.
=A0 =A0 =A0 =A0 at org.apache.hadoop.mapred.= TaskRunner.run(TaskRunner.java:258)

attempt_201301141457_0034_r_000002_1: execvp: No such f= ile or directory
13/02/24 13:33:02 INFO mapred.JobClient: Task Id= : attempt_201301141457_0034_m_000002_2, Status : FAILED
Error in= itializing attempt_201301141457_0034_m_000002_2:
java.lang.InternalError
=A0 =A0 =A0 =A0 at sun.misc.URLClass= Path$JarLoader.getResource(URLClassPath.java:769)
=A0 =A0 =A0 =A0= at sun.misc.URLClassPath.getResource(URLClassPath.java:185)
=A0 = =A0 =A0 =A0 at sun.misc.URLClassPath.getResource(URLClassPath.java:237)
=A0 =A0 =A0 =A0 at java.lang.ClassLoader.getBootstrapResource(ClassLoa= der.java:1113)
=A0 =A0 =A0 =A0 at java.lang.ClassLoader.getResour= ce(ClassLoader.java:974)
=A0 =A0 =A0 =A0 at java.lang.ClassLoader= .getResource(ClassLoader.java:972)
=A0 =A0 =A0 =A0 at java.lang.ClassLoader.getSystemResource(ClassLoader= .java:1075)
=A0 =A0 =A0 =A0 at java.lang.ClassLoader.getSystemRes= ourceAsStream(ClassLoader.java:1181)
=A0 =A0 =A0 =A0 at java.lang= .Class.getResourceAsStream(Class.java:2045)
=A0 =A0 =A0 =A0 at com.sun.org.apache.xml.internal.serializer.OutputPr= opertiesFactory$1.run(OutputPropertiesFactory.java:370)
=A0 =A0 = =A0 =A0 at java.security.AccessController.doPrivileged(Native M
=
--
=A0 =A0 =A0=A0
Regards,
.....=A0 Sud= hakara.st
=A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0 =A0=A0
--e89a8f923da2d09bbf04d6766030--