Return-Path: X-Original-To: apmail-hadoop-common-user-archive@www.apache.org Delivered-To: apmail-hadoop-common-user-archive@www.apache.org Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by minotaur.apache.org (Postfix) with SMTP id 76F72CF5C for ; Thu, 27 Jun 2013 17:54:43 +0000 (UTC) Received: (qmail 95647 invoked by uid 500); 27 Jun 2013 17:54:38 -0000 Delivered-To: apmail-hadoop-common-user-archive@hadoop.apache.org Received: (qmail 95372 invoked by uid 500); 27 Jun 2013 17:54:38 -0000 Mailing-List: contact user-help@hadoop.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: user@hadoop.apache.org Delivered-To: mailing list user@hadoop.apache.org Received: (qmail 95364 invoked by uid 99); 27 Jun 2013 17:54:37 -0000 Received: from athena.apache.org (HELO athena.apache.org) (140.211.11.136) by apache.org (qpsmtpd/0.29) with ESMTP; Thu, 27 Jun 2013 17:54:37 +0000 X-ASF-Spam-Status: No, hits=-0.7 required=5.0 tests=RCVD_IN_DNSWL_LOW,SPF_PASS X-Spam-Check-By: apache.org Received-SPF: pass (athena.apache.org: domain of pengyu.ut@gmail.com designates 209.85.216.174 as permitted sender) Received: from [209.85.216.174] (HELO mail-qc0-f174.google.com) (209.85.216.174) by apache.org (qpsmtpd/0.29) with ESMTP; Thu, 27 Jun 2013 17:54:33 +0000 Received: by mail-qc0-f174.google.com with SMTP id m15so711601qcq.19 for ; Thu, 27 Jun 2013 10:54:12 -0700 (PDT) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=gmail.com; s=20120113; h=mime-version:in-reply-to:references:date:message-id:subject:from:to :content-type; bh=8BHLbFLPRZS89ZSh14poWSmjIPgUnf7tEV9Mz+CIC1U=; b=06VRZ0xsKdFdA5BL2sYOi9iS3i7fjHbF8oB5Jy3xR0RmlDmLnmHa/OkEt6ZGDkZHnL bIID2dqrsMonKyhtlUpyhCRGNlkJkrHhAoFX4DLV+FI3BbZVfVmbOvCC1N9VhNZugVFQ gaCEDhQvDyh6aXpIvLTV3poCRlIX7p8q5eXfpNSQ+xJmaw8yFPvvERGyMcUoPkvq0xED wTt7qTZ0KRaMHGF8ysKt3fl1V/npf8O+tqArrSHNyJxn1BZkvJsWvJrHADMxRYNiL598 rDoHxIqU96lVGXh7cFNZfR2Os4WrqzLdILdxuHcWpwFv4+Zub9OuOkSAqHsttCHauG+z fMHw== MIME-Version: 1.0 X-Received: by 10.229.140.6 with SMTP id g6mr2962094qcu.14.1372355652501; Thu, 27 Jun 2013 10:54:12 -0700 (PDT) Received: by 10.224.163.139 with HTTP; Thu, 27 Jun 2013 10:54:12 -0700 (PDT) In-Reply-To: References: Date: Thu, 27 Jun 2013 12:54:12 -0500 Message-ID: Subject: Re: Can not follow Single Node Setup example. From: Peng Yu To: user@hadoop.apache.org Content-Type: text/plain; charset=ISO-8859-1 X-Virus-Checked: Checked by ClamAV on apache.org Hi, Here is what I got. Is there anything wrong? ~/Downloads/hadoop-install/hadoop$ bin/hadoop fs -put conf/ /input/ 13/06/27 12:53:39 WARN hdfs.DFSClient: DataStreamer Exception: org.apache.hadoop.ipc.RemoteException: java.io.IOException: File /input/conf/capacity-scheduler.xml could only be replicated to 0 nodes, instead of 1 at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getAdditionalBlock(FSNamesystem.java:1639) at org.apache.hadoop.hdfs.server.namenode.NameNode.addBlock(NameNode.java:736) at sun.reflect.GeneratedMethodAccessor9.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25) at java.lang.reflect.Method.invoke(Method.java:597) at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:578) at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1393) at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1389) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:396) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1149) at org.apache.hadoop.ipc.Server$Handler.run(Server.java:1387) at org.apache.hadoop.ipc.Client.call(Client.java:1107) at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:229) at com.sun.proxy.$Proxy1.addBlock(Unknown Source) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25) at java.lang.reflect.Method.invoke(Method.java:597) at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:85) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:62) at com.sun.proxy.$Proxy1.addBlock(Unknown Source) at org.apache.hadoop.hdfs.DFSClient$DFSOutputStream.locateFollowingBlock(DFSClient.java:3686) at org.apache.hadoop.hdfs.DFSClient$DFSOutputStream.nextBlockOutputStream(DFSClient.java:3546) at org.apache.hadoop.hdfs.DFSClient$DFSOutputStream.access$2600(DFSClient.java:2749) at org.apache.hadoop.hdfs.DFSClient$DFSOutputStream$DataStreamer.run(DFSClient.java:2989) 13/06/27 12:53:39 WARN hdfs.DFSClient: Error Recovery for block null bad datanode[0] nodes == null 13/06/27 12:53:39 WARN hdfs.DFSClient: Could not get block locations. Source file "/input/conf/capacity-scheduler.xml" - Aborting... put: java.io.IOException: File /input/conf/capacity-scheduler.xml could only be replicated to 0 nodes, instead of 1 13/06/27 12:53:39 ERROR hdfs.DFSClient: Failed to close file /input/conf/capacity-scheduler.xml org.apache.hadoop.ipc.RemoteException: java.io.IOException: File /input/conf/capacity-scheduler.xml could only be replicated to 0 nodes, instead of 1 at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getAdditionalBlock(FSNamesystem.java:1639) at org.apache.hadoop.hdfs.server.namenode.NameNode.addBlock(NameNode.java:736) at sun.reflect.GeneratedMethodAccessor9.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25) at java.lang.reflect.Method.invoke(Method.java:597) at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:578) at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1393) at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1389) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:396) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1149) at org.apache.hadoop.ipc.Server$Handler.run(Server.java:1387) at org.apache.hadoop.ipc.Client.call(Client.java:1107) at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:229) at com.sun.proxy.$Proxy1.addBlock(Unknown Source) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25) at java.lang.reflect.Method.invoke(Method.java:597) at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:85) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:62) at com.sun.proxy.$Proxy1.addBlock(Unknown Source) at org.apache.hadoop.hdfs.DFSClient$DFSOutputStream.locateFollowingBlock(DFSClient.java:3686) at org.apache.hadoop.hdfs.DFSClient$DFSOutputStream.nextBlockOutputStream(DFSClient.java:3546) at org.apache.hadoop.hdfs.DFSClient$DFSOutputStream.access$2600(DFSClient.java:2749) at org.apache.hadoop.hdfs.DFSClient$DFSOutputStream$DataStreamer.run(DFSClient.java:2989) On Thu, Jun 27, 2013 at 12:40 PM, Mohammad Tariq wrote: > No. This means that you are trying to copy an entire directory instead of a > file. Do this : > bin/hadoop fs -put conf/ /input/ > > Warm Regards, > Tariq > cloudfront.blogspot.com > > > On Thu, Jun 27, 2013 at 10:37 PM, Peng Yu wrote: >> >> Hi, >> >> ~/Downloads/hadoop-install/hadoop$ rm -rf ~/input/conf/ >> ~/Downloads/hadoop-install/hadoop$ bin/hadoop fs -put conf input >> put: Target input/conf is a directory >> >> I get the above output. Is it the correct output? Thanks. >> >> On Wed, Jun 26, 2013 at 10:51 AM, Shahab Yunus >> wrote: >> > It is looking for a file within your login folder >> > /user/py/input/conf >> > >> > You are running your job form >> > hadoop/bin >> > and I think the hadoop job will is looking for files in the current >> > folder. >> > >> > Regards, >> > Shahab >> > >> > >> > On Wed, Jun 26, 2013 at 11:02 AM, Peng Yu wrote: >> >> >> >> Hi, >> >> >> >> Here are what I have. >> >> >> >> ~/Downloads/hadoop-install/hadoop$ ls >> >> CHANGES.txt README.txt c++ hadoop-ant-1.1.2.jar >> >> hadoop-examples-1.1.2.jar hadoop-tools-1.1.2.jar ivy.xml logs >> >> src >> >> LICENSE.txt bin conf hadoop-client-1.1.2.jar >> >> hadoop-minicluster-1.1.2.jar input lib sbin >> >> webapps >> >> NOTICE.txt build.xml contrib hadoop-core-1.1.2.jar >> >> hadoop-test-1.1.2.jar ivy libexec share >> >> ~/Downloads/hadoop-install/hadoop$ ls input/ >> >> capacity-scheduler.xml core-site.xml fair-scheduler.xml >> >> hadoop-policy.xml hdfs-site.xml mapred-queue-acls.xml >> >> mapred-site.xml >> >> >> >> On Wed, Jun 26, 2013 at 10:00 AM, Shahab Yunus >> >> wrote: >> >> > Basically whether this step worked or not: >> >> > >> >> > $ cp conf/*.xml input >> >> > >> >> > Regards, >> >> > Shahab >> >> > >> >> > >> >> > On Wed, Jun 26, 2013 at 10:58 AM, Shahab Yunus >> >> > >> >> > wrote: >> >> >> >> >> >> Have you verified that the 'input' folder exists on the hdfs (singel >> >> >> node >> >> >> setup) that you are job needs? >> >> >> >> >> >> Regards, >> >> >> Shahab >> >> >> >> >> >> >> >> >> On Wed, Jun 26, 2013 at 10:53 AM, Peng Yu >> >> >> wrote: >> >> >>> >> >> >>> Hi, >> >> >>> >> >> >>> http://hadoop.apache.org/docs/r1.1.2/single_node_setup.html >> >> >>> >> >> >>> I followed the above instructions. But I get the following errors. >> >> >>> Does anybody know what is wrong? Thanks. >> >> >>> >> >> >>> ~/Downloads/hadoop-install/hadoop$ bin/hadoop jar >> >> >>> hadoop-examples-*.jar grep input output 'dfs[a-z.]+' >> >> >>> Warning: $HADOOP_HOME is deprecated. >> >> >>> >> >> >>> 13/06/26 09:49:14 WARN util.NativeCodeLoader: Unable to load >> >> >>> native-hadoop library for your platform... using builtin-java >> >> >>> classes >> >> >>> where applicable >> >> >>> 13/06/26 09:49:14 WARN snappy.LoadSnappy: Snappy native library not >> >> >>> loaded >> >> >>> 13/06/26 09:49:14 INFO mapred.FileInputFormat: Total input paths to >> >> >>> process : 2 >> >> >>> 13/06/26 09:49:14 INFO mapred.JobClient: Cleaning up the staging >> >> >>> area >> >> >>> >> >> >>> >> >> >>> >> >> >>> hdfs://localhost:9000/opt/local/var/hadoop/cache/mapred/staging/py/.staging/job_201306260838_0001 >> >> >>> 13/06/26 09:49:14 ERROR security.UserGroupInformation: >> >> >>> PriviledgedActionException as:py cause:java.io.IOException: Not a >> >> >>> file: hdfs://localhost:9000/user/py/input/conf >> >> >>> java.io.IOException: Not a file: >> >> >>> hdfs://localhost:9000/user/py/input/conf >> >> >>> at >> >> >>> >> >> >>> >> >> >>> org.apache.hadoop.mapred.FileInputFormat.getSplits(FileInputFormat.java:215) >> >> >>> at >> >> >>> >> >> >>> org.apache.hadoop.mapred.JobClient.writeOldSplits(JobClient.java:1051) >> >> >>> at >> >> >>> org.apache.hadoop.mapred.JobClient.writeSplits(JobClient.java:1043) >> >> >>> at >> >> >>> org.apache.hadoop.mapred.JobClient.access$700(JobClient.java:179) >> >> >>> at >> >> >>> org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:959) >> >> >>> at >> >> >>> org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:912) >> >> >>> at java.security.AccessController.doPrivileged(Native >> >> >>> Method) >> >> >>> at javax.security.auth.Subject.doAs(Subject.java:396) >> >> >>> at >> >> >>> >> >> >>> >> >> >>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1149) >> >> >>> at >> >> >>> >> >> >>> >> >> >>> org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:912) >> >> >>> at >> >> >>> org.apache.hadoop.mapred.JobClient.submitJob(JobClient.java:886) >> >> >>> at >> >> >>> org.apache.hadoop.mapred.JobClient.runJob(JobClient.java:1323) >> >> >>> at org.apache.hadoop.examples.Grep.run(Grep.java:69) >> >> >>> at >> >> >>> org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65) >> >> >>> at org.apache.hadoop.examples.Grep.main(Grep.java:93) >> >> >>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native >> >> >>> Method) >> >> >>> at >> >> >>> >> >> >>> >> >> >>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39) >> >> >>> at >> >> >>> >> >> >>> >> >> >>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25) >> >> >>> at java.lang.reflect.Method.invoke(Method.java:597) >> >> >>> at >> >> >>> >> >> >>> >> >> >>> org.apache.hadoop.util.ProgramDriver$ProgramDescription.invoke(ProgramDriver.java:68) >> >> >>> at >> >> >>> org.apache.hadoop.util.ProgramDriver.driver(ProgramDriver.java:139) >> >> >>> at >> >> >>> >> >> >>> org.apache.hadoop.examples.ExampleDriver.main(ExampleDriver.java:64) >> >> >>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native >> >> >>> Method) >> >> >>> at >> >> >>> >> >> >>> >> >> >>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39) >> >> >>> at >> >> >>> >> >> >>> >> >> >>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25) >> >> >>> at java.lang.reflect.Method.invoke(Method.java:597) >> >> >>> at org.apache.hadoop.util.RunJar.main(RunJar.java:156) >> >> >>> >> >> >>> -- >> >> >>> Regards, >> >> >>> Peng >> >> >> >> >> >> >> >> > >> >> >> >> >> >> >> >> -- >> >> Regards, >> >> Peng >> > >> > >> >> >> >> -- >> Regards, >> Peng > > -- Regards, Peng