Return-Path: Delivered-To: apmail-hadoop-core-user-archive@www.apache.org Received: (qmail 33154 invoked from network); 6 Feb 2009 06:17:50 -0000 Received: from hermes.apache.org (HELO mail.apache.org) (140.211.11.2) by minotaur.apache.org with SMTP; 6 Feb 2009 06:17:50 -0000 Received: (qmail 94204 invoked by uid 500); 6 Feb 2009 06:17:42 -0000 Delivered-To: apmail-hadoop-core-user-archive@hadoop.apache.org Received: (qmail 94161 invoked by uid 500); 6 Feb 2009 06:17:42 -0000 Mailing-List: contact core-user-help@hadoop.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: core-user@hadoop.apache.org Delivered-To: mailing list core-user@hadoop.apache.org Received: (qmail 94149 invoked by uid 99); 6 Feb 2009 06:17:42 -0000 Received: from athena.apache.org (HELO athena.apache.org) (140.211.11.136) by apache.org (qpsmtpd/0.29) with ESMTP; Thu, 05 Feb 2009 22:17:42 -0800 X-ASF-Spam-Status: No, hits=2.6 required=10.0 tests=DNS_FROM_OPENWHOIS,SPF_HELO_PASS,SPF_PASS,WHOIS_MYPRIVREG X-Spam-Check-By: apache.org Received-SPF: pass (athena.apache.org: domain of lists@nabble.com designates 216.139.236.158 as permitted sender) Received: from [216.139.236.158] (HELO kuber.nabble.com) (216.139.236.158) by apache.org (qpsmtpd/0.29) with ESMTP; Fri, 06 Feb 2009 06:17:34 +0000 Received: from isper.nabble.com ([192.168.236.156]) by kuber.nabble.com with esmtp (Exim 4.63) (envelope-from ) id 1LVK1Z-00025k-JL for core-user@hadoop.apache.org; Thu, 05 Feb 2009 22:17:13 -0800 Message-ID: <21867199.post@talk.nabble.com> Date: Thu, 5 Feb 2009 22:17:13 -0800 (PST) From: Rajshekar To: core-user@hadoop.apache.org Subject: Re: Not able to copy a file to HDFS after installing In-Reply-To: MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: quoted-printable X-Nabble-From: rajashekar.s@excelindia.com References: <21845768.post@talk.nabble.com> <498A9AF9.50603@attributor.com> <21846923.post@talk.nabble.com> X-Virus-Checked: Checked by ClamAV on apache.org Hi Thanks Rasi, >From Yest evening I am able to start Namenode. I did few changed in hadoop-site.xml. it working now, but the new problem is I am not able to do map/reduce jobs using .jar files. it is giving following error hadoop@excel-desktop:/usr/local/hadoop$ bin/hadoop jar hadoop-0.19.0-examples.jar wordcount gutenberg gutenberg-output java.io.IOException: Error opening job jar: hadoop-0.19.0-examples.jar at org.apache.hadoop.util.RunJar.main(RunJar.java:90) at org.apache.hadoop.mapred.JobShell.run(JobShell.java:194) at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65) at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:79) at org.apache.hadoop.mapred.JobShell.main(JobShell.java:220) Caused by: java.util.zip.ZipException: error in opening zip file at java.util.zip.ZipFile.open(Native Method) at java.util.zip.ZipFile.(ZipFile.java:131) at java.util.jar.JarFile.(JarFile.java:150) at java.util.jar.JarFile.(JarFile.java:87) at org.apache.hadoop.util.RunJar.main(RunJar.java:88) ... 4 more Pls help me out Rasit OZDAS wrote: >=20 > Rajshekar, > It seems that your namenode isn't able to load FsImage file. >=20 > Here is a thread about a similar issue: > http://www.nabble.com/Hadoop-0.17.1-%3D%3E-EOFException-reading-FSEdits-f= ile,-what-causes-this---how-to-prevent--td21440922.html >=20 > Rasit >=20 > 2009/2/5 Rajshekar : >> >> Name naode is localhost with an ip address.Now I checked when i give >> /bin/hadoop namenode i am getting error >> >> root@excel-desktop:/usr/local/hadoop/hadoop-0.17.2.1# bin/hadoop namenod= e >> 09/02/05 13:27:43 INFO dfs.NameNode: STARTUP_MSG: >> /************************************************************ >> STARTUP_MSG: Starting NameNode >> STARTUP_MSG: host =3D excel-desktop/127.0.1.1 >> STARTUP_MSG: args =3D [] >> STARTUP_MSG: version =3D 0.17.2.1 >> STARTUP_MSG: build =3D >> https://svn.apache.org/repos/asf/hadoop/core/branches/branch-0.17 -r >> 684969; >> compiled by 'oom' on Wed Aug 20 22:29:32 UTC 2008 >> ************************************************************/ >> 09/02/05 13:27:43 INFO metrics.RpcMetrics: Initializing RPC Metrics with >> hostName=3DNameNode, port=3D9000 >> 09/02/05 13:27:43 INFO dfs.NameNode: Namenode up at: >> localhost/127.0.0.1:9000 >> 09/02/05 13:27:43 INFO jvm.JvmMetrics: Initializing JVM Metrics with >> processName=3DNameNode, sessionId=3Dnull >> 09/02/05 13:27:43 INFO dfs.NameNodeMetrics: Initializing NameNodeMeteric= s >> using context object:org.apache.hadoop.metrics.spi.NullContext >> 09/02/05 13:27:43 INFO fs.FSNamesystem: fsOwner=3Droot,root >> 09/02/05 13:27:43 INFO fs.FSNamesystem: supergroup=3Dsupergroup >> 09/02/05 13:27:43 INFO fs.FSNamesystem: isPermissionEnabled=3Dtrue >> 09/02/05 13:27:44 INFO ipc.Server: Stopping server on 9000 >> 09/02/05 13:27:44 ERROR dfs.NameNode: java.io.EOFException >> at java.io.RandomAccessFile.readInt(RandomAccessFile.java:776) >> at >> org.apache.hadoop.dfs.FSImage.isConversionNeeded(FSImage.java:488) >> at >> org.apache.hadoop.dfs.Storage$StorageDirectory.analyzeStorage(Storage.ja= va:283) >> at >> org.apache.hadoop.dfs.FSImage.recoverTransitionRead(FSImage.java:149) >> at >> org.apache.hadoop.dfs.FSDirectory.loadFSImage(FSDirectory.java:80) >> at >> org.apache.hadoop.dfs.FSNamesystem.initialize(FSNamesystem.java:274) >> at >> org.apache.hadoop.dfs.FSNamesystem.(FSNamesystem.java:255) >> at org.apache.hadoop.dfs.NameNode.initialize(NameNode.java:133) >> at org.apache.hadoop.dfs.NameNode.(NameNode.java:178) >> at org.apache.hadoop.dfs.NameNode.(NameNode.java:164) >> at >> org.apache.hadoop.dfs.NameNode.createNameNode(NameNode.java:846) >> at org.apache.hadoop.dfs.NameNode.main(NameNode.java:855) >> >> 09/02/05 13:27:44 INFO dfs.NameNode: SHUTDOWN_MSG: >> /************************************************************ >> SHUTDOWN_MSG: Shutting down NameNode at excel-desktop/127.0.1.1 >> ************************************************************/ >> Rajshekar >> >> >> >> >> >> Sagar Naik-3 wrote: >>> >>> >>> where is the namenode running ? localhost or some other host >>> >>> -Sagar >>> Rajshekar wrote: >>>> Hello, >>>> I am new to Hadoop and I jus installed on Ubuntu 8.0.4 LTS as per >>>> guidance >>>> of a web site. I tested it and found working fine. I tried to copy a >>>> file >>>> but it is giving some error pls help me out >>>> >>>> hadoop@excel-desktop:/usr/local/hadoop/hadoop-0.17.2.1$ bin/hadoop ja= r >>>> hadoop-0.17.2.1-examples.jar wordcount /home/hadoop/Download\ URLs.txt >>>> download-output >>>> 09/02/02 11:18:59 INFO ipc.Client: Retrying connect to server: >>>> localhost/127.0.0.1:9000. Already tried 1 time(s). >>>> 09/02/02 11:19:00 INFO ipc.Client: Retrying connect to server: >>>> localhost/127.0.0.1:9000. Already tried 2 time(s). >>>> 09/02/02 11:19:01 INFO ipc.Client: Retrying connect to server: >>>> localhost/127.0.0.1:9000. Already tried 3 time(s). >>>> 09/02/02 11:19:02 INFO ipc.Client: Retrying connect to server: >>>> localhost/127.0.0.1:9000. Already tried 4 time(s). >>>> 09/02/02 11:19:04 INFO ipc.Client: Retrying connect to server: >>>> localhost/127.0.0.1:9000. Already tried 5 time(s). >>>> 09/02/02 11:19:05 INFO ipc.Client: Retrying connect to server: >>>> localhost/127.0.0.1:9000. Already tried 6 time(s). >>>> 09/02/02 11:19:06 INFO ipc.Client: Retrying connect to server: >>>> localhost/127.0.0.1:9000. Already tried 7 time(s). >>>> 09/02/02 11:19:07 INFO ipc.Client: Retrying connect to server: >>>> localhost/127.0.0.1:9000. Already tried 8 time(s). >>>> 09/02/02 11:19:08 INFO ipc.Client: Retrying connect to server: >>>> localhost/127.0.0.1:9000. Already tried 9 time(s). >>>> 09/02/02 11:19:09 INFO ipc.Client: Retrying connect to server: >>>> localhost/127.0.0.1:9000. Already tried 10 time(s). >>>> java.lang.RuntimeException: java.net.ConnectException: Connection >>>> refused >>>> at org.apache.hadoop.mapred.JobConf.getWorkingDirecto >>>> ry(JobConf.java:356) >>>> at org.apache.hadoop.mapred.FileInputFormat.setInputP >>>> aths(FileInputFormat.java:331) >>>> at org.apache.hadoop.mapred.FileInputFormat.setInputP >>>> aths(FileInputFormat.java:304) >>>> at org.apache.hadoop.examples.WordCount.run(WordCount .java:146) >>>> at org.apache.hadoop.util.ToolRunner.run(ToolRunner.j ava:65) >>>> at org.apache.hadoop.examples.WordCount.main(WordCoun t.java:155) >>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Nativ e Method) >>>> at sun.reflect.NativeMethodAccessorImpl.invoke(Native >>>> MethodAccessorImpl.java:57) >>>> at sun.reflect.DelegatingMethodAccessorImpl.invoke(De >>>> legatingMethodAccessorImpl.java:43) >>>> at java.lang.reflect.Method.invoke(Method.java:616) >>>> at org.apache.hadoop.util.ProgramDriver$ProgramDescri >>>> ption.invoke(ProgramDriver.java:6 >>>> at org.apache.hadoop.util.ProgramDriver.driver(Progra mDriver.java:139= ) >>>> at org.apache.hadoop.examples.ExampleDriver.main(Exam >>>> pleDriver.java:53) >>>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Nativ e Method) >>>> at sun.reflect.NativeMethodAccessorImpl.invoke(Native >>>> MethodAccessorImpl.java:57) >>>> at sun.reflect.DelegatingMethodAccessorImpl.invoke(De >>>> legatingMethodAccessorImpl.java:43) >>>> at java.lang.reflect.Method.invoke(Method.java:616) >>>> at org.apache.hadoop.util.RunJar.main(RunJar.java:155 ) >>>> at org.apache.hadoop.mapred.JobShell.run(JobShell.jav a:194) >>>> at org.apache.hadoop.util.ToolRunner.run(ToolRunner.j ava:65) >>>> at org.apache.hadoop.util.ToolRunner.run(ToolRunner.j ava:79) >>>> at org.apache.hadoop.mapred.JobShell.main(JobShell.ja va:220) >>>> Caused by: java.net.ConnectException: Connection refused >>>> at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) >>>> at sun.nio.ch.SocketChannelImpl.finishConnect(SocketC >>>> hannelImpl.java:592) >>>> at sun.nio.ch.SocketAdaptor.connect(SocketAdaptor.jav a:11 >>>> at org.apache.hadoop.ipc.Client$Connection.setupIOstr >>>> eams(Client.java:174) >>>> at org.apache.hadoop.ipc.Client.getConnection(Client. java:623) >>>> at org.apache.hadoop.ipc.Client.call(Client.java:546) >>>> at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java: 212) >>>> at org.apache.hadoop.dfs.$Proxy0.getProtocolVersion(U nknown Source) >>>> at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:313) >>>> at org.apache.hadoop.dfs.DFSClient.createRPCNamenode( >>>> DFSClient.java:102) >>>> at org.apache.hadoop.dfs.DFSClient.(DFSClient.j ava:17 >>>> at org.apache.hadoop.dfs.DistributedFileSystem.initia >>>> lize(DistributedFileSystem.java:6 >>>> at org.apache.hadoop.fs.FileSystem.createFileSystem(F >>>> ileSystem.java:1280) >>>> at org.apache.hadoop.fs.FileSystem.access$300(FileSys tem.java:56) >>>> at org.apache.hadoop.fs.FileSystem$Cache.get(FileSyst em.java:1291) >>>> at org.apache.hadoop.fs.FileSystem.get(FileSystem.jav a:203) >>>> at org.apache.hadoop.fs.FileSystem.get(FileSystem.jav a:10 >>>> at org.apache.hadoop.mapred.JobConf.getWorkingDirecto >>>> ry(JobConf.java:352) >>>> >>> >>> >> >> -- >> View this message in context: >> http://www.nabble.com/Not-able-to-copy-a-file-to-HDFS-after-installing-t= p21845768p21846923.html >> Sent from the Hadoop core-user mailing list archive at Nabble.com. >> >> >=20 >=20 >=20 > --=20 > M. Ra=C5=9Fit =C3=96ZDA=C5=9E >=20 >=20 --=20 View this message in context: http://www.nabble.com/Not-able-to-copy-a-file= -to-HDFS-after-installing-tp21845768p21867199.html Sent from the Hadoop core-user mailing list archive at Nabble.com.