hadoop-common-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Sagar Naik <sn...@attributor.com>
Subject Re: Not able to copy a file to HDFS after installing
Date Thu, 05 Feb 2009 07:53:29 GMT

where is the namenode running ? localhost or some other host

-Sagar
Rajshekar wrote:
> Hello, 
> I am new to Hadoop and I jus installed on Ubuntu 8.0.4 LTS as per guidance
> of a web site. I tested it and found working fine. I tried to copy a file
> but it is giving some error pls help me out
>
> hadoop@excel-desktop:/usr/local/hadoop/hadoop-0.17.2.1$  bin/hadoop jar
> hadoop-0.17.2.1-examples.jar wordcount /home/hadoop/Download\ URLs.txt
> download-output
> 09/02/02 11:18:59 INFO ipc.Client: Retrying connect to server:
> localhost/127.0.0.1:9000. Already tried 1 time(s).
> 09/02/02 11:19:00 INFO ipc.Client: Retrying connect to server:
> localhost/127.0.0.1:9000. Already tried 2 time(s).
> 09/02/02 11:19:01 INFO ipc.Client: Retrying connect to server:
> localhost/127.0.0.1:9000. Already tried 3 time(s).
> 09/02/02 11:19:02 INFO ipc.Client: Retrying connect to server:
> localhost/127.0.0.1:9000. Already tried 4 time(s).
> 09/02/02 11:19:04 INFO ipc.Client: Retrying connect to server:
> localhost/127.0.0.1:9000. Already tried 5 time(s).
> 09/02/02 11:19:05 INFO ipc.Client: Retrying connect to server:
> localhost/127.0.0.1:9000. Already tried 6 time(s).
> 09/02/02 11:19:06 INFO ipc.Client: Retrying connect to server:
> localhost/127.0.0.1:9000. Already tried 7 time(s).
> 09/02/02 11:19:07 INFO ipc.Client: Retrying connect to server:
> localhost/127.0.0.1:9000. Already tried 8 time(s).
> 09/02/02 11:19:08 INFO ipc.Client: Retrying connect to server:
> localhost/127.0.0.1:9000. Already tried 9 time(s).
> 09/02/02 11:19:09 INFO ipc.Client: Retrying connect to server:
> localhost/127.0.0.1:9000. Already tried 10 time(s).
> java.lang.RuntimeException: java.net.ConnectException: Connection refused
> at org.apache.hadoop.mapred.JobConf.getWorkingDirecto ry(JobConf.java:356)
> at org.apache.hadoop.mapred.FileInputFormat.setInputP
> aths(FileInputFormat.java:331)
> at org.apache.hadoop.mapred.FileInputFormat.setInputP
> aths(FileInputFormat.java:304)
> at org.apache.hadoop.examples.WordCount.run(WordCount .java:146)
> at org.apache.hadoop.util.ToolRunner.run(ToolRunner.j ava:65)
> at org.apache.hadoop.examples.WordCount.main(WordCoun t.java:155)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Nativ e Method)
> at sun.reflect.NativeMethodAccessorImpl.invoke(Native
> MethodAccessorImpl.java:57)
> at sun.reflect.DelegatingMethodAccessorImpl.invoke(De
> legatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:616)
> at org.apache.hadoop.util.ProgramDriver$ProgramDescri
> ption.invoke(ProgramDriver.java:6
> at org.apache.hadoop.util.ProgramDriver.driver(Progra mDriver.java:139)
> at org.apache.hadoop.examples.ExampleDriver.main(Exam pleDriver.java:53)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Nativ e Method)
> at sun.reflect.NativeMethodAccessorImpl.invoke(Native
> MethodAccessorImpl.java:57)
> at sun.reflect.DelegatingMethodAccessorImpl.invoke(De
> legatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:616)
> at org.apache.hadoop.util.RunJar.main(RunJar.java:155 )
> at org.apache.hadoop.mapred.JobShell.run(JobShell.jav a:194)
> at org.apache.hadoop.util.ToolRunner.run(ToolRunner.j ava:65)
> at org.apache.hadoop.util.ToolRunner.run(ToolRunner.j ava:79)
> at org.apache.hadoop.mapred.JobShell.main(JobShell.ja va:220)
> Caused by: java.net.ConnectException: Connection refused
> at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
> at sun.nio.ch.SocketChannelImpl.finishConnect(SocketC hannelImpl.java:592)
> at sun.nio.ch.SocketAdaptor.connect(SocketAdaptor.jav a:11
> at org.apache.hadoop.ipc.Client$Connection.setupIOstr eams(Client.java:174)
> at org.apache.hadoop.ipc.Client.getConnection(Client. java:623)
> at org.apache.hadoop.ipc.Client.call(Client.java:546)
> at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java: 212)
> at org.apache.hadoop.dfs.$Proxy0.getProtocolVersion(U nknown Source)
> at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:313)
> at org.apache.hadoop.dfs.DFSClient.createRPCNamenode( DFSClient.java:102)
> at org.apache.hadoop.dfs.DFSClient.<init>(DFSClient.j ava:17
> at org.apache.hadoop.dfs.DistributedFileSystem.initia
> lize(DistributedFileSystem.java:6
> at org.apache.hadoop.fs.FileSystem.createFileSystem(F ileSystem.java:1280)
> at org.apache.hadoop.fs.FileSystem.access$300(FileSys tem.java:56)
> at org.apache.hadoop.fs.FileSystem$Cache.get(FileSyst em.java:1291)
> at org.apache.hadoop.fs.FileSystem.get(FileSystem.jav a:203)
> at org.apache.hadoop.fs.FileSystem.get(FileSystem.jav a:10
> at org.apache.hadoop.mapred.JobConf.getWorkingDirecto ry(JobConf.java:352)
>   

Mime
View raw message