hadoop-common-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Raj Vishwanathan <rajv...@yahoo.com>
Subject Re: Error in Formatting NameNode
Date Sun, 12 Feb 2012 15:40:51 GMT
Manish

If you read the error message, it says "connection refused". Big clue :-) 

You probably have firewall configured. 

Raj

Sent from my iPad
Please excuse the typos. 

On Feb 12, 2012, at 1:41 AM, Manish Maheshwari <myloginid@gmail.com> wrote:

> Thanks,
> 
> I tried with hadoop-1.0.0 and JRE6 and things are looking good. I was able
> to format the namenode and bring up the NameNode 'calvin-PC:47110' and
> Hadoop Map/Reduce Administration webpages.
> 
> Further i tried the example of TestDFSIO but get the below error of
> connection refused.
> 
> -bash-4.1$ cd share/hadoop
> -bash-4.1$ ../../bin/hadoop jar hadoop-test-1.0.0.jar TestDFSIO –write
> –nrFiles 1 –filesize 10
> Warning: $HADOOP_HOME is deprecated.
> 
> TestDFSIO.0.0.4
> 12/02/12 15:05:08 INFO fs.TestDFSIO: nrFiles = 1
> 12/02/12 15:05:08 INFO fs.TestDFSIO: fileSize (MB) = 1
> 12/02/12 15:05:08 INFO fs.TestDFSIO: bufferSize = 1000000
> 12/02/12 15:05:08 INFO fs.TestDFSIO: creating control file: 1 mega bytes, 1
> files
> 12/02/12 15:05:08 INFO fs.TestDFSIO: created control files for: 1 files
> 12/02/12 15:05:11 INFO ipc.Client: Retrying connect to server: calvin-PC/
> 127.0.0.1:8021. Already tried 0 time(s).
> 12/02/12 15:05:13 INFO ipc.Client: Retrying connect to server: calvin-PC/
> 127.0.0.1:8021. Already tried 1 time(s).
> 12/02/12 15:05:15 INFO ipc.Client: Retrying connect to server: calvin-PC/
> 127.0.0.1:8021. Already tried 2 time(s).
> 12/02/12 15:05:17 INFO ipc.Client: Retrying connect to server: calvin-PC/
> 127.0.0.1:8021. Already tried 3 time(s).
> 12/02/12 15:05:19 INFO ipc.Client: Retrying connect to server: calvin-PC/
> 127.0.0.1:8021. Already tried 4 time(s).
> 12/02/12 15:05:21 INFO ipc.Client: Retrying connect to server: calvin-PC/
> 127.0.0.1:8021. Already tried 5 time(s).
> 12/02/12 15:05:23 INFO ipc.Client: Retrying connect to server: calvin-PC/
> 127.0.0.1:8021. Already tried 6 time(s).
> 12/02/12 15:05:25 INFO ipc.Client: Retrying connect to server: calvin-PC/
> 127.0.0.1:8021. Already tried 7 time(s).
> 12/02/12 15:05:27 INFO ipc.Client: Retrying connect to server: calvin-PC/
> 127.0.0.1:8021. Already tried 8 time(s).
> 12/02/12 15:05:29 INFO ipc.Client: Retrying connect to server: calvin-PC/
> 127.0.0.1:8021. Already tried 9 time(s).
> java.net.ConnectException: Call to calvin-PC/127.0.0.1:8021 failed on
> connection exception: java.net.ConnectException: Connection refused: no
> further information
>        at org.apache.hadoop.ipc.Client.wrapException(Client.java:1095)
>        at org.apache.hadoop.ipc.Client.call(Client.java:1071)
>        at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:225)
>        at org.apache.hadoop.mapred.$Proxy2.getProtocolVersion(Unknown
> Source)
>        at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:396)
>        at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:379)
>        at
> org.apache.hadoop.mapred.JobClient.createRPCProxy(JobClient.java:480)
>        at org.apache.hadoop.mapred.JobClient.init(JobClient.java:474)
>        at org.apache.hadoop.mapred.JobClient.<init>(JobClient.java:457)
>        at org.apache.hadoop.mapred.JobClient.runJob(JobClient.java:1260)
>        at org.apache.hadoop.fs.TestDFSIO.runIOTest(TestDFSIO.java:257)
>        at org.apache.hadoop.fs.TestDFSIO.readTest(TestDFSIO.java:295)
>        at org.apache.hadoop.fs.TestDFSIO.run(TestDFSIO.java:459)
>        at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
>        at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:79)
>        at org.apache.hadoop.fs.TestDFSIO.main(TestDFSIO.java:317)
>        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>        at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>        at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>        at java.lang.reflect.Method.invoke(Method.java:597)
>        at
> org.apache.hadoop.util.ProgramDriver$ProgramDescription.invoke(ProgramDriver.java:68)
>        at
> org.apache.hadoop.util.ProgramDriver.driver(ProgramDriver.java:139)
>        at org.apache.hadoop.test.AllTestDriver.main(AllTestDriver.java:81)
>        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>        at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>        at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>        at java.lang.reflect.Method.invoke(Method.java:597)
>        at org.apache.hadoop.util.RunJar.main(RunJar.java:156)
> Caused by: java.net.ConnectException: Connection refused: no further
> information
>        at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
>        at
> sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:567)
>        at
> org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
>        at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:656)
>        at
> org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:434)
>        at
> org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:560)
>        at
> org.apache.hadoop.ipc.Client$Connection.access$2000(Client.java:184)
>        at org.apache.hadoop.ipc.Client.getConnection(Client.java:1202)
>        at org.apache.hadoop.ipc.Client.call(Client.java:1046)
>        ... 26 more
> -bash-4.1$
> 
> Is this a problem with ssh. ssh daemon is still working. i could not get
> anything on this on google.
> 
> Thanks
> Manish
> 
> 
> 
> 
> 
> On Sat, Feb 11, 2012 at 4:37 PM, N Keywal <nkeywal@gmail.com> wrote:
>> 
>> Hi,
>> 
>> You should try with java 1.6.
>> See
>> 
> http://hadoop.apache.org/common/docs/r0.22.0/single_node_setup.html#PreReqs
>> 
>> You don't want to use the hadoop release 1.0?
>> 
>> N.
>> 
>> On Sat, Feb 11, 2012 at 11:51 AM, Manish Maheshwari <myloginid@gmail.com
>> wrote:
>> 
>>> Hi All,
>>> 
>>> I made 4-5 attempts in configuring the namenode but always get a Java
> Error
>>> 
>>> I am using hadoop-0.22.0 on cygwin. SSH is working
>>> 
>>> I have installed JDK 5 in JAVA_HOME=/cygdrive/c/Java/jdk1.5.0_22 and
> same
>>> is set in hadoop-env.sh as well as the cygwin .bash_profile.
>>> When i log in into Cygwin i see the JAVA_HOME set.
>>> 
>>> Below is the error -
>>> 
>>> -bash-4.1$ pwd
>>> /cygdrive/c/hadoop/hadoop-0.22.0
>>> -bash-4.1$ bin/hadoop namenode -format
>>> *cygpath: can't convert empty path
>>> **java.lang.NoClassDefFoundError: org/apache/hadoop/util/PlatformName
>>> **Exception in thread "main" cygpath: can't convert empty path
>>> cygpath: can't convert empty path
>>> *DEPRECATED: Use of this script to execute hdfs command is deprecated.
>>> Instead use the hdfs command for it.
>>> 
>>> cygpath: can't convert empty path
>>> java.lang.NoClassDefFoundError: org/apache/hadoop/util/PlatformName
>>> Exception in thread "main" cygpath: can't convert empty path
>>> cygpath: can't convert empty path
>>> java.lang.UnsupportedClassVersionError: Bad version number in .class
> file
>>>       at java.lang.ClassLoader.defineClass1(Native Method)
>>>       at java.lang.ClassLoader.defineClass(ClassLoader.java:621)
>>>       at
>>> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:124)
>>>       at java.net.URLClassLoader.defineClass(URLClassLoader.java:260)
>>>       at java.net.URLClassLoader.access$100(URLClassLoader.java:56)
>>>       at java.net.URLClassLoader$1.run(URLClassLoader.java:195)
>>>       at java.security.AccessController.doPrivileged(Native Method)
>>>       at java.net.URLClassLoader.findClass(URLClassLoader.java:188)
>>>       at java.lang.ClassLoader.loadClass(ClassLoader.java:307)
>>>       at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:268)
>>>       at java.lang.ClassLoader.loadClass(ClassLoader.java:252)
>>>       at java.lang.ClassLoader.loadClassInternal(ClassLoader.java:320)
>>> Exception in thread "main" -bash-4.1$
>>> -bash-4.1$
>>> 
>>> -bash-4.1$ grep JAVA hadoop-env.sh
>>> # The only required environment variable is JAVA_HOME.  All others are
>>> # set JAVA_HOME in this file, so that it is correctly defined on
>>> # export JAVA_HOME=/usr/lib/j2sdk1.6-sun
>>> *export JAVA_HOME=/cygdrive/c/Java/jdk1.5.0_22
>>> *
>>> Any help is much appreciated.
>>> 
>>> Thanks,
>>> Manish
>>> 

Mime
View raw message