Return-Path: Delivered-To: apmail-hadoop-common-user-archive@www.apache.org Received: (qmail 75347 invoked from network); 3 Mar 2011 22:19:03 -0000 Received: from hermes.apache.org (HELO mail.apache.org) (140.211.11.3) by minotaur.apache.org with SMTP; 3 Mar 2011 22:19:03 -0000 Received: (qmail 1734 invoked by uid 500); 3 Mar 2011 22:18:57 -0000 Delivered-To: apmail-hadoop-common-user-archive@hadoop.apache.org Received: (qmail 1431 invoked by uid 500); 3 Mar 2011 22:18:56 -0000 Mailing-List: contact common-user-help@hadoop.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: common-user@hadoop.apache.org Delivered-To: mailing list common-user@hadoop.apache.org Received: (qmail 1403 invoked by uid 500); 3 Mar 2011 22:18:56 -0000 Delivered-To: apmail-hadoop-core-user@hadoop.apache.org Received: (qmail 1396 invoked by uid 99); 3 Mar 2011 22:18:56 -0000 Received: from athena.apache.org (HELO athena.apache.org) (140.211.11.136) by apache.org (qpsmtpd/0.29) with ESMTP; Thu, 03 Mar 2011 22:18:56 +0000 X-ASF-Spam-Status: No, hits=1.5 required=5.0 tests=HTML_MESSAGE,RCVD_IN_DNSWL_LOW,SPF_PASS X-Spam-Check-By: apache.org Received-SPF: pass (athena.apache.org: domain of praveen.peddi@nokia.com designates 147.243.128.24 as permitted sender) Received: from [147.243.128.24] (HELO mgw-da01.nokia.com) (147.243.128.24) by apache.org (qpsmtpd/0.29) with ESMTP; Thu, 03 Mar 2011 22:18:48 +0000 Received: from vaebh101.NOE.Nokia.com (vaebh101.europe.nokia.com [10.160.244.22]) by mgw-da01.nokia.com (Switch-3.4.3/Switch-3.4.3) with ESMTP id p23MHMCH015688; Fri, 4 Mar 2011 00:18:25 +0200 Received: from smtp.mgd.nokia.com ([65.54.30.8]) by vaebh101.NOE.Nokia.com over TLS secured channel with Microsoft SMTPSVC(6.0.3790.4675); Fri, 4 Mar 2011 00:17:04 +0200 Received: from 008-AM1MMR1-003.mgdnok.nokia.com (65.54.30.58) by NOK-AM1MHUB-04.mgdnok.nokia.com (65.54.30.8) with Microsoft SMTP Server (TLS) id 8.2.255.0; Thu, 3 Mar 2011 23:17:01 +0100 Received: from 008-AM1MPN1-014.mgdnok.nokia.com ([169.254.4.6]) by 008-AM1MMR1-003.mgdnok.nokia.com ([65.54.30.58]) with mapi id 14.01.0270.002; Thu, 3 Mar 2011 23:16:59 +0100 From: To: , , Subject: Unable to use hadoop cluster on the cloud Thread-Topic: Unable to use hadoop cluster on the cloud Thread-Index: AcvZ8LXygnv7lbFwRTS8JCTlhHPoxA== Date: Thu, 3 Mar 2011 22:16:58 +0000 Message-ID: <6C870DB6C8A84F41898F05D559707F23085CEDAE@008-AM1MPN1-014.mgdnok.nokia.com> Accept-Language: en-US Content-Language: en-US X-MS-Has-Attach: X-MS-TNEF-Correlator: x-originating-ip: [172.19.167.66] Content-Type: multipart/alternative; boundary="_000_6C870DB6C8A84F41898F05D559707F23085CEDAE008AM1MPN1014mg_" MIME-Version: 1.0 X-OriginalArrivalTime: 03 Mar 2011 22:17:04.0874 (UTC) FILETIME=[BA27CCA0:01CBD9F0] X-Nokia-AV: Clean --_000_6C870DB6C8A84F41898F05D559707F23085CEDAE008AM1MPN1014mg_ Content-Type: text/plain; charset="us-ascii" Content-Transfer-Encoding: quoted-printable Hello all, I installed hadoop0.20.2 on physical machines and everything works like a c= harm. Now I installed hadoop using the same hadoop-install gz file on the c= loud. Installation seems fine. I can even copy files to hdfs from master ma= chine. But when I try to do it from another "non hadoop" machine, I get fol= lowing error. I did googling and lot of people got this error but could not= find any solution. Also I didn't see any exceptions in the hadoop logs. Any thoughts? $ /usr/local/hadoop-0.20.2/bin/hadoop fs -copyFromLocal Merchandising-ear.t= ar.gz /tmp/hadoop-test/Merchandising-ear.tar.gz 11/03/03 21:58:50 INFO hdfs.DFSClient: Exception in createBlockOutputStream= java.net.ConnectException: Connection timed out 11/03/03 21:58:50 INFO hdfs.DFSClient: Abandoning block blk_-82432076289737= 32008_1005 11/03/03 21:58:50 INFO hdfs.DFSClient: Waiting to find target node: xx.xx.1= 2:50010 11/03/03 21:59:17 INFO hdfs.DFSClient: Exception in createBlockOutputStream= java.net.ConnectException: Connection timed out 11/03/03 21:59:17 INFO hdfs.DFSClient: Abandoning block blk_285212766656802= 6830_1005 11/03/03 21:59:17 INFO hdfs.DFSClient: Waiting to find target node: xx.xx.1= 6.12:50010 11/03/03 21:59:44 INFO hdfs.DFSClient: Exception in createBlockOutputStream= java.net.ConnectException: Connection timed out 11/03/03 21:59:44 INFO hdfs.DFSClient: Abandoning block blk_228483619346326= 5901_1005 11/03/03 21:59:44 INFO hdfs.DFSClient: Waiting to find target node: xx.xx.1= 6.12:50010 11/03/03 22:00:11 INFO hdfs.DFSClient: Exception in createBlockOutputStream= java.net.ConnectException: Connection timed out 11/03/03 22:00:11 INFO hdfs.DFSClient: Abandoning block blk_-56009154140552= 50488_1005 11/03/03 22:00:11 INFO hdfs.DFSClient: Waiting to find target node: xx.xx.1= 6.11:50010 11/03/03 22:00:17 WARN hdfs.DFSClient: DataStreamer Exception: java.io.IOEx= ception: Unable to create new block. at org.apache.hadoop.hdfs.DFSClient$DFSOutputStream.nextBlockOutput= Stream(DFSClient.java:2845) at org.apache.hadoop.hdfs.DFSClient$DFSOutputStream.access$2000(DFS= Client.java:2102) at org.apache.hadoop.hdfs.DFSClient$DFSOutputStream$DataStreamer.ru= n(DFSClient.java:2288) 11/03/03 22:00:17 WARN hdfs.DFSClient: Error Recovery for block blk_-560091= 5414055250488_1005 bad datanode[0] nodes =3D=3D null 11/03/03 22:00:17 WARN hdfs.DFSClient: Could not get block locations. Sourc= e file "/tmp/hadoop-test/Merchandising-ear.tar.gz" - Aborting... copyFromLocal: Connection timed out 11/03/03 22:00:17 ERROR hdfs.DFSClient: Exception closing file /tmp/hadoop-= test/Merchandising-ear.tar.gz : java.net.ConnectException: Connection timed= out java.net.ConnectException: Connection timed out at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.jav= a:567) at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTi= meout.java:206) at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:404) at org.apache.hadoop.hdfs.DFSClient$DFSOutputStream.createBlockOutp= utStream(DFSClient.java:2870) at org.apache.hadoop.hdfs.DFSClient$DFSOutputStream.nextBlockOutput= Stream(DFSClient.java:2826) at org.apache.hadoop.hdfs.DFSClient$DFSOutputStream.access$2000(DFS= Client.java:2102) at org.apache.hadoop.hdfs.DFSClient$DFSOutputStream$DataStreamer.ru= n(DFSClient.java:2288) [C4554954_admin@c4554954vl03 relevancy]$ --_000_6C870DB6C8A84F41898F05D559707F23085CEDAE008AM1MPN1014mg_--