Return-Path: Delivered-To: apmail-hadoop-core-user-archive@www.apache.org Received: (qmail 23582 invoked from network); 22 Oct 2008 22:48:24 -0000 Received: from hermes.apache.org (HELO mail.apache.org) (140.211.11.2) by minotaur.apache.org with SMTP; 22 Oct 2008 22:48:24 -0000 Received: (qmail 46756 invoked by uid 500); 22 Oct 2008 22:48:21 -0000 Delivered-To: apmail-hadoop-core-user-archive@hadoop.apache.org Received: (qmail 46713 invoked by uid 500); 22 Oct 2008 22:48:21 -0000 Mailing-List: contact core-user-help@hadoop.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: core-user@hadoop.apache.org Delivered-To: mailing list core-user@hadoop.apache.org Received: (qmail 46702 invoked by uid 99); 22 Oct 2008 22:48:21 -0000 Received: from athena.apache.org (HELO athena.apache.org) (140.211.11.136) by apache.org (qpsmtpd/0.29) with ESMTP; Wed, 22 Oct 2008 15:48:21 -0700 X-ASF-Spam-Status: No, hits=2.6 required=10.0 tests=DNS_FROM_OPENWHOIS,SPF_HELO_PASS,SPF_PASS,WHOIS_MYPRIVREG X-Spam-Check-By: apache.org Received-SPF: pass (athena.apache.org: domain of lists@nabble.com designates 216.139.236.158 as permitted sender) Received: from [216.139.236.158] (HELO kuber.nabble.com) (216.139.236.158) by apache.org (qpsmtpd/0.29) with ESMTP; Wed, 22 Oct 2008 22:47:09 +0000 Received: from isper.nabble.com ([192.168.236.156]) by kuber.nabble.com with esmtp (Exim 4.63) (envelope-from ) id 1KsmUV-0001M2-T8 for core-user@hadoop.apache.org; Wed, 22 Oct 2008 15:47:47 -0700 Message-ID: <20121246.post@talk.nabble.com> Date: Wed, 22 Oct 2008 15:47:47 -0700 (PDT) From: bzheng To: core-user@hadoop.apache.org Subject: Re: distcp port for 0.17.2 In-Reply-To: <448899.22394.qm@web56207.mail.re3.yahoo.com> MIME-Version: 1.0 Content-Type: text/plain; charset=us-ascii Content-Transfer-Encoding: 7bit X-Nabble-From: bing.zheng@gmail.com References: <20117463.post@talk.nabble.com> <448899.22394.qm@web56207.mail.re3.yahoo.com> X-Virus-Checked: Checked by ClamAV on apache.org Thanks. The fs.default.name is "file:///" and dfs.http.address is "0.0.0.0:50070". I tried: hadoop dfs -ls /path/file to make sure file exists on cluster1 hadoop distcp file:///cluster1_master_node_ip:50070/path/file file:///cluster2_master_node_ip:50070/path/file It gives this error message: 08/10/22 15:43:47 INFO util.CopyFiles: srcPaths=[file:/cluster1_master_node_ip:50070/path/file] 08/10/22 15:43:47 INFO util.CopyFiles: destPath=file:/cluster2_master_node_ip:50070/path/file With failures, global counters are inaccurate; consider running with -i Copy failed: org.apache.hadoop.mapred.InvalidInputException: Input source file:/cluster1_master_node_ip:50070/path/file does not exist. at org.apache.hadoop.util.CopyFiles.checkSrcPath(CopyFiles.java:578) at org.apache.hadoop.util.CopyFiles.copy(CopyFiles.java:594) at org.apache.hadoop.util.CopyFiles.run(CopyFiles.java:743) at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65) at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:79) at org.apache.hadoop.util.CopyFiles.main(CopyFiles.java:763) If I use hdfs:// instead of file:///, I get: Copy failed: java.net.SocketTimeoutException: timed out waiting for rpc response at org.apache.hadoop.ipc.Client.call(Client.java:559) at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:212) at org.apache.hadoop.dfs.$Proxy0.getProtocolVersion(Unknown Source) at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:313) at org.apache.hadoop.dfs.DFSClient.createRPCNamenode(DFSClient.java:102) at org.apache.hadoop.dfs.DFSClient.(DFSClient.java:178) at org.apache.hadoop.dfs.DistributedFileSystem.initialize(DistributedFileSystem.java:68) at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1280) at org.apache.hadoop.fs.FileSystem.access$300(FileSystem.java:56) at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1291) at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:203) at org.apache.hadoop.fs.Path.getFileSystem(Path.java:175) at org.apache.hadoop.util.CopyFiles.checkSrcPath(CopyFiles.java:572) at org.apache.hadoop.util.CopyFiles.copy(CopyFiles.java:594) at org.apache.hadoop.util.CopyFiles.run(CopyFiles.java:743) at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65) at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:79) at org.apache.hadoop.util.CopyFiles.main(CopyFiles.java:763) s29752-hadoopuser wrote: > > Hi, > > There is no such thing called distcp port. distcp uses (generic) file > system API and so it does not care about the file system implementation > details like port number. > > It is common to use distcp with HDFS or HFTP. The urls will look like > hdfs://namenode:port/path and hftp://namenode:port/path for HDFS and HFTP, > respectively. The HDFS and HFTP ports are specified by fs.default.name > and dfs.http.address, respectively. > > Nicholas Sze > > > > > ----- Original Message ---- >> From: bzheng >> To: core-user@hadoop.apache.org >> Sent: Wednesday, October 22, 2008 11:57:43 AM >> Subject: distcp port for 0.17.2 >> >> >> What's the port number for distcp in 0.17.2? I can't find any >> documentation >> on distcp for version 0.17.2. For version 0.18, the documentation says >> it's >> 8020. >> >> I'm using a standard install and the only open ports associated with >> hadoop >> are 50030, 50070, and 50090. None of them work with distcp. So, how do >> you >> use distcp in 0.17.2? are there any extra setup/configuration needed? >> >> Thanks in advance for your help. >> -- >> View this message in context: >> http://www.nabble.com/distcp-port-for-0.17.2-tp20117463p20117463.html >> Sent from the Hadoop core-user mailing list archive at Nabble.com. > > > -- View this message in context: http://www.nabble.com/distcp-port-for-0.17.2-tp20117463p20121246.html Sent from the Hadoop core-user mailing list archive at Nabble.com.