Return-Path: X-Original-To: apmail-hadoop-user-archive@minotaur.apache.org Delivered-To: apmail-hadoop-user-archive@minotaur.apache.org Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by minotaur.apache.org (Postfix) with SMTP id 0A621E223 for ; Mon, 18 Feb 2013 17:24:01 +0000 (UTC) Received: (qmail 8100 invoked by uid 500); 18 Feb 2013 17:23:56 -0000 Delivered-To: apmail-hadoop-user-archive@hadoop.apache.org Received: (qmail 7911 invoked by uid 500); 18 Feb 2013 17:23:55 -0000 Mailing-List: contact user-help@hadoop.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: user@hadoop.apache.org Delivered-To: mailing list user@hadoop.apache.org Received: (qmail 7904 invoked by uid 99); 18 Feb 2013 17:23:55 -0000 Received: from athena.apache.org (HELO athena.apache.org) (140.211.11.136) by apache.org (qpsmtpd/0.29) with ESMTP; Mon, 18 Feb 2013 17:23:55 +0000 X-ASF-Spam-Status: No, hits=-0.0 required=5.0 tests=RCVD_IN_DNSWL_LOW,SPF_NEUTRAL X-Spam-Check-By: apache.org Received-SPF: neutral (athena.apache.org: local policy) Received: from [74.125.82.171] (HELO mail-we0-f171.google.com) (74.125.82.171) by apache.org (qpsmtpd/0.29) with ESMTP; Mon, 18 Feb 2013 17:23:50 +0000 Received: by mail-we0-f171.google.com with SMTP id u54so4813855wey.2 for ; Mon, 18 Feb 2013 09:23:29 -0800 (PST) X-Google-DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=google.com; s=20120113; h=mime-version:x-received:in-reply-to:references:date:message-id :subject:from:to:content-type:content-transfer-encoding :x-gm-message-state; bh=n81aYbW/DlsKfFH+rkHNPb2AlFCZO5YAVzPR48gc1R8=; b=DoMO1fx8p7ZTRnpBMIt3qBwU7KKwbuHi2QHeHdHl3JNqK6HOqpO/v/VsES3ZYN28km k4PVpcf7jyvHVV0Xhc8PHoydZDlhOqn8B9QEE5PwNb7pN45fmq5JtstvHU5H70Y3GyR9 XNBYnf9dGM8T/E7bKS3cgPeYipOajS/x5/USUnYSWy+hpAXK2ePp8Kq14fHwpgNIiIxj i/68UhvdFRQryaO62e5F+97DGAC8lbFHSv5Oa/ffoZ1oN57n5eckgf1XWrfXdXgBVUMm btTAMZA1kNaewxfe7Iy+MdH7KIPkaXEjT3HZRwcwwcleMgzvhFiF83EKAsK2imADs4hn EUkA== MIME-Version: 1.0 X-Received: by 10.180.75.177 with SMTP id d17mr20221046wiw.16.1361208209009; Mon, 18 Feb 2013 09:23:29 -0800 (PST) Received: by 10.194.9.165 with HTTP; Mon, 18 Feb 2013 09:23:28 -0800 (PST) In-Reply-To: References: Date: Mon, 18 Feb 2013 12:23:28 -0500 Message-ID: Subject: Re: Piping output of hadoop command From: Jean-Marc Spaggiari To: user@hadoop.apache.org Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: quoted-printable X-Gm-Message-State: ALoCoQl7Rw75Frzx0nm8e/5+lfi/2f5ZSO2f0Jkj6HgaANizoN35HhFtYm5OZth/hjW4Gv2fcMFI X-Virus-Checked: Checked by ClamAV on apache.org Hi Julian, I think it's not outputing on the standard output bu on the error one. You might want to test that: hadoop fs -copyToLocal FILE_IN_HDFS 1>&2 | ssh REMOTE_HOST "dd of=3DFILE_ON REMOTE_HOST" Which will redirect the stderr to the stdout too. Not sure, but it might be your issue. JM 2013/2/18, Julian Wissmann : > Hi, > > we're running a Hadoop cluster with hbase for the purpose of > evaluating it as database for a research project and we've more or > less decided to go with it. > So now I'm exploring backup mechanisms and have decided to experiment > with hadoops export functionality for that. > > What I am trying to achieve is getting data out of hbase and into hdfs > via hadoop export and then copy it out of hdfs onto a backup system. > However while copying data out of hdfs to the backup machine I am > experiencing problems. > > What I am trying to do is the following: > > hadoop fs -copyToLocal FILE_IN_HDFS | ssh REMOTE_HOST "dd of=3DTARGET_FIL= E" > > It creates a file on the remote host, however this file is 0kb in > size; instead of copying any data over there, the file just lands in > my home folder. > > The command output looks like this: hadoop fs -copyToLocal > FILE_IN_HDFS | ssh REMOTE_HOST "dd of=3DFILE_ON REMOTE_HOST" > 0+0 Datens=C3=A4tze ein > 0+0 Datens=C3=A4tze aus > 0 Bytes (0 B) kopiert, 1,10011 s, 0,0 kB/s > > I cannot think of any reason, why this command would behave in this > way. Is this some Java-ism that I'm missing here (like not correctly > treating stdout), or am I actually doing it wrong? > > The Hadoop Version is 2.0.0-cdh4.1.2 > > Regards > > Julian >