Return-Path: X-Original-To: apmail-hadoop-user-archive@minotaur.apache.org Delivered-To: apmail-hadoop-user-archive@minotaur.apache.org Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by minotaur.apache.org (Postfix) with SMTP id 2F13910C15 for ; Wed, 17 Dec 2014 23:04:22 +0000 (UTC) Received: (qmail 37178 invoked by uid 500); 17 Dec 2014 23:04:03 -0000 Delivered-To: apmail-hadoop-user-archive@hadoop.apache.org Received: (qmail 37074 invoked by uid 500); 17 Dec 2014 23:04:03 -0000 Mailing-List: contact user-help@hadoop.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: user@hadoop.apache.org Delivered-To: mailing list user@hadoop.apache.org Received: (qmail 37063 invoked by uid 99); 17 Dec 2014 23:04:03 -0000 Received: from athena.apache.org (HELO athena.apache.org) (140.211.11.136) by apache.org (qpsmtpd/0.29) with ESMTP; Wed, 17 Dec 2014 23:04:03 +0000 X-ASF-Spam-Status: No, hits=2.4 required=5.0 tests=FREEMAIL_ENVFROM_END_DIGIT,HTML_MESSAGE,RCVD_IN_DNSWL_NONE,SPF_PASS X-Spam-Check-By: apache.org Received-SPF: pass (athena.apache.org: domain of pcgamer2426@outlook.com designates 65.55.111.100 as permitted sender) Received: from [65.55.111.100] (HELO BLU004-OMC2S25.hotmail.com) (65.55.111.100) by apache.org (qpsmtpd/0.29) with ESMTP; Wed, 17 Dec 2014 23:03:58 +0000 Received: from BLU176-W41 ([65.55.111.73]) by BLU004-OMC2S25.hotmail.com over TLS secured channel with Microsoft SMTPSVC(7.5.7601.22751); Wed, 17 Dec 2014 15:03:37 -0800 X-TMN: [OzFZkJTaioAqYKcAwTI5ngdXT7XLXycY] X-Originating-Email: [pcgamer2426@outlook.com] Message-ID: Content-Type: multipart/alternative; boundary="_93feff06-4181-48c2-9261-8d650fb5d3fc_" From: johny casanova To: "user@hadoop.apache.org" Subject: RE: Copying files to hadoop. Date: Wed, 17 Dec 2014 18:03:37 -0500 Importance: Normal In-Reply-To: References: ,, MIME-Version: 1.0 X-OriginalArrivalTime: 17 Dec 2014 23:03:37.0268 (UTC) FILETIME=[B0AD0F40:01D01A4D] X-Virus-Checked: Checked by ClamAV on apache.org --_93feff06-4181-48c2-9261-8d650fb5d3fc_ Content-Type: text/plain; charset="iso-8859-1" Content-Transfer-Encoding: quoted-printable what you can do is copy the files to the linux box then use the hadoop fs p= ut. You can do this like "scp /directory/i/want or "file.name" "username"@"= hostname":/directorytoputfiles/" for example : scp dude.txt dude@main-hadoop:/opt/ Date: Thu=2C 18 Dec 2014 09:58:43 +1100 Subject: Re: Copying files to hadoop. From: anil.jagtap@gmail.com To: user@hadoop.apache.org Yes i can do that but I have connected from my mac os terminal to linux usi= ng ssh.Now when I run LS command it shows me list of files & folders from L= inux and not from Mac OS.I have files which I need to put onto Hadoop direc= tly from Mac OS.So something like below. >From Mac OS Terminal:[root@sandbox ~]#hadoop fs -put Hope my requirement is clear. Rgds=2C Anil On Thu=2C Dec 18=2C 2014 at 9:39 AM=2C johny casanova wrote:=0A= =0A= =0A= Hi Anil=2C you can use the hadoop fs put "file" or directory and that should add it t= o your hdfs Date: Thu=2C 18 Dec 2014 09:29:34 +1100 Subject: Copying files to hadoop. From: anil.jagtap@gmail.com To: user@hadoop.apache.org Dear All=2C I'm pretty new to Hadoop technology and Linux environment hence struggling = even to find solutions for the basic stuff. For now=2C Hortonworks Sandbox is working fine for me and i managed to conn= ect to it thru SSH. Now i have some csv files in my mac os folders which i want to copy onto Ha= doop. As per my knowledge i can copy those files first to Linux and then pu= t to Hadoop. But is there a way in which just in one command it will copy t= o Hadoop directly from mac os folder? Appreciate your advices. Thank you guys... Rgds=2C Anil =0A= = --_93feff06-4181-48c2-9261-8d650fb5d3fc_ Content-Type: text/html; charset="iso-8859-1" Content-Transfer-Encoding: quoted-printable

what you can do is copy the = files to the linux box then use the hadoop fs put. You can do this like "sc= p /directory/i/want or "file.name" "username"@"hostname":/directorytoputfil= es/"

for example : scp dude.txt dude@main-hadoop:/opt/

Date: Thu=2C 18 Dec 2014 09:58:43 +1100
Subject: Re: = Copying files to hadoop.
From: anil.jagtap@gmail.com
To: user@hadoop.= apache.org

Yes i can do that but I have connected f= rom my mac os terminal to linux using ssh.
Now when I run LS command it= shows me list of files &=3B folders from Linux and not from Mac OS.
I have files which I need to put onto Hadoop directly from Mac OS.
So something like below.

From Mac OS Term= inal:

[root@sandbox = ~]#hadoop fs -put <=3BMAC OS FOLDER PATH/FILE>=3B <=3BHADOOP PATH>= =3B


Hope my requirement is clear.

Rgds=2C Anil



<= /div>

= On Thu=2C Dec 18=2C 2014 at 9:39 AM=2C johny casanova <= =3Bpcgamer2426= @outlook.com>=3B wrote:
=0A= =0A= =0A=
Hi Anil=2C

you can use the =3B hadoop fs p= ut "file" or directory and that should add it to your hdfs


= Date: Thu=2C 18 Dec 2014 09:29:34 +1100
Subject: Copying files to hadoop= .
From: anil.= jagtap@gmail.com
To: user@hadoop.apache.org

Dear All=2C=

I'm pretty new to Hadoop technology and Linux environme= nt hence struggling even to find solutions for the basic stuff.
<= br>
For now=2C Hortonworks Sandbox is working fine for me and i m= anaged to connect to it thru SSH.

Now i have some = csv files in my mac os folders which i want to copy onto Hadoop. As per my = knowledge i can copy those files first to Linux and then put to Hadoop. But= is there a way in which just in one command it will copy to Hadoop directl= y from mac os folder?

Appreciate your advices.

Thank you guys...

Rgds=2C An= il

=0A=
= --_93feff06-4181-48c2-9261-8d650fb5d3fc_--