hadoop-hdfs-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From johny casanova <pcgamer2...@outlook.com>
Subject RE: Copying files to hadoop.
Date Wed, 17 Dec 2014 23:03:37 GMT

what you can do is copy the files to the linux box then use the hadoop fs put. You can do
this like "scp /directory/i/want or "file.name" "username"@"hostname":/directorytoputfiles/"

for example : scp dude.txt dude@main-hadoop:/opt/
Date: Thu, 18 Dec 2014 09:58:43 +1100
Subject: Re: Copying files to hadoop.
From: anil.jagtap@gmail.com
To: user@hadoop.apache.org

Yes i can do that but I have connected from my mac os terminal to linux using ssh.Now when
I run LS command it shows me list of files & folders from Linux and not from Mac OS.I
have files which I need to put onto Hadoop directly from Mac OS.So something like below.
>From Mac OS Terminal:[root@sandbox ~]#hadoop fs -put <MAC OS FOLDER PATH/FILE> <HADOOP
PATH>
Hope my requirement is clear.

Rgds, Anil



On Thu, Dec 18, 2014 at 9:39 AM, johny casanova <pcgamer2426@outlook.com> wrote:


Hi Anil,

you can use the  hadoop fs put "file" or directory and that should add it to your hdfs

Date: Thu, 18 Dec 2014 09:29:34 +1100
Subject: Copying files to hadoop.
From: anil.jagtap@gmail.com
To: user@hadoop.apache.org

Dear All,
I'm pretty new to Hadoop technology and Linux environment hence struggling even to find solutions
for the basic stuff.
For now, Hortonworks Sandbox is working fine for me and i managed to connect to it thru SSH.
Now i have some csv files in my mac os folders which i want to copy onto Hadoop. As per my
knowledge i can copy those files first to Linux and then put to Hadoop. But is there a way
in which just in one command it will copy to Hadoop directly from mac os folder?
Appreciate your advices.
Thank you guys...
Rgds, Anil
 		 	   		  
 		 	   		  
Mime
View raw message