hadoop-hdfs-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Rich Haase <rha...@pandora.com>
Subject Re: Copying files to hadoop.
Date Wed, 17 Dec 2014 23:38:47 GMT
Anil,

Happy to help!

Cheers,
Rich

Rich Haase | Sr. Software Engineer | Pandora
m 303.887.1146 | rhaase@pandora.com

From: Anil Jagtap <anil.jagtap@gmail.com<mailto:anil.jagtap@gmail.com>>
Reply-To: "user@hadoop.apache.org<mailto:user@hadoop.apache.org>" <user@hadoop.apache.org<mailto:user@hadoop.apache.org>>
Date: Wednesday, December 17, 2014 at 4:35 PM
To: "user@hadoop.apache.org<mailto:user@hadoop.apache.org>" <user@hadoop.apache.org<mailto:user@hadoop.apache.org>>
Subject: Re: Copying files to hadoop.

Hi Rich,

Yes infact i was too thinking the same but then somehow slipped of my mind. I guess the second
option would be really great so i don't even need to build the complex and length commands.
The shared folder will be anyways appear as local in vm.

Thanks a lot Rich.

Rgds, Anil


On Thu, Dec 18, 2014 at 10:03 AM, Rich Haase <rhaase@pandora.com<mailto:rhaase@pandora.com>>
wrote:
Anil,

You have two main options:

  1.  install the hadoop software on OSX and add the configuration files appropriate for your
sandbox, then do use hdfs dfs –put <local> <remote>
  2.  Setup your sandbox VM to share a directory between OS X and Linux.  All virtual machines
that I know of support sharing a file system between the VM and host.  This is probably the
easiest solution since it will allow you to see the files you have on OS X in your Linux VM
and then you can use the hdfs/hadoop/yarn commands on linux (which you already have configured).

Cheers,

Rich

Rich Haase | Sr. Software Engineer | Pandora
m 303.887.1146 | rhaase@pandora.com<mailto:rhaase@pandora.com>

From: Anil Jagtap <anil.jagtap@gmail.com<mailto:anil.jagtap@gmail.com>>
Reply-To: "user@hadoop.apache.org<mailto:user@hadoop.apache.org>" <user@hadoop.apache.org<mailto:user@hadoop.apache.org>>
Date: Wednesday, December 17, 2014 at 3:58 PM
To: "user@hadoop.apache.org<mailto:user@hadoop.apache.org>" <user@hadoop.apache.org<mailto:user@hadoop.apache.org>>
Subject: Re: Copying files to hadoop.

Yes i can do that but I have connected from my mac os terminal to linux using ssh.
Now when I run LS command it shows me list of files & folders from Linux and not from
Mac OS.
I have files which I need to put onto Hadoop directly from Mac OS.
So something like below.

>From Mac OS Terminal:

[root@sandbox ~]#hadoop fs -put <MAC OS FOLDER PATH/FILE> <HADOOP PATH>

Hope my requirement is clear.

Rgds, Anil




On Thu, Dec 18, 2014 at 9:39 AM, johny casanova <pcgamer2426@outlook.com<mailto:pcgamer2426@outlook.com>>
wrote:
Hi Anil,

you can use the  hadoop fs put "file" or directory and that should add it to your hdfs

________________________________
Date: Thu, 18 Dec 2014 09:29:34 +1100
Subject: Copying files to hadoop.
From: anil.jagtap@gmail.com<mailto:anil.jagtap@gmail.com>
To: user@hadoop.apache.org<mailto:user@hadoop.apache.org>

Dear All,

I'm pretty new to Hadoop technology and Linux environment hence struggling even to find solutions
for the basic stuff.

For now, Hortonworks Sandbox is working fine for me and i managed to connect to it thru SSH.

Now i have some csv files in my mac os folders which i want to copy onto Hadoop. As per my
knowledge i can copy those files first to Linux and then put to Hadoop. But is there a way
in which just in one command it will copy to Hadoop directly from mac os folder?

Appreciate your advices.

Thank you guys...

Rgds, Anil


Mime
View raw message