hadoop-mapreduce-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Abhishek Singh <23singhabhis...@gmail.com>
Subject Re: Putting multiple files..
Date Sun, 28 Dec 2014 21:26:11 GMT
Hello Anil,

There are 2 ways I'm aware of :-

1) use put command

put

Usage: hadoop fs -put <localsrc> ... <dst>

Copy single src, or multiple srcs from local file system to the destination
filesystem. Also reads input from stdin and writes to destination
filesystem.

    hadoop fs -put localfile /user/hadoop/hadoopfile
    hadoop fs -put localfile1 localfile2 /user/hadoop/hadoopdir
    hadoop fs -put localfile hdfs://nn.example.com/hadoop/hadoopfile
    hadoop fs -put - hdfs://nn.example.com/hadoop/hadoopfile
    Reads the input from stdin.

Exit Code:

Returns 0 on success and -1 on error.

2) Create a shell script for your custom need.

To give you a vague idea here's one of the link on stackoverflow which is
similar to what you are demanding:-

http://stackoverflow.com/questions/12790166/shell-script-to-move-files-into-a-hadoop-cluster

Please reach out for further discussion!

Thanks!

Regards,

Abhishek Singh
On Dec 28, 2014 3:52 AM, "Anil Jagtap" <anil.jagtap@gmail.com> wrote:

> Dear All,
>
> Just wanted to know if there is a way to copy multiple files using hadoop
> fs -put.
>
> Instead of specifying individual name I provide wild-chars and respective
> files should get copied.
>
> Thank You.
>
> Rgds, Anil
>

Mime
View raw message