hadoop-common-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Milind Bhandarkar (JIRA)" <j...@apache.org>
Subject [jira] Updated: (HADOOP-209) Add a program to recursively copy directories across file systems
Date Thu, 11 May 2006 20:33:05 GMT
     [ http://issues.apache.org/jira/browse/HADOOP-209?page=all ]

Milind Bhandarkar updated HADOOP-209:

    Attachment: CopyDir.patch

Here is a patch for recursively copying directories across multiple file-systems. Consists
of addition of a cp command to bin/hadoop, a mapreduce program to copy files, and a unit test
that tests all combinations of filesystems (i.e. 4 combinations of local, and dfs).

> Add a program to recursively copy directories across file systems
> -----------------------------------------------------------------
>          Key: HADOOP-209
>          URL: http://issues.apache.org/jira/browse/HADOOP-209
>      Project: Hadoop
>         Type: New Feature

>   Components: fs
>     Versions: 0.3
>  Environment: All
>     Reporter: Milind Bhandarkar
>  Attachments: CopyDir.patch
> A useful feature would be a simple command to copy directories recursively across filesystems.
The source and destination path should be specified using a filesystem-neutral URI, such as:
> hadoop cp dfs://namenode1:port1/path/to/srcdir file:///path/to/local/destination/dir
> "cp" command would invoke a map-reduce program to copy files recursively.
> I willl attach a patch as soon as svn is up and running.

This message is automatically generated by JIRA.
If you think it was sent incorrectly contact one of the administrators:
For more information on JIRA, see:

View raw message