hadoop-common-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Espen Amble Kolstad <es...@trank.no>
Subject Re: Problem copying to local from DFS in Hadoop 0.12.1
Date Tue, 13 Mar 2007 17:05:16 GMT
Hi,

Here's a patch that I use to fix this:
Index: src/java/org/apache/hadoop/fs/ChecksumFileSystem.java
===================================================================
--- src/java/org/apache/hadoop/fs/ChecksumFileSystem.java
(revision 517190)
+++ src/java/org/apache/hadoop/fs/ChecksumFileSystem.java       (working
copy)
@@ -599,7 +599,7 @@
     } else {
       Path[] srcs = listPaths(src);
       for (Path srcFile : srcs) {
-        copyToLocalFile(srcFile, dst, copyCrc);
+        copyToLocalFile(srcFile, new Path(dst, srcFile.getName()),
copyCrc);
       }
     }
   }

- Espen

Rohan Mehta wrote:
> All,
> 
> I am having problems with -copyToLocal in Hadoop version 0.12.1.
> I have injected some data into the Distributed file system of Hadoop 0.12.1
> When I try to do -copyToLocal on a file it works fine but when I try to
> do -copyToLocal on a directory I get the following error:
> 
> *[user@bob01 nutch]$ bin/hadoop fs -copyToLocal crawldb rohan/
> copyToLocal: Target rohan/data already exists*
> 
> Same thing happens when i use -get or -cp option
> 
> System config:
> - using cluster of 5 computers
> - Nutch trunk version 0.9
> - Hadoop truck version 0.12.1
> 
> how can I make this work ? any pointers are appreciated.
> 
> Thanks,
> Rohan
> 


Mime
View raw message