hadoop-common-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Johannes Zillmann ...@101tec.com>
Subject Re: hadoop and local files
Date Thu, 24 Jan 2008 16:53:57 GMT
Hi Jerrro,

take a look at 
The DistributedCache looks like what you are searching for. I think the 
interesting part is the example 


jerrro wrote:
> Hello,
> When launching a map-reduce job, I am interested in copying a certain file
> to the datanodes, but not HDFS - the local file system, so I can access that
> file from my job on the datanode. (The file is around 500KB, so I don't
> think there will be much overhead). Is there a way to tell hadoop to do that
> (I heard it is possible, but not sure how)? Also, how do I know where the
> file is copied to? (I understood it can be copied to /tmp or something of
> that sort of the datanode).
> Thanks.
> Jerr.

101tec GmbH

Halle (Saale), Saxony-Anhalt, Germany

View raw message