hadoop-hdfs-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Botelho, Andrew" <Andrew.Bote...@emc.com>
Subject RE: Distributed Cache
Date Wed, 10 Jul 2013 13:31:26 GMT
Ok using job.addCacheFile() seems to compile correctly.
However, how do I then access the cached file in my Mapper code?  Is there a method that will
look for any files in the cache?



From: Ted Yu [mailto:yuzhihong@gmail.com]
Sent: Tuesday, July 09, 2013 6:08 PM
To: user@hadoop.apache.org
Subject: Re: Distributed Cache

You should use Job#addCacheFile()

On Tue, Jul 9, 2013 at 3:02 PM, Botelho, Andrew <Andrew.Botelho@emc.com<mailto:Andrew.Botelho@emc.com>>

I was wondering if I can still use the DistributedCache class in the latest release of Hadoop
(Version 2.0.5).
In my driver class, I use this code to try and add a file to the distributed cache:

import java.net.URI;
import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.filecache.DistributedCache;
import org.apache.hadoop.fs.*;
import org.apache.hadoop.io.*;
import org.apache.hadoop.mapreduce.*;
import org.apache.hadoop.mapreduce.lib.input.FileInputFormat;
import org.apache.hadoop.mapreduce.lib.output.FileOutputFormat;

Configuration conf = new Configuration();
DistributedCache.addCacheFile(new URI("file path in HDFS"), conf);
Job job = Job.getInstance();

However, I keep getting warnings that the method addCacheFile() is deprecated.
Is there a more current way to add files to the distributed cache?

Thanks in advance,


View raw message