hadoop-common-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Serge Blazhievsky <hadoop...@gmail.com>
Subject Re: Hadoop 2.2.0 Distributed Cache
Date Thu, 27 Mar 2014 18:17:02 GMT
How are you putting files in distributed cache ? 

Sent from my iPhone

> On Mar 27, 2014, at 9:20 AM, Jonathan Poon <jkpoon@ucdavis.edu> wrote:
> 
> 
> Hi Stanley,
> 
> Sorry about the confusion, but I'm trying to read a txt file into my Mapper function.
 I am trying to copy the file using the -files option when submitting the Hadoop job.
> 
> I try to obtain the filename using the following lines of code in my Mapper:
> 
> URI[] localPaths = context.getCacheFiles();
> String configFilename = localPaths[0].toString();
> 
> However, when I run the JAR in hadoop, I get a NullPointerException.  
> 
> Error: java.lang.NullPointerException
> 
> I'm running Hadoop 2.2 in Single Node mode.  Not sure if that affects things...
> 
> 
> 
> 
>> On Wed, Mar 26, 2014 at 8:21 PM, Stanley Shi <sshi@gopivotal.com> wrote:
>> where did you get the error? from the compiler or the runtime?
>> 
>> Regards,
>> Stanley Shi,
>> 
>> 
>> 
>>> On Thu, Mar 27, 2014 at 7:34 AM, Jonathan Poon <jkpoon@ucdavis.edu> wrote:
>>> Hi Everyone,
>>> 
>>> I'm submitting a MapReduce job using the -files option to copy a text file that
contains properties I use for the map and reduce functions.  
>>> 
>>> I'm trying to obtain the local cache files in my mapper function using:
>>> 
>>> Path[] paths = context.getLocalCacheFiles();
>>> 
>>> However, i get an error saying getLocalCacheFiles() is undefined.  I've imported
the hadoop-mapreduce-client-core-2.2.0.jar as part of my build environment in Eclipse. 
>>> 
>>> Any ideas on what could be incorrect?  
>>> 
>>> If I'm incorrectly using the distributed cache, could someone point me to an
example using the distributed cache with Hadoop 2.2.0?  
>>> 
>>> Thanks for your help!
>>> 
>>> Jonathan 
>> 
> 

Mime
View raw message