hadoop-common-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From brien colwell <xcolw...@gmail.com>
Subject Re: Not able to compile '.java' files
Date Sat, 06 Feb 2010 05:26:00 GMT
To get a feel for Hadoop, I'd recommend using Eclipse and using a single 
node to start. If you add all the Hadoop JARs to your Eclipse build 
path, I think there are five, then Eclipse will manage the classpath for 

The following config settings will set up Hadoop to use the local file 
system and run your MapRed job in a single JVM. In this way you can set 
also breakpoints and take it step by step.

Configuration baseConf = new Configuration();
baseConf.set("mapred.job.tracker", "local");
         baseConf.set("fs.default.name", "file:///");
String.format("%s/hadoop/mapred/system", LOCAL_TEMP_DIR));
String.format("%s/hadoop/mapred/data", LOCAL_TEMP_DIR));
String.format("%s/hadoop/namespace", LOCAL_TEMP_DIR));
         baseConf.set("dfs.data.dir", String.format("%s/hadoop/data", 

Then use this configuration when setting up a JobConf.

hope that helps,


On 2/5/2010 5:58 PM, Prateek Jindal wrote:
>   Hi everyone,
> I am new to mapReduce. I am trying to run a very basic mapReduce
> application. I encountered the following problem. Can someone help me about
> it:
> 1) I have 3 files, namely MaxTemperature.java, MaxTemperatureMapper.java,
> MaxTemperatureReducer.java. Now, I have to compile them to get the '.class'
> files which would be used by 'hadoop' command. I tried the following:
> 'javac -cp .:/hadoop/lib MaxTemperatureMapper.java'
> But it gives me the error that it doesn't recognize the packages '
> org.apache.hadoop.io', 'org.apache.hadoop.mapred' and so on.
> Can someone suggest something about that?
> 2) Also, do we have to make the '.class' files by ourselves necessarily. Or
> is it somehow possible that hadoop will make .class files by itself (from
> the .java source files)?
> Thanks,
> Prateek.

View raw message