hadoop-common-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Keith Stevens <fozzietheb...@gmail.com>
Subject Setting up MapReduce 2 on a test cluster
Date Mon, 12 Mar 2012 01:18:53 GMT
Hi All,

I've been trying to setup Cloudera's Ch4 Beta 1 release of MapReduce 2.0 on
a small cluster for testing but i'm not having much luck getting things
running.  I've been following the guides on
http://hadoop.apache.org/common/docs/r0.23.1/hadoop-yarn/hadoop-yarn-site/ClusterSetup.htmlto
configure everything.  hdfs seems to be working properly in that I can
access the file system, load files, and read them.

However, running jobs doesn't seem to work correctly.  I'm trying to run
just a sample job with

hadoop jar
/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-examples-0.23.0-cdh4b1.jar
randomwriter -Dmapreduce.job.user.name=$USER -
Dmapreduce.clientfactory.class.name=org.apache.hadoop.mapred.YarnClientFactory
-Dmapreduce.randomwriter.bytespermap=10000 -Ddfs.blocksize=536870912
-Ddfs.block.size=536870912 -libjars
/usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-jobclient-0.23.0-cdh4b1.jar
output

When running I get a ClassNotFoundException:
org.apache.hadoop.hdfs.DistributedFileSystem exception on the local node
running the task.  I have fs.hdfs.impl set to be
org.apache.hadoop.hdfs.DistributedFileSystem which i believe is to be
correct.  But i'm not sure why the node isn't finding the class.

In my setup, everything is located under /usr/local/hadoop on all the nodes
and all the relevant environment variables point to that directly.  So when
the local nodes start up they include this:

    -classpath
/usr/local/hadoop/conf:/usr/local/hadoop/conf:/usr/local/hadoop/conf:/usr/local/hadoop-0.23.0-cdh4b1/sbin/..:/usr/local/hadoop-0.23.0-cdh4b1/sbin/../lib/*:/usr/local/hadoop-0.23.0-cdh4b1/sbin/../*:/usr/local/hadoop-0.23.0-cdh4b1/sbin/../share/hadoop/common/lib/*:/usr/local/hadoop-0.23.0-cdh4b1/sbin/../share/hadoop/common/*:/usr/local/hadoop-0.23.0-cdh4b1/sbin/../share/hadoop/hdfs:/usr/local/hadoop-0.23.0-cdh4b1/sbin/../share/hadoop/hdfs/lib/*:/usr/local/hadoop-0.23.0-cdh4b1/sbin/../share/hadoop/hdfs/*:/usr/local/hadoop/share/hadoop/mapreduce/lib/*:/usr/local/hadoop/share/hadoop/mapreduce/*:/usr/local/hadoop/share/hadoop/mapreduce/*:/usr/local/hadoop/share/hadoop/mapreduce/lib/*:/usr/local/hadoop/conf/nm-config/log4j.properties

which looks to be correct.  So I'm not exactly sure where the problem is
coming from.

Any suggestions on what might be wrong or how to further diagnose the
problem would be greatly appreciated.

Thanks!
--Keith

Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message