hadoop-common-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Keith Stevens <fozzietheb...@gmail.com>
Subject Re: Setting up MapReduce 2 on a test cluster
Date Mon, 12 Mar 2012 04:34:53 GMT
Just to double check that I'm checking the logs correctly,

I need the hadoop-hdfs-0.23.0-cdh4b1.jar specifically, yes?  The logs for
my local nodes are reporting that this is included in the class path for
the resource master and node masters.  Is there some other task that might
be missing the jar?

On Sun, Mar 11, 2012 at 9:26 PM, Keith Stevens <fozziethebeat@gmail.com>wrote:

> Hi Harsh,
>
> Thanks for getting back to me on this on a sunday.
>
> Your guess is the same as mine, but i'm not sure where this is happening
> or how.
>
> I installed this manually using the tarballs because the cluster i'm
> working on is mostly cut off from the internet.  I also can't seem to
> install createrepo to create a local yum repository.
>
> Is there a way to install the cdh4 packages using just yum install if I
> downloaded them all?  I tried to do this but yum says there the yarn rpm
> depends on the hadoop rpm and visa versa.
>
> Thanks,
> --Keith
>
>
> On Sun, Mar 11, 2012 at 8:53 PM, Harsh J <harsh@cloudera.com> wrote:
>
>> Hey Keith,
>>
>> You're most likely missing the HDFS jar somehow. I use the package
>> installation and am able to run the following successfully:
>>
>> hadoop jar /usr/lib/hadoop/hadoop-mapreduce-examples.jar randomwriter
>> -Dmapreduce.job.user.name=$USER
>> -Dmapreduce.clientfactory.class.name
>> =org.apache.hadoop.mapred.YarnClientFactory
>> -Dmapreduce.randomwriter.bytespermap=10000 -Ddfs.blocksize=536870912
>> -Ddfs.block.size=536870912 -libjars
>> /usr/lib/hadoop/hadoop-mapreduce-client-jobclient.jar output
>>
>> My `hadoop classpath` looks like:
>>
>> /etc/hadoop/conf:/usr/lib/hadoop:/usr/lib/hadoop/lib/*:/usr/lib/hadoop/*:/usr/lib/hadoop/share/hadoop/common/*:/usr/lib/hadoop/share/hadoop/hdfs/*:/usr/lib/hadoop/share/hadoop/mapreduce/*
>>
>> How have you installed this?
>>
>> On Mon, Mar 12, 2012 at 6:48 AM, Keith Stevens <fozziethebeat@gmail.com>
>> wrote:
>> > Hi All,
>> >
>> > I've been trying to setup Cloudera's Ch4 Beta 1 release of MapReduce
>> 2.0 on
>> > a small cluster for testing but i'm not having much luck getting things
>> > running.  I've been following the guides on
>> >
>> http://hadoop.apache.org/common/docs/r0.23.1/hadoop-yarn/hadoop-yarn-site/ClusterSetup.htmlto
>> > configure everything.  hdfs seems to be working properly in that I can
>> > access the file system, load files, and read them.
>> >
>> > However, running jobs doesn't seem to work correctly.  I'm trying to run
>> > just a sample job with
>> >
>> > hadoop jar
>> >
>> /usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-examples-0.23.0-cdh4b1.jar
>> > randomwriter -Dmapreduce.job.user.name=$USER -
>> > Dmapreduce.clientfactory.class.name
>> =org.apache.hadoop.mapred.YarnClientFactory
>> > -Dmapreduce.randomwriter.bytespermap=10000 -Ddfs.blocksize=536870912
>> > -Ddfs.block.size=536870912 -libjars
>> >
>> /usr/local/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-jobclient-0.23.0-cdh4b1.jar
>> > output
>> >
>> > When running I get a ClassNotFoundException:
>> > org.apache.hadoop.hdfs.DistributedFileSystem exception on the local node
>> > running the task.  I have fs.hdfs.impl set to be
>> > org.apache.hadoop.hdfs.DistributedFileSystem which i believe is to be
>> > correct.  But i'm not sure why the node isn't finding the class.
>> >
>> > In my setup, everything is located under /usr/local/hadoop on all the
>> nodes
>> > and all the relevant environment variables point to that directly.  So
>> when
>> > the local nodes start up they include this:
>> >
>> >    -classpath
>> >
>> /usr/local/hadoop/conf:/usr/local/hadoop/conf:/usr/local/hadoop/conf:/usr/local/hadoop-0.23.0-cdh4b1/sbin/..:/usr/local/hadoop-0.23.0-cdh4b1/sbin/../lib/*:/usr/local/hadoop-0.23.0-cdh4b1/sbin/../*:/usr/local/hadoop-0.23.0-cdh4b1/sbin/../share/hadoop/common/lib/*:/usr/local/hadoop-0.23.0-cdh4b1/sbin/../share/hadoop/common/*:/usr/local/hadoop-0.23.0-cdh4b1/sbin/../share/hadoop/hdfs:/usr/local/hadoop-0.23.0-cdh4b1/sbin/../share/hadoop/hdfs/lib/*:/usr/local/hadoop-0.23.0-cdh4b1/sbin/../share/hadoop/hdfs/*:/usr/local/hadoop/share/hadoop/mapreduce/lib/*:/usr/local/hadoop/share/hadoop/mapreduce/*:/usr/local/hadoop/share/hadoop/mapreduce/*:/usr/local/hadoop/share/hadoop/mapreduce/lib/*:/usr/local/hadoop/conf/nm-config/log4j.properties
>> >
>> > which looks to be correct.  So I'm not exactly sure where the problem is
>> > coming from.
>> >
>> > Any suggestions on what might be wrong or how to further diagnose the
>> > problem would be greatly appreciated.
>> >
>> > Thanks!
>> > --Keith
>>
>>
>>
>> --
>> Harsh J
>>
>
>

Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message