hadoop-mapreduce-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Harsh J <ha...@cloudera.com>
Subject Re: Running example application with capacity scheduler ?
Date Thu, 15 Sep 2011 11:53:21 GMT
Hello Arun,

To me it looks like your HDFS isn't setup properly, in this case. Can
you ensure all DNs are properly up? Your NN appears to have gotten
stuck somehow into a safemode. Check out your http://nn-host:50070
page for more details on why.

Your JT won't come up until the NN is properly up and out of safemode
(for which it needs the DNs). And once it comes up, I think you should
be good to go, keeping in mind the changes Thomas mentioned earlier.

On Thu, Sep 15, 2011 at 3:58 PM, arun k <arunk786@gmail.com> wrote:
> Hi all !
>
> Thanks Thomas ! it's working in terminal.
> I saw the queues in web UI of JT.
> when i try to run normally again (default) i get this error :
> i tried formatting namenode and making safemode off and restart but didn't
> work.
>
> hduser@arun-Presario-C500-RU914PA-ACJ:/usr/local/hadoop$ bin/hadoop jar
> hadoop*examples*.jar wordcount  /user/hduser/wcinput /user/hduser/wcoutput6
> java.io.IOException: Call to localhost/127.0.0.1:54311 failed on local
> exception: java.io.IOException: Connection reset by peer
>
> The log of JobTracker shows :
> 2011-09-15 12:46:13,346 INFO org.apache.hadoop.mapred.JobTracker: JobTracker
> up at: 54311
> 2011-09-15 12:46:13,347 INFO org.apache.hadoop.mapred.JobTracker: JobTracker
> webserver: 50030
> 2011-09-15 12:46:13,634 INFO org.apache.hadoop.mapred.JobTracker: Cleaning
> up the system directory
> 2011-09-15 12:46:13,646 INFO org.apache.hadoop.mapred.JobTracker: problem
> cleaning system directory:
> hdfs://localhost:54310/app/hadoop/tmp/mapred/system
> org.apache.hadoop.ipc.RemoteException:
> org.apache.hadoop.hdfs.server.namenode.SafeModeException: Cannot delete
> /app/hadoop/tmp/mapred/system. Name node is in safe mode.
>
> Thanks,
> Arun
>
>
>
>
>
>
> On Wed, Sep 14, 2011 at 7:46 PM, Thomas Graves <tgraves@yahoo-inc.com>
> wrote:
>>
>> I believe it defaults to submit a job to the default queue if you don't
>> specify it.  You don't have the default queue defined in your list of
>> mapred.queue.names.  So add -Dmapred.job.queue.name=myqueue1 (or another
>> queue you have defined) to the wordcount command like:
>>
>> bin/hadoop jar
>> > hadoop*examples*.jar wordcount -Dmapred.job.queue.name=myqueue1
>> /user/hduser/wcinput /user/hduser/wcoutput5
>>
>> Tom
>>
>>
>> On 9/14/11 5:57 AM, "arun k" <arunk786@gmail.com> wrote:
>>
>> > Hi !
>> >
>> > I have set up single-node cluster using
>> >
>> > http://www.google.co.in/url?sa=t&source=web&cd=1&ved=0CB0QFjAA&url=http%3A%2F%
>> >
>> > 2Fwww.michael-noll.com%2Ftutorials%2Frunning-hadoop-on-ubuntu-linux-single-nod
>> >
>> > e-cluster%2F&rct=j&q=michael%20noll%20single%20node&ei=b4ZwTvrCLsOrrAei-N32Bg&
>> >
>> > usg=AFQjCNGhuvv0tNdvPj4u23bbj-qXJDlixg&sig2=7ij8Dy7aQZUkBwhTnS1rLw&cad=rja
>> > and could run wordcount example application.
>> > I was trying to run this application using capacity scheduler.
>> > As per
>> > http://hadoop.apache.org/common/docs/current/capacity_scheduler.htmli
>> > have done :
>> > 1.Copied the hadoop-capacity-scheduler-*.jar from *
>> > contrib/capacity-scheduler* directory to HADOOP_HOME/lib
>> > 2.Set mapred.jobtracker.taskScheduler
>> > 3.Set *mapred.queue.names to myqueue1,myqueue2.
>> > 4.Set *mapred.capacity-scheduler.queue.<queue-name>.capacity to 30 and
>> > 70
>> > for two queues.
>> >
>> > When i run i get the error :
>> > hduser@arun-Presario-C500-RU914PA-ACJ:/usr/local/hadoop$ bin/hadoop jar
>> > hadoop*examples*.jar wordcount /user/hduser/wcinput
>> > /user/hduser/wcoutput5
>> > 11/09/14 16:00:56 INFO input.FileInputFormat: Total input paths to
>> > process :
>> > 4
>> > org.apache.hadoop.ipc.RemoteException: java.io.IOException: Queue
>> > "default"
>> > does not exist
>> >     at
>> > org.apache.hadoop.mapred.JobTracker.submitJob(JobTracker.java:2998)
>> >     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>> >     at
>> >
>> > sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>> >     at
>> >
>> > sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.j
>> > ava:25)
>> >     at java.lang.reflect.Method.invoke(Method.java:597)
>> >     at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:508)
>> >     at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:959)
>> >     at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:955)
>> >     at java.security.AccessController.doPrivileged(Native Method)
>> >     at javax.security.auth.Subject.doAs(Subject.java:396)
>> >     at org.apache.hadoop.ipc.Server$Handler.run(Server.java:953)
>> >
>> >     at org.apache.hadoop.ipc.Client.call(Client.java:740)
>> >     at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:220)
>> >     at org.apache.hadoop.mapred.$Proxy0.submitJob(Unknown Source)
>> >     at
>> > org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:800)
>> >     at org.apache.hadoop.mapreduce.Job.submit(Job.java:432)
>> >     at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:447)
>> >     at org.apache.hadoop.examples.WordCount.main(WordCount.java:67)
>> >     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>> >     at
>> >
>> > sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>> >     at
>> >
>> > sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.j
>> > ava:25)
>> >     at java.lang.reflect.Method.invoke(Method.java:597)
>> >     at
>> >
>> > org.apache.hadoop.util.ProgramDriver$ProgramDescription.invoke(ProgramDriver.j
>> > ava:68)
>> >     at
>> > org.apache.hadoop.util.ProgramDriver.driver(ProgramDriver.java:139)
>> >     at
>> > org.apache.hadoop.examples.ExampleDriver.main(ExampleDriver.java:64)
>> >     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>> >     at
>> >
>> > sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>> >     at
>> >
>> > sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.j
>> > ava:25)
>> >     at java.lang.reflect.Method.invoke(Method.java:597)
>> >     at org.apache.hadoop.util.RunJar.main(RunJar.java:156)
>> >
>> > I didn't submit jobs to a particular queue as such. Do i need to do it ?
>> > How
>> > can i do it ?
>> > Any help ?
>> >
>> > Thanks,
>> > Arun
>> >
>> >
>> > *
>> > *
>>
>
>



-- 
Harsh J

Mime
View raw message