Return-Path: X-Original-To: apmail-hadoop-hdfs-user-archive@minotaur.apache.org Delivered-To: apmail-hadoop-hdfs-user-archive@minotaur.apache.org Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by minotaur.apache.org (Postfix) with SMTP id 496DC10270 for ; Sun, 20 Oct 2013 13:29:11 +0000 (UTC) Received: (qmail 44714 invoked by uid 500); 20 Oct 2013 13:29:00 -0000 Delivered-To: apmail-hadoop-hdfs-user-archive@hadoop.apache.org Received: (qmail 44615 invoked by uid 500); 20 Oct 2013 13:29:00 -0000 Mailing-List: contact user-help@hadoop.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: user@hadoop.apache.org Delivered-To: mailing list user@hadoop.apache.org Received: (qmail 44608 invoked by uid 99); 20 Oct 2013 13:28:59 -0000 Received: from athena.apache.org (HELO athena.apache.org) (140.211.11.136) by apache.org (qpsmtpd/0.29) with ESMTP; Sun, 20 Oct 2013 13:28:59 +0000 X-ASF-Spam-Status: No, hits=2.2 required=5.0 tests=HTML_MESSAGE,NORMAL_HTTP_TO_IP,RCVD_IN_DNSWL_NONE,SPF_PASS,WEIRD_PORT X-Spam-Check-By: apache.org Received-SPF: pass (athena.apache.org: domain of gunjanmish@gmail.com designates 209.85.192.179 as permitted sender) Received: from [209.85.192.179] (HELO mail-pd0-f179.google.com) (209.85.192.179) by apache.org (qpsmtpd/0.29) with ESMTP; Sun, 20 Oct 2013 13:28:55 +0000 Received: by mail-pd0-f179.google.com with SMTP id y10so3457176pdj.24 for ; Sun, 20 Oct 2013 06:28:35 -0700 (PDT) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=gmail.com; s=20120113; h=mime-version:in-reply-to:references:date:message-id:subject:from:to :content-type; bh=yyRzpn6+9mJunm0t2mRAwGeJGcvlTa8RBOzudC2yWSE=; b=wW9aEA7EZR1c3W/YIHpbZyviHJCZ4OHUjBUhHeTUX/HkZorFy/ivgcOOlwUlMIjMXH vJHXUeJ8CM9Aay8iijO6BfgjD4GeS2DCVF1z1+bAimiXjhGqfjAFgua8Ey8B+i5+6F9X JKvTyjIw+dLeQQeykk1TtmAC1DZCaL3VU/rkW6KAyNk/JBZQaU8tGwJIii1fBodo2BOd 6OioARshQqZnQ0zAFbaTDDipiztl+2kN4oKDACaSR251VNJW+HohEhP2md1TM+vgRVER GBh0x3N51e3z/akYEVqQ1/mRZ/X1UyKtX+OOLF7hy74ZfneRCtO9Z92pEE+gs7crU8LA r3Qg== MIME-Version: 1.0 X-Received: by 10.66.180.200 with SMTP id dq8mr13464182pac.104.1382275715176; Sun, 20 Oct 2013 06:28:35 -0700 (PDT) Received: by 10.68.114.101 with HTTP; Sun, 20 Oct 2013 06:28:35 -0700 (PDT) In-Reply-To: References: Date: Sun, 20 Oct 2013 09:28:35 -0400 Message-ID: Subject: Re: simple word count program remains un assigned... From: gunjan mishra To: user Content-Type: multipart/alternative; boundary=047d7ba972c2fc214804e92c240d X-Virus-Checked: Checked by ClamAV on apache.org --047d7ba972c2fc214804e92c240d Content-Type: text/plain; charset=ISO-8859-1 Thanks very much Harsh , I am a newbie to this , so I don't know much about MR1 or MR2(YARN) , I just updated my old cloudera distribution to the latest following instructions on cloudera website. Thats probably lead me to YARN and I checked and I don't have MR1 files any more. So please tell me if I should revert back to MR1 some how or I can continue with YARN troubleshooting problems. BTW thanks alot one more time. Thanks On Sat, Oct 19, 2013 at 11:26 PM, Harsh J wrote: > For CDH-specific questions, you can post them onto the Cloudera > Community forums: http://community.cloudera.com > > On Sun, Oct 20, 2013 at 2:45 AM, gunjan mishra > wrote: > > HItesh , > > > > My apologies , is there a different distribution list. Please can some > one > > point me to that. > > > > Thanks > > > > > > On Sat, Oct 19, 2013 at 5:11 PM, Hitesh Shah wrote: > >> > >> Hello Gunjan, > >> > >> This mailing list is for Apache Hadoop related questions. Please post > >> questions for other distributions to the appropriate vendor's mailing > list. > >> > >> thanks > >> -- Hitesh > >> > >> On Oct 19, 2013, at 11:27 AM, gunjan mishra wrote: > >> > >> > Hi I am trying to run a simple word count program , like this , job > >> > keeps running but is not being assigned to Mapper and reducers ... > >> > while I checked the status > >> > > >> > > >> > > =============================================================================================== > >> > > >> > [root@localhost ~]# hadoop jar > >> > /usr/lib/hadoop-0.20-mapreduce/hadoop-examples-2.0.0-mr1-cdh4.4.0.jar > >> > wordcount /usr/read.txt /usr/output > >> > 13/10/19 15:05:02 INFO service.AbstractService: > >> > Service:org.apache.hadoop.yarn.client.YarnClientImpl is inited. > >> > 13/10/19 15:05:02 INFO service.AbstractService: > >> > Service:org.apache.hadoop.yarn.client.YarnClientImpl is started. > >> > 13/10/19 15:05:03 INFO input.FileInputFormat: Total input paths to > >> > process : 1 > >> > 13/10/19 15:05:03 INFO mapreduce.JobSubmitter: number of splits:1 > >> > 13/10/19 15:05:03 WARN conf.Configuration: mapred.jar is deprecated. > >> > Instead, use mapreduce.job.jar > >> > 13/10/19 15:05:03 WARN conf.Configuration: mapred.output.value.class > is > >> > deprecated. Instead, use mapreduce.job.output.value.class > >> > 13/10/19 15:05:03 WARN conf.Configuration: mapreduce.combine.class is > >> > deprecated. Instead, use mapreduce.job.combine.class > >> > 13/10/19 15:05:03 WARN conf.Configuration: mapreduce.map.class is > >> > deprecated. Instead, use mapreduce.job.map.class > >> > 13/10/19 15:05:03 WARN conf.Configuration: mapred.job.name is > >> > deprecated. Instead, use mapreduce.job.name > >> > 13/10/19 15:05:03 WARN conf.Configuration: mapreduce.reduce.class is > >> > deprecated. Instead, use mapreduce.job.reduce.class > >> > 13/10/19 15:05:03 WARN conf.Configuration: mapred.input.dir is > >> > deprecated. Instead, use mapreduce.input.fileinputformat.inputdir > >> > 13/10/19 15:05:03 WARN conf.Configuration: mapred.output.dir is > >> > deprecated. Instead, use mapreduce.output.fileoutputformat.outputdir > >> > 13/10/19 15:05:03 WARN conf.Configuration: mapred.map.tasks is > >> > deprecated. Instead, use mapreduce.job.maps > >> > 13/10/19 15:05:03 WARN conf.Configuration: mapred.output.key.class is > >> > deprecated. Instead, use mapreduce.job.output.key.class > >> > 13/10/19 15:05:03 WARN conf.Configuration: mapred.working.dir is > >> > deprecated. Instead, use mapreduce.job.working.dir > >> > 13/10/19 15:05:03 INFO mapreduce.JobSubmitter: Submitting tokens for > >> > job: job_1382144693199_0005 > >> > 13/10/19 15:05:03 INFO client.YarnClientImpl: Submitted application > >> > application_1382144693199_0005 to ResourceManager at /0.0.0.0:8032 > >> > 13/10/19 15:05:04 INFO mapreduce.Job: The url to track the job: > >> > > http://localhost.localdomain:8088/proxy/application_1382144693199_0005/ > >> > 13/10/19 15:05:04 INFO mapreduce.Job: Running job: > >> > job_1382144693199_0005 > >> > > >> > > >> > > =============================================================================================== > >> > > >> > here is my cloudera distribution > >> > > >> > > >> > Hadoop 2.0.0-cdh4.4.0 > >> > Subversion > >> > > file:///data/1/jenkins/workspace/generic-package-rhel64-6-0/topdir/BUILD/hadoop-2.0.0-cdh4.4.0/src/hadoop-common-project/hadoop-common > >> > -r c0eba6cd38c984557e96a16ccd7356b7de835e79 > >> > Compiled by jenkins on Tue Sep 3 19:33:17 PDT 2013 > >> > From source with checksum ac7e170aa709b3ace13dc5f775487180 > >> > This command was run using > >> > /usr/lib/hadoop/hadoop-common-2.0.0-cdh4.4.0.jar > >> > > >> > > >> > and the outcome of jps (from root) > >> > > >> > ----------------------------------------- > >> > [root@localhost ~]# jps > >> > 2202 TaskTracker > >> > 4161 Bootstrap > >> > 3134 DataNode > >> > 3520 Application > >> > 3262 NameNode > >> > 1879 ThriftServer > >> > 1740 Main > >> > 3603 RunJar > >> > 1606 HMaster > >> > 2078 JobTracker > >> > 16277 Jps > >> > 3624 RunJar > >> > 4053 RunJar > >> > 4189 Sqoop > >> > 3582 Bootstrap > >> > 3024 JobHistoryServer > >> > 3379 SecondaryNameNode > >> > 4732 ResourceManager > >> > > >> > ------------------------------------------------------ > >> > -- > >> > Thanks & Regards > >> > Gunjan Mishra > >> > 732-200-5839(H) > >> > 917-216-9739(C) > >> > > > > > > > > -- > > Thanks & Regards > > Gunjan Mishra > > 732-200-5839(H) > > 917-216-9739(C) > > > > -- > Harsh J > -- Thanks & Regards Gunjan Mishra 732-200-5839(H) 917-216-9739(C) --047d7ba972c2fc214804e92c240d Content-Type: text/html; charset=ISO-8859-1 Content-Transfer-Encoding: quoted-printable
Thanks very much Harsh ,=A0

I am a newb= ie to this , so I don't know much about MR1 or MR2(YARN) , I just updat= ed my old cloudera distribution to the latest following instructions on clo= udera website. Thats probably lead me to YARN and I checked and =A0 I don&#= 39;t have MR1 files any more.=A0

So please tell me if I should revert back to MR1 some h= ow or I can continue with YARN troubleshooting problems.=A0

<= /div>
BTW thanks alot one more time.=A0

Thanks=



On Sat, Oct 19, 2013 at 11:26 PM, Harsh J <harsh@cloudera.com= > wrote:
For CDH-specific questions, you can post the= m onto the Cloudera
Community forums: http://community.cloudera.com

On Sun, Oct 20, 2013 at 2:45 AM, gunjan mishra <gunjanmish@gmail.com> wrote:
> HItesh ,
>
> My apologies , is there a different distribution list. Please can some= one
> point me to that.
>
> Thanks
>
>
> On Sat, Oct 19, 2013 at 5:11 PM, Hitesh Shah <hitesh@apache.org> wrote:
>>
>> Hello Gunjan,
>>
>> This mailing list is for Apache Hadoop related questions. Please p= ost
>> questions for other distributions to the appropriate vendor's = mailing list.
>>
>> thanks
>> -- Hitesh
>>
>> On Oct 19, 2013, at 11:27 AM, gunjan mishra wrote:
>>
>> > Hi I am trying to run a simple word count program , like this= , job
>> > keeps =A0running but is not being assigned to Mapper and redu= cers ...
>> > while I checked the status
>> >
>> >
>> > =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=
>> >
>> > [root@localhost ~]# hadoop jar
>> > /usr/lib/hadoop-0.20-mapreduce/hadoop-examples-2.0.0-mr1-cdh4= .4.0.jar
>> > wordcount /usr/read.txt /usr/output
>> > 13/10/19 15:05:02 INFO service.AbstractService:
>> > Service:org.apache.hadoop.yarn.client.YarnClientImpl is inite= d.
>> > 13/10/19 15:05:02 INFO service.AbstractService:
>> > Service:org.apache.hadoop.yarn.client.YarnClientImpl is start= ed.
>> > 13/10/19 15:05:03 INFO input.FileInputFormat: Total input pat= hs to
>> > process : 1
>> > 13/10/19 15:05:03 INFO mapreduce.JobSubmitter: number of spli= ts:1
>> > 13/10/19 15:05:03 WARN conf.Configuration: mapred.jar is depr= ecated.
>> > Instead, use mapreduce.job.jar
>> > 13/10/19 15:05:03 WARN conf.Configuration: mapred.output.valu= e.class is
>> > deprecated. Instead, use mapreduce.job.output.value.class
>> > 13/10/19 15:05:03 WARN conf.Configuration: mapreduce.combine.= class is
>> > deprecated. Instead, use mapreduce.job.combine.class
>> > 13/10/19 15:05:03 WARN conf.Configuration: mapreduce.map.clas= s is
>> > deprecated. Instead, use mapreduce.job.map.class
>> > 13/10/19 15:05:03 WARN conf.Configuration: mapred.job.name is
>> > deprecated. Instead, use mapreduce.job.name
>> > 13/10/19 15:05:03 WARN conf.Configuration: mapreduce.reduce.c= lass is
>> > deprecated. Instead, use mapreduce.job.reduce.class
>> > 13/10/19 15:05:03 WARN conf.Configuration: mapred.input.dir i= s
>> > deprecated. Instead, use mapreduce.input.fileinputformat.inpu= tdir
>> > 13/10/19 15:05:03 WARN conf.Configuration: mapred.output.dir = is
>> > deprecated. Instead, use mapreduce.output.fileoutputformat.ou= tputdir
>> > 13/10/19 15:05:03 WARN conf.Configuration: mapred.map.tasks i= s
>> > deprecated. Instead, use mapreduce.job.maps
>> > 13/10/19 15:05:03 WARN conf.Configuration: mapred.output.key.= class is
>> > deprecated. Instead, use mapreduce.job.output.key.class
>> > 13/10/19 15:05:03 WARN conf.Configuration: mapred.working.dir= is
>> > deprecated. Instead, use mapreduce.job.working.dir
>> > 13/10/19 15:05:03 INFO mapreduce.JobSubmitter: Submitting tok= ens for
>> > job: job_1382144693199_0005
>> > 13/10/19 15:05:03 INFO client.YarnClientImpl: Submitted appli= cation
>> > application_1382144693199_0005 to ResourceManager at /0.0.0.0:8032
>> > 13/10/19 15:05:04 INFO mapreduce.Job: The url to track the jo= b:
>> > http://localhost.localdomain:8088/= proxy/application_1382144693199_0005/
>> > 13/10/19 15:05:04 INFO mapreduce.Job: Running job:
>> > job_1382144693199_0005
>> >
>> >
>> > =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=
>> >
>> > here is my cloudera distribution
>> >
>> >
>> > Hadoop 2.0.0-cdh4.4.0
>> > Subversion
>> > file:///data/1/jenkins/workspace/generic-package-rhel64-6-0/t= opdir/BUILD/hadoop-2.0.0-cdh4.4.0/src/hadoop-common-project/hadoop-common >> > -r c0eba6cd38c984557e96a16ccd7356b7de835e79
>> > Compiled by jenkins on Tue Sep =A03 19:33:17 PDT 2013
>> > From source with checksum ac7e170aa709b3ace13dc5f775487180 >> > This command was run using
>> > /usr/lib/hadoop/hadoop-common-2.0.0-cdh4.4.0.jar
>> >
>> >
>> > and the outcome of jps (from root)
>> >
>> > -----------------------------------------
>> > [root@localhost ~]# jps
>> > 2202 TaskTracker
>> > 4161 Bootstrap
>> > 3134 DataNode
>> > 3520 Application
>> > 3262 NameNode
>> > 1879 ThriftServer
>> > 1740 Main
>> > 3603 RunJar
>> > 1606 HMaster
>> > 2078 JobTracker
>> > 16277 Jps
>> > 3624 RunJar
>> > 4053 RunJar
>> > 4189 Sqoop
>> > 3582 Bootstrap
>> > 3024 JobHistoryServer
>> > 3379 SecondaryNameNode
>> > 4732 ResourceManager
>> >
>> > ------------------------------------------------------
>> > --
>> > Thanks & Regards
>> > Gunjan Mishra
>> > 732-200-5839(H)
>> > 917-216-9739(C)
>>
>
>
>
> --
> Thanks & Regards
> Gunjan Mishra
> 732-200-5839(H)
> 917-216-9739(C)



--
Harsh J



--
Thanks & Regards
Gunjan Mishra
732-200-5839(H)=
917-216-9739(C)
--047d7ba972c2fc214804e92c240d--