Return-Path: X-Original-To: apmail-hadoop-user-archive@minotaur.apache.org Delivered-To: apmail-hadoop-user-archive@minotaur.apache.org Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by minotaur.apache.org (Postfix) with SMTP id 1FFDF10329 for ; Tue, 13 Aug 2013 16:03:03 +0000 (UTC) Received: (qmail 29968 invoked by uid 500); 13 Aug 2013 16:02:57 -0000 Delivered-To: apmail-hadoop-user-archive@hadoop.apache.org Received: (qmail 29879 invoked by uid 500); 13 Aug 2013 16:02:57 -0000 Mailing-List: contact user-help@hadoop.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: user@hadoop.apache.org Delivered-To: mailing list user@hadoop.apache.org Delivered-To: moderator for user@hadoop.apache.org Received: (qmail 84155 invoked by uid 99); 13 Aug 2013 15:11:06 -0000 X-ASF-Spam-Status: No, hits=1.5 required=5.0 tests=HTML_MESSAGE,RCVD_IN_DNSWL_LOW,SPF_PASS X-Spam-Check-By: apache.org Received-SPF: pass (nike.apache.org: domain of rmangler@gmail.com designates 209.85.160.45 as permitted sender) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=gmail.com; s=20120113; h=mime-version:in-reply-to:references:date:message-id:subject:from:to :content-type; bh=SUn/HElB3skkARmcFgMnvvtu7WHRk4DtDwV0HEzcWHw=; b=WwsW4LaX0O1UhdhoE5gQSi217KHBtKWr/mHKSfyQEv+V0czCIb6cGPZOwvCDny7C0y Jrs1lk7EAdE218ADX71s+d5NPFsBHdOu/2S++RPaygtBqRIgYMGyd7X3mG+EZE9sW0Ci 8vHY2zmrVD0Tkhbf+6z+39XC1vISvbOOeKd5ztqWB1lnAEcjpzvF4IdaAtSJGqrKy6d7 dopt0uekCnOKsjylh1iTuYVnX1JpIMGURFS3co1hNvNHsb3s/ZPEnObLIg8o4CFxCyCc 6N74bJrAgx8n7AwHLMYnANOo0ciqnHgifsM1FV2DWYr90xwPItNLnvRpY56F+ZsBJXon mwRw== MIME-Version: 1.0 X-Received: by 10.68.219.33 with SMTP id pl1mr5009913pbc.147.1376406626480; Tue, 13 Aug 2013 08:10:26 -0700 (PDT) In-Reply-To: References: Date: Tue, 13 Aug 2013 08:10:26 -0700 Message-ID: Subject: Re: YARN with local filesystem From: Rod Paulk To: user@hadoop.apache.org Content-Type: multipart/alternative; boundary=047d7b2edb2509bae104e3d5a4a1 X-Virus-Checked: Checked by ClamAV on apache.org --047d7b2edb2509bae104e3d5a4a1 Content-Type: text/plain; charset=ISO-8859-1 I was able to execute the example by running the job as the yarn user. For example the following successfully completes: sudo -u yarn yarn org.apache.hadoop.examples.RandomWriter /tmp/random-out Whereas this fails with the local user rpaulk: yarn org.apache.hadoop.examples.RandomWriter /tmp/random-out On Wed, Jul 31, 2013 at 2:28 PM, Rod Paulk wrote: > I am having an issue running 2.0.5-alpha (BigTop-0.6.0) YARN-MapReduce on > the local filesystem instead of HDFS. The appTokens file that the error > states is missing, does exist after the job fails. I saw other 'similar' > issues noted in YARN-917, YARN-513, YARN-993. When I switch to HDFS, the > jobs run fine. > > In core-site.xml > > fs.defaultFS > file:/// > > > In mapred-site.xml > > mapreduce.framework.name > yarn > > > 2013-07-29 16:13:06,549 INFO > org.apache.hadoop.yarn.server.nodemanager.containermanager.ContainerManagerImpl: > Start request for container_1375138534137_0003_01_000001 by user rpaulk > > 2013-07-29 16:13:06,549 INFO > org.apache.hadoop.yarn.server.nodemanager.containermanager.ContainerManagerImpl: > Creating a new application reference for app application_1375138534137_0003 > > 2013-07-29 16:13:06,549 INFO > org.apache.hadoop.yarn.server.nodemanager.NMAuditLogger: USER=rpaulk > IP=172.20.130.215 OPERATION=Start Container Request > TARGET=ContainerManageImpl RESULT=SUCCESS > APPID=application_1375138534137_0003 > CONTAINERID=container_1375138534137_0003_01_000001 > > 2013-07-29 16:13:06,551 INFO > org.apache.hadoop.yarn.server.nodemanager.containermanager.application.Application: > Application application_1375138534137_0003 transitioned from NEW to INITING > > 2013-07-29 16:13:06,551 INFO > org.apache.hadoop.yarn.server.nodemanager.containermanager.application.Application: > Adding container_1375138534137_0003_01_000001 to application > application_1375138534137_0003 > > 2013-07-29 16:13:06,554 INFO > org.apache.hadoop.yarn.server.nodemanager.containermanager.application.Application: > Application application_1375138534137_0003 transitioned from INITING to > RUNNING > > 2013-07-29 16:13:06,555 INFO > org.apache.hadoop.yarn.server.nodemanager.containermanager.container.Container: > Container container_1375138534137_0003_01_000001 transitioned from NEW to > LOCALIZING > > *2013-07-29 16:13:06,555 INFO > org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.LocalizedResource: > Resource > file:/nas/scratch/localfs-1/hadoop-yarn/staging/rpaulk/.staging/job_13751385 > * > > *34137_0003/appTokens transitioned from INIT to DOWNLOADING* > > 2013-07-29 16:13:06,556 INFO > org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.LocalizedResource: > Resource > file:/nas/scratch/localfs-1/hadoop-yarn/staging/rpaulk/.staging/job_13751385 > > 34137_0003/job.jar transitioned from INIT to DOWNLOADING > > 2013-07-29 16:13:06,556 INFO > org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.LocalizedResource: > Resource > file:/nas/scratch/localfs-1/hadoop-yarn/staging/rpaulk/.staging/job_13751385 > > 34137_0003/job.splitmetainfo transitioned from INIT to DOWNLOADING > > 2013-07-29 16:13:06,556 INFO > org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.LocalizedResource: > Resource > file:/nas/scratch/localfs-1/hadoop-yarn/staging/rpaulk/.staging/job_13751385 > > 34137_0003/job.split transitioned from INIT to DOWNLOADING > > 2013-07-29 16:13:06,556 INFO > org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.LocalizedResource: > Resource > file:/nas/scratch/localfs-1/hadoop-yarn/staging/rpaulk/.staging/job_1375138534137_0003/job.xml > transitioned from INIT to DOWNLOADING > > 2013-07-29 16:13:06,556 INFO > org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.ResourceLocalizationService: > Created localizer for container_1375138534137_0003_01_000001 > > 2013-07-29 16:13:06,559 INFO > org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.ResourceLocalizationService: > Writing credentials to the nmPrivate file > /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/nmPrivate/container_1375138534137_0003_01_000001.tokens. > Credentials list: > > 2013-07-29 16:13:06,560 INFO > org.apache.hadoop.yarn.server.nodemanager.DefaultContainerExecutor: > Initializing user rpaulk > > 2013-07-29 16:13:06,564 INFO > org.apache.hadoop.yarn.server.nodemanager.DefaultContainerExecutor: Copying > from > /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/nmPrivate/container_1375138534137_0003_01_000001.tokens > to > /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/rpaulk/appcache/application_1375138534137_0003/container_1375138534137_0003_01_000001.tokens > > 2013-07-29 16:13:06,564 INFO > org.apache.hadoop.yarn.server.nodemanager.DefaultContainerExecutor: CWD set > to > /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/rpaulk/appcache/application_1375138534137_0003 > = > file:/var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/rpaulk/appcache/application_1375138534137_0003 > > *2013-07-29 16:13:06,646 ERROR > org.apache.hadoop.security.UserGroupInformation: PriviledgedActionException > as:rpaulk (auth:SIMPLE) cause:java.io.FileNotFoundException: File > file:/nas/scratch/localfs-1/hadoop-yarn/staging/rpaulk/.staging/job_1375138534137_0003/appTokens > does not exist* > > 2013-07-29 16:13:06,648 INFO > org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.ResourceLocalizationService: > DEBUG: FAILED { > file:/nas/scratch/localfs-1/hadoop-yarn/staging/rpaulk/.staging/job_1375138534137_0003/appTokens, > 1375139459000, FILE, null } > > RemoteTrace: > > java.io.FileNotFoundException: File > file:/nas/scratch/localfs-1/hadoop-yarn/staging/rpaulk/.staging/job_1375138534137_0003/appTokens > does not exist > > at > org.apache.hadoop.fs.RawLocalFileSystem.getFileStatus(RawLocalFileSystem.java:492) > > at > org.apache.hadoop.fs.FilterFileSystem.getFileStatus(FilterFileSystem.java:395) > > at > org.apache.hadoop.yarn.util.FSDownload.copy(FSDownload.java:176) > > at > org.apache.hadoop.yarn.util.FSDownload.access$000(FSDownload.java:51) > > at > org.apache.hadoop.yarn.util.FSDownload$1.run(FSDownload.java:284) > > at > org.apache.hadoop.yarn.util.FSDownload$1.run(FSDownload.java:282) > > at java.security.AccessController.doPrivileged(Native Method) > > at javax.security.auth.Subject.doAs(Subject.java:396) > > at > org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1478) > > at > org.apache.hadoop.yarn.util.FSDownload.call(FSDownload.java:280) > > at org.apache.hadoop.yarn.util.FSDownload.call(FSDownload.java:51) > > at > java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303) > > at java.util.concurrent.FutureTask.run(FutureTask.java:138) > > at > java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:441) > > at > java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303) > > at java.util.concurrent.FutureTask.run(FutureTask.java:138) > > at > java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886) > > at > java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908) > > at java.lang.Thread.run(Thread.java:662) > > at LocalTrace: > > org.apache.hadoop.yarn.exceptions.impl.pb.YarnRemoteExceptionPBImpl: > File > file:/nas/scratch/localfs-1/hadoop-yarn/staging/rpaulk/.staging/job_1375138534137_0003/appTokens > does not exist > > at > org.apache.hadoop.yarn.server.nodemanager.api.protocolrecords.impl.pb.LocalResourceStatusPBImpl.convertFromProtoFormat(LocalResourceStatusPBImpl.java:217) > > at > org.apache.hadoop.yarn.server.nodemanager.api.protocolrecords.impl.pb.LocalResourceStatusPBImpl.getException(LocalResourceStatusPBImpl.java:147) > > at > org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.ResourceLocalizationService$LocalizerRunner.update(ResourceLocalizationService.java:819) > > at > org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.ResourceLocalizationService$LocalizerTracker.processHeartbeat(ResourceLocalizationService.java:491) > > at > org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.ResourceLocalizationService.heartbeat(ResourceLocalizationService.java:218) > > at > org.apache.hadoop.yarn.server.nodemanager.api.impl.pb.service.LocalizationProtocolPBServiceImpl.heartbeat(LocalizationProtocolPBServiceImpl.java:46) > > at > org.apache.hadoop.yarn.proto.LocalizationProtocol$LocalizationProtocolService$2.callBlockingMethod(LocalizationProtocol.java:57) > > at > org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:454) > > at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1014) > > at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1741) > > at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1737) > > at java.security.AccessController.doPrivileged(Native Method) > > at javax.security.auth.Subject.doAs(Subject.java:396) > > at > org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1478) > > at org.apache.hadoop.ipc.Server$Handler.run(Server.java:1735) > > 2013-07-29 16:13:06,650 INFO > org.apache.hadoop.yarn.server.nodemanager.containermanager.container.Container: > Container container_1375138534137_0003_01_000001 transitioned from > LOCALIZING to LOCALIZATION_FAILED > > *2013-07-29 16:13:06,650 INFO > org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.LocalizedResource: > Resource > file:/nas/scratch/localfs-1/hadoop-yarn/staging/rpaulk/.staging/job_1375138534137_0003/appTokens > transitioned from DOWNLOADING to INIT* > > 2013-07-29 16:13:06,650 INFO > org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.LocalizedResource: > Resource > file:/nas/scratch/localfs-1/hadoop-yarn/staging/rpaulk/.staging/job_1375138534137_0003/job.jar > transitioned from DOWNLOADING to INIT > > 2013-07-29 16:13:06,650 INFO > org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.LocalizedResource: > Resource > file:/nas/scratch/localfs-1/hadoop-yarn/staging/rpaulk/.staging/job_1375138534137_0003/job.splitmetainfo > transitioned from DOWNLOADING to INIT > > 2013-07-29 16:13:06,650 INFO > org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.LocalizedResource: > Resource > file:/nas/scratch/localfs-1/hadoop-yarn/staging/rpaulk/.staging/job_1375138534137_0003/job.split > transitioned from DOWNLOADING to INIT > > 2013-07-29 16:13:06,650 INFO > org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.LocalizedResource: > Resource > file:/nas/scratch/localfs-1/hadoop-yarn/staging/rpaulk/.staging/job_1375138534137_0003/job.xml > transitioned from DOWNLOADING to INIT > > 2013-07-29 16:13:06,652 WARN > org.apache.hadoop.yarn.server.nodemanager.NMAuditLogger: USER=rpaulk > OPERATION=Container Finished - Failed TARGET=ContainerImpl > RESULT=FAILURE DESCRIPTION=Container failed with state: > LOCALIZATION_FAILED APPID=application_1375138534137_0003 > CONTAINERID=container_1375138534137_0003_01_000001 > > 2013-07-29 16:13:06,652 INFO > org.apache.hadoop.yarn.server.nodemanager.containermanager.container.Container: > Container container_1375138534137_0003_01_000001 transitioned from > LOCALIZATION_FAILED to DONE > > 2013-07-29 16:13:06,652 INFO > org.apache.hadoop.yarn.server.nodemanager.containermanager.application.Application: > Removing container_1375138534137_0003_01_000001 from application > application_1375138534137_0003 > > 2013-07-29 16:13:06,652 INFO > org.apache.hadoop.yarn.server.nodemanager.containermanager.logaggregation.AppLogAggregatorImpl: > Considering container container_1375138534137_0003_01_000001 for > log-aggregation > > 2013-07-29 16:13:06,652 INFO > org.apache.hadoop.yarn.server.nodemanager.DefaultContainerExecutor: > Deleting absolute path : > /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/rpaulk/appcache/application_1375138534137_0003/container_1375138534137_0003_01_000001 > --047d7b2edb2509bae104e3d5a4a1 Content-Type: text/html; charset=ISO-8859-1 Content-Transfer-Encoding: quoted-printable
I was able to execute the=A0example by running the job as=A0the yarn u= ser.=A0=A0
=A0
For example the following successfully completes:
sudo -u yarn yarn org.apache.hadoop.examples.RandomWriter /tmp/random-= out
=A0
Whereas this fails with the local user rpaulk:
yarn org.apache.hadoop.examples.RandomWriter /tmp/random-out
=A0
On Wed, Jul 31, 2013 at 2:28 PM, Rod Paulk <rmangler@gmail.com&g= t; wrote:
I am having an issue running 2.0= .5-alpha (BigTop-0.6.0) YARN-MapReduce on the local filesystem instead of H= DFS.=A0=A0 The appTokens file that the error states is missing, does = exist after the job fails.=A0 I saw other 'similar' issues=A0noted = in=A0YARN-917, YARN-513, YARN-993.=A0=A0 When I switch to HDFS, the jobs ru= n fine.=A0
= =A0
In core-site.xml
<prop= erty>
=A0 <name= >fs.defaultFS</name>
=A0 <value>file:///&l= t;/value>
</<= /font>property>
<= /div>

In mapred-site.xm= l
<property
>
=A0
<name= >map= reduce.framework.name</nam= e>
=A0 <value>yarn</<= /font>value>
</<= /font>property>

2013-07-29 16:13:06,549 INFO= org.apache.hadoop.yarn.server.nodemanager.containermanager.ContainerManage= rImpl: Start request for container_1375138534137_0003_01_000001 by user rpa= ulk

2013-07-29 16:13:06,549 INFO= org.apache.hadoop.yarn.server.nodemanager.containermanager.ContainerManage= rImpl: Creating a new application reference for app application_13751385341= 37_0003

2013-07-29 16:13:06,549 INFO= org.apache.hadoop.yarn.server.nodemanager.NMAuditLogger: USER=3Drpaulk=A0=A0=A0=A0=A0=A0 IP=3D172.20.130.215=A0=A0=A0=A0=A0=A0 OPERATION=3DStart Container Request=A0=A0=A0=A0=A0=A0 TARG= ET=3DContainerManageImpl=A0=A0=A0=A0=A0 RESULT=3DSUCCESS= =A0=A0=A0 APPID=3Dapplication_1375138534137_0003=A0=A0=A0 CONTAINERID=3Dcontainer_1375138534137_0003_01_000001

2013-07-29 16:13:06,551 INFO= org.apache.hadoop.yarn.server.nodemanager.containermanager.application.App= lication: Application application_1375138534137_0003 transitioned from NEW = to INITING

2013-07-29 16:13:06,551 INFO= org.apache.hadoop.yarn.server.nodemanager.containermanager.application.App= lication: Adding container_1375138534137_0003_01_000001 to application appl= ication_1375138534137_0003

2013-07-29 16:13:06,554 INFO= org.apache.hadoop.yarn.server.nodemanager.containermanager.application.App= lication: Application application_1375138534137_0003 transitioned from INIT= ING to RUNNING

2013-07-29 16:13:06,555 INFO= org.apache.hadoop.yarn.server.nodemanager.containermanager.container.Conta= iner: Container container_1375138534137_0003_01_000001 transitioned from NE= W to LOCALIZING

2013-07-29 16:13:06,555 I= NFO org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.Lo= calizedResource: Resource file:/nas/scratch/localfs-1/hadoop-yarn/staging/r= paulk/.staging/job_13751385

34137_0003/appTokens tran= sitioned from INIT to DOWNLOADING

2013-07-29 16:13:06,556 INFO= org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.Local= izedResource: Resource file:/nas/scratch/localfs-1/hadoop-yarn/staging/rpau= lk/.staging/job_13751385

34137_0003/job.jar transitio= ned from INIT to DOWNLOADING

2013-07-29 16:13:06,556 INFO= org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.Local= izedResource: Resource file:/nas/scratch/localfs-1/hadoop-yarn/staging/rpau= lk/.staging/job_13751385

34137_0003/job.splitmetainfo= transitioned from INIT to DOWNLOADING

2013-07-29 16:13:06,556 INFO= org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.Local= izedResource: Resource file:/nas/scratch/localfs-1/hadoop-yarn/staging/rpau= lk/.staging/job_13751385

34137_0003/job.split transit= ioned from INIT to DOWNLOADING

2013-07-29 16:13:06,556 INFO= org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.Local= izedResource: Resource file:/nas/scratch/localfs-1/hadoop-yarn/staging/rpau= lk/.staging/job_1375138534137_0003/job.xml transitioned from INIT to DOWNLO= ADING

2013-07-29 16:13:06,556 INFO= org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.Resou= rceLocalizationService: Created localizer for container_1375138534137_0003_= 01_000001

2013-07-29 16:13:06,559 INFO= org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.Resou= rceLocalizationService: Writing credentials to the nmPrivate file /var/lib/= hadoop-yarn/cache/yarn/nm-local-dir/nmPrivate/container_1375138534137_0003_= 01_000001.tokens. Credentials list:

2013-07-29 16:13:06,560 INFO= org.apache.hadoop.yarn.server.nodemanager.DefaultContainerExecutor: Initia= lizing user rpaulk

2013-07-29 16:13:06,564 INFO= org.apache.hadoop.yarn.server.nodemanager.DefaultContainerExecutor: Copyin= g from /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/nmPrivate/container_137= 5138534137_0003_01_000001.tokens to /var/lib/hadoop-yarn/cache/yarn/nm-loca= l-dir/usercache/rpaulk/appcache/application_1375138534137_0003/container_13= 75138534137_0003_01_000001.tokens

2013-07-29 16:13:06,564 INFO= org.apache.hadoop.yarn.server.nodemanager.DefaultContainerExecutor: CWD se= t to /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/rpaulk/appcache= /application_1375138534137_0003 =3D file:/var/lib/hadoop-yarn/cache/yarn/nm= -local-dir/usercache/rpaulk/appcache/application_1375138534137_0003<= /font>

2013-07-29 16:13:06,646 E= RROR org.apache.hadoop.security.UserGroupInformation: PriviledgedActionExce= ption as:rpaulk (auth:SIMPLE) cause:java.io.FileNotFoundException: File fil= e:/nas/scratch/localfs-1/hadoop-yarn/staging/rpaulk/.staging/job_1375138534= 137_0003/appTokens does not exist

2013-07-29 16:13:06,648 INFO= org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.Resou= rceLocalizationService: DEBUG: FAILED { file:/nas/scratch/localfs-1/hadoop-= yarn/staging/rpaulk/.staging/job_1375138534137_0003/appTokens, 137513945900= 0, FILE, null }

RemoteTrace: <= /span>

java.io.FileNotFoundExceptio= n: File file:/nas/scratch/localfs-1/hadoop-yarn/staging/rpaulk/.staging/job= _1375138534137_0003/appTokens does not exist

=A0=A0=A0=A0=A0=A0=A0 = at org.apache.hadoop.fs.RawLocalFileSystem.getFileStatus(RawLocalFil= eSystem.java:492)

=A0=A0=A0=A0=A0=A0=A0 = at org.apache.hadoop.fs.FilterFileSystem.getFileStatus(FilterFileSys= tem.java:395)

=A0=A0=A0=A0=A0=A0=A0 = at org.apache.hadoop.yarn.util.FSDownload.copy(FSDownload.java:176)<= /font>

=A0=A0=A0=A0=A0=A0=A0 = at org.apache.hadoop.yarn.util.FSDownload.access$000(FSDownload.java= :51)

=A0=A0=A0=A0=A0=A0=A0 = at org.apache.hadoop.yarn.util.FSDownload$1.run(FSDownload.java:284)=

=A0=A0=A0=A0=A0=A0=A0 = at org.apache.hadoop.yarn.util.FSDownload$1.run(FSDownload.java:282)=

=A0=A0=A0=A0=A0=A0=A0 = at java.security.AccessController.doPrivileged(Native Method)=

=A0=A0=A0=A0=A0=A0=A0 = at javax.security.auth.Subject.doAs(Subject.java:396)<= /span>

=A0=A0=A0=A0=A0=A0=A0 = at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInf= ormation.java:1478)

=A0=A0=A0=A0=A0=A0=A0 = at org.apache.hadoop.yarn.util.FSDownload.call(FSDownload.java:280)<= /font>

=A0=A0=A0=A0=A0=A0=A0 = at org.apache.hadoop.yarn.util.FSDownload.call(FSDownload.java:51)

=A0=A0=A0=A0=A0=A0=A0 = at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303= )

=A0=A0=A0=A0=A0=A0=A0 = at java.util.concurrent.FutureTask.run(FutureTask.java:138)

=A0=A0=A0=A0=A0=A0=A0 = at java.util.concurrent.Executors$RunnableAdapter.call(Executors.jav= a:441)

=A0=A0=A0=A0=A0=A0=A0 = at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303= )

=A0=A0=A0=A0=A0=A0=A0 = at java.util.concurrent.FutureTask.run(FutureTask.java:138)

=A0=A0=A0=A0=A0=A0=A0 = at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPool= Executor.java:886)

=A0=A0=A0=A0=A0=A0=A0 = at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExec= utor.java:908)

=A0=A0=A0=A0=A0=A0=A0 = at java.lang.Thread.run(Thread.java:662)

=A0at LocalTrac= e:

=A0=A0=A0=A0=A0=A0=A0 = org.apache.hadoop.yarn.exceptions.impl.pb.YarnRemoteExceptionPBImpl:= File file:/nas/scratch/localfs-1/hadoop-yarn/staging/rpaulk/.staging/job_1= 375138534137_0003/appTokens does not exist

=A0=A0=A0=A0=A0=A0=A0 = at org.apache.hadoop.yarn.server.nodemanager.api.protocolrecords.imp= l.pb.LocalResourceStatusPBImpl.convertFromProtoFormat(LocalResourceStatusPB= Impl.java:217)

=A0=A0=A0=A0=A0=A0=A0 = at org.apache.hadoop.yarn.server.nodemanager.api.protocolrecords.imp= l.pb.LocalResourceStatusPBImpl.getException(LocalResourceStatusPBImpl.java:= 147)

=A0=A0=A0=A0=A0=A0=A0 = at org.apache.hadoop.yarn.server.nodemanager.containermanager.locali= zer.ResourceLocalizationService$LocalizerRunner.update(ResourceLocalization= Service.java:819)

=A0=A0=A0=A0=A0=A0=A0 = at org.apache.hadoop.yarn.server.nodemanager.containermanager.locali= zer.ResourceLocalizationService$LocalizerTracker.processHeartbeat(ResourceL= ocalizationService.java:491)

=A0=A0=A0=A0=A0=A0=A0 = at org.apache.hadoop.yarn.server.nodemanager.containermanager.locali= zer.ResourceLocalizationService.heartbeat(ResourceLocalizationService.java:= 218)

=A0=A0=A0=A0=A0=A0=A0 = at org.apache.hadoop.yarn.server.nodemanager.api.impl.pb.service.Loc= alizationProtocolPBServiceImpl.heartbeat(LocalizationProtocolPBServiceImpl.= java:46)

=A0=A0=A0=A0=A0=A0=A0 = at org.apache.hadoop.yarn.proto.LocalizationProtocol$LocalizationPro= tocolService$2.callBlockingMethod(LocalizationProtocol.java:57)

=A0=A0=A0=A0=A0=A0=A0 = at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker= .call(ProtobufRpcEngine.java:454)

=A0=A0=A0=A0=A0=A0=A0 = at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1014)

=A0=A0=A0=A0=A0=A0=A0 = at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1741)

=A0=A0=A0=A0=A0=A0=A0 = at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1737)

=A0=A0=A0=A0=A0=A0=A0 = at java.security.AccessController.doPrivileged(Native Method)=

=A0=A0=A0=A0=A0=A0=A0 = at javax.security.auth.Subject.doAs(Subject.java:396)<= /span>

=A0=A0=A0=A0=A0=A0=A0 = at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInf= ormation.java:1478)

=A0=A0=A0=A0=A0=A0=A0 = at org.apache.hadoop.ipc.Server$Handler.run(Server.java:1735)=

2013-07-29 16:13:06,650 INFO= org.apache.hadoop.yarn.server.nodemanager.containermanager.container.Conta= iner: Container container_1375138534137_0003_01_000001 transitioned from LO= CALIZING to LOCALIZATION_FAILED

2013-07-29 16:13:06,650 I= NFO org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.Lo= calizedResource: Resource file:/nas/scratch/localfs-1/hadoop-yarn/staging/r= paulk/.staging/job_1375138534137_0003/appTokens transitioned from DOWNLOADI= NG to INIT

2013-07-29 16:13:06,650 INFO= org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.Local= izedResource: Resource file:/nas/scratch/localfs-1/hadoop-yarn/staging/rpau= lk/.staging/job_1375138534137_0003/job.jar transitioned from DOWNLOADING to= INIT

2013-07-29 16:13:06,650 INFO= org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.Local= izedResource: Resource file:/nas/scratch/localfs-1/hadoop-yarn/staging/rpau= lk/.staging/job_1375138534137_0003/job.splitmetainfo transitioned from DOWN= LOADING to INIT

2013-07-29 16:13:06,650 INFO= org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.Local= izedResource: Resource file:/nas/scratch/localfs-1/hadoop-yarn/staging/rpau= lk/.staging/job_1375138534137_0003/job.split transitioned from DOWNLOADING = to INIT

2013-07-29 16:13:06,650 INFO= org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.Local= izedResource: Resource file:/nas/scratch/localfs-1/hadoop-yarn/staging/rpau= lk/.staging/job_1375138534137_0003/job.xml transitioned from DOWNLOADING to= INIT

2013-07-29 16:13:06,652 WARN= org.apache.hadoop.yarn.server.nodemanager.NMAuditLogger: USER=3Drpaulk=A0=A0=A0=A0=A0=A0 OPERATION=3DContainer Finished - Failed= =A0=A0 TARGET=3DContainerImpl=A0=A0=A0 RESULT=3DFAILURE= =A0 DESCRIPTION=3DContainer failed with state: LOCALIZATION_FA= ILED=A0 APPID=3Dapplication_1375138534137_0003=A0=A0=A0 = CONTAINERID=3Dcontainer_1375138534137_0003_01_000001

2013-07-29 16:13:06,652 INFO= org.apache.hadoop.yarn.server.nodemanager.containermanager.container.Conta= iner: Container container_1375138534137_0003_01_000001 transitioned from LO= CALIZATION_FAILED to DONE

2013-07-29 16:13:06,652 INFO= org.apache.hadoop.yarn.server.nodemanager.containermanager.application.App= lication: Removing container_1375138534137_0003_01_000001 from application = application_1375138534137_0003

2013-07-29 16:13:06,652 INFO= org.apache.hadoop.yarn.server.nodemanager.containermanager.logaggregation.= AppLogAggregatorImpl: Considering container container_1375138534137_0003_01= _000001 for log-aggregation

2013-07-29 16:13:06,652 INFO= org.apache.hadoop.yarn.server.nodemanager.DefaultContainerExecutor: Deleti= ng absolute path : /var/lib/hadoop-yarn/cache/yarn/nm-local-dir/usercache/r= paulk/appcache/application_1375138534137_0003/container_1375138534137_0003_= 01_000001


--047d7b2edb2509bae104e3d5a4a1--