Return-Path: X-Original-To: apmail-hadoop-mapreduce-user-archive@minotaur.apache.org Delivered-To: apmail-hadoop-mapreduce-user-archive@minotaur.apache.org Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by minotaur.apache.org (Postfix) with SMTP id 8516611B49 for ; Tue, 24 Jun 2014 17:08:02 +0000 (UTC) Received: (qmail 61466 invoked by uid 500); 24 Jun 2014 17:07:57 -0000 Delivered-To: apmail-hadoop-mapreduce-user-archive@hadoop.apache.org Received: (qmail 61353 invoked by uid 500); 24 Jun 2014 17:07:57 -0000 Mailing-List: contact user-help@hadoop.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: user@hadoop.apache.org Delivered-To: mailing list user@hadoop.apache.org Received: (qmail 61337 invoked by uid 99); 24 Jun 2014 17:07:57 -0000 Received: from nike.apache.org (HELO nike.apache.org) (192.87.106.230) by apache.org (qpsmtpd/0.29) with ESMTP; Tue, 24 Jun 2014 17:07:57 +0000 X-ASF-Spam-Status: No, hits=1.5 required=5.0 tests=HTML_MESSAGE,RCVD_IN_DNSWL_LOW,SPF_PASS X-Spam-Check-By: apache.org Received-SPF: pass (nike.apache.org: domain of anfernee.xu@gmail.com designates 74.125.82.177 as permitted sender) Received: from [74.125.82.177] (HELO mail-we0-f177.google.com) (74.125.82.177) by apache.org (qpsmtpd/0.29) with ESMTP; Tue, 24 Jun 2014 17:07:53 +0000 Received: by mail-we0-f177.google.com with SMTP id u56so687138wes.36 for ; Tue, 24 Jun 2014 10:07:26 -0700 (PDT) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=gmail.com; s=20120113; h=mime-version:date:message-id:subject:from:to:content-type; bh=EGCiXbBvXNRQFPeH3ZHjXzvza8tg3CMdhpGk1iCl0G8=; b=KTkfTiyWW2vp6/hJ042MvshUsiBOWzYpJx38CQqd3kJ8lXqVUKoPWYEQCKvHtfM4AT fl5Lxt2Sc6rXD2SvD95UX3hDgpw/9OaM0fX5BktMuhGVeGBOnBOtNrmu6NY8QK3GqrlG 0PWj6pbjOfGxWuNxSNz2KXd3+VV139eHHWYHUA3WJ8yTfSnYulN0x4eMPzeNOpQYlEYc V8NtEzPBiGYctnfKbmvKTKsFfgt+qNVP9gOO/6LqsRLWjgSNnOjtr9+Qdmu9BUgN+prG C/3JRlHVmvl1z0IGnIfWMvZms611o4X+oD+81XUySLgBqTgR6JXC6fyWhoUEpItL+L5W PfFw== MIME-Version: 1.0 X-Received: by 10.194.92.196 with SMTP id co4mr3247339wjb.4.1403629646484; Tue, 24 Jun 2014 10:07:26 -0700 (PDT) Received: by 10.216.45.76 with HTTP; Tue, 24 Jun 2014 10:07:26 -0700 (PDT) Date: Tue, 24 Jun 2014 10:07:26 -0700 Message-ID: Subject: MR job failed due to java.io.FileNotFoundException, but the path for ${mapreduce.jobhistory.done-dir} is not correct From: Anfernee Xu To: user Content-Type: multipart/alternative; boundary=047d7bf0d95879988604fc97fe8f X-Virus-Checked: Checked by ClamAV on apache.org --047d7bf0d95879988604fc97fe8f Content-Type: text/plain; charset=UTF-8 Hi, I'm running Hadoop 2.2.0, and occasionally some my MR jobs failed due to below error. The issue is the job was running on 2014-06-24, but the path was pointed to /2014/06/01, do you guys know what's going on here? 2014-06-24 08:04:28.170 -0700 [pool-1-thread-157] java.io.IOException: org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.yarn.exceptions.YarnRuntimeException): java.io.FileNotFoundException: File /tmp/hadoop-yarn/staging/history/done/2014/06/01/000059 does not exist. at org.apache.hadoop.mapreduce.v2.hs.CachedHistoryStorage.getFullJob(CachedHistoryStorage.java:122) at org.apache.hadoop.mapreduce.v2.hs.JobHistory.getJob(JobHistory.java:207) at org.apache.hadoop.mapreduce.v2.hs.HistoryClientService$HSClientProtocolHandler$1.run(HistoryClientService.java:200) at org.apache.hadoop.mapreduce.v2.hs.HistoryClientService$HSClientProtocolHandler$1.run(HistoryClientService.java:196) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:396) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1491) at org.apache.hadoop.mapreduce.v2.hs.HistoryClientService$HSClientProtocolHandler.verifyAndGetJob(HistoryClientService.java:196) at org.apache.hadoop.mapreduce.v2.hs.HistoryClientService$HSClientProtocolHandler.getJobReport(HistoryClientService.java:228) at org.apache.hadoop.mapreduce.v2.api.impl.pb.service.MRClientProtocolPBServiceImpl.getJobReport(MRClientProtocolPBServiceImpl.java:122) at org.apache.hadoop.yarn.proto.MRClientProtocol$MRClientProtocolService$2.callBlockingMethod(MRClientProtocol.java:275) at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:585) at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:928) at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2048) at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2044) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:396) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1491) at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2042) Caused by: java.io.FileNotFoundException: File /tmp/hadoop-yarn/staging/history/done/2014/06/01/000059 does not exist. at org.apache.hadoop.fs.Hdfs$DirListingIterator.(Hdfs.java:205) at org.apache.hadoop.fs.Hdfs$DirListingIterator.(Hdfs.java:189) at org.apache.hadoop.fs.Hdfs$2.(Hdfs.java:171) at org.apache.hadoop.fs.Hdfs.listStatusIterator(Hdfs.java:171) at org.apache.hadoop.fs.FileContext$20.next(FileContext.java:1392) at org.apache.hadoop.fs.FileContext$20.next(FileContext.java:1387) at org.apache.hadoop.fs.FSLinkResolver.resolve(FSLinkResolver.java:90) at org.apache.hadoop.fs.FileContext.listStatus(FileContext.java:1387) at org.apache.hadoop.mapreduce.v2.hs.HistoryFileManager.scanDirectory(HistoryFileManager.java:655) at org.apache.hadoop.mapreduce.v2.hs.HistoryFileManager.scanDirectoryForHistoryFiles(HistoryFileManager.java:668) at org.apache.hadoop.mapreduce.v2.hs.HistoryFileManager.scanOldDirsForJob(HistoryFileManager.java:825) at org.apache.hadoop.mapreduce.v2.hs.HistoryFileManager.getFileInfo(HistoryFileManager.java:854) at org.apache.hadoop.mapreduce.v2.hs.CachedHistoryStorage.getFullJob(CachedHistoryStorage.java:107) ... 18 more at org.apache.hadoop.mapred.ClientServiceDelegate.invoke(ClientServiceDelegate.java:331) ~[thirdeye-action.jar:na] at org.apache.hadoop.mapred.ClientServiceDelegate.getJobStatus(ClientServiceDelegate.java:416) ~[thirdeye-action.jar:na] at org.apache.hadoop.mapred.TIEYarnRunner.getJobStatus(TIEYarnRunner.java:534) ~[thirdeye-action.jar:na] at org.apache.hadoop.mapreduce.Job$1.run(Job.java:314) ~[thirdeye-action.jar:na] at org.apache.hadoop.mapreduce.Job$1.run(Job.java:311) ~[thirdeye-action.jar:na] at java.security.AccessController.doPrivileged(Native Method) ~[na:1.6.0_23] at javax.security.auth.Subject.doAs(Subject.java:396) ~[na:1.6.0_23] at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1491) ~[thirdeye-action.jar:na] at org.apache.hadoop.mapreduce.Job.updateStatus(Job.java:311) ~[thirdeye-action.jar:na] at org.apache.hadoop.mapreduce.Job.isComplete(Job.java:599) ~[thirdeye-action.jar:na] at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1294) ~[thirdeye-action.jar:na] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:441) [na:1.6.0_23] at java.util.concurrent.FutureTask$Sync.innerRunAndReset(FutureTask.java:317) [na:1.6.0_23] at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:150) [na:1.6.0_23] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$101(ScheduledThreadPoolExecutor.java:98) [na:1.6.0_23] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.runPeriodic(ScheduledThreadPoolExecutor.java:180) [na:1.6.0_23] at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:204) [na:1.6.0_23] at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886) [na:1.6.0_23] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908) [na:1.6.0_23] at java.lang.Thread.run(Thread.java:662) [na:1.6.0_23] Caused by: org.apache.hadoop.ipc.RemoteException: java.io.FileNotFoundException: File /tmp/hadoop-yarn/staging/history/done/2014/06/01/000059 does not exist. at org.apache.hadoop.mapreduce.v2.hs.CachedHistoryStorage.getFullJob(CachedHistoryStorage.java:122) at org.apache.hadoop.mapreduce.v2.hs.JobHistory.getJob(JobHistory.java:207) at org.apache.hadoop.mapreduce.v2.hs.HistoryClientService$HSClientProtocolHandler$1.run(HistoryClientService.java:200) at org.apache.hadoop.mapreduce.v2.hs.HistoryClientService$HSClientProtocolHandler$1.run(HistoryClientService.java:196) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:396) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1491) at org.apache.hadoop.mapreduce.v2.hs.HistoryClientService$HSClientProtocolHandler.verifyAndGetJob(HistoryClientService.java:196) at org.apache.hadoop.mapreduce.v2.hs.HistoryClientService$HSClientProtocolHandler.getJobReport(HistoryClientService.java:228) at org.apache.hadoop.mapreduce.v2.api.impl.pb.service.MRClientProtocolPBServiceImpl.getJobReport(MRClientProtocolPBServiceImpl.java:122) at org.apache.hadoop.yarn.proto.MRClientProtocol$MRClientProtocolService$2.callBlockingMethod(MRClientProtocol.java:275) at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:585) at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:928) at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2048) at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2044) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:396) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1491) at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2042) Caused by: java.io.FileNotFoundException: File /tmp/hadoop-yarn/staging/history/done/2014/06/01/000059 does not exist. at org.apache.hadoop.fs.Hdfs$DirListingIterator.(Hdfs.java:205) at org.apache.hadoop.fs.Hdfs$DirListingIterator.(Hdfs.java:189) at org.apache.hadoop.fs.Hdfs$2.(Hdfs.java:171) at org.apache.hadoop.fs.Hdfs.listStatusIterator(Hdfs.java:171) at org.apache.hadoop.fs.FileContext$20.next(FileContext.java:1392) at org.apache.hadoop.fs.FileContext$20.next(FileContext.java:1387) at org.apache.hadoop.fs.FSLinkResolver.resolve(FSLinkResolver.java:90) at org.apache.hadoop.fs.FileContext.listStatus(FileContext.java:1387) at org.apache.hadoop.mapreduce.v2.hs.HistoryFileManager.scanDirectory(HistoryFileManager.java:655) at org.apache.hadoop.mapreduce.v2.hs.HistoryFileManager.scanDirectoryForHistoryFiles(HistoryFileManager.java:668) at org.apache.hadoop.mapreduce.v2.hs.HistoryFileManager.scanOldDirsForJob(HistoryFileManager.java:825) at org.apache.hadoop.mapreduce.v2.hs.HistoryFileManager.getFileInfo(HistoryFileManager.java:854) at org.apache.hadoop.mapreduce.v2.hs.CachedHistoryStorage.getFullJob(CachedHistoryStorage.java:107) ... 18 more at org.apache.hadoop.ipc.Client.call(Client.java:1347) ~[thirdeye-action.jar:na] at org.apache.hadoop.ipc.Client.call(Client.java:1300) ~[thirdeye-action.jar:na] at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:206) ~[thirdeye-action.jar:na] at $Proxy13.getJobReport(Unknown Source) ~[na:na] at org.apache.hadoop.mapreduce.v2.api.impl.pb.client.MRClientProtocolPBClientImpl.getJobReport(MRClientProtocolPBClientImpl.java:133) ~[thirdeye-action.jar:na] at sun.reflect.GeneratedMethodAccessor63.invoke(Unknown Source) ~[na:na] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25) ~[na:1.6.0_23] at java.lang.reflect.Method.invoke(Method.java:597) ~[na:1.6.0_23] at org.apache.hadoop.mapred.ClientServiceDelegate.invoke(ClientServiceDelegate.java:317) ~[thirdeye-action.jar:na] ... 22 common frames omitted -- --Anfernee --047d7bf0d95879988604fc97fe8f Content-Type: text/html; charset=UTF-8 Content-Transfer-Encoding: quoted-printable
Hi,

I'm running Hadoop 2.2.0, and o= ccasionally some my MR jobs failed due to below error.

=
The issue is the job was running on 2014-06-24, but the path was point= ed to /2014/06/01, do you guys know what's going on here?

2014-06-24 08:04:28.170 -0700 [pool-1-thread-157] java.= io.IOException: org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.yar= n.exceptions.YarnRuntimeException): java.io.FileNotFoundException: File /tm= p/hadoop-yarn/staging/history/done/2014/06/01/000059 does not exist.
at org.apache.hadoo= p.mapreduce.v2.hs.CachedHistoryStorage.getFullJob(CachedHistoryStorage.java= :122)
at org.a= pache.hadoop.mapreduce.v2.hs.JobHistory.getJob(JobHistory.java:207)
at org.apache.hadoo= p.mapreduce.v2.hs.HistoryClientService$HSClientProtocolHandler$1.run(Histor= yClientService.java:200)
at org.apache.hadoop.mapreduce.v2.hs.HistoryClientService$HSClie= ntProtocolHandler$1.run(HistoryClientService.java:196)
at java.security.Ac= cessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.ja= va:396)
at org.apache.hadoo= p.security.UserGroupInformation.doAs(UserGroupInformation.java:1491)
<= div> at org.apache.hadoop= .mapreduce.v2.hs.HistoryClientService$HSClientProtocolHandler.verifyAndGetJ= ob(HistoryClientService.java:196)
at org.apache.hadoo= p.mapreduce.v2.hs.HistoryClientService$HSClientProtocolHandler.getJobReport= (HistoryClientService.java:228)
at org.apache.hadoop.mapreduce.v2.api.impl.pb.service.MRC= lientProtocolPBServiceImpl.getJobReport(MRClientProtocolPBServiceImpl.java:= 122)
at org.apache.hadoo= p.yarn.proto.MRClientProtocol$MRClientProtocolService$2.callBlockingMethod(= MRClientProtocol.java:275)
at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcI= nvoker.call(ProtobufRpcEngine.java:585)
at org.apache.hadoo= p.ipc.RPC$Server.call(RPC.java:928)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.= java:2048)
at org.apache.hadoo= p.ipc.Server$Handler$1.run(Server.java:2044)
at java.security.AccessController.doPrivileg= ed(Native Method)
at javax.security.a= uth.Subject.doAs(Subject.java:396)
at org.apache.hadoop.security.UserGroupInformation.doA= s(UserGroupInformation.java:1491)
at org.apache.hadoo= p.ipc.Server$Handler.run(Server.java:2042)
Caused by: java.io.Fil= eNotFoundException: File /tmp/hadoop-yarn/staging/history/done/2014/06/01/0= 00059 does not exist.
at org.apache.hadoo= p.fs.Hdfs$DirListingIterator.<init>(Hdfs.java:205)
at org.apache.hadoop.fs.Hdfs$Dir= ListingIterator.<init>(Hdfs.java:189)
at org.apache.hadoo= p.fs.Hdfs$2.<init>(Hdfs.java:171)
at org.apache.hadoop.fs.Hdfs.listStatusIterator(H= dfs.java:171)
at org.apache.hadoo= p.fs.FileContext$20.next(FileContext.java:1392)
at org.apache.hadoop.fs.FileContext$20.ne= xt(FileContext.java:1387)
at org.apache.hadoo= p.fs.FSLinkResolver.resolve(FSLinkResolver.java:90)
at org.apache.hadoop.fs.FileContext= .listStatus(FileContext.java:1387)
at org.apache.hadoo= p.mapreduce.v2.hs.HistoryFileManager.scanDirectory(HistoryFileManager.java:= 655)
at org.ap= ache.hadoop.mapreduce.v2.hs.HistoryFileManager.scanDirectoryForHistoryFiles= (HistoryFileManager.java:668)
at org.apache.hadoo= p.mapreduce.v2.hs.HistoryFileManager.scanOldDirsForJob(HistoryFileManager.j= ava:825)
at or= g.apache.hadoop.mapreduce.v2.hs.HistoryFileManager.getFileInfo(HistoryFileM= anager.java:854)
at org.apache.hadoo= p.mapreduce.v2.hs.CachedHistoryStorage.getFullJob(CachedHistoryStorage.java= :107)
... 18 m= ore

at o= rg.apache.hadoop.mapred.ClientServiceDelegate.invoke(ClientServiceDelegate.= java:331) ~[thirdeye-action.jar:na]
at org.apache.hadoop.mapred.ClientServiceDelegate.get= JobStatus(ClientServiceDelegate.java:416) ~[thirdeye-action.jar:na]
at org.apache.hadoo= p.mapred.TIEYarnRunner.getJobStatus(TIEYarnRunner.java:534) ~[thirdeye-acti= on.jar:na]
at = org.apache.hadoop.mapreduce.Job$1.run(Job.java:314) ~[thirdeye-action.jar:n= a]
at org.apache.hadoo= p.mapreduce.Job$1.run(Job.java:311) ~[thirdeye-action.jar:na]
at java.security.AccessCont= roller.doPrivileged(Native Method) ~[na:1.6.0_23]
at javax.security.a= uth.Subject.doAs(Subject.java:396) ~[na:1.6.0_23]
at org.apache.hadoop.security.UserGroup= Information.doAs(UserGroupInformation.java:1491) ~[thirdeye-action.jar:na]<= /div>
at org.apache.hadoo= p.mapreduce.Job.updateStatus(Job.java:311) ~[thirdeye-action.jar:na]
<= div> at org.apache.hadoop= .mapreduce.Job.isComplete(Job.java:599) ~[thirdeye-action.jar:na]
at org.apache.hadoo= p.mapreduce.Job.waitForCompletion(Job.java:1294) ~[thirdeye-action.jar:na]<= /div>
at java.util.c= oncurrent.Executors$RunnableAdapter.call(Executors.java:441) [na:1.6.0_23]<= /div>
at java.util.concur= rent.FutureTask$Sync.innerRunAndReset(FutureTask.java:317) [na:1.6.0_23]
at java.util.con= current.FutureTask.runAndReset(FutureTask.java:150) [na:1.6.0_23]
at java.util.concur= rent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$101(ScheduledTh= readPoolExecutor.java:98) [na:1.6.0_23]
at java.util.concurrent.ScheduledThreadPoolExecut= or$ScheduledFutureTask.runPeriodic(ScheduledThreadPoolExecutor.java:180) [n= a:1.6.0_23]
at java.util.concur= rent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoo= lExecutor.java:204) [na:1.6.0_23]
at java.util.concurrent.ThreadPoolExecutor$Worker.runTa= sk(ThreadPoolExecutor.java:886) [na:1.6.0_23]
at java.util.concur= rent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908) [na:1.6.0_2= 3]
at java.lan= g.Thread.run(Thread.java:662) [na:1.6.0_23]
Caused by: org.apache.hadoop.ipc.RemoteException: java.io.FileNotFound= Exception: File /tmp/hadoop-yarn/staging/history/done/2014/06/01/000059 doe= s not exist.
a= t org.apache.hadoop.mapreduce.v2.hs.CachedHistoryStorage.getFullJob(CachedH= istoryStorage.java:122)
at org.apache.hadoo= p.mapreduce.v2.hs.JobHistory.getJob(JobHistory.java:207)
at org.apache.hadoop.mapreduce.v= 2.hs.HistoryClientService$HSClientProtocolHandler$1.run(HistoryClientServic= e.java:200)
at org.apache.hadoo= p.mapreduce.v2.hs.HistoryClientService$HSClientProtocolHandler$1.run(Histor= yClientService.java:196)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.a= uth.Subject.doAs(Subject.java:396)
at org.apache.hadoop.security.UserGroupInformation.doA= s(UserGroupInformation.java:1491)
at org.apache.hadoo= p.mapreduce.v2.hs.HistoryClientService$HSClientProtocolHandler.verifyAndGet= Job(HistoryClientService.java:196)
at org.apache.hadoop.mapreduce.v2.hs.HistoryClientServ= ice$HSClientProtocolHandler.getJobReport(HistoryClientService.java:228)
at org.apache.hadoo= p.mapreduce.v2.api.impl.pb.service.MRClientProtocolPBServiceImpl.getJobRepo= rt(MRClientProtocolPBServiceImpl.java:122)
at org.apache.hadoop.yarn.proto.MRClientProtoc= ol$MRClientProtocolService$2.callBlockingMethod(MRClientProtocol.java:275)<= /div>
at org.apache.hadoo= p.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.ja= va:585)
at org= .apache.hadoop.ipc.RPC$Server.call(RPC.java:928)
at org.apache.hadoo= p.ipc.Server$Handler$1.run(Server.java:2048)
at org.apache.hadoop.ipc.Server$Handler$1.ru= n(Server.java:2044)
at java.security.Ac= cessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.ja= va:396)
at org.apache.hadoo= p.security.UserGroupInformation.doAs(UserGroupInformation.java:1491)
<= div> at org.apache.hadoop= .ipc.Server$Handler.run(Server.java:2042)
Caused by: java.io.FileNotFoundException: File /tmp/hadoop-yarn/stagin= g/history/done/2014/06/01/000059 does not exist.
at org.apache.hadoop.fs.Hdfs$DirListingI= terator.<init>(Hdfs.java:205)
at org.apache.hadoo= p.fs.Hdfs$DirListingIterator.<init>(Hdfs.java:189)
at org.apache.hadoop.fs.Hdfs$2.&= lt;init>(Hdfs.java:171)
at org.apache.hadoo= p.fs.Hdfs.listStatusIterator(Hdfs.java:171)
at org.apache.hadoop.fs.FileContext$20.next(F= ileContext.java:1392)
at org.apache.hadoo= p.fs.FileContext$20.next(FileContext.java:1387)
at org.apache.hadoop.fs.FSLinkResolver.re= solve(FSLinkResolver.java:90)
at org.apache.hadoo= p.fs.FileContext.listStatus(FileContext.java:1387)
at org.apache.hadoop.mapreduce.v2.hs.H= istoryFileManager.scanDirectory(HistoryFileManager.java:655)
at org.apache.hadoo= p.mapreduce.v2.hs.HistoryFileManager.scanDirectoryForHistoryFiles(HistoryFi= leManager.java:668)
<= /span>at org.apache.hadoop.mapreduce.v2.hs.HistoryFileManager.scanOldDirsFo= rJob(HistoryFileManager.java:825)
at org.apache.hadoo= p.mapreduce.v2.hs.HistoryFileManager.getFileInfo(HistoryFileManager.java:85= 4)
at org.apac= he.hadoop.mapreduce.v2.hs.CachedHistoryStorage.getFullJob(CachedHistoryStor= age.java:107)
... 18 more

at org= .apache.hadoop.ipc.Client.call(Client.java:1347) ~[thirdeye-action.jar:na]<= /div>
at org.apache.hadoo= p.ipc.Client.call(Client.java:1300) ~[thirdeye-action.jar:na]
at org.apache.hadoop.ipc.Pr= otobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:206) ~[thirdeye-actio= n.jar:na]
at $Proxy13.getJobR= eport(Unknown Source) ~[na:na]
at org.apache.hadoop.mapreduce.v2.api.impl.pb.client.MRCli= entProtocolPBClientImpl.getJobReport(MRClientProtocolPBClientImpl.java:133)= ~[thirdeye-action.jar:na]
at sun.reflect.Gene= ratedMethodAccessor63.invoke(Unknown Source) ~[na:na]
at sun.reflect.DelegatingMethodAcce= ssorImpl.invoke(DelegatingMethodAccessorImpl.java:25) ~[na:1.6.0_23]
at java.lang.reflec= t.Method.invoke(Method.java:597) ~[na:1.6.0_23]
at org.apache.hadoop.mapred.ClientService= Delegate.invoke(ClientServiceDelegate.java:317) ~[thirdeye-action.jar:na]
... 22 common frame= s omitted=C2=A0

--
--Anfernee
--047d7bf0d95879988604fc97fe8f--