Return-Path: X-Original-To: apmail-hadoop-common-user-archive@www.apache.org Delivered-To: apmail-hadoop-common-user-archive@www.apache.org Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by minotaur.apache.org (Postfix) with SMTP id DF53810A4F for ; Mon, 27 Jan 2014 22:57:05 +0000 (UTC) Received: (qmail 69454 invoked by uid 500); 27 Jan 2014 22:56:57 -0000 Delivered-To: apmail-hadoop-common-user-archive@hadoop.apache.org Received: (qmail 69246 invoked by uid 500); 27 Jan 2014 22:56:57 -0000 Mailing-List: contact user-help@hadoop.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: user@hadoop.apache.org Delivered-To: mailing list user@hadoop.apache.org Received: (qmail 69239 invoked by uid 99); 27 Jan 2014 22:56:57 -0000 Received: from athena.apache.org (HELO athena.apache.org) (140.211.11.136) by apache.org (qpsmtpd/0.29) with ESMTP; Mon, 27 Jan 2014 22:56:57 +0000 X-ASF-Spam-Status: No, hits=1.7 required=5.0 tests=FREEMAIL_ENVFROM_END_DIGIT,HTML_MESSAGE,RCVD_IN_DNSWL_LOW,SPF_PASS X-Spam-Check-By: apache.org Received-SPF: pass (athena.apache.org: domain of jayunit100@gmail.com designates 209.85.217.181 as permitted sender) Received: from [209.85.217.181] (HELO mail-lb0-f181.google.com) (209.85.217.181) by apache.org (qpsmtpd/0.29) with ESMTP; Mon, 27 Jan 2014 22:56:52 +0000 Received: by mail-lb0-f181.google.com with SMTP id z5so4916898lbh.40 for ; Mon, 27 Jan 2014 14:56:30 -0800 (PST) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=gmail.com; s=20120113; h=mime-version:date:message-id:subject:from:to:content-type; bh=wlWh/BORnsrShCfu3eqsTsAdU6ccvYsigM+Ax8dillc=; b=Yh0lLakYChCWgY8BeyFqvISkJCR8JVSynZrr4aDfNlJA9GSSU4YemV8b1kA7CQkzTa YQih8H2+aymun/VODmcpSKRbMKB0PojFgLgd7qoTX11emRglCV0OHmivH4JYb8oWhi0z cAt/MQ0kge89qrGd0iGGzGc56YpL56NCJD88+1IHMLhtJWSo81raoepTOLNoI8KcOWkQ a9CCyHEG6ouifwlOl42qxUJtzaamIZNmzeXgHEQZilz80zCZeKJhf2OU8uia7QgSbBQE ye/Io1ZzsALYsyVfzOPGv0w9o/UXy3Nltkp4md0/th5d0487JEalgLwFq7wyAJaO2hqf tf9Q== MIME-Version: 1.0 X-Received: by 10.112.180.72 with SMTP id dm8mr8803198lbc.28.1390863390845; Mon, 27 Jan 2014 14:56:30 -0800 (PST) Received: by 10.112.143.229 with HTTP; Mon, 27 Jan 2014 14:56:30 -0800 (PST) Date: Mon, 27 Jan 2014 17:56:30 -0500 Message-ID: Subject: Strange rpc exception in Yarn From: Jay Vyas To: "common-user@hadoop.apache.org" Content-Type: multipart/alternative; boundary=089e011836be58005604f0fb9ec6 X-Virus-Checked: Checked by ClamAV on apache.org --089e011836be58005604f0fb9ec6 Content-Type: text/plain; charset=ISO-8859-1 Hi folks: At the **end** of a successful job, im getting some strange stack traces .... this when using pig, however, it doesnt seem to be pig specific from the stacktrace. Rather, it appears that the job client is attempting to do something funny. Anyone ever see this sort of exception in Yarn ? It seems as though its related to an IPC call, but the IPC call is throwing an exception in the hasNext(..) implementation in the AbstractfileSystem. ERROR org.apache.hadoop.security.UserGroupInformation - PriviledgedActionException as:roofmonkey (auth:SIMPLE) cause:java.io.IOException: org.apache.hadoop.ipc.RemoteException(java.lang.NullPointerException): java.lang.NullPointerException at org.apache.hadoop.fs.AbstractFileSystem$1.hasNext(AbstractFileSystem.java:861) at org.apache.hadoop.mapreduce.v2.hs.HistoryFileManager.scanDirectory(HistoryFileManager.java:656) at org.apache.hadoop.mapreduce.v2.hs.HistoryFileManager.scanDirectoryForHistoryFiles(HistoryFileManager.java:668) at org.apache.hadoop.mapreduce.v2.hs.HistoryFileManager.scanIntermediateDirectory(HistoryFileManager.java:722) at org.apache.hadoop.mapreduce.v2.hs.HistoryFileManager.access$300(HistoryFileManager.java:77) at org.apache.hadoop.mapreduce.v2.hs.HistoryFileManager$UserLogDir.scanIfNeeded(HistoryFileManager.java:275) at org.apache.hadoop.mapreduce.v2.hs.HistoryFileManager.scanIntermediateDirectory(HistoryFileManager.java:708) at org.apache.hadoop.mapreduce.v2.hs.HistoryFileManager.getFileInfo(HistoryFileManager.java:847) at org.apache.hadoop.mapreduce.v2.hs.CachedHistoryStorage.getFullJob(CachedHistoryStorage.java:107) at org.apache.hadoop.mapreduce.v2.hs.JobHistory.getJob(JobHistory.java:207) at org.apache.hadoop.mapreduce.v2.hs.HistoryClientService$HSClientProtocolHandler$1.run(HistoryClientService.java:200) at org.apache.hadoop.mapreduce.v2.hs.HistoryClientService$HSClientProtocolHandler$1.run(HistoryClientService.java:196) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:396) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1491) at org.apache.hadoop.mapreduce.v2.hs.HistoryClientService$HSClientProtocolHandler.verifyAndGetJob(HistoryClientService.java:196) at org.apache.hadoop.mapreduce.v2.hs.HistoryClientService$HSClientProtocolHandler.getJobReport(HistoryClientService.java:228) at org.apache.hadoop.mapreduce.v2.api.impl.pb.service.MRClientProtocolPBServiceImpl.getJobReport(MRClientProtocolPBServiceImpl.java:122) at org.apache.hadoop.yarn.proto.MRClientProtocol$MRClientProtocolService$2.callBlockingMethod(MRClientProtocol.java:275) at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:585) at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:928) at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2053) at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2049) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:396) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1491) at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2047) --089e011836be58005604f0fb9ec6 Content-Type: text/html; charset=ISO-8859-1 Content-Transfer-Encoding: quoted-printable
Hi folks:

At the **end** of a successful job, im g= etting some strange stack traces ....=A0 this when using pig, however, it d= oesnt seem to be pig specific from the stacktrace.=A0 Rather, it appears th= at the job client is attempting to do something funny.

Anyone ever see this sort of exception in Yarn ?=A0 It seems as though = its related to an IPC call, but the IPC call is throwing an exception in th= e hasNext(..) implementation in the AbstractfileSystem.

ERROR org.apache.hadoop.security.UserGroupInformation - PriviledgedActionEx= ception as:roofmonkey (auth:SIMPLE) cause:java.io.IOException: org.apache.h= adoop.ipc.RemoteException(java.lang.NullPointerException): java.lang.NullPo= interException
=A0=A0 =A0at org.apache.hadoop.fs.AbstractFileSystem$1.hasNext(AbstractFile= System.java:861)
=A0=A0 =A0at org.apache.hadoop.mapreduce.v2.hs.HistoryF= ileManager.scanDirectory(HistoryFileManager.java:656)
=A0=A0 =A0at org.a= pache.hadoop.mapreduce.v2.hs.HistoryFileManager.scanDirectoryForHistoryFile= s(HistoryFileManager.java:668)
=A0=A0 =A0at org.apache.hadoop.mapreduce.v2.hs.HistoryFileManager.scanInter= mediateDirectory(HistoryFileManager.java:722)
=A0=A0 =A0at org.apache.ha= doop.mapreduce.v2.hs.HistoryFileManager.access$300(HistoryFileManager.java:= 77)
=A0=A0 =A0at org.apache.hadoop.mapreduce.v2.hs.HistoryFileManager$Us= erLogDir.scanIfNeeded(HistoryFileManager.java:275)
=A0=A0 =A0at org.apache.hadoop.mapreduce.v2.hs.HistoryFileManager.scanInter= mediateDirectory(HistoryFileManager.java:708)
=A0=A0 =A0at org.apache.ha= doop.mapreduce.v2.hs.HistoryFileManager.getFileInfo(HistoryFileManager.java= :847)
=A0=A0 =A0at org.apache.hadoop.mapreduce.v2.hs.CachedHistoryStorage.getFull= Job(CachedHistoryStorage.java:107)
=A0=A0 =A0at org.apache.hadoop.mapred= uce.v2.hs.JobHistory.getJob(JobHistory.java:207)
=A0=A0 =A0at org.apache= .hadoop.mapreduce.v2.hs.HistoryClientService$HSClientProtocolHandler$1.run(= HistoryClientService.java:200)
=A0=A0 =A0at org.apache.hadoop.mapreduce.v2.hs.HistoryClientService$HSClien= tProtocolHandler$1.run(HistoryClientService.java:196)
=A0=A0 =A0at java.= security.AccessController.doPrivileged(Native Method)
=A0=A0 =A0at javax= .security.auth.Subject.doAs(Subject.java:396)
=A0=A0 =A0at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroup= Information.java:1491)
=A0=A0 =A0at org.apache.hadoop.mapreduce.v2.hs.Hi= storyClientService$HSClientProtocolHandler.verifyAndGetJob(HistoryClientSer= vice.java:196)
=A0=A0 =A0at org.apache.hadoop.mapreduce.v2.hs.HistoryClientService$HSClien= tProtocolHandler.getJobReport(HistoryClientService.java:228)
=A0=A0 =A0a= t org.apache.hadoop.mapreduce.v2.api.impl.pb.service.MRClientProtocolPBServ= iceImpl.getJobReport(MRClientProtocolPBServiceImpl.java:122)
=A0=A0 =A0at org.apache.hadoop.yarn.proto.MRClientProtocol$MRClientProtocol= Service$2.callBlockingMethod(MRClientProtocol.java:275)
=A0=A0 =A0at org= .apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(Protobu= fRpcEngine.java:585)
=A0=A0 =A0at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:928)
=A0=A0 = =A0at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2053)
=A0= =A0 =A0at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2049)
= =A0=A0 =A0at java.security.AccessController.doPrivileged(Native Method)
=A0=A0 =A0at javax.security.auth.Subject.doAs(Subject.java:396)
=A0=A0 = =A0at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInforma= tion.java:1491)
=A0=A0 =A0at org.apache.hadoop.ipc.Server$Handler.run(Se= rver.java:2047)



--089e011836be58005604f0fb9ec6--