Return-Path: X-Original-To: apmail-hive-user-archive@www.apache.org Delivered-To: apmail-hive-user-archive@www.apache.org Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by minotaur.apache.org (Postfix) with SMTP id 8D84B11617 for ; Tue, 17 Jun 2014 03:52:56 +0000 (UTC) Received: (qmail 16009 invoked by uid 500); 17 Jun 2014 03:52:55 -0000 Delivered-To: apmail-hive-user-archive@hive.apache.org Received: (qmail 15924 invoked by uid 500); 17 Jun 2014 03:52:54 -0000 Mailing-List: contact user-help@hive.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: user@hive.apache.org Delivered-To: mailing list user@hive.apache.org Received: (qmail 15914 invoked by uid 99); 17 Jun 2014 03:52:54 -0000 Received: from nike.apache.org (HELO nike.apache.org) (192.87.106.230) by apache.org (qpsmtpd/0.29) with ESMTP; Tue, 17 Jun 2014 03:52:54 +0000 X-ASF-Spam-Status: No, hits=1.5 required=5.0 tests=HTML_MESSAGE,RCVD_IN_DNSWL_LOW,SPF_PASS X-Spam-Check-By: apache.org Received-SPF: pass (nike.apache.org: domain of zhangwei.justin@gmail.com designates 209.85.128.182 as permitted sender) Received: from [209.85.128.182] (HELO mail-ve0-f182.google.com) (209.85.128.182) by apache.org (qpsmtpd/0.29) with ESMTP; Tue, 17 Jun 2014 03:52:50 +0000 Received: by mail-ve0-f182.google.com with SMTP id oy12so5122250veb.13 for ; Mon, 16 Jun 2014 20:52:26 -0700 (PDT) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=gmail.com; s=20120113; h=mime-version:in-reply-to:references:date:message-id:subject:from:to :content-type; bh=Q0GSMsawI0AvpJI9Df0e4KvkO6tP5Wh16ZUjFFn/tQE=; b=yqeNGPrcJw9ac9hKJNuNX9yke7YzAWJmbFg6yYNoW7CXwolAkort0C3oTXN3IkD3IQ BVu+PINRCl2pF34YK1znqtBjVUnWlhJ7m7XeZE+VauhbZ8ixngWb8enMDQRojot4O7bG tsL+LSaQuYB7zcNmTjyJna3GiCzRJcL3CFSK1mY5a4W5aHrh7myeRbO6VBibAS1e5Ztx cBDowcb0Dgcy3XniZIhkL6QwEOUSeLFe3PTXyUR09S/zERUzy0m/+Ok8qV13CZoWvHXP 8TEHzxp0Q663SbwpJ9j9ztFxwwASITvjCCETAzqDog9LITvbs64mStnH31M0TB42HxuC p5KA== MIME-Version: 1.0 X-Received: by 10.58.29.164 with SMTP id l4mr19530820veh.8.1402977145881; Mon, 16 Jun 2014 20:52:25 -0700 (PDT) Received: by 10.220.225.69 with HTTP; Mon, 16 Jun 2014 20:52:25 -0700 (PDT) In-Reply-To: <4CD1C46F-3A81-4181-A632-BD305924DDD0@hortonworks.com> References: <4CD1C46F-3A81-4181-A632-BD305924DDD0@hortonworks.com> Date: Tue, 17 Jun 2014 11:52:25 +0800 Message-ID: Subject: Re: Hive-0.13 java.io.FileNotFoundException: HIVE_PLAN not found From: =?UTF-8?B?5byg5Lyf?= To: user@hive.apache.org Content-Type: multipart/alternative; boundary=047d7bacb63c688a0904fc0012b0 X-Virus-Checked: Checked by ClamAV on apache.org --047d7bacb63c688a0904fc0012b0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: quoted-printable Hi Jason: Thanks a lot for your tips!! I finally find the problem. It's because i run shark-0.9.1 on the same cluster which is compiled with hive-0.11. When yarn starts, It reads hive-0.11 jar files which lead to the error!! I've removed shark classpath from yarn.application.classpath in yarn-site.xml, and the error fixed! thank you 2014-06-17 7:04 GMT+08:00 Jason Dere : > Can you confirm you're using Hive 0.13? The stack trace looks more like i= t > was on Hive 0.11. > Is uberized mode enabled in YARN (mapreduce.job.ubertask.enable)? Could > be due to HIVE-5857. > > On Jun 16, 2014, at 7:42 AM, =E5=BC=A0=E4=BC=9F wrote: > > Hi, > > I run Hadoop-2.2.0 + Hive-0.13.0 on a cluster. WordCount example > succeeds running and it's ok to create table in hive cli. But when i run > hive query with mapreduce jobs, then i keep getting errors like: > > Diagnostic Messages for this Task:Error: java.lang.RuntimeException: java= .io.FileNotFoundException: HIVE_PLAN7b8ea437-8ec3-4c05-af4e-3cd6466dce85 (N= o such file or directory) > at org.apache.hadoop.hive.ql.exec.Utilities.getMapRedWork(Utilities.j= ava:230) > at org.apache.hadoop.hive.ql.io.HiveInputFormat.init(HiveInputFormat.= java:255) > at org.apache.hadoop.hive.ql.io.HiveInputFormat.pushProjectionsAndFil= ters(HiveInputFormat.java:381) > at org.apache.hadoop.hive.ql.io.HiveInputFormat.pushProjectionsAndFil= ters(HiveInputFormat.java:374) > at org.apache.hadoop.hive.ql.io.CombineHiveInputFormat.getRecordReade= r(CombineHiveInputFormat.java:540) > at org.apache.hadoop.mapred.MapTask$TrackedRecordReader.(MapTas= k.java:167) > at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:408) > at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) > at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:162) > at java.security.AccessController.doPrivileged(Native Method) > at javax.security.auth.Subject.doAs(Subject.java:415) > at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInfo= rmation.java:1491) > at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:157)Caused = by: java.io.FileNotFoundException: HIVE_PLAN7b8ea437-8ec3-4c05-af4e-3cd6466= dce85 (No such file or directory) > at java.io.FileInputStream.open(Native Method) > at java.io.FileInputStream.(FileInputStream.java:146) > at java.io.FileInputStream.(FileInputStream.java:101) > at org.apache.hadoop.hive.ql.exec.Utilities.getMapRedWork(Utilities.j= ava:221) > ... 12 more > > > FAILED: Execution Error, return code 2 from org.apache.hadoop.hive.ql.exe= c.mr.MapRedTaskMapReduce Jobs Launched:Job 0: Map: 1 HDFS Read: 0 HDFS Wr= ite: 0 FAILTotal MapReduce CPU Time Spent: 0 msec > > > According to the error messages above, i find that hive.exec.scratchdir (= /tmp/hive-${user.name}) where hive plan files are stored is cleaned up, whe= n query finished. > But when i use hive-0.12 version, the directory was not cleaned, and all = hive plan files are kept there. I think it's the main problem causing the e= rrors. > > how can I fix it? Looking forward to your reply! > > > > CONFIDENTIALITY NOTICE > NOTICE: This message is intended for the use of the individual or entity > to which it is addressed and may contain information that is confidential= , > privileged and exempt from disclosure under applicable law. If the reader > of this message is not the intended recipient, you are hereby notified th= at > any printing, copying, dissemination, distribution, disclosure or > forwarding of this communication is strictly prohibited. If you have > received this communication in error, please contact the sender immediate= ly > and delete it from your system. Thank You. --047d7bacb63c688a0904fc0012b0 Content-Type: text/html; charset=UTF-8 Content-Transfer-Encoding: quoted-printable
Hi Jason:

Thanks a lot for your tips!!<= /div>

I finally find the problem. It's because i run= shark-0.9.1 on the same cluster which is compiled with hive-0.11. When yar= n starts, It reads hive-0.11 jar files which lead to the error!!

I've removed shark classpath from=C2=A0yarn.applica= tion.classpath in yarn-site.xml, and the error fixed!

<= div>thank you=C2=A0


2014-06-17 7:04 GMT+08:00 Jason Dere <jdere@hortonworks.com>:
Can you confirm you're using Hive 0= .13? The stack trace looks more like it was on Hive 0.11.
Is uberized m= ode enabled in YARN (mapreduce.job.ube= rtask.enable)? Could be due to HIVE-5857.

On Jun 16, 2014, at 7:42 AM, =E5= =BC=A0=E4=BC=9F <zhangwei.justin@gmail.com> wrote:

Hi,

=C2=A0 =C2=A0 I run Hadoop-2.2.0 + Hive-0.13.0 on = a cluster. WordCount example succeeds running and it's ok to create tab= le in hive cli. But when i run hive query with mapreduce jobs, then i keep = getting errors like:

D=
iagnostic Messages for this Task=
:
Error: java=
.lang.RuntimeException: java=
.io.FileNotFoundException: HIVE_PLAN7b8ea437-8ec3=
-4c05=
-af4e-3cd6466dc=
e85 (No such=
 file or directory).apache=
.hadoop.hive.ql.exec.Utilities.getMapRedWork(Utilities.java:230
    at org.apache=
.hadoop.hive.ql.io.HiveInputFormatinit(HiveInputFormat.java:=
255)
    at org.apache=
.hadoop.hive.ql.io.HiveInputFormatpushProjectionsAndFilters(HiveInputFor=
mat.java:381)
    at org.apache=
.hadoop.hive.ql.io.HiveInputFormatpushProjectionsAndFilters(HiveInputFor=
mat.java:374)
    at org.apache=
.hadoop.hive.ql.io.CombineHiveInputFormat.getRecordReader(CombineHiveI=
nputFormat.java:540)
    at org.apache=
.hadoop.mapred.MapTask$Trac=
kedRecordReader.<init>(MapTask.java:167).apache=
.hadoop.mapred.MapTask.runOldMapper(Map=
Task.java<=
span style=3D"margin:0px;padding:0px;border:0px;vertical-align:baseline;bac=
kground:transparent">:408)
    at org.apache=
.hadoop.mapred.MapTask.run(MapTask.java:341)
    at org.apache=
.hadoop.mapred.YarnChild$2<=
/span>.run(YarnChil=
d.java:162)
    at java.secur=
ity.AccessController.doPrivileged(Native Method)
    at javax.secu=
rity.auth<=
span style=3D"margin:0px;padding:0px;border:0px;vertical-align:baseline;bac=
kground:transparent">.Su=
bject.doAs=
(S=
ubject.java:415=
)
    at org.apache=
.hadoop.security.UserGroupInf=
ormation.doAs(.java:1491)
    at org.apache=
.hadoop.mapred.YarnChild.main(YarnChild=
.java:157)
Caused by: j=
ava.io.FileN=
otFoundException:=
 HIVE_PLAN7b8ea437-8ec3-4c05-af4e-3cd6466dce85 <=
span style=3D"margin:0px;padding:0px;border:0px;vertical-align:baseline;bac=
kground:transparent">(No=
 such file or directory)
    at java.io.FileInputStream.(Native Method)
    at java.io.FileInputStream.<init>(FileInputStream.java:146)
    at java.io.FileInputStream.<init>(FileInputStream.java:101)
    at org.apache=
.hadoop.hive.ql.exec.Utilities.getMapRedWork(Utilities.java:221
    ... 12 more


FAILED: Execution Error, return code 2 from org.apache.hadoop.hive.ql.exec.m=
r.MapRedTask
MapReduce Jobs Launched:
Job 0: Map: 1   HDFS Read<=
/span>: 0 HDFS =
Write: 0 FAIL
Total MapReduce CPU Time Spent: 0 msec

According to the error messag=
es above, i find that hive.exec.scratchdir (/tmp/hive-${user.name}) where hive plan files are stor=
ed is cleaned up, when query finished.
But when i use hive-0.12 version, the directory was not cleaned, and all hi=
ve plan files are kept there. I think it's the main problem causing the=
 errors.
how can I fix it? Looking for=
ward to your reply!


CONFIDENTIALITY NOTICE
NOTICE: This message is = intended for the use of the individual or entity to which it is addressed a= nd may contain information that is confidential, privileged and exempt from= disclosure under applicable law. If the reader of this message is not the = intended recipient, you are hereby notified that any printing, copying, dis= semination, distribution, disclosure or forwarding of this communication is= strictly prohibited. If you have received this communication in error, ple= ase contact the sender immediately and delete it from your system. Thank Yo= u.

--047d7bacb63c688a0904fc0012b0--