Return-Path: X-Original-To: apmail-hive-user-archive@www.apache.org Delivered-To: apmail-hive-user-archive@www.apache.org Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by minotaur.apache.org (Postfix) with SMTP id EF282100BA for ; Wed, 12 Feb 2014 19:48:16 +0000 (UTC) Received: (qmail 64841 invoked by uid 500); 12 Feb 2014 19:48:14 -0000 Delivered-To: apmail-hive-user-archive@hive.apache.org Received: (qmail 64753 invoked by uid 500); 12 Feb 2014 19:48:14 -0000 Mailing-List: contact user-help@hive.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: user@hive.apache.org Delivered-To: mailing list user@hive.apache.org Received: (qmail 64745 invoked by uid 99); 12 Feb 2014 19:48:13 -0000 Received: from nike.apache.org (HELO nike.apache.org) (192.87.106.230) by apache.org (qpsmtpd/0.29) with ESMTP; Wed, 12 Feb 2014 19:48:13 +0000 X-ASF-Spam-Status: No, hits=1.5 required=5.0 tests=HTML_MESSAGE,RCVD_IN_DNSWL_LOW,SPF_PASS X-Spam-Check-By: apache.org Received-SPF: pass (nike.apache.org: domain of pjayachandran@hortonworks.com designates 209.85.160.43 as permitted sender) Received: from [209.85.160.43] (HELO mail-pb0-f43.google.com) (209.85.160.43) by apache.org (qpsmtpd/0.29) with ESMTP; Wed, 12 Feb 2014 19:48:05 +0000 Received: by mail-pb0-f43.google.com with SMTP id md12so9716611pbc.2 for ; Wed, 12 Feb 2014 11:47:44 -0800 (PST) X-Google-DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=1e100.net; s=20130820; h=x-gm-message-state:from:message-id:mime-version:subject:date :references:to:in-reply-to:content-type; bh=UKRir8uSyDCxoD3toY6F4rxvf6cCj1cwCCLWBjPKw8s=; b=eD5KfxyZrFF8K+oMfbiA4NL0nRyA8/YZX419DwLBw832t9/KTkwJePUH/cZOcColjB vkLNf8TNW2D2dZ4JftmI9XFzLdyn6fseEiYphz9K6VB0j9bI6Lsrw01FF4zWbWRUmGz8 +aSKlIl5NJe9MM5Ry21ACfYZ+H3VgWbBS40jysFPN+TNBxk7n7yrUZdj05P1mFX1QDIh FKm7Rgpm7xdfUiAUx5NXFuqAqzDm/LjO2QrnHQgkt1CatN3RyrqMbiYNOG+j2wJnQtyP Oy+kbBEPAXITTQpAOWz/PRWYQC9A/UsChgDsFj8yga3yr1Od419cIOmN9+JmybetyULZ MWfg== X-Gm-Message-State: ALoCoQlF4FIQu0KSTNZ2K82y0ZQREgGUHyqEHc4FQ0x4Pm1YVbgH+xm+spKGPFgy31LwyVYFjvvakiozu9H4Bh2FUZh9nQ2Pb0zlnhQ/2HvgyBt+H1L7Qsw= X-Received: by 10.68.237.228 with SMTP id vf4mr53483574pbc.131.1392234463851; Wed, 12 Feb 2014 11:47:43 -0800 (PST) Received: from [10.11.3.24] ([192.175.27.2]) by mx.google.com with ESMTPSA id y9sm168238730pas.10.2014.02.12.11.47.42 for (version=TLSv1 cipher=ECDHE-RSA-RC4-SHA bits=128/128); Wed, 12 Feb 2014 11:47:43 -0800 (PST) From: Prasanth Jayachandran Message-Id: Mime-Version: 1.0 (Mac OS X Mail 7.1 \(1827\)) Subject: Re: Compiling Hive 0.12.0 Date: Wed, 12 Feb 2014 11:47:43 -0800 References: To: user@hive.apache.org In-Reply-To: X-Mailer: Apple Mail (2.1827) Content-Type: multipart/alternative; boundary="Apple-Mail=_CD4C5BDC-93B8-45F6-A809-0BBEB3DC8D28" X-Virus-Checked: Checked by ClamAV on apache.org --Apple-Mail=_CD4C5BDC-93B8-45F6-A809-0BBEB3DC8D28 Content-Type: text/plain; charset=ISO-8859-1 Hi Bryan HIVE-5991 is a writer bug. From the exception I see that exception happens while reading the ORC file. It might happen that its trying to read a corrupted ORC file. Are you trying to read the old ORC file after applying this patch? Since it is a writer bug, the ORC file needs to be regenerated. Can you try regenerating the file and see if it happens again? Thanks Prasanth Jayachandran On Feb 12, 2014, at 11:35 AM, Bryan Jeffrey wrote: > Hello. > > I am running Hive 0.12.0 & Hadoop 2.2.0. I attempted to apply the fix described in the patch here: https://issues.apache.org/jira/secure/attachment/12617931/HIVE-5991.1.patch > > I applied the patch, and ran 'ant tar' from the src directory. A tar file for distribution was created, and it appeared that items recompiled. However, I am still seeing the following errors: > > Error: java.io.IOException: java.io.IOException: java.lang.ArrayIndexOutOfBoundsException: 0 > at org.apache.hadoop.hive.io.HiveIOExceptionHandlerChain.handleRecordReaderNextException(HiveIOExceptionHandlerChain.java:121) > at org.apache.hadoop.hive.io.HiveIOExceptionHandlerUtil.handleRecordReaderNextException(HiveIOExceptionHandlerUtil.java:77) > at org.apache.hadoop.hive.shims.HadoopShimsSecure$CombineFileRecordReader.doNextWithExceptionHandler(HadoopShimsSecure.java:304) > at org.apache.hadoop.hive.shims.HadoopShimsSecure$CombineFileRecordReader.next(HadoopShimsSecure.java:220) > at org.apache.hadoop.mapred.MapTask$TrackedRecordReader.moveToNext(MapTask.java:197) > at org.apache.hadoop.mapred.MapTask$TrackedRecordReader.next(MapTask.java:183) > at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:52) > at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:429) > at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) > at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:162) > at java.security.AccessController.doPrivileged(Native Method) > at javax.security.auth.Subject.doAs(Subject.java:396) > at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1491) > at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:157) > Caused by: java.io.IOException: java.lang.ArrayIndexOutOfBoundsException: 0 > at org.apache.hadoop.hive.io.HiveIOExceptionHandlerChain.handleRecordReaderNextException(HiveIOExceptionHandlerChain.java:121) > at org.apache.hadoop.hive.io.HiveIOExceptionHandlerUtil.handleRecordReaderNextException(HiveIOExceptionHandlerUtil.java:77) > at org.apache.hadoop.hive.ql.io.HiveContextAwareRecordReader.doNext(HiveContextAwareRecordReader.java:276) > at org.apache.hadoop.hive.ql.io.CombineHiveRecordReader.doNext(CombineHiveRecordReader.java:101) > at org.apache.hadoop.hive.ql.io.CombineHiveRecordReader.doNext(CombineHiveRecordReader.java:41) > at org.apache.hadoop.hive.ql.io.HiveContextAwareRecordReader.next(HiveContextAwareRecordReader.java:108) > at org.apache.hadoop.hive.shims.HadoopShimsSecure$CombineFileRecordReader.doNextWithExceptionHandler(HadoopShimsSecure.java:302) > ... 11 more > Caused by: java.lang.ArrayIndexOutOfBoundsException: 0 > at org.apache.hadoop.hive.ql.io.orc.RunLengthIntegerReaderV2.readPatchedBaseValues(RunLengthIntegerReaderV2.java:171) > at org.apache.hadoop.hive.ql.io.orc.RunLengthIntegerReaderV2.readValues(RunLengthIntegerReaderV2.java:54) > at org.apache.hadoop.hive.ql.io.orc.RunLengthIntegerReaderV2.next(RunLengthIntegerReaderV2.java:287) > at org.apache.hadoop.hive.ql.io.orc.RecordReaderImpl$LongTreeReader.next(RecordReaderImpl.java:473) > at org.apache.hadoop.hive.ql.io.orc.RecordReaderImpl$StructTreeReader.next(RecordReaderImpl.java:1157) > at org.apache.hadoop.hive.ql.io.orc.RecordReaderImpl.next(RecordReaderImpl.java:2196) > at org.apache.hadoop.hive.ql.io.orc.OrcInputFormat$OrcRecordReader.next(OrcInputFormat.java:106) > at org.apache.hadoop.hive.ql.io.orc.OrcInputFormat$OrcRecordReader.next(OrcInputFormat.java:57) > at org.apache.hadoop.hive.ql.io.HiveContextAwareRecordReader.doNext(HiveContextAwareRecordReader.java:274) > ... 15 more > > > I believe that the symptoms match the ticket (#5991). I altered the 'int' to 'long' as described in the patch. Did I miss a step when recompiling? > > Regards, > > Bryan Jeffrey -- CONFIDENTIALITY NOTICE NOTICE: This message is intended for the use of the individual or entity to which it is addressed and may contain information that is confidential, privileged and exempt from disclosure under applicable law. If the reader of this message is not the intended recipient, you are hereby notified that any printing, copying, dissemination, distribution, disclosure or forwarding of this communication is strictly prohibited. If you have received this communication in error, please contact the sender immediately and delete it from your system. Thank You. --Apple-Mail=_CD4C5BDC-93B8-45F6-A809-0BBEB3DC8D28 Content-Transfer-Encoding: quoted-printable Content-Type: text/html; charset=US-ASCII Hi Bryan

<= div>HIVE-5991 is a writer bug. From the exception I see that exception happ= ens while reading the ORC file. It might happen that its trying to read a c= orrupted ORC file.
Are you trying to read the old ORC file after = applying this patch? Since it is a writer bug, the ORC file needs to be reg= enerated. Can you try regenerating the file and see if it happens again?

Thanks
Prasanth Jayachandran

On Feb 12, 2014, at 11:35 AM, Bryan Jeffrey <bryan.jeffrey@gmail.com> wrote:
<= br class=3D"Apple-interchange-newline">
Hello.

I am running Hive 0.12.0 & Hadoop 2.= 2.0.  I attempted to apply the fix described in the patch here: <= a href=3D"https://issues.apache.org/jira/secure/attachment/12617931/HIVE-59= 91.1.patch">https://issues.apache.org/jira/secure/attachment/12617931/HIVE-= 5991.1.patch

I applied the patch, and ran 'ant tar' from the src dir= ectory.  A tar file for distribution was created, and it appeared that= items recompiled.  However, I am still seeing the following errors:

Error: java.io.IOException: java.io.IOException: j= ava.lang.ArrayIndexOutOfBoundsException: 0
      &= nbsp; at org.apache.hadoop.hive.io.HiveIOExceptionHandlerChain.handleRecord= ReaderNextException(HiveIOExceptionHandlerChain.java:121)
        at org.apache.hadoop.hive.io.HiveIOExcepti= onHandlerUtil.handleRecordReaderNextException(HiveIOExceptionHandlerUtil.ja= va:77)
        at org.apache.hadoop.hive.shim= s.HadoopShimsSecure$CombineFileRecordReader.doNextWithExceptionHandler(Hado= opShimsSecure.java:304)
        at org.apache.hadoop.hive.shims.HadoopShim= sSecure$CombineFileRecordReader.next(HadoopShimsSecure.java:220)
=         at org.apache.hadoop.mapred.MapTask$TrackedReco= rdReader.moveToNext(MapTask.java:197)
        at org.apache.hadoop.mapred.MapTask$Tracke= dRecordReader.next(MapTask.java:183)
        = at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:52)
&nbs= p;       at org.apache.hadoop.mapred.MapTask.runOldMapper(Ma= pTask.java:429)
        at org.apache.hadoop.mapred.MapTask.run(Ma= pTask.java:341)
        at org.apache.hadoop.= mapred.YarnChild$2.run(YarnChild.java:162)
      &= nbsp; at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subjec= t.java:396)
        at org.apache.hadoop.secu= rity.UserGroupInformation.doAs(UserGroupInformation.java:1491)
&n= bsp;       at org.apache.hadoop.mapred.YarnChild.main(YarnCh= ild.java:157)
Caused by: java.io.IOException: java.lang.ArrayIndexOutOfBoundsExcepti= on: 0
        at org.apache.hadoop.hive.io.Hi= veIOExceptionHandlerChain.handleRecordReaderNextException(HiveIOExceptionHa= ndlerChain.java:121)
        at org.apache.hadoop.hive.io.HiveIOExcepti= onHandlerUtil.handleRecordReaderNextException(HiveIOExceptionHandlerUtil.ja= va:77)
        at org.apache.hadoop.hive.ql.i= o.HiveContextAwareRecordReader.doNext(HiveContextAwareRecordReader.java:276= )
        at org.apache.hadoop.hive.ql.io.CombineHiv= eRecordReader.doNext(CombineHiveRecordReader.java:101)
  &nb= sp;     at org.apache.hadoop.hive.ql.io.CombineHiveRecordReader.d= oNext(CombineHiveRecordReader.java:41)
        at org.apache.hadoop.hive.ql.io.HiveContex= tAwareRecordReader.next(HiveContextAwareRecordReader.java:108)
&n= bsp;       at org.apache.hadoop.hive.shims.HadoopShimsSecure= $CombineFileRecordReader.doNextWithExceptionHandler(HadoopShimsSecure.java:= 302)
        ... 11 more
Caused by: java.lang= .ArrayIndexOutOfBoundsException: 0
        at= org.apache.hadoop.hive.ql.io.orc.RunLengthIntegerReaderV2.readPatchedBaseV= alues(RunLengthIntegerReaderV2.java:171)
        at org.apache.hadoop.hive.ql.io.orc.RunLen= gthIntegerReaderV2.readValues(RunLengthIntegerReaderV2.java:54)
&= nbsp;       at org.apache.hadoop.hive.ql.io.orc.RunLengthInt= egerReaderV2.next(RunLengthIntegerReaderV2.java:287)
        at org.apache.hadoop.hive.ql.io.orc.Record= ReaderImpl$LongTreeReader.next(RecordReaderImpl.java:473)
  =       at org.apache.hadoop.hive.ql.io.orc.RecordReaderImpl$S= tructTreeReader.next(RecordReaderImpl.java:1157)
        at org.apache.hadoop.hive.ql.io.orc.Record= ReaderImpl.next(RecordReaderImpl.java:2196)
      =   at org.apache.hadoop.hive.ql.io.orc.OrcInputFormat$OrcRecordReader.n= ext(OrcInputFormat.java:106)
        at org.apache.hadoop.hive.ql.io.orc.OrcInputFor= mat$OrcRecordReader.next(OrcInputFormat.java:57)
    &n= bsp;   at org.apache.hadoop.hive.ql.io.HiveContextAwareRecordReader.do= Next(HiveContextAwareRecordReader.java:274)
        ... 15 more


I believe that the symptoms match the ticket (#5991).  = ;I altered the 'int' to 'long' as described in the patch.  Did I miss = a step when recompiling?

Regards,

Bryan Jeffrey


CONFIDENTIALITY NOTICE
NOTICE: This message is = intended for the use of the individual or entity to which it is addressed a= nd may contain information that is confidential, privileged and exempt from= disclosure under applicable law. If the reader of this message is not the = intended recipient, you are hereby notified that any printing, copying, dis= semination, distribution, disclosure or forwarding of this communication is= strictly prohibited. If you have received this communication in error, ple= ase contact the sender immediately and delete it from your system. Thank Yo= u. --Apple-Mail=_CD4C5BDC-93B8-45F6-A809-0BBEB3DC8D28--