Return-Path: X-Original-To: apmail-hive-user-archive@www.apache.org Delivered-To: apmail-hive-user-archive@www.apache.org Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by minotaur.apache.org (Postfix) with SMTP id 6C34910FD6 for ; Wed, 12 Feb 2014 19:36:22 +0000 (UTC) Received: (qmail 29385 invoked by uid 500); 12 Feb 2014 19:36:19 -0000 Delivered-To: apmail-hive-user-archive@hive.apache.org Received: (qmail 29274 invoked by uid 500); 12 Feb 2014 19:36:19 -0000 Mailing-List: contact user-help@hive.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: user@hive.apache.org Delivered-To: mailing list user@hive.apache.org Received: (qmail 29265 invoked by uid 99); 12 Feb 2014 19:36:19 -0000 Received: from athena.apache.org (HELO athena.apache.org) (140.211.11.136) by apache.org (qpsmtpd/0.29) with ESMTP; Wed, 12 Feb 2014 19:36:19 +0000 X-ASF-Spam-Status: No, hits=1.5 required=5.0 tests=HTML_MESSAGE,RCVD_IN_DNSWL_LOW,SPF_PASS X-Spam-Check-By: apache.org Received-SPF: pass (athena.apache.org: domain of bryan.jeffrey@gmail.com designates 209.85.219.45 as permitted sender) Received: from [209.85.219.45] (HELO mail-oa0-f45.google.com) (209.85.219.45) by apache.org (qpsmtpd/0.29) with ESMTP; Wed, 12 Feb 2014 19:36:13 +0000 Received: by mail-oa0-f45.google.com with SMTP id i11so11452013oag.4 for ; Wed, 12 Feb 2014 11:35:53 -0800 (PST) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=gmail.com; s=20120113; h=mime-version:date:message-id:subject:from:to:content-type; bh=RiNGDw4BQcZJYB1HP9DUvT0NLS9n1sapC3xnQ9/netA=; b=fi9fHHn4XkIuNT4HBfLaftkLny2xIi2hGCzXIKZETEhBmKGyIkrl1noCsgGBBSazpL NRBWVNQVxbWE+HDHn1QWFC7iKgcfyFkEp9io6dhs3uMqAi99TLyBLBZbRYagkJ++kDiw Vz5XYfYmoljT5cjC6H5f607XmB8NAvyOFOzIjIglJE4xmY0glaLwC6gs9yupZR9YBR34 w19XHReoOh0cA/Fb1+nI3s7P+lmQaNxWONKPTmy7nt8986N7slKsFuAQnY11GgxW8GAd 872IFEBxc6HpLB2OJJF/s6O/9TuiBrhp1ebfdU1M5yZDbT730klRcyuSRmihFqXGoBGR IoNw== MIME-Version: 1.0 X-Received: by 10.182.28.7 with SMTP id x7mr13797105obg.43.1392233752926; Wed, 12 Feb 2014 11:35:52 -0800 (PST) Received: by 10.76.132.136 with HTTP; Wed, 12 Feb 2014 11:35:52 -0800 (PST) Date: Wed, 12 Feb 2014 14:35:52 -0500 Message-ID: Subject: Compiling Hive 0.12.0 From: Bryan Jeffrey To: user@hive.apache.org Content-Type: multipart/alternative; boundary=089e015380ba49a42304f23aae00 X-Virus-Checked: Checked by ClamAV on apache.org --089e015380ba49a42304f23aae00 Content-Type: text/plain; charset=ISO-8859-1 Hello. I am running Hive 0.12.0 & Hadoop 2.2.0. I attempted to apply the fix described in the patch here: https://issues.apache.org/jira/secure/attachment/12617931/HIVE-5991.1.patch I applied the patch, and ran 'ant tar' from the src directory. A tar file for distribution was created, and it appeared that items recompiled. However, I am still seeing the following errors: Error: java.io.IOException: java.io.IOException: java.lang.ArrayIndexOutOfBoundsException: 0 at org.apache.hadoop.hive.io.HiveIOExceptionHandlerChain.handleRecordReaderNextException(HiveIOExceptionHandlerChain.java:121) at org.apache.hadoop.hive.io.HiveIOExceptionHandlerUtil.handleRecordReaderNextException(HiveIOExceptionHandlerUtil.java:77) at org.apache.hadoop.hive.shims.HadoopShimsSecure$CombineFileRecordReader.doNextWithExceptionHandler(HadoopShimsSecure.java:304) at org.apache.hadoop.hive.shims.HadoopShimsSecure$CombineFileRecordReader.next(HadoopShimsSecure.java:220) at org.apache.hadoop.mapred.MapTask$TrackedRecordReader.moveToNext(MapTask.java:197) at org.apache.hadoop.mapred.MapTask$TrackedRecordReader.next(MapTask.java:183) at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:52) at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:429) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:162) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:396) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1491) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:157) Caused by: java.io.IOException: java.lang.ArrayIndexOutOfBoundsException: 0 at org.apache.hadoop.hive.io.HiveIOExceptionHandlerChain.handleRecordReaderNextException(HiveIOExceptionHandlerChain.java:121) at org.apache.hadoop.hive.io.HiveIOExceptionHandlerUtil.handleRecordReaderNextException(HiveIOExceptionHandlerUtil.java:77) at org.apache.hadoop.hive.ql.io.HiveContextAwareRecordReader.doNext(HiveContextAwareRecordReader.java:276) at org.apache.hadoop.hive.ql.io.CombineHiveRecordReader.doNext(CombineHiveRecordReader.java:101) at org.apache.hadoop.hive.ql.io.CombineHiveRecordReader.doNext(CombineHiveRecordReader.java:41) at org.apache.hadoop.hive.ql.io.HiveContextAwareRecordReader.next(HiveContextAwareRecordReader.java:108) at org.apache.hadoop.hive.shims.HadoopShimsSecure$CombineFileRecordReader.doNextWithExceptionHandler(HadoopShimsSecure.java:302) ... 11 more Caused by: java.lang.ArrayIndexOutOfBoundsException: 0 at org.apache.hadoop.hive.ql.io.orc.RunLengthIntegerReaderV2.readPatchedBaseValues(RunLengthIntegerReaderV2.java:171) at org.apache.hadoop.hive.ql.io.orc.RunLengthIntegerReaderV2.readValues(RunLengthIntegerReaderV2.java:54) at org.apache.hadoop.hive.ql.io.orc.RunLengthIntegerReaderV2.next(RunLengthIntegerReaderV2.java:287) at org.apache.hadoop.hive.ql.io.orc.RecordReaderImpl$LongTreeReader.next(RecordReaderImpl.java:473) at org.apache.hadoop.hive.ql.io.orc.RecordReaderImpl$StructTreeReader.next(RecordReaderImpl.java:1157) at org.apache.hadoop.hive.ql.io.orc.RecordReaderImpl.next(RecordReaderImpl.java:2196) at org.apache.hadoop.hive.ql.io.orc.OrcInputFormat$OrcRecordReader.next(OrcInputFormat.java:106) at org.apache.hadoop.hive.ql.io.orc.OrcInputFormat$OrcRecordReader.next(OrcInputFormat.java:57) at org.apache.hadoop.hive.ql.io.HiveContextAwareRecordReader.doNext(HiveContextAwareRecordReader.java:274) ... 15 more I believe that the symptoms match the ticket (#5991). I altered the 'int' to 'long' as described in the patch. Did I miss a step when recompiling? Regards, Bryan Jeffrey --089e015380ba49a42304f23aae00 Content-Type: text/html; charset=ISO-8859-1 Content-Transfer-Encoding: quoted-printable
Hello.

I am running Hive 0.12.0 & H= adoop 2.2.0. =A0I attempted to apply the fix described in the patch here:= =A0https://issues.apache.org/jira/secure/attachment/12617931/H= IVE-5991.1.patch

I applied the patch, and ran 'ant tar' from the= src directory. =A0A tar file for distribution was created, and it appeared= that items recompiled. =A0However, I am still seeing the following errors:=

Error: java.io.IOException: java.io.IOException: j= ava.lang.ArrayIndexOutOfBoundsException: 0
=A0 =A0 =A0 =A0 at org= .apache.hadoop.hive.io.HiveIOExceptionHandlerChain.handleRecordReaderNextEx= ception(HiveIOExceptionHandlerChain.java:121)
=A0 =A0 =A0 =A0 at org.apache.hadoop.hive.io.HiveIOExceptionHandlerUti= l.handleRecordReaderNextException(HiveIOExceptionHandlerUtil.java:77)
=
=A0 =A0 =A0 =A0 at org.apache.hadoop.hive.shims.HadoopShimsSecure$Comb= ineFileRecordReader.doNextWithExceptionHandler(HadoopShimsSecure.java:304)<= /div>
=A0 =A0 =A0 =A0 at org.apache.hadoop.hive.shims.HadoopShimsSecure$Comb= ineFileRecordReader.next(HadoopShimsSecure.java:220)
=A0 =A0 =A0 = =A0 at org.apache.hadoop.mapred.MapTask$TrackedRecordReader.moveToNext(MapT= ask.java:197)
=A0 =A0 =A0 =A0 at org.apache.hadoop.mapred.MapTask$TrackedRecordReade= r.next(MapTask.java:183)
=A0 =A0 =A0 =A0 at org.apache.hadoop.map= red.MapRunner.run(MapRunner.java:52)
=A0 =A0 =A0 =A0 at org.apach= e.hadoop.mapred.MapTask.runOldMapper(MapTask.java:429)
=A0 =A0 =A0 =A0 at org.apache.hadoop.mapred.MapTask.run(MapTask.java:3= 41)
=A0 =A0 =A0 =A0 at org.apache.hadoop.mapred.YarnChild$2.run(Y= arnChild.java:162)
=A0 =A0 =A0 =A0 at java.security.AccessControl= ler.doPrivileged(Native Method)
=A0 =A0 =A0 =A0 at javax.security.auth.Subject.doAs(Subject.java:396)<= /div>
=A0 =A0 =A0 =A0 at org.apache.hadoop.security.UserGroupInformatio= n.doAs(UserGroupInformation.java:1491)
=A0 =A0 =A0 =A0 at org.apa= che.hadoop.mapred.YarnChild.main(YarnChild.java:157)
Caused by: java.io.IOException: java.lang.ArrayIndexOutOfBoundsExcepti= on: 0
=A0 =A0 =A0 =A0 at org.apache.hadoop.hive.io.HiveIOExceptio= nHandlerChain.handleRecordReaderNextException(HiveIOExceptionHandlerChain.j= ava:121)
=A0 =A0 =A0 =A0 at org.apache.hadoop.hive.io.HiveIOExceptionHandlerUti= l.handleRecordReaderNextException(HiveIOExceptionHandlerUtil.java:77)
=
=A0 =A0 =A0 =A0 at org.apache.hadoop.hive.ql.io.HiveContextAwareRecord= Reader.doNext(HiveContextAwareRecordReader.java:276)
=A0 =A0 =A0 =A0 at org.apache.hadoop.hive.ql.io.CombineHiveRecordReade= r.doNext(CombineHiveRecordReader.java:101)
=A0 =A0 =A0 =A0 at org= .apache.hadoop.hive.ql.io.CombineHiveRecordReader.doNext(CombineHiveRecordR= eader.java:41)
=A0 =A0 =A0 =A0 at org.apache.hadoop.hive.ql.io.HiveContextAwareRecord= Reader.next(HiveContextAwareRecordReader.java:108)
=A0 =A0 =A0 = =A0 at org.apache.hadoop.hive.shims.HadoopShimsSecure$CombineFileRecordRead= er.doNextWithExceptionHandler(HadoopShimsSecure.java:302)
=A0 =A0 =A0 =A0 ... 11 more
Caused by: java.lang.ArrayIndexO= utOfBoundsException: 0
=A0 =A0 =A0 =A0 at org.apache.hadoop.hive.= ql.io.orc.RunLengthIntegerReaderV2.readPatchedBaseValues(RunLengthIntegerRe= aderV2.java:171)
=A0 =A0 =A0 =A0 at org.apache.hadoop.hive.ql.io.orc.RunLengthIntegerRe= aderV2.readValues(RunLengthIntegerReaderV2.java:54)
=A0 =A0 =A0 = =A0 at org.apache.hadoop.hive.ql.io.orc.RunLengthIntegerReaderV2.next(RunLe= ngthIntegerReaderV2.java:287)
=A0 =A0 =A0 =A0 at org.apache.hadoop.hive.ql.io.orc.RecordReaderImpl$L= ongTreeReader.next(RecordReaderImpl.java:473)
=A0 =A0 =A0 =A0 at = org.apache.hadoop.hive.ql.io.orc.RecordReaderImpl$StructTreeReader.next(Rec= ordReaderImpl.java:1157)
=A0 =A0 =A0 =A0 at org.apache.hadoop.hive.ql.io.orc.RecordReaderImpl.n= ext(RecordReaderImpl.java:2196)
=A0 =A0 =A0 =A0 at org.apache.had= oop.hive.ql.io.orc.OrcInputFormat$OrcRecordReader.next(OrcInputFormat.java:= 106)
=A0 =A0 =A0 =A0 at org.apache.hadoop.hive.ql.io.orc.OrcInputFormat$OrcRecor= dReader.next(OrcInputFormat.java:57)
=A0 =A0 =A0 =A0 at org.apach= e.hadoop.hive.ql.io.HiveContextAwareRecordReader.doNext(HiveContextAwareRec= ordReader.java:274)
=A0 =A0 =A0 =A0 ... 15 more


<= div>I believe that the symptoms match the ticket (#5991). =A0I altered the = 'int' to 'long' as described in the patch. =A0Did I miss a = step when recompiling?

Regards,

Bryan Jeffrey
--089e015380ba49a42304f23aae00--