Return-Path: X-Original-To: archive-asf-public-internal@cust-asf2.ponee.io Delivered-To: archive-asf-public-internal@cust-asf2.ponee.io Received: from cust-asf.ponee.io (cust-asf.ponee.io [163.172.22.183]) by cust-asf2.ponee.io (Postfix) with ESMTP id F3EF6200D2F for ; Wed, 18 Oct 2017 08:03:12 +0200 (CEST) Received: by cust-asf.ponee.io (Postfix) id F0F4A1609EC; Wed, 18 Oct 2017 06:03:12 +0000 (UTC) Delivered-To: archive-asf-public@cust-asf.ponee.io Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by cust-asf.ponee.io (Postfix) with SMTP id 19C731609EB for ; Wed, 18 Oct 2017 08:03:11 +0200 (CEST) Received: (qmail 67448 invoked by uid 500); 18 Oct 2017 06:03:11 -0000 Mailing-List: contact dev-help@hive.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: dev@hive.apache.org Delivered-To: mailing list dev@hive.apache.org Received: (qmail 67437 invoked by uid 99); 18 Oct 2017 06:03:10 -0000 Received: from pnap-us-west-generic-nat.apache.org (HELO spamd2-us-west.apache.org) (209.188.14.142) by apache.org (qpsmtpd/0.29) with ESMTP; Wed, 18 Oct 2017 06:03:10 +0000 Received: from localhost (localhost [127.0.0.1]) by spamd2-us-west.apache.org (ASF Mail Server at spamd2-us-west.apache.org) with ESMTP id 1A7461A2A52 for ; Wed, 18 Oct 2017 06:03:10 +0000 (UTC) X-Virus-Scanned: Debian amavisd-new at spamd2-us-west.apache.org X-Spam-Flag: NO X-Spam-Score: -99.502 X-Spam-Level: X-Spam-Status: No, score=-99.502 tagged_above=-999 required=6.31 tests=[KAM_NUMSUBJECT=0.5, RP_MATCHES_RCVD=-0.001, SPF_PASS=-0.001, USER_IN_WHITELIST=-100] autolearn=disabled Received: from mx1-lw-us.apache.org ([10.40.0.8]) by localhost (spamd2-us-west.apache.org [10.40.0.9]) (amavisd-new, port 10024) with ESMTP id 6NJ57PSvJxZx for ; Wed, 18 Oct 2017 06:03:04 +0000 (UTC) Received: from mailrelay1-us-west.apache.org (mailrelay1-us-west.apache.org [209.188.14.139]) by mx1-lw-us.apache.org (ASF Mail Server at mx1-lw-us.apache.org) with ESMTP id 720C35F3DE for ; Wed, 18 Oct 2017 06:03:04 +0000 (UTC) Received: from jira-lw-us.apache.org (unknown [207.244.88.139]) by mailrelay1-us-west.apache.org (ASF Mail Server at mailrelay1-us-west.apache.org) with ESMTP id 44580E0635 for ; Wed, 18 Oct 2017 06:03:03 +0000 (UTC) Received: from jira-lw-us.apache.org (localhost [127.0.0.1]) by jira-lw-us.apache.org (ASF Mail Server at jira-lw-us.apache.org) with ESMTP id 8C1482438A for ; Wed, 18 Oct 2017 06:03:01 +0000 (UTC) Date: Wed, 18 Oct 2017 06:03:01 +0000 (UTC) From: "Chiran Ravani (JIRA)" To: dev@hive.apache.org Message-ID: In-Reply-To: References: Subject: [jira] [Created] (HIVE-17829) ArrayIndexOutOfBoundsException - HBASE-backed tables with Avro schema in Hive2 MIME-Version: 1.0 Content-Type: text/plain; charset=utf-8 Content-Transfer-Encoding: 7bit X-JIRA-FingerPrint: 30527f35849b9dde25b450d4833f0394 archived-at: Wed, 18 Oct 2017 06:03:13 -0000 Chiran Ravani created HIVE-17829: ------------------------------------ Summary: ArrayIndexOutOfBoundsException - HBASE-backed tables with Avro schema in Hive2 Key: HIVE-17829 URL: https://issues.apache.org/jira/browse/HIVE-17829 Project: Hive Issue Type: Bug Components: HBase Handler Affects Versions: 2.1.0 Reporter: Chiran Ravani Priority: Critical Stack {code} 2017-10-09T09:39:54,804 ERROR [HiveServer2-Background-Pool: Thread-95]: metadata.Table (Table.java:getColsInternal(642)) - Unable to get field from serde: org.apache.hadoop.hive.hbase.HBaseSerDe java.lang.ArrayIndexOutOfBoundsException: 1 at java.util.Arrays$ArrayList.get(Arrays.java:3841) ~[?:1.8.0_77] at org.apache.hadoop.hive.serde2.BaseStructObjectInspector.init(BaseStructObjectInspector.java:104) ~[hive-exec-2.1.0.2.6.2.0-205.jar:2.1.0.2.6.2.0-205] at org.apache.hadoop.hive.serde2.lazy.objectinspector.LazySimpleStructObjectInspector.init(LazySimpleStructObjectInspector.java:97) ~[hive-exec-2.1.0.2.6.2.0-205.jar:2.1.0.2.6.2.0-205] at org.apache.hadoop.hive.serde2.lazy.objectinspector.LazySimpleStructObjectInspector.(LazySimpleStructObjectInspector.java:77) ~[hive-exec-2.1.0.2.6.2.0-205.jar:2.1.0.2.6.2.0-205] at org.apache.hadoop.hive.serde2.lazy.objectinspector.LazyObjectInspectorFactory.getLazySimpleStructObjectInspector(LazyObjectInspectorFactory.java:115) ~[hive-exec-2.1.0.2.6.2.0-205.jar:2.1.0.2.6.2.0-205] at org.apache.hadoop.hive.hbase.HBaseLazyObjectFactory.createLazyHBaseStructInspector(HBaseLazyObjectFactory.java:79) ~[hive-hbase-handler-2.1.0.2.6.2.0-205.jar:2.1.0.2.6.2.0-205] at org.apache.hadoop.hive.hbase.HBaseSerDe.initialize(HBaseSerDe.java:127) ~[hive-hbase-handler-2.1.0.2.6.2.0-205.jar:2.1.0.2.6.2.0-205] at org.apache.hadoop.hive.serde2.AbstractSerDe.initialize(AbstractSerDe.java:54) ~[hive-exec-2.1.0.2.6.2.0-205.jar:2.1.0.2.6.2.0-205] at org.apache.hadoop.hive.serde2.SerDeUtils.initializeSerDe(SerDeUtils.java:531) ~[hive-exec-2.1.0.2.6.2.0-205.jar:2.1.0.2.6.2.0-205] at org.apache.hadoop.hive.metastore.MetaStoreUtils.getDeserializer(MetaStoreUtils.java:424) ~[hive-exec-2.1.0.2.6.2.0-205.jar:2.1.0.2.6.2.0-205] at org.apache.hadoop.hive.metastore.MetaStoreUtils.getDeserializer(MetaStoreUtils.java:411) ~[hive-exec-2.1.0.2.6.2.0-205.jar:2.1.0.2.6.2.0-205] at org.apache.hadoop.hive.ql.metadata.Table.getDeserializerFromMetaStore(Table.java:279) ~[hive-exec-2.1.0.2.6.2.0-205.jar:2.1.0.2.6.2.0-205] at org.apache.hadoop.hive.ql.metadata.Table.getDeserializer(Table.java:261) ~[hive-exec-2.1.0.2.6.2.0-205.jar:2.1.0.2.6.2.0-205] at org.apache.hadoop.hive.ql.metadata.Table.getColsInternal(Table.java:639) [hive-exec-2.1.0.2.6.2.0-205.jar:2.1.0.2.6.2.0-205] at org.apache.hadoop.hive.ql.metadata.Table.getCols(Table.java:622) [hive-exec-2.1.0.2.6.2.0-205.jar:2.1.0.2.6.2.0-205] at org.apache.hadoop.hive.ql.metadata.Hive.createTable(Hive.java:833) [hive-exec-2.1.0.2.6.2.0-205.jar:2.1.0.2.6.2.0-205] at org.apache.hadoop.hive.ql.metadata.Hive.createTable(Hive.java:869) [hive-exec-2.1.0.2.6.2.0-205.jar:2.1.0.2.6.2.0-205] at org.apache.hadoop.hive.ql.exec.DDLTask.createTable(DDLTask.java:4228) [hive-exec-2.1.0.2.6.2.0-205.jar:2.1.0.2.6.2.0-205] at org.apache.hadoop.hive.ql.exec.DDLTask.execute(DDLTask.java:347) [hive-exec-2.1.0.2.6.2.0-205.jar:2.1.0.2.6.2.0-205] at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:197) [hive-exec-2.1.0.2.6.2.0-205.jar:2.1.0.2.6.2.0-205] at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:100) [hive-exec-2.1.0.2.6.2.0-205.jar:2.1.0.2.6.2.0-205] at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1905) [hive-exec-2.1.0.2.6.2.0-205.jar:2.1.0.2.6.2.0-205] at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1607) [hive-exec-2.1.0.2.6.2.0-205.jar:2.1.0.2.6.2.0-205] at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1354) [hive-exec-2.1.0.2.6.2.0-205.jar:2.1.0.2.6.2.0-205] at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1123) [hive-exec-2.1.0.2.6.2.0-205.jar:2.1.0.2.6.2.0-205] at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1116) [hive-exec-2.1.0.2.6.2.0-205.jar:2.1.0.2.6.2.0-205] at org.apache.hive.service.cli.operation.SQLOperation.runQuery(SQLOperation.java:242) [hive-service-2.1.0.2.6.2.0-205.jar:2.1.0.2.6.2.0-205] at org.apache.hive.service.cli.operation.SQLOperation.access$800(SQLOperation.java:91) [hive-service-2.1.0.2.6.2.0-205.jar:2.1.0.2.6.2.0-205] at org.apache.hive.service.cli.operation.SQLOperation$BackgroundWork$1.run(SQLOperation.java:334) [hive-service-2.1.0.2.6.2.0-205.jar:2.1.0.2.6.2.0-205] at java.security.AccessController.doPrivileged(Native Method) ~[?:1.8.0_77] at javax.security.auth.Subject.doAs(Subject.java:422) [?:1.8.0_77] at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1866) [hadoop-common-2.7.3.2.6.2.0-205.jar:?] at org.apache.hive.service.cli.operation.SQLOperation$BackgroundWork.run(SQLOperation.java:348) [hive-service-2.1.0.2.6.2.0-205.jar:2.1.0.2.6.2.0-205] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) [?:1.8.0_77] at java.util.concurrent.FutureTask.run(FutureTask.java:266) [?:1.8.0_77] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) [?:1.8.0_77] at java.util.concurrent.FutureTask.run(FutureTask.java:266) [?:1.8.0_77] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) [?:1.8.0_77] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) [?:1.8.0_77] at java.lang.Thread.run(Thread.java:745) [?:1.8.0_77] {code} Steps to Repro: {code} Create Hbase Table: ======================== create 'hbase_avro_table', 'test_col_fam', 'test_col' Create Hive Table: ========================= CREATE EXTERNAL TABLE test_hbase_avro2 ROW FORMAT SERDE 'org.apache.hadoop.hive.hbase.HBaseSerDe' STORED BY 'org.apache.hadoop.hive.hbase.HBaseStorageHandler' WITH SERDEPROPERTIES ( "hbase.columns.mapping" = ":key,test_col_fam:test_col", "test_col_fam.test_col.serialization.type" = "avro", "test_col_fam.test_col.avro.schema.url" = "hdfs://rpathak-h1.openstacklocal:8020/user/hive/schema.avsc") TBLPROPERTIES ( "hbase.table.name" = "hbase_avro_table", "hbase.mapred.output.outputtable" = "hbase_avro_table", "hbase.struct.autogenerate"="true", "avro.schema.literal"='{ "type": "record", "name": "test_hbase_avro", "fields": [ { "name":"test_col", "type":"string"} ] }'); {code} The same query works with Hive 1.2.1 -- This message was sent by Atlassian JIRA (v6.4.14#64029)