Return-Path: X-Original-To: apmail-hadoop-mapreduce-user-archive@minotaur.apache.org Delivered-To: apmail-hadoop-mapreduce-user-archive@minotaur.apache.org Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by minotaur.apache.org (Postfix) with SMTP id 5C34E173A8 for ; Wed, 13 May 2015 18:50:56 +0000 (UTC) Received: (qmail 13184 invoked by uid 500); 13 May 2015 18:50:48 -0000 Delivered-To: apmail-hadoop-mapreduce-user-archive@hadoop.apache.org Received: (qmail 13071 invoked by uid 500); 13 May 2015 18:50:48 -0000 Mailing-List: contact user-help@hadoop.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: user@hadoop.apache.org Delivered-To: mailing list user@hadoop.apache.org Received: (qmail 13059 invoked by uid 99); 13 May 2015 18:50:48 -0000 Received: from Unknown (HELO spamd4-us-west.apache.org) (209.188.14.142) by apache.org (qpsmtpd/0.29) with ESMTP; Wed, 13 May 2015 18:50:48 +0000 Received: from localhost (localhost [127.0.0.1]) by spamd4-us-west.apache.org (ASF Mail Server at spamd4-us-west.apache.org) with ESMTP id D4F8DC0DF2 for ; Wed, 13 May 2015 18:50:47 +0000 (UTC) X-Virus-Scanned: Debian amavisd-new at spamd4-us-west.apache.org X-Spam-Flag: NO X-Spam-Score: 4.881 X-Spam-Level: **** X-Spam-Status: No, score=4.881 tagged_above=-999 required=6.31 tests=[DKIM_SIGNED=0.1, DKIM_VALID=-0.1, DKIM_VALID_AU=-0.1, HTML_MESSAGE=3, KAM_BADIPHTTP=2, RCVD_IN_MSPIKE_H3=-0.01, RCVD_IN_MSPIKE_WL=-0.01, SPF_PASS=-0.001, URIBL_BLOCKED=0.001, WEIRD_PORT=0.001] autolearn=disabled Authentication-Results: spamd4-us-west.apache.org (amavisd-new); dkim=pass (2048-bit key) header.d=gmail.com Received: from mx1-us-east.apache.org ([10.40.0.8]) by localhost (spamd4-us-west.apache.org [10.40.0.11]) (amavisd-new, port 10024) with ESMTP id kHLjHcyStVfR for ; Wed, 13 May 2015 18:50:40 +0000 (UTC) Received: from mail-la0-f52.google.com (mail-la0-f52.google.com [209.85.215.52]) by mx1-us-east.apache.org (ASF Mail Server at mx1-us-east.apache.org) with ESMTPS id B58EC42AD1 for ; Wed, 13 May 2015 18:50:39 +0000 (UTC) Received: by layy10 with SMTP id y10so36177202lay.0 for ; Wed, 13 May 2015 11:50:38 -0700 (PDT) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=gmail.com; s=20120113; h=mime-version:date:message-id:subject:from:to:content-type; bh=5797B12Qhh8fQ2Y6Ta2ueI2GJD8GElLSJt3Di+k/WqA=; b=uMHOLjyvSV/StJ6vaUsqOpmbPFspfDtVkyxEDSFTBNGqbkwepk1W9Un23Z6OHFZYJl JyUqBuna3pAwwcbuTuKqXJKEC/uKPp2iwLDyHW+JbdkDpZJd0yIRL4FfzmohrFRY+XYX Og8h8gZe1spa5TKHk2QEhmt2ZH5QUIeV+jujRFqLfW4YvIgnS56L1QZMbUTlXpJtQLs6 OaGriEDOakQHsSunqjAkllFbgNpJr9LaXznF8WsXb4wAMS+JircbdcGuHUIC3ReHEk0o 986D2Ec8TOnTqpdk5//KVRLpeVNIy2gplc3tNXtDkoIN8prTPzvL7opzOSLUSDuCJ6gz BilQ== MIME-Version: 1.0 X-Received: by 10.112.130.129 with SMTP id oe1mr222379lbb.37.1431543038600; Wed, 13 May 2015 11:50:38 -0700 (PDT) Received: by 10.152.104.178 with HTTP; Wed, 13 May 2015 11:50:38 -0700 (PDT) Date: Wed, 13 May 2015 23:50:38 +0500 Message-ID: Subject: Hive/Hbase Integration issue From: Ibrar Ahmed To: user@hadoop.apache.org Content-Type: multipart/alternative; boundary=e89a8f64732f4bee7c0515fb167b --e89a8f64732f4bee7c0515fb167b Content-Type: text/plain; charset=UTF-8 Hi, I am creating a table using hive and getting this error. [127.0.0.1:10000] hive> CREATE TABLE hbase_table_1(key int, value string) > STORED BY 'org.apache.hadoop.hive.hbase.HBaseStorageHandler' > WITH SERDEPROPERTIES ("hbase.columns.mapping" = ":key,cf1:val") > TBLPROPERTIES ("hbase.table.name" = "xyz"); [Hive Error]: Query returned non-zero code: 1, cause: FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask. MetaException(message:org.apache.hadoop.hbase.client.RetriesExhaustedException: Can't get the locations at org.apache.hadoop.hbase.client.RpcRetryingCallerWithReadReplicas.getRegionLocations(RpcRetryingCallerWithReadReplicas.java:305) at org.apache.hadoop.hbase.client.ScannerCallableWithReplicas.call(ScannerCallableWithReplicas.java:147) at org.apache.hadoop.hbase.client.ScannerCallableWithReplicas.call(ScannerCallableWithReplicas.java:56) at org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithoutRetries(RpcRetryingCaller.java:200) at org.apache.hadoop.hbase.client.ClientScanner.call(ClientScanner.java:288) at org.apache.hadoop.hbase.client.ClientScanner.nextScanner(ClientScanner.java:267) at org.apache.hadoop.hbase.client.ClientScanner.initializeScannerInConstruction(ClientScanner.java:139) at org.apache.hadoop.hbase.client.ClientScanner.(ClientScanner.java:134) at org.apache.hadoop.hbase.client.HTable.getScanner(HTable.java:823) at org.apache.hadoop.hbase.MetaTableAccessor.fullScan(MetaTableAccessor.java:601) at org.apache.hadoop.hbase.MetaTableAccessor.tableExists(MetaTableAccessor.java:365) at org.apache.hadoop.hbase.client.HBaseAdmin.tableExists(HBaseAdmin.java:281) at org.apache.hadoop.hbase.client.HBaseAdmin.tableExists(HBaseAdmin.java:291) at org.apache.hadoop.hive.hbase.HBaseStorageHandler.preCreateTable(HBaseStorageHandler.java:162) at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.createTable(HiveMetaStoreClient.java:554) at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.createTable(HiveMetaStoreClient.java:547) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:606) at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:89) at com.sun.proxy.$Proxy7.createTable(Unknown Source) at org.apache.hadoop.hive.ql.metadata.Hive.createTable(Hive.java:613) at org.apache.hadoop.hive.ql.exec.DDLTask.createTable(DDLTask.java:4194) at org.apache.hadoop.hive.ql.exec.DDLTask.execute(DDLTask.java:281) at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:153) at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:85) at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1472) at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1239) at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1057) at org.apache.hadoop.hive.ql.Driver.run(Driver.java:880) at org.apache.hadoop.hive.ql.Driver.run(Driver.java:870) at org.apache.hadoop.hive.service.HiveServer$HiveServerHandler.execute(HiveServer.java:198) at org.apache.hadoop.hive.service.ThriftHive$Processor$execute.getResult(ThriftHive.java:644) at org.apache.hadoop.hive.service.ThriftHive$Processor$execute.getResult(ThriftHive.java:628) at org.apache.thrift.ProcessFunction.process(ProcessFunction.java:39) at org.apache.thrift.TBaseProcessor.process(TBaseProcessor.java:39) at org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:206) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) at java.lang.Thread.run(Thread.java:745) ) Any help/clue can help. --ibrar --e89a8f64732f4bee7c0515fb167b Content-Type: text/html; charset=UTF-8 Content-Transfer-Encoding: quoted-printable
Hi,

I am creating a table using hive and = getting this error.

[127.0.0.1:10= 000] hive> CREATE TABLE hbase_table_1(key int, value string)
=C2= =A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0= =C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 > STORED BY 'org.ap= ache.hadoop.hive.hbase.HBaseStorageHandler'
=C2=A0=C2=A0=C2=A0=C2=A0= =C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2= =A0=C2=A0=C2=A0=C2=A0=C2=A0 > WITH SERDEPROPERTIES ("hbase.columns.= mapping" =3D ":key,cf1:val")
=C2=A0=C2=A0=C2=A0=C2=A0=C2= =A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0= =C2=A0=C2=A0=C2=A0=C2=A0 > TBLPROPERTIES ("hbase.table.name" =3D "xyz");


[Hive Error]: Query returned non-zero code: 1, cause: FAILED: Execution Er= ror, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask. MetaExcepti= on(message:org.apache.hadoop.hbase.client.RetriesExhaustedException: Can= 9;t get the locations
=C2=A0=C2=A0=C2=A0 at org.apache.hadoop.hbase.clie= nt.RpcRetryingCallerWithReadReplicas.getRegionLocations(RpcRetryingCallerWi= thReadReplicas.java:305)
=C2=A0=C2=A0=C2=A0 at org.apache.hadoop.hbase.c= lient.ScannerCallableWithReplicas.call(ScannerCallableWithReplicas.java:147= )
=C2=A0=C2=A0=C2=A0 at org.apache.hadoop.hbase.client.ScannerCallableWi= thReplicas.call(ScannerCallableWithReplicas.java:56)
=C2=A0=C2=A0=C2=A0 = at org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithoutRetries(RpcR= etryingCaller.java:200)
=C2=A0=C2=A0=C2=A0 at org.apache.hadoop.hbase.cl= ient.ClientScanner.call(ClientScanner.java:288)
=C2=A0=C2=A0=C2=A0 at or= g.apache.hadoop.hbase.client.ClientScanner.nextScanner(ClientScanner.java:2= 67)
=C2=A0=C2=A0=C2=A0 at org.apache.hadoop.hbase.client.ClientScanner.i= nitializeScannerInConstruction(ClientScanner.java:139)
=C2=A0=C2=A0=C2= =A0 at org.apache.hadoop.hbase.client.ClientScanner.<init>(ClientScan= ner.java:134)
=C2=A0=C2=A0=C2=A0 at org.apache.hadoop.hbase.client.HTabl= e.getScanner(HTable.java:823)
=C2=A0=C2=A0=C2=A0 at org.apache.hadoop.hb= ase.MetaTableAccessor.fullScan(MetaTableAccessor.java:601)
=C2=A0=C2=A0= =C2=A0 at org.apache.hadoop.hbase.MetaTableAccessor.tableExists(MetaTableAc= cessor.java:365)
=C2=A0=C2=A0=C2=A0 at org.apache.hadoop.hbase.client.HB= aseAdmin.tableExists(HBaseAdmin.java:281)
=C2=A0=C2=A0=C2=A0 at org.apac= he.hadoop.hbase.client.HBaseAdmin.tableExists(HBaseAdmin.java:291)
=C2= =A0=C2=A0=C2=A0 at org.apache.hadoop.hive.hbase.HBaseStorageHandler.preCrea= teTable(HBaseStorageHandler.java:162)
=C2=A0=C2=A0=C2=A0 at org.apache.h= adoop.hive.metastore.HiveMetaStoreClient.createTable(HiveMetaStoreClient.ja= va:554)
=C2=A0=C2=A0=C2=A0 at org.apache.hadoop.hive.metastore.HiveMetaS= toreClient.createTable(HiveMetaStoreClient.java:547)
=C2=A0=C2=A0=C2=A0 = at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
=C2=A0=C2= =A0=C2=A0 at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccess= orImpl.java:57)
=C2=A0=C2=A0=C2=A0 at sun.reflect.DelegatingMethodAccess= orImpl.invoke(DelegatingMethodAccessorImpl.java:43)
=C2=A0=C2=A0=C2=A0 a= t java.lang.reflect.Method.invoke(Method.java:606)
=C2=A0=C2=A0=C2=A0 at= org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMe= taStoreClient.java:89)
=C2=A0=C2=A0=C2=A0 at com.sun.proxy.$Proxy7.creat= eTable(Unknown Source)
=C2=A0=C2=A0=C2=A0 at org.apache.hadoop.hive.ql.m= etadata.Hive.createTable(Hive.java:613)
=C2=A0=C2=A0=C2=A0 at org.apache= .hadoop.hive.ql.exec.DDLTask.createTable(DDLTask.java:4194)
=C2=A0=C2=A0= =C2=A0 at org.apache.hadoop.hive.ql.exec.DDLTask.execute(DDLTask.java:281)<= br>=C2=A0=C2=A0=C2=A0 at org.apache.hadoop.hive.ql.exec.Task.executeTask(Ta= sk.java:153)
=C2=A0=C2=A0=C2=A0 at org.apache.hadoop.hive.ql.exec.TaskRu= nner.runSequential(TaskRunner.java:85)
=C2=A0=C2=A0=C2=A0 at org.apache.= hadoop.hive.ql.Driver.launchTask(Driver.java:1472)
=C2=A0=C2=A0=C2=A0 at= org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1239)
=C2=A0=C2=A0= =C2=A0 at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1057)=C2=A0=C2=A0=C2=A0 at org.apache.hadoop.hive.ql.Driver.run(Driver.java:880= )
=C2=A0=C2=A0=C2=A0 at org.apache.hadoop.hive.ql.Driver.run(Driver.java= :870)
=C2=A0=C2=A0=C2=A0 at org.apache.hadoop.hive.service.HiveServer$Hi= veServerHandler.execute(HiveServer.java:198)
=C2=A0=C2=A0=C2=A0 at org.a= pache.hadoop.hive.service.ThriftHive$Processor$execute.getResult(ThriftHive= .java:644)
=C2=A0=C2=A0=C2=A0 at org.apache.hadoop.hive.service.ThriftHi= ve$Processor$execute.getResult(ThriftHive.java:628)
=C2=A0=C2=A0=C2=A0 a= t org.apache.thrift.ProcessFunction.process(ProcessFunction.java:39)
=C2= =A0=C2=A0=C2=A0 at org.apache.thrift.TBaseProcessor.process(TBaseProcessor.= java:39)
=C2=A0=C2=A0=C2=A0 at org.apache.thrift.server.TThreadPoolServe= r$WorkerProcess.run(TThreadPoolServer.java:206)
=C2=A0=C2=A0=C2=A0 at ja= va.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:114= 5)
=C2=A0=C2=A0=C2=A0 at java.util.concurrent.ThreadPoolExecutor$Worker.= run(ThreadPoolExecutor.java:615)
=C2=A0=C2=A0=C2=A0 at java.lang.Thread.= run(Thread.java:745)
)


Any help/clue can help.

--ibrar

--e89a8f64732f4bee7c0515fb167b--