hadoop-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Ibrar Ahmed <ibrar.ah...@gmail.com>
Subject Re: Hive/Hbase Integration issue
Date Wed, 13 May 2015 19:27:32 GMT
On Thu, May 14, 2015 at 12:21 AM, Ted Yu <yuzhihong@gmail.com> wrote:

> Is hbase-site.xml on the classpath ?
>
> Yes it is in classpath

> BTW please use hbase mailing list for hbase specific questions.
>

Ok just did that :)

>
> Cheers
>
> On Wed, May 13, 2015 at 11:50 AM, Ibrar Ahmed <ibrar.ahmad@gmail.com>
> wrote:
>
>> Hi,
>>
>> I am creating a table using hive and getting this error.
>>
>> [127.0.0.1:10000] hive> CREATE TABLE hbase_table_1(key int, value
>> string)
>>                       > STORED BY
>> 'org.apache.hadoop.hive.hbase.HBaseStorageHandler'
>>                       > WITH SERDEPROPERTIES ("hbase.columns.mapping" =
>> ":key,cf1:val")
>>                       > TBLPROPERTIES ("hbase.table.name" = "xyz");
>>
>>
>>
>> [Hive Error]: Query returned non-zero code: 1, cause: FAILED: Execution
>> Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask.
>> MetaException(message:org.apache.hadoop.hbase.client.RetriesExhaustedException:
>> Can't get the locations
>>     at
>> org.apache.hadoop.hbase.client.RpcRetryingCallerWithReadReplicas.getRegionLocations(RpcRetryingCallerWithReadReplicas.java:305)
>>     at
>> org.apache.hadoop.hbase.client.ScannerCallableWithReplicas.call(ScannerCallableWithReplicas.java:147)
>>     at
>> org.apache.hadoop.hbase.client.ScannerCallableWithReplicas.call(ScannerCallableWithReplicas.java:56)
>>     at
>> org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithoutRetries(RpcRetryingCaller.java:200)
>>     at
>> org.apache.hadoop.hbase.client.ClientScanner.call(ClientScanner.java:288)
>>     at
>> org.apache.hadoop.hbase.client.ClientScanner.nextScanner(ClientScanner.java:267)
>>     at
>> org.apache.hadoop.hbase.client.ClientScanner.initializeScannerInConstruction(ClientScanner.java:139)
>>     at
>> org.apache.hadoop.hbase.client.ClientScanner.<init>(ClientScanner.java:134)
>>     at org.apache.hadoop.hbase.client.HTable.getScanner(HTable.java:823)
>>     at
>> org.apache.hadoop.hbase.MetaTableAccessor.fullScan(MetaTableAccessor.java:601)
>>     at
>> org.apache.hadoop.hbase.MetaTableAccessor.tableExists(MetaTableAccessor.java:365)
>>     at
>> org.apache.hadoop.hbase.client.HBaseAdmin.tableExists(HBaseAdmin.java:281)
>>     at
>> org.apache.hadoop.hbase.client.HBaseAdmin.tableExists(HBaseAdmin.java:291)
>>     at
>> org.apache.hadoop.hive.hbase.HBaseStorageHandler.preCreateTable(HBaseStorageHandler.java:162)
>>     at
>> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.createTable(HiveMetaStoreClient.java:554)
>>     at
>> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.createTable(HiveMetaStoreClient.java:547)
>>     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>     at
>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>>     at
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>     at java.lang.reflect.Method.invoke(Method.java:606)
>>     at
>> org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:89)
>>     at com.sun.proxy.$Proxy7.createTable(Unknown Source)
>>     at org.apache.hadoop.hive.ql.metadata.Hive.createTable(Hive.java:613)
>>     at
>> org.apache.hadoop.hive.ql.exec.DDLTask.createTable(DDLTask.java:4194)
>>     at org.apache.hadoop.hive.ql.exec.DDLTask.execute(DDLTask.java:281)
>>     at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:153)
>>     at
>> org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:85)
>>     at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1472)
>>     at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1239)
>>     at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1057)
>>     at org.apache.hadoop.hive.ql.Driver.run(Driver.java:880)
>>     at org.apache.hadoop.hive.ql.Driver.run(Driver.java:870)
>>     at
>> org.apache.hadoop.hive.service.HiveServer$HiveServerHandler.execute(HiveServer.java:198)
>>     at
>> org.apache.hadoop.hive.service.ThriftHive$Processor$execute.getResult(ThriftHive.java:644)
>>     at
>> org.apache.hadoop.hive.service.ThriftHive$Processor$execute.getResult(ThriftHive.java:628)
>>     at org.apache.thrift.ProcessFunction.process(ProcessFunction.java:39)
>>     at org.apache.thrift.TBaseProcessor.process(TBaseProcessor.java:39)
>>     at
>> org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:206)
>>     at
>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>>     at
>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>>     at java.lang.Thread.run(Thread.java:745)
>> )
>>
>>
>> Any help/clue can help.
>>
>> --ibrar
>>
>>
>


-- 
Ibrar Ahmed

Mime
View raw message