hive-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Divya Gehlot <divya.htco...@gmail.com>
Subject Re: hive need access the hdfs of hbase?
Date Thu, 17 Mar 2016 08:46:00 GMT
Hi,
Please check your zookeeper.znode.parent property
where is it pointing to ?

On 17 March 2016 at 15:21, songj songj <songjun435@gmail.com> wrote:

> hi all:
> I have 2 cluster,one is hive cluster(2.0.0),another is hbase
> cluster(1.1.1),
> this two clusters have dependent hdfs:
>
> hive cluster:
> <property>
>    <name>fs.defaultFS</name>
>    <value>hdfs://*hive-cluster*</value>
> </property>
>
> hbase cluster:
> <property>
>    <name>fs.defaultFS</name>
>    <value>hdfs://*hbase-cluster*</value>
> </property>
>
> *1)*but when I use hive shell to access hbase cluster
> >set hbase.zookeeper.quorum=10.24.19.88;
> >CREATE EXTERNAL TABLE IF NOT EXISTS pagecounts_hbase (rowkey STRING,
> pageviews STRING, bytes STRING) STORED BY
> 'org.apache.hadoop.hive.hbase.HBaseStorageHandler' WITH SERDEPROPERTIES
> ('hbase.columns.mapping' = ':key,cf:c1,cf:c2') TBLPROPERTIES ('
> hbase.table.name' = 'test');
>
> *2)*then I got exceptions:
>
> FAILED: Execution Error, return code 1 from
> org.apache.hadoop.hive.ql.exec.DDLTask.
> MetaException(message:MetaException(message:java.io.IOException:
> java.lang.reflect.InvocationTargetException
>      at
> org.apache.hadoop.hbase.client.ConnectionFactory.createConnection(ConnectionFactory.java:240)
>      at
> org.apache.hadoop.hbase.client.ConnectionManager.createConnection(ConnectionManager.java:420)
>      at
> org.apache.hadoop.hbase.client.ConnectionManager.createConnection(ConnectionManager.java:413)
>      at
> org.apache.hadoop.hbase.client.ConnectionManager.getConnectionInternal(ConnectionManager.java:291)
>      at
> org.apache.hadoop.hbase.client.HBaseAdmin.<init>(HBaseAdmin.java:222)
>      at
> org.apache.hadoop.hive.hbase.HBaseStorageHandler.getHBaseAdmin(HBaseStorageHandler.java:102)
>      at
> org.apache.hadoop.hive.hbase.HBaseStorageHandler.preCreateTable(HBaseStorageHandler.java:182)
>      at
> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.createTable(HiveMetaStoreClient.java:608)
>      at
> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.createTable(HiveMetaStoreClient.java:601)
>      at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>      at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>      at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>      at java.lang.reflect.Method.invoke(Method.java:606)
>      at
> org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:90)
>      at com.sun.proxy.$Proxy15.createTable(Unknown Source)
>      at org.apache.hadoop.hive.ql.metadata.Hive.createTable(Hive.java:671)
>      at
> org.apache.hadoop.hive.ql.exec.DDLTask.createTable(DDLTask.java:3973)
>      at org.apache.hadoop.hive.ql.exec.DDLTask.execute(DDLTask.java:295)
>      at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:160)
>      at
> org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:85)
>      at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1604)
>      at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1364)
>      at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1177)
>      at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1004)
>      at org.apache.hadoop.hive.ql.Driver.run(Driver.java:994)
>      at
> org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:201)
>      at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:153)
>      at
> org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:364)
>      at
> org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:712)
>      at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:631)
>      at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:570)
>      at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>      at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>      at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>      at java.lang.reflect.Method.invoke(Method.java:606)
>      at org.apache.hadoop.util.RunJar.run(RunJar.java:221)
>      at org.apache.hadoop.util.RunJar.main(RunJar.java:136)
> Caused by: java.lang.reflect.InvocationTargetException
>      at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
> Method)
>      at
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
>      at
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
>      at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
>      at
> org.apache.hadoop.hbase.client.ConnectionFactory.createConnection(ConnectionFactory.java:238)
>      ... 36 more
> Caused by: java.lang.ExceptionInInitializerError
>      at org.apache.hadoop.hbase.ClusterId.parseFrom(ClusterId.java:64)
>      at
> org.apache.hadoop.hbase.zookeeper.ZKClusterId.readClusterIdZNode(ZKClusterId.java:75)
>      at
> org.apache.hadoop.hbase.client.ZooKeeperRegistry.getClusterId(ZooKeeperRegistry.java:105)
>      at
> org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.retrieveClusterId(ConnectionManager.java:879)
>      at
> org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.<init>(ConnectionManager.java:635)
>      ... 41 more
> Caused by: java.lang.IllegalArgumentException:
> java.net.UnknownHostException: hbase-cluster
>      at
> org.apache.hadoop.security.SecurityUtil.buildTokenService(SecurityUtil.java:374)
>      at
> org.apache.hadoop.hdfs.NameNodeProxies.createNonHAProxy(NameNodeProxies.java:312)
>      at
> org.apache.hadoop.hdfs.NameNodeProxies.createProxy(NameNodeProxies.java:178)
>      at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:665)
>      at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:601)
>      at
> org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:148)
>      at
> org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2596)
>      at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:91)
>      at
> org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2630)
>      at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2612)
>      at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:370)
>      at org.apache.hadoop.fs.Path.getFileSystem(Path.java:296)
>      at
> org.apache.hadoop.hbase.util.DynamicClassLoader.<init>(DynamicClassLoader.java:104)
>      at
> org.apache.hadoop.hbase.protobuf.ProtobufUtil.<clinit>(ProtobufUtil.java:238)
>      ... 46 more
> *Caused by: java.net.UnknownHostException: hbase-cluster*
>      ... 60 more
> )
>      at
> org.apache.hadoop.hive.hbase.HBaseStorageHandler.getHBaseAdmin(HBaseStorageHandler.java:106)
>      at
> org.apache.hadoop.hive.hbase.HBaseStorageHandler.preCreateTable(HBaseStorageHandler.java:182)
>      at
> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.createTable(HiveMetaStoreClient.java:608)
>      at
> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.createTable(HiveMetaStoreClient.java:601)
>      at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>      at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>      at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>      at java.lang.reflect.Method.invoke(Method.java:606)
>      at
> org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:90)
>      at com.sun.proxy.$Proxy15.createTable(Unknown Source)
>      at org.apache.hadoop.hive.ql.metadata.Hive.createTable(Hive.java:671)
>      at
> org.apache.hadoop.hive.ql.exec.DDLTask.createTable(DDLTask.java:3973)
>      at org.apache.hadoop.hive.ql.exec.DDLTask.execute(DDLTask.java:295)
>      at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:160)
>      at
> org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:85)
>      at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1604)
>      at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1364)
>      at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1177)
>      at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1004)
>      at org.apache.hadoop.hive.ql.Driver.run(Driver.java:994)
>      at
> org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:201)
>      at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:153)
>      at
> org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:364)
>      at
> org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:712)
>      at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:631)
>      at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:570)
>      at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>      at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>      at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>      at java.lang.reflect.Method.invoke(Method.java:606)
>      at org.apache.hadoop.util.RunJar.run(RunJar.java:221)
>      at org.apache.hadoop.util.RunJar.main(RunJar.java:136)
>
> *3) *then I modify hosts file on hive (*the ip is hive machine*)
> cat '10.24.19.32 hbase-cluster' >>/etc/hosts
>
> then everything is ok!
>
> *or *cat '10.24.19.88 hbase-cluster' >>/etc/hosts (*the ip is hbase
> machine*)
>
> it is also ok!
>
> *So, why hive cluster should resolve the host 'hbase-cluster' *
> *which is the hdfs url of the hbase cluster?*
> *and why I use the hive/hbase ip to bind host 'hbase-cluster', they are
> all ok?*
>
>
>

Mime
View raw message