hive-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Alok Kumar <alok...@gmail.com>
Subject Re: Hive-Hbase integration require Hbase in Pseudo distributed??
Date Fri, 02 Dec 2011 12:24:27 GMT
hi,

yeah i've used

$HIVE_HOME/bin/hive --auxpath
$HIVE_HOME/lib/hive-hbase-handler-*.jar,$HIVE_HOME/lib/hbase-*.jar,$HIVE_HOME/lib/zookeeper-*.jar
-hiveconf hbase.master=localhost:60000
------------------------------------------------
Hadoop version : hadoop-0.20.203.0
Hbase version : hbase-0.90.4
Hive version : hive-0.9.0 (built from trunk)
on
Ubuntu 11.10
-----------------------------------------------

Regards,

Alok

On Fri, Dec 2, 2011 at 5:49 PM, Ankit Jain <ankitjaincs06@gmail.com> wrote:

> Hi,
>
> have you used following command to start the hive shell.
>
> $HIVE_HOME/bin/hive --auxpath $HIVE_HOME/lib/hive-hbase-handler-*.jar,$HIVE_HOME/lib/hbase-*.jar,$HIVE_HOME/lib/zookeeper-*.jar
-hiveconf hbase.master=127.0.0.1:60000
>
> If no then used above command.
> Regards,
> Ankit
>
>
> On Fri, Dec 2, 2011 at 5:34 PM, Alok Kumar <alokawi@gmail.com> wrote:
>
>> Hi,
>>
>> // Hadoop core-site.xml
>> <configuration>
>>     <property>
>>         <name>fs.default.name</name>
>>         <value>hdfs://localhost:9000</value>
>>     </property>
>>     <property>
>>         <name>hadoop.tmp.dir</name>
>>         <value>/home/alokkumar/hadoop/tmp</value>
>>     </property>
>>  </configuration>
>>
>> // hbase-site.xml
>> <configuration>
>>     <property>
>>             <name>hbase.rootdir</name>
>>        <!--    <value>hdfs://localhost:9000/hbase</value>-->
>>         <value>file:///home/alokkumar/hbase/</value>
>>     </property>
>> </configuration>
>>
>> with these conf Hbase/Hive are runnning independently file..
>> hbase(main):003:0> status
>> 1 servers, 0 dead, 4.0000 average load
>>
>> but i'm stil getting $hive> CREATE TABLE hbase_table_1(key int, value
>> string)
>>     > STORED BY 'org.apache.hadoop.hive.hbase.HBaseStorageHandler'
>>     > WITH SERDEPROPERTIES ("hbase.columns.mapping" = ":key,cf1:val");
>> FAILED: Error in metadata:
>> MetaException(message:org.apache.hadoop.hbase.MasterNotRunningException:
>> localhost:45966
>>
>>     at
>> org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.getMaster(HConnectionManager.java:394)
>>     at
>> org.apache.hadoop.hbase.client.HBaseAdmin.<init>(HBaseAdmin.java:83)
>>
>> Regards,
>> Alok
>>
>>
>> On Fri, Dec 2, 2011 at 5:14 PM, Ankit Jain <ankitjaincs06@gmail.com>wrote:
>>
>>> HI,
>>>
>>> Can you post the Hbase-site.xml and hadoop core-site.xml properties here.
>>>
>>> Regards,
>>> Ankit
>>>
>>>
>>> On Fri, Dec 2, 2011 at 3:30 PM, Alok Kumar <alokawi@gmail.com> wrote:
>>>
>>>> Hi Ankit,
>>>>
>>>> you were right, my Hbase shell/HMaster was not running though it was
>>>> coming in jps :)
>>>>
>>>> nw i've run my HMaster n Hbase shell is up.. n getting this error--
>>>> Do I need zookeeper configured in standalone mode?
>>>>
>>>> hive> CREATE TABLE hbase_table_1(key int, value string)
>>>>     > STORED BY 'org.apache.hadoop.hive.hbase.HBaseStorageHandler'
>>>>     > WITH SERDEPROPERTIES ("hbase.columns.mapping" = ":key,cf1:val")
>>>>     > TBLPROPERTIES ("hbase.table.name" = "xyz");
>>>>
>>>> FAILED: Error in metadata:
>>>> MetaException(message:org.apache.hadoop.hbase.ZooKeeperConnectionException:
>>>> org.apache.hadoop.hbase.ZooKeeperConnectionException:
>>>> org.apache.zookeeper.KeeperException$ConnectionLossException:
>>>> KeeperErrorCode = ConnectionLoss for /hbase
>>>>     at
>>>> org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.getZooKeeperWatcher(HConnectionManager.java:985)
>>>>     at
>>>> org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.setupZookeeperTrackers(HConnectionManager.java:301)
>>>>     at
>>>> org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.<init>(HConnectionManager.java:292)
>>>>     at
>>>> org.apache.hadoop.hbase.client.HConnectionManager.getConnection(HConnectionManager.java:155)
>>>>     at
>>>> org.apache.hadoop.hbase.client.HBaseAdmin.<init>(HBaseAdmin.java:79)
>>>>
>>>>     at
>>>> org.apache.hadoop.hive.hbase.HBaseStorageHandler.getHBaseAdmin(HBaseStorageHandler.java:74)
>>>>     at
>>>> org.apache.hadoop.hive.hbase.HBaseStorageHandler.preCreateTable(HBaseStorageHandler.java:158)
>>>>     at
>>>> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.createTable(HiveMetaStoreClient.java:375)
>>>>     at
>>>> org.apache.hadoop.hive.ql.metadata.Hive.createTable(Hive.java:540)
>>>>     at
>>>> org.apache.hadoop.hive.ql.exec.DDLTask.createTable(DDLTask.java:3473)
>>>>     at org.apache.hadoop.hive.ql.exec.DDLTask.execute(DDLTask.java:225)
>>>>     at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:133)
>>>>     at
>>>> org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:57)
>>>>     at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1332)
>>>>     at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1123)
>>>>     at org.apache.hadoop.hive.ql.Driver.run(Driver.java:931)
>>>>     at
>>>> org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:255)
>>>>     at
>>>> org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:212)
>>>>     at
>>>> org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:403)
>>>>     at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:671)
>>>>     at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:554)
>>>>     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>>     at
>>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>>>>     at
>>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>>>>     at java.lang.reflect.Method.invoke(Method.java:597)
>>>>     at org.apache.hadoop.util.RunJar.main(RunJar.java:156)
>>>> Caused by: org.apache.hadoop.hbase.ZooKeeperConnectionException:
>>>> org.apache.zookeeper.KeeperException$ConnectionLossException:
>>>> KeeperErrorCode = ConnectionLoss for /hbase
>>>>     at
>>>> org.apache.hadoop.hbase.zookeeper.ZooKeeperWatcher.<init>(ZooKeeperWatcher.java:147)
>>>>     at
>>>> org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.getZooKeeperWatcher(HConnectionManager.java:983)
>>>>     ... 25 more
>>>> Caused by:
>>>> org.apache.zookeeper.KeeperException$ConnectionLossException:
>>>> KeeperErrorCode = ConnectionLoss for /hbase
>>>>     at
>>>> org.apache.zookeeper.KeeperException.create(KeeperException.java:90)
>>>>     at
>>>> org.apache.zookeeper.KeeperException.create(KeeperException.java:42)
>>>>     at org.apache.zookeeper.ZooKeeper.create(ZooKeeper.java:637)
>>>>     at
>>>> org.apache.hadoop.hbase.zookeeper.ZKUtil.createAndFailSilent(ZKUtil.java:886)
>>>>     at
>>>> org.apache.hadoop.hbase.zookeeper.ZooKeeperWatcher.<init>(ZooKeeperWatcher.java:133)
>>>>     ... 26 more
>>>>
>>>> )
>>>> FAILED: Execution Error, return code 1 from
>>>> org.apache.hadoop.hive.ql.exec.DDLTask
>>>>
>>>>
>>>> On Fri, Dec 2, 2011 at 2:03 PM, Ankit Jain <ankitjaincs06@gmail.com>wrote:
>>>>
>>>>> I think your hbase master is not running.
>>>>>
>>>>> Open the hive shell and run command :
>>>>> hbase> status
>>>>>
>>>>>
>>>>>
>>>>> On Fri, Dec 2, 2011 at 2:00 PM, Alok Kumar <alokawi@gmail.com>
wrote:
>>>>>
>>>>>> Hi,
>>>>>>
>>>>>> Does Hive-Hbase integration require Hbase running in
>>>>>> pseudo-distributed mode?
>>>>>>
>>>>>> I've build my Hadoop following this article
>>>>>> http://www.michael-noll.com/blog/2011/04/14/building-an-hadoop-0-20-x-version-for-hbase-0-90-2/
>>>>>> and  have already replaced Hbase jar files accordingly..
>>>>>>
>>>>>> I'm getting this error..
>>>>>>
>>>>>> hive> !jps;
>>>>>> 5469 Jps
>>>>>> 4128 JobTracker
>>>>>> 3371 Main
>>>>>> 4346 TaskTracker
>>>>>> 5330 RunJar
>>>>>> 4059 SecondaryNameNode
>>>>>> 8350 NameNode
>>>>>> 3841 DataNode
>>>>>> 3244 HMaster
>>>>>> hive> create table hbase_table_1(key int, value string) stored
by
>>>>>> 'org.apache.hadoop.hive.hbase.HBaseStorageHandler' with
>>>>>> serdeproperties("hbase.columns.mapping" = ":key,cf1:val") tblproperties
("
>>>>>> hbase.table.name" = "xyz");
>>>>>>
>>>>>> FAILED: Error in metadata:
>>>>>> MetaException(message:org.apache.hadoop.hbase.MasterNotRunningException:
>>>>>> localhost:56848
>>>>>>     at
>>>>>> org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.getMaster(HConnectionManager.java:394)
>>>>>>     at
>>>>>> org.apache.hadoop.hbase.client.HBaseAdmin.<init>(HBaseAdmin.java:83)
>>>>>>     at
>>>>>> org.apache.hadoop.hive.hbase.HBaseStorageHandler.getHBaseAdmin(HBaseStorageHandler.java:74)
>>>>>>     at
>>>>>> org.apache.hadoop.hive.hbase.HBaseStorageHandler.preCreateTable(HBaseStorageHandler.java:158)
>>>>>>     at
>>>>>> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.createTable(HiveMetaStoreClient.java:375)
>>>>>>     at
>>>>>> org.apache.hadoop.hive.ql.metadata.Hive.createTable(Hive.java:540)
>>>>>>     at
>>>>>> org.apache.hadoop.hive.ql.exec.DDLTask.createTable(DDLTask.java:3473)
>>>>>>     at
>>>>>> org.apache.hadoop.hive.ql.exec.DDLTask.execute(DDLTask.java:225)
>>>>>>     at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:133)
>>>>>>     at
>>>>>> org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:57)
>>>>>>     at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1332)
>>>>>>     at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1123)
>>>>>>     at org.apache.hadoop.hive.ql.Driver.run(Driver.java:931)
>>>>>>     at
>>>>>> org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:255)
>>>>>>     at
>>>>>> org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:212)
>>>>>>     at
>>>>>> org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:403)
>>>>>>     at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:671)
>>>>>>     at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:554)
>>>>>>     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>>>>     at
>>>>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>>>>>>     at
>>>>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>>>>>>     at java.lang.reflect.Method.invoke(Method.java:597)
>>>>>>     at org.apache.hadoop.util.RunJar.main(RunJar.java:156)
>>>>>> )
>>>>>> FAILED: Execution Error, return code 1 from
>>>>>> org.apache.hadoop.hive.ql.exec.DDLTask
>>>>>>
>>>>>> where i shuld look for fixing "*message:org.apache.hadoop.hbase.MasterNotRunningException:
>>>>>> localhost:56848*" ..?
>>>>>>
>>>>>> --
>>>>>> regards
>>>>>> Alok Kumar
>>>>>>
>>>>>>
>>>>>>
>>>>> --
>>>> Alok
>>>>
>>>>
>>>>
>>>
>>
>>
>


-- 
Alok Kumar

Mime
View raw message