incubator-hcatalog-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From 徐冰 <janet_0...@126.com>
Subject Re:Re: A bug in hcatalog?
Date Fri, 11 Jan 2013 03:47:07 GMT
Hi,Mithun.
Thx for your advice!
I have installed hive-0.9.0 replacing hive-0.8.1 for trying. And create table,load data,select,etc
operations are all normal.
But when use hcat to create table,NPEs are the same.

I didn't change the hive-site.xml conf file in $HIVE_HOME/conf and $HCAT_HOME/etc/hcatalog,
and just stay it as previous.

Do you have the compiled hcatalog installation file : hcatalog-0.4.0.tar.gz?If yes,can you
mail it to me? Thx~

Or do you have other advice?



Best Regards,

Janet




At 2013-01-11 05:51:39,"Mithun Radhakrishnan" <mithun.radhakrishnan@yahoo.com> wrote:

Hello, Janet.


Unless I'm mistaken, HCatalog 0.4 should really be used with Hive 0.9.0/0.9.1. It looks like
you're trying to get this going with Hive 0.8.1. Would you please run against Hive-9 and check
if you're still hitting NPEs? This shouldn't be failing at all.


Mithun


From: 徐冰 <janet_0302@126.com>
To:hcatalog-user@incubator.apache.org
Sent: Wednesday, January 9, 2013 6:53 PM
Subject: A bug in hcatalog?




Hi~
These days,I have tried to install hcatalog on a hadoop cluster composed of three nodes.
The related environment detailed info is shown as following:
Hadoop version: hadoop-0.20.2
Hive version: hive-0.8.1
HCatalog version: hcatalog-0.4.0
Ant version: Apache-ant-1.8.4
Forrest:Apache-forrest-0.9.0
Thrift version:thrift-0.9.0
Mysql version:Mysql-5.1.47
The installation steps I followed is like this(MapReduce job can run on hadoop cluster correctly):
Step 1:Compile the hcatalog-src-0.4.0-incubating to the install file hcatalog-0.4.0.tar.gz
Step 2:install thrift and test installation.
Step 3:install hive and config the metastore to mysql as the local metastore rather than remote
metastore.Then test by loading data into hive from local and select.All the operations can
be done rightly.And the data on HDFS and mysql are all correct.
Step 4:Untar the hcatalog-0.4.0.tar.gz and install hcatalog-0.4.0 server follow the steps
given by apache website.And then I started the hive server service.
Step 5:Start the hcat server,also can work normally.Then I tried to create a table,but it
failed.Then I enter the $HIVE_LOG_DIR/hive.log to see the error info:
2013-01-09 16:51:01,158 ERROR DataNucleus.Plugin (Log4JLogger.java:error(115)) - Bundle "org.eclipse.jdt.core"
requires "org.eclipse.core.resources" but it cannot be resolved.
2013-01-09 16:51:01,158 ERROR DataNucleus.Plugin (Log4JLogger.java:error(115)) - Bundle "org.eclipse.jdt.core"
requires "org.eclipse.core.resources" but it cannot be resolved.
2013-01-09 16:51:01,173 ERROR DataNucleus.Plugin (Log4JLogger.java:error(115)) - Bundle "org.eclipse.jdt.core"
requires "org.eclipse.core.runtime" but it cannot be resolved.
2013-01-09 16:51:01,173 ERROR DataNucleus.Plugin (Log4JLogger.java:error(115)) - Bundle "org.eclipse.jdt.core"
requires "org.eclipse.core.runtime" but it cannot be resolved.
2013-01-09 16:51:01,178 ERROR DataNucleus.Plugin (Log4JLogger.java:error(115)) - Bundle "org.eclipse.jdt.core"
requires "org.eclipse.text" but it cannot be resolved.
2013-01-09 16:51:01,178 ERROR DataNucleus.Plugin (Log4JLogger.java:error(115)) - Bundle "org.eclipse.jdt.core"
requires "org.eclipse.text" but it cannot be resolved.
2013-01-09 16:51:24,243 ERROR ql.Driver (SessionState.java:printError(374)) - FAILED: Error
in semantic analysis: org.apache.hadoop.hive.ql.metadata.HiveException: java.lang.NullPointerException
org.apache.hadoop.hive.ql.parse.SemanticException: org.apache.hadoop.hive.ql.metadata.HiveException:
java.lang.NullPointerException
 at org.apache.hcatalog.cli.SemanticAnalysis.CreateTableHook.authorize(CreateTableHook.java:210)
 at org.apache.hcatalog.cli.SemanticAnalysis.CreateTableHook.postAnalyze(CreateTableHook.java:168)
 at org.apache.hcatalog.cli.SemanticAnalysis.HCatSemanticAnalyzer.postAnalyze(HCatSemanticAnalyzer.java:185)
 at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:379)
 at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:335)
 at org.apache.hadoop.hive.ql.Driver.run(Driver.java:844)
 at org.apache.hcatalog.cli.HCatDriver.run(HCatDriver.java:42)
 at org.apache.hcatalog.cli.HCatCli.processCmd(HCatCli.java:225)
 at org.apache.hcatalog.cli.HCatCli.processLine(HCatCli.java:181)
 at org.apache.hcatalog.cli.HCatCli.main(HCatCli.java:145)
 at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
 at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:60)
 at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:37)
 at java.lang.reflect.Method.invoke(Method.java:611)
 at org.apache.hadoop.util.RunJar.main(RunJar.java:156)
Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: java.lang.NullPointerException
 at org.apache.hadoop.hive.ql.metadata.Hive.getDatabase(Hive.java:1043)
 at org.apache.hcatalog.cli.SemanticAnalysis.CreateTableHook.authorize(CreateTableHook.java:195)
 ... 14 more
Caused by: java.lang.NullPointerException
 at java.lang.Class.forName(Class.java:172)
 at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getClass(HiveMetaStore.java:503)
 at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:385)
 at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.executeWithRetry(HiveMetaStore.java:337)
 at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:482)
 at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:250)
 at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.<init>(HiveMetaStore.java:213)
 at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:110)
 at org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:2010)
 at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:2020)
 at org.apache.hadoop.hive.ql.metadata.Hive.getDatabase(Hive.java:1039)
 ... 15 more
 
Then I can show the hive-stie.xml in $HCAT_CONF_DIR/etc/hcatalog/conf info:
<?xml version="1.0"?>
<?xml-stylesheet type="text/xsl" href="configuration.xsl"?>
<!--
   Licensed to the Apache Software Foundation (ASF) under one or more
   contributor license agreements.  See the NOTICE file distributed with
   this work for additional information regarding copyright ownership.
   The ASF licenses this file to You under the Apache License, Version 2.0
   (the "License"); you may not use this file except in compliance with
   the License.  You may obtain a copy of the License at
       http://www.apache.org/licenses/LICENSE-2.0
   Unless required by applicable law or agreed to in writing, software
   distributed under the License is distributed on an "AS IS" BASIS,
   WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
   See the License for the specific language governing permissions and
   limitations under the License.
-->
<configuration>
<property>
  <name>hive.metastore.local</name>
  <value>true</value>
  <description>controls whether to connect to remove metastore server or open a new
metastore server in Hive Client JVM</description>
</property>
<property>
  <name>javax.jdo.option.ConnectionURL</name>
    <value>jdbc:mysql://127.0.0.1:3306/metastore</value>
  <description>JDBC connect string for a JDBC metastore</description>
</property>
<property>
  <name>javax.jdo.option.ConnectionDriverName</name>
  <value>com.mysql.jdbc.Driver</value>
  <description>Driver class name for a JDBC metastore</description>
</property>
<property>
  <name>javax.jdo.option.ConnectionUserName</name>
  <value>root</value>
  <description>username to use against metastore database</description>
</property>
<property>
  <name>javax.jdo.option.ConnectionPassword</name>
  <value>passw0rd</value>
  <description>password to use against metastore database</description>
</property>
<property>
  <name>hive.metastore.warehouse.dir</name>
  <value>/user/hive/warehouse</value>
  <description>location of default database for the warehouse</description>
</property>
<property>
 <name>hive.semantic.analyzer.factory.Impl</name>
 <value>org.apache.hcatalog.cli.HCatSemanticAnalyzerFactory</value>
</property>
</configuration>

Whether it is a bug exist in hcatalog-0.4.0?
Or has anyone met the same problem?
Pls contact with me if you has anwser or advice to fix it.Thx a lot~
 




 
--

Best Regards,
Name:Janet
E-mail: janet_0302@126.com 












--

Best Regards,
Name:徐冰
Tel:13691355975
E-mail: janet_0302@126.com OR: xubingbj@cn.ibm.com 
Mime
View raw message