hive-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From demian rosas <demia...@gmail.com>
Subject Re: Problem when trying to connect to hive server using jdbc
Date Wed, 03 Apr 2013 01:48:43 GMT
My linux version is

Linux myhost 2.6.18-308.4.1.0.1.el5xen #1 SMP Tue Apr 17 16:41:30 EDT 2012
x86_64 x86_64 x86_64 GNU/Linux

which is some version of Oracle linux which is supposed to be Red Hat
compatible

> echo $HADOOP_MAPRED_HOME
/usr/lib/hadoop-0.20-mapreduce

> echo $HIVE_HOME
/usr/lib/hive/

These variables see to be fine.

Just a little bit more of context:

I am trying to set a hive installation using a remote mysql metastore. I am
using CDH4.2 on a fresh installation. All this in a single machine, so I am
using hadoop pseudo distributed mode.

So far I have Hadoop working fine. MySql is working and I can connect to it
using jdbc from a java application. When I installed hive for the first
time, it was using the embedded mode and I was able to define an external
table pointing to an hdfs location and query the data with a hive query.
Until here everything was fine.

The problems started when I tried to set the remote mysql metastore. I have
followed the instructions provided here:
https://ccp.cloudera.com/display/CDH4DOC/Hive+Installation#HiveInstallation-ConfiguringtheHiveMetastore

Right now I am using hive-server 1 configuration. This are the properties
in my hive-site.xml file:

<property>
  <name>javax.jdo.option.ConnectionURL</name>

<value>jdbc:mysql://localhost/metastore?createDatabaseIfNotExist=true</value>
  <description>JDBC connect string for a JDBC metastore</description>
</property>

<property>
  <name>javax.jdo.option.ConnectionDriverName</name>
  <value>com.mysql.jdbc.Driver</value>
  <description>Driver class name for a JDBC metastore</description>
</property>

<property>
  <name>javax.jdo.option.ConnectionUserName</name>
  <value>myuser</value>
</property>

<property>
  <name>javax.jdo.option.ConnectionPassword</name>
  <value>mypassword</value>
</property>

<property>
  <name>datanucleus.autoCreateSchema</name>
  <value>false</value>
</property>

<property>
  <name>datanucleus.fixedDatastore</name>
  <value>true</value>
</property>

<property>
  <name>hive.metastore.uris</name>
  <value>thrift://my_ipaddress:3306</value>
  <description>IP address (or fully-qualified domain name) and port of the
metastore host</description>
</property>

<property>
  <name>hive.metastore.warehouse.dir</name>
  <value>/user/hive/warehouse</value>
</property>

<property>
  <name>hive.server.thrift.port</name>
   <value>10000</value>
</property>






On 2 April 2013 18:36, Sanjay Subramanian <
Sanjay.Subramanian@wizecommerce.com> wrote:

>  I just modified the hive batch file to echo some variables. These are
> what the variables are set to when I execute /usr/lib/hive/bin/hive
>
>  HADOOP=/usr/bin/../bin/hadoop
> HADOOP_HOME=/usr/bin/..
> HADOOP_VERSION=2.0.0-cdh4.1.2
>
>  R these variables set correctly for u  ?
> HADOOP_MAPRED_HOME=/usr/lib/hadoop-mapreduce
> HIVE_HOME=/usr/lib/hive
>
>  Which linux  are u running ?
>
>   From: demian rosas <demianrh@gmail.com>
> Reply-To: "user@hive.apache.org" <user@hive.apache.org>
> Date: Tuesday, April 2, 2013 6:18 PM
>
> To: "user@hive.apache.org" <user@hive.apache.org>
> Subject: Re: Problem when trying to connect to hive server using jdbc
>
>   The content of the indicated file on line 179 "area" is
>
>  # Save the regex to a var to workaround quoting incompatabilities
> # between Bash 3.1 and 3.2
> hadoop_version_re="^([[:digit:]]+)\.([[:digit:]]+)(\.([[:digit:]]+))?.*$"
>
>  if [[ "$HADOOP_VERSION" =~ $hadoop_version_re ]]; then
> #-------------------------------------------------------------------------------------->
> This is line 179
>     hadoop_major_ver=${BASH_REMATCH[1]}
>     hadoop_minor_ver=${BASH_REMATCH[2]}
>     hadoop_patch_ver=${BASH_REMATCH[4]}
> else
>     echo "Unable to determine Hadoop version information."
>     echo "'hadoop version' returned:"
>     echo `$HADOOP version`
>     exit 5
> fi
>
>  I do not see anything particularly wrong there.
>
>  What do you think?
>
>
>
>
>
>
>
> On 2 April 2013 18:09, Sanjay Subramanian <
> Sanjay.Subramanian@wizecommerce.com> wrote:
>
>>  In the hive batch file
>> There seem to be some errors on 179…can u verify
>> sanjay
>>
>>   From: demian rosas <demianrh@gmail.com>
>> Reply-To: "user@hive.apache.org" <user@hive.apache.org>
>>  Date: Tuesday, April 2, 2013 5:56 PM
>>
>> To: "user@hive.apache.org" <user@hive.apache.org>
>> Subject: Re: Problem when trying to connect to hive server using jdbc
>>
>>   Yes,
>>
>>  I restart all Hadoop/Hive services every time I change something.
>>
>>  The content of my /var/log/hive/hive-metastore.log file is
>>
>>
>> -------------------------------------------------------------------------------------------------------------------------------------------------------
>>  Exception in thread "main" java.lang.NoClassDefFoundError:
>> org/apache/hadoop/util/VersionInfo
>> Caused by: java.lang.ClassNotFoundException:
>> org.apache.hadoop.util.VersionInfo
>> at java.net.URLClassLoader$1.run(URLClassLoader.java:217)
>> at java.security.AccessController.doPrivileged(Native Method)
>> at java.net.URLClassLoader.findClass(URLClassLoader.java:205)
>> at java.lang.ClassLoader.loadClass(ClassLoader.java:321)
>> at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:294)
>> at java.lang.ClassLoader.loadClass(ClassLoader.java:266)
>> Could not find the main class: org.apache.hadoop.util.VersionInfo.
>> Program will exit.
>> /usr/lib/hive/bin/hive: line 179: conditional binary operator expected
>> /usr/lib/hive/bin/hive: line 179: syntax error near unexpected token `=~'
>> /usr/lib/hive/bin/hive: line 179: `if [[ "$HADOOP_VERSION" =~
>> $hadoop_version_re ]]; then'
>>
>> -------------------------------------------------------------------------------------------------------------------------------------------------------
>>
>>   /var/log/hive/hive-server.log and  /var/log/hive/hive-server2.log have
>> the same content. But I do not think these seem to be outdated logs.
>>
>>  ll /var/log/hive/
>> total 24
>> 4 drwxr-xr-x  2 hive 4096 Feb 15 12:16 ./
>> 8 drwxr-xr-x 41 root 4096 Apr  1 21:02 ../
>> 4 -rw-r--r--  1 hive  863 Apr  2 17:06 hive-metastore.log
>> 4 -rw-r--r--  1 hive  863 Mar 28 14:01 hive-server.log
>> 4 -rw-r--r--  1 hive  863 Mar 29 19:10 hive-server2.log
>>
>>
>>  I suppose there must be others containing recent operations.
>>
>>
>>
>> On 2 April 2013 17:38, Sanjay Subramanian <
>> Sanjay.Subramanian@wizecommerce.com> wrote:
>>
>>>  I assume u Restarted all services
>>> Sequence
>>> Stop hive-server2; hive-server1; Hive-meta-store
>>> Stop Hive-meta-store, hive-server1; hive-server2;
>>>
>>>  We should see  /var/log/hive/*.log to see any startup errors…we need
>>> to fix those first
>>>
>>>
>>>   From: demian rosas <demianrh@gmail.com>
>>> Reply-To: "user@hive.apache.org" <user@hive.apache.org>
>>>  Date: Tuesday, April 2, 2013 5:30 PM
>>>
>>> To: "user@hive.apache.org" <user@hive.apache.org>
>>> Subject: Re: Problem when trying to connect to hive server using jdbc
>>>
>>>   By the way, I have hive-metastore and hive-server services running
>>>
>>>
>>> On 2 April 2013 17:26, demian rosas <demianrh@gmail.com> wrote:
>>>
>>>> Thanks for the hints.
>>>>
>>>>  I have tried everything suggested.
>>>>
>>>>  Configured for using hive-server 1, got mysql connector 5.1.22, set  hive.server.thrift.port
>>>> in hive-site.xml have also double checked the CLASSPATH.
>>>>
>>>>  My mysql server is in the same machine as my hadoop/hive
>>>> configuration.
>>>>
>>>>  I can connect to mysql using jdbc with the credentials set in
>>>> hive-site.xml.
>>>>
>>>>  I fixed hive.metastore.uris to contain the IP address of my machine
>>>> instead of "localhost". Now when I run "show tables" with debug messages in
>>>> the hive console I get this:
>>>>
>>>>
>>>> ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
>>>>  hive> show tables;
>>>> 13/04/02 17:10:25 INFO ql.Driver: <PERFLOG method=Driver.run>
>>>> 13/04/02 17:10:25 INFO ql.Driver: <PERFLOG method=TimeToSubmit>
>>>> 13/04/02 17:10:25 INFO ql.Driver: <PERFLOG method=compile>
>>>> 13/04/02 17:10:25 INFO parse.ParseDriver: Parsing command: show tables
>>>> 13/04/02 17:10:25 INFO parse.ParseDriver: Parse Completed
>>>> 13/04/02 17:10:26 INFO ql.Driver: Semantic Analysis Completed
>>>> 13/04/02 17:10:26 INFO exec.ListSinkOperator: Initializing Self 0 OP
>>>> 13/04/02 17:10:26 INFO exec.ListSinkOperator: Operator 0 OP initialized
>>>> 13/04/02 17:10:26 INFO exec.ListSinkOperator: Initialization Done 0 OP
>>>> 13/04/02 17:10:26 INFO ql.Driver: Returning Hive schema:
>>>> Schema(fieldSchemas:[FieldSchema(name:tab_name, type:string, comment:from
>>>> deserializer)], properties:null)
>>>> 13/04/02 17:10:26 INFO ql.Driver: </PERFLOG method=compile
>>>> start=1364947825805 end=1364947826183 duration=378>
>>>> 13/04/02 17:10:26 INFO ql.Driver: <PERFLOG method=Driver.execute>
>>>> 13/04/02 17:10:26 INFO ql.Driver: Starting command: show tables
>>>> 13/04/02 17:10:26 INFO ql.Driver: </PERFLOG method=TimeToSubmit
>>>> start=1364947825805 end=1364947826197 duration=392>
>>>> 13/04/02 17:10:26 INFO hive.metastore: Trying to connect to metastore
>>>> with URI thrift://10.240.81.72:3306
>>>> 13/04/02 17:10:26 INFO hive.metastore: Waiting 1 seconds before next
>>>> connection attempt.
>>>> 13/04/02 17:10:27 INFO hive.metastore: Connected to metastore.
>>>> 13/04/02 17:10:28 WARN metastore.RetryingMetaStoreClient:
>>>> MetaStoreClient lost connection. Attempting to reconnect.
>>>> org.apache.thrift.transport.TTransportException
>>>>         at
>>>> org.apache.thrift.transport.TIOStreamTransport.read(TIOStreamTransport.java:132)
>>>>         at
>>>> org.apache.thrift.transport.TTransport.readAll(TTransport.java:84)
>>>>         at
>>>> org.apache.thrift.protocol.TBinaryProtocol.readStringBody(TBinaryProtocol.java:354)
>>>>         at
>>>> org.apache.thrift.protocol.TBinaryProtocol.readMessageBegin(TBinaryProtocol.java:215)
>>>>         at
>>>> org.apache.thrift.TServiceClient.receiveBase(TServiceClient.java:69)
>>>>         at
>>>> org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.recv_get_database(ThriftHiveMetastore.java:412)
>>>>         at
>>>> org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.get_database(ThriftHiveMetastore.java:399)
>>>>         at
>>>> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getDatabase(HiveMetaStoreClient.java:736)
>>>>          at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>>         at
>>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>>>>         at
>>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>>>>         at java.lang.reflect.Method.invoke(Method.java:597)
>>>>          at
>>>> org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:74)
>>>>         at $Proxy9.getDatabase(Unknown Source)
>>>>         at
>>>> org.apache.hadoop.hive.ql.metadata.Hive.getDatabase(Hive.java:1114)
>>>>          at
>>>> org.apache.hadoop.hive.ql.metadata.Hive.databaseExists(Hive.java:1103)
>>>>         at
>>>> org.apache.hadoop.hive.ql.exec.DDLTask.showTables(DDLTask.java:2206)
>>>>         at
>>>> org.apache.hadoop.hive.ql.exec.DDLTask.execute(DDLTask.java:334)
>>>>         at
>>>> org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:138)
>>>>         at
>>>> org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:57)
>>>>         at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1352)
>>>>         at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1138)
>>>>         at org.apache.hadoop.hive.ql.Driver.run(Driver.java:951)
>>>>         at
>>>> org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:259)
>>>>         at
>>>> org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:216)
>>>>         at
>>>> org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:412)
>>>>         at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:755)
>>>>         at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:613)
>>>>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>>         at
>>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>>>>         at
>>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>>>>         at java.lang.reflect.Method.invoke(Method.java:597)
>>>>         at org.apache.hadoop.util.RunJar.main(RunJar.java:208)
>>>>  13/04/02 17:10:29 INFO hive.metastore: Trying to connect to metastore
>>>> with URI thrift://10.240.81.72:3306
>>>> 13/04/02 17:10:29 INFO hive.metastore: Waiting 1 seconds before next
>>>> connection attempt.
>>>> 13/04/02 17:10:30 INFO hive.metastore: Connected to metastore.
>>>> FAILED: Error in metadata:
>>>> org.apache.thrift.transport.TTransportException
>>>> 13/04/02 17:10:31 ERROR exec.Task: FAILED: Error in metadata:
>>>> org.apache.thrift.transport.TTransportException
>>>> org.apache.hadoop.hive.ql.metadata.HiveException:
>>>> org.apache.thrift.transport.TTransportException
>>>>          at
>>>> org.apache.hadoop.hive.ql.metadata.Hive.getDatabase(Hive.java:1118)
>>>>         at
>>>> org.apache.hadoop.hive.ql.metadata.Hive.databaseExists(Hive.java:1103)
>>>>         at
>>>> org.apache.hadoop.hive.ql.exec.DDLTask.showTables(DDLTask.java:2206)
>>>>         at
>>>> org.apache.hadoop.hive.ql.exec.DDLTask.execute(DDLTask.java:334)
>>>>         at
>>>> org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:138)
>>>>         at
>>>> org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:57)
>>>>         at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1352)
>>>>         at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1138)
>>>>         at org.apache.hadoop.hive.ql.Driver.run(Driver.java:951)
>>>>         at
>>>> org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:259)
>>>>         at
>>>> org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:216)
>>>>         at
>>>> org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:412)
>>>>         at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:755)
>>>>         at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:613)
>>>>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>>         at
>>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>>>>         at
>>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>>>>         at java.lang.reflect.Method.invoke(Method.java:597)
>>>>         at org.apache.hadoop.util.RunJar.main(RunJar.java:208)
>>>>  Caused by: org.apache.thrift.transport.TTransportException
>>>>         at
>>>> org.apache.thrift.transport.TIOStreamTransport.read(TIOStreamTransport.java:132)
>>>>         at
>>>> org.apache.thrift.transport.TTransport.readAll(TTransport.java:84)
>>>>         at
>>>> org.apache.thrift.protocol.TBinaryProtocol.readStringBody(TBinaryProtocol.java:354)
>>>>         at
>>>> org.apache.thrift.protocol.TBinaryProtocol.readMessageBegin(TBinaryProtocol.java:215)
>>>>         at
>>>> org.apache.thrift.TServiceClient.receiveBase(TServiceClient.java:69)
>>>>         at
>>>> org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.recv_get_database(ThriftHiveMetastore.java:412)
>>>>         at
>>>> org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.get_database(ThriftHiveMetastore.java:399)
>>>>         at
>>>> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.getDatabase(HiveMetaStoreClient.java:736)
>>>>          at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>>         at
>>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>>>>         at
>>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>>>>         at java.lang.reflect.Method.invoke(Method.java:597)
>>>>          at
>>>> org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:74)
>>>>         at $Proxy9.getDatabase(Unknown Source)
>>>>          at
>>>> org.apache.hadoop.hive.ql.metadata.Hive.getDatabase(Hive.java:1114)
>>>>         ... 18 more
>>>>
>>>>   FAILED: Execution Error, return code 1 from
>>>> org.apache.hadoop.hive.ql.exec.DDLTask
>>>>  13/04/02 17:10:31 ERROR ql.Driver: FAILED: Execution Error, return
>>>> code 1 from org.apache.hadoop.hive.ql.exec.DDLTask
>>>> 13/04/02 17:10:31 INFO ql.Driver: </PERFLOG method=Driver.execute
>>>> start=1364947826183 end=1364947831388 duration=5205>
>>>> 13/04/02 17:10:31 INFO ql.Driver: <PERFLOG method=releaseLocks>
>>>> 13/04/02 17:10:31 INFO ql.Driver: </PERFLOG method=releaseLocks
>>>> start=1364947831388 end=1364947831388 duration=0>
>>>> 13/04/02 17:10:31 INFO exec.ListSinkOperator: 0 finished. closing...
>>>> 13/04/02 17:10:31 INFO exec.ListSinkOperator: 0 forwarded 0 rows
>>>> 13/04/02 17:10:31 INFO ql.Driver: <PERFLOG method=releaseLocks>
>>>> 13/04/02 17:10:31 INFO ql.Driver: </PERFLOG method=releaseLocks
>>>> start=1364947831391 end=1364947831391 duration=0>
>>>> hive>
>>>>
>>>> ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
>>>>
>>>>  When I try to connect using jdbc from a java program I get this:
>>>>
>>>>
>>>> ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
>>>>
>>>>   log4j:ERROR Could not instantiate class
>>>> [org.apache.hadoop.log.metrics.EventCounter].
>>>> java.lang.ClassNotFoundException:
>>>> org.apache.hadoop.log.metrics.EventCounter
>>>>         at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
>>>>         at java.security.AccessController.doPrivileged(Native Method)
>>>>         at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
>>>>         at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
>>>>         at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
>>>>         at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
>>>>         at java.lang.Class.forName0(Native Method)
>>>>         at java.lang.Class.forName(Class.java:169)
>>>>         at org.apache.log4j.helpers.Loader.loadClass(Loader.java:198)
>>>>          at
>>>> org.apache.log4j.helpers.OptionConverter.instantiateByClassName(OptionConverter.java:327)
>>>>         at
>>>> org.apache.log4j.helpers.OptionConverter.instantiateByKey(OptionConverter.java:124)
>>>>         at
>>>> org.apache.log4j.PropertyConfigurator.parseAppender(PropertyConfigurator.java:785)
>>>>         at
>>>> org.apache.log4j.PropertyConfigurator.parseCategory(PropertyConfigurator.java:768)
>>>>         at
>>>> org.apache.log4j.PropertyConfigurator.configureRootCategory(PropertyConfigurator.java:648)
>>>>         at
>>>> org.apache.log4j.PropertyConfigurator.doConfigure(PropertyConfigurator.java:514)
>>>>         at
>>>> org.apache.log4j.PropertyConfigurator.doConfigure(PropertyConfigurator.java:580)
>>>>         at
>>>> org.apache.log4j.helpers.OptionConverter.selectAndConfigure(OptionConverter.java:526)
>>>>          at org.apache.log4j.LogManager.<clinit>(LogManager.java:127)
>>>>          at
>>>> org.slf4j.impl.Log4jLoggerFactory.getLogger(Log4jLoggerFactory.java:73)
>>>>         at org.slf4j.LoggerFactory.getLogger(LoggerFactory.java:242)
>>>>         at
>>>> org.apache.thrift.transport.TIOStreamTransport.<clinit>(TIOStreamTransport.java:38)
>>>>         at
>>>> org.apache.hadoop.hive.jdbc.HiveConnection.<init>(HiveConnection.java:110)
>>>>         at
>>>> org.apache.hadoop.hive.jdbc.HiveDriver.connect(HiveDriver.java:104)
>>>>          at
>>>> java.sql.DriverManager.getConnection(DriverManager.java:582)
>>>>         at java.sql.DriverManager.getConnection(DriverManager.java:185)
>>>>          at Conn.main(Conn.java:19)
>>>>  log4j:ERROR Could not instantiate appender named "EventCounter".
>>>>  Exception in thread "main" java.lang.NoClassDefFoundError:
>>>> org/apache/hadoop/io/Writable
>>>>         at
>>>> org.apache.hadoop.hive.jdbc.HiveStatement.executeQuery(HiveStatement.java:193)
>>>>         at
>>>> org.apache.hadoop.hive.jdbc.HiveStatement.execute(HiveStatement.java:127)
>>>>         at
>>>> org.apache.hadoop.hive.jdbc.HiveConnection.configureConnection(HiveConnection.java:126)
>>>>         at
>>>> org.apache.hadoop.hive.jdbc.HiveConnection.<init>(HiveConnection.java:121)
>>>>         at
>>>> org.apache.hadoop.hive.jdbc.HiveDriver.connect(HiveDriver.java:104)
>>>>          at
>>>> java.sql.DriverManager.getConnection(DriverManager.java:582)
>>>>         at java.sql.DriverManager.getConnection(DriverManager.java:185)
>>>>          at Conn.main(Conn.java:19)
>>>> Caused by: java.lang.ClassNotFoundException:
>>>> org.apache.hadoop.io.Writable
>>>>          at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
>>>>         at java.security.AccessController.doPrivileged(Native Method)
>>>>         at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
>>>>         at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
>>>>         at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
>>>>         at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
>>>>          ... 8 more
>>>>
>>>> ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
>>>>
>>>>  I have included serveral jars in my CLASSPATH
>>>>
>>>>
>>>> /usr/lib/hadoop/lib/slf4j-log4j12-1.6.1.jar:/usr/lib/hadoop-hdfs/hadoop-hdfs-2.0.0-cdh4.2.0.jar:/ade/b/
>>>> 4124940812
>>>> /oracle/jlib/oraclepki.jar:/home/drosash/hadoopConnectors/oraloaderHadoop/oraloaderV2forCDH4/oraloader-2.0.1-2/oraloader.jar:/usr/lib/hive/lib:/home/drosash/miscelaneous/mysql-connector-java-5.1.24/mysql-connector-java-5.1.22.jar:/usr/lib/hive/lib/hive-exec-0.10.0-cdh4.2.0.jar:/usr/lib/hive/lib/hive-jdbc-0.10.0-cdh4.2.0.jar:/usr/lib/hive/lib/hive-metastore-0.10.0-cdh4.2.0.jar:/usr/lib/hive/lib/hive-service-0.10.0-cdh4.2.0.jar:/usr/lib/hive/lib/libfb303-0.9.0.jar:/usr/lib/hadoop-0.20-mapreduce/hadoop-2.0.0-mr1-cdh4.2.0-core.jar:/usr/lib/hive/lib/commons-logging-api-1.0.4.jar:/usr/lib/hive/lib/commons-logging-1.0.4.jar:/usr/lib/hive/lib/commons-configuration-1.6.jar:/usr/lib/hadoop/lib/slf4j-api-1.6.1.jar:/usr/lib/hadoop/lib/log4j-1.2.17.jar:/usr/lib/hadoop/lib/commons-configuration-1.6.jar:/usr/lib/hadoop/lib/commons-io-2.4.jar:/usr/bin/hadoop:/usr/lib/hive/lib/commons-collections-3.2.1.jar
>>>>
>>>>  any other idea I could try?
>>>>
>>>>  Thank you very much,
>>>> Demian
>>>>
>>>>
>>>> On 1 April 2013 18:47, Sanjay Subramanian <
>>>> Sanjay.Subramanian@wizecommerce.com> wrote:
>>>>
>>>>>  Hi
>>>>>
>>>>>  Have u started hive-metastore service ? And Hive Server service ?
>>>>>
>>>>>  Check this value in hive-site.xml
>>>>> <property>
>>>>>    <name>hive.metastore.uris</name>
>>>>>   <value>thrift://FQDN:9083</value>
>>>>>   <description>IP address (or fully-qualified domain name) and port of
>>>>> the metastore host</description>
>>>>> </property>
>>>>>
>>>>>
>>>>>  I see that u r connecting to hive-server2.  If so then u must start
>>>>> hive-server2 service (u can check it using command line beeline CLI instead
>>>>> of hive CLI)
>>>>>
>>>>>  If u r using Hive-server and hive-server2 they must listen on
>>>>> separate ports
>>>>>
>>>>>  <property>
>>>>>   <name>hive.server2.thrift.port</name>
>>>>>    <value>10786</value>
>>>>> </property>
>>>>>
>>>>>  <property>
>>>>>   <name>hive.server.thrift.port</name>
>>>>>    <value>10000</value>
>>>>> </property>
>>>>>
>>>>>
>>>>>  I suggest u try the hive-server on 10000 first then move on to
>>>>> hive-server2
>>>>>
>>>>>
>>>>>  Thanks
>>>>> sanjay
>>>>>
>>>>>
>>>>>
>>>>>   From: demian rosas <demianrh@gmail.com>
>>>>> Reply-To: "user@hive.apache.org" <user@hive.apache.org>
>>>>>  Date: Monday, April 1, 2013 6:36 PM
>>>>> To: "user@hive.apache.org" <user@hive.apache.org>
>>>>> Subject: Re: Problem when trying to connect to hive server using jdbc
>>>>>
>>>>>   Hi,
>>>>>
>>>>>  Thanks a lot for your answer. I have done what you indicate already.
>>>>>
>>>>>  Actually now I was able to go a little bit ahead.
>>>>>
>>>>>  I am not using yarn. I am running CDH4.2 in pseudo distributed mode.
>>>>> I want to configure a mysql metastore (remote mode) in the same machine so
>>>>> that everything will be running in a single machine.
>>>>>
>>>>>  "Does hive CLI start successfully ? U can see debug messages by
>>>>> starting hive CLI this way
>>>>> $HIVE_HOME/bin/hive -hiveconf hive.root.logger=INFO,console"
>>>>>
>>>>>  I can start hive CLI successfully. When I call to show debug
>>>>> messages and execute "show tables", I am getting the following:
>>>>>
>>>>>
>>>>>
>>>>> ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
>>>>>  hive> show tables;
>>>>> 13/04/01 18:20:40 INFO ql.Driver: <PERFLOG method=Driver.run>
>>>>> 13/04/01 18:20:40 INFO ql.Driver: <PERFLOG method=TimeToSubmit>
>>>>> 13/04/01 18:20:40 INFO ql.Driver: <PERFLOG method=compile>
>>>>> 13/04/01 18:20:40 INFO parse.ParseDriver: Parsing command: show tables
>>>>> 13/04/01 18:20:40 INFO parse.ParseDriver: Parse Completed
>>>>> 13/04/01 18:20:40 INFO ql.Driver: Semantic Analysis Completed
>>>>> 13/04/01 18:20:40 INFO exec.ListSinkOperator: Initializing Self 0 OP
>>>>> 13/04/01 18:20:40 INFO exec.ListSinkOperator: Operator 0 OP initialized
>>>>> 13/04/01 18:20:40 INFO exec.ListSinkOperator: Initialization Done 0 OP
>>>>> 13/04/01 18:20:40 INFO ql.Driver: Returning Hive schema:
>>>>> Schema(fieldSchemas:[FieldSchema(name:tab_name, type:string, comment:from
>>>>> deserializer)], properties:null)
>>>>> 13/04/01 18:20:40 INFO ql.Driver: </PERFLOG method=compile
>>>>> start=1364865640585 end=1364865640913 duration=328>
>>>>> 13/04/01 18:20:40 INFO zookeeper.ZooKeeper: Client
>>>>> environment:zookeeper.version=3.4.5-cdh4.2.0--1, built on 02/15/2013 18:36
>>>>> GMT
>>>>> 13/04/01 18:20:40 INFO zookeeper.ZooKeeper: Client environment:
>>>>> host.name=slc01euu.us.oracle.com
>>>>> 13/04/01 18:20:40 INFO zookeeper.ZooKeeper: Client
>>>>> environment:java.version=1.6.0_37
>>>>> 13/04/01 18:20:40 INFO zookeeper.ZooKeeper: Client
>>>>> environment:java.vendor=Sun Microsystems Inc.
>>>>> 13/04/01 18:20:40 INFO zookeeper.ZooKeeper: Client
>>>>> environment:java.home=/ade_autofs/dd19_db/RDBMS/MAIN/LINUX.X64/130325/jdk6/jre
>>>>> ...
>>>>> ...
>>>>> ...
>>>>>  13/04/01 18:20:40 INFO zookeeper.ZooKeeper: Client
>>>>> environment:os.version=2.6.18-308.4.1.0.1.el5xen
>>>>> 13/04/01 18:20:40 INFO zookeeper.ZooKeeper: Client environment:
>>>>> user.name=drosash
>>>>> 13/04/01 18:20:40 INFO zookeeper.ZooKeeper: Client
>>>>> environment:user.home=/home/drosash
>>>>> 13/04/01 18:20:40 INFO zookeeper.ZooKeeper: Client
>>>>> environment:user.dir=/home/drosash/miscelaneous
>>>>> 13/04/01 18:20:40 INFO zookeeper.ZooKeeper: Initiating client
>>>>> connection, connectString=localhost:2181 sessionTimeout=600000
>>>>> watcher=org.apache.hadoop.hive.ql.lockmgr.zookeeper.ZooKeeperHiveLockManager$DummyWatcher@f35f44e
>>>>> 13/04/01 18:20:40 INFO zookeeper.ClientCnxn: Opening socket connection
>>>>> to server localhost.localdomain/127.0.0.1:2181. Will not attempt to
>>>>> authenticate using SASL (Unable to locate a login configuration)
>>>>> 13/04/01 18:20:40 INFO zookeeper.ClientCnxn: Socket connection
>>>>> established to localhost.localdomain/127.0.0.1:2181, initiating
>>>>> session
>>>>> 13/04/01 18:20:41 INFO zookeeper.ClientCnxn: Session establishment
>>>>> complete on server localhost.localdomain/127.0.0.1:2181, sessionid =
>>>>> 0x13dc843aabf0000, negotiated timeout = 40000
>>>>> 13/04/01 18:20:41 INFO ql.Driver: <PERFLOG
>>>>> method=acquireReadWriteLocks>
>>>>> 13/04/01 18:20:41 INFO ql.Driver: </PERFLOG
>>>>> method=acquireReadWriteLocks start=1364865641054 end=1364865641055
>>>>> duration=1>
>>>>> 13/04/01 18:20:41 INFO ql.Driver: <PERFLOG method=Driver.execute>
>>>>> 13/04/01 18:20:41 INFO ql.Driver: Starting command: show tables
>>>>> 13/04/01 18:20:41 INFO ql.Driver: </PERFLOG method=TimeToSubmit
>>>>> start=1364865640585 end=1364865641070 duration=485>
>>>>> 13/04/01 18:20:41 INFO hive.metastore: Trying to connect to metastore
>>>>> with URI thrift://0.0.0.0.0:3306
>>>>> FAILED: Error in metadata: java.lang.RuntimeException: Unable to
>>>>> instantiate org.apache.hadoop.hive.metastore.HiveMetaStoreClient
>>>>> 13/04/01 18:20:41 ERROR exec.Task: FAILED: Error in metadata:
>>>>> java.lang.RuntimeException: Unable to instantiate
>>>>> org.apache.hadoop.hive.metastore.HiveMetaStoreClient
>>>>> org.apache.hadoop.hive.ql.metadata.HiveException:
>>>>> java.lang.RuntimeException: Unable to instantiate
>>>>> org.apache.hadoop.hive.metastore.HiveMetaStoreClient
>>>>>         at
>>>>> org.apache.hadoop.hive.ql.metadata.Hive.getDatabase(Hive.java:1118)
>>>>>         at
>>>>> org.apache.hadoop.hive.ql.metadata.Hive.databaseExists(Hive.java:1103)
>>>>>         at
>>>>> org.apache.hadoop.hive.ql.exec.DDLTask.showTables(DDLTask.java:2206)
>>>>>         at
>>>>> org.apache.hadoop.hive.ql.exec.DDLTask.execute(DDLTask.java:334)
>>>>>         at
>>>>> org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:138)
>>>>>         at
>>>>> org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:57)
>>>>>         at
>>>>> org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1352)
>>>>>         at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1138)
>>>>>         at org.apache.hadoop.hive.ql.Driver.run(Driver.java:951)
>>>>>         at
>>>>> org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:259)
>>>>>         at
>>>>> org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:216)
>>>>>         at
>>>>> org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:412)
>>>>>         at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:755)
>>>>>         at
>>>>> org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:613)
>>>>>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>>>         at
>>>>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>>>>>         at
>>>>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>>>>>         at java.lang.reflect.Method.invoke(Method.java:597)
>>>>>         at org.apache.hadoop.util.RunJar.main(RunJar.java:208)
>>>>> Caused by: java.lang.RuntimeException: Unable to instantiate
>>>>> org.apache.hadoop.hive.metastore.HiveMetaStoreClient
>>>>>         at
>>>>> org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1084)
>>>>>         at
>>>>> org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:51)
>>>>>         at
>>>>> org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:61)
>>>>>         at
>>>>> org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:2140)
>>>>>         at
>>>>> org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:2151)
>>>>>         at
>>>>> org.apache.hadoop.hive.ql.metadata.Hive.getDatabase(Hive.java:1114)
>>>>>         ... 18 more
>>>>> Caused by: java.lang.reflect.InvocationTargetException
>>>>>         at
>>>>> sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
>>>>>         at
>>>>> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:39)
>>>>>         at
>>>>> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
>>>>>         at
>>>>> java.lang.reflect.Constructor.newInstance(Constructor.java:513)
>>>>>         at
>>>>> org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1082)
>>>>>         ... 23 more
>>>>> Caused by: java.lang.NullPointerException
>>>>>         at org.apache.thrift.transport.TSocket.open(TSocket.java:168)
>>>>>         at
>>>>> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.open(HiveMetaStoreClient.java:277)
>>>>>         at
>>>>> org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:163)
>>>>>         ... 28 more
>>>>>
>>>>>  FAILED: Execution Error, return code 1 from
>>>>> org.apache.hadoop.hive.ql.exec.DDLTask
>>>>> 13/04/01 18:20:41 ERROR ql.Driver: FAILED: Execution Error, return
>>>>> code 1 from org.apache.hadoop.hive.ql.exec.DDLTask
>>>>> 13/04/01 18:20:41 INFO ql.Driver: </PERFLOG method=Driver.execute
>>>>> start=1364865641055 end=1364865641098 duration=43>
>>>>> 13/04/01 18:20:41 INFO ql.Driver: <PERFLOG method=releaseLocks>
>>>>> 13/04/01 18:20:41 INFO ql.Driver: </PERFLOG method=releaseLocks
>>>>> start=1364865641098 end=1364865641098 duration=0>
>>>>> 13/04/01 18:20:41 INFO exec.ListSinkOperator: 0 finished. closing...
>>>>> 13/04/01 18:20:41 INFO exec.ListSinkOperator: 0 forwarded 0 rows
>>>>>  13/04/01 18:20:41 INFO ql.Driver: <PERFLOG method=releaseLocks>
>>>>> 13/04/01 18:20:41 INFO ql.Driver: </PERFLOG method=releaseLocks
>>>>> start=1364865641101 end=1364865641101 duration=0>
>>>>> 13/04/01 18:20:41 INFO zookeeper.ZooKeeper: Session: 0x13dc843aabf0000
>>>>> closed
>>>>> hive> 13/04/01 18:20:41 INFO zookeeper.ClientCnxn: EventThread shut
>>>>> down
>>>>>
>>>>> ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
>>>>>
>>>>>  and then nothing else happens.
>>>>>
>>>>>  When I try to connect from a java app (code shown below) using jdbc,
>>>>> I get the following:
>>>>>
>>>>>
>>>>> ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
>>>>>  import java.sql.*;
>>>>>
>>>>>  class Conn {
>>>>>   public static void main (String[] args) throws Exception
>>>>>   {
>>>>>    Class.forName ("org.apache.hive.jdbc.HiveDriver");
>>>>>
>>>>>
>>>>>  Connection conn = DriverManager.getConnection
>>>>>                    ("jdbc:hive2://0.0.0.0.0:10000/default", "", "");
>>>>>
>>>>>     try {
>>>>>      Statement stmt = conn.createStatement();
>>>>>      try {
>>>>>        ResultSet rset = stmt.executeQuery("show tables");
>>>>>         try {
>>>>>          while (rset.next())
>>>>>            System.out.println (rset.getString(1));   // Print col 1
>>>>>        }
>>>>>        finally {
>>>>>           try { rset.close(); } catch (Exception ignore) {}
>>>>>        }
>>>>>      }
>>>>>      finally {
>>>>>        try { stmt.close(); } catch (Exception ignore) {}
>>>>>      }
>>>>>    }
>>>>>    finally {
>>>>>      try { conn.close(); } catch (Exception ignore) {}
>>>>>    }
>>>>>   }
>>>>> }
>>>>>
>>>>>
>>>>>
>>>>> ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
>>>>>
>>>>>  log4j:ERROR Could not instantiate class
>>>>> [org.apache.hadoop.log.metrics.EventCounter].
>>>>> java.lang.ClassNotFoundException:
>>>>> org.apache.hadoop.log.metrics.EventCounter
>>>>>         at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
>>>>>         at java.security.AccessController.doPrivileged(Native Method)
>>>>>         at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
>>>>>         at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
>>>>>         at
>>>>> sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
>>>>>         at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
>>>>>         at java.lang.Class.forName0(Native Method)
>>>>>         at java.lang.Class.forName(Class.java:169)
>>>>>         at org.apache.log4j.helpers.Loader.loadClass(Loader.java:198)
>>>>>         at
>>>>> org.apache.log4j.helpers.OptionConverter.instantiateByClassName(OptionConverter.java:326)
>>>>>         at
>>>>> org.apache.log4j.helpers.OptionConverter.instantiateByKey(OptionConverter.java:123)
>>>>>         at
>>>>> org.apache.log4j.PropertyConfigurator.parseAppender(PropertyConfigurator.java:752)
>>>>>         at
>>>>> org.apache.log4j.PropertyConfigurator.parseCategory(PropertyConfigurator.java:735)
>>>>>         at
>>>>> org.apache.log4j.PropertyConfigurator.configureRootCategory(PropertyConfigurator.java:615)
>>>>>         at
>>>>> org.apache.log4j.PropertyConfigurator.doConfigure(PropertyConfigurator.java:502)
>>>>>         at
>>>>> org.apache.log4j.PropertyConfigurator.doConfigure(PropertyConfigurator.java:547)
>>>>>         at
>>>>> org.apache.log4j.helpers.OptionConverter.selectAndConfigure(OptionConverter.java:483)
>>>>>         at org.apache.log4j.LogManager.<clinit>(LogManager.java:127)
>>>>>         at org.apache.log4j.Logger.getLogger(Logger.java:104)
>>>>>         at
>>>>> org.apache.commons.logging.impl.Log4JLogger.getLogger(Log4JLogger.java:229)
>>>>>         at
>>>>> org.apache.commons.logging.impl.Log4JLogger.<init>(Log4JLogger.java:65)
>>>>>         at
>>>>> sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
>>>>>         at
>>>>> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:39)
>>>>>         at
>>>>> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
>>>>>         at
>>>>> java.lang.reflect.Constructor.newInstance(Constructor.java:513)
>>>>>         at
>>>>> org.apache.commons.logging.impl.LogFactoryImpl.newInstance(LogFactoryImpl.java:529)
>>>>>         at
>>>>> org.apache.commons.logging.impl.LogFactoryImpl.getInstance(LogFactoryImpl.java:235)
>>>>>         at
>>>>> org.apache.commons.logging.impl.LogFactoryImpl.getInstance(LogFactoryImpl.java:209)
>>>>>         at
>>>>> org.apache.commons.logging.LogFactory.getLog(LogFactory.java:351)
>>>>>         at
>>>>> org.apache.hive.service.AbstractService.<clinit>(AbstractService.java:34)
>>>>>         at
>>>>> org.apache.hive.jdbc.HiveConnection.<init>(HiveConnection.java:86)
>>>>>         at org.apache.hive.jdbc.HiveDriver.connect(HiveDriver.java:104)
>>>>>         at java.sql.DriverManager.getConnection(DriverManager.java:582)
>>>>>         at java.sql.DriverManager.getConnection(DriverManager.java:185)
>>>>>         at Conn.main(Conn.java:16)
>>>>> log4j:ERROR Could not instantiate appender named "EventCounter".
>>>>> Exception in thread "main" java.lang.NoClassDefFoundError:
>>>>> org/apache/hadoop/conf/Configuration
>>>>>         at
>>>>> org.apache.hive.service.cli.thrift.EmbeddedThriftCLIService.<init>(EmbeddedThriftCLIService.java:32)
>>>>>         at
>>>>> org.apache.hive.jdbc.HiveConnection.<init>(HiveConnection.java:86)
>>>>>         at org.apache.hive.jdbc.HiveDriver.connect(HiveDriver.java:104)
>>>>>         at java.sql.DriverManager.getConnection(DriverManager.java:582)
>>>>>         at java.sql.DriverManager.getConnection(DriverManager.java:185)
>>>>>         at Conn.main(Conn.java:16)
>>>>> Caused by: java.lang.ClassNotFoundException:
>>>>> org.apache.hadoop.conf.Configuration
>>>>>         at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
>>>>>         at java.security.AccessController.doPrivileged(Native Method)
>>>>>         at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
>>>>>         at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
>>>>>         at
>>>>> sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
>>>>>         at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
>>>>>         ... 6 more
>>>>>
>>>>> ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
>>>>>
>>>>>  do you have an idea about what I am missing?
>>>>>
>>>>>  Thanks a lot in advance,
>>>>> Demian
>>>>>
>>>>>
>>>>> On 1 April 2013 18:08, Sanjay Subramanian <
>>>>> Sanjay.Subramanian@wizecommerce.com> wrote:
>>>>>
>>>>>>  Hi
>>>>>>
>>>>>>  First of if u r planning to run YARN on 4.2.0 then stay with 4.1.2.
>>>>>>
>>>>>>  I installed 4.2.0 but had to roll back :-( Hit upon this error
>>>>>> https://issues.cloudera.org/browse/DISTRO-461. If u r not using yarn
>>>>>> then it will not affect u.
>>>>>>
>>>>>>  When u install Cloudera Manager, it installs Hive. But Hive-server
>>>>>> and hive-metastore you have to install your self. I have installed CM on
>>>>>> Centos and Ubuntu. Never had to copy around any hive jars. Only jar to be
>>>>>> copied is mysql.
>>>>>>
>>>>>>  I am assuming u have setup (
>>>>>> https://ccp.cloudera.com/display/CDH4DOC/Hive+Installation#HiveInstallation-ConfiguringHiveServer2)
>>>>>> all MYSQL related stuff in the hive-site.xml specially
>>>>>> hive.metastore.uris
>>>>>> javax.jdo.option.ConnectionURL
>>>>>> javax.jdo.option.ConnectionDriverName
>>>>>> javax.jdo.option.ConnectionUserName
>>>>>> javax.jdo.option.ConnectionPassword
>>>>>> datanucleus.autoCreateSchema
>>>>>> datanucleus.fixedDatastore
>>>>>>
>>>>>>  Does hive CLI start successfully ? U can see debug messages by
>>>>>> starting hive CLI this way
>>>>>> $HIVE_HOME/bin/hive -hiveconf hive.root.logger=INFO,console
>>>>>>
>>>>>>  In CLI Execute command "Show Tables"
>>>>>>
>>>>>>
>>>>>>
>>>>>>  Check Mysql server machine
>>>>>> ---------------------------------------
>>>>>> If MySQL is on another Server that what Hive is installed .Then on
>>>>>> the MySQL box check the /etc/hosts
>>>>>>  <Ipaddress_mysql_server>  FQDV_mysql_server  alias_11
>>>>>> <Ipaddress_hive_box>  FQDV_hive_box  alias_22
>>>>>>
>>>>>>  In my.cnf
>>>>>> bind-address            = 0.0.0.0
>>>>>>
>>>>>>
>>>>>>  Check if some old PATH and env variables are pointing to improper
>>>>>> location.
>>>>>>
>>>>>>  Try an older version of mysql jar..I use 5.1.22
>>>>>>
>>>>>>
>>>>>>  Thanks
>>>>>>
>>>>>>  sanjay
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>   From: demian rosas <demianrh@gmail.com>
>>>>>> Reply-To: "user@hive.apache.org" <user@hive.apache.org>
>>>>>> Date: Monday, April 1, 2013 1:43 PM
>>>>>> To: "user@hive.apache.org" <user@hive.apache.org>
>>>>>> Subject: Problem when trying to connect to hive server using jdbc
>>>>>>
>>>>>>   Hi,
>>>>>>
>>>>>>  I am using hive from CDH4.2 in a fresh installation. I want to set
>>>>>> a mysql metastore.
>>>>>>
>>>>>>  When trying to connect to hive server using jdbc I am getting this
>>>>>> error:
>>>>>>
>>>>>>  Exception in thread "main" java.lang.ClassNotFoundException:
>>>>>> org.apache.hive.jdbc.HiveDriver
>>>>>>         at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
>>>>>>         at java.security.AccessController.doPrivileged(Native Method)
>>>>>>         at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
>>>>>>         at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
>>>>>>         at
>>>>>> sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
>>>>>>         at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
>>>>>>         at java.lang.Class.forName0(Native Method)
>>>>>>         at java.lang.Class.forName(Class.java:169)
>>>>>>         at Conn.main(Conn.java:8)
>>>>>>
>>>>>>
>>>>>>  I have copied the jdbc hive jar (hive-jdbc-0.10.0-cdh4.2.0.jar)
>>>>>> into my CLASSPATH, as well as my mysql connector jar
>>>>>> (mysql-connector-java-5.1.24-bin.jar).
>>>>>>
>>>>>>  I would appreciate a lot if you could tell me what I am missing
>>>>>> here.
>>>>>>
>>>>>>  Thanks a lot,
>>>>>> Demian
>>>>>>
>>>>>> CONFIDENTIALITY NOTICE
>>>>>> ======================
>>>>>> This email message and any attachments are for the exclusive use of
>>>>>> the intended recipient(s) and may contain confidential and privileged
>>>>>> information. Any unauthorized review, use, disclosure or distribution is
>>>>>> prohibited. If you are not the intended recipient, please contact the
>>>>>> sender by reply email and destroy all copies of the original message along
>>>>>> with any attachments, from your computer system. If you are the intended
>>>>>> recipient, please be advised that the content of this message is subject to
>>>>>> access, review and disclosure by the sender's Email System Administrator.
>>>>>>
>>>>>
>>>>>
>>>>> CONFIDENTIALITY NOTICE
>>>>> ======================
>>>>> This email message and any attachments are for the exclusive use of
>>>>> the intended recipient(s) and may contain confidential and privileged
>>>>> information. Any unauthorized review, use, disclosure or distribution is
>>>>> prohibited. If you are not the intended recipient, please contact the
>>>>> sender by reply email and destroy all copies of the original message along
>>>>> with any attachments, from your computer system. If you are the intended
>>>>> recipient, please be advised that the content of this message is subject to
>>>>> access, review and disclosure by the sender's Email System Administrator.
>>>>>
>>>>
>>>>
>>>
>>> CONFIDENTIALITY NOTICE
>>> ======================
>>> This email message and any attachments are for the exclusive use of the
>>> intended recipient(s) and may contain confidential and privileged
>>> information. Any unauthorized review, use, disclosure or distribution is
>>> prohibited. If you are not the intended recipient, please contact the
>>> sender by reply email and destroy all copies of the original message along
>>> with any attachments, from your computer system. If you are the intended
>>> recipient, please be advised that the content of this message is subject to
>>> access, review and disclosure by the sender's Email System Administrator.
>>>
>>
>>
>> CONFIDENTIALITY NOTICE
>> ======================
>> This email message and any attachments are for the exclusive use of the
>> intended recipient(s) and may contain confidential and privileged
>> information. Any unauthorized review, use, disclosure or distribution is
>> prohibited. If you are not the intended recipient, please contact the
>> sender by reply email and destroy all copies of the original message along
>> with any attachments, from your computer system. If you are the intended
>> recipient, please be advised that the content of this message is subject to
>> access, review and disclosure by the sender's Email System Administrator.
>>
>
>
> CONFIDENTIALITY NOTICE
> ======================
> This email message and any attachments are for the exclusive use of the
> intended recipient(s) and may contain confidential and privileged
> information. Any unauthorized review, use, disclosure or distribution is
> prohibited. If you are not the intended recipient, please contact the
> sender by reply email and destroy all copies of the original message along
> with any attachments, from your computer system. If you are the intended
> recipient, please be advised that the content of this message is subject to
> access, review and disclosure by the sender's Email System Administrator.
>

Mime
View raw message