hive-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Mich Talebzadeh" <m...@peridale.co.uk>
Subject RE: Trying to run simple Java script againt Hive in Eclipse on Windows
Date Thu, 07 Jan 2016 21:51:06 GMT
Thanks Nick,

 

What I did was this

 

1.    Copied over $HIVE_HOME/lib from Linux host to my PC

2.    Copied over $HADOPP_HOME from Linux host to my PC

3.    In Eclipse in project (right click), choose Build Path --> Configure Build Path -->
Choose external JARs

4.    Add all JAR files from hive/lib

5.    Add hadoop-common-2.6.0.jar from $HADOOP_HOME/share/hadoop/httpfs/tomcat/webapps/webhdfs/WEB-INF/lib

 

Once you have done all that it works

 

Effectively what we are doing here is building Hive client package for windows.

 

Running: show databases

asehadoop

default

iqhadoop

oraclehadoop

test

 

 

 

 

Dr Mich Talebzadeh

 

LinkedIn  https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw

 

Sybase ASE 15 Gold Medal Award 2008

A Winning Strategy: Running the most Critical Financial Data on ASE 15

http://login.sybase.com/files/Product_Overviews/ASE-Winning-Strategy-091908.pdf

Author of the books "A Practitioner’s Guide to Upgrading to Sybase ASE 15", ISBN 978-0-9563693-0-7.


co-author "Sybase Transact SQL Guidelines Best Practices", ISBN 978-0-9759693-0-4

Publications due shortly:

Complex Event Processing in Heterogeneous Environments, ISBN: 978-0-9563693-3-8

Oracle and Sybase, Concepts and Contrasts, ISBN: 978-0-9563693-1-4, volume one out shortly

 

http://talebzadehmich.wordpress.com

 

NOTE: The information in this email is proprietary and confidential. This message is for the
designated recipient only, if you are not the intended recipient, you should destroy it immediately.
Any information in this message shall not be understood as given or endorsed by Peridale Technology
Ltd, its subsidiaries or their employees, unless expressly so stated. It is the responsibility
of the recipient to ensure that this email is virus free, therefore neither Peridale Ltd,
its subsidiaries nor their employees accept any responsibility.

 

 

-----Original Message-----
From: Nicholas Hakobian [mailto:nicholas.hakobian@rallyhealth.com] 
Sent: 07 January 2016 17:05
To: user@hive.apache.org
Subject: Re: Trying to run simple Java script againt Hive in Eclipse on Windows

 

You're probably missing the base hadoop libraries that the hive jdbc driver depends on (like
hadoop-common, etc.).  I'm not sure about other distributions (or just compiling hive by hand),
but CDH provides a hive-jdbc-standalone.jar which is a fat jar that contains all the dependencies.
I remember getting similar errors unless I used the standalone jar (or copied over the hadoop
dependencies).

 

-Nick

 

Nicholas Szandor Hakobian

Data Scientist

Rally Health

 <mailto:nicholas.hakobian@rallyhealth.com> nicholas.hakobian@rallyhealth.com

 

On Thu, Jan 7, 2016 at 2:37 AM, Mich Talebzadeh < <mailto:mich@peridale.co.uk> mich@peridale.co.uk>
wrote:

> Hi Java Gurus,

> 

> 

> 

> 

> 

> I have written a simple Java program that works fine when I run it on 

> Linux as hduser (the OS owner for Hadoop, Hive etc)

> 

> 

> 

> When I create a project in Eclipse on Windows and have copied over 

> $HIVE_HOME/lib from Linux to Windows and added as external libraries, 

> the execution fails with the following error

> 

> 

> 

> SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder".

> 

> SLF4J: Defaulting to no-operation (NOP) logger implementation

> 

> SLF4J: See  <http://www.slf4j.org/codes.html#StaticLoggerBinder> http://www.slf4j.org/codes.html#StaticLoggerBinder
for 

> further details.

> 

> Exception in thread "main" java.lang.NoClassDefFoundError:

> org/apache/hadoop/conf/Configuration

> 

>        at

> org.apache.hive.jdbc.HiveConnection.createBinaryTransport(HiveConnecti

> on.java:478)

> 

>        at

> org.apache.hive.jdbc.HiveConnection.openTransport(HiveConnection.java:

> 201)

> 

>        at

> org.apache.hive.jdbc.HiveConnection.<init>(HiveConnection.java:176)

> 

>        at org.apache.hive.jdbc.HiveDriver.connect(HiveDriver.java:105)

> 

>        at java.sql.DriverManager.getConnection(Unknown Source)

> 

>        at java.sql.DriverManager.getConnection(Unknown Source)

> 

>        at hivework.HiveJdbcClient.main(HiveJdbcClient.java:35)

> 

> Caused by: java.lang.ClassNotFoundException:

> org.apache.hadoop.conf.Configuration

> 

>        at java.net.URLClassLoader.findClass(Unknown Source)

> 

>        at java.lang.ClassLoader.loadClass(Unknown Source)

> 

>        at sun.misc.Launcher$AppClassLoader.loadClass(Unknown Source)

> 

>        at java.lang.ClassLoader.loadClass(Unknown Source)

> 

>        ... 7 more

> 

> 

> 

> 

> 

> Appreciate any input

> 

> 

> 

> 

> 

> 

> 

> The code is as follows

> 

> 

> 

> package hivework;

> 

> import java.sql.SQLException;

> 

> import java.io.*;

> 

> import java.sql.Connection;

> 

> import java.sql.ResultSet;

> 

> import java.sql.Statement;

> 

> import java.sql.DriverManager;

> 

> 

> 

> public class HiveJdbcClient

> 

> {

> 

>   private static String driverName = 

> "org.apache.hive.jdbc.HiveDriver";

> 

>   private static String _username;

> 

>   private static String _password;

> 

>   private static String _hiveserver;

> 

>   private static String _database;

> 

> 

> 

>   public static void main(String[] args) throws SQLException {

> 

>     boolean confirm = false;

> 

>     _hiveserver = "jdbc:hive2://rhes564:10010";

> 

>     _database = "/asehadoop";

> 

>     _username = "hduser";

> 

>     _password = "hduser";

> 

>     int status = 0;

> 

>     String query = null;

> 

>     String TableName = null;

> 

>     Connection _con = null;

> 

>     try {

> 

>       Class.forName(driverName);

> 

>     } catch (ClassNotFoundException e) {

> 

>       // TODO Auto-generated catch block

> 

>       e.printStackTrace();

> 

>       System.exit(1);

> 

>     }

> 

>     ResultSet rs = null;

> 

>     _con = DriverManager.getConnection(_hiveserver+_database, 

> _username, _password);

> 

>     Statement statement = _con.createStatement();

> 

>     query = "show tables";

> 

>     rs = statement.executeQuery(query);

> 

>     rs = statement.getResultSet();

> 

>     //ResultSetMetaData rsmd = rs.getMetaData ();

> 

>     System.out.println("Running: " + query);

> 

>     while (rs.next()) {

> 

>     System.out.println(rs.getString(1));

> 

>   }

> 

>     query = "show databases";

> 

>     rs = statement.executeQuery(query);

> 

>     rs = statement.getResultSet();

> 

>     System.out.println("Running: " + query);

> 

>     while (rs.next())

> 

>     {

> 

>       System.out.println(rs.getString(1));

> 

>     }

> 

> // let us do some DML

> 

> //

> 

>    try

> 

>    {

> 

>     TableName = "michboy";

> 

>     query = "drop table if exists " + TableName;

> 

>     System.out.println("Running: " + query);

> 

>     statement.executeUpdate(query);

> 

>     query = "create table " + TableName + "(col1 int, col2 

> varchar(30))";

> 

>     System.out.println("Running: " + query);

> 

>     statement.executeUpdate(query);

> 

>     query = "insert into table michboy values (1,'mich')";

> 

>     statement.executeUpdate(query);

> 

>     query = "insert into table michboy values (2,'row2')";

> 

>     statement.executeUpdate(query);

> 

>     query = "insert into table michboy values (3,'row3')";

> 

>     System.out.println("Running: " + query);

> 

>     statement.executeUpdate(query);

> 

>     query = "show tables";

> 

>     rs = statement.executeQuery(query);

> 

>     rs = statement.getResultSet();

> 

>     System.out.println("Running: " + query);

> 

>     while (rs.next())

> 

>     {

> 

>       System.out.println(rs.getString(1));

> 

>     }

> 

>     query = "desc michboy";

> 

>     rs = statement.executeQuery(query);

> 

>     rs = statement.getResultSet();

> 

>     System.out.println("\n Running: " + query+ "\n");

> 

>     while (rs.next())

> 

>     {

> 

>       System.out.println(rs.getString(1));

> 

>     }

> 

>     query = "select * from michboy";

> 

>     rs = statement.executeQuery(query);

> 

>     rs = statement.getResultSet();

> 

>     System.out.println("Running: " + query);

> 

>     while (rs.next())

> 

>     {

> 

>       System.out.println(rs.getString(1));

> 

>     }

> 

> //

> 

>    }catch(SQLException se){

> 

>       //Handle errors for JDBC

> 

>       se.printStackTrace();

> 

>    }catch(Exception e){

> 

>       //Handle errors for Class.forName

> 

>       e.printStackTrace();

> 

>    }

> 

> 

> 

> }//end main

> 

> }//HiveJdbcClient

> 

> 

> 

> 

> 

> 

> 

> Dr Mich Talebzadeh

> 

> 

> 

> LinkedIn

>  <https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCC> https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCC

> dOABUrV8Pw

> 

> 

> 

> Sybase ASE 15 Gold Medal Award 2008

> 

> A Winning Strategy: Running the most Critical Financial Data on ASE 15

> 

>  <http://login.sybase.com/files/Product_Overviews/ASE-Winning-Strategy-0> http://login.sybase.com/files/Product_Overviews/ASE-Winning-Strategy-0

> 91908.pdf

> 

> Author of the books "A Practitioner’s Guide to Upgrading to Sybase ASE 

> 15", ISBN 978-0-9563693-0-7.

> 

> co-author "Sybase Transact SQL Guidelines Best Practices", ISBN

> 978-0-9759693-0-4

> 

> Publications due shortly:

> 

> Complex Event Processing in Heterogeneous Environments, ISBN:

> 978-0-9563693-3-8

> 

> Oracle and Sybase, Concepts and Contrasts, ISBN: 978-0-9563693-1-4, 

> volume one out shortly

> 

> 

> 

>  <http://talebzadehmich.wordpress.com> http://talebzadehmich.wordpress.com

> 

> 

> 

> NOTE: The information in this email is proprietary and confidential. 

> This message is for the designated recipient only, if you are not the 

> intended recipient, you should destroy it immediately. Any information 

> in this message shall not be understood as given or endorsed by 

> Peridale Technology Ltd, its subsidiaries or their employees, unless 

> expressly so stated. It is the responsibility of the recipient to 

> ensure that this email is virus free, therefore neither Peridale Ltd, 

> its subsidiaries nor their employees accept any responsibility.

> 

> 


Mime
View raw message