hive-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Mich Talebzadeh" <m...@peridale.co.uk>
Subject RE: Any clue on this error, Exception in thread "main" java.lang.NoSuchFieldError: SPARK_RPC_CLIENT_CONNECT_TIMEOUT
Date Fri, 04 Dec 2015 00:10:46 GMT
Thanks downloaded the one suggested,

 

Unfortunately I get the following error when I try start-master.sh

 

hduser@rhes564::/home/hduser> start-master.sh

starting org.apache.spark.deploy.master.Master, logging to /usr/lib/spark/sbin/../logs/spark-hduser-org.apache.spark.deploy.master.Master-1-rhes564.out

failed to launch org.apache.spark.deploy.master.Master:

        at java.lang.ClassLoader.loadClass(ClassLoader.java:357)

        ... 6 more

full log in /usr/lib/spark/sbin/../logs/spark-hduser-org.apache.spark.deploy.master.Master-1-rhes564.out

hduser@rhes564::/home/hduser> cat /usr/lib/spark/sbin/../logs/spark-hduser-org.apache.spark.deploy.master.Master-1-rhes564.out

Spark Command: /usr/java/latest/bin/java -cp /usr/lib/spark/sbin/../conf/:/usr/lib/spark/lib/spark-assembly-1.5.2-hadoop2.2.0.jar:/home/hduser/hadoop-2.6.0/etc/hadoop/
-Xms1g -Xmx1g -XX:MaxPermSize=256m org.apache.spark.deploy.master.Master --ip rhes564 --port
7077 --webui-port 8080

========================================

Exception in thread "main" java.lang.NoClassDefFoundError: org/slf4j/Logger

        at java.lang.Class.getDeclaredMethods0(Native Method)

        at java.lang.Class.privateGetDeclaredMethods(Class.java:2521)

        at java.lang.Class.getMethod0(Class.java:2764)

        at java.lang.Class.getMethod(Class.java:1653)

        at sun.launcher.LauncherHelper.getMainMethod(LauncherHelper.java:494)

        at sun.launcher.LauncherHelper.checkAndLoadMain(LauncherHelper.java:486)

Caused by: java.lang.ClassNotFoundException: org.slf4j.Logger

        at java.net.URLClassLoader$1.run(URLClassLoader.java:366)

        at java.net.URLClassLoader$1.run(URLClassLoader.java:355)

        at java.security.AccessController.doPrivileged(Native Method)

        at java.net.URLClassLoader.findClass(URLClassLoader.java:354)

        at java.lang.ClassLoader.loadClass(ClassLoader.java:424)

        at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)

        at java.lang.ClassLoader.loadClass(ClassLoader.java:357)

        ... 6 more

 

This is I suspect as before due to having fewer jar files in $SPARK_HOME/lib

 

hduser@rhes564::/usr/lib/spark/lib> ltr

total 207620

-rw-r--r-- 1 hduser hadoop 102573339 Nov  3 18:04 spark-examples-1.5.2-hadoop2.2.0.jar

-rw-r--r-- 1 hduser hadoop 105357751 Nov  3 18:04 spark-assembly-1.5.2-hadoop2.2.0.jar

-rw-r--r-- 1 hduser hadoop   4433953 Nov  3 18:04 spark-1.5.2-yarn-shuffle.jar

 

 

Compared to prebuild one

 

hduser@rhes564::/usr/lib/spark_ori/lib> ltr

total 303876

-rw-r--r-- 1 hduser hadoop 118360126 Nov  3 18:05 spark-examples-1.5.2-hadoop2.6.0.jar

-rw-r--r-- 1 hduser hadoop 183993445 Nov  3 18:05 spark-assembly-1.5.2-hadoop2.6.0.jar

-rw-r--r-- 1 hduser hadoop   4433953 Nov  3 18:05 spark-1.5.2-yarn-shuffle.jar

-rw-r--r-- 1 hduser hadoop   1809447 Nov  3 18:05 datanucleus-rdbms-3.2.9.jar

-rw-r--r-- 1 hduser hadoop   1890075 Nov  3 18:05 datanucleus-core-3.2.10.jar

-rw-r--r-- 1 hduser hadoop    339666 Nov  3 18:05 datanucleus-api-jdo-3.2.6.jar

 

 

 

 

Mich Talebzadeh

 

Sybase ASE 15 Gold Medal Award 2008

A Winning Strategy: Running the most Critical Financial Data on ASE 15

http://login.sybase.com/files/Product_Overviews/ASE-Winning-Strategy-091908.pdf

Author of the books "A Practitioner’s Guide to Upgrading to Sybase ASE 15", ISBN 978-0-9563693-0-7.


co-author "Sybase Transact SQL Guidelines Best Practices", ISBN 978-0-9759693-0-4

Publications due shortly:

Complex Event Processing in Heterogeneous Environments, ISBN: 978-0-9563693-3-8

Oracle and Sybase, Concepts and Contrasts, ISBN: 978-0-9563693-1-4, volume one out shortly

 

http://talebzadehmich.wordpress.com

 

NOTE: The information in this email is proprietary and confidential. This message is for the
designated recipient only, if you are not the intended recipient, you should destroy it immediately.
Any information in this message shall not be understood as given or endorsed by Peridale Technology
Ltd, its subsidiaries or their employees, unless expressly so stated. It is the responsibility
of the recipient to ensure that this email is virus free, therefore neither Peridale Ltd,
its subsidiaries nor their employees accept any responsibility.

 

 

-----Original Message-----
From: Marcelo Vanzin [mailto:vanzin@cloudera.com] 
Sent: 03 December 2015 23:44
To: Mich Talebzadeh <mich@peridale.co.uk>
Cc: user@hive.apache.org
Subject: Re: Any clue on this error, Exception in thread "main" java.lang.NoSuchFieldError:
SPARK_RPC_CLIENT_CONNECT_TIMEOUT

 

I spoke to Xuefu (Hive dev) and mentioned that this isn't really how it should be done.

 

In the meantime, if you can, you should use a Spark package that does not include Hive classes.
There used to be an explicit one for that, but I can't find it. In the meantime, the tarball
that says "pre-built with user-provided Hadoop" should work for your case.

 

On Thu, Dec 3, 2015 at 3:41 PM, Mich Talebzadeh < <mailto:mich@peridale.co.uk> mich@peridale.co.uk>
wrote:

> Just noticed that hive shell in 1.2.1 makes a reference to SPARK_HOME 

> if it finds it

> 

> 

> 

> 

> 

> # add Spark assembly jar to the classpath

> 

> if [[ -n "$SPARK_HOME" ]]

> 

> then

> 

>   sparkAssemblyPath=`ls ${SPARK_HOME}/lib/spark-assembly-*.jar`

> 

>   CLASSPATH="${CLASSPATH}:${sparkAssemblyPath}"

> 

> fi

> 

> 

> 

> 

> 

> Is this expected?

> 

> 

> 

> Mich Talebzadeh

> 

> 

> 

> Sybase ASE 15 Gold Medal Award 2008

> 

> A Winning Strategy: Running the most Critical Financial Data on ASE 15

> 

>  <http://login.sybase.com/files/Product_Overviews/ASE-Winning-Strategy-0> http://login.sybase.com/files/Product_Overviews/ASE-Winning-Strategy-0

> 91908.pdf

> 

> Author of the books "A Practitioner’s Guide to Upgrading to Sybase ASE 

> 15", ISBN 978-0-9563693-0-7.

> 

> co-author "Sybase Transact SQL Guidelines Best Practices", ISBN

> 978-0-9759693-0-4

> 

> Publications due shortly:

> 

> Complex Event Processing in Heterogeneous Environments, ISBN:

> 978-0-9563693-3-8

> 

> Oracle and Sybase, Concepts and Contrasts, ISBN: 978-0-9563693-1-4, 

> volume one out shortly

> 

> 

> 

>  <http://talebzadehmich.wordpress.com> http://talebzadehmich.wordpress.com

> 

> 

> 

> NOTE: The information in this email is proprietary and confidential. 

> This message is for the designated recipient only, if you are not the 

> intended recipient, you should destroy it immediately. Any information 

> in this message shall not be understood as given or endorsed by 

> Peridale Technology Ltd, its subsidiaries or their employees, unless 

> expressly so stated. It is the responsibility of the recipient to 

> ensure that this email is virus free, therefore neither Peridale Ltd, 

> its subsidiaries nor their employees accept any responsibility.

> 

> 

> 

> From: Mich Talebzadeh [ <mailto:mich@peridale.co.uk> mailto:mich@peridale.co.uk]

> Sent: 03 December 2015 19:46

> To:  <mailto:user@hive.apache.org> user@hive.apache.org; 'Marcelo Vanzin' <
<mailto:vanzin@cloudera.com> vanzin@cloudera.com>

> 

> 

> Subject: RE: Any clue on this error, Exception in thread "main"

> java.lang.NoSuchFieldError: SPARK_RPC_CLIENT_CONNECT_TIMEOUT

> 

> 

> 

> Hi,

> 

> 

> 

> This is my CLASSPATH which I have simplified running with Hive 1.2.1 

> and generic build Spark 1.3

> 

> 

> 

> unset CLASSPATH

> 

> CLASSPATH=$HADOOP_HOME/share/hadoop/common/hadoop-common-2.6.0-tests.j

> ar:$HADOOP_HOME/share/hadoop/common/hadoop-common-2.6.0.jar:hadoop-nfs

> -2.6.0.jar:$HIVE_HOME/lib:${SPARK_HOME}/lib

> 

> 

> 

> echo $CLASSPATH

> 

> export CLASSPATH

> 

> 

> 

> 

> 

> CLASPPATH IS now

> 

> 

> 

> /home/hduser/hadoop-2.6.0/share/hadoop/common/hadoop-common-2.6.0-test

> s.jar:/home/hduser/hadoop-2.6.0/share/hadoop/common/hadoop-common-2.6.

> 0.jar:hadoop-nfs-2.6.0.jar:/usr/lib/hive/lib:/usr/lib/spark/lib

> 

> 

> 

> However, I get the error. Does anyone has a working CLASSPATH for this?

> 

> 

> 

> 

> 

> 

> 

> .spark.client.RemoteDriver /usr/lib/hive/lib/hive-exec-1.2.1.jar

> --remote-host rhes564 --remote-port 51642 --conf

> hive.spark.client.connect.timeout=1000 --conf

> hive.spark.client.server.connect.timeout=90000 --conf 

> hive.spark.client.channel.log.level=null --conf

> hive.spark.client.rpc.max.size=52428800 --conf

> hive.spark.client.rpc.threads=8 --conf 

> hive.spark.client.secret.bits=256

> 

> 15/12/03 19:42:51 [stderr-redir-1]: INFO client.SparkClientImpl: Spark 

> assembly has been built with Hive, including Datanucleus jars on 

> classpath

> 

> 15/12/03 19:42:52 [stderr-redir-1]: INFO client.SparkClientImpl: Warning:

> Ignoring non-spark config property: 

> hive.spark.client.connect.timeout=1000

> 

> 15/12/03 19:42:52 [stderr-redir-1]: INFO client.SparkClientImpl: Warning:

> Ignoring non-spark config property: hive.spark.client.rpc.threads=8

> 

> 15/12/03 19:42:52 [stderr-redir-1]: INFO client.SparkClientImpl: Warning:

> Ignoring non-spark config property: 

> hive.spark.client.rpc.max.size=52428800

> 

> 15/12/03 19:42:52 [stderr-redir-1]: INFO client.SparkClientImpl: Warning:

> Ignoring non-spark config property: hive.spark.client.secret.bits=256

> 

> 15/12/03 19:42:52 [stderr-redir-1]: INFO client.SparkClientImpl: Warning:

> Ignoring non-spark config property:

> hive.spark.client.server.connect.timeout=90000

> 

> 15/12/03 19:42:52 [stderr-redir-1]: INFO client.SparkClientImpl: 

> 15/12/03

> 19:42:52 INFO client.RemoteDriver: Connecting to: rhes564:51642

> 

> 15/12/03 19:42:52 [stderr-redir-1]: INFO client.SparkClientImpl: 

> Exception in thread "main" java.lang.NoSuchFieldError:

> SPARK_RPC_CLIENT_CONNECT_TIMEOUT

> 

> 15/12/03 19:42:52 [stderr-redir-1]: INFO client.SparkClientImpl:        at

> org.apache.hive.spark.client.rpc.RpcConfiguration.<clinit>(RpcConfigur

> ation.java:46)

> 

> 15/12/03 19:42:52 [stderr-redir-1]: INFO client.SparkClientImpl:        at

> org.apache.hive.spark.client.RemoteDriver.<init>(RemoteDriver.java:146

> )

> 

> 15/12/03 19:42:52 [stderr-redir-1]: INFO client.SparkClientImpl:        at

> org.apache.hive.spark.client.RemoteDriver.main(RemoteDriver.java:556)

> 

> 15/12/03 19:42:52 [stderr-redir-1]: INFO client.SparkClientImpl:        at

> sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)

> 

> 15/12/03 19:42:52 [stderr-redir-1]: INFO client.SparkClientImpl:        at

> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.j

> ava:57)

> 

> 15/12/03 19:42:52 [stderr-redir-1]: INFO client.SparkClientImpl:        at

> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccess

> orImpl.java:43)

> 

> 15/12/03 19:42:52 [stderr-redir-1]: INFO client.SparkClientImpl:        at

> java.lang.reflect.Method.invoke(Method.java:606)

> 

> 15/12/03 19:42:52 [stderr-redir-1]: INFO client.SparkClientImpl:        at

> org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubm

> it$$runMain(SparkSubmit.scala:569)

> 

> 15/12/03 19:42:52 [stderr-redir-1]: INFO client.SparkClientImpl:        at

> org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:166

> )

> 

> 15/12/03 19:42:52 [stderr-redir-1]: INFO client.SparkClientImpl:        at

> org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:189)

> 

> 15/12/03 19:42:52 [stderr-redir-1]: INFO client.SparkClientImpl:        at

> org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:110)

> 

> 15/12/03 19:42:52 [stderr-redir-1]: INFO client.SparkClientImpl:        at

> org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

> 

> 

> 

> 

> 

> 

> 

> 

> 

> Mich Talebzadeh

> 

> 

> 

> Sybase ASE 15 Gold Medal Award 2008

> 

> A Winning Strategy: Running the most Critical Financial Data on ASE 15

> 

>  <http://login.sybase.com/files/Product_Overviews/ASE-Winning-Strategy-0> http://login.sybase.com/files/Product_Overviews/ASE-Winning-Strategy-0

> 91908.pdf

> 

> Author of the books "A Practitioner’s Guide to Upgrading to Sybase ASE 

> 15", ISBN 978-0-9563693-0-7.

> 

> co-author "Sybase Transact SQL Guidelines Best Practices", ISBN

> 978-0-9759693-0-4

> 

> Publications due shortly:

> 

> Complex Event Processing in Heterogeneous Environments, ISBN:

> 978-0-9563693-3-8

> 

> Oracle and Sybase, Concepts and Contrasts, ISBN: 978-0-9563693-1-4, 

> volume one out shortly

> 

> 

> 

>  <http://talebzadehmich.wordpress.com> http://talebzadehmich.wordpress.com

> 

> 

> 

> NOTE: The information in this email is proprietary and confidential. 

> This message is for the designated recipient only, if you are not the 

> intended recipient, you should destroy it immediately. Any information 

> in this message shall not be understood as given or endorsed by 

> Peridale Technology Ltd, its subsidiaries or their employees, unless 

> expressly so stated. It is the responsibility of the recipient to 

> ensure that this email is virus free, therefore neither Peridale Ltd, 

> its subsidiaries nor their employees accept any responsibility.

> 

> 

> 

> 

> 

> -----Original Message-----

> From: Mich Talebzadeh [ <mailto:mich@peridale.co.uk> mailto:mich@peridale.co.uk]

> Sent: 03 December 2015 19:02

> To: 'Marcelo Vanzin' < <mailto:vanzin@cloudera.com> vanzin@cloudera.com>

> Cc:  <mailto:user@hive.apache.org> user@hive.apache.org; 'user' < <mailto:user@spark.apache.org>
user@spark.apache.org>

> Subject: RE: Any clue on this error, Exception in thread "main"

> java.lang.NoSuchFieldError: SPARK_RPC_CLIENT_CONNECT_TIMEOUT

> 

> 

> 

> Hi Marcelo.

> 

> 

> 

> So this is the approach I am going to take:

> 

> 

> 

> Use spark 1.3 pre-built

> 

> Use Hive 1.2.1. Do not copy over anything to add to hive libraries 

> from spark 1.3 libraries Use Hadoop 2.6

> 

> 

> 

> There is no need to mess around with the libraries. I will try to 

> unset my CLASSPATH and reset again and try again

> 

> 

> 

> 

> 

> Thanks,

> 

> 

> 

> 

> 

> Mich Talebzadeh

> 

> 

> 

> Sybase ASE 15 Gold Medal Award 2008

> 

> A Winning Strategy: Running the most Critical Financial Data on ASE 15 

>  <http://login.sybase.com/files/Product_Overviews/ASE-Winning-Strategy-0> http://login.sybase.com/files/Product_Overviews/ASE-Winning-Strategy-0

> 91908.pdf

> 

> Author of the books "A Practitioner’s Guide to Upgrading to Sybase ASE 

> 15", ISBN 978-0-9563693-0-7.

> 

> co-author "Sybase Transact SQL Guidelines Best Practices", ISBN

> 978-0-9759693-0-4 Publications due shortly:

> 

> Complex Event Processing in Heterogeneous Environments, ISBN:

> 978-0-9563693-3-8 Oracle and Sybase, Concepts and Contrasts, ISBN:

> 978-0-9563693-1-4, volume one out shortly

> 

> 

> 

>  <http://talebzadehmich.wordpress.com> http://talebzadehmich.wordpress.com

> 

> 

> 

> NOTE: The information in this email is proprietary and confidential. 

> This message is for the designated recipient only, if you are not the 

> intended recipient, you should destroy it immediately. Any information 

> in this message shall not be understood as given or endorsed by 

> Peridale Technology Ltd, its subsidiaries or their employees, unless 

> expressly so stated. It is the responsibility of the recipient to 

> ensure that this email is virus free, therefore neither Peridale Ltd, 

> its subsidiaries nor their employees accept any responsibility.

> 

> 

> 

> -----Original Message-----

> 

> From: Marcelo Vanzin [ <mailto:vanzin@cloudera.com> mailto:vanzin@cloudera.com]

> 

> Sent: 03 December 2015 18:45

> 

> To: Mich Talebzadeh < <mailto:mich@peridale.co.uk> mich@peridale.co.uk>

> 

> Cc:  <mailto:user@hive.apache.org> user@hive.apache.org; user < <mailto:user@spark.apache.org>
user@spark.apache.org>

> 

> Subject: Re: Any clue on this error, Exception in thread "main"

> java.lang.NoSuchFieldError: SPARK_RPC_CLIENT_CONNECT_TIMEOUT

> 

> 

> 

> On Thu, Dec 3, 2015 at 10:32 AM, Mich Talebzadeh < <mailto:mich@peridale.co.uk>
mich@peridale.co.uk>

> wrote:

> 

> 

> 

>> hduser@rhes564::/usr/lib/spark/logs> hive --version

> 

>> SLF4J: Found binding in

> 

>> [jar:file:/usr/lib/spark/lib/spark-assembly-1.3.0-hadoop2.4.0.jar!/or

>> g

> 

>> /slf4j/impl/StaticLoggerBinder.class]

> 

> 

> 

> As I suggested before, you have Spark's assembly in the Hive classpath.

> That's not the way to configure hive-on-spark; if the documentation 

> you're following tells you to do that, it's wrong.

> 

> 

> 

> (And sorry Ted, but please ignore Ted's suggestion. Hive-on-Spark 

> should work fine with Spark 1.3 if it's configured correctly. You 

> really don't want to be overriding Hive classes with the ones shipped 

> in the Spark assembly, regardless of the version of Spark being used.)

> 

> 

> 

> --

> 

> Marcelo

> 

> 

> 

> ---------------------------------------------------------------------

> 

> To unsubscribe, e-mail:  <mailto:user-unsubscribe@spark.apache.org> user-unsubscribe@spark.apache.org
For 

> additional commands, e-mail:  <mailto:user-help@spark.apache.org> user-help@spark.apache.org

 

 

 

--

Marcelo


Mime
View raw message