Return-Path: X-Original-To: apmail-hive-user-archive@www.apache.org Delivered-To: apmail-hive-user-archive@www.apache.org Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by minotaur.apache.org (Postfix) with SMTP id BA57F187D8 for ; Fri, 4 Dec 2015 00:10:56 +0000 (UTC) Received: (qmail 33833 invoked by uid 500); 4 Dec 2015 00:10:55 -0000 Delivered-To: apmail-hive-user-archive@hive.apache.org Received: (qmail 33762 invoked by uid 500); 4 Dec 2015 00:10:54 -0000 Mailing-List: contact user-help@hive.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: user@hive.apache.org Delivered-To: mailing list user@hive.apache.org Received: (qmail 33752 invoked by uid 99); 4 Dec 2015 00:10:54 -0000 Received: from Unknown (HELO spamd4-us-west.apache.org) (209.188.14.142) by apache.org (qpsmtpd/0.29) with ESMTP; Fri, 04 Dec 2015 00:10:54 +0000 Received: from localhost (localhost [127.0.0.1]) by spamd4-us-west.apache.org (ASF Mail Server at spamd4-us-west.apache.org) with ESMTP id 5848CC059B for ; Fri, 4 Dec 2015 00:10:54 +0000 (UTC) X-Virus-Scanned: Debian amavisd-new at spamd4-us-west.apache.org X-Spam-Flag: NO X-Spam-Score: 4.111 X-Spam-Level: **** X-Spam-Status: No, score=4.111 tagged_above=-999 required=6.31 tests=[HTML_MESSAGE=3, KAM_COUK=1.1, T_KAM_HTML_FONT_INVALID=0.01, URIBL_BLOCKED=0.001] autolearn=disabled Received: from mx1-eu-west.apache.org ([10.40.0.8]) by localhost (spamd4-us-west.apache.org [10.40.0.11]) (amavisd-new, port 10024) with ESMTP id gyq-opWkzd60 for ; Fri, 4 Dec 2015 00:10:40 +0000 (UTC) Received: from sulu.netzoomi.net (sulu.netzoomi.net [83.138.144.103]) by mx1-eu-west.apache.org (ASF Mail Server at mx1-eu-west.apache.org) with ESMTP id 0492A21231 for ; Fri, 4 Dec 2015 00:10:39 +0000 (UTC) Received: from vulcan.netzoomi.net (unknown [212.100.249.54]) by sulu.netzoomi.net (Postfix) with ESMTP id 52E966A489A for ; Fri, 4 Dec 2015 00:10:39 +0000 (GMT) X-Envelope-From: Received: from w7 (cpc86449-seve24-2-0-cust177.13-3.cable.virginm.net [86.19.59.178]) by vulcan.netzoomi.net (Postfix) with ESMTPA id 2BE5912480F7 for ; Fri, 4 Dec 2015 00:10:39 +0000 (GMT) From: "Mich Talebzadeh" To: References: <08dc01d12df3$9bb92500$d32b6f00$@peridale.co.uk> <08ef01d12df6$a6cbe4c0$f463ae40$@peridale.co.uk> <08fe01d12df8$f6b8a390$e429eab0$@peridale.co.uk> <091201d12dfd$05e34010$11a9c030$@peridale.co.uk> <092001d12e03$35a2b410$a0e81c30$@peridale.co.uk> <094501d12e24$19b70460$4d250d20$@peridale.co.uk> In-Reply-To: Subject: RE: Any clue on this error, Exception in thread "main" java.lang.NoSuchFieldError: SPARK_RPC_CLIENT_CONNECT_TIMEOUT Date: Fri, 4 Dec 2015 00:10:46 -0000 Message-ID: <094c01d12e28$39667300$ac335900$@peridale.co.uk> MIME-Version: 1.0 Content-Type: multipart/alternative; boundary="----=_NextPart_000_094D_01D12E28.39698040" X-Mailer: Microsoft Outlook 15.0 Thread-Index: AQKJxdFi5/BYV1u4w3Ia88e0OSqbKQHw5BxYAR/lViMB13XQUAJAc665AdC7C/oCYd2zXQFNI0UGAeUnjSYBjGsvBJzH8r+w Content-Language: en-gb This is a multipart message in MIME format. ------=_NextPart_000_094D_01D12E28.39698040 Content-Type: text/plain; charset="utf-8" Content-Transfer-Encoding: quoted-printable Thanks downloaded the one suggested, =20 Unfortunately I get the following error when I try start-master.sh =20 hduser@rhes564::/home/hduser> start-master.sh starting org.apache.spark.deploy.master.Master, logging to = /usr/lib/spark/sbin/../logs/spark-hduser-org.apache.spark.deploy.master.M= aster-1-rhes564.out failed to launch org.apache.spark.deploy.master.Master: at java.lang.ClassLoader.loadClass(ClassLoader.java:357) ... 6 more full log in = /usr/lib/spark/sbin/../logs/spark-hduser-org.apache.spark.deploy.master.M= aster-1-rhes564.out hduser@rhes564::/home/hduser> cat = /usr/lib/spark/sbin/../logs/spark-hduser-org.apache.spark.deploy.master.M= aster-1-rhes564.out Spark Command: /usr/java/latest/bin/java -cp = /usr/lib/spark/sbin/../conf/:/usr/lib/spark/lib/spark-assembly-1.5.2-hado= op2.2.0.jar:/home/hduser/hadoop-2.6.0/etc/hadoop/ -Xms1g -Xmx1g = -XX:MaxPermSize=3D256m org.apache.spark.deploy.master.Master --ip = rhes564 --port 7077 --webui-port 8080 =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D Exception in thread "main" java.lang.NoClassDefFoundError: = org/slf4j/Logger at java.lang.Class.getDeclaredMethods0(Native Method) at java.lang.Class.privateGetDeclaredMethods(Class.java:2521) at java.lang.Class.getMethod0(Class.java:2764) at java.lang.Class.getMethod(Class.java:1653) at = sun.launcher.LauncherHelper.getMainMethod(LauncherHelper.java:494) at = sun.launcher.LauncherHelper.checkAndLoadMain(LauncherHelper.java:486) Caused by: java.lang.ClassNotFoundException: org.slf4j.Logger at java.net.URLClassLoader$1.run(URLClassLoader.java:366) at java.net.URLClassLoader$1.run(URLClassLoader.java:355) at java.security.AccessController.doPrivileged(Native Method) at java.net.URLClassLoader.findClass(URLClassLoader.java:354) at java.lang.ClassLoader.loadClass(ClassLoader.java:424) at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308) at java.lang.ClassLoader.loadClass(ClassLoader.java:357) ... 6 more =20 This is I suspect as before due to having fewer jar files in = $SPARK_HOME/lib =20 hduser@rhes564::/usr/lib/spark/lib> ltr total 207620 -rw-r--r-- 1 hduser hadoop 102573339 Nov 3 18:04 = spark-examples-1.5.2-hadoop2.2.0.jar -rw-r--r-- 1 hduser hadoop 105357751 Nov 3 18:04 = spark-assembly-1.5.2-hadoop2.2.0.jar -rw-r--r-- 1 hduser hadoop 4433953 Nov 3 18:04 = spark-1.5.2-yarn-shuffle.jar =20 =20 Compared to prebuild one =20 hduser@rhes564::/usr/lib/spark_ori/lib> ltr total 303876 -rw-r--r-- 1 hduser hadoop 118360126 Nov 3 18:05 = spark-examples-1.5.2-hadoop2.6.0.jar -rw-r--r-- 1 hduser hadoop 183993445 Nov 3 18:05 = spark-assembly-1.5.2-hadoop2.6.0.jar -rw-r--r-- 1 hduser hadoop 4433953 Nov 3 18:05 = spark-1.5.2-yarn-shuffle.jar -rw-r--r-- 1 hduser hadoop 1809447 Nov 3 18:05 = datanucleus-rdbms-3.2.9.jar -rw-r--r-- 1 hduser hadoop 1890075 Nov 3 18:05 = datanucleus-core-3.2.10.jar -rw-r--r-- 1 hduser hadoop 339666 Nov 3 18:05 = datanucleus-api-jdo-3.2.6.jar =20 =20 =20 =20 Mich Talebzadeh =20 Sybase ASE 15 Gold Medal Award 2008 A Winning Strategy: Running the most Critical Financial Data on ASE 15 http://login.sybase.com/files/Product_Overviews/ASE-Winning-Strategy-0919= 08.pdf Author of the books "A Practitioner=E2=80=99s Guide to Upgrading to = Sybase ASE 15", ISBN 978-0-9563693-0-7.=20 co-author "Sybase Transact SQL Guidelines Best Practices", ISBN = 978-0-9759693-0-4 Publications due shortly: Complex Event Processing in Heterogeneous Environments, ISBN: = 978-0-9563693-3-8 Oracle and Sybase, Concepts and Contrasts, ISBN: 978-0-9563693-1-4, = volume one out shortly =20 http://talebzadehmich.wordpress.com =20 NOTE: The information in this email is proprietary and confidential. = This message is for the designated recipient only, if you are not the = intended recipient, you should destroy it immediately. Any information = in this message shall not be understood as given or endorsed by Peridale = Technology Ltd, its subsidiaries or their employees, unless expressly so = stated. It is the responsibility of the recipient to ensure that this = email is virus free, therefore neither Peridale Ltd, its subsidiaries = nor their employees accept any responsibility. =20 =20 -----Original Message----- From: Marcelo Vanzin [mailto:vanzin@cloudera.com]=20 Sent: 03 December 2015 23:44 To: Mich Talebzadeh Cc: user@hive.apache.org Subject: Re: Any clue on this error, Exception in thread "main" = java.lang.NoSuchFieldError: SPARK_RPC_CLIENT_CONNECT_TIMEOUT =20 I spoke to Xuefu (Hive dev) and mentioned that this isn't really how it = should be done. =20 In the meantime, if you can, you should use a Spark package that does = not include Hive classes. There used to be an explicit one for that, but = I can't find it. In the meantime, the tarball that says "pre-built with = user-provided Hadoop" should work for your case. =20 On Thu, Dec 3, 2015 at 3:41 PM, Mich Talebzadeh < = mich@peridale.co.uk> wrote: > Just noticed that hive shell in 1.2.1 makes a reference to SPARK_HOME=20 > if it finds it >=20 >=20 >=20 >=20 >=20 > # add Spark assembly jar to the classpath >=20 > if [[ -n "$SPARK_HOME" ]] >=20 > then >=20 > sparkAssemblyPath=3D`ls ${SPARK_HOME}/lib/spark-assembly-*.jar` >=20 > CLASSPATH=3D"${CLASSPATH}:${sparkAssemblyPath}" >=20 > fi >=20 >=20 >=20 >=20 >=20 > Is this expected? >=20 >=20 >=20 > Mich Talebzadeh >=20 >=20 >=20 > Sybase ASE 15 Gold Medal Award 2008 >=20 > A Winning Strategy: Running the most Critical Financial Data on ASE 15 >=20 > = = http://login.sybase.com/files/Product_Overviews/ASE-Winning-Strategy-0 > 91908.pdf >=20 > Author of the books "A Practitioner=E2=80=99s Guide to Upgrading to = Sybase ASE=20 > 15", ISBN 978-0-9563693-0-7. >=20 > co-author "Sybase Transact SQL Guidelines Best Practices", ISBN > 978-0-9759693-0-4 >=20 > Publications due shortly: >=20 > Complex Event Processing in Heterogeneous Environments, ISBN: > 978-0-9563693-3-8 >=20 > Oracle and Sybase, Concepts and Contrasts, ISBN: 978-0-9563693-1-4,=20 > volume one out shortly >=20 >=20 >=20 > = http://talebzadehmich.wordpress.com >=20 >=20 >=20 > NOTE: The information in this email is proprietary and confidential.=20 > This message is for the designated recipient only, if you are not the=20 > intended recipient, you should destroy it immediately. Any information = > in this message shall not be understood as given or endorsed by=20 > Peridale Technology Ltd, its subsidiaries or their employees, unless=20 > expressly so stated. It is the responsibility of the recipient to=20 > ensure that this email is virus free, therefore neither Peridale Ltd,=20 > its subsidiaries nor their employees accept any responsibility. >=20 >=20 >=20 > From: Mich Talebzadeh [ = mailto:mich@peridale.co.uk] > Sent: 03 December 2015 19:46 > To: user@hive.apache.org; 'Marcelo = Vanzin' < vanzin@cloudera.com> >=20 >=20 > Subject: RE: Any clue on this error, Exception in thread "main" > java.lang.NoSuchFieldError: SPARK_RPC_CLIENT_CONNECT_TIMEOUT >=20 >=20 >=20 > Hi, >=20 >=20 >=20 > This is my CLASSPATH which I have simplified running with Hive 1.2.1=20 > and generic build Spark 1.3 >=20 >=20 >=20 > unset CLASSPATH >=20 > = CLASSPATH=3D$HADOOP_HOME/share/hadoop/common/hadoop-common-2.6.0-tests.j > ar:$HADOOP_HOME/share/hadoop/common/hadoop-common-2.6.0.jar:hadoop-nfs > -2.6.0.jar:$HIVE_HOME/lib:${SPARK_HOME}/lib >=20 >=20 >=20 > echo $CLASSPATH >=20 > export CLASSPATH >=20 >=20 >=20 >=20 >=20 > CLASPPATH IS now >=20 >=20 >=20 > /home/hduser/hadoop-2.6.0/share/hadoop/common/hadoop-common-2.6.0-test > s.jar:/home/hduser/hadoop-2.6.0/share/hadoop/common/hadoop-common-2.6. > 0.jar:hadoop-nfs-2.6.0.jar:/usr/lib/hive/lib:/usr/lib/spark/lib >=20 >=20 >=20 > However, I get the error. Does anyone has a working CLASSPATH for = this? >=20 >=20 >=20 >=20 >=20 >=20 >=20 > .spark.client.RemoteDriver /usr/lib/hive/lib/hive-exec-1.2.1.jar > --remote-host rhes564 --remote-port 51642 --conf > hive.spark.client.connect.timeout=3D1000 --conf > hive.spark.client.server.connect.timeout=3D90000 --conf=20 > hive.spark.client.channel.log.level=3Dnull --conf > hive.spark.client.rpc.max.size=3D52428800 --conf > hive.spark.client.rpc.threads=3D8 --conf=20 > hive.spark.client.secret.bits=3D256 >=20 > 15/12/03 19:42:51 [stderr-redir-1]: INFO client.SparkClientImpl: Spark = > assembly has been built with Hive, including Datanucleus jars on=20 > classpath >=20 > 15/12/03 19:42:52 [stderr-redir-1]: INFO client.SparkClientImpl: = Warning: > Ignoring non-spark config property:=20 > hive.spark.client.connect.timeout=3D1000 >=20 > 15/12/03 19:42:52 [stderr-redir-1]: INFO client.SparkClientImpl: = Warning: > Ignoring non-spark config property: hive.spark.client.rpc.threads=3D8 >=20 > 15/12/03 19:42:52 [stderr-redir-1]: INFO client.SparkClientImpl: = Warning: > Ignoring non-spark config property:=20 > hive.spark.client.rpc.max.size=3D52428800 >=20 > 15/12/03 19:42:52 [stderr-redir-1]: INFO client.SparkClientImpl: = Warning: > Ignoring non-spark config property: = hive.spark.client.secret.bits=3D256 >=20 > 15/12/03 19:42:52 [stderr-redir-1]: INFO client.SparkClientImpl: = Warning: > Ignoring non-spark config property: > hive.spark.client.server.connect.timeout=3D90000 >=20 > 15/12/03 19:42:52 [stderr-redir-1]: INFO client.SparkClientImpl:=20 > 15/12/03 > 19:42:52 INFO client.RemoteDriver: Connecting to: rhes564:51642 >=20 > 15/12/03 19:42:52 [stderr-redir-1]: INFO client.SparkClientImpl:=20 > Exception in thread "main" java.lang.NoSuchFieldError: > SPARK_RPC_CLIENT_CONNECT_TIMEOUT >=20 > 15/12/03 19:42:52 [stderr-redir-1]: INFO client.SparkClientImpl: = at > org.apache.hive.spark.client.rpc.RpcConfiguration.(RpcConfigur > ation.java:46) >=20 > 15/12/03 19:42:52 [stderr-redir-1]: INFO client.SparkClientImpl: = at > org.apache.hive.spark.client.RemoteDriver.(RemoteDriver.java:146 > ) >=20 > 15/12/03 19:42:52 [stderr-redir-1]: INFO client.SparkClientImpl: = at > org.apache.hive.spark.client.RemoteDriver.main(RemoteDriver.java:556) >=20 > 15/12/03 19:42:52 [stderr-redir-1]: INFO client.SparkClientImpl: = at > sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) >=20 > 15/12/03 19:42:52 [stderr-redir-1]: INFO client.SparkClientImpl: = at > sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.j > ava:57) >=20 > 15/12/03 19:42:52 [stderr-redir-1]: INFO client.SparkClientImpl: = at > sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccess > orImpl.java:43) >=20 > 15/12/03 19:42:52 [stderr-redir-1]: INFO client.SparkClientImpl: = at > java.lang.reflect.Method.invoke(Method.java:606) >=20 > 15/12/03 19:42:52 [stderr-redir-1]: INFO client.SparkClientImpl: = at > org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubm > it$$runMain(SparkSubmit.scala:569) >=20 > 15/12/03 19:42:52 [stderr-redir-1]: INFO client.SparkClientImpl: = at > org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:166 > ) >=20 > 15/12/03 19:42:52 [stderr-redir-1]: INFO client.SparkClientImpl: = at > org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:189) >=20 > 15/12/03 19:42:52 [stderr-redir-1]: INFO client.SparkClientImpl: = at > org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:110) >=20 > 15/12/03 19:42:52 [stderr-redir-1]: INFO client.SparkClientImpl: = at > org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) >=20 >=20 >=20 >=20 >=20 >=20 >=20 >=20 >=20 > Mich Talebzadeh >=20 >=20 >=20 > Sybase ASE 15 Gold Medal Award 2008 >=20 > A Winning Strategy: Running the most Critical Financial Data on ASE 15 >=20 > = = http://login.sybase.com/files/Product_Overviews/ASE-Winning-Strategy-0 > 91908.pdf >=20 > Author of the books "A Practitioner=E2=80=99s Guide to Upgrading to = Sybase ASE=20 > 15", ISBN 978-0-9563693-0-7. >=20 > co-author "Sybase Transact SQL Guidelines Best Practices", ISBN > 978-0-9759693-0-4 >=20 > Publications due shortly: >=20 > Complex Event Processing in Heterogeneous Environments, ISBN: > 978-0-9563693-3-8 >=20 > Oracle and Sybase, Concepts and Contrasts, ISBN: 978-0-9563693-1-4,=20 > volume one out shortly >=20 >=20 >=20 > = http://talebzadehmich.wordpress.com >=20 >=20 >=20 > NOTE: The information in this email is proprietary and confidential.=20 > This message is for the designated recipient only, if you are not the=20 > intended recipient, you should destroy it immediately. Any information = > in this message shall not be understood as given or endorsed by=20 > Peridale Technology Ltd, its subsidiaries or their employees, unless=20 > expressly so stated. It is the responsibility of the recipient to=20 > ensure that this email is virus free, therefore neither Peridale Ltd,=20 > its subsidiaries nor their employees accept any responsibility. >=20 >=20 >=20 >=20 >=20 > -----Original Message----- > From: Mich Talebzadeh [ = mailto:mich@peridale.co.uk] > Sent: 03 December 2015 19:02 > To: 'Marcelo Vanzin' < = vanzin@cloudera.com> > Cc: user@hive.apache.org; 'user' < = user@spark.apache.org> > Subject: RE: Any clue on this error, Exception in thread "main" > java.lang.NoSuchFieldError: SPARK_RPC_CLIENT_CONNECT_TIMEOUT >=20 >=20 >=20 > Hi Marcelo. >=20 >=20 >=20 > So this is the approach I am going to take: >=20 >=20 >=20 > Use spark 1.3 pre-built >=20 > Use Hive 1.2.1. Do not copy over anything to add to hive libraries=20 > from spark 1.3 libraries Use Hadoop 2.6 >=20 >=20 >=20 > There is no need to mess around with the libraries. I will try to=20 > unset my CLASSPATH and reset again and try again >=20 >=20 >=20 >=20 >=20 > Thanks, >=20 >=20 >=20 >=20 >=20 > Mich Talebzadeh >=20 >=20 >=20 > Sybase ASE 15 Gold Medal Award 2008 >=20 > A Winning Strategy: Running the most Critical Financial Data on ASE 15 = > = = http://login.sybase.com/files/Product_Overviews/ASE-Winning-Strategy-0 > 91908.pdf >=20 > Author of the books "A Practitioner=E2=80=99s Guide to Upgrading to = Sybase ASE=20 > 15", ISBN 978-0-9563693-0-7. >=20 > co-author "Sybase Transact SQL Guidelines Best Practices", ISBN > 978-0-9759693-0-4 Publications due shortly: >=20 > Complex Event Processing in Heterogeneous Environments, ISBN: > 978-0-9563693-3-8 Oracle and Sybase, Concepts and Contrasts, ISBN: > 978-0-9563693-1-4, volume one out shortly >=20 >=20 >=20 > = http://talebzadehmich.wordpress.com >=20 >=20 >=20 > NOTE: The information in this email is proprietary and confidential.=20 > This message is for the designated recipient only, if you are not the=20 > intended recipient, you should destroy it immediately. Any information = > in this message shall not be understood as given or endorsed by=20 > Peridale Technology Ltd, its subsidiaries or their employees, unless=20 > expressly so stated. It is the responsibility of the recipient to=20 > ensure that this email is virus free, therefore neither Peridale Ltd,=20 > its subsidiaries nor their employees accept any responsibility. >=20 >=20 >=20 > -----Original Message----- >=20 > From: Marcelo Vanzin [ = mailto:vanzin@cloudera.com] >=20 > Sent: 03 December 2015 18:45 >=20 > To: Mich Talebzadeh < = mich@peridale.co.uk> >=20 > Cc: user@hive.apache.org; user < = user@spark.apache.org> >=20 > Subject: Re: Any clue on this error, Exception in thread "main" > java.lang.NoSuchFieldError: SPARK_RPC_CLIENT_CONNECT_TIMEOUT >=20 >=20 >=20 > On Thu, Dec 3, 2015 at 10:32 AM, Mich Talebzadeh < = mich@peridale.co.uk> > wrote: >=20 >=20 >=20 >> hduser@rhes564::/usr/lib/spark/logs> hive --version >=20 >> SLF4J: Found binding in >=20 >> [jar:file:/usr/lib/spark/lib/spark-assembly-1.3.0-hadoop2.4.0.jar!/or >> g >=20 >> /slf4j/impl/StaticLoggerBinder.class] >=20 >=20 >=20 > As I suggested before, you have Spark's assembly in the Hive = classpath. > That's not the way to configure hive-on-spark; if the documentation=20 > you're following tells you to do that, it's wrong. >=20 >=20 >=20 > (And sorry Ted, but please ignore Ted's suggestion. Hive-on-Spark=20 > should work fine with Spark 1.3 if it's configured correctly. You=20 > really don't want to be overriding Hive classes with the ones shipped=20 > in the Spark assembly, regardless of the version of Spark being used.) >=20 >=20 >=20 > -- >=20 > Marcelo >=20 >=20 >=20 > --------------------------------------------------------------------- >=20 > To unsubscribe, e-mail: = user-unsubscribe@spark.apache.org For=20 > additional commands, e-mail: = user-help@spark.apache.org =20 =20 =20 -- Marcelo ------=_NextPart_000_094D_01D12E28.39698040 Content-Type: text/html; charset="utf-8" Content-Transfer-Encoding: quoted-printable

Thanks downloaded the one = suggested,

 

Unfortunately I get the following error when I try = start-master.sh

 

hduser@rhes564::/home/hduser> = start-master.sh

starting = org.apache.spark.deploy.master.Master, logging to = /usr/lib/spark/sbin/../logs/spark-hduser-org.apache.spark.deploy.master.M= aster-1-rhes564.out

failed to launch = org.apache.spark.deploy.master.Master:

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at = java.lang.ClassLoader.loadClass(ClassLoader.java:357)

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 ... 6 = more

full log in = /usr/lib/spark/sbin/../logs/spark-hduser-org.apache.spark.deploy.master.M= aster-1-rhes564.out

hduser@rhes564::/home/hduser> cat = /usr/lib/spark/sbin/../logs/spark-hduser-org.apache.spark.deploy.master.M= aster-1-rhes564.out

Spark Command: = /usr/java/latest/bin/java -cp = /usr/lib/spark/sbin/../conf/:/usr/lib/spark/lib/spark-assembly-1.5.2-hado= op2.2.0.jar:/home/hduser/hadoop-2.6.0/etc/hadoop/ -Xms1g -Xmx1g = -XX:MaxPermSize=3D256m org.apache.spark.deploy.master.Master --ip = rhes564 --port 7077 --webui-port 8080

=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D= =3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D=3D

Exception in thread "main" = java.lang.NoClassDefFoundError: org/slf4j/Logger

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at = java.lang.Class.getDeclaredMethods0(Native = Method)

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at = java.lang.Class.privateGetDeclaredMethods(Class.java:2521)

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at = java.lang.Class.getMethod0(Class.java:2764)

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at = java.lang.Class.getMethod(Class.java:1653)

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at = sun.launcher.LauncherHelper.getMainMethod(LauncherHelper.java:494)

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at = sun.launcher.LauncherHelper.checkAndLoadMain(LauncherHelper.java:486)

Caused by: = java.lang.ClassNotFoundException: = org.slf4j.Logger

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at = java.net.URLClassLoader$1.run(URLClassLoader.java:366)<= /p>

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at = java.net.URLClassLoader$1.run(URLClassLoader.java:355)<= /p>

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at = java.security.AccessController.doPrivileged(Native = Method)

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at = java.net.URLClassLoader.findClass(URLClassLoader.java:354)

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at = java.lang.ClassLoader.loadClass(ClassLoader.java:424)

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at = sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)<= /span>

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 at = java.lang.ClassLoader.loadClass(ClassLoader.java:357)

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 ... 6 = more

 

This is I suspect as before due to having fewer jar = files in $SPARK_HOME/lib

 

hduser@rhes564::/usr/lib/spark/lib> = ltr

total = 207620

-rw-r--r-- 1 hduser = hadoop 102573339 Nov=C2=A0 3 18:04 = spark-examples-1.5.2-hadoop2.2.0.jar

-rw-r--r-- 1 hduser hadoop 105357751 Nov=C2=A0 3 18:04 = spark-assembly-1.5.2-hadoop2.2.0.jar

-rw-r--r-- 1 hduser hadoop=C2=A0=C2=A0 4433953 = Nov=C2=A0 3 18:04 spark-1.5.2-yarn-shuffle.jar

 

 

Compared to prebuild = one

 

hduser@rhes564::/usr/lib/spark_ori/lib> = ltr

total = 303876

-rw-r--r-- 1 hduser = hadoop 118360126 Nov=C2=A0 3 18:05 = spark-examples-1.5.2-hadoop2.6.0.jar

-rw-r--r-- 1 hduser hadoop 183993445 Nov=C2=A0 3 18:05 = spark-assembly-1.5.2-hadoop2.6.0.jar

-rw-r--r-- 1 hduser hadoop=C2=A0=C2=A0 4433953 = Nov=C2=A0 3 18:05 spark-1.5.2-yarn-shuffle.jar

-rw-r--r-- 1 hduser hadoop=C2=A0=C2=A0 1809447 = Nov=C2=A0 3 18:05 datanucleus-rdbms-3.2.9.jar

-rw-r--r-- 1 hduser hadoop=C2=A0=C2=A0 1890075 = Nov=C2=A0 3 18:05 datanucleus-core-3.2.10.jar

-rw-r--r-- 1 hduser hadoop=C2=A0=C2=A0=C2=A0 339666 = Nov=C2=A0 3 18:05 datanucleus-api-jdo-3.2.6.jar

 

 

 

 

Mich = Talebzadeh

 

Sybase = ASE 15 Gold Medal Award 2008

A = Winning Strategy: Running the most Critical Financial Data on ASE = 15

http://login.sybase.com/files/Produc= t_Overviews/ASE-Winning-Strategy-091908.pdf

Author = of the books "A Practitioner=E2=80=99s Guide to Upgrading to Sybase = ASE 15", ISBN 978-0-9563693-0-7.

co-author "Sybase Transact SQL = Guidelines Best Practices", ISBN = 978-0-9759693-0-4

Publications due = shortly:

Complex Event Processing in = Heterogeneous Environments, ISBN: = 978-0-9563693-3-8

Oracle and Sybase, Concepts and = Contrasts, ISBN: 978-0-9563693-1-4, volume one out = shortly

 

http://talebzadehmich.wordpress.com<= o:p>

 

NOTE: = The information in this email is proprietary and confidential. This = message is for the designated recipient only, if you are not the = intended recipient, you should destroy it immediately. Any information = in this message shall not be understood as given or endorsed by Peridale = Technology Ltd, its subsidiaries or their employees, unless expressly so = stated. It is the responsibility of the recipient to ensure that this = email is virus free, therefore neither Peridale Ltd, its subsidiaries = nor their employees accept any responsibility.

 

 

-----Original = Message-----
From: Marcelo Vanzin [mailto:vanzin@cloudera.com] =
Sent: 03 December 2015 23:44
To: Mich Talebzadeh = <mich@peridale.co.uk>
Cc: user@hive.apache.org
Subject: Re: = Any clue on this error, Exception in thread "main" = java.lang.NoSuchFieldError: = SPARK_RPC_CLIENT_CONNECT_TIMEOUT

 

I = spoke to Xuefu (Hive dev) and mentioned that this isn't really how it = should be done.

 

In the = meantime, if you can, you should use a Spark package that does not = include Hive classes. There used to be an explicit one for that, but I = can't find it. In the meantime, the tarball that says "pre-built = with user-provided Hadoop" should work for your = case.

 

On Thu, Dec 3, 2015 at 3:41 PM, Mich Talebzadeh = <mich@peridale.co.uk> wrote:

> Just noticed = that hive shell in 1.2.1 makes a reference to SPARK_HOME =

> if it finds = it

> 

> 

> 

> 

> 

> # add Spark assembly jar to the = classpath

> 

> if [[ -n "$SPARK_HOME" = ]]

> 

> then

> 

>=C2=A0=C2=A0 sparkAssemblyPath=3D`ls = ${SPARK_HOME}/lib/spark-assembly-*.jar`

> 

>=C2=A0=C2=A0 = CLASSPATH=3D"${CLASSPATH}:${sparkAssemblyPath}"

<= p class=3DMsoPlainText>> 

> fi

> 

> 

> 

> 

> 

> Is this expected?

> 

> 

> 

> Mich Talebzadeh

> 

> 

> 

> Sybase ASE 15 Gold Medal Award = 2008

> 

> A Winning Strategy: Running the most Critical = Financial Data on ASE 15

> 

> http://login.sybase.com/f= iles/Product_Overviews/ASE-Winning-Strategy-0

> 91908.pdf

> 

> Author of the books "A = Practitioner=E2=80=99s Guide to Upgrading to Sybase ASE =

> 15", ISBN = 978-0-9563693-0-7.

> 

> co-author "Sybase Transact SQL Guidelines = Best Practices", ISBN

> = 978-0-9759693-0-4

> 

> Publications due shortly:

> 

> Complex Event Processing in Heterogeneous = Environments, ISBN:

> = 978-0-9563693-3-8

> 

> Oracle and Sybase, Concepts and Contrasts, = ISBN: 978-0-9563693-1-4,

> = volume one out shortly

> 

> 

> 

> http://talebzadehmich.wor= dpress.com

> 

> 

> 

> NOTE: The information in this email is = proprietary and confidential.

> This message is for the designated recipient = only, if you are not the

> = intended recipient, you should destroy it immediately. Any information =

> in this message shall not be = understood as given or endorsed by

> Peridale Technology Ltd, its subsidiaries or = their employees, unless

> = expressly so stated. It is the responsibility of the recipient to =

> ensure that this email is = virus free, therefore neither Peridale Ltd,

> its subsidiaries nor their employees accept = any responsibility.

> 

> 

> 

> From: Mich Talebzadeh [mailto:mich@peridale.co.u= k]

> Sent: 03 = December 2015 19:46

> To: user@hive.apache.org; 'Marcelo Vanzin' <vanzin@cloudera.com>

> 

> 

> Subject: RE: Any clue on this error, Exception = in thread "main"

> = java.lang.NoSuchFieldError: = SPARK_RPC_CLIENT_CONNECT_TIMEOUT

> 

> 

> 

> Hi,

> 

> 

> 

> This is my CLASSPATH which I have simplified = running with Hive 1.2.1

> and = generic build Spark 1.3

> 

> 

> 

> unset CLASSPATH

> 

> = CLASSPATH=3D$HADOOP_HOME/share/hadoop/common/hadoop-common-2.6.0-tests.j<= o:p>

> = ar:$HADOOP_HOME/share/hadoop/common/hadoop-common-2.6.0.jar:hadoop-nfs

> = -2.6.0.jar:$HIVE_HOME/lib:${SPARK_HOME}/lib

> 

> 

> 

> echo $CLASSPATH

> 

> export CLASSPATH

> 

> 

> 

> 

> 

> CLASPPATH IS now

> 

> 

> 

> = /home/hduser/hadoop-2.6.0/share/hadoop/common/hadoop-common-2.6.0-test

> = s.jar:/home/hduser/hadoop-2.6.0/share/hadoop/common/hadoop-common-2.6.

> = 0.jar:hadoop-nfs-2.6.0.jar:/usr/lib/hive/lib:/usr/lib/spark/lib

> 

> 

> 

> However, I get the error. Does anyone has a = working CLASSPATH for this?

> 

> 

> 

> 

> 

> 

> 

> .spark.client.RemoteDriver = /usr/lib/hive/lib/hive-exec-1.2.1.jar

> --remote-host rhes564 --remote-port 51642 = --conf

> = hive.spark.client.connect.timeout=3D1000 --conf

> = hive.spark.client.server.connect.timeout=3D90000 --conf =

> = hive.spark.client.channel.log.level=3Dnull --conf

> hive.spark.client.rpc.max.size=3D52428800 = --conf

> = hive.spark.client.rpc.threads=3D8 --conf

> = hive.spark.client.secret.bits=3D256

> 

> 15/12/03 19:42:51 [stderr-redir-1]: INFO = client.SparkClientImpl: Spark

> assembly has been built with Hive, including = Datanucleus jars on

> = classpath

> 

> 15/12/03 19:42:52 [stderr-redir-1]: INFO = client.SparkClientImpl: Warning:

> Ignoring non-spark config property: =

> = hive.spark.client.connect.timeout=3D1000

> 

> 15/12/03 19:42:52 [stderr-redir-1]: INFO = client.SparkClientImpl: Warning:

> Ignoring non-spark config property: = hive.spark.client.rpc.threads=3D8

> 

> 15/12/03 19:42:52 [stderr-redir-1]: INFO = client.SparkClientImpl: Warning:

> Ignoring non-spark config property: =

> = hive.spark.client.rpc.max.size=3D52428800

> 

> 15/12/03 19:42:52 [stderr-redir-1]: INFO = client.SparkClientImpl: Warning:

> Ignoring non-spark config property: = hive.spark.client.secret.bits=3D256

> 

> 15/12/03 19:42:52 [stderr-redir-1]: INFO = client.SparkClientImpl: Warning:

> Ignoring non-spark config = property:

> = hive.spark.client.server.connect.timeout=3D90000

> 

> 15/12/03 19:42:52 [stderr-redir-1]: INFO = client.SparkClientImpl:

> = 15/12/03

> 19:42:52 INFO = client.RemoteDriver: Connecting to: rhes564:51642

> 

> 15/12/03 19:42:52 [stderr-redir-1]: INFO = client.SparkClientImpl:

> = Exception in thread "main" = java.lang.NoSuchFieldError:

> = SPARK_RPC_CLIENT_CONNECT_TIMEOUT

> 

> 15/12/03 19:42:52 [stderr-redir-1]: INFO = client.SparkClientImpl:=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 = at

> = org.apache.hive.spark.client.rpc.RpcConfiguration.<clinit>(RpcConfi= gur

> = ation.java:46)

> 

> 15/12/03 19:42:52 [stderr-redir-1]: INFO = client.SparkClientImpl:=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 = at

> = org.apache.hive.spark.client.RemoteDriver.<init>(RemoteDriver.java:= 146

> )

> 

> 15/12/03 19:42:52 [stderr-redir-1]: INFO = client.SparkClientImpl:=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 = at

> = org.apache.hive.spark.client.RemoteDriver.main(RemoteDriver.java:556)

> 

> 15/12/03 19:42:52 [stderr-redir-1]: INFO = client.SparkClientImpl:=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 = at

> = sun.reflect.NativeMethodAccessorImpl.invoke0(Native = Method)

> 

> 15/12/03 19:42:52 [stderr-redir-1]: INFO = client.SparkClientImpl:=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 = at

> = sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.j

> ava:57)

> 

> 15/12/03 19:42:52 [stderr-redir-1]: INFO = client.SparkClientImpl:=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 = at

> = sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccess

> = orImpl.java:43)

> 

> 15/12/03 19:42:52 [stderr-redir-1]: INFO = client.SparkClientImpl:=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 = at

> = java.lang.reflect.Method.invoke(Method.java:606)

> 

> 15/12/03 19:42:52 [stderr-redir-1]: INFO = client.SparkClientImpl:=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 = at

> = org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubm

> = it$$runMain(SparkSubmit.scala:569)

> 

> 15/12/03 19:42:52 [stderr-redir-1]: INFO = client.SparkClientImpl:=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 = at

> = org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:166

> )

> 

> 15/12/03 19:42:52 [stderr-redir-1]: INFO = client.SparkClientImpl:=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 = at

> = org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:189)

> 

> 15/12/03 19:42:52 [stderr-redir-1]: INFO = client.SparkClientImpl:=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 = at

> = org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:110)

> 

> 15/12/03 19:42:52 [stderr-redir-1]: INFO = client.SparkClientImpl:=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 = at

> = org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

> 

> 

> 

> 

> 

> 

> 

> 

> 

> Mich Talebzadeh

> 

> 

> 

> Sybase ASE 15 Gold Medal Award = 2008

> 

> A Winning Strategy: Running the most Critical = Financial Data on ASE 15

> 

> http://login.sybase.com/f= iles/Product_Overviews/ASE-Winning-Strategy-0

> 91908.pdf

> 

> Author of the books "A = Practitioner=E2=80=99s Guide to Upgrading to Sybase ASE =

> 15", ISBN = 978-0-9563693-0-7.

> 

> co-author "Sybase Transact SQL Guidelines = Best Practices", ISBN

> = 978-0-9759693-0-4

> 

> Publications due shortly:

> 

> Complex Event Processing in Heterogeneous = Environments, ISBN:

> = 978-0-9563693-3-8

> 

> Oracle and Sybase, Concepts and Contrasts, = ISBN: 978-0-9563693-1-4,

> = volume one out shortly

> 

> 

> 

> http://talebzadehmich.wor= dpress.com

> 

> 

> 

> NOTE: The information in this email is = proprietary and confidential.

> This message is for the designated recipient = only, if you are not the

> = intended recipient, you should destroy it immediately. Any information =

> in this message shall not be = understood as given or endorsed by

> Peridale Technology Ltd, its subsidiaries or = their employees, unless

> = expressly so stated. It is the responsibility of the recipient to =

> ensure that this email is = virus free, therefore neither Peridale Ltd,

> its subsidiaries nor their employees accept = any responsibility.

> 

> 

> 

> 

> 

> -----Original Message-----

> From: Mich Talebzadeh [mailto:mich@peridale.co.u= k]

> Sent: 03 = December 2015 19:02

> To: = 'Marcelo Vanzin' <vanzin@cloudera.com>

> Cc: user@hive.apache.org; 'user' <user@spark.apache.org>

> Subject: RE: Any = clue on this error, Exception in thread = "main"

> = java.lang.NoSuchFieldError: = SPARK_RPC_CLIENT_CONNECT_TIMEOUT

> 

> 

> 

> Hi Marcelo.

> 

> 

> 

> So this is the approach I am going to = take:

> 

> 

> 

> Use spark 1.3 pre-built

> 

> Use Hive 1.2.1. Do not copy over anything to = add to hive libraries

> from = spark 1.3 libraries Use Hadoop 2.6

> 

> 

> 

> There is no need to mess around with the = libraries. I will try to

> = unset my CLASSPATH and reset again and try again

> 

> 

> 

> 

> 

> Thanks,

> 

> 

> 

> 

> 

> Mich Talebzadeh

> 

> 

> 

> Sybase ASE 15 Gold Medal Award = 2008

> 

> A Winning Strategy: Running the most Critical = Financial Data on ASE 15

> http://login.sybase.com/f= iles/Product_Overviews/ASE-Winning-Strategy-0

> 91908.pdf

> 

> Author of the books "A = Practitioner=E2=80=99s Guide to Upgrading to Sybase ASE =

> 15", ISBN = 978-0-9563693-0-7.

> 

> co-author "Sybase Transact SQL Guidelines = Best Practices", ISBN

> = 978-0-9759693-0-4 Publications due shortly:

> 

> Complex Event Processing in Heterogeneous = Environments, ISBN:

> = 978-0-9563693-3-8 Oracle and Sybase, Concepts and Contrasts, = ISBN:

> 978-0-9563693-1-4, = volume one out shortly

> 

> 

> 

> http://talebzadehmich.wor= dpress.com

> 

> 

> 

> NOTE: The information in this email is = proprietary and confidential.

> This message is for the designated recipient = only, if you are not the

> = intended recipient, you should destroy it immediately. Any information =

> in this message shall not be = understood as given or endorsed by

> Peridale Technology Ltd, its subsidiaries or = their employees, unless

> = expressly so stated. It is the responsibility of the recipient to =

> ensure that this email is = virus free, therefore neither Peridale Ltd,

> its subsidiaries nor their employees accept = any responsibility.

> 

> 

> 

> -----Original Message-----

> 

> From: Marcelo Vanzin [mailto:vanzin@cloudera.co= m]

> 

> Sent: 03 December 2015 18:45

> 

> To: Mich Talebzadeh <mich@peridale.co.uk>

> 

> Cc: user@hive.apache.org; user <user@spark.apache.org>

> 

> Subject: Re: Any clue on this error, Exception = in thread "main"

> = java.lang.NoSuchFieldError: = SPARK_RPC_CLIENT_CONNECT_TIMEOUT

> 

> 

> 

> On Thu, Dec 3, 2015 at 10:32 AM, Mich = Talebzadeh <mich@peridale.co.uk>

> = wrote:

> 

> 

> 

>> hduser@rhes564::/usr/lib/spark/logs> = hive --version

> 

>> SLF4J: Found binding in

> 

>> = [jar:file:/usr/lib/spark/lib/spark-assembly-1.3.0-hadoop2.4.0.jar!/or

>> g

> 

>> = /slf4j/impl/StaticLoggerBinder.class]

> 

> 

> 

> As I suggested before, you have Spark's = assembly in the Hive classpath.

> That's not the way to configure hive-on-spark; = if the documentation

> you're = following tells you to do that, it's wrong.

> 

> 

> 

> (And sorry Ted, but please ignore Ted's = suggestion. Hive-on-Spark

> = should work fine with Spark 1.3 if it's configured correctly. You =

> really don't want to be = overriding Hive classes with the ones shipped

> in the Spark assembly, regardless of the = version of Spark being used.)

> 

> 

> 

> --

> 

> Marcelo

> 

> 

> 

> = ---------------------------------------------------------------------

> 

> To unsubscribe, e-mail: user-unsubscribe@spark.ap= ache.org For

> = additional commands, e-mail: user-help@spark.apache.or= g

 

 

 

--

Marcelo

------=_NextPart_000_094D_01D12E28.39698040--