Return-Path: X-Original-To: apmail-spark-user-archive@minotaur.apache.org Delivered-To: apmail-spark-user-archive@minotaur.apache.org Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by minotaur.apache.org (Postfix) with SMTP id 49658102F4 for ; Mon, 24 Nov 2014 22:26:56 +0000 (UTC) Received: (qmail 12188 invoked by uid 500); 24 Nov 2014 22:26:54 -0000 Delivered-To: apmail-spark-user-archive@spark.apache.org Received: (qmail 12117 invoked by uid 500); 24 Nov 2014 22:26:54 -0000 Mailing-List: contact user-help@spark.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Delivered-To: mailing list user@spark.apache.org Received: (qmail 12097 invoked by uid 99); 24 Nov 2014 22:26:54 -0000 Received: from nike.apache.org (HELO nike.apache.org) (192.87.106.230) by apache.org (qpsmtpd/0.29) with ESMTP; Mon, 24 Nov 2014 22:26:54 +0000 X-ASF-Spam-Status: No, hits=2.2 required=5.0 tests=HTML_MESSAGE,SPF_HELO_PASS,SPF_PASS X-Spam-Check-By: apache.org Received-SPF: pass (nike.apache.org: domain of judynash@exchange.microsoft.com designates 157.55.158.28 as permitted sender) Received: from [157.55.158.28] (HELO na01-sn2-obe.outbound.o365filtering.com) (157.55.158.28) by apache.org (qpsmtpd/0.29) with ESMTP; Mon, 24 Nov 2014 22:26:24 +0000 Received: from BLUSR01CA103.namsdf01.sdf.exchangelabs.com (10.255.124.148) by BY2SR01MB596.namsdf01.sdf.exchangelabs.com (10.255.93.164) with Microsoft SMTP Server (TLS) id 15.1.42.4; Mon, 24 Nov 2014 22:26:00 +0000 Received: from SN2FFOFD004.ffo.gbl (2a01:111:f400:7c04::24) by BLUSR01CA103.outlook.office365.com (2a01:111:e400:801::20) with Microsoft SMTP Server (TLS) id 15.1.42.4 via Frontend Transport; Mon, 24 Nov 2014 22:25:59 +0000 Received: from hybrid.exchange.microsoft.com (131.107.159.100) by SN2FFOFD004.mail.o365filtering.com (10.111.201.41) with Microsoft SMTP Server (TLS) id 15.1.26.5 via Frontend Transport; Mon, 24 Nov 2014 22:25:59 +0000 Received: from DFM-TK5MBX15-07.exchange.corp.microsoft.com (157.54.109.46) by DFM-TK5EDG15-02.exchange.corp.microsoft.com (157.54.27.97) with Microsoft SMTP Server (TLS) id 15.0.1044.22; Mon, 24 Nov 2014 22:25:56 +0000 Received: from DFM-DB3MBX15-07.exchange.corp.microsoft.com (10.221.22.29) by DFM-TK5MBX15-07.exchange.corp.microsoft.com (157.54.109.46) with Microsoft SMTP Server (TLS) id 15.0.1044.22; Mon, 24 Nov 2014 14:25:55 -0800 Received: from DFM-DB3MBX15-08.exchange.corp.microsoft.com (10.221.24.69) by DFM-DB3MBX15-07.exchange.corp.microsoft.com (10.221.22.29) with Microsoft SMTP Server (TLS) id 15.0.1044.22; Mon, 24 Nov 2014 14:25:53 -0800 Received: from DFM-DB3MBX15-08.exchange.corp.microsoft.com ([169.254.12.242]) by DFM-DB3MBX15-08.exchange.corp.microsoft.com ([169.254.12.242]) with mapi id 15.00.1044.021; Mon, 24 Nov 2014 14:25:53 -0800 From: Judy Nash To: Cheng Lian , "user@spark.incubator.apache.org" Subject: RE: latest Spark 1.2 thrift server fail with NoClassDefFoundError on Guava Thread-Topic: latest Spark 1.2 thrift server fail with NoClassDefFoundError on Guava Thread-Index: AdAGANCaTCDnB+HZSumZgIJVLwEoHQAS1asAAHpV1/A= Date: Mon, 24 Nov 2014 22:25:52 +0000 Message-ID: <8ee4315788de4a43b5450f00f201e7c1@DFM-DB3MBX15-08.exchange.corp.microsoft.com> References: <54700AC6.10305@gmail.com> In-Reply-To: <54700AC6.10305@gmail.com> Accept-Language: en-US Content-Language: en-US X-MS-Has-Attach: X-MS-TNEF-Correlator: x-ms-exchange-transport-fromentityheader: Hosted x-originating-ip: [157.54.51.13] Content-Type: multipart/alternative; boundary="_000_8ee4315788de4a43b5450f00f201e7c1DFMDB3MBX1508exchangeco_" MIME-Version: 1.0 X-EOPAttributedMessage: 0 Received-SPF: SoftFail (protection.outlook.com: domain of transitioning exchange.microsoft.com discourages use of 131.107.159.100 as permitted sender) Authentication-Results: spf=softfail (sender IP is 131.107.159.100) smtp.mailfrom=judynash@exchange.microsoft.com; X-Forefront-Antispam-Report: CIP:131.107.159.100;IPV:NLI;EFV:NLI;SFV:NSPM;SFS:(10019020)(479174003)(377454003)(41574002)(24454002)(189002)(199003)(68736004)(108616004)(15975445006)(84676001)(66066001)(64706001)(20776003)(46102003)(33646002)(2501002)(71186001)(44976005)(6806004)(87936001)(15202345003)(19300405004)(19580405001)(19580395003)(2656002)(84326002)(19625215002)(92566001)(97736003)(105596002)(62966003)(77096003)(77156002)(21056001)(31966008)(107886001)(4396001)(107046002)(106466001)(512954002)(99396003)(16236675004)(120916001)(76176999)(54356999)(50986999)(148773002)(24736002);DIR:OUT;SFP:1102;SCL:1;SRVR:BY2SR01MB596;H:hybrid.exchange.microsoft.com;FPR:;SPF:SoftFail;PTR:InfoDomainNonexistent;MX:1;A:1;LANG:en; X-Microsoft-Antispam: UriScan:; X-Microsoft-Antispam: BCL:0;PCL:0;RULEID:;SRVR:BY2SR01MB596; X-DmarcAction: None X-DmarcStatus: Failed X-Exchange-Antispam-Report-Test: UriScan:; X-Exchange-Antispam-Report-CFA-Test: BCL:0;PCL:0;RULEID:(4003003)(4002003);SRVR:BY2SR01MB596; X-Forefront-PRVS: 040513D301 X-Exchange-Antispam-Report-CFA-Test: BCL:0;PCL:0;RULEID:;SRVR:BY2SR01MB596; X-OriginatorOrg: exchange.microsoft.com X-MS-Exchange-CrossTenant-OriginalArrivalTime: 24 Nov 2014 22:25:59.7436 (UTC) X-MS-Exchange-CrossTenant-Id: f686d426-8d16-42db-81b7-ab578e110ccd X-MS-Exchange-CrossTenant-FromEntityHeader: HybridOnPrem X-MS-Exchange-Transport-CrossTenantHeadersStamped: BY2SR01MB596 X-Virus-Checked: Checked by ClamAV on apache.org --_000_8ee4315788de4a43b5450f00f201e7c1DFMDB3MBX1508exchangeco_ Content-Type: text/plain; charset="us-ascii" Content-Transfer-Encoding: quoted-printable Thank you Cheng for responding. Here is the commit SHA1 on the 1.2 branch I saw this failure in: commit 6f70e0295572e3037660004797040e026e440dbd Author: zsxwing Date: Fri Nov 21 00:42:43 2014 -0800 [SPARK-4472][Shell] Print "Spark context available as sc." only when Sp= arkContext is created... ... successfully It's weird that printing "Spark context available as sc" when creating = SparkContext unsuccessfully. Let me know if you need anything else. From: Cheng Lian [mailto:lian.cs.zju@gmail.com] Sent: Friday, November 21, 2014 8:02 PM To: Judy Nash; user@spark.incubator.apache.org Subject: Re: latest Spark 1.2 thrift server fail with NoClassDefFoundError = on Guava Hi Judy, could you please provide the commit SHA1 of the version you're usi= ng? Thanks! On 11/22/14 11:05 AM, Judy Nash wrote: Hi, Thrift server is failing to start for me on latest spark 1.2 branch. I got the error below when I start thrift server. Exception in thread "main" java.lang.NoClassDefFoundError: com/google/commo= n/bas e/Preconditions at org.apache.hadoop.conf.Configuration$DeprecationDelta.(Con= figur ation.java:314).... Here is my setup: 1) Latest spark 1.2 branch build 2) Used build command: mvn -Pyarn -Phadoop-2.4 -Dhadoop.version=3D2.4.0 -Phive -Phive-thriftserver= -DskipTests clean package 3) Added hive-site.xml to \conf 4) Version on the box: Hive 0.13, Hadoop 2.4 Is this a real bug or am I doing something wrong? ----------------------------------- Full Stacktrace: Exception in thread "main" java.lang.NoClassDefFoundError: com/google/commo= n/bas e/Preconditions at org.apache.hadoop.conf.Configuration$DeprecationDelta.(Con= figur ation.java:314) at org.apache.hadoop.conf.Configuration$DeprecationDelta.(Con= figur ation.java:327) at org.apache.hadoop.conf.Configuration.(Configuration.java= :409) at org.apache.spark.deploy.SparkHadoopUtil.newConfiguration(SparkHa= doopU til.scala:82) at org.apache.spark.deploy.SparkHadoopUtil.(SparkHadoopUtil.s= cala: 42) at org.apache.spark.deploy.SparkHadoopUtil$.(SparkHadoopUtil.= scala :202) at org.apache.spark.deploy.SparkHadoopUtil$.(SparkHadoopUti= l.sca la) at org.apache.spark.util.Utils$.getSparkOrYarnConfig(Utils.scala:17= 84) at org.apache.spark.storage.BlockManager.(BlockManager.scala:= 105) at org.apache.spark.storage.BlockManager.(BlockManager.scala:= 180) at org.apache.spark.SparkEnv$.create(SparkEnv.scala:292) at org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:159) at org.apache.spark.SparkContext.(SparkContext.scala:230) at org.apache.spark.sql.hive.thriftserver.SparkSQLEnv$.init(SparkSQ= LEnv. scala:38) at org.apache.spark.sql.hive.thriftserver.HiveThriftServer2$.main(H= iveTh riftServer2.scala:56) at org.apache.spark.sql.hive.thriftserver.HiveThriftServer2.main(Hi= veThr iftServer2.scala) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessor= Impl. java:57) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethod= Acces sorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:606) at org.apache.spark.deploy.SparkSubmit$.launch(SparkSubmit.scala:35= 3) at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:75) at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) Caused by: java.lang.ClassNotFoundException: com.google.common.base.Precond= ition s at java.net.URLClassLoader$1.run(URLClassLoader.java:366) at java.net.URLClassLoader$1.run(URLClassLoader.java:355) at java.security.AccessController.doPrivileged(Native Method) at java.net.URLClassLoader.findClass(URLClassLoader.java:354) at java.lang.ClassLoader.loadClass(ClassLoader.java:425) at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308) at java.lang.ClassLoader.loadClass(ClassLoader.java:358) --_000_8ee4315788de4a43b5450f00f201e7c1DFMDB3MBX1508exchangeco_ Content-Type: text/html; charset="us-ascii" Content-Transfer-Encoding: quoted-printable

Thank you Cheng for re= sponding.


Here is the commit SHA1 on the 1.2 branch I saw this failure in:=

commit 6f70e0295572e30= 37660004797040e026e440dbd

Author: zsxwing <zs= xwing@gmail.com>

Date:   Fri = Nov 21 00:42:43 2014 -0800

 

    [SP= ARK-4472][Shell] Print "Spark context available as sc." only when= SparkContext is created...

 

    ...= successfully

 

    It'= s weird that printing "Spark context available as sc" when creati= ng SparkContext unsuccessfully.

 

Let me know if you nee= d anything else.

 

From:= Cheng Lian [mailto:lian.cs.zju@gmail.com]
Sent: Friday, November 21, 2014 8:02 PM
To: Judy Nash; user@spark.incubator.apache.org
Subject: Re: latest Spark 1.2 thrift server fail with NoClassDefFoun= dError on Guava

 

Hi Judy, could you pl= ease provide the commit SHA1 of the version you're using? Thanks!

On 11/22/14 11:05 AM, Judy Nash wrote:

Hi,

 

Thrift server is failing to start for me on latest s= park 1.2 branch.

 

I got the error below when I start thrift server.

Exception in thread "main" java.lang.NoCla= ssDefFoundError: com/google/common/bas

e/Preconditions

        at org.ap= ache.hadoop.conf.Configuration$DeprecationDelta.<init>(Configur<= /o:p>

ation.java:314)….

 

Here is my setup:

1)      Latest spark 1.2 branch build

2)      Used build command:

mvn -Pyarn -Phadoop-2.4 -Dhadoop.version=3D2.4.0 -Phive -Phive= -thriftserver -DskipTests clean package

3)      Added hive-site.xml to \conf

4)      Version on the box: Hive 0.13, Hadoop 2.4

 

Is this a real bug or am I doing something wrong?

 

-----------------------------------

Full Stacktrace:

Exception in thread "main" java.lang.NoCla= ssDefFoundError: com/google/common/bas

e/Preconditions

        at org.ap= ache.hadoop.conf.Configuration$DeprecationDelta.<init>(Configur<= /o:p>

ation.java:314)

        at org.ap= ache.hadoop.conf.Configuration$DeprecationDelta.<init>(Configur<= /o:p>

ation.java:327)

        at org.ap= ache.hadoop.conf.Configuration.<clinit>(Configuration.java:409)<= /o:p>

 

        at org.ap= ache.spark.deploy.SparkHadoopUtil.newConfiguration(SparkHadoopU<= /p>

til.scala:82)

        at org.ap= ache.spark.deploy.SparkHadoopUtil.<init>(SparkHadoopUtil.scala:<= /o:p>

42)

        at org.ap= ache.spark.deploy.SparkHadoopUtil$.<init>(SparkHadoopUtil.scala<= /o:p>

:202)

        at org.ap= ache.spark.deploy.SparkHadoopUtil$.<clinit>(SparkHadoopUtil.sca<= /o:p>

la)

        at org.ap= ache.spark.util.Utils$.getSparkOrYarnConfig(Utils.scala:1784)

        at org.ap= ache.spark.storage.BlockManager.<init>(BlockManager.scala:105)

        at org.ap= ache.spark.storage.BlockManager.<init>(BlockManager.scala:180)

        at org.ap= ache.spark.SparkEnv$.create(SparkEnv.scala:292)

        at org.ap= ache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:159)

        at org.ap= ache.spark.SparkContext.<init>(SparkContext.scala:230)

        at org.ap= ache.spark.sql.hive.thriftserver.SparkSQLEnv$.init(SparkSQLEnv.<= /p>

scala:38)

        at org.ap= ache.spark.sql.hive.thriftserver.HiveThriftServer2$.main(HiveTh<= /p>

riftServer2.scala:56)

        at org.ap= ache.spark.sql.hive.thriftserver.HiveThriftServer2.main(HiveThr<= /p>

iftServer2.scala)

        at sun.re= flect.NativeMethodAccessorImpl.invoke0(Native Method)

        at sun.re= flect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.<= /p>

java:57)

        at sun.re= flect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAcces<= /p>

sorImpl.java:43)

        at java.l= ang.reflect.Method.invoke(Method.java:606)

        at org.ap= ache.spark.deploy.SparkSubmit$.launch(SparkSubmit.scala:353)

        at org.ap= ache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:75)

        at org.ap= ache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

Caused by: java.lang.ClassNotFoundException: com.goo= gle.common.base.Precondition

s

        at java.n= et.URLClassLoader$1.run(URLClassLoader.java:366)

        at java.n= et.URLClassLoader$1.run(URLClassLoader.java:355)

        at java.s= ecurity.AccessController.doPrivileged(Native Method)

        at java.n= et.URLClassLoader.findClass(URLClassLoader.java:354)

        at java.l= ang.ClassLoader.loadClass(ClassLoader.java:425)

        at sun.mi= sc.Launcher$AppClassLoader.loadClass(Launcher.java:308)

        at java.l= ang.ClassLoader.loadClass(ClassLoader.java:358)

 

--_000_8ee4315788de4a43b5450f00f201e7c1DFMDB3MBX1508exchangeco_--