Return-Path: X-Original-To: archive-asf-public-internal@cust-asf2.ponee.io Delivered-To: archive-asf-public-internal@cust-asf2.ponee.io Received: from cust-asf.ponee.io (cust-asf.ponee.io [163.172.22.183]) by cust-asf2.ponee.io (Postfix) with ESMTP id 49D07200B85 for ; Thu, 15 Sep 2016 14:23:57 +0200 (CEST) Received: by cust-asf.ponee.io (Postfix) id 48570160AB7; Thu, 15 Sep 2016 12:23:57 +0000 (UTC) Delivered-To: archive-asf-public@cust-asf.ponee.io Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by cust-asf.ponee.io (Postfix) with SMTP id 44F9F160AB5 for ; Thu, 15 Sep 2016 14:23:56 +0200 (CEST) Received: (qmail 30401 invoked by uid 500); 15 Sep 2016 12:23:50 -0000 Mailing-List: contact user-help@hadoop.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Delivered-To: mailing list user@hadoop.apache.org Received: (qmail 29893 invoked by uid 99); 15 Sep 2016 12:23:49 -0000 Received: from pnap-us-west-generic-nat.apache.org (HELO spamd1-us-west.apache.org) (209.188.14.142) by apache.org (qpsmtpd/0.29) with ESMTP; Thu, 15 Sep 2016 12:23:49 +0000 Received: from localhost (localhost [127.0.0.1]) by spamd1-us-west.apache.org (ASF Mail Server at spamd1-us-west.apache.org) with ESMTP id 85B70CD72C for ; Thu, 15 Sep 2016 12:23:49 +0000 (UTC) X-Virus-Scanned: Debian amavisd-new at spamd1-us-west.apache.org X-Spam-Flag: NO X-Spam-Score: 1.298 X-Spam-Level: * X-Spam-Status: No, score=1.298 tagged_above=-999 required=6.31 tests=[HTML_MESSAGE=2, RCVD_IN_DNSWL_LOW=-0.7, RCVD_IN_MSPIKE_H2=-0.001, SPF_PASS=-0.001] autolearn=disabled Received: from mx2-lw-us.apache.org ([10.40.0.8]) by localhost (spamd1-us-west.apache.org [10.40.0.7]) (amavisd-new, port 10024) with ESMTP id 5PVPtYEVuPS2 for ; Thu, 15 Sep 2016 12:23:47 +0000 (UTC) Received: from mx0a-001b2d01.pphosted.com (mx0b-001b2d01.pphosted.com [148.163.158.5]) by mx2-lw-us.apache.org (ASF Mail Server at mx2-lw-us.apache.org) with ESMTPS id 66F815FC35 for ; Thu, 15 Sep 2016 12:23:47 +0000 (UTC) Received: from pps.filterd (m0098419.ppops.net [127.0.0.1]) by mx0b-001b2d01.pphosted.com (8.16.0.17/8.16.0.17) with SMTP id u8FCNKSd136805 for ; Thu, 15 Sep 2016 08:23:46 -0400 Received: from e28smtp03.in.ibm.com (e28smtp03.in.ibm.com [125.16.236.3]) by mx0b-001b2d01.pphosted.com with ESMTP id 25ftr6tpvb-1 (version=TLSv1.2 cipher=AES256-SHA bits=256 verify=NOT) for ; Thu, 15 Sep 2016 08:23:46 -0400 Received: from localhost by e28smtp03.in.ibm.com with IBM ESMTP SMTP Gateway: Authorized Use Only! Violators will be prosecuted for from ; Thu, 15 Sep 2016 17:53:42 +0530 Received: from d28dlp02.in.ibm.com (9.184.220.127) by e28smtp03.in.ibm.com (192.168.1.133) with IBM ESMTP SMTP Gateway: Authorized Use Only! Violators will be prosecuted; Thu, 15 Sep 2016 17:53:38 +0530 X-IBM-Helo: d28dlp02.in.ibm.com X-IBM-MailFrom: ashishk4@in.ibm.com X-IBM-RcptTo: user@hadoop.apache.org Received: from d28relay04.in.ibm.com (d28relay04.in.ibm.com [9.184.220.61]) by d28dlp02.in.ibm.com (Postfix) with ESMTP id E2E98394005C for ; Thu, 15 Sep 2016 17:53:37 +0530 (IST) Received: from d50lp02.ny.us.ibm.com (d50lp02.pok.ibm.com [146.89.104.208]) by d28relay04.in.ibm.com (8.14.9/8.14.9/NCO v10.0) with ESMTP id u8FCNZ1L35061994 for ; Thu, 15 Sep 2016 17:53:37 +0530 Received: from localhost by d50lp02.ny.us.ibm.com with IBM ESMTP SMTP Gateway: Authorized Use Only! Violators will be prosecuted for from ; Thu, 15 Sep 2016 08:23:35 -0400 Received: from smtp.notes.na.collabserv.com (192.155.248.73) by d50lp02.ny.us.ibm.com (158.87.18.21) with IBM ESMTP SMTP Gateway: Authorized Use Only! Violators will be prosecuted; (version=TLSv1/SSLv3 cipher=AES128-SHA bits=128/128) Thu, 15 Sep 2016 08:23:33 -0400 X-IBM-Helo: smtp.notes.na.collabserv.com X-IBM-MailFrom: ashishk4@in.ibm.com X-IBM-RcptTo: user@hadoop.apache.org Received: from localhost by smtp.notes.na.collabserv.com with smtp.notes.na.collabserv.com ESMTP for from ; Thu, 15 Sep 2016 12:23:32 -0000 Received: from us1a3-smtp05.a3.dal06.isc4sb.com (10.146.71.159) by smtp.notes.na.collabserv.com (10.106.227.90) with smtp.notes.na.collabserv.com ESMTP; Thu, 15 Sep 2016 12:23:29 -0000 X-IBM-Helo: us1a3-smtp05.a3.dal06.isc4sb.com X-IBM-MailFrom: ashishk4@in.ibm.com X-IBM-RcptTo: user@spark.apache.org;user@hadoop.apache.org Received: from us1a3-mail120.a3.dal06.isc4sb.com ([10.146.45.191]) by us1a3-smtp05.a3.dal06.isc4sb.com with ESMTP id 2016091512232896-183649 ; Thu, 15 Sep 2016 12:23:28 +0000 To: "user @spark" , user@hadoop.apache.org Subject: Spark 2.0.0 against Hadoop 2.7.2 - spark-shell error From: "Ashish Kumar9" Date: Thu, 15 Sep 2016 17:53:28 +0530 MIME-Version: 1.0 X-KeepSent: CD8274B3:DC05E67D-6525802F:00439956; type=4; name=$KeepSent X-Mailer: IBM Notes Release 9.0.1FP5 SHF106 December 12, 2015 X-LLNOutbound: False X-Disclaimed: 46583 X-TNEFEvaluated: 1 Content-Type: multipart/alternative; boundary="=_alternative 004411E16525802F_=" x-cbid: 16091512-0008-0000-0000-00000340F9F3 X-IBM-ISS-SpamDetectors: Score=0.423878; BY=0; FL=0; FP=0; FZ=0; HX=0; KW=0; PH=0; SC=0.423878; ST=0; TS=0; UL=0; ISC= X-IBM-ISS-DetailInfo: BY=3.00005765; HX=3.00000240; KW=3.00000007; PH=3.00000004; SC=3.00000185; SDB=6.00757946; UDB=6.00359611; UTC=2016-09-15 12:23:30 x-cbparentid: 16091512-3108-0000-0000-000000FD3C05 X-IBM-AV-DETECTION: SAVI=unused REMOTE=unused XFE=unused X-Content-Scanned: Fidelis XPS MAILER X-IBM-AV-DETECTION: SAVI=unused REMOTE=unused XFE=unused Message-Id: X-Proofpoint-Virus-Version: vendor=fsecure engine=2.50.10432:,, definitions=2016-09-15_07:,, signatures=0 X-Proofpoint-Spam-Details: rule=outbound_notspam policy=outbound score=0 spamscore=0 suspectscore=0 malwarescore=0 phishscore=0 adultscore=0 bulkscore=0 classifier=spam adjust=0 reason=mlx scancount=1 engine=8.0.1-1609020000 definitions=main-1609150167 archived-at: Thu, 15 Sep 2016 12:23:57 -0000 --=_alternative 004411E16525802F_= Content-Transfer-Encoding: quoted-printable Content-Type: text/plain; charset="US-ASCII" I have built Spark 2.0.0 against Hadoop 2.7.2 and build was successful .=20 Hadoop is running fine as well . However when I run Spark I get below=20 runtime exception .=20 Please suggest on required classpath settings . I have included all the=20 hadoop libraries in the classpath and I still get this error .=20 [hadoop@sys-77402 sbin]$ cd $SPARK=5FHOME [hadoop@sys-77402 spark-2.0.0-bin-spark-2.0.0-hadoop2.7-ppc64le]$=20 spark-shell Setting default log level to "WARN". To adjust logging level use sc.setLogLevel(newLevel). java.lang.NoClassDefFoundError:=20 org/apache/commons/configuration/Configuration at=20 org.apache.hadoop.metrics2.lib.DefaultMetricsSystem.(DefaultMetricsSy= stem.java:38) at=20 org.apache.hadoop.metrics2.lib.DefaultMetricsSystem.(DefaultMetrics= System.java:36) at=20 org.apache.hadoop.security.UserGroupInformation$UgiMetrics.create(UserGroup= Information.java:120) at=20 org.apache.hadoop.security.UserGroupInformation.(UserGroupInformati= on.java:236) at=20 org.apache.spark.util.Utils$$anonfun$getCurrentUserName$1.apply(Utils.scala= :2245) at=20 org.apache.spark.util.Utils$$anonfun$getCurrentUserName$1.apply(Utils.scala= :2245) at scala.Option.getOrElse(Option.scala:121) at org.apache.spark.util.Utils$.getCurrentUserName(Utils.scala:2245) at org.apache.spark.SparkContext.(SparkContext.scala:297) at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2256) at=20 org.apache.spark.sql.SparkSession$Builder$$anonfun$8.apply(SparkSession.sca= la:831) at=20 org.apache.spark.sql.SparkSession$Builder$$anonfun$8.apply(SparkSession.sca= la:823) at scala.Option.getOrElse(Option.scala:121) at=20 org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:82= 3) at org.apache.spark.repl.Main$.createSparkSession(Main.scala:101) ... 47 elided Caused by: java.lang.ClassNotFoundException:=20 org.apache.commons.configuration.Configuration at java.net.URLClassLoader.findClass(URLClassLoader.java:381) at java.lang.ClassLoader.loadClass(ClassLoader.java:424) at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331) at java.lang.ClassLoader.loadClass(ClassLoader.java:357) ... 62 more :14: error: not found: value spark import spark.implicits.=5F ^ :14: error: not found: value spark import spark.sql ^ Welcome to =5F=5F=5F=5F =5F=5F / =5F=5F/=5F=5F =5F=5F=5F =5F=5F=5F=5F=5F/ /=5F=5F =5F\ \/ =5F \/ =5F `/ =5F=5F/ '=5F/ /=5F=5F=5F/ .=5F=5F/\=5F,=5F/=5F/ /=5F/\=5F\ version 2.0.0 /=5F/ =20 Using Scala version 2.11.8 (OpenJDK 64-Bit Server VM, Java 1.8.0=5F65) Type in expressions to have them evaluated. --=_alternative 004411E16525802F_= Content-Transfer-Encoding: quoted-printable Content-Type: text/html; charset="US-ASCII" I have built Spark 2.0.0 against Hadoop 2.7.2 and build was successful . Hadoop is running fine as well . However when I run Spark I get below runtime exception .

Please suggest on required classpath settings . I have included all the hadoop libraries in the classpath and I still get this error .

= [hadoop@sys-77402 sbin]$ cd $SPARK=5FHOME
[hadoop@sys-77402 spark-2.0.0-bin-spark-2.0.0-hadoop2.7-ppc64le= ]$ spark-shell
Setting default lo= g level to "WARN".
T= o adjust logging level use sc.setLogLevel(newLevel).
java.lang.NoClassDefFoundError: org/apache/commons= /configuration/Configuration
&= nbsp; at org.apache.hadoop.metrics2.lib.DefaultMetricsSystem.<init>(D= efaultMetricsSystem.java:38)
&= nbsp; at org.apache.hadoop.metrics2.lib.DefaultMetricsSystem.<clinit>= (DefaultMetricsSystem.java:36)
  at org.apache.hadoop.security.UserGroupInformation$UgiMetrics.creat= e(UserGroupInformation.java:120)
  at org.apache.hadoop.security.UserGroupInformation.<clinit>= (UserGroupInformation.java:236)
  at org.apache.spark.util.Utils$$anonfun$getCurrentUserName$1.apply= (Utils.scala:2245)
  at o= rg.apache.spark.util.Utils$$anonfun$getCurrentUserName$1.apply(Utils.scala:= 2245)
  at scala.Option.g= etOrElse(Option.scala:121)
&nb= sp; at org.apache.spark.util.Utils$.getCurrentUserName(Utils.scala:2245)
  at org.apache.spark.Spark= Context.<init>(SparkContext.scala:297)
  at org.apache.spark.SparkContext$.getOrCreate(SparkC= ontext.scala:2256)
  at o= rg.apache.spark.sql.SparkSession$Builder$$anonfun$8.apply(SparkSession.scal= a:831)
  at org.apache.sp= ark.sql.SparkSession$Builder$$anonfun$8.apply(SparkSession.scala:823)
  at scala.Option.getOrElse(Op= tion.scala:121)
  at org.= apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:823)
  at org.apache.spark.repl= .Main$.createSparkSession(Main.scala:101)
  ... 47 elided
Caused by: java.lang.ClassNotFoundException: org.apache.commons.configuration.Configuration
  at java.net.URLClassLoader.findClass(URLClassLoader= .java:381)
  at java.lang= .ClassLoader.loadClass(ClassLoader.java:424)
  at sun.misc.Launcher$AppClassLoader.loadClass(Launch= er.java:331)
  at java.la= ng.ClassLoader.loadClass(ClassLoader.java:357)
  ... 62 more
<console>:14: error: not found: value spark
    &nbs= p;  import spark.implicits.=5F
              ^
<console>:14: e= rror: not found: value spark
    &nbs= p;  import spark.sql
&nbs= p;             ^
Welcome to
=       =5F=5F=5F=5F  =            =5F=5F
     / =5F=5F/=5F=5F  =5F=5F=5F =5F=5F=5F=5F=5F/ /=5F=5F
 = ;   =5F\ \/ =5F \/ =5F `/ =5F=5F/  '=5F/
   /=5F=5F=5F/ .=5F=5F/\=5F,=5F/=5F/ /=5F/\= =5F\   version 2.0.0
  &n= bsp;   /=5F/
   = ;      
Using S= cala version 2.11.8 (OpenJDK 64-Bit Server VM, Java 1.8.0=5F65)
Type in expressions to have them evaluated.


--=_alternative 004411E16525802F_=--