spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Raghuveer Chanda <>
Subject Spark on Yarn
Date Wed, 21 Oct 2015 10:33:11 GMT
Hi all,

I am trying to run spark on yarn in quickstart cloudera vm.It already has
spark 1.3 and Hadoop 2.6.0-cdh5.4.0 installed.(I am not using spark-submit
since I want to run a different version of spark).

I am able to run spark 1.3 on yarn but get the below error for spark 1.4.

The log shows its running on spark 1.4 but still gives a error on a method
which is present in 1.4 and not 1.3. Even the fat jar contains the class
files of 1.4.

As far as running in yarn the installed spark version shouldnt matter, but
still its running on the other version.

*Hadoop Version:*
Hadoop 2.6.0-cdh5.4.0
Subversion -r
Compiled by jenkins on 2015-04-21T19:18Z
Compiled with protoc 2.5.0
>From source with checksum cd78f139c66c13ab5cee96e15a629025
This command was run using /usr/lib/hadoop/hadoop-common-2.6.0-cdh5.4.0.jar

Log Upload Time:Tue Oct 20 21:58:56 -0700 2015
Log Contents:
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in
SLF4J: Found binding in
SLF4J: See for an
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
15/10/20 21:58:50 INFO spark.SparkContext: *Running Spark version 1.4.0*
15/10/20 21:58:53 INFO spark.SecurityManager: Changing view acls to: yarn
15/10/20 21:58:53 INFO spark.SecurityManager: Changing modify acls to: yarn
15/10/20 21:58:53 INFO spark.SecurityManager: SecurityManager:
authentication disabled; ui acls disabled; users with view permissions:
Set(yarn); users with modify permissions: Set(yarn)
*Exception in thread "main" java.lang.NoSuchMethodError:;)J*
at org.apache.spark.util.Utils$.timeStringAsSeconds(Utils.scala:1027)
at org.apache.spark.SparkConf.getTimeAsSeconds(SparkConf.scala:194)
at org.apache.spark.util.AkkaUtils$$anonfun$1.apply(AkkaUtils.scala:54)
at org.apache.spark.util.AkkaUtils$$anonfun$1.apply(AkkaUtils.scala:53)
at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:141)
at org.apache.spark.util.Utils$.startServiceOnPort(Utils.scala:1982)
at org.apache.spark.util.AkkaUtils$.createActorSystem(AkkaUtils.scala:56)
at org.apache.spark.rpc.akka.AkkaRpcEnvFactory.create(AkkaRpcEnv.scala:245)
at org.apache.spark.rpc.RpcEnv$.create(RpcEnv.scala:52)
at org.apache.spark.SparkEnv$.create(SparkEnv.scala:247)
at org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:188)
at org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:267)
at org.apache.spark.SparkContext.<init>(SparkContext.scala:424)
at com.hortonworks.simpleyarnapp.HelloWorld.main(
15/10/20 21:58:53 INFO util.Utils: Shutdown hook called

Please help :)

Regards and Thanks,
Raghuveer Chanda

View raw message