spark-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Timothy Chen <tnac...@gmail.com>
Subject Re: Mesos and No transport is loaded for protocol
Date Mon, 06 Jun 2016 20:58:56 GMT
Hi,

How did you package the spark.tgz, and are you running the same code that you packaged when
you ran spark submit?

And what is your settings for spark look like?

Tim


> On Jun 6, 2016, at 12:13 PM, thibaut <thibaut.gensollen@gmail.com> wrote:
> 
> Hi there,
> 
> I an trying to configure Spark for running on top of Mesos. But every time I send a job,
it fails. I can see mesos downloading correctly the spark.tgz but I have this errors at the
end :
> 
> 
> Any idea ? I did not find anything for solving my issue..  Is it my cluster ? Spark ?
both ? Thanking you in advance.
> Thibaut
> 
> I0606 15:06:35.628329 16520 fetcher.cpp:456] Fetched 'http://d3kbcqa49mib13.cloudfront.net/spark-1.5.1-bin-hadoop2.6.tgz'
to '/tmp/mesos/slaves/c58064f7-88b6-438d-b76f-fc28c6cc51a1-S0/frameworks/c58064f7-88b6-438d-b76f-fc28c6cc51a1-0079/executors/3/runs/23913146-d87f-445c-9f6b-f412ad2cbbd7/spark-1.5.1-bin-hadoop2.6.tgz'
> I0606 15:06:35.687414 16527 exec.cpp:143] Version: 0.28.1
> I0606 15:06:35.691270 16540 exec.cpp:217] Executor registered on slave c58064f7-88b6-438d-b76f-fc28c6cc51a1-S0
> Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
> 16/06/06 15:06:36 INFO CoarseGrainedExecutorBackend: Registered signal handlers for [TERM,
HUP, INT]
> 16/06/06 15:06:36 WARN NativeCodeLoader: Unable to load native-hadoop library for your
platform... using builtin-java classes where applicable
> 16/06/06 15:06:37 INFO SecurityManager: Changing view acls to: thibautg
> 16/06/06 15:06:37 INFO SecurityManager: Changing modify acls to: thibautg
> 16/06/06 15:06:37 INFO SecurityManager: SecurityManager: authentication disabled; ui
acls disabled; users with view permissions: Set(thibautg); users with modify permissions:
Set(thibautg)
> 16/06/06 15:06:37 INFO Slf4jLogger: Slf4jLogger started
> 16/06/06 15:06:38 INFO Remoting: Starting remoting
> 16/06/06 15:06:38 INFO Remoting: Remoting started; listening on addresses :[akka.tcp://driverPropsFetcher@141.213.4.119:56419]
> 16/06/06 15:06:38 INFO Utils: Successfully started service 'driverPropsFetcher' on port
56419.
> Exception in thread "main" akka.remote.RemoteTransportException: No transport is loaded
for protocol: [spark], available protocols: [akka.tcp]
> 	at akka.remote.Remoting$.localAddressForRemote(Remoting.scala:87)
> 	at akka.remote.Remoting.localAddressForRemote(Remoting.scala:129)
> 	at akka.remote.RemoteActorRefProvider.rootGuardianAt(RemoteActorRefProvider.scala:338)
> 	at akka.actor.ActorRefFactory$class.actorSelection(ActorRefProvider.scala:318)
> 	at akka.actor.ActorSystem.actorSelection(ActorSystem.scala:272)
> 	at org.apache.spark.rpc.akka.AkkaRpcEnv.asyncSetupEndpointRefByURI(AkkaRpcEnv.scala:216)
> 	at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:98)
> 	at org.apache.spark.executor.CoarseGrainedExecutorBackend$$anonfun$run$1.apply$mcV$sp(CoarseGrainedExecutorBackend.scala:162)
> 	at org.apache.spark.deploy.SparkHadoopUtil$$anon$1.run(SparkHadoopUtil.scala:69)
> 	at org.apache.spark.deploy.SparkHadoopUtil$$anon$1.run(SparkHadoopUtil.scala:68)
> 	at java.security.AccessController.doPrivileged(Native Method)
> 	at javax.security.auth.Subject.doAs(Subject.java:422)
> 	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628)
> 	at org.apache.spark.deploy.SparkHadoopUtil.runAsSparkUser(SparkHadoopUtil.scala:68)
> 	at org.apache.spark.executor.CoarseGrainedExecutorBackend$.run(CoarseGrainedExecutorBackend.scala:149)
> 	at org.apache.spark.executor.CoarseGrainedExecutorBackend$.main(CoarseGrainedExecutorBackend.scala:250)
> 	at org.apache.spark.executor.CoarseGrainedExecutorBackend.main(CoarseGrainedExecutorBackend.scala)
> 
>  

Mime
View raw message