incubator-mesos-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Du Li <>
Subject how to run other frameworks on Mesos
Date Thu, 20 Jun 2013 19:30:01 GMT

I set up Mesos (0.11.0) on my local cluster running Ubuntu 12.04. Now without starting Spark
(0.7.2) daemons, I was able to run Spark jobs directly on Mesos. However, to run Shark (0.7.0),
I still had to start the Spark master/slave daemons. Otherwise, shark exit with an IllegalStateException.
I have modified shark/conf/ to provide the Mesos library path. This seems reasonable
since Shark eventually decomposes queries into Spark tasks. but Just to confirm, do I really
need to start Spark daemons to run Shark?


shark> show tables;
show tables;
FAILED: Hive Internal Error: java.lang.IllegalStateException(Shutdown in progress)
shark> Exception in thread "Thread-1" java.util.concurrent.TimeoutException: Futures timed
out after [5000] milliseconds
at akka.dispatch.DefaultPromise.ready(Future.scala:870)
at akka.dispatch.DefaultPromise.result(Future.scala:874)
at akka.dispatch.Await$.result(Future.scala:74)
at spark.deploy.client.Client.stop(Client.scala:117)
at spark.scheduler.cluster.SparkDeploySchedulerBackend.stop(SparkDeploySchedulerBackend.scala:43)
at spark.scheduler.cluster.ClusterScheduler.stop(ClusterScheduler.scala:254)
at spark.scheduler.DAGScheduler.stop(DAGScheduler.scala:724)
at spark.SparkContext.stop(SparkContext.scala:532)
at shark.SharkEnv$.stop(SharkEnv.scala:115)
at shark.SharkCliDriver$$anon$

  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message