livy-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Junaid Nasir <jna...@an10.io>
Subject Livy with DC/OS Mesos
Date Wed, 25 Oct 2017 10:55:47 GMT
Hi,
I am trying to run Livy with DC/OS Mesos.it tries to create driver but that
fails with error blow. Is there a way i can provide it with spark-internal file?
I am unable to find what this spark-internal file does and i am unable to find
it anywhere. only thing i found related to this was in ContextLauncher.java line
232 ` launcher.setAppResource("spark-internal");`

Failed to fetch spark-internal                   I1025 10:41:45.202145 27901 fetcher.cpp:531]
Fetcher Info: {"cache_directory":"\/tmp\/mesos\/fetch\/slaves\/babc8653-5a13-4fca-9fd9-fe2870f9ad50-S14","items":[{"action":"BYPASS_CACHE","uri":{"cache":false,"extract":true,"value":"spark-internal"}}],"sandbox_directory":"\/var\/lib\/mesos\/slave\/slaves\/babc8653-5a13-4fca-9fd9-fe2870f9ad50-S14\/frameworks\/babc8653-5a13-4fca-9fd9-fe2870f9ad50-0003\/executors\/driver-20171025104145-0034\/runs\/9976b5de-0ed7-428e-a23c-59056c6f29ed"}I1025
10:41:45.204314 27901 fetcher.cpp:442] Fetching URI 'spark-internal'I1025 10:41:45.204329
27901 fetcher.cpp:283] Fetching directly into the sandbox directoryI1025 10:41:45.204346 27901
fetcher.cpp:220] Fetching URI 'spark-internal'Failed to fetch 'spark-internal': A relative
path was passed for the resource but the Mesos framework home was not specified. Please either
provide this config option or avoid using a relative path
                

Not using Mixmax yet?  

Spark-submit works fine I am unable to run a job with following command
bin/spark-submit --class org.apache.spark.examples.SparkPi --master
mesos://10.128.0.23:22401 --deploy-mode cluster --supervise --executor-memory 1G
--total-executor-cores 1 --conf
spark.mesos.executor.docker.image=mesosphere/spark:2.0.1-2.2.0-1-hadoop-2.6
--conf spark.executor.home=/opt/spark/dist
https://downloads.mesosphere.com/spark/examples/pi.py 30
I have set spark master to the same ip and set cluster mode. sending post
request to session with these parametersdata = {'kind':
'spark','conf':{'spark.mesos.executor.docker.image':'mesosphere/spark:2.0.1-2.2.0-1-hadoop-2.6
','spark.executor.home':'/opt/spark/dist'}} 
Any help would be highlyappreciated
Mime
View raw message