mesos-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From haosdent <haosd...@gmail.com>
Subject Re: Spark job run periodically through Chronos
Date Sun, 08 May 2016 17:03:19 GMT
The error looks like your classpath for spark is incorrect. I think this is
a spark usage problem.
You could take a look at
https://github.com/apache/spark/blob/master/bin/spark-class to check
if the jars exist under "${SPARK_HOME}/jars". Or get more information about
this via send another email to user@spark.apache.org.

On Wed, May 4, 2016 at 9:55 PM, Cecile, Adam <Adam.Cecile@hitec.lu> wrote:

> Hello,
>
>
> I went to the all-in-one TGZ using maven stuff and it went fine except the
> job is unable to be ran inside mesos...
>
> Of course the path is correct and this command works as expected on my own
> computer... Can you see something weird ? Mesos-slave not really exporting
> SPARK_HOME ? Java issue with semi-colon in path ?
>
>
> Thanks in advance,
>
>
> Adam.
>
>
> + bash -x spark-dist/spark-1.6.1-bin-hadoop2.6/bin/spark-submit --help
> + '[' -z '' ']'
> +++ dirname spark-dist/spark-1.6.1-bin-hadoop2.6/bin/spark-submit
> ++ cd spark-dist/spark-1.6.1-bin-hadoop2.6/bin/..
> ++ pwd
> + export SPARK_HOME=/tmp/mesos/slaves/9cb693bd-573e-44f3-9b99-a623a4b597ab-S2/frameworks/52c3180b-a1eb-4c79-8c8e-2599d59edcde-0000/executors/ct:1462369865658:0:radar-analysis-via-spark:/runs/4bb42868-fa7b-43a9-a7ca-5ccc8cf92d01/spark-dist/spark-1.6.1-bin-hadoop2.6
> + SPARK_HOME=/tmp/mesos/slaves/9cb693bd-573e-44f3-9b99-a623a4b597ab-S2/frameworks/52c3180b-a1eb-4c79-8c8e-2599d59edcde-0000/executors/ct:1462369865658:0:radar-analysis-via-spark:/runs/4bb42868-fa7b-43a9-a7ca-5ccc8cf92d01/spark-dist/spark-1.6.1-bin-hadoop2.6
> + export PYTHONHASHSEED=0
> + PYTHONHASHSEED=0
> + exec /tmp/mesos/slaves/9cb693bd-573e-44f3-9b99-a623a4b597ab-S2/frameworks/52c3180b-a1eb-4c79-8c8e-2599d59edcde-0000/executors/ct:1462369865658:0:radar-analysis-via-spark:/runs/4bb42868-fa7b-43a9-a7ca-5ccc8cf92d01/spark-dist/spark-1.6.1-bin-hadoop2.6/bin/spark-class
org.apache.spark.deploy.SparkSubmit --help
> Error: Could not find or load main class org.apache.spark.launcher.Main​
>
>
> ------------------------------
> *De :* Cecile, Adam
> *Envoyé :* mercredi 4 mai 2016 09:13
> *À :* user
> *Objet :* RE: Spark job run periodically through Chronos
>
>
> ​That's another option indeed ;)
> ------------------------------
> *De :* Shuai Lin <linshuai2012@gmail.com>
> *Envoyé :* mercredi 4 mai 2016 09:12
> *À :* user
> *Objet :* Re: Spark job run periodically through Chronos
>
> Why not using docker? You can build a docker image for your app, and the
> image can be based on https://hub.docker.com/r/mesosphere/spark/ .
>
> On Wed, May 4, 2016 at 2:53 PM, Cecile, Adam <Adam.Cecile@hitec.lu> wrote:
>
>> Hello,
>>
>>
>> I need to run a Spark job every N minutes and I'm wondering what is the
>> easier/proper way to do that. I was thinking about using Chronos but if I
>> do so, it seems I'll have to use spark-submit anyway.
>>
>>
>> Should I:
>>
>> * Bundle spark-submit tool along my app
>>
>> * Make sure spark-submit command is available on all mesos-slave
>>
>> * Forget this and do something better
>>
>>
>> ​Thanks in advance,
>>
>>
>> Best regards, Adam.
>>
>>
>>
>>
>


-- 
Best Regards,
Haosdent Huang

Mime
View raw message