spark-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Littlestar (JIRA)" <j...@apache.org>
Subject [jira] [Comment Edited] (SPARK-6461) spark.executorEnv.PATH in spark-defaults.conf is not pass to mesos
Date Mon, 23 Mar 2015 09:30:11 GMT

    [ https://issues.apache.org/jira/browse/SPARK-6461?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14375468#comment-14375468
] 

Littlestar edited comment on SPARK-6461 at 3/23/15 9:29 AM:
------------------------------------------------------------

each mesos slave node has JAVA and HADOOP DataNode.

Now I add  the following setting to mesos-master-env.sh and mesos-slave-env.sh.
 export MESOS_JAVA_HOME=/home/test/jdk
 export MESOS_HADOOP_HOME=/home/test/hadoop-2.4.0
export MESOS_HADOOP_CONF_DIR=/home/test/hadoop-2.4.0/etc/hadoop
 export MESOS_PATH=/home/test/jdk/bin:/home/test/hadoop-2.4.0/sbin:/home/test/hadoop-2.4.0/bin:/sbin:/bin:/usr/sbin:/usr/bin

 /usr/bin/env: bash: No such file or directory

thanks.



was (Author: cnstar9988):
each mesos slave node has JAVA and HADOOP DataNode.

I also add  the following setting to mesos-master-env.sh and mesos-slave-env.sh.
 export MESOS_JAVA_HOME=/home/test/jdk
 export MESOS_HADOOP_HOME=/home/test/hadoop-2.4.0
export MESOS_HADOOP_CONF_DIR=/home/test/hadoop-2.4.0/etc/hadoop
 export MESOS_PATH=/home/test/jdk/bin:/home/test/hadoop-2.4.0/sbin:/home/test/hadoop-2.4.0/bin:/sbin:/bin:/usr/sbin:/usr/bin

 /usr/bin/env: bash: No such file or directory

thanks.


> spark.executorEnv.PATH in spark-defaults.conf is not pass to mesos
> ------------------------------------------------------------------
>
>                 Key: SPARK-6461
>                 URL: https://issues.apache.org/jira/browse/SPARK-6461
>             Project: Spark
>          Issue Type: Bug
>          Components: Scheduler
>    Affects Versions: 1.3.0
>            Reporter: Littlestar
>
> I use mesos run spak 1.3.0 ./run-example SparkPi
> but failed.
> spark.executorEnv.PATH in spark-defaults.conf is not pass to mesos
> spark.executorEnv.PATH
> spark.executorEnv.HADOOP_HOME
> spark.executorEnv.JAVA_HOME
> E0323 14:24:36.400635 11355 fetcher.cpp:109] HDFS copyToLocal failed: hadoop fs -copyToLocal
'hdfs://192.168.1.9:54310/home/test/spark-1.3.0-bin-2.4.0.tar.gz' '/home/mesos/work_dir/slaves/20150323-100710-1214949568-5050-3453-S3/frameworks/20150323-133400-1214949568-5050-15440-0007/executors/20150323-100710-1214949568-5050-3453-S3/runs/915b40d8-f7c4-428a-9df8-ac9804c6cd21/spark-1.3.0-bin-2.4.0.tar.gz'
> sh: hadoop: command not found



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org


Mime
View raw message