spark-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Zhan Zhang <zzh...@hortonworks.com>
Subject Re: spark-shell 1.5 doesn't seem to work in local mode
Date Sun, 20 Sep 2015 03:29:22 GMT
It does not matter whether you start your spark with local or other mode. If you have hdfs-site.xml
somewhere and spark configuration pointing to that config, you will read/write to HDFS.

Thanks.

Zhan Zhang

________________________________________
From: Madhu <madhu@madhu.com>
Sent: Saturday, September 19, 2015 12:14 PM
To: dev@spark.apache.org
Subject: Re: spark-shell 1.5 doesn't seem to work in local mode

Thanks guys.

I do have HADOOP_INSTALL set, but Spark 1.4.1 did not seem to mind.
Seems like there's a difference in behavior between 1.5.0 and 1.4.1 for some
reason.

To the best of my knowledge, I just downloaded each tgz and untarred them in
/opt
I adjusted my PATH to point to one or the other, but that should be about
it.

Does 1.5.0 pick up HADOOP_INSTALL?
Wouldn't spark-shell --master local override that?
1.5 seemed to completely ignore --master local



-----
--
Madhu
https://www.linkedin.com/in/msiddalingaiah
--
View this message in context: http://apache-spark-developers-list.1001551.n3.nabble.com/spark-shell-1-5-doesn-t-seem-to-work-in-local-mode-tp14212p14217.html
Sent from the Apache Spark Developers List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
For additional commands, e-mail: dev-help@spark.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
For additional commands, e-mail: dev-help@spark.apache.org


Mime
View raw message