spark-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Reynold Xin <r...@databricks.com>
Subject Re: [Newbie] spark conf
Date Fri, 10 Feb 2017 21:36:46 GMT
You can put them in spark's own conf/spark-defaults.conf file

On Fri, Feb 10, 2017 at 10:35 PM, Sam Elamin <hussam.elamin@gmail.com>
wrote:

> Hi All,
>
>
> really newbie question here folks, i have properties like my aws access
> and secret keys in the core-site.xml in hadoop among other properties, but
> thats the only reason I have hadoop installed which seems a bit of an
> overkill.
>
> Is there an equivalent of core-site.xml for spark so I dont have to
> reference the HADOOP_CONF_DIR in my spark env.sh?
>
> I know I can export env variables for the AWS credentials but other
> properties that my application might want to use?
>
> Regards
> Sam
>
>
>
>

Mime
View raw message