spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Marcelo Vanzin <van...@cloudera.com>
Subject Re: [Spark-SQL]: Unable to propagate hadoop configuration after SparkContext is initialized
Date Tue, 27 Oct 2015 18:05:03 GMT
On Tue, Oct 27, 2015 at 10:43 AM, Jerry Lam <chilinglam@gmail.com> wrote:
> Anyone experiences issues in setting hadoop configurations after
> SparkContext is initialized? I'm using Spark 1.5.1.
>
> I'm trying to use s3a which requires access and secret key set into hadoop
> configuration. I tried to set the properties in the hadoop configuration
> from sparktcontext.
>
> sc.hadoopConfiguration.set("fs.s3a.access.key", AWSAccessKeyId)
> sc.hadoopConfiguration.set("fs.s3a.secret.key", AWSSecretKey)

Try setting "spark.hadoop.fs.s3a.access.key" and
"spark.hadoop.fs.s3a.secret.key" in your SparkConf before creating the
SparkContext.

-- 
Marcelo

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org


Mime
View raw message