spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Steve Loughran <ste...@hortonworks.com>
Subject Re: HiveContext: Unable to load AWS credentials from any provider in the chain
Date Thu, 09 Jun 2016 09:19:54 GMT

On 9 Jun 2016, at 06:17, Daniel Haviv <daniel.haviv@veracity-group.com<mailto:daniel.haviv@veracity-group.com>>
wrote:

Hi,
I've set these properties both in core-site.xml and hdfs-site.xml with no luck.

Thank you.
Daniel


That's not good.

I'm afraid I don't know what version of s3a is in the cloudera release —I can see that the
amazon stuff has been shaded, but don't know about the hadoop side and its auth.

One thing: can you try using s3n rather than s3a. I do think s3a is now better (and will be
*really* good soon), but as s3n has been around for a long time, it's the baseline for functionality.

And I've just created some homework to do better logging of what's going on the s3a driver,
though that bit of startup code in spark might interfere. https://issues.apache.org/jira/browse/HADOOP-13252


There's not much else i can do I'm afraid, not without patching your hadoop source and rebuilding
things

-Steve




Mime
View raw message