hadoop-hdfs-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Phillips, Caleb" <Caleb.Phill...@nrel.gov>
Subject fs.s3a.endpoint not working
Date Tue, 22 Dec 2015 20:39:32 GMT
Hi All,

New to this list. Looking for a bit of help:

I'm having trouble connecting Hadoop to a S3-compatable (non AWS) object store.

This issue was discussed, but left unresolved, in this thread:


And here, on Cloudera's forums (the second post is mine):


I'm running Hadoop 2.6.3 with Java 1.8 (65) on a Linux host. Using Hadoop, I'm able to connect
to S3 on AWS, and e.g., list/put/get files.

However, when I point the fs.s3a.endpoint configuration directive at my non-AWS S3-Compatable
object storage, it appears to still point at (and authenticate against) AWS.

I've checked and double-checked my credentials and configuration using both Python's boto
library and the s3cmd tool, both of which connect to this non-AWS data store just fine.

Any help would be much appreciated. Thanks!

Caleb Phillips, Ph.D.
Data Scientist | Computational Science Center

National Renewable Energy Laboratory (NREL)
15013 Denver West Parkway | Golden, CO 80401
303-275-4297 | caleb.phillips@nrel.gov

To unsubscribe, e-mail: user-unsubscribe@hadoop.apache.org
For additional commands, e-mail: user-help@hadoop.apache.org

View raw message