Stupid question, I assume you're using a URL that starts with s3a and that your custom endpoint supports s3a?

William Watson
Lead Software Engineer

On Thu, Jan 14, 2016 at 1:57 PM, Alexander Pivovarov <apivovarov@gmail.com> wrote:

http://www.jets3t.org/toolkit/configuration.html

On Jan 14, 2016 10:56 AM, "Alexander Pivovarov" <apivovarov@gmail.com> wrote:

Add jets3t.properties file with s3service.s3-endpoint=<endpoint> to /etc/hadoop/conf folder

The folder with the file should be in HADOOP_CLASSPATH

JetS3t library which is used by hadoop is looking for this file.

On Dec 22, 2015 12:39 PM, "Phillips, Caleb" <Caleb.Phillips@nrel.gov> wrote:
Hi All,

New to this list. Looking for a bit of help:

I'm having trouble connecting Hadoop to a S3-compatable (non AWS) object store.

This issue was discussed, but left unresolved, in this thread:

https://mail-archives.apache.org/mod_mbox/spark-user/201507.mbox/%3CCA+0W_Au5Es_fLUgZMGwkkgA3JyA1ASi3u+isJCuYmfnTvNkGuQ@mail.gmail.com%3E

And here, on Cloudera's forums (the second post is mine):

https://community.cloudera.com/t5/Data-Ingestion-Integration/fs-s3a-endpoint-ignored-in-hdfs-site-xml/m-p/33694#M1180

I'm running Hadoop 2.6.3 with Java 1.8 (65) on a Linux host. Using Hadoop, I'm able to connect to S3 on AWS, and e.g., list/put/get files.

However, when I point the fs.s3a.endpoint configuration directive at my non-AWS S3-Compatable object storage, it appears to still point at (and authenticate against) AWS.

I've checked and double-checked my credentials and configuration using both Python's boto library and the s3cmd tool, both of which connect to this non-AWS data store just fine.

Any help would be much appreciated. Thanks!

--
Caleb Phillips, Ph.D.
Data Scientist | Computational Science Center

National Renewable Energy Laboratory (NREL)
15013 Denver West Parkway | Golden, CO 80401
303-275-4297 | caleb.phillips@nrel.gov

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@hadoop.apache.org
For additional commands, e-mail: user-help@hadoop.apache.org