hadoop-common-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "antoine (JIRA)" <j...@apache.org>
Subject [jira] [Created] (HADOOP-15858) S3A: property fs.s3a.endpoint is either ignored or treated incorrectly for custom s3 endpoint
Date Tue, 16 Oct 2018 11:03:00 GMT
antoine created HADOOP-15858:
--------------------------------

             Summary: S3A: property fs.s3a.endpoint is either ignored or treated incorrectly
for custom s3 endpoint
                 Key: HADOOP-15858
                 URL: https://issues.apache.org/jira/browse/HADOOP-15858
             Project: Hadoop Common
          Issue Type: Bug
          Components: fs/s3
    Affects Versions: 2.7.7, 2.9.1
         Environment: Hadoop 2.7.7 and 2.9.1

Java openJDK 8:  8u181-b13-0ubuntu0.18.04.1

Swift S3 api

 
            Reporter: antoine


I'm trying to connect to an internal swift server using S3A capability of Hadoop. This server
works using python boto, it contains one bucket name {{test}} and in this bucket there's one
file {{test.file}}

So far it's been impossible for me to reach the server properly, each time I try it either
ignore fs.s3a.endpoint or treat it incorrectly.:

site-core.xml:
{quote}{{<configuration>}}
 {{<property>}}
 {{<name>fs.s3a.access.key</name>}}
 {{<description>mykey</description>}}
 {{</property>}}
 {{<property>}}
 {{<name>fs.s3a.secret.key</name>}}
 {{<description>mysecret</description>}}
 {{</property>}}
 {{<property>}}
 {{<name>fs.s3a.endpoint</name>}}
 {{<description>my.endpoint.fr:8080</description>}}
 {{</property>}}
 {{<property>}}
 {{<name>fs.s3a.connection.ssl.enabled</name>}}
 {{<value>true</value>}}
 {{</property>}}
 {{<property>}}
 {{<name>fs.s3a.path.style.access</name>}}
 {{<description>true</description>}}
 {{</property>}}
 {{<property>}}
 {{<name>fs.s3a.impl</name>}}
 {{<description>org.apache.hadoop.fs.s3a.S3AFileSystem</description>}}
 {{</property>}}
 {{</configuration>}}
{quote}
To debug this issue, I've try using this tool: [cloudstore|https://github.com/steveloughran/cloudstore/releases]
which helps debug Hadoop fs. 
 When trying to list my bucket {{test}} : 
 {{s3a://test/}}

I can see that it's connecting to :

[https://test.s3.amazonaws.com/]

Meaning that it happen test to the original s3 server and ignoring my previous settings.

When trying to list my bucket {{test}} using this url : 
 {{s3a://}}{{my.endpoint.fr:8080}}{{/test/}}

I can see that it's connecting to :

[https://my.endpoint.fr/]

Meaning that it ignore the port I set up in fs.s3a.endpoint configuration which of course
doesn't work because my server is listening to port 8080.



I've tried with {{fs.s3a.path.style.access}} to false it's pretty much the same.

I'm sorry if it's not a bug but any help or consideration would be very appreciate.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: common-dev-unsubscribe@hadoop.apache.org
For additional commands, e-mail: common-dev-help@hadoop.apache.org


Mime
View raw message