hadoop-common-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From mohamedhafez <mohamedha...@google.com>
Subject Re: Not able to back up to S3
Date Thu, 17 Apr 2008 18:19:22 GMT

If I try to specify the ID and Secret as part of the S3 URL, I get the
following error:

root@ip-10-251-110-134:~# hadoop distcp /dijkstra.log
s3://1W27ZBE2AKDVVFZB9T02:FEQbLfFVh+kF7VdTnw%2fPSqed8Joez+ummWtmmuq5@new_bucket_mohamedhafez/
With failures, global counters are inaccurate; consider running with -i
Copy failed: java.lang.IllegalArgumentException: AWS Access Key ID and
Secret Access Key must be specified as the username or password
(respectively) of a s3 URL, or by setting the fs.s3.awsAccessKeyId or
fs.s3.awsSecretAccessKey properties (respectively).
        at
org.apache.hadoop.fs.s3.Jets3tFileSystemStore.initialize(Jets3tFileSystemStore.java:101)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
        at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
        at java.lang.reflect.Method.invoke(Method.java:597)
        at
org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:82)
        at
org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:59)
        at $Proxy1.initialize(Unknown Source)
        at
org.apache.hadoop.fs.s3.S3FileSystem.initialize(S3FileSystem.java:78)
        at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:166)
        at org.apache.hadoop.fs.Path.getFileSystem(Path.java:175)
        at org.apache.hadoop.util.CopyFiles.setup(CopyFiles.java:672)
        at org.apache.hadoop.util.CopyFiles.copy(CopyFiles.java:475)
        at org.apache.hadoop.util.CopyFiles.run(CopyFiles.java:550)
        at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
        at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:79)
        at org.apache.hadoop.util.CopyFiles.main(CopyFiles.java:563)




When I put the id and secret in the config file, I get the following error:

root@ip-10-251-110-134:~# hadoop distcp /dijkstra.log
s3://new_bucket_mohamedhafez/
08/04/17 18:17:12 WARN httpclient.RestS3Service: Unable to access bucket:
null
org.jets3t.service.S3ServiceException: Cannot connect to S3 Service with a
null path
        at
org.jets3t.service.impl.rest.httpclient.RestS3Service.setupConnection(RestS3Service.java:616)
        at
org.jets3t.service.impl.rest.httpclient.RestS3Service.performRestHead(RestS3Service.java:483)
        at
org.jets3t.service.impl.rest.httpclient.RestS3Service.isBucketAccessible(RestS3Service.java:714)
        at org.jets3t.service.S3Service.createBucket(S3Service.java:499)
        at
org.apache.hadoop.fs.s3.Jets3tFileSystemStore.createBucket(Jets3tFileSystemStore.java:136)
        at
org.apache.hadoop.fs.s3.Jets3tFileSystemStore.initialize(Jets3tFileSystemStore.java:129)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
        at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
        at java.lang.reflect.Method.invoke(Method.java:597)
        at
org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:82)
        at
org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:59)
        at $Proxy1.initialize(Unknown Source)
        at
org.apache.hadoop.fs.s3.S3FileSystem.initialize(S3FileSystem.java:78)
        at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:166)
        at org.apache.hadoop.fs.Path.getFileSystem(Path.java:175)
        at org.apache.hadoop.util.CopyFiles.setup(CopyFiles.java:672)
        at org.apache.hadoop.util.CopyFiles.copy(CopyFiles.java:475)
        at org.apache.hadoop.util.CopyFiles.run(CopyFiles.java:550)
        at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
        at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:79)
        at org.apache.hadoop.util.CopyFiles.main(CopyFiles.java:563)
With failures, global counters are inaccurate; consider running with -i
Copy failed: org.apache.hadoop.fs.s3.S3Exception:
org.jets3t.service.S3ServiceException: The action Create Bucket cannot be
performed with an invalid bucket: S3Bucket
[name=null,creationDate=null,owner=null] Metadata={}
        at
org.apache.hadoop.fs.s3.Jets3tFileSystemStore.createBucket(Jets3tFileSystemStore.java:141)
        at
org.apache.hadoop.fs.s3.Jets3tFileSystemStore.initialize(Jets3tFileSystemStore.java:129)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
        at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
        at java.lang.reflect.Method.invoke(Method.java:597)
        at
org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:82)
        at
org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:59)
        at $Proxy1.initialize(Unknown Source)
        at
org.apache.hadoop.fs.s3.S3FileSystem.initialize(S3FileSystem.java:78)
        at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:166)
        at org.apache.hadoop.fs.Path.getFileSystem(Path.java:175)
        at org.apache.hadoop.util.CopyFiles.setup(CopyFiles.java:672)
        at org.apache.hadoop.util.CopyFiles.copy(CopyFiles.java:475)
        at org.apache.hadoop.util.CopyFiles.run(CopyFiles.java:550)
        at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
        at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:79)
        at org.apache.hadoop.util.CopyFiles.main(CopyFiles.java:563)
Caused by: org.jets3t.service.S3ServiceException: The action Create Bucket
cannot be performed with an invalid bucket: S3Bucket
[name=null,creationDate=null,owner=null] Metadata={}
        at
org.jets3t.service.S3Service.assertValidBucket(S3Service.java:420)
        at org.jets3t.service.S3Service.createBucket(S3Service.java:653)
        at org.jets3t.service.S3Service.createBucket(S3Service.java:506)
        at
org.apache.hadoop.fs.s3.Jets3tFileSystemStore.createBucket(Jets3tFileSystemStore.java:136)
        ... 17 more




I get the same error whether or not I replace / with %2f or not as well.
s3sync from the local fs works just fine.

Thanks,
Mohamed





Tom White wrote:
> 
> The bucket doesn't need formatting, and Hadoop creates buckets
> automatically if they don't already exist. Can you post the error
> message you are getting please.
> 
> Tom
> 
> On 17/04/2008, mohamedhafez <mohamedhafez@google.com> wrote:
>>
>>  Hi, I am trying to back up data to S3 from the hdfs using distcp, but it
>>  fails complaining of a null bucket. The bucket does exist and I can
>> access
>>  it with s3sync from the local filesystem. Can anyone help me with this?
>> Does
>>  the bucket need to be formated in some way first? Is there some command
>> in
>>  Hadoop to create a bucket it can use?
>>
>> --
>>  View this message in context:
>> http://www.nabble.com/Not-able-to-back-up-to-S3-tp16737029p16737029.html
>>  Sent from the Hadoop core-user mailing list archive at Nabble.com.
>>
>>
> 
> 
> -- 
> Blog: http://www.lexemetech.com/
> 
> 

-- 
View this message in context: http://www.nabble.com/Not-able-to-back-up-to-S3-tp16737029p16750360.html
Sent from the Hadoop core-user mailing list archive at Nabble.com.


Mime
View raw message