hadoop-common-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From tim robertson <timrobertson...@gmail.com>
Subject Re: Help with Hadoop/Hbase on s3
Date Fri, 07 Aug 2009 17:28:58 GMT
Pointing out the obvious but something somewhere is trying to create a
bucket that has already been created.

Sorry, but I don't think I can help further - perhaps change
s3://testbucket to s3://testbucket2 just to be sure it is not that you
have created it in another process by accident?

Cheers

Tim


On Fri, Aug 7, 2009 at 6:51 PM, Ananth T.
Sarathy<ananth.t.sarathy@gmail.com> wrote:
> TIm,
>  that got me a little further! Thanks...
>
> but now i get a different error
>
> hbase-site.xml
>
> <configuration>
>   <property>
>  <name>hbase.master</name>
>    <value>174.129.15.236:60000</value>
>    <description>The host and port that the HBase master runs at.
>    A value of 'local' runs the master and a regionserver in
>    a single process.
>    </description>
>  </property>
>
>  <property>
>   <name>hbase.rootdir</name>
>   <value>s3://testbucket</value>
>   <description>The directory shared by region servers.
>   </description>
>  </property>
> </configuration>
>
> i copied a hadoop-site.xml with my access and secret key to my conf/ in
> hbase....  i also tried using the s3://id:access@bucket and that didn't
> work.
>
> Fri Aug  7 12:47:45 EDT 2009 Starting master on ip-10-244-131-228
> ulimit -n 1024
> 2009-08-07 12:47:45,850 INFO org.apache.hadoop.hbase.master.HMaster:
> vmName=Java HotSpot(TM) Client VM, vmVendor=Sun Microsystems Inc.,
> vmVersion=14.1-b02
> 2009-08-07 12:47:45,850 INFO org.apache.hadoop.hbase.master.HMaster:
> vmInputArguments=[-Xmx1000m, -XX:+HeapDumpOnOutOfMemoryError,
> -Dhbase.log.dir=/usr/hbase-0.19.2/bin/../logs,
> -Dhbase.log.file=hbase-root-master-ip-10-244-131-228.log,
> -Dhbase.home.dir=/usr/hbase-0.19.2/bin/.., -Dhbase.id.str=root,
> -Dhbase.root.logger=INFO,DRFA,
> -Djava.library.path=/usr/hbase-0.19.2/bin/../lib/native/Linux-i386-32]
> 2009-08-07 12:47:48,535 ERROR org.apache.hadoop.hbase.master.HMaster: Can
> not start master
> org.apache.hadoop.fs.s3.S3Exception: org.jets3t.service.S3ServiceException:
> S3 PUT failed for '/' XML Error Message: <?xml version="1.0"
> encoding="UTF-8"?><Error><Code>BucketAlreadyExists</Code><Message>The
> requested bucket name is not available. The bucket namespace is shared by
> all users of the system. Please select a different name and try
> again.</Message><BucketName>testbucket</BucketName><RequestId>C0C7F562713BDE97</RequestId><HostId>ifY4rPOqmasjPkH+EiTS3LsgRzuDcbUTHy+y8p4HMnJWN1kUXCUe+FvYSZhIlYHg</HostId></Error>
>        at
> org.apache.hadoop.fs.s3.Jets3tFileSystemStore.createBucket(Jets3tFileSystemStore.java:108)
>        at
> org.apache.hadoop.fs.s3.Jets3tFileSystemStore.initialize(Jets3tFileSystemStore.java:96)
>        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>        at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>        at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>        at java.lang.reflect.Method.invoke(Method.java:597)
>        at
> org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:82)
>        at
> org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:59)
>        at $Proxy0.initialize(Unknown Source)
>        at
> org.apache.hadoop.fs.s3.S3FileSystem.initialize(S3FileSystem.java:76)
>        at
> org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1367)
>        at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:56)
>        at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1379)
>        at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:215)
>        at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:120)
>        at org.apache.hadoop.hbase.master.HMaster.<init>(HMaster.java:186)
>        at org.apache.hadoop.hbase.master.HMaster.<init>(HMaster.java:156)
>        at
> org.apache.hadoop.hbase.LocalHBaseCluster.<init>(LocalHBaseCluster.java:96)
>        at
> org.apache.hadoop.hbase.LocalHBaseCluster.<init>(LocalHBaseCluster.java:78)
>        at org.apache.hadoop.hbase.master.HMaster.doMain(HMaster.java:1013)
>        at org.apache.hadoop.hbase.master.HMaster.main(HMaster.java:1057)
> Caused by: org.jets3t.service.S3ServiceException: S3 PUT failed for '/' XML
> Error Message: <?xml version="1.0"
> encoding="UTF-8"?><Error><Code>BucketAlreadyExists</Code><Message>The
> requested bucket name is not available. The bucket namespace is shared by
> all users of the system. Please select a different name and try
> again.</Message><BucketName>testbucket</BucketName><RequestId>C0C7F562713BDE97</RequestId><HostId>ifY4rPOqmasjPkH+EiTS3LsgRzuDcbUTHy+y8p4HMnJWN1kUXCUe+FvYSZhIlYHg</HostId></Error>
>        at
> org.jets3t.service.impl.rest.httpclient.RestS3Service.performRequest(RestS3Service.java:416)
>        at
> org.jets3t.service.impl.rest.httpclient.RestS3Service.performRestPut(RestS3Service.java:800)
>        at
> org.jets3t.service.impl.rest.httpclient.RestS3Service.createObjectImpl(RestS3Service.java:1399)
>        at
> org.jets3t.service.impl.rest.httpclient.RestS3Service.createBucketImpl(RestS3Service.java:1270)
>        at org.jets3t.service.S3Service.createBucket(S3Service.java:1558)
>        at org.jets3t.service.S3Service.createBucket(S3Service.java:1257)
>        at org.jets3t.service.S3Service.createBucket(S3Service.java:1284)
>        at
> org.apache.hadoop.fs.s3.Jets3tFileSystemStore.createBucket(Jets3tFileSystemStore.java:103)
>        ... 20 more
>
>
> Ananth T Sarathy
>
>
> On Fri, Aug 7, 2009 at 11:02 AM, tim robertson <timrobertson100@gmail.com>wrote:
>
>> Do you need to add the Amazon S3 toolkit on the HBase classpath
>> directly to use S3 as a store?
>>
>> http://developer.amazonwebservices.com/connect/entry.jspa?externalID=617&categoryID=47
>>
>> I'm guessing based on the "java.lang.NoClassDefFoundError:
>> org/jets3t/service/S3ServiceException"
>>
>> Cheers
>>
>> Tim
>>
>>
>> On Fri, Aug 7, 2009 at 4:50 PM, Ananth T.
>> Sarathy<ananth.t.sarathy@gmail.com> wrote:
>> > I can't seem to get Hbase to run using the hadoop i have connected to my
>> s3
>> > bucket
>> >
>> > Running
>> > Hbase 0.19.2
>> > Hadoop  0.19.2
>> >
>> > Hadoop-site.xml
>> >  < configuration>
>> >
>> > <property>
>> >  <name>fs.default.name</name>
>> >  <value>s3://hbase</value>
>> > </property>
>> >
>> > <property>
>> >  <name>fs.s3.awsAccessKeyId</name>
>> >  <value>ID</value>
>> > </property>
>> >
>> > <property>
>> >  <name>fs.s3.awsSecretAccessKey</name>
>> >  <value>SECRET</value>
>> > </property>
>> > </configuration>
>> >
>> > and it seems to start up no problem
>> >
>> > my hbase-site.xml
>> >
>> > <configuration>
>> >    <property>
>> >  <name>hbase.master</name>
>> >     <value>174.129.15.236:60000</value>
>> >     <description>The host and port that the HBase master runs at.
>> >     A value of 'local' runs the master and a regionserver in
>> >     a single process.
>> >     </description>
>> >   </property>
>> >
>> >  <property>
>> >    <name>hbase.rootdir</name>
>> >    <value>s3://hbase</value>
>> >    <description>The directory shared by region servers.
>> >    </description>
>> >  </property>
>> >
>> > </configuration>
>> >
>> >
>> > keeps giving me
>> >
>> > ]
>> > 2009-08-06 17:20:44,526 ERROR org.apache.hadoop.hbase.master.HMaster: Can
>> > not start master
>> > java.lang.NoClassDefFoundError: org/jets3t/service/S3ServiceException
>> >        at
>> >
>> org.apache.hadoop.fs.s3.S3FileSystem.createDefaultStore(S3FileSystem.java:84)
>> >        at
>> > org.apache.hadoop.fs.s3.S3FileSystem.initialize(S3FileSystem.java:74)
>> >        at
>> > org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1367)
>> >        at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:56)
>> >        at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1379)
>> >        at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:215)
>> >        at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:120)
>> >        at org.apache.hadoop.hbase.master.HMaster.<init>(HMaster.java:186)
>> >        at org.apache.hadoop.hbase.master.HMaster.<init>(HMaster.java:156)
>> >        at
>> >
>> org.apache.hadoop.hbase.LocalHBaseCluster.<init>(LocalHBaseCluster.java:96)
>> >        at
>> >
>> org.apache.hadoop.hbase.LocalHBaseCluster.<init>(LocalHBaseCluster.java:78)
>> >        at
>> org.apache.hadoop.hbase.master.HMaster.doMain(HMaster.java:1013)
>> >        at org.apache.hadoop.hbase.master.HMaster.main(HMaster.java:1057)
>> > Caused by: java.lang.ClassNotFoundException:
>> > org.jets3t.service.S3ServiceException
>> >        at java.net.URLClassLoader$1.run(URLClassLoader.java:200)
>> >        at java.security.AccessController.doPrivileged(Native Method)
>> >        at java.net.URLClassLoader.findClass(URLClassLoader.java:188)
>> >        at java.lang.ClassLoader.loadClass(ClassLoader.java:307)
>> >        at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
>> >        at java.lang.ClassLoader.loadClass(ClassLoader.java:252)
>> >        at java.lang.ClassLoader.loadClassInternal(ClassLoader.java:320)
>> >
>> >
>> > what am i doing wrong here?
>> >
>> > Ananth T Sarathy
>> >
>>
>

Mime
View raw message