hadoop-mapreduce-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Dhiraj <jar...@gmail.com>
Subject S3 with Hadoop 2.5.0 - Not working
Date Wed, 10 Sep 2014 07:45:07 GMT
Hi,

I have downloaded hadoop-2.5.0 and am trying to get it working for s3
backend *(single-node in a pseudo-distributed mode)*.
I have made changes to the core-site.xml according to
https://wiki.apache.org/hadoop/AmazonS3

I have an backend object store running on my machine that supports S3.

I get the following message when i try to start the daemons
*Incorrect configuration: namenode address dfs.namenode.servicerpc-address
or dfs.namenode.rpc-address is not configured.*


root@ubuntu:/build/hadoop/hadoop-2.5.0# ./sbin/start-dfs.sh
Incorrect configuration: namenode address dfs.namenode.servicerpc-address
or dfs.namenode.rpc-address is not configured.
Starting namenodes on []
localhost: starting namenode, logging to
/build/hadoop/hadoop-2.5.0/logs/hadoop-root-namenode-ubuntu.out
localhost: starting datanode, logging to
/build/hadoop/hadoop-2.5.0/logs/hadoop-root-datanode-ubuntu.out
Starting secondary namenodes [0.0.0.0]
0.0.0.0: starting secondarynamenode, logging to
/build/hadoop/hadoop-2.5.0/logs/hadoop-root-secondarynamenode-ubuntu.out
root@ubuntu:/build/hadoop/hadoop-2.5.0#

The deamons dont start after the above.
i get the same error if i add the property "fs.defaultFS" and set its value
to the s3 bucket but if i change the defaultFS to *hdfs://* it works fine -
am able to launch the daemons.

my core-site.xml:
<configuration>
    <property>
        <name>fs.defaultFS</name>
        <value>s3://bucket1</value>
    </property>
    <property>
        <name>fs.s3.awsAccessKeyId</name>
        <value>abcd</value>
    </property>
    <property>
        <name>fs.s3.awsSecretAccessKey</name>
        <value>1234</value>
    </property>
</configuration>


I am able to list the buckets and its contents via s3cmd and boto; but
unable to get an s3 config started via hadoop

Also from the following core-file.xml listed on the website; i dont see an
implementation for s3
http://hadoop.apache.org/docs/current/hadoop-project-dist/hadoop-common/core-default.xml

There is an s3.impl until 1.2.1 release. So does the 2.5.0 release support
s3 or do i need to do anything else.

cheers,
Dhiraj

Mime
View raw message