hadoop-common-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Tom White" <tom.e.wh...@gmail.com>
Subject Re: Namenode Exceptions with S3
Date Thu, 10 Jul 2008 20:55:45 GMT
> I get (where the all-caps portions are the actual values...):
>
> 2008-07-01 19:05:17,540 ERROR org.apache.hadoop.dfs.NameNode:
> java.lang.NumberFormatException: For input string:
> "AWS_SECRET_ACCESS_KEY@HDFS_BUCKET"
>        at java.lang.NumberFormatException.forInputString(NumberFormatException.java:48)
>        at java.lang.Integer.parseInt(Integer.java:447)
>        at java.lang.Integer.parseInt(Integer.java:497)
>        at org.apache.hadoop.net.NetUtils.createSocketAddr(NetUtils.java:128)
>        at org.apache.hadoop.dfs.NameNode.initialize(NameNode.java:121)
>        at org.apache.hadoop.dfs.NameNode.<init>(NameNode.java:178)
>        at org.apache.hadoop.dfs.NameNode.<init>(NameNode.java:164)
>        at org.apache.hadoop.dfs.NameNode.createNameNode(NameNode.java:848)
>        at org.apache.hadoop.dfs.NameNode.main(NameNode.java:857)
>
> These exceptions are taken from the namenode log.  The datanode logs
> show the same exceptions.

If you make the default filesystem S3 then you can't run HDFS daemons.
If you want to run HDFS and use an S3 filesystem, you need to make the
default filesystem a hdfs URI, and use s3 URIs to reference S3
filesystems.

Hope this helps.

Tom

Mime
View raw message