hadoop-common-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Bill Habermaas (JIRA)" <j...@apache.org>
Subject [jira] Updated: (HADOOP-5191) After creation and startup of the hadoop namenode on AIX or Solaris, you will only be allowed to connect to the namenode via hostname but not IP.
Date Mon, 09 Mar 2009 13:10:50 GMT

     [ https://issues.apache.org/jira/browse/HADOOP-5191?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]

Bill Habermaas updated HADOOP-5191:
-----------------------------------

    Attachment: TestHadoopHDFS.java

This is a very simple program to write data to HDFS.  The source lines below are an excerpt
from the source file. If I use a hostname that matches the namenode's machine then the FileSystem.get
will work.  If I use an IP address instead then it fails.  Why is this considered an error??

static public String filePath  = "hdfs://10.120.12.81:9000/test/datafile";

String file = filePath;
	
Configuration conf = new Configuration();
try {
	fs = FileSystem.get(new URI(file),conf);
}
  

> After creation and startup of the hadoop namenode on AIX or Solaris, you will only be
allowed to connect to the namenode via hostname but not IP.
> -------------------------------------------------------------------------------------------------------------------------------------------------
>
>                 Key: HADOOP-5191
>                 URL: https://issues.apache.org/jira/browse/HADOOP-5191
>             Project: Hadoop Core
>          Issue Type: Bug
>          Components: dfs
>    Affects Versions: 0.19.1
>         Environment: AIX 6.1 or Solaris
>            Reporter: Bill Habermaas
>            Assignee: Bill Habermaas
>            Priority: Minor
>         Attachments: 5191-1.patch, hadoop-5191.patch, TestHadoopHDFS.java
>
>
> After creation and startup of the hadoop namenode on AIX or Solaris, you will only be
allowed to connect to the namenode via hostname but not IP.
> fs.default.name=hdfs://p520aix61.mydomain.com:9000
> Hostname for box is p520aix and the IP is 10.120.16.68
> If you use the following url, "hdfs://10.120.16.68", to connect to the namenode, the
exception that appears below occurs. You can only connect successfully if "hdfs://p520aix61.mydomain.com:9000"
is used. 
> Exception in thread "Thread-0" java.lang.IllegalArgumentException: Wrong FS: hdfs://10.120.16.68:9000/testdata,
expected: hdfs://p520aix61.mydomain.com:9000
> 	at org.apache.hadoop.fs.FileSystem.checkPath(FileSystem.java:320)
> 	at org.apache.hadoop.dfs.DistributedFileSystem.checkPath(DistributedFileSystem.java:84)
> 	at org.apache.hadoop.dfs.DistributedFileSystem.getPathName(DistributedFileSystem.java:122)
> 	at org.apache.hadoop.dfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:390)
> 	at org.apache.hadoop.fs.FileSystem.exists(FileSystem.java:667)
> 	at TestHadoopHDFS.run(TestHadoopHDFS.java:116)

-- 
This message is automatically generated by JIRA.
-
You can reply to this email to add a comment to the issue online.


Mime
View raw message