hadoop-general mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Saurabh Nanda <saurabhna...@gmail.com>
Subject Re: Wrong FS error
Date Thu, 09 Jul 2009 04:15:18 GMT
> >
> > A lot of my Hadop jobs (actually Cloudbase queries) are showing the
> > following kind of errors in the job tracker:
> >
> > java.lang.IllegalArgumentException: Wrong FS:
> >
> hdfs://master-hadoop.local/user/ct-admin/cloudbase/index/UID_INDEX/index_new/metadata/1,
> expected: hdfs://master-hadoop
>
> P.S. Sounds also like you have no DNS inside but using /etc/hosts that
> might be just wrong. Check host names on each and make sure nslookup
> shows you identical everywhere.



You are right. I am using /etc/hosts and my hadoop machines do not have
proper DNS entries. However, why should that matter if I am using IP address
in the configuration files?

Relevant entries from hadoop-site.xml:
fs.default.name=hdfs://172.16.37.56:8020
mapred.job.tracker=172.16.37.56:8021

slaves.xml:
172.16.37.56
172.16.37.72

/etc/hosts on 172.16.37.56:
127.0.0.1    master-hadoop    localhost.localdomain    localhost
172.16.37.72 slave1-hadoop

/etc/hosts on 172.16.37.72:
127.0.0.1    slave1-hadoop    localhost.localdomain    localhost
172.16.37.56 master-hadoop

How should I go about fixing this?

Saurabh.
-- 
http://nandz.blogspot.com
http://foodieforlife.blogspot.com

Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message