hadoop-common-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Palleti, Pallavi" <pallavi.pall...@corp.aol.com>
Subject RE: HDFS is not responding after restart
Date Thu, 28 May 2009 04:00:17 GMT
Hi all,
   
The issue is, one of the machine which was deadnode (deadnode because, that machine wasn't
accessible) was accessible and didn't have anything in the home directory. Essentially, hadoop
installation directory was not present. The namenode didn't throw any issue but got stuck
and didn't come up. There is no way from the logs, we could realize that this might be issue.
Any idea, how to figure out such kind of issues?

Thanks
Pallavi

-----Original Message-----
From: Pallavi Palleti [mailto:pallavi.palleti@corp.aol.com] 
Sent: Wednesday, May 27, 2009 2:05 PM
To: core-user@hadoop.apache.org
Subject: HDFS is not responding after restart

Hi all,

  We have a cluster of 50 machines and we had to restart hadoop for some reason. When we restart,
the jobtracker is up. I can see the ui showing everything perfectly fine. But the dfs ui is
stuck. When I look into namenode logs, it says, it reached 0.990, safe mode would be turned
off in 0 secs . But, even after waiting for 2 hours, I couldn't see it turned off. Looking
into datanode logs, every datanode is verifying data blocks and reporting bad blocks to namenode.
In most of the machines, verification is successful. But, it is still processing some blocks.
The question is, any idea how long do we need to wait for this to be up? Also, hadoop dfs
-report is also not responding. Any idea, what would be the issue? Will it be ok, if I forcibly
switch off safemode?


Thanks in Advance,
Pallavi 
 
Mime
View raw message