hadoop-common-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Tali K <ncherr...@hotmail.com>
Subject Help: 1) Hadoop processes still are running after we stopped hadoop.2) How to exclude a dead node?
Date Tue, 07 Dec 2010 18:40:16 GMT

1)When I stopped hadoop, we checked all the nodes and found that 2 or 3 java/hadoop processes
were still running on each node.  So we went to each node and did a 'killall java' - in some
cases I had to do 'killall -9 java'.
My question : why is is this happening and what would be recommendations , how to make sure
that there is no hadoop processes running after I stopped hadoop with stop-all.sh?
 
2) Also we have a dead node. We  removed this node  from $HADOOP_HOME/conf/slaves.  This file
is supposed to tell the namenode 
 which machines are supposed to be datanodes/tasktrackers.
We  started hadoop again, and were surprised to see a dead node in  hadoop 'report' ("$HADOOP_HOME/bin/hadoop
dfsadmin -report|less")
It is only after blocking a deadnode and restarting hadoop, deadnode no longer showed up in
hreport.
Any recommendations, how to deal with dead nodes?
 		 	   		  
Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message