hadoop-common-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Grant Ingersoll <gsing...@apache.org>
Subject Safe Mode
Date Thu, 26 Oct 2006 16:59:38 GMT
Hi,

What does this mean:

org.apache.hadoop.ipc.RemoteException:  
org.apache.hadoop.dfs.SafeModeException: Cannot delete /tmp/hadoop- 
<my user name>/mapred/system. Name node is in safe mode.
Safe mode will be turned off automatically.
at org.apache.hadoop.dfs.FSNamesystem.delete(FSNamesystem.java:761)
         at org.apache.hadoop.dfs.NameNode.delete(NameNode.java:322)
         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
         at sun.reflect.NativeMethodAccessorImpl.invoke 
(NativeMethodAccessorImpl.java:39)
         at sun.reflect.DelegatingMethodAccessorImpl.invoke 
(DelegatingMethodAccessorImpl.java:25)
         at java.lang.reflect.Method.invoke(Method.java:585)
         at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:385)
         at org.apache.hadoop.ipc.Server$Handler.run(Server.java:514)

The directory (hadoop-<my user name>) doesn't even exist on either of  
my machines that I am using.  I am getting this after calling start- 
all.sh on my machine.  My slaves file only has localhost.

Does this mean my FS has been corrupted?  Looking at the code, it is  
trying to delete the NameNode.  If I run ./hadoop namenode -format  
everything then works.  The only thing I can think of that might be  
related is that one of my worker nodes went to sleep while the server  
thread was still running (idle) overnight.

Thanks,
Grant


Mime
View raw message