hadoop-common-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Shengkai Zhu" <geniusj...@gmail.com>
Subject Re: Failed to repeat the Quickstart guide for Pseudo-distributed operation
Date Wed, 09 Jul 2008 01:42:31 GMT
After your formatting the namenode second time, your datanodes and namenode
may stay in inconsistency, namely, under imcompatible namespace.

On 7/2/08, Xuan Dzung Doan <doanxuandung@yahoo.com> wrote:
>
> I was exactly following the Hadoop 0.16.4 quickstart guide to run a
> Pseudo-distributed operation on my Fedora 8 machine. The first time I did
> it, everything ran successfully (formated a new hdfs, started hadoop
> daemons, then ran the grep example). A moment later, I decided to redo
> everything again. Reformating the hdfs and starting the daemons seemed to
> have no problem; but from the homepage of the namenode's web interface (
> http://localhost:50070/), when I clicked "Browse the filesystem", it said
> the following:
>
>
> HTTP ERROR: 404
> /browseDirectory.jsp
> RequestURI=/browseDirectory.jsp
> Then when I tried to copy files to the hdfs to re-run the grep example, I
> couldn't with the following long list of exceptions (looks like some
> replication or block allocation issue):
>
> # bin/hadoop dfs -put conf input
>
> 08/06/29 09:38:42 INFO dfs.DFSClient:
> org.apache.hadoop.ipc.RemoteException: java.io.IOException: File
> /user/root/input/hadoop-env.sh could only be replicated to 0 nodes, instead
> of 1
>        at
> org.apache.hadoop.dfs.FSNamesystem.getAdditionalBlock(FSNamesystem.java:1127)
>        at org.apache.hadoop.dfs.NameNode.addBlock(NameNode.java:312)
>        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>        at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>        at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>        at java.lang.reflect.Method.invoke(Method.java:597)
>        at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:409)
>        at org.apache.hadoop.ipc.Server$Handler.run(Server.java:901)
>
>        at org.apache.hadoop.ipc.Client.call(Client.java:512)
>        at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:198)
>        at org.apache.hadoop.dfs.$Proxy0.addBlock(Unknown Source)
>        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>        at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>        at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>        at java.lang.reflect.Method.invoke(Method.java:597)
>        at
> org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:82)
>        at
> org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:59)
>        at org.apache.hadoop.dfs.$Proxy0.addBlock(Unknown Source)
>        at
> org.apache.hadoop.dfs.DFSClient$DFSOutputStream.locateFollowingBlock(DFSClient.java:2074)
>        at
> org.apache.hadoop.dfs.DFSClient$DFSOutputStream.nextBlockOutputStream(DFSClient.java:1967)
>        at
> org.apache.hadoop.dfs.DFSClient$DFSOutputStream.access$1500(DFSClient.java:1487)
>        at
> org.apache.hadoop.dfs.DFSClient$DFSOutputStream$DataStreamer.run(DFSClient.java:1601)
>
> 08/06/29 09:38:42 WARN dfs.DFSClient: NotReplicatedYetException sleeping
> /user/root/input/hadoop-env.sh retries left 4
> 08/06/29 09:38:42 INFO dfs.DFSClient:
> org.apache.hadoop.ipc.RemoteException: java.io.IOException: File
> /user/root/input/hadoop-env.sh could only be replicated to 0 nodes, instead
> of 1
>        at
> org.apache.hadoop.dfs.FSNamesystem.getAdditionalBlock(FSNamesystem.java:1127)
>        at org.apache.hadoop.dfs.NameNode.addBlock(NameNode.java:312)
>        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>        at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>        at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>        at java.lang.reflect.Method.invoke(Method.java:597)
>        at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:409)
>        at org.apache.hadoop.ipc.Server$Handler.run(Server.java:901)
>
>        at org.apache.hadoop.ipc.Client.call(Client.java:512)
>        at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:198)
>        at org.apache.hadoop.dfs.$Proxy0.addBlock(Unknown Source)
>        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>        at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>        at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>        at java.lang.reflect.Method.invoke(Method.java:597)
>        at
> org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:82)
>        at
> org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:59)
>        at org.apache.hadoop.dfs.$Proxy0.addBlock(Unknown Source)
>        at
> org.apache.hadoop.dfs.DFSClient$DFSOutputStream.locateFollowingBlock(DFSClient.java:2074)
>        at
> org.apache.hadoop.dfs.DFSClient$DFSOutputStream.nextBlockOutputStream(DFSClient.java:1967)
>        at
> org.apache.hadoop.dfs.DFSClient$DFSOutputStream.access$1500(DFSClient.java:1487)
>        at
> org.apache.hadoop.dfs.DFSClient$DFSOutputStream$DataStreamer.run(DFSClient.java:1601)
>
> 08/06/29 09:38:42 WARN dfs.DFSClient: NotReplicatedYetException sleeping
> /user/root/input/hadoop-env.sh retries left 3
> 08/06/29 09:38:43 INFO dfs.DFSClient:
> org.apache.hadoop.ipc.RemoteException: java.io.IOException: File
> /user/root/input/hadoop-env.sh could only be replicated to 0 nodes, instead
> of 1
>        at
> org.apache.hadoop.dfs.FSNamesystem.getAdditionalBlock(FSNamesystem.java:1127)
>        at org.apache.hadoop.dfs.NameNode.addBlock(NameNode.java:312)
>        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>        at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>        at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>        at java.lang.reflect.Method.invoke(Method.java:597)
>        at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:409)
>        at org.apache.hadoop.ipc.Server$Handler.run(Server.java:901)
>
>        at org.apache.hadoop.ipc.Client.call(Client.java:512)
>        at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:198)
>        at org.apache.hadoop.dfs.$Proxy0.addBlock(Unknown Source)
>        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>        at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>        at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>        at java.lang.reflect.Method.invoke(Method.java:597)
>        at
> org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:82)
>        at
> org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:59)
>        at org.apache.hadoop.dfs.$Proxy0.addBlock(Unknown Source)
>        at
> org.apache.hadoop.dfs.DFSClient$DFSOutputStream.locateFollowingBlock(DFSClient.java:2074)
>        at
> org.apache.hadoop.dfs.DFSClient$DFSOutputStream.nextBlockOutputStream(DFSClient.java:1967)
>        at
> org.apache.hadoop.dfs.DFSClient$DFSOutputStream.access$1500(DFSClient.java:1487)
>        at
> org.apache.hadoop.dfs.DFSClient$DFSOutputStream$DataStreamer.run(DFSClient.java:1601)
>
> 08/06/29 09:38:43 WARN dfs.DFSClient: NotReplicatedYetException sleeping
> /user/root/input/hadoop-env.sh retries left 2
> 08/06/29 09:38:44 INFO dfs.DFSClient:
> org.apache.hadoop.ipc.RemoteException: java.io.IOException: File
> /user/root/input/hadoop-env.sh could only be replicated to 0 nodes, instead
> of 1
>        at
> org.apache.hadoop.dfs.FSNamesystem.getAdditionalBlock(FSNamesystem.java:1127)
>        at org.apache.hadoop.dfs.NameNode.addBlock(NameNode.java:312)
>        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>        at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>        at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>        at java.lang.reflect.Method.invoke(Method.java:597)
>        at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:409)
>        at org.apache.hadoop.ipc.Server$Handler.run(Server.java:901)
>
>        at org.apache.hadoop.ipc.Client.call(Client.java:512)
>        at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:198)
>        at org.apache.hadoop.dfs.$Proxy0.addBlock(Unknown Source)
>        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>        at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>        at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>        at java.lang.reflect.Method.invoke(Method.java:597)
>        at
> org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:82)
>        at
> org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:59)
>        at org.apache.hadoop.dfs.$Proxy0.addBlock(Unknown Source)
>        at
> org.apache.hadoop.dfs.DFSClient$DFSOutputStream.locateFollowingBlock(DFSClient.java:2074)
>        at
> org.apache.hadoop.dfs.DFSClient$DFSOutputStream.nextBlockOutputStream(DFSClient.java:1967)
>        at
> org.apache.hadoop.dfs.DFSClient$DFSOutputStream.access$1500(DFSClient.java:1487)
>        at
> org.apache.hadoop.dfs.DFSClient$DFSOutputStream$DataStreamer.run(DFSClient.java:1601)
>
> 08/06/29 09:38:44 WARN dfs.DFSClient: NotReplicatedYetException sleeping
> /user/root/input/hadoop-env.sh retries left 1
> 08/06/29 09:38:48 WARN dfs.DFSClient: DataStreamer Exception:
> org.apache.hadoop.ipc.RemoteException: java.io.IOException: File
> /user/root/input/hadoop-env.sh could only be replicated to 0 nodes, instead
> of 1
>        at
> org.apache.hadoop.dfs.FSNamesystem.getAdditionalBlock(FSNamesystem.java:1127)
>        at org.apache.hadoop.dfs.NameNode.addBlock(NameNode.java:312)
>        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>        at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>        at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>        at java.lang.reflect.Method.invoke(Method.java:597)
>        at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:409)
>        at org.apache.hadoop.ipc.Server$Handler.run(Server.java:901)
>
> 08/06/29 09:38:48 WARN dfs.DFSClient: Error Recovery for block null bad
> datanode[0]
> put: Could not get block locations. Aborting...
>
> I tried to repeat the steps several times from the beginning but all had
> this same issue. Any idea what the problem is and how I may resolve it?
>
> Thanks a bunch,
> David.
>
>
>
>

Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message