hadoop-common-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From jason hadoop <jason.had...@gmail.com>
Subject Re: Cannot copy from local file system to DFS
Date Sat, 07 Feb 2009 16:21:12 GMT
Please examine the web console for the namenode.

The url for this should be http://*namenodehost*:50070/

This will tell you what datanodes are successfully connected to the
namenode.

If the number is 0, then no datanodes are either running or were able to
connect to the namenode at start, or were able to be started.
The common reasons for this case are configuration errors, installation
errors, or network connectivity issues due to firewalls blocking ports, or
dns lookup errors (either failure or incorrect address returned) for the
namenode hostname on the datanodes.

At this point you will need to investigate the log files for the datanodes
to make an assessment of what has happened.


On Sat, Feb 7, 2009 at 6:17 AM, Rasit OZDAS <rasitozdas@gmail.com> wrote:

> Hi, Mithila,
>
> "File /user/mithila/test/20417.txt could only be replicated to 0
> nodes, instead of 1"
>
> I think your datanode isn't working properly.
> please take a look at log file of your datanode (logs/*datanode*.log).
>
> If there is no error in that log file, I've heard that hadoop can sometimes
> mark
> a datanode as "BAD" and refuses to send the block to that node, this
> can be the cause.
> (List, please correct me if I'm wrong!)
>
> Hope this helps,
> Rasit
>
> 2009/2/6 Mithila Nagendra <mnagendr@asu.edu>:
> > Hey all
> > I was trying to run the word count example on one of the hadoop systems I
> > installed, but when i try to copy the text files from the local file
> system
> > to the DFS, it throws up the following exception:
> >
> > [mithila@node02 hadoop]$ jps
> > 8711 JobTracker
> > 8805 TaskTracker
> > 8901 Jps
> > 8419 NameNode
> > 8642 SecondaryNameNode
> > [mithila@node02 hadoop]$ cd ..
> > [mithila@node02 mithila]$ ls
> > hadoop  hadoop-0.17.2.1.tar  hadoop-datastore  test
> > [mithila@node02 mithila]$ hadoop/bin/hadoop dfs -copyFromLocal test test
> > 09/02/06 11:26:26 INFO dfs.DFSClient:
> org.apache.hadoop.ipc.RemoteException:
> > java.io.IOException: File /user/mithila/test/20417.txt could only be
> > replicated to 0 nodes, instead of 1
> >        at
> >
> org.apache.hadoop.dfs.FSNamesystem.getAdditionalBlock(FSNamesystem.java:1145)
> >        at org.apache.hadoop.dfs.NameNode.addBlock(NameNode.java:300)
> >        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> >        at
> >
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
> >        at
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
> >        at java.lang.reflect.Method.invoke(Method.java:597)
> >        at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:446)
> >        at org.apache.hadoop.ipc.Server$Handler.run(Server.java:896)
> >
> >        at org.apache.hadoop.ipc.Client.call(Client.java:557)
> >        at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:212)
> >        at org.apache.hadoop.dfs.$Proxy0.addBlock(Unknown Source)
> >        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> >        at
> >
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
> >        at
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
> >        at java.lang.reflect.Method.invoke(Method.java:597)
> >        at
> >
> org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:82)
> >        at
> >
> org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:59)
> >        at org.apache.hadoop.dfs.$Proxy0.addBlock(Unknown Source)
> >        at
> >
> org.apache.hadoop.dfs.DFSClient$DFSOutputStream.locateFollowingBlock(DFSClient.java:2335)
> >        at
> >
> org.apache.hadoop.dfs.DFSClient$DFSOutputStream.nextBlockOutputStream(DFSClient.java:2220)
> >        at
> >
> org.apache.hadoop.dfs.DFSClient$DFSOutputStream.access$1700(DFSClient.java:1702)
> >        at
> >
> org.apache.hadoop.dfs.DFSClient$DFSOutputStream$DataStreamer.run(DFSClient.java:1842)
> >
> > 09/02/06 11:26:26 WARN dfs.DFSClient: NotReplicatedYetException sleeping
> > /user/mithila/test/20417.txt retries left 4
> > 09/02/06 11:26:27 INFO dfs.DFSClient:
> org.apache.hadoop.ipc.RemoteException:
> > java.io.IOException: File /user/mithila/test/20417.txt could only be
> > replicated to 0 nodes, instead of 1
> >        at
> >
> org.apache.hadoop.dfs.FSNamesystem.getAdditionalBlock(FSNamesystem.java:1145)
> >        at org.apache.hadoop.dfs.NameNode.addBlock(NameNode.java:300)
> >        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> >        at
> >
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
> >        at
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
> >        at java.lang.reflect.Method.invoke(Method.java:597)
> >        at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:446)
> >        at org.apache.hadoop.ipc.Server$Handler.run(Server.java:896)
> >
> >        at org.apache.hadoop.ipc.Client.call(Client.java:557)
> >        at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:212)
> >        at org.apache.hadoop.dfs.$Proxy0.addBlock(Unknown Source)
> >        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> >        at
> >
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
> >        at
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
> >        at java.lang.reflect.Method.invoke(Method.java:597)
> >        at
> >
> org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:82)
> >        at
> >
> org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:59)
> >        at org.apache.hadoop.dfs.$Proxy0.addBlock(Unknown Source)
> >        at
> >
> org.apache.hadoop.dfs.DFSClient$DFSOutputStream.locateFollowingBlock(DFSClient.java:2335)
> >        at
> >
> org.apache.hadoop.dfs.DFSClient$DFSOutputStream.nextBlockOutputStream(DFSClient.java:2220)
> >        at
> >
> org.apache.hadoop.dfs.DFSClient$DFSOutputStream.access$1700(DFSClient.java:1702)
> >        at
> >
> org.apache.hadoop.dfs.DFSClient$DFSOutputStream$DataStreamer.run(DFSClient.java:1842)
> >
> > 09/02/06 11:26:27 WARN dfs.DFSClient: NotReplicatedYetException sleeping
> > /user/mithila/test/20417.txt retries left 3
> > 09/02/06 11:26:28 INFO dfs.DFSClient:
> org.apache.hadoop.ipc.RemoteException:
> > java.io.IOException: File /user/mithila/test/20417.txt could only be
> > replicated to 0 nodes, instead of 1
> >        at
> >
> org.apache.hadoop.dfs.FSNamesystem.getAdditionalBlock(FSNamesystem.java:1145)
> >        at org.apache.hadoop.dfs.NameNode.addBlock(NameNode.java:300)
> >        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> >        at
> >
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
> >        at
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
> >        at java.lang.reflect.Method.invoke(Method.java:597)
> >        at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:446)
> >        at org.apache.hadoop.ipc.Server$Handler.run(Server.java:896)
> >
> >        at org.apache.hadoop.ipc.Client.call(Client.java:557)
> >        at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:212)
> >        at org.apache.hadoop.dfs.$Proxy0.addBlock(Unknown Source)
> >        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> >        at
> >
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
> >        at
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
> >        at java.lang.reflect.Method.invoke(Method.java:597)
> >        at
> >
> org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:82)
> >        at
> >
> org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:59)
> >        at org.apache.hadoop.dfs.$Proxy0.addBlock(Unknown Source)
> >        at
> >
> org.apache.hadoop.dfs.DFSClient$DFSOutputStream.locateFollowingBlock(DFSClient.java:2335)
> >        at
> >
> org.apache.hadoop.dfs.DFSClient$DFSOutputStream.nextBlockOutputStream(DFSClient.java:2220)
> >        at
> >
> org.apache.hadoop.dfs.DFSClient$DFSOutputStream.access$1700(DFSClient.java:1702)
> >        at
> >
> org.apache.hadoop.dfs.DFSClient$DFSOutputStream$DataStreamer.run(DFSClient.java:1842)
> >
> > 09/02/06 11:26:28 WARN dfs.DFSClient: NotReplicatedYetException sleeping
> > /user/mithila/test/20417.txt retries left 2
> > 09/02/06 11:26:29 INFO dfs.DFSClient:
> org.apache.hadoop.ipc.RemoteException:
> > java.io.IOException: File /user/mithila/test/20417.txt could only be
> > replicated to 0 nodes, instead of 1
> >        at
> >
> org.apache.hadoop.dfs.FSNamesystem.getAdditionalBlock(FSNamesystem.java:1145)
> >        at org.apache.hadoop.dfs.NameNode.addBlock(NameNode.java:300)
> >        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> >        at
> >
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
> >        at
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
> >        at java.lang.reflect.Method.invoke(Method.java:597)
> >        at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:446)
> >        at org.apache.hadoop.ipc.Server$Handler.run(Server.java:896)
> >
> >        at org.apache.hadoop.ipc.Client.call(Client.java:557)
> >        at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:212)
> >        at org.apache.hadoop.dfs.$Proxy0.addBlock(Unknown Source)
> >        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> >        at
> >
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
> >        at
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
> >        at java.lang.reflect.Method.invoke(Method.java:597)
> >        at
> >
> org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:82)
> >        at
> >
> org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:59)
> >        at org.apache.hadoop.dfs.$Proxy0.addBlock(Unknown Source)
> >        at
> >
> org.apache.hadoop.dfs.DFSClient$DFSOutputStream.locateFollowingBlock(DFSClient.java:2335)
> >        at
> >
> org.apache.hadoop.dfs.DFSClient$DFSOutputStream.nextBlockOutputStream(DFSClient.java:2220)
> >        at
> >
> org.apache.hadoop.dfs.DFSClient$DFSOutputStream.access$1700(DFSClient.java:1702)
> >        at
> >
> org.apache.hadoop.dfs.DFSClient$DFSOutputStream$DataStreamer.run(DFSClient.java:1842)
> >
> > 09/02/06 11:26:29 WARN dfs.DFSClient: NotReplicatedYetException sleeping
> > /user/mithila/test/20417.txt retries left 1
> > 09/02/06 11:26:32 WARN dfs.DFSClient: DataStreamer Exception:
> > org.apache.hadoop.ipc.RemoteException: java.io.IOException: File
> > /user/mithila/test/20417.txt could only be replicated to 0 nodes, instead
> of
> > 1
> >        at
> >
> org.apache.hadoop.dfs.FSNamesystem.getAdditionalBlock(FSNamesystem.java:1145)
> >        at org.apache.hadoop.dfs.NameNode.addBlock(NameNode.java:300)
> >        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> >        at
> >
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
> >        at
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
> >        at java.lang.reflect.Method.invoke(Method.java:597)
> >        at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:446)
> >        at org.apache.hadoop.ipc.Server$Handler.run(Server.java:896)
> >
> >        at org.apache.hadoop.ipc.Client.call(Client.java:557)
> >        at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:212)
> >        at org.apache.hadoop.dfs.$Proxy0.addBlock(Unknown Source)
> >        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> >        at
> >
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
> >        at
> >
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
> >        at java.lang.reflect.Method.invoke(Method.java:597)
> >        at
> >
> org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:82)
> >        at
> >
> org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:59)
> >        at org.apache.hadoop.dfs.$Proxy0.addBlock(Unknown Source)
> >        at
> >
> org.apache.hadoop.dfs.DFSClient$DFSOutputStream.locateFollowingBlock(DFSClient.java:2335)
> >        at
> >
> org.apache.hadoop.dfs.DFSClient$DFSOutputStream.nextBlockOutputStream(DFSClient.java:2220)
> >        at
> >
> org.apache.hadoop.dfs.DFSClient$DFSOutputStream.access$1700(DFSClient.java:1702)
> >        at
> >
> org.apache.hadoop.dfs.DFSClient$DFSOutputStream$DataStreamer.run(DFSClient.java:1842)
> >
> > 09/02/06 11:26:32 WARN dfs.DFSClient: Error Recovery for block null bad
> > datanode[0]
> > copyFromLocal: Could not get block locations. Aborting...
> > Exception closing file /user/mithila/test/20417.txt
> > java.io.IOException: Could not get block locations. Aborting...
> >        at
> >
> org.apache.hadoop.dfs.DFSClient$DFSOutputStream.processDatanodeError(DFSClient.java:2081)
> >        at
> >
> org.apache.hadoop.dfs.DFSClient$DFSOutputStream.access$1300(DFSClient.java:1702)
> >        at
> >
> org.apache.hadoop.dfs.DFSClient$DFSOutputStream$DataStreamer.run(DFSClient.java:1818)
> >
> > Does anyone have an idea as to what could be wrong! Im fresh outta ideas!
> >
> > Thanks!
> > Mithila
> >
>
>
>
> --
> M. Raşit ÖZDAŞ
>

Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message