hadoop-mapreduce-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Lemon Cheng <lemon...@gmail.com>
Subject Query about "hadoop dfs -cat" in hadoop-0-0.20.2
Date Fri, 17 Jun 2011 12:11:36 GMT
Hi,

I am using the hadoop-0.20.2. After calling ./start-all.sh, i can type
"hadoop dfs -ls".
However, when i type "hadoop dfs -cat /usr/lemon/wordcount/input/file01",
the error is shown as follow.
I have searched the related problem in the web, but i can't find a solution
for helping me to solve this problem.
Anyone can give suggestion?
Many Thanks.



11/06/17 19:27:12 INFO hdfs.DFSClient: No node available for block:
blk_7095683278339921538_1029 file=/usr/lemon/wordcount/input/file01
11/06/17 19:27:12 INFO hdfs.DFSClient: Could not obtain block
blk_7095683278339921538_1029 from any node:  java.io.IOException: No live
nodes contain current block
11/06/17 19:27:15 INFO hdfs.DFSClient: No node available for block:
blk_7095683278339921538_1029 file=/usr/lemon/wordcount/input/file01
11/06/17 19:27:15 INFO hdfs.DFSClient: Could not obtain block
blk_7095683278339921538_1029 from any node:  java.io.IOException: No live
nodes contain current block
11/06/17 19:27:18 INFO hdfs.DFSClient: No node available for block:
blk_7095683278339921538_1029 file=/usr/lemon/wordcount/input/file01
11/06/17 19:27:18 INFO hdfs.DFSClient: Could not obtain block
blk_7095683278339921538_1029 from any node:  java.io.IOException: No live
nodes contain current block
11/06/17 19:27:21 WARN hdfs.DFSClient: DFS Read: java.io.IOException: Could
not obtain block: blk_7095683278339921538_1029
file=/usr/lemon/wordcount/input/file01
        at
org.apache.hadoop.hdfs.DFSClient$DFSInputStream.chooseDataNode(DFSClient.java:1812)
        at
org.apache.hadoop.hdfs.DFSClient$DFSInputStream.blockSeekTo(DFSClient.java:1638)
        at
org.apache.hadoop.hdfs.DFSClient$DFSInputStream.read(DFSClient.java:1767)
        at java.io.DataInputStream.read(DataInputStream.java:83)
        at org.apache.hadoop.io.IOUtils.copyBytes(IOUtils.java:47)
        at org.apache.hadoop.io.IOUtils.copyBytes(IOUtils.java:85)
        at org.apache.hadoop.fs.FsShell.printToStdout(FsShell.java:114)
        at org.apache.hadoop.fs.FsShell.access$100(FsShell.java:49)
        at org.apache.hadoop.fs.FsShell$1.process(FsShell.java:352)
        at
org.apache.hadoop.fs.FsShell$DelayedExceptionThrowing.globAndProcess(FsShell.java:1898)
        at org.apache.hadoop.fs.FsShell.cat<http://org.apache.hadoop.fs.fsshell.cat/>
(FsShell.java:346)
        at org.apache.hadoop.fs.FsShell.doall(FsShell.java:1543)
        at org.apache.hadoop.fs.FsShell.run(FsShell.java:1761)
        at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
        at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:79)
        at org.apache.hadoop.fs.FsShell.main(FsShell.java:1880)


Regards,
Lemon

Mime
View raw message