hadoop-general mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Pierre ANCELOT <pierre...@gmail.com>
Subject Re: Could not obtain block blk_
Date Fri, 07 May 2010 10:09:13 GMT
All my nodes are up, I can get the file through the DFS client...
0.20.2

On Fri, May 7, 2010 at 12:09 PM, Steve Loughran <stevel@apache.org> wrote:

> Pierre ANCELOT wrote:
>
>> Hello, we're having some issues with this...
>> Any idea please?
>>
>> 2010-05-07 05:59:46,998 INFO org.apache.hadoop.metrics.jvm.JvmMetrics:
>> Cannot initialize JVM Metrics with processName=MAP, sessionId= - already
>> initialized
>> 2010-05-07 05:59:54,846 INFO org.apache.hadoop.hdfs.DFSClient: Could not
>> obtain block blk_6180178384047700588_1429320 from any node:
>> java.io.IOException: No live nodes contain current block
>> 2010-05-07 06:00:05,493 INFO org.apache.hadoop.hdfs.DFSClient: Could not
>> obtain block blk_6180178384047700588_1429320 from any node:
>> java.io.IOException: No live nodes contain current block
>> 2010-05-07 06:00:10,358 INFO org.apache.hadoop.hdfs.DFSClient: Could not
>> obtain block blk_6180178384047700588_1429320 from any node:
>> java.io.IOException: No live nodes contain current block
>> 2010-05-07 06:00:14,945 WARN org.apache.hadoop.hdfs.DFSClient: DFS Read:
>>
>
> As it says, "No live nodes contain current block".
> This means: no live datanodes have a copy of the block of that file. Either
> some nodes have gone offline, a disk failed or it wasn't replicated.
>
>
>
>  java.io.IOException: Could not obtain block:
>> blk_6180178384047700588_1429320
>>
>> file=/zzz/Dispatch/input/zzz_24ED346D41543586AE7342997C07F5C8_2010_5_6_mr_24ED346D41543586AE7342997C07F5C8_201056.txt_128
>>    at
>>
>> org.apache.hadoop.hdfs.DFSClient$DFSInputStream.chooseDataNode(DFSClient.java:1812)
>>    at
>>
>> org.apache.hadoop.hdfs.DFSClient$DFSInputStream.blockSeekTo(DFSClient.java:1638)
>>    at
>> org.apache.hadoop.hdfs.DFSClient$DFSInputStream.read(DFSClient.java:1767)
>>    at java.io.DataInputStream.read(DataInputStream.java:83)
>>    at org.apache.hadoop.util.LineReader.readLine(LineReader.java:134)
>>    at
>>
>> org.apache.hadoop.mapreduce.lib.input.LineRecordReader.nextKeyValue(LineRecordReader.java:97)
>>    at
>>
>> org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.nextKeyValue(MapTask.java:423)
>>    at
>> org.apache.hadoop.mapreduce.MapContext.nextKeyValue(MapContext.java:67)
>>    at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:143)
>>    at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:621)
>>    at org.apache.hadoop.mapred.MapTask.run(MapTask.java:305)
>>    at org.apache.hadoop.mapred.Child.main(Child.java:170)
>>
>>
>


-- 
http://www.neko-consulting.com
Ego sum quis ego servo
"Je suis ce que je prot├Ęge"
"I am what I protect"

Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message