hbase-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Michael Dagaev" <michael.dag...@gmail.com>
Subject Re: Hbase Client Failure
Date Tue, 09 Dec 2008 13:20:15 GMT
    Andy, you are right.

    I checked this issue and discovered that the host was rebooted
but the Hadoop name node was not shut down before the reboot.
This explains the file damage.

This is our test environment. We formatted the DFS and continue testing.

Thank you for your cooperation,
M.

On Tue, Dec 9, 2008 at 2:40 PM, Andrew Purtell <apurtell@yahoo.com> wrote:
> Hello Michael,
>
> There has either been file damage to the data file of the
> ROOT region or some of your DFS datanodes are offline. File
> damage may occur at the DFS layer if the DFS daemons are
> placed under too much stress.
>
> Can you say more about the circumstances under which this
> happened? Why was a restart necessary?
>
>   - Andy
>
>
>> From: Michael Dagaev <michael.dagaev@gmail.com>
>> Subject: Hbase Client Failure
>> To: hbase-user@hadoop.apache.org
>> Date: Tuesday, December 9, 2008, 3:53 AM
>> Hi, all
>>
>>     After Hbase (0.18.1) restart our client failed with an
>> IO Exception  -- Could not obtain block:
>> blk_-6097614652807114983_50488
>> file=/hbase/-ROOT-/70236052/info/mapfiles/6743958588342007607/data.
>> (see the full trace below). The hbase shell failed with an
>> exception too.
>>
>> Do you know what can be the root cause of the exception?
>>
>> Thank you for your cooperation,
>> M.
>>
>> P.S. The exception stack:
>>
>> Caused by: java.io.IOException: java.io.IOException: Could
>> not obtain block: blk_-6097614652807114983_50488
>> file=/hbase/-ROOT-/70236052/info/mapfiles/6743958588342007607/data
>>         at org.apache.hadoop.dfs.DFSClient$DFSInputStream.chooseDataNode(DFSClient.java:1462)
>>         at org.apache.hadoop.dfs.DFSClient$DFSInputStream.blockSeekTo(DFSClient.java:1312)
>>         at
> org.apache.hadoop.dfs.DFSClient$DFSInputStream.read(DFSClient.java:1417)
>>         at
> org.apache.hadoop.dfs.DFSClient$DFSInputStream.read(DFSClient.java:1369)
>>         at java.io.DataInputStream.readInt(DataInputStream.java:370)
>>         at org.apache.hadoop.io.SequenceFile$Reader.readRecordLength(SequenceFile.java:1898)
>>         at org.apache.hadoop.io.SequenceFile$Reader.next(SequenceFile.java:1928)
>>         at org.apache.hadoop.io.SequenceFile$Reader.next(SequenceFile.java:1833)
>>         at org.apache.hadoop.io.SequenceFile$Reader.next(SequenceFile.java:1879)
>>         at org.apache.hadoop.io.MapFile$Reader.next(MapFile.java:516)
>>         at org.apache.hadoop.hbase.regionserver.HStore.rowAtOrBeforeFromMapFile(HStore.java:1504)
>>         at org.apache.hadoop.hbase.regionserver.HStore.getRowKeyAtOrBefore(HStore.java:1476)
>>         at org.apache.hadoop.hbase.regionserver.HRegion.getClosestRowBefore(HRegion.java:1250)
>>         at org.apache.hadoop.hbase.regionserver.HRegionServer.getClosestRowBefore(HRegionServer.java:1062)
>>         at sun.reflect.GeneratedMethodAccessor10.invoke(Unknown Source)
>>         at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>>         at java.lang.reflect.Method.invoke(Method.java:597)
>>         at org.apache.hadoop.hbase.ipc.HbaseRPC$Server.call(HbaseRPC.java:554)
>>         at org.apache.hadoop.ipc.Server$Handler.run(Server.java:888)
>
>
>
>
>

Mime
View raw message