hbase-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From stack <st...@duboce.net>
Subject Re: StackOverFlow Error in HBase
Date Thu, 27 Mar 2008 19:36:59 GMT
You are using a filter?  If so, tell us more about it.
St.Ack

David Alves wrote:
> Hi guys 
>
> 	I 'm using HBase to keep data that is later indexed.
> 	The data is indexed in chunks so the cycle is get XXXX records index
> them check for more records etc...
> 	When I tryed the candidate-2 instead of the old 0.16.0 (which I
> switched to do to the regionservers becoming unresponsive) I got the
> error in the end of this email well into an indexing job.
> 	So you have any idea why? Am I doing something wrong?
>
> David Alves
>
> java.lang.RuntimeException: org.apache.hadoop.ipc.RemoteException:
> java.io.IOException: java.lang.StackOverflowError
>         at java.io.DataInputStream.readFully(DataInputStream.java:178)
>         at java.io.DataInputStream.readLong(DataInputStream.java:399)
>         at org.apache.hadoop.dfs.DFSClient
> $BlockReader.readChunk(DFSClient.java:735)
>         at
> org.apache.hadoop.fs.FSInputChecker.readChecksumChunk(FSInputChecker.java:234)
>         at
> org.apache.hadoop.fs.FSInputChecker.fill(FSInputChecker.java:176)
>         at
> org.apache.hadoop.fs.FSInputChecker.read1(FSInputChecker.java:193)
>         at
> org.apache.hadoop.fs.FSInputChecker.read(FSInputChecker.java:157)
>         at org.apache.hadoop.dfs.DFSClient
> $BlockReader.read(DFSClient.java:658)
>         at org.apache.hadoop.dfs.DFSClient
> $DFSInputStream.readBuffer(DFSClient.java:1130)
>         at org.apache.hadoop.dfs.DFSClient
> $DFSInputStream.read(DFSClient.java:1166)
>         at java.io.DataInputStream.readFully(DataInputStream.java:178)
>         at org.apache.hadoop.io.DataOutputBuffer
> $Buffer.write(DataOutputBuffer.java:56)
>         at
> org.apache.hadoop.io.DataOutputBuffer.write(DataOutputBuffer.java:90)
>         at org.apache.hadoop.io.SequenceFile
> $Reader.next(SequenceFile.java:1829)
>         at org.apache.hadoop.io.SequenceFile
> $Reader.next(SequenceFile.java:1729)
>         at org.apache.hadoop.io.SequenceFile
> $Reader.next(SequenceFile.java:1775)
>         at org.apache.hadoop.io.MapFile$Reader.next(MapFile.java:461)
>         at org.apache.hadoop.hbase.HStore
> $StoreFileScanner.getNext(HStore.java:2350)
>         at
> org.apache.hadoop.hbase.HAbstractScanner.next(HAbstractScanner.java:256)
>         at org.apache.hadoop.hbase.HStore
> $HStoreScanner.next(HStore.java:2561)
>         at org.apache.hadoop.hbase.HRegion
> $HScanner.next(HRegion.java:1807)
>         at org.apache.hadoop.hbase.HRegion
> $HScanner.next(HRegion.java:1843)
>         at org.apache.hadoop.hbase.HRegion
> $HScanner.next(HRegion.java:1843)
>         at org.apache.hadoop.hbase.HRegion
> $HScanner.next(HRegion.java:1843)
>         at org.apache.hadoop.hbase.HRegion
> $HScanner.next(HRegion.java:1843)
>         at org.apache.hadoop.hbase.HRegion
> $HScanner.next(HRegion.java:1843)
>         at org.apache.hadoop.hbase.HRegion
> $HScanner.next(HRegion.java:1843)
>         at org.apache.hadoop.hbase.HRegion
> $HScanner.next(HRegion.java:1843)
>         at org.apache.hadoop.hbase.HRegion
> $HScanner.next(HRegion.java:1843)
>         at org.apache.hadoop.hbase.HRegion
> $HScanner.next(HRegion.java:1843)
>         at org.apache.hadoop.hbase.HRegion
> $HScanner.next(HRegion.java:1843)
>         at org.apache.hadoop.hbase.HRegion
> $HScanner.next(HRegion.java:1843)
>         at org.apache.hadoop.hbase.HRegion
> $HScanner.next(HRegion.java:1843)
>         at org.apache.hadoop.hbase.HRegion
> $HScanner.next(HRegion.java:1843)
>         at org.apache.hadoop.hbase.HRegion
> $HScanner.next(HRegion.java:1843)
>         at org.apache.hadoop.hbase.HRegion
> $HScanner.next(HRegion.java:1843)
>         at org.apache.hadoop.hbase.HRegion
> $HScanner.next(HRegion.java:1843)
>         at org.apache.hadoop.hbase.HRegion
> $HScanner.next(HRegion.java:1843)
> ...
>   
>
>   


Mime
View raw message