hbase-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Liang Xie (JIRA)" <j...@apache.org>
Subject [jira] [Commented] (HBASE-9644) Regionserver throws java.lang.NoClassDefFoundError: Ljava/lang/InternalError exception while decompressing hfileblock
Date Tue, 24 Sep 2013 10:29:03 GMT

    [ https://issues.apache.org/jira/browse/HBASE-9644?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=13776176#comment-13776176
] 

Liang Xie commented on HBASE-9644:
----------------------------------

bq. "why hbase checksum cannot find this problem while hdfs can?"
IIRC, it has a fall back mechanism in hbase's checksum impl, that means if failed in hbase,
and will resort to hdfs's checksum, but it should be able to print some warning log, probably
need to double-check the log code. btw, you never saw any weird checksum log in hbase's logfile,
right?
                
> Regionserver throws java.lang.NoClassDefFoundError: Ljava/lang/InternalError exception
while decompressing hfileblock
> ---------------------------------------------------------------------------------------------------------------------
>
>                 Key: HBASE-9644
>                 URL: https://issues.apache.org/jira/browse/HBASE-9644
>             Project: HBase
>          Issue Type: Bug
>          Components: HFile, regionserver
>    Affects Versions: 0.94.10
>         Environment: Linux 2.6.32-el5.x86_64
>            Reporter: Victor Xu
>
> Regionserver threw a "java.lang.NoClassDefFoundError: Ljava/lang/InternalError" Exception
when it decompressed a hfileblock. 
> The exception detail is:
> 2013-09-15 05:44:03,612 ERROR org.apache.hadoop.hbase.regionserver.HRegionServer: 
> java.lang.NoClassDefFoundError: Ljava/lang/InternalError
>         at org.apache.hadoop.io.compress.snappy.SnappyDecompressor.decompressBytesDirect(Native
Method)
>         at org.apache.hadoop.io.compress.snappy.SnappyDecompressor.decompress(SnappyDecompressor.java:238)
>         at org.apache.hadoop.io.compress.BlockDecompressorStream.decompress(BlockDecompressorStream.java:87)
>         at org.apache.hadoop.io.compress.DecompressorStream.read(DecompressorStream.java:83)
>         at java.io.BufferedInputStream.read1(BufferedInputStream.java:256)
>         at java.io.BufferedInputStream.read(BufferedInputStream.java:317)
>         at org.apache.hadoop.io.IOUtils.readFully(IOUtils.java:192)
>         at org.apache.hadoop.hbase.io.hfile.HFileBlock$AbstractFSReader.decompress(HFileBlock.java:1461)
>         at org.apache.hadoop.hbase.io.hfile.HFileBlock$FSReaderV2.readBlockDataInternal(HFileBlock.java:1890)
>         at org.apache.hadoop.hbase.io.hfile.HFileBlock$FSReaderV2.readBlockData(HFileBlock.java:1703)
>         at org.apache.hadoop.hbase.io.hfile.HFileReaderV2.readBlock(HFileReaderV2.java:342)
>         at org.apache.hadoop.hbase.io.hfile.HFileBlockIndex$BlockIndexReader.loadDataBlockWithScanInfo(HFileBlockIndex.java:254)
>         at org.apache.hadoop.hbase.io.hfile.HFileReaderV2$AbstractScannerV2.seekTo(HFileReaderV2.java:484)
>         at org.apache.hadoop.hbase.io.hfile.HFileReaderV2$AbstractScannerV2.seekTo(HFileReaderV2.java:505)
>         at org.apache.hadoop.hbase.regionserver.StoreFileScanner.seekAtOrAfter(StoreFileScanner.java:220)
>         at org.apache.hadoop.hbase.regionserver.StoreFileScanner.seek(StoreFileScanner.java:140)
>         at org.apache.hadoop.hbase.regionserver.StoreScanner.<init>(StoreScanner.java:131)
>         at org.apache.hadoop.hbase.regionserver.Store.getScanner(Store.java:2208)
>         at org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.<init>(HRegion.java:3807)
>         at org.apache.hadoop.hbase.regionserver.HRegion.instantiateRegionScanner(HRegion.java:1825)
>         at org.apache.hadoop.hbase.regionserver.HRegion.getScanner(HRegion.java:1817)
>         at org.apache.hadoop.hbase.regionserver.HRegion.getScanner(HRegion.java:1794)
>         at org.apache.hadoop.hbase.regionserver.HRegion.get(HRegion.java:4828)
>         at org.apache.hadoop.hbase.regionserver.HRegion.get(HRegion.java:4802)
>         at org.apache.hadoop.hbase.regionserver.HRegionServer.get(HRegionServer.java:2196)
>         at sun.reflect.GeneratedMethodAccessor48.invoke(Unknown Source)
>         at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>         at java.lang.reflect.Method.invoke(Method.java:597)
>         at org.apache.hadoop.hbase.ipc.WritableRpcEngine$Server.call(WritableRpcEngine.java:320)
>         at org.apache.hadoop.hbase.ipc.HBaseServer$Handler.run(HBaseServer.java:1426)
> Caused by: java.lang.ClassNotFoundException: Ljava.lang.InternalError
>         at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
>         at java.security.AccessController.doPrivileged(Native Method)
>         at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
>         at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
>         at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
>         at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
>         ... 30 more
> There are two problems here:
> 1. Why use the class name 'Ljava/lang/InternalError' instead of 'java/lang/InternalError'
in Snappy library?
> This involves the code of snappy lib, maybe there is a bug in it.
> 2. When I tried to read the hfileblock using HDFS tools, it told me that the local hfileblock
didn't pass the file checksum and read another replica from remote datanode. Then here is
the question, why hbase checksum cannot find this problem while hdfs can?
> This is my hbase-site.xml configure for checksum:
>   <property>
>     <name>dfs.client.read.shortcircuit</name>
>     <value>true</value>
>     <description></description>
>   </property>
>   <property>
>     <name>dfs.client.read.shortcircuit.skip.checksum</name>
>     <value>true</value>
>     <description></description>
>   </property>
>   <property>
>     <name>hbase.regionserver.checksum.verify</name>
>     <value>true</value>
>     <description></description>
>   </property>

--
This message is automatically generated by JIRA.
If you think it was sent incorrectly, please contact your JIRA administrators
For more information on JIRA, see: http://www.atlassian.com/software/jira

Mime
View raw message