hbase-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Ted Yu <yuzhih...@gmail.com>
Subject Re: regionserver shutdown abnormally
Date Thu, 24 Dec 2015 06:17:50 GMT
Which hbase release are you using ?

After a brief search, looks like Chinese char might be present in region
name or config value.

Can you double check ?

On Wed, Dec 23, 2015 at 10:04 PM, yaoxiaohua <yaoxiaohua@outlook.com> wrote:

> Hi,
>
>                 172.19.206.142 ,this node is running datanode and
> regionserver, now the region server always shutdown sometimes, the
> following
> is some log,
>
> Could you help me analysis why ?
>
>
>
> 2015-12-05 10:20:41,570 WARN
> [PostOpenDeployTasks:b7b84410963cbc1484827ceca3439658]
> handler.OpenRegionHandler: Exception running postOpenDeployTasks;
> region=b7b84410963cbc1484827ceca3439658
>
> java.lang.ArrayIndexOutOfBoundsException: Array index out of range: 0
>
>         at sun.nio.cs.UTF_8$Encoder.encode(UTF_8.java:652)
>
>         at java.lang.StringCoding.encode(StringCoding.java:899)
>
>         at java.lang.String.getBytes(String.java:2226)
>
>         at org.apache.hadoop.hbase.util.Bytes.toBytes(Bytes.java:502)
>
>         at
> org.apache.hadoop.hbase.catalog.MetaEditor.addLocation(MetaEditor.java:572)
>
>         at
>
> org.apache.hadoop.hbase.catalog.MetaEditor.updateLocation(MetaEditor.java:46
> 2)
>
>         at
>
> org.apache.hadoop.hbase.catalog.MetaEditor.updateRegionLocation(MetaEditor.j
> ava:442)
>
>         at
>
> org.apache.hadoop.hbase.regionserver.HRegionServer.postOpenDeployTasks(HRegi
> onServer.java:1787)
>
>         at
>
> org.apache.hadoop.hbase.regionserver.handler.OpenRegionHandler$PostOpenDeplo
> yTasksThread.run(OpenRegionHandler.java:325)
>
> 2015-12-05 10:29:39,312 WARN
> [PostOpenDeployTasks:7aa570f1699ad8121e5adf3ffbf79787]
> handler.OpenRegionHandler: Exception running postOpenDeployTasks;
> region=7aa570f1699ad8121e5adf3ffbf79787
>
> java.lang.ArrayIndexOutOfBoundsException
>
>         at sun.nio.cs.UTF_8$Encoder.encode(UTF_8.java:652)
>
>         at java.lang.StringCoding.encode(StringCoding.java:899)
>
>         at java.lang.String.getBytes(String.java:2226)
>
>         at org.apache.hadoop.hbase.util.Bytes.toBytes(Bytes.java:502)
>
>         at
>
> org.apache.hadoop.hbase.CompoundConfiguration$1.get(CompoundConfiguration.ja
> va:182)
>
>         at
>
> org.apache.hadoop.hbase.CompoundConfiguration.get(CompoundConfiguration.java
> :284)
>
>         at
> org.apache.hadoop.conf.Configuration.getTrimmed(Configuration.java:872)
>
>         at
> org.apache.hadoop.conf.Configuration.getInt(Configuration.java:1089)
>
>         at
>
> org.apache.hadoop.hbase.regionserver.DefaultStoreFileManager.getStoreCompact
> ionPriority(DefaultStoreFileManager.java:132)
>
>         at
>
> org.apache.hadoop.hbase.regionserver.HStore.getCompactPriority(HStore.java:1
> 841)
>
>         at
>
> org.apache.hadoop.hbase.regionserver.CompactSplitThread$CompactionRunner.<in
> it>(CompactSplitThread.java:418)
>
>         at
>
> org.apache.hadoop.hbase.regionserver.CompactSplitThread.requestCompactionInt
> ernal(CompactSplitThread.java:316)
>
>         at
>
> org.apache.hadoop.hbase.regionserver.CompactSplitThread.requestSystemCompact
> ion(CompactSplitThread.java:286)
>
> 2015-12-05 10:29:41,890 WARN  [StoreFileOpenerThread-F-1]
> hdfs.BlockReaderFactory: I/O error constructing remote block reader.
>
> java.io.EOFException: Premature EOF: no length prefix available
>
>         at
> org.apache.hadoop.hdfs.protocolPB.PBHelper.vintPrefixed(PBHelper.java:1986)
>
>         at
>
> org.apache.hadoop.hdfs.RemoteBlockReader2.newBlockReader(RemoteBlockReader2.
> java:395)
>
>         at
>
> org.apache.hadoop.hdfs.BlockReaderFactory.getRemoteBlockReader(BlockReaderFa
> ctory.java:786)
>
>         at
>
> org.apache.hadoop.hdfs.BlockReaderFactory.getRemoteBlockReaderFromTcp(BlockR
> eaderFactory.java:665)
>
>         at
>
> org.apache.hadoop.hdfs.BlockReaderFactory.build(BlockReaderFactory.java:325)
>
>         at
> org.apache.hadoop.hdfs.DFSInputStream.blockSeekTo(DFSInputStream.java:567)
>
>         at
>
> org.apache.hadoop.hdfs.DFSInputStream.readWithStrategy(DFSInputStream.java:7
> 93)
>
>         at
> org.apache.hadoop.hdfs.DFSInputStream.read(DFSInputStream.java:840)
>
>         at java.io.DataInputStream.readFully(DataInputStream.java:207)
>
>         at
>
> org.apache.hadoop.hbase.io.hfile.FixedFileTrailer.readFromStream(FixedFileTr
> ailer.java:391)
>
>         at
> org.apache.hadoop.hbase.io.hfile.HFile.pickReaderVersion(HFile.java:537)
>
>         at
> org.apache.hadoop.hbase.io.hfile.HFile.createReader(HFile.java:580)
>
> 2015-12-05 10:29:41,891 WARN  [StoreFileOpenerThread-F-1] hdfs.DFSClient:
> Failed to connect to /172.19.206.142:50011 for block, add to deadNodes and
> continue. java.io.EOFException: Premature EOF: no length prefix available
>
> java.io.EOFException: Premature EOF: no length prefix available
>
>         at
> org.apache.hadoop.hdfs.protocolPB.PBHelper.vintPrefixed(PBHelper.java:1986)
>
>         at
>
> org.apache.hadoop.hdfs.RemoteBlockReader2.newBlockReader(RemoteBlockReader2.
> java:395)
>
>         at
>
> org.apache.hadoop.hdfs.BlockReaderFactory.getRemoteBlockReader(BlockReaderFa
> ctory.java:786)
>
>         at
>
> org.apache.hadoop.hdfs.BlockReaderFactory.getRemoteBlockReaderFromTcp(BlockR
> eaderFactory.java:665)
>
>         at
>
> org.apache.hadoop.hdfs.BlockReaderFactory.build(BlockReaderFactory.java:325)
>
>         at
> org.apache.hadoop.hdfs.DFSInputStream.blockSeekTo(DFSInputStream.java:567)
>
>         at
>
> org.apache.hadoop.hdfs.DFSInputStream.readWithStrategy(DFSInputStream.java:7
> 93)
>
>         at
> org.apache.hadoop.hdfs.DFSInputStream.read(DFSInputStream.java:840)
>
>         at java.io.DataInputStream.readFully(DataInputStream.java:207)
>
>         at
>
> org.apache.hadoop.hbase.io.hfile.FixedFileTrailer.readFromStream(FixedFileTr
> ailer.java:391)
>
>         at
> org.apache.hadoop.hbase.io.hfile.HFile.pickReaderVersion(HFile.java:537)
>
>         at
> org.apache.hadoop.hbase.io.hfile.HFile.createReader(HFile.java:580)
>
> 2015-12-05 14:03:20,801 WARN  [StoreFileOpenerThread-F-1]
> hdfs.BlockReaderFactory: I/O error constructing remote block reader.
>
> java.io.EOFException: Premature EOF: no length prefix available
>
>         at
> org.apache.hadoop.hdfs.protocolPB.PBHelper.vintPrefixed(PBHelper.java:1986)
>
>         at
>
> org.apache.hadoop.hdfs.RemoteBlockReader2.newBlockReader(RemoteBlockReader2.
> java:395)
>
>         at
>
> org.apache.hadoop.hdfs.BlockReaderFactory.getRemoteBlockReader(BlockReaderFa
> ctory.java:786)
>
>         at
>
> org.apache.hadoop.hdfs.BlockReaderFactory.getRemoteBlockReaderFromTcp(BlockR
> eaderFactory.java:665)
>
>         at
>
> org.apache.hadoop.hdfs.BlockReaderFactory.build(BlockReaderFactory.java:325)
>
>         at
> org.apache.hadoop.hdfs.DFSInputStream.blockSeekTo(DFSInputStream.java:567)
>
>         at
>
> org.apache.hadoop.hdfs.DFSInputStream.readWithStrategy(DFSInputStream.java:7
> 93)
>
>         at
> org.apache.hadoop.hdfs.DFSInputStream.read(DFSInputStream.java:840)
>
>         at java.io.DataInputStream.readFully(DataInputStream.java:207)
>
>         at
>
> org.apache.hadoop.hbase.io.hfile.FixedFileTrailer.readFromStream(FixedFileTr
> ailer.java:391)
>
>         at
> org.apache.hadoop.hbase.io.hfile.HFile.pickReaderVersion(HFile.java:537)
>
>         at
> org.apache.hadoop.hbase.io.hfile.HFile.createReader(HFile.java:580)
>
>         at
>
> org.apache.hadoop.hbase.regionserver.StoreFile$Reader.<init>(StoreFile.java:
> 1019)
>
>         2015-12-05 14:48:33,859 WARN  [StoreFileOpenerThread-F-1]
> hdfs.BlockReaderFactory: I/O error constructing remote block reader.
>
> java.io.EOFException: Premature EOF: no length prefix available
>
>         at
> org.apache.hadoop.hdfs.protocolPB.PBHelper.vintPrefixed(PBHelper.java:1986)
>
>         at
>
> org.apache.hadoop.hdfs.RemoteBlockReader2.newBlockReader(RemoteBlockReader2.
> java:395)
>
>         at
>
> org.apache.hadoop.hdfs.BlockReaderFactory.getRemoteBlockReader(BlockReaderFa
> ctory.java:786)
>
>         at
>
> org.apache.hadoop.hdfs.BlockReaderFactory.getRemoteBlockReaderFromTcp(BlockR
> eaderFactory.java:665)
>
>         at
>
> org.apache.hadoop.hdfs.BlockReaderFactory.build(BlockReaderFactory.java:325)
>
>         at
> org.apache.hadoop.hdfs.DFSInputStream.blockSeekTo(DFSInputStream.java:567)
>
>         at
>
> org.apache.hadoop.hdfs.DFSInputStream.readWithStrategy(DFSInputStream.java:7
> 93)
>
>         at
> org.apache.hadoop.hdfs.DFSInputStream.read(DFSInputStream.java:840)
>
>         at java.io.DataInputStream.readFully(DataInputStream.java:207)
>
>         at
>
> org.apache.hadoop.hbase.io.hfile.FixedFileTrailer.readFromStream(FixedFileTr
> ailer.java:391)
>
>         at
> org.apache.hadoop.hbase.io.hfile.HFile.pickReaderVersion(HFile.java:537)
>
>         at
> org.apache.hadoop.hbase.io.hfile.HFile.createReader(HFile.java:580)
>
>         at
>
> org.apache.hadoop.hbase.regionserver.StoreFile$Reader.<init>(StoreFile.java:
> 1019)
>
>         at
>
> org.apache.hadoop.hbase.regionserver.StoreFileInfo.open(StoreFileInfo.java:2
> 11)
>
> 2015-12-05 16:02:51,026 WARN  [RS_CLOSE_REGION-px42pub:60020-0]
> regionserver.HRegionServer: Unable to report fatal error to master
>
> java.lang.ArrayIndexOutOfBoundsException
>
>         at sun.nio.cs.UTF_8$Decoder.decode(UTF_8.java:396)
>
>         at java.lang.StringCoding.decode(StringCoding.java:810)
>
>         at java.lang.String.<init>(String.java:2212)
>
>         at org.apache.hadoop.hbase.util.Bytes.toString(Bytes.java:380)
>
>         at
>
> org.apache.hadoop.hbase.ServerName.parseVersionedServerName(ServerName.java:
> 316)
>
>         at
>
> org.apache.hadoop.hbase.regionserver.HRegionServer.abort(HRegionServer.java:
> 1832)
>
>         at
>
> org.apache.hadoop.hbase.regionserver.handler.CloseRegionHandler.process(Clos
> eRegionHandler.java:159)
>
>         at
> org.apache.hadoop.hbase.executor.EventHandler.run(EventHandler.java:128)
>
>         at
>
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:11
> 77)
>
>         at
>
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:6
> 42)
>
>         at java.lang.Thread.run(Thread.java:857)
>
> 2015-12-05 16:02:51,026 INFO  [RS_CLOSE_REGION-px42pub:60020-0]
> regionserver.HRegionServer: STOPPED: Unrecoverable exception while closing
> region GPRS_201510,fca0,1435662970705.e908e82b92df6407941becdca7703e62.,
> still finishing close
>
> 2015-12-05 16:02:43,411 INFO  [Thread-844] hdfs.DFSClient: Exception in
> createBlockOutputStream
>
> java.io.EOFException: Premature EOF: no length prefix available
>
>         at
> org.apache.hadoop.hdfs.protocolPB.PBHelper.vintPrefixed(PBHelper.java:1986)
>
>         at
>
> org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.createBlockOutputStream(
> DFSOutputStream.java:1344)
>
>         at
>
> org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.nextBlockOutputStream(DF
> SOutputStream.java:1271)
>
>         at
>
> org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java
> :525)
>
>
>
>
>  Best Regards,
>
> Evan
>
>

Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message