hbase-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Manimekalai K <kmanimeka...@gmail.com>
Subject how to avoid interruptedioexception in hbase put operation
Date Wed, 19 Sep 2018 10:32:27 GMT
While putting data in Hbase, using the HTable.put method, I'll end up with
the below exception occasionally. But the data has been actually written to
Hbase when I checked the get operation for that particular rowkey.

For the same time I have searched for the logs in both HMaster and
HRegionservers to identify the issue. But unable to find that.

hbase.client.* configurations has default value only.

Please help to fine tune Hbase Configurations in order to avoid

Hadoop Distribution: ApacheVersion: HBase 1.2.6Cluster size: 12nodes

java.io.InterruptedIOException: #17209, interrupted. currentNumberOfTask=1
    at org.apache.hadoop.hbase.client.AsyncProcess.waitForMaximumCurrentTasks(AsyncProcess.java:1764)
    at org.apache.hadoop.hbase.client.AsyncProcess.waitForMaximumCurrentTasks(AsyncProcess.java:1734)
    at org.apache.hadoop.hbase.client.AsyncProcess.waitForAllPreviousOpsAndReset(AsyncProcess.java:1810)
    at org.apache.hadoop.hbase.client.BufferedMutatorImpl.backgroundFlushCommits(BufferedMutatorImpl.java:240)
    at org.apache.hadoop.hbase.client.BufferedMutatorImpl.flush(BufferedMutatorImpl.java:190)
    at org.apache.hadoop.hbase.client.HTable.flushCommits(HTable.java:1434)
    at org.apache.hadoop.hbase.client.HTable.put(HTable.java:1018)

Please help to solve it

The same exception has been faced by someone . But in that thread, there is
no explanation about which are configurations need to be checked in order
to avoid it

*Manimekalai K*

  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message