hbase-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Lucas Resch (JIRA)" <j...@apache.org>
Subject [jira] [Created] (HBASE-19201) BulkLoading in HBaseContext in hbase-spark does not close connection
Date Tue, 07 Nov 2017 13:55:02 GMT
Lucas Resch created HBASE-19201:
-----------------------------------

             Summary: BulkLoading in HBaseContext in hbase-spark does not close connection
                 Key: HBASE-19201
                 URL: https://issues.apache.org/jira/browse/HBASE-19201
             Project: HBase
          Issue Type: Bug
          Components: hbase
    Affects Versions: 1.1.12
         Environment: I was using the cdh 5.11.1 version but I checken on newest branch and
problem persists
            Reporter: Lucas Resch


Within the hbase-spark module an HBaseContext exists that provides utility functions to do
bulkLoading data in HBase. I tried using this function in a streaming context, but after a
while Zookeeper denies further connections since the maximum of connections per client is
exhausted. 

This issue seems to be within HBaseContext, since the functions bulkLoad and bulkLoadThinRows
open a connection via the ConnectionFactory, but never closes that connection.

I copied the needed code into a new scala project and added a conn.close() at the end of the
function and the problem is gone. 

It seems like no one else has had this problem before. I'm guessing thats because almost no
one uses its function within a streaming context. And a one time call to it with RDDs might
never reach that upper limit on connections. 



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

Mime
View raw message