hbase-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Yuexin Zhang (JIRA)" <j...@apache.org>
Subject [jira] [Created] (HBASE-18570) use hbase-spark without HBaseContext runs into NPE
Date Fri, 11 Aug 2017 09:27:00 GMT
Yuexin Zhang created HBASE-18570:
------------------------------------

             Summary: use hbase-spark without HBaseContext runs into NPE
                 Key: HBASE-18570
                 URL: https://issues.apache.org/jira/browse/HBASE-18570
             Project: HBase
          Issue Type: Improvement
          Components: hbase
    Affects Versions: 1.2.0
            Reporter: Yuexin Zhang
            Priority: Minor


I recently run into the same issue as described in stackoverflow :

https://stackoverflow.com/questions/38865558/sparksql-dataframes-does-not-work-in-spark-shell-and-application#

If we don't explicitly initialize a HBaseContext and don't set hbase.use.hbase.context option
to false, it will run into NPE at:
{code}
    val wrappedConf = new SerializableConfiguration(hbaseContext.config)
{code}

https://github.com/apache/hbase/blob/master/hbase-spark/src/main/scala/org/apache/hadoop/hbase/spark/DefaultSource.scala#L140

Should we safe guard with a NULL validation  on hbaseContext?

Something like: 
{code}
    //create or get latest HBaseContext
  val hbaseContext:HBaseContext = if (useHBaseContext && null != LatestHBaseContextCache.latest)
{
    LatestHBaseContextCache.latest
  } else {
    val config = HBaseConfiguration.create()
    configResources.split(",").foreach( r => config.addResource(r))
    new HBaseContext(sqlContext.sparkContext, config)
  }
{code}






--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

Mime
View raw message