hbase-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Soumitra Kumar <kumar.soumi...@gmail.com>
Subject Re: How to add HBase dependencies and conf with spark-submit?
Date Wed, 15 Oct 2014 14:39:28 GMT
I am writing to HBase, following are my options:

export SPARK_CLASSPATH=/opt/cloudera/parcels/CDH/lib/hbase/hbase-protocol.jar

spark-submit \
    --jars /opt/cloudera/parcels/CDH/lib/hbase/hbase-protocol.jar,/opt/cloudera/parcels/CDH/lib/hbase/hbase-common.jar,/opt/cloudera/parcels/CDH/lib/hbase/hbase-client.jar,/opt/cloudera/parcels/CDH/lib/hbase/lib/htrace-core.jar
\

----- Original Message -----
From: "Fengyun RAO" <raofengyun@gmail.com>
To: user@spark.apache.org, user@hbase.apache.org
Sent: Wednesday, October 15, 2014 6:29:21 AM
Subject: Re: How to add HBase dependencies and conf with spark-submit?


+user@hbase 



2014-10-15 20:48 GMT+08:00 Fengyun RAO < raofengyun@gmail.com > : 



We use Spark 1.1, and HBase 0.98.1-cdh5.1.0, and need to read and write an HBase table in
Spark program. 



I notice there are: 

spark.driver.extraClassPath spark.executor.extraClassPath properties to manage extra ClassPath,
over even an deprecated SPARK_CLASSPATH. 


The problem is what classpath or jars should we append? 
I can simplely add the whole `hbase classpath`, which is huge, 
but this leads to dependencies conflict, e.g. HBase uses guava-12 while Spark uses guava-14.




Mime
View raw message