spark-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Stephen Boesch <java...@gmail.com>
Subject Extra libs for bin/spark-shell - specifically for hbase
Date Fri, 15 Aug 2014 23:07:24 GMT
Although this has been discussed a number of times here, I am still unclear
how to add user jars to the spark-shell:

a) for importing classes for use directly within the shell interpreter

b) for  invoking SparkContext commands with closures referencing user
supplied classes contained within jar's.

Similarly to other posts, I have gone through:

 updating bin/spark-env.sh
 SPARK_CLASSPATH
 SPARK_SUBMIT_OPTS
  creating conf/spark-defaults.conf  and adding
 spark.executor.extraClassPath
--driver-class-path
  etc

Hopefully there would be something along the lines of  a single entry added
to some claspath somewhere like this

   SPARK_CLASSPATH/driver-class-path/spark.executor.extraClassPath (or
whatever is the correct option..)  =
$HBASE_HOME/*:$HBASE_HOME/lib/*:$SPARK_CLASSPATH

Any ideas here?

thanks

Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message