spark-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Stephen Boesch <>
Subject Extra libs for bin/spark-shell - specifically for hbase
Date Fri, 15 Aug 2014 23:07:24 GMT
Although this has been discussed a number of times here, I am still unclear
how to add user jars to the spark-shell:

a) for importing classes for use directly within the shell interpreter

b) for  invoking SparkContext commands with closures referencing user
supplied classes contained within jar's.

Similarly to other posts, I have gone through:

 updating bin/
  creating conf/spark-defaults.conf  and adding

Hopefully there would be something along the lines of  a single entry added
to some claspath somewhere like this

   SPARK_CLASSPATH/driver-class-path/spark.executor.extraClassPath (or
whatever is the correct option..)  =

Any ideas here?


  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message