incubator-spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Marek Wiewiorka <marek.wiewio...@gmail.com>
Subject Re: Adding external jar to spark-shell classpath using ADD_JARS
Date Tue, 04 Feb 2014 21:31:28 GMT
Try adding these jars to SPARK_CLASSPATH as well.


2014-02-04 Soumya Simanta <soumya.simanta@gmail.com>:

> Hi,
>
> I've a Spark cluster where I want to use classes from 3rd party jar in my
> shell.
>
> I'm starting my spark shell using the following command.
>
>
> MASTER="spark://n001:7077"
> ADD_JARS=/home/myuserid/twitter4j/lib/twitter4j-core-3.0.5.jar
> SPARK_MEM="24G" ./spark-shell
>
>
> I also see the following in the logs.
>
> 14/02/04 16:09:25 INFO SparkContext: Added JAR
> /home/myuserid/twitter4j/lib/twitter4j-core-3.0.5.jar at
> http://10.27.112.32:59460/jars/twitter4j-core-3.0.5.jar with timestamp
> 1391548165483
>
>
> However, when I try to import one of the classes in that jar file I get
> the following error.
>
>
> scala> import twitter4j.Status
>
> <console>:10: error: not found: value twitter4j
>
>        import twitter4j.Status
>
>               ^
>

Mime
View raw message