spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Michael Armbrust <mich...@databricks.com>
Subject Re: Unable to find org.apache.spark.sql.catalyst.ScalaReflection class
Date Sun, 01 Mar 2015 01:32:59 GMT
I think its possible that the problem is that the scala compiler is not
being loaded by the primordial classloader (but instead by some child
classloader) and thus the scala reflection mirror is failing to initialize
when it can't find it. Unfortunately, the only solution that I know of is
to load all required jars when the JVM starts.

On Sat, Feb 28, 2015 at 5:26 PM, Ashish Nigam <ashnigamtech@gmail.com>
wrote:

> Also, can scala version play any role here?
> I am using scala 2.11.5 but all spark packages have dependency to scala
> 2.11.2
> Just wanted to make sure that scala version is not an issue here.
>
> On Sat, Feb 28, 2015 at 9:18 AM, Ashish Nigam <ashnigamtech@gmail.com>
> wrote:
>
>> Hi,
>> I wrote a very simple program in scala to convert an existing RDD to
>> SchemaRDD.
>> But createSchemaRDD function is throwing exception
>>
>> Exception in thread "main" scala.ScalaReflectionException: class
>> org.apache.spark.sql.catalyst.ScalaReflection in JavaMirror with primordial
>> classloader with boot classpath [.....] not found
>>
>>
>> Here's more info on the versions I am using -
>>
>> <scala.binary.version>2.11</scala.binary.version>
>>     <spark.version>1.2.1</spark.version>
>>     <scala.version>2.11.5</scala.version>
>>
>> Please let me know how can I resolve this problem.
>>
>> Thanks
>> Ashish
>>
>
>

Mime
View raw message