hive-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Xiaomin Zhang <zhangxiao...@gmail.com>
Subject Re: Hive on spark "com.esotericsoftware.kryo.Kryo cannot be cast to org.apache.hive.com.esotericsoftware.kryo.Kryo"
Date Fri, 16 Jan 2015 03:54:22 GMT
Hi:
Please make sure your Spark cluster is not built with Hive profile.
This ClassCastException is due to the conflict that the spark-assembly.jar
contains another version of Hive.

Best Regards.
Xiaomin


On Tue, Jan 13, 2015 at 6:06 PM, Damien Carol <dcarol@blitzbs.com> wrote:

> Hello,
>
> I have some problems with Hive on spark.
>
> I'm using spark branch with a standalone spark cluster but I can't make any
> query.
>
> I have this error :  java.lang.ClassCastException:
> com.esotericsoftware.kryo.Kryo cannot be cast to
> org.apache.hive.com.esotericsoftware.kryo.Kryo
>
> I wander if it's because Kryo is shaded in HIVE jars.
>
> Any advice to understand what's going on?
>
> Thanks in advance for any help.
>
> Regards,
>
> Here the complete stack :
>
> 2015-01-13 10:55:06,604 INFO  [stderr-redir-1]: client.SparkClientImpl
> (SparkClientImpl.java:run(454)) - java.lang.ClassCastException:
> com.esotericsoftware.kryo.Kryo cannot be cast to
> org.apache.hive.com.esotericsoftware.kryo.Kryo
> 2015-01-13 10:55:06,604 INFO  [stderr-redir-1]: client.SparkClientImpl
> (SparkClientImpl.java:run(454)) -        at
>
> org.apache.hadoop.hive.ql.exec.spark.KryoSerializer.deserialize(KryoSerializer.java:49)
> 2015-01-13 10:55:06,605 INFO  [stderr-redir-1]: client.SparkClientImpl
> (SparkClientImpl.java:run(454)) -        at
>
> org.apache.hadoop.hive.ql.exec.spark.RemoteHiveSparkClient$JobStatusJob.call(RemoteHiveSparkClient.java:211)
> 2015-01-13 10:55:06,605 INFO  [stderr-redir-1]: client.SparkClientImpl
> (SparkClientImpl.java:run(454)) -        at
>
> org.apache.hive.spark.client.RemoteDriver$JobWrapper.call(RemoteDriver.java:298)
> 2015-01-13 10:55:06,605 INFO  [stderr-redir-1]: client.SparkClientImpl
> (SparkClientImpl.java:run(454)) -        at
>
> org.apache.hive.spark.client.RemoteDriver$JobWrapper.call(RemoteDriver.java:269)
> 2015-01-13 10:55:06,605 INFO  [stderr-redir-1]: client.SparkClientImpl
> (SparkClientImpl.java:run(454)) -        at
> java.util.concurrent.FutureTask.run(FutureTask.java:262)
> 2015-01-13 10:55:06,605 INFO  [stderr-redir-1]: client.SparkClientImpl
> (SparkClientImpl.java:run(454)) -        at
>
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
> 2015-01-13 10:55:06,605 INFO  [stderr-redir-1]: client.SparkClientImpl
> (SparkClientImpl.java:run(454)) -        at
>
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
> 2015-01-13 10:55:06,606 INFO  [stderr-redir-1]: client.SparkClientImpl
> (SparkClientImpl.java:run(454)) -        at
> java.lang.Thread.run(Thread.java:744)
>
>
>
> Damien CAROL
>
>    - tél : +33 (0)4 74 96 88 14
>    - email : dcarol@blitzbs.com
>
> BLITZ BUSINESS SERVICE
>

Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message