hive-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Xuefu Zhang (JIRA)" <j...@apache.org>
Subject [jira] [Commented] (HIVE-8854) Guava dependency conflict between hive driver and remote spark context[Spark Branch]
Date Fri, 14 Nov 2014 15:04:34 GMT

    [ https://issues.apache.org/jira/browse/HIVE-8854?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14212338#comment-14212338
] 

Xuefu Zhang commented on HIVE-8854:
-----------------------------------

{quote}
In Hive spark branch, we updated hive guava version from 11 to 14, does it necessary after
Spark has shaded guava dependency?
{quote}
I wasn't aware that in Spark branch we updated guava version. This is would be a problem when
we merge back to trunk. Thus, we probably need to revert it and then solve the problem.

And yes, it doesn't make sense to keep both versions, one of which is shaded.

> Guava dependency conflict between hive driver and remote spark context[Spark Branch]
> ------------------------------------------------------------------------------------
>
>                 Key: HIVE-8854
>                 URL: https://issues.apache.org/jira/browse/HIVE-8854
>             Project: Hive
>          Issue Type: Sub-task
>          Components: Spark
>            Reporter: Chengxiang Li
>              Labels: Spark-M3
>         Attachments: hive-dirver-classloader-info.output
>
>
> Hive driver would load guava 11.0.2 from hadoop/tez, while remote spark context depends
on guava 14.0.1, It should be JobMetrics deserialize failed on Hive driver side since Absent
is used in Metrics, here is the hive driver log:
> {noformat}
> java.lang.IllegalAccessError: tried to access method com.google.common.base.Optional.<init>()V
from class com.google.common.base.Absent
>         at com.google.common.base.Absent.<init>(Absent.java:35)
>         at com.google.common.base.Absent.<clinit>(Absent.java:33)
>         at sun.misc.Unsafe.ensureClassInitialized(Native Method)
>         at sun.reflect.UnsafeFieldAccessorFactory.newFieldAccessor(UnsafeFieldAccessorFactory.java:43)
>         at sun.reflect.ReflectionFactory.newFieldAccessor(ReflectionFactory.java:140)
>         at java.lang.reflect.Field.acquireFieldAccessor(Field.java:1057)
>         at java.lang.reflect.Field.getFieldAccessor(Field.java:1038)
>         at java.lang.reflect.Field.getLong(Field.java:591)
>         at java.io.ObjectStreamClass.getDeclaredSUID(ObjectStreamClass.java:1663)
>         at java.io.ObjectStreamClass.access$700(ObjectStreamClass.java:72)
>         at java.io.ObjectStreamClass$2.run(ObjectStreamClass.java:480)
>         at java.io.ObjectStreamClass$2.run(ObjectStreamClass.java:468)
>         at java.security.AccessController.doPrivileged(Native Method)
>         at java.io.ObjectStreamClass.<init>(ObjectStreamClass.java:468)
>         at java.io.ObjectStreamClass.lookup(ObjectStreamClass.java:365)
>         at java.io.ObjectStreamClass.initNonProxy(ObjectStreamClass.java:602)
>         at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1622)
>         at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1517)
>         at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1771)
>         at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
>         at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1990)
>         at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1915)
>         at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
>         at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
>         at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1990)
>         at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1915)
>         at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
>         at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
>         at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)
>         at akka.serialization.JavaSerializer$$anonfun$1.apply(Serializer.scala:136)
>         at scala.util.DynamicVariable.withValue(DynamicVariable.scala:57)
>         at akka.serialization.JavaSerializer.fromBinary(Serializer.scala:136)
>         at akka.serialization.Serialization$$anonfun$deserialize$1.apply(Serialization.scala:104)
>         at scala.util.Try$.apply(Try.scala:161)
>         at akka.serialization.Serialization.deserialize(Serialization.scala:98)
>         at akka.remote.serialization.MessageContainerSerializer.fromBinary(MessageContainerSerializer.scala:63)
>         at akka.serialization.Serialization$$anonfun$deserialize$1.apply(Serialization.scala:104)
>         at scala.util.Try$.apply(Try.scala:161)
>         at akka.serialization.Serialization.deserialize(Serialization.scala:98)
>         at akka.remote.MessageSerializer$.deserialize(MessageSerializer.scala:23)
>         at akka.remote.DefaultMessageDispatcher.payload$lzycompute$1(Endpoint.scala:58)
>         at akka.remote.DefaultMessageDispatcher.payload$1(Endpoint.scala:58)
>         at akka.remote.DefaultMessageDispatcher.dispatch(Endpoint.scala:76)
>         at akka.remote.EndpointReader$$anonfun$receive$2.applyOrElse(Endpoint.scala:937)
>         at akka.actor.Actor$class.aroundReceive(Actor.scala:465)
>         at akka.remote.EndpointActor.aroundReceive(Endpoint.scala:415)
>         at akka.actor.ActorCell.receiveMessage(ActorCell.scala:516)
>         at akka.actor.ActorCell.invoke(ActorCell.scala:487)
>         at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:238)
>         at akka.dispatch.Mailbox.run(Mailbox.scala:220)
>         at akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:393)
>         at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
>         at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
>         at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
>         at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
> {noformat}
> and remote spark context log:
> {noformat}
> 2014-11-13 17:16:28,481 INFO  [task-result-getter-1]: scheduler.TaskSetManager (Logging.scala:logInfo(59))
- Finished task 0.0 in stage 1.0 (TID 1) in 439 ms on node14-4 (1/1)
> 2014-11-13 17:16:28,482 INFO  [sparkDriver-akka.actor.default-dispatcher-8]: scheduler.DAGScheduler
(Logging.scala:logInfo(59)) - Stage 1 (foreachAsync at RemoteHiveSparkClient.java:121) finished
in 0.452 s
> 2014-11-13 17:16:28,482 INFO  [task-result-getter-1]: scheduler.TaskSchedulerImpl (Logging.scala:logInfo(59))
- Removed TaskSet 1.0, whose tasks have all completed, from pool
> 2014-11-13 17:16:28,486 INFO  [08592e9f-19a2-413d-bc48-c871259c4d2e-akka.actor.default-dispatcher-4]:
remote.RemoteActorRefProvider$RemoteDeadLetterActorRef (Slf4jLogger.scala:apply$mcV$sp(74))
- Message [org.apache.hive.spark.client.Protocol$JobMetrics] from Actor[akka://08592e9f-19a2-413d-bc48-c871259c4d2e/user/RemoteDriver#-893697064]
to Actor[akka://08592e9f-19a2-413d-bc48-c871259c4d2e/deadLetters] was not delivered. [3] dead
letters encountered. This logging can be turned off or adjusted with configuration settings
'akka.log-dead-letters' and 'akka.log-dead-letters-during-shutdown'.
> 2014-11-13 17:16:28,494 INFO  [08592e9f-19a2-413d-bc48-c871259c4d2e-akka.actor.default-dispatcher-4]:
remote.RemoteActorRefProvider$RemoteDeadLetterActorRef (Slf4jLogger.scala:apply$mcV$sp(74))
- Message [org.apache.hive.spark.client.Protocol$JobResult] from Actor[akka://08592e9f-19a2-413d-bc48-c871259c4d2e/user/RemoteDriver#-893697064]
to Actor[akka://08592e9f-19a2-413d-bc48-c871259c4d2e/deadLetters] was not delivered. [4] dead
letters encountered. This logging can be turned off or adjusted with configuration settings
'akka.log-dead-letters' and 'akka.log-dead-letters-during-shutdown'.
> {noformat}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

Mime
View raw message