hive-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Owen O'Malley (JIRA)" <j...@apache.org>
Subject [jira] [Commented] (HIVE-12783) fix the unit test failures in TestSparkClient and TestSparkSessionManagerImpl
Date Thu, 07 Jan 2016 18:23:40 GMT

    [ https://issues.apache.org/jira/browse/HIVE-12783?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15087817#comment-15087817
] 

Owen O'Malley commented on HIVE-12783:
--------------------------------------

The big difference I'm seeing is a bunch of errors in spark-client/target/tmp/log/hive.log:

{code}
2016-01-07T09:51:49,242 INFO  [Driver[]]: spark.SparkEnv (Logging.scala:logInfo(59)) - Registering
OutputCommitCoordinator
2016-01-07T09:51:49,365 ERROR [Driver[]]: spark.SparkContext (Logging.scala:logError(96))
- Error initializing SparkContext.
java.lang.SecurityException: class "javax.servlet.FilterRegistration"'s signer information
does not match signer information of other classes in the same package
        at java.lang.ClassLoader.checkCerts(ClassLoader.java:952) ~[?:1.7.0_45]
        at java.lang.ClassLoader.preDefineClass(ClassLoader.java:666) ~[?:1.7.0_45]
        at java.lang.ClassLoader.defineClass(ClassLoader.java:794) ~[?:1.7.0_45]
        at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142) ~[?:1.7.0_45]
        at java.net.URLClassLoader.defineClass(URLClassLoader.java:449) ~[?:1.7.0_45]
        at java.net.URLClassLoader.access$100(URLClassLoader.java:71) ~[?:1.7.0_45]
        at java.net.URLClassLoader$1.run(URLClassLoader.java:361) ~[?:1.7.0_45]
        at java.net.URLClassLoader$1.run(URLClassLoader.java:355) ~[?:1.7.0_45]
        at java.security.AccessController.doPrivileged(Native Method) ~[?:1.7.0_45]
        at java.net.URLClassLoader.findClass(URLClassLoader.java:354) ~[?:1.7.0_45]
        at java.lang.ClassLoader.loadClass(ClassLoader.java:425) ~[?:1.7.0_45]
        at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308) ~[?:1.7.0_45]
        at java.lang.ClassLoader.loadClass(ClassLoader.java:358) ~[?:1.7.0_45]
        at org.spark-project.jetty.servlet.ServletContextHandler.<init>(ServletContextHandler.java:136)
~[spark-core_2.10-1.5.0.jar:1.5.0]
        at org.spark-project.jetty.servlet.ServletContextHandler.<init>(ServletContextHandler.java:129)
~[spark-core_2.10-1.5.0.jar:1.5.0]
        at org.spark-project.jetty.servlet.ServletContextHandler.<init>(ServletContextHandler.java:98)
~[spark-core_2.10-1.5.0.jar:1.5.0]
        at org.apache.spark.ui.JettyUtils$.createServletHandler(JettyUtils.scala:110) ~[spark-core_2.10-1.5.0.jar:1.5.0]
        at org.apache.spark.ui.JettyUtils$.createServletHandler(JettyUtils.scala:101) ~[spark-core_2.10-1.5.0.jar:1.5.0]
        at org.apache.spark.ui.WebUI.attachPage(WebUI.scala:78) ~[spark-core_2.10-1.5.0.jar:1.5.0]
        at org.apache.spark.ui.WebUI$$anonfun$attachTab$1.apply(WebUI.scala:62) ~[spark-core_2.10-1.5.0.jar:1.5.0]
        at org.apache.spark.ui.WebUI$$anonfun$attachTab$1.apply(WebUI.scala:62) ~[spark-core_2.10-1.5.0.jar:1.5.0]
        at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
~[scala-library-2.10.4.jar:?]
        at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:47) ~[scala-library-2.10.4.jar:?]
        at org.apache.spark.ui.WebUI.attachTab(WebUI.scala:62) ~[spark-core_2.10-1.5.0.jar:1.5.0]
        at org.apache.spark.ui.SparkUI.initialize(SparkUI.scala:61) ~[spark-core_2.10-1.5.0.jar:1.5.0]
        at org.apache.spark.ui.SparkUI.<init>(SparkUI.scala:74) ~[spark-core_2.10-1.5.0.jar:1.5.0]
        at org.apache.spark.ui.SparkUI$.create(SparkUI.scala:190) ~[spark-core_2.10-1.5.0.jar:1.5.0]
        at org.apache.spark.ui.SparkUI$.createLiveUI(SparkUI.scala:141) ~[spark-core_2.10-1.5.0.jar:1.5.0]
        at org.apache.spark.SparkContext.<init>(SparkContext.scala:457) [spark-core_2.10-1.5.0.jar:1.5.0]
        at org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:61)
[spark-core_2.10-1.5.0.jar:1.5.0]
        at org.apache.hive.spark.client.RemoteDriver.<init>(RemoteDriver.java:156) [classes/:?]
        at org.apache.hive.spark.client.RemoteDriver.main(RemoteDriver.java:500) [classes/:?]
        at org.apache.hive.spark.client.SparkClientImpl$2.run(SparkClientImpl.java:217) [classes/:?]
        at java.lang.Thread.run(Thread.java:744) [?:1.7.0_45]
{code}

Any ideas what is happening? The patch in question moved most of the orc classes to a new
jar named: hive-orc.

> fix the unit test failures in TestSparkClient and TestSparkSessionManagerImpl
> -----------------------------------------------------------------------------
>
>                 Key: HIVE-12783
>                 URL: https://issues.apache.org/jira/browse/HIVE-12783
>             Project: Hive
>          Issue Type: Sub-task
>          Components: Test
>    Affects Versions: 2.0.0
>            Reporter: Pengcheng Xiong
>            Priority: Blocker
>
> This includes
> {code}
> org.apache.hive.spark.client.TestSparkClient.testSyncRpc
> org.apache.hive.spark.client.TestSparkClient.testJobSubmission
> org.apache.hive.spark.client.TestSparkClient.testMetricsCollection
> org.apache.hive.spark.client.TestSparkClient.testCounters
> org.apache.hive.spark.client.TestSparkClient.testRemoteClient
> org.apache.hive.spark.client.TestSparkClient.testAddJarsAndFiles
> org.apache.hive.spark.client.TestSparkClient.testSimpleSparkJob
> org.apache.hive.spark.client.TestSparkClient.testErrorJob
> org.apache.hadoop.hive.ql.exec.spark.session.TestSparkSessionManagerImpl.testMultiSessionMultipleUse
> org.apache.hadoop.hive.ql.exec.spark.session.TestSparkSessionManagerImpl.testSingleSessionMultipleUse
> {code}
> all of them passed on my laptop. cc'ing [~szehon], [~xuefuz], could you please take a
look? Shall we ignore them? Thanks.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

Mime
View raw message