spark-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Dongjoon Hyun (JIRA)" <j...@apache.org>
Subject [jira] [Commented] (SPARK-25337) HiveExternalCatalogVersionsSuite + Scala 2.12 = NoSuchMethodError: org.apache.spark.sql.execution.datasources.FileFormat.$init$(Lorg/apache/spark/sql/execution/datasources/FileFormat;)
Date Wed, 05 Sep 2018 03:36:00 GMT

    [ https://issues.apache.org/jira/browse/SPARK-25337?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16603865#comment-16603865
] 

Dongjoon Hyun commented on SPARK-25337:
---------------------------------------

[~srowen]. I reproduced this locally. The failure occurs during executing old Spark inside
`beforeAll` . So, it's marked as `aborted`. And, the root cause of failure is corrupted class
path for some reasons. I'm still investigation.

> HiveExternalCatalogVersionsSuite + Scala 2.12 = NoSuchMethodError: org.apache.spark.sql.execution.datasources.FileFormat.$init$(Lorg/apache/spark/sql/execution/datasources/FileFormat;)
> ----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
>
>                 Key: SPARK-25337
>                 URL: https://issues.apache.org/jira/browse/SPARK-25337
>             Project: Spark
>          Issue Type: Sub-task
>          Components: SQL
>    Affects Versions: 2.4.0
>            Reporter: Sean Owen
>            Priority: Major
>
> Observed in the Scala 2.12 pull request builder consistently now. I don't see this failing
the main 2.11 builds, so assume it's 2.12-related, but, kind of hard to see how.
> CC [~sadhen]
> {code:java}
> org.apache.spark.sql.hive.HiveExternalCatalogVersionsSuite *** ABORTED ***
> Exception encountered when invoking run on a nested suite - spark-submit returned with
exit code 1.
> Command line: './bin/spark-submit' '--name' 'prepare testing tables' '--master' 'local[2]'
'--conf' 'spark.ui.enabled=false' '--conf' 'spark.master.rest.enabled=false' '--conf' 'spark.sql.warehouse.dir=/home/jenkins/workspace/spark-master-test-maven-hadoop-2.7-ubuntu-scala-2.12/sql/hive/target/tmp/warehouse-37386cdb-c0fb-405d-9442-8f0044b81643'
'--conf' 'spark.sql.test.version.index=0' '--driver-java-options' '-Dderby.system.home=/home/jenkins/workspace/spark-master-test-maven-hadoop-2.7-ubuntu-scala-2.12/sql/hive/target/tmp/warehouse-37386cdb-c0fb-405d-9442-8f0044b81643'
'/home/jenkins/workspace/spark-master-test-maven-hadoop-2.7-ubuntu-scala-2.12/sql/hive/target/tmp/test7888487003559759098.py'
> ...
> 2018-09-04 20:00:04.949 - stdout>   File "/private/tmp/test-spark/spark-2.1.3/python/lib/pyspark.zip/pyspark/sql/session.py",
line 545, in sql
> 2018-09-04 20:00:04.949 - stdout>   File "/private/tmp/test-spark/spark-2.1.3/python/lib/py4j-0.10.7-src.zip/py4j/java_gateway.py",
line 1257, in __call__
> 2018-09-04 20:00:04.949 - stdout>   File "/private/tmp/test-spark/spark-2.1.3/python/lib/pyspark.zip/pyspark/sql/utils.py",
line 63, in deco
> 2018-09-04 20:00:04.949 - stdout>   File "/private/tmp/test-spark/spark-2.1.3/python/lib/py4j-0.10.7-src.zip/py4j/protocol.py",
line 328, in get_return_value
> 2018-09-04 20:00:04.95 - stdout> py4j.protocol.Py4JJavaError: An error occurred while
calling o27.sql.
> 2018-09-04 20:00:04.95 - stdout> : java.util.ServiceConfigurationError: org.apache.spark.sql.sources.DataSourceRegister:
Provider org.apache.spark.sql.hive.execution.HiveFileFormat could not be instantiated
> {code}



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org


Mime
View raw message