hive-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Hive QA (JIRA)" <>
Subject [jira] [Commented] (HIVE-8959) SparkSession is not closed until JVM exit.[Spark Branch]
Date Tue, 25 Nov 2014 12:37:12 GMT


Hive QA commented on HIVE-8959:

{color:red}Overall{color}: -1 at least one tests failed

Here are the results of testing the latest attachment:

{color:red}ERROR:{color} -1 due to 2 failed/errored test(s), 7181 tests executed
*Failed tests:*

Test results:
Console output:
Test logs:

Executing org.apache.hive.ptest.execution.PrepPhase
Executing org.apache.hive.ptest.execution.ExecutionPhase
Executing org.apache.hive.ptest.execution.ReportingPhase
Tests exited with: TestsFailedException: 2 tests failed

This message is automatically generated.

ATTACHMENT ID: 12683509 - PreCommit-HIVE-SPARK-Build

> SparkSession is not closed until JVM exit.[Spark Branch]
> --------------------------------------------------------
>                 Key: HIVE-8959
>                 URL:
>             Project: Hive
>          Issue Type: Sub-task
>          Components: Spark
>            Reporter: Chengxiang Li
>            Assignee: Chengxiang Li
>              Labels: Spark-M3
>         Attachments: HIVE-8959.1-spark.patch
> During unit test, SparkSession is closed by Runtime shutdownHook, which means it's closed
until JVM exist. During unit test suite, each qfile, as a single test case, would reset SessionState,
which lead to a new Sparksession is created for each qfile. As we know that, RemoteSparkClient
is SparkSession specified, so more and more executors is launched during unit test until blocked
by no more resources.

This message was sent by Atlassian JIRA

View raw message