hive-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Hive QA (JIRA)" <>
Subject [jira] [Commented] (HIVE-20519) Remove 30m min value for hive.spark.session.timeout
Date Thu, 13 Sep 2018 21:05:00 GMT


Hive QA commented on HIVE-20519:

Here are the results of testing the latest attachment:

{color:green}SUCCESS:{color} +1 due to 2 test(s) being added or modified.

{color:red}ERROR:{color} -1 due to 1 failed/errored test(s), 14945 tests executed
*Failed tests:*

Test results:
Console output:
Test logs:

Executing org.apache.hive.ptest.execution.TestCheckPhase
Executing org.apache.hive.ptest.execution.PrepPhase
Executing org.apache.hive.ptest.execution.YetusPhase
Executing org.apache.hive.ptest.execution.ExecutionPhase
Executing org.apache.hive.ptest.execution.ReportingPhase
Tests exited with: TestsFailedException: 1 tests failed

This message is automatically generated.

ATTACHMENT ID: 12939492 - PreCommit-HIVE-Build

> Remove 30m min value for hive.spark.session.timeout
> ---------------------------------------------------
>                 Key: HIVE-20519
>                 URL:
>             Project: Hive
>          Issue Type: Sub-task
>          Components: Spark
>            Reporter: Sahil Takiar
>            Assignee: Sahil Takiar
>            Priority: Major
>         Attachments: HIVE-20519.1.patch
> In HIVE-14162 we added the config \{{hive.spark.session.timeout}} which provided a way
to time out Spark sessions that are active for a long period of time. The config has a lower
bound of 30m which we should remove. It should be possible for users to configure this value
so the HoS session is closed as soon as the query is complete.

This message was sent by Atlassian JIRA

View raw message