hive-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Jimmy Xiang (JIRA)" <>
Subject [jira] [Commented] (HIVE-12538) After set spark related config, SparkSession never get reused
Date Tue, 01 Dec 2015 21:10:11 GMT


Jimmy Xiang commented on HIVE-12538:

bq. Not quite follow.Is there anything special in operation conf for SparkSession? And when
to set "isSparkConfigUpdated =false " ?
We can set it to false for the session level conf only. So this flag in the operation level
is totaly ignored, all the time.
Things are a little tricky actually. If we use the session level conf, we could miss some
non-spark-related settings in the operation level conf.
If we use the operation level conf, we could miss some spark-related settings in the session
level conf.
Instead of just maintaining a isSparkConfigUpdated flag, probably, we should have a separate
map to store such changed spark-related settings temporarily.
This map can be reset upon SparkUtilities#getSparkSession() is invoked.

> After set spark related config, SparkSession never get reused
> -------------------------------------------------------------
>                 Key: HIVE-12538
>                 URL:
>             Project: Hive
>          Issue Type: Bug
>          Components: Spark
>    Affects Versions: 1.3.0
>            Reporter: Nemon Lou
>            Assignee: Nemon Lou
>         Attachments: HIVE-12538.1.patch, HIVE-12538.patch
> Hive on Spark yarn-cluster mode.
> After setting "set spark.yarn.queue=QueueA;" ,
> run the query "select count(*) from test"  3 times and you will find  3 different yarn
> Two of the yarn applications in FINISHED & SUCCEEDED state,and one in RUNNING &
UNDEFINED state waiting for next work.
> And if you submit one more "select count(*) from test" ,the third one will be in FINISHED
& SUCCEEDED state and a new yarn application will start up.

This message was sent by Atlassian JIRA

View raw message