spark-reviews mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From GavinGavinNo1 <...@git.apache.org>
Subject [GitHub] spark pull request: [SPARK-10529][SQL]When creating multiple HiveC...
Date Sat, 12 Sep 2015 02:02:40 GMT
Github user GavinGavinNo1 commented on the pull request:

    https://github.com/apache/spark/pull/8713#issuecomment-139702061
  
    Thank you much for your comment. I think I haven't got what you mean for the ability to
connect to multiple metastores.One HiveContext can only connect to one metastore, right? Or
you mean creating multiple HiveContext to connect to multiple metastores with one SparkContext
in one JVM? If so, it'll lead to the same JVM OOM problem in theory.
    We use spark 1.3.1 formerly. You know it isn't supported for dynamic allocation in standalone
mode. We have several apps and each one launches timely tasks using HiveContext. Due to the
limit of hardware resources, we must stop SparkContext to release CPU and memory resources
when a task is done. When Spark 1.4.1 comes out, it brings many new features and we want to
switch to this version. However, problems mentioned in my issue make a lot of trouble to us.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


Mime
View raw message