spark-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Marcelo Vanzin (JIRA)" <>
Subject [jira] [Updated] (SPARK-2098) All Spark processes should support spark-defaults.conf, config file
Date Tue, 10 Jun 2014 20:35:02 GMT


Marcelo Vanzin updated SPARK-2098:

    Issue Type: Improvement  (was: Bug)

> All Spark processes should support spark-defaults.conf, config file
> -------------------------------------------------------------------
>                 Key: SPARK-2098
>                 URL:
>             Project: Spark
>          Issue Type: Improvement
>          Components: Spark Core
>    Affects Versions: 1.0.0
>            Reporter: Marcelo Vanzin
> SparkSubmit supports the idea of a config file to set SparkConf configurations. This
is handy because you can easily set a site-wide configuration file, and power users can use
their own when needed, or resort to JVM properties or other means of overriding configs.
> It would be nice if all Spark processes (e.g. master / worker / history server) also
supported something like this. For daemon processes this is particularly interesting because
it makes it easy to decouple starting the daemon (e.g. some /etc/init.d script packaged by
some distribution) from configuring that daemon. Right now you have to set environment variables
to modify the configuration of those daemons, which is not very friendly to the above scenario.

This message was sent by Atlassian JIRA

View raw message