spark-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Nicholas Chammas (JIRA)" <j...@apache.org>
Subject [jira] [Created] (SPARK-1458) Add programmatic way to determine Spark version
Date Wed, 09 Apr 2014 22:23:17 GMT
Nicholas Chammas created SPARK-1458:
---------------------------------------

             Summary: Add programmatic way to determine Spark version
                 Key: SPARK-1458
                 URL: https://issues.apache.org/jira/browse/SPARK-1458
             Project: Spark
          Issue Type: New Feature
          Components: PySpark, Spark Core
    Affects Versions: 0.9.0
            Reporter: Nicholas Chammas
            Priority: Minor


As discussed [here|http://apache-spark-user-list.1001560.n3.nabble.com/programmatic-way-to-tell-Spark-version-td1929.html],
I think it would be nice if there was a way to programmatically determine what version of
Spark you are running. 

The potential use cases are not that important, but they include:
# Branching your code based on what version of Spark is running.
# Checking your version without having to quit and restart the Spark shell.

Right now in PySpark, I believe the only way to determine your version is by firing up the
Spark shell and looking at the startup banner.



--
This message was sent by Atlassian JIRA
(v6.2#6252)

Mime
View raw message