spark-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Nicholas Chammas (JIRA)" <>
Subject [jira] [Created] (SPARK-1458) Add programmatic way to determine Spark version
Date Wed, 09 Apr 2014 22:23:17 GMT
Nicholas Chammas created SPARK-1458:

             Summary: Add programmatic way to determine Spark version
                 Key: SPARK-1458
             Project: Spark
          Issue Type: New Feature
          Components: PySpark, Spark Core
    Affects Versions: 0.9.0
            Reporter: Nicholas Chammas
            Priority: Minor

As discussed [here|],
I think it would be nice if there was a way to programmatically determine what version of
Spark you are running. 

The potential use cases are not that important, but they include:
# Branching your code based on what version of Spark is running.
# Checking your version without having to quit and restart the Spark shell.

Right now in PySpark, I believe the only way to determine your version is by firing up the
Spark shell and looking at the startup banner.

This message was sent by Atlassian JIRA

View raw message