spark-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Patrick Wendell (JIRA)" <>
Subject [jira] [Commented] (SPARK-1458) Expose sc.version in PySpark
Date Thu, 10 Jul 2014 19:53:05 GMT


Patrick Wendell commented on SPARK-1458:

[~nchammas] I updated the JIRA title to reflect the scope. We should just add this in PySpark,
should be an easy fix!

> Expose sc.version in PySpark
> ----------------------------
>                 Key: SPARK-1458
>                 URL:
>             Project: Spark
>          Issue Type: New Feature
>          Components: PySpark, Spark Core
>    Affects Versions: 0.9.0
>            Reporter: Nicholas Chammas
>            Priority: Minor
> As discussed [here|],
I think it would be nice if there was a way to programmatically determine what version of
Spark you are running. 
> The potential use cases are not that important, but they include:
> # Branching your code based on what version of Spark is running.
> # Checking your version without having to quit and restart the Spark shell.
> Right now in PySpark, I believe the only way to determine your version is by firing up
the Spark shell and looking at the startup banner.

This message was sent by Atlassian JIRA

View raw message