spark-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Apache Spark (JIRA)" <j...@apache.org>
Subject [jira] [Commented] (SPARK-20440) Allow SparkR session and context to have delayed binding
Date Sat, 22 Apr 2017 23:21:04 GMT

    [ https://issues.apache.org/jira/browse/SPARK-20440?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15980163#comment-15980163
] 

Apache Spark commented on SPARK-20440:
--------------------------------------

User 'vijoshi' has created a pull request for this issue:
https://github.com/apache/spark/pull/17731

> Allow SparkR session and context to have delayed binding
> --------------------------------------------------------
>
>                 Key: SPARK-20440
>                 URL: https://issues.apache.org/jira/browse/SPARK-20440
>             Project: Spark
>          Issue Type: Improvement
>          Components: SparkR
>    Affects Versions: 2.1.0
>            Reporter: Vinayak Joshi
>
> It would be useful if users could do something like this without first invoking {{sparkR.session()}}:
> {code}
> delayedAssign(".sparkRsession", { sparkR.session(..) }, assign.env=SparkR:::.sparkREnv)
> {code}
> This would help providers of interactive environments that bootstrap Spark for their
users but the user code need not always include SparkR and so possibility of lazy semantics
for setting up a SparkSession/Context would be very useful. 
> Note that SparkR API does not have a single entry object (such as Scala/Python SparkSession
classes) so it's the only env where such lazy setup is currently difficult to achieve, so
doing this enhancement will make it easier. 
> The changes required are minor and do not affect the external API or functionality in
any way. I will attach a PR with the changes needed for consideration shortly. 



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org


Mime
View raw message