spark-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Ajay Saini (JIRA)" <>
Subject [jira] [Updated] (SPARK-21542) Helper functions for custom Python Persistence
Date Wed, 26 Jul 2017 23:11:00 GMT


Ajay Saini updated SPARK-21542:
    Component/s:     (was: ML)

> Helper functions for custom Python Persistence
> ----------------------------------------------
>                 Key: SPARK-21542
>                 URL:
>             Project: Spark
>          Issue Type: New Feature
>          Components: PySpark
>    Affects Versions: 2.2.0
>            Reporter: Ajay Saini
> Currnetly, there is no way to easily persist Json-serializable parameters in Python only.
All parameters in Python are persisted by converting them to Java objects and using the Java
persistence implementation. In order to facilitate the creation of custom Python-only pipeline
stages, it would be good to have a Python-only persistence framework so that these stages
do not need to be implemented in Scala for persistence. 
> This task involves:
> - Adding implementations for DefaultParamsReadable, DefaultParamsWriteable, DefaultParamsReader,
and DefaultParamsWriter in pyspark.

This message was sent by Atlassian JIRA

To unsubscribe, e-mail:
For additional commands, e-mail:

View raw message