spark-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Shivaram Venkataraman (JIRA)" <j...@apache.org>
Subject [jira] [Commented] (SPARK-15585) Don't use null in data source options to indicate default value
Date Fri, 27 May 2016 05:28:12 GMT

    [ https://issues.apache.org/jira/browse/SPARK-15585?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15303523#comment-15303523
] 

Shivaram Venkataraman commented on SPARK-15585:
-----------------------------------------------

I am not sure i completely understand the question - The way the options get passed in R [1]
 is that we create a hash map and fill it in with anything passed in by the user. `NULL` is
a restricted keyword in R (note that its in all caps), and it gets deserialized / passed as
`null` to Scala.

[1] https://github.com/apache/spark/blob/c82883239eadc4615a3aba907cd4633cb7aed26e/R/pkg/R/SQLContext.R#L658

> Don't use null in data source options to indicate default value
> ---------------------------------------------------------------
>
>                 Key: SPARK-15585
>                 URL: https://issues.apache.org/jira/browse/SPARK-15585
>             Project: Spark
>          Issue Type: Improvement
>          Components: SQL
>            Reporter: Reynold Xin
>            Priority: Critical
>
> See email: http://apache-spark-developers-list.1001551.n3.nabble.com/changed-behavior-for-csv-datasource-and-quoting-in-spark-2-0-0-SNAPSHOT-td17704.html
> We'd need to change DataFrameReader/DataFrameWriter in Python's csv/json/parquet/...
functions to put the actual default option values as function parameters, rather than setting
them to None. We can then in CSVOptions.getChar (and JSONOptions, etc) to actually return
null if the value is null, rather  than setting it to default value.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org


Mime
View raw message