spark-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Shivaram Venkataraman (JIRA)" <j...@apache.org>
Subject [jira] [Commented] (SPARK-16883) SQL decimal type is not properly cast to number when collecting SparkDataFrame
Date Tue, 09 Aug 2016 23:36:20 GMT

    [ https://issues.apache.org/jira/browse/SPARK-16883?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15414446#comment-15414446
] 

Shivaram Venkataraman commented on SPARK-16883:
-----------------------------------------------

Yeah I see that - The change I am proposing is to add another case inside writeObject to convert
DecimalTypes to doubles using something like the function in https://github.com/apache/spark/blob/master/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/Cast.scala#L344

I guess the question is whether this change is more intrusive than the other one and how will
this impact handling more types in the future. It'll be great if you can try out one or both
of them and open a PR for more discussion.

> SQL decimal type is not properly cast to number when collecting SparkDataFrame
> ------------------------------------------------------------------------------
>
>                 Key: SPARK-16883
>                 URL: https://issues.apache.org/jira/browse/SPARK-16883
>             Project: Spark
>          Issue Type: Bug
>          Components: SparkR
>    Affects Versions: 2.0.0
>            Reporter: Hossein Falaki
>
> To reproduce run following code. As you can see "y" is a list of values.
> {code}
> registerTempTable(createDataFrame(iris), "iris")
> str(collect(sql("select cast('1' as double) as x, cast('2' as decimal) as y  from iris
limit 5")))
> 'data.frame':	5 obs. of  2 variables:
>  $ x: num  1 1 1 1 1
>  $ y:List of 5
>   ..$ : num 2
>   ..$ : num 2
>   ..$ : num 2
>   ..$ : num 2
>   ..$ : num 2
> {code}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org


Mime
View raw message