spark-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Cheng Lian (JIRA)" <j...@apache.org>
Subject [jira] [Resolved] (SPARK-10533) DataFrame filter is not handling float/double with Scientific Notation 'e' / 'E'
Date Tue, 03 Nov 2015 14:31:28 GMT

     [ https://issues.apache.org/jira/browse/SPARK-10533?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]

Cheng Lian resolved SPARK-10533.
--------------------------------
       Resolution: Fixed
    Fix Version/s: 1.6.0

Issue resolved by pull request 9085
[https://github.com/apache/spark/pull/9085]

> DataFrame filter is not handling float/double with Scientific Notation 'e' / 'E'  
> ----------------------------------------------------------------------------------
>
>                 Key: SPARK-10533
>                 URL: https://issues.apache.org/jira/browse/SPARK-10533
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 1.4.1
>            Reporter: Rishabh Bhardwaj
>            Assignee: Adrian Wang
>            Priority: Minor
>              Labels: newbie
>             Fix For: 1.6.0
>
>
> In DataFrames filter operation,when giving float comparison with e (2.0e2) it is not
converting the comparison constant as expected (200.0 in this case).
> For example:
>  val df = sqlContext.createDataFrame(Seq(("a",1.0),("b",2.0),("c",3.0)))
> df.filter("_2 < 2.0e1").show
> +--+---+
> |_1| _2|
> +--+---+
> | a|1.0|
> +--+---+ 
> It should return all the three records from the dataframe,but is return record which
is less than 2.0.
> It seems it is just comparing with the mantissa/coefficient.
> On the other hand,sqlContext is handling the above case and giving the desired output.
> df.resgisterTempTable("df")
> sqlContext.sql("select * from df where `_2` < 2.0e1").show
> +--+---+
> |_1| _2|
> +--+---+
> | a|1.0|
> | b|2.0|
> | c|3.0|
> +--+---+ 



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org


Mime
View raw message