spark-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Sean Owen (JIRA)" <j...@apache.org>
Subject [jira] [Resolved] (SPARK-11257) Spark dataframe negate filter conditions
Date Thu, 22 Oct 2015 11:14:27 GMT

     [ https://issues.apache.org/jira/browse/SPARK-11257?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]

Sean Owen resolved SPARK-11257.
-------------------------------
    Resolution: Not A Problem

> Spark dataframe negate filter conditions
> ----------------------------------------
>
>                 Key: SPARK-11257
>                 URL: https://issues.apache.org/jira/browse/SPARK-11257
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 1.5.0
>         Environment: Fedora 21 core i5
>            Reporter: Lokesh Kumar
>              Labels: bug
>
> I am trying to apply a negation of filter condition on the DataFrame as shown below.
> !(`Ship Mode` LIKE '%Truck%')
> Which is throwing an exception below
> Exception in thread "main" java.lang.RuntimeException: [1.3] failure: identifier expected
> (!(`Ship Mode` LIKE '%Truck%'))
>   ^
>     at scala.sys.package$.error(package.scala:27)
>     at org.apache.spark.sql.catalyst.SqlParser.parseExpression(SqlParser.scala:47)
>     at org.apache.spark.sql.DataFrame.filter(DataFrame.scala:748)
>     at Main.main(Main.java:73)
> Where as the same kind of negative filter conditions are working fine in MySQL. Please
find below
> mysql> select count(*) from audit_log where !(operation like '%Log%' or operation
like '%Proj%');
> +----------+
> | count(*) |
> +----------+
> |      129 |
> +----------+
> 1 row in set (0.05 sec)
> Can anyone please let me know if this is planned to be fixed in Spark DataFrames in future
releases



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org


Mime
View raw message