spark-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Josh Rosen (JIRA)" <j...@apache.org>
Subject [jira] [Resolved] (SPARK-17252) Performing arithmetic in VALUES can lead to ClassCastException / MatchErrors during query parsing
Date Fri, 26 Aug 2016 18:04:22 GMT

     [ https://issues.apache.org/jira/browse/SPARK-17252?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]

Josh Rosen resolved SPARK-17252.
--------------------------------
       Resolution: Fixed
    Fix Version/s: 2.0.1

> Performing arithmetic in VALUES can lead to ClassCastException / MatchErrors during query
parsing
> -------------------------------------------------------------------------------------------------
>
>                 Key: SPARK-17252
>                 URL: https://issues.apache.org/jira/browse/SPARK-17252
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 2.0.0
>            Reporter: Josh Rosen
>             Fix For: 2.0.1
>
>
> The following example fails with a ClassCastException:
> {code}
> create table t(d double);
> insert into t VALUES (1 * 1.0);
> {code}
>  Here's the error:
> {code}
> java.lang.ClassCastException: org.apache.spark.sql.types.Decimal cannot be cast to java.lang.Integer
> 	at scala.runtime.BoxesRunTime.unboxToInt(BoxesRunTime.java:106)
> 	at scala.math.Numeric$IntIsIntegral$.times(Numeric.scala:57)
> 	at org.apache.spark.sql.catalyst.expressions.Multiply.nullSafeEval(arithmetic.scala:207)
> 	at org.apache.spark.sql.catalyst.expressions.BinaryExpression.eval(Expression.scala:416)
> 	at org.apache.spark.sql.catalyst.expressions.CreateStruct$$anonfun$eval$2.apply(complexTypeCreator.scala:198)
> 	at org.apache.spark.sql.catalyst.expressions.CreateStruct$$anonfun$eval$2.apply(complexTypeCreator.scala:198)
> 	at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244)
> 	at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244)
> 	at scala.collection.immutable.List.foreach(List.scala:318)
> 	at scala.collection.TraversableLike$class.map(TraversableLike.scala:244)
> 	at scala.collection.AbstractTraversable.map(Traversable.scala:105)
> 	at org.apache.spark.sql.catalyst.expressions.CreateStruct.eval(complexTypeCreator.scala:198)
> 	at org.apache.spark.sql.catalyst.expressions.UnaryExpression.eval(Expression.scala:320)
> 	at org.apache.spark.sql.catalyst.parser.AstBuilder$$anonfun$visitInlineTable$1$$anonfun$39.apply(AstBuilder.scala:677)
> 	at org.apache.spark.sql.catalyst.parser.AstBuilder$$anonfun$visitInlineTable$1$$anonfun$39.apply(AstBuilder.scala:674)
> 	at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244)
> 	at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244)
> 	at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
> 	at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:47)
> 	at scala.collection.TraversableLike$class.map(TraversableLike.scala:244)
> 	at scala.collection.AbstractTraversable.map(Traversable.scala:105)
> 	at org.apache.spark.sql.catalyst.parser.AstBuilder$$anonfun$visitInlineTable$1.apply(AstBuilder.scala:674)
> 	at org.apache.spark.sql.catalyst.parser.AstBuilder$$anonfun$visitInlineTable$1.apply(AstBuilder.scala:658)
> 	at org.apache.spark.sql.catalyst.parser.ParserUtils$.withOrigin(ParserUtils.scala:96)
> 	at org.apache.spark.sql.catalyst.parser.AstBuilder.visitInlineTable(AstBuilder.scala:658)
> 	at org.apache.spark.sql.catalyst.parser.AstBuilder.visitInlineTable(AstBuilder.scala:43)
> 	at org.apache.spark.sql.catalyst.parser.SqlBaseParser$InlineTableContext.accept(SqlBaseParser.java:9358)
> 	at org.apache.spark.sql.catalyst.parser.AstBuilder.visitChildren(AstBuilder.scala:57)
> 	at org.apache.spark.sql.catalyst.parser.SqlBaseBaseVisitor.visitInlineTableDefault1(SqlBaseBaseVisitor.java:608)
> 	at org.apache.spark.sql.catalyst.parser.SqlBaseParser$InlineTableDefault1Context.accept(SqlBaseParser.java:7073)
> 	at org.apache.spark.sql.catalyst.parser.AstBuilder.visitChildren(AstBuilder.scala:57)
> 	at org.apache.spark.sql.catalyst.parser.SqlBaseBaseVisitor.visitQueryTermDefault(SqlBaseBaseVisitor.java:580)
> 	at org.apache.spark.sql.catalyst.parser.SqlBaseParser$QueryTermDefaultContext.accept(SqlBaseParser.java:6895)
> 	at org.apache.spark.sql.catalyst.parser.AstBuilder.typedVisit(AstBuilder.scala:47)
> 	at org.apache.spark.sql.catalyst.parser.AstBuilder.plan(AstBuilder.scala:83)
> 	at org.apache.spark.sql.catalyst.parser.AstBuilder$$anonfun$visitSingleInsertQuery$1.apply(AstBuilder.scala:158)
> 	at org.apache.spark.sql.catalyst.parser.AstBuilder$$anonfun$visitSingleInsertQuery$1.apply(AstBuilder.scala:162)
> 	at org.apache.spark.sql.catalyst.parser.ParserUtils$.withOrigin(ParserUtils.scala:96)
> 	at org.apache.spark.sql.catalyst.parser.AstBuilder.visitSingleInsertQuery(AstBuilder.scala:157)
> 	at org.apache.spark.sql.catalyst.parser.AstBuilder.visitSingleInsertQuery(AstBuilder.scala:43)
> 	at org.apache.spark.sql.catalyst.parser.SqlBaseParser$SingleInsertQueryContext.accept(SqlBaseParser.java:6500)
> 	at org.apache.spark.sql.catalyst.parser.AstBuilder.typedVisit(AstBuilder.scala:47)
> 	at org.apache.spark.sql.catalyst.parser.AstBuilder.plan(AstBuilder.scala:83)
> 	at org.apache.spark.sql.catalyst.parser.AstBuilder$$anonfun$visitQuery$1.apply(AstBuilder.scala:89)
> 	at org.apache.spark.sql.catalyst.parser.AstBuilder$$anonfun$visitQuery$1.apply(AstBuilder.scala:88)
> 	at org.apache.spark.sql.catalyst.parser.ParserUtils$.withOrigin(ParserUtils.scala:96)
> 	at org.apache.spark.sql.catalyst.parser.AstBuilder.visitQuery(AstBuilder.scala:88)
> 	at org.apache.spark.sql.catalyst.parser.AstBuilder.visitQuery(AstBuilder.scala:43)
> 	at org.apache.spark.sql.catalyst.parser.SqlBaseParser$QueryContext.accept(SqlBaseParser.java:4751)
> 	at org.apache.spark.sql.catalyst.parser.AstBuilder.visitChildren(AstBuilder.scala:57)
> 	at org.apache.spark.sql.catalyst.parser.SqlBaseBaseVisitor.visitStatementDefault(SqlBaseBaseVisitor.java:48)
> 	at org.apache.spark.sql.catalyst.parser.SqlBaseParser$StatementDefaultContext.accept(SqlBaseParser.java:992)
> 	at org.antlr.v4.runtime.tree.AbstractParseTreeVisitor.visit(AbstractParseTreeVisitor.java:42)
> 	at org.apache.spark.sql.catalyst.parser.AstBuilder$$anonfun$visitSingleStatement$1.apply(AstBuilder.scala:64)
> 	at org.apache.spark.sql.catalyst.parser.AstBuilder$$anonfun$visitSingleStatement$1.apply(AstBuilder.scala:64)
> 	at org.apache.spark.sql.catalyst.parser.ParserUtils$.withOrigin(ParserUtils.scala:96)
> 	at org.apache.spark.sql.catalyst.parser.AstBuilder.visitSingleStatement(AstBuilder.scala:63)
> 	at org.apache.spark.sql.catalyst.parser.AbstractSqlParser$$anonfun$parsePlan$1.apply(ParseDriver.scala:54)
> 	at org.apache.spark.sql.catalyst.parser.AbstractSqlParser$$anonfun$parsePlan$1.apply(ParseDriver.scala:53)
> 	at org.apache.spark.sql.catalyst.parser.AbstractSqlParser.parse(ParseDriver.scala:82)
> 	at org.apache.spark.sql.execution.SparkSqlParser.parse(SparkSqlParser.scala:46)
> 	at org.apache.spark.sql.catalyst.parser.AbstractSqlParser.parsePlan(ParseDriver.scala:53)
> 	at org.apache.spark.sql.SparkSession.sql(SparkSession.scala:582)
> 	at org.apache.spark.sql.SQLContext.sql(SQLContext.scala:682)
> {code}
> It's surprising to me that this error is occurring during query parsing. My hunch is
that we're performing expression evaluation too early and need to run more analysis and type
promotion rules prior to trying to evaluate the expressions here. 



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org


Mime
View raw message