spark-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Davy Song (JIRA)" <j...@apache.org>
Subject [jira] [Commented] (SPARK-19081) spark sql use HIVE UDF throw exception when return a Map value
Date Wed, 18 Jan 2017 01:23:26 GMT

    [ https://issues.apache.org/jira/browse/SPARK-19081?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15827199#comment-15827199
] 

Davy Song commented on SPARK-19081:
-----------------------------------

Thanks Takeshi, you are right. I test map on Databricks's spark-sql 2.0 and hit the exception
as follows:

com.databricks.backend.common.rpc.DatabricksExceptions$SQLExecutionException: org.apache.spark.sql.AnalysisException:
Map type in java is unsupported because JVM type erasure makes spark fail to catch key and
value types in Map<>; line 2 pos 12 at org.apache.spark.sql.hive.HiveInspectors$class.javaClassToDataType(HiveInspectors.scala:234)
at org.apache.spark.sql.hive.HiveSimpleUDF.javaClassToDataType(hiveUDFs.scala:41) at org.apache.spark.sql.hive.HiveSimpleUDF.dataType$lzycompute(hiveUDFs.scala:71)
at org.apache.spark.sql.hive.HiveSimpleUDF.dataType(hiveUDFs.scala:71) at org.apache.spark.sql.hive.HiveSessionCatalog$$anonfun$makeFunctionBuilder$1.apply(HiveSessionCatalog.scala:126)
at org.apache.spark.sql.hive.HiveSessionCatalog$$anonfun$makeFunctionBuilder$1.apply(HiveSessionCatalog.scala:122)
at org.apache.spark.sql.catalyst.analysis.SimpleFunctionRegistry.lookupFunction(FunctionRegistry.scala:87)
at org.apache.spark.sql.catalyst.catalog.SessionCatalog.lookupFunction(SessionCatalog.scala:853)
at org.apache.spark.sql.hive.HiveSessionCatalog.org$apache$spark$sql$hive$HiveSessionCatalog$$super$lookupFunction(HiveSessionCatalog.scala:186)
at 

> spark sql use HIVE UDF throw exception when return a Map value
> --------------------------------------------------------------
>
>                 Key: SPARK-19081
>                 URL: https://issues.apache.org/jira/browse/SPARK-19081
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 1.3.0
>            Reporter: Davy Song
>
> I have met a problem like https://issues.apache.org/jira/browse/SPARK-3582,
> but not with this parameter Map, my evaluate function return a Map:
> public Map<String, String> evaluate(Text url) {...}
> when run spark-sql with this udf, getting the following exception:
> scala.MatchError: interface java.util.Map (of class java.lang.Class)
>         at org.apache.spark.sql.hive.HiveInspectors$class.javaClassToDataType(HiveInspectors.scala:175)
>         at org.apache.spark.sql.hive.HiveSimpleUdf.javaClassToDataType(hiveUdfs.scala:112)
>         at org.apache.spark.sql.hive.HiveSimpleUdf.dataType$lzycompute(hiveUdfs.scala:144)
>         at org.apache.spark.sql.hive.HiveSimpleUdf.dataType(hiveUdfs.scala:144)
>         at org.apache.spark.sql.catalyst.expressions.Alias.toAttribute(namedExpressions.scala:133)
>         at org.apache.spark.sql.catalyst.plans.logical.Project$$anonfun$output$1.apply(basicOperators.scala:25)
>         at org.apache.spark.sql.catalyst.plans.logical.Project$$anonfun$output$1.apply(basicOperators.scala:25)
>         at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244)
>         at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244)
>         at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
>         at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:47)
>         at scala.collection.TraversableLike$class.map(TraversableLike.scala:244)
>         at scala.collection.AbstractTraversable.map(Traversable.scala:105)
>         at org.apache.spark.sql.catalyst.plans.logical.Project.output(basicOperators.scala:25)
>         at org.apache.spark.sql.catalyst.plans.logical.InsertIntoTable.resolved$lzycompute(basicOperators.scala:149)



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org


Mime
View raw message