spark-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Sean Owen (JIRA)" <>
Subject [jira] [Commented] (SPARK-22314) Accessing Hive UDFs defined without 'USING JAR' from Spark
Date Thu, 19 Oct 2017 17:48:00 GMT


Sean Owen commented on SPARK-22314:

What is the problem here? either way you must specify the JAR location. In the first case
you use a Hive mechanism that isn't what's used in Spark.

> Accessing Hive UDFs defined without 'USING JAR' from Spark 
> -----------------------------------------------------------
>                 Key: SPARK-22314
>                 URL:
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Core
>    Affects Versions: 2.2.0
>            Reporter: Matyas Orhidi
> When defining UDF functions in Hive it is possible to load the UDF jar(s) from a shared
location e.g. from hive.reloadable.aux.jars.path, and then use the CREATE FUNCTION statement:
> {{CREATE FUNCTION <your_function_name> AS '<fully_qualified_class_name>';}}
> These UDFs are not working from Spark unless you use the 
> {{CREATE FUNCTION <your_function_name> AS '<fully_qualified_class_name>'
USING JAR 'hdfs:///<path/to/jar/in/hdfs>';}}
> command to create the Hive UDF function.

This message was sent by Atlassian JIRA

To unsubscribe, e-mail:
For additional commands, e-mail:

View raw message