hive-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Peter Cseh (JIRA)" <>
Subject [jira] [Commented] (HIVE-15767) Hive On Spark is not working on secure clusters from Oozie
Date Thu, 13 Jul 2017 12:21:00 GMT


Peter Cseh commented on HIVE-15767:

The Spark driver will get the correct tokens from the parent application - it's in the local
folder created for it's container. I'm not sure how it get's them, but they are there. 
The driver will pick it up from the correct container_tokens file using the HADOOP_TOKEN_FILE_LOCATION
env variable or something like that. The issue is that Hadoop's TokenCache is looking for
the mapreduce.job.credentials.binary property as well, while it's not needed and this invalid
reference causes the job to fail.

> Hive On Spark is not working on secure clusters from Oozie
> ----------------------------------------------------------
>                 Key: HIVE-15767
>                 URL:
>             Project: Hive
>          Issue Type: Bug
>          Components: Spark
>    Affects Versions: 1.2.1, 2.1.1
>            Reporter: Peter Cseh
>            Assignee: Peter Cseh
>         Attachments: HIVE-15767-001.patch, HIVE-15767-002.patch
> When a HiveAction is launched form Oozie with Hive On Spark enabled, we're getting errors:
> {noformat}
> Caused by: Exception reading file:/yarn/nm/usercache/yshi/appcache/application_1485271416004_0022/container_1485271416004_0022_01_000002/container_tokens
>         at
>         at
> {noformat}
> This is caused by passing the {{mapreduce.job.credentials.binary}} property to the Spark
configuration in RemoteHiveSparkClient.

This message was sent by Atlassian JIRA

View raw message