phoenix-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Chinmay Kulkarni (JIRA)" <>
Subject [jira] [Commented] (PHOENIX-5124) PropertyPolicyProvider should not evaluate default hbase config properties
Date Thu, 14 Feb 2019 07:15:00 GMT


Chinmay Kulkarni commented on PHOENIX-5124:

[~tdsilva] When I run a Spark job using the Phoenix-Spark connector with a version of phoenix
containing this patch, I am still running into _PropertyNotAllowedException_. There are some
properties that are not being removed which correspond to either of the following cases:
# Properties that depend on the value of other properties for ex: "mapreduce.cluster.local.dir"
-> "${hadoop.tmp.dir}/mapred/local" or "yarn.timeline-service.address" -> "${yarn.timeline-service.hostname}:10200".
The dependent properties are either already part of the passed properties map or system properties.
# Apart from this, the PhoenixConfigurationUtil adds properties such as "phoenix.input.class"
and ""

We need to include these conditions when removing standard HBase/Phoenix properties from the
properties map.
On another note, is it possible to avoid passing all these properties when creating a Phoenix
connection in case of M/R jobs or Spark jobs? Perhaps as a configuration that is passed directly
from the Spark driver to the workers?

> PropertyPolicyProvider should not evaluate default hbase config properties
> --------------------------------------------------------------------------
>                 Key: PHOENIX-5124
>                 URL:
>             Project: Phoenix
>          Issue Type: Bug
>            Reporter: Thomas D'Silva
>            Assignee: Thomas D'Silva
>            Priority: Major
>             Fix For: 4.15.0, 5.1.0
>         Attachments: PHOENIX-5124-4.x-HBase-1.3-v2.patch, PHOENIX-5124-4.x-HBase-1.3.patch
>          Time Spent: 20m
>  Remaining Estimate: 0h

This message was sent by Atlassian JIRA

View raw message