phoenix-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Hadoop QA (JIRA)" <>
Subject [jira] [Commented] (PHOENIX-5197) Use extraOptions to set the configuration for Spark Workers
Date Tue, 09 Apr 2019 06:08:00 GMT


Hadoop QA commented on PHOENIX-5197:

{color:red}-1 overall{color}.  Here are the results of testing the latest attachment
  against master branch at commit 2ba4d8b0c718e5431120028dea16cc90bfb08dd7.
  ATTACHMENT ID: 12965283

    {color:green}+1 @author{color}.  The patch does not contain any @author tags.

    {color:green}+1 tests included{color}.  The patch appears to include 7 new or modified

    {color:red}-1 patch{color}.  The patch command could not apply the patch.

Console output:

This message is automatically generated.

> Use extraOptions to set the configuration for Spark Workers
> -----------------------------------------------------------
>                 Key: PHOENIX-5197
>                 URL:
>             Project: Phoenix
>          Issue Type: Improvement
>    Affects Versions: 5.0.0, 4.15.0
>            Reporter: Chinmay Kulkarni
>            Assignee: Chinmay Kulkarni
>            Priority: Major
>         Attachments: PHOENIX-5197-v1.patch, PHOENIX-5197.patch
>          Time Spent: 10m
>  Remaining Estimate: 0h
> We need a standardized way to set configurations for Spark driver and workers when reading
and writing data.
> Quoting from my offline discussion with []:
> _"if we use phoenixTableAsRDD to create the rdd then we can pass in a config_
>  _if we use the standard df.write.format() we can't pass in a conf_
>  _maybe the best was to do this is to put config in the option map when we call read/write
.format and then set those in the config"_

This message was sent by Atlassian JIRA

View raw message