hive-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Marcelo Vanzin (JIRA)" <j...@apache.org>
Subject [jira] [Commented] (HIVE-9178) Create a separate API for remote Spark Context RPC other than job submission [Spark Branch]
Date Wed, 14 Jan 2015 05:45:35 GMT

    [ https://issues.apache.org/jira/browse/HIVE-9178?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14276514#comment-14276514
] 

Marcelo Vanzin commented on HIVE-9178:
--------------------------------------

[~chengxiang li] ah, good catch. This method:

{code}
    private void handle(ChannelHandlerContext ctx, SyncJobRequest msg) throws Exception {
{code}

Should actually be returning the result of the RPC instead of void. I'll update the patch
tomorrow and add a unit test (d'oh).

> Create a separate API for remote Spark Context RPC other than job submission [Spark Branch]
> -------------------------------------------------------------------------------------------
>
>                 Key: HIVE-9178
>                 URL: https://issues.apache.org/jira/browse/HIVE-9178
>             Project: Hive
>          Issue Type: Sub-task
>          Components: Spark
>            Reporter: Xuefu Zhang
>            Assignee: Marcelo Vanzin
>         Attachments: HIVE-9178.1-spark.patch, HIVE-9178.1-spark.patch, HIVE-9178.2-spark.patch
>
>
> Based on discussions in HIVE-8972, it seems making sense to create a separate API for
RPCs, such as addJar and getExecutorCounter. These jobs are different from a query submission
in that they don't need to be queued in the backend and can be executed right away.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

Mime
View raw message