hive-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Hive QA (JIRA)" <>
Subject [jira] [Commented] (HIVE-9178) Create a separate API for remote Spark Context RPC other than job submission [Spark Branch]
Date Wed, 14 Jan 2015 03:29:34 GMT


Hive QA commented on HIVE-9178:

{color:red}Overall{color}: -1 at least one tests failed

Here are the results of testing the latest attachment:

{color:red}ERROR:{color} -1 due to 3 failed/errored test(s), 7307 tests executed
*Failed tests:*

Test results:
Console output:
Test logs:

Executing org.apache.hive.ptest.execution.PrepPhase
Executing org.apache.hive.ptest.execution.ExecutionPhase
Executing org.apache.hive.ptest.execution.ReportingPhase
Tests exited with: TestsFailedException: 3 tests failed

This message is automatically generated.

ATTACHMENT ID: 12692117 - PreCommit-HIVE-SPARK-Build

> Create a separate API for remote Spark Context RPC other than job submission [Spark Branch]
> -------------------------------------------------------------------------------------------
>                 Key: HIVE-9178
>                 URL:
>             Project: Hive
>          Issue Type: Sub-task
>          Components: Spark
>            Reporter: Xuefu Zhang
>            Assignee: Marcelo Vanzin
>         Attachments: HIVE-9178.1-spark.patch, HIVE-9178.1-spark.patch, HIVE-9178.2-spark.patch
> Based on discussions in HIVE-8972, it seems making sense to create a separate API for
RPCs, such as addJar and getExecutorCounter. These jobs are different from a query submission
in that they don't need to be queued in the backend and can be executed right away.

This message was sent by Atlassian JIRA

View raw message