Return-Path: X-Original-To: archive-asf-public-internal@cust-asf2.ponee.io Delivered-To: archive-asf-public-internal@cust-asf2.ponee.io Received: from cust-asf.ponee.io (cust-asf.ponee.io [163.172.22.183]) by cust-asf2.ponee.io (Postfix) with ESMTP id 6AFCC200BCA for ; Mon, 7 Nov 2016 04:25:29 +0100 (CET) Received: by cust-asf.ponee.io (Postfix) id 69944160B0D; Mon, 7 Nov 2016 03:25:29 +0000 (UTC) Delivered-To: archive-asf-public@cust-asf.ponee.io Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by cust-asf.ponee.io (Postfix) with SMTP id AF783160AFC for ; Mon, 7 Nov 2016 04:25:28 +0100 (CET) Received: (qmail 44848 invoked by uid 500); 7 Nov 2016 03:25:27 -0000 Mailing-List: contact commits-help@hive.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: hive-dev@hive.apache.org Delivered-To: mailing list commits@hive.apache.org Received: (qmail 44832 invoked by uid 99); 7 Nov 2016 03:25:27 -0000 Received: from git1-us-west.apache.org (HELO git1-us-west.apache.org) (140.211.11.23) by apache.org (qpsmtpd/0.29) with ESMTP; Mon, 07 Nov 2016 03:25:27 +0000 Received: by git1-us-west.apache.org (ASF Mail Server at git1-us-west.apache.org, from userid 33) id 2B823E1170; Mon, 7 Nov 2016 03:25:27 +0000 (UTC) Content-Type: text/plain; charset="us-ascii" MIME-Version: 1.0 Content-Transfer-Encoding: 7bit From: lirui@apache.org To: commits@hive.apache.org Message-Id: X-Mailer: ASF-Git Admin Mailer Subject: hive git commit: HIVE-15054: Hive insertion query execution fails on Hive on Spark (Aihua Xu via Rui Li) Date: Mon, 7 Nov 2016 03:25:27 +0000 (UTC) archived-at: Mon, 07 Nov 2016 03:25:29 -0000 Repository: hive Updated Branches: refs/heads/master 876ecdc6e -> 0951c9c64 HIVE-15054: Hive insertion query execution fails on Hive on Spark (Aihua Xu via Rui Li) Project: http://git-wip-us.apache.org/repos/asf/hive/repo Commit: http://git-wip-us.apache.org/repos/asf/hive/commit/0951c9c6 Tree: http://git-wip-us.apache.org/repos/asf/hive/tree/0951c9c6 Diff: http://git-wip-us.apache.org/repos/asf/hive/diff/0951c9c6 Branch: refs/heads/master Commit: 0951c9c6443fda41e9e4ab5f8302a043f564a5d8 Parents: 876ecdc Author: Aihua Xu Authored: Mon Nov 7 11:25:17 2016 +0800 Committer: Rui Li Committed: Mon Nov 7 11:25:17 2016 +0800 ---------------------------------------------------------------------- .../hive/ql/exec/spark/HivePairFlatMapFunction.java | 10 +++++++--- 1 file changed, 7 insertions(+), 3 deletions(-) ---------------------------------------------------------------------- http://git-wip-us.apache.org/repos/asf/hive/blob/0951c9c6/ql/src/java/org/apache/hadoop/hive/ql/exec/spark/HivePairFlatMapFunction.java ---------------------------------------------------------------------- diff --git a/ql/src/java/org/apache/hadoop/hive/ql/exec/spark/HivePairFlatMapFunction.java b/ql/src/java/org/apache/hadoop/hive/ql/exec/spark/HivePairFlatMapFunction.java index 7df626b..4f1b7d7 100644 --- a/ql/src/java/org/apache/hadoop/hive/ql/exec/spark/HivePairFlatMapFunction.java +++ b/ql/src/java/org/apache/hadoop/hive/ql/exec/spark/HivePairFlatMapFunction.java @@ -70,10 +70,14 @@ public abstract class HivePairFlatMapFunction implements PairFlatMapFun taskAttemptIdBuilder.append("r_"); } - // Spark task attempt id is increased by Spark context instead of task, which may introduce - // unstable qtest output, since non Hive features depends on this, we always set it to 0 here. + // Hive requires this TaskAttemptId to be unique. MR's TaskAttemptId is composed + // of "attempt_timestamp_jobNum_m/r_taskNum_attemptNum". The counterpart for + // Spark should be "attempt_timestamp_stageNum_m/r_partitionId_attemptNum". + // When there're multiple attempts for a task, Hive will rely on the partitionId + // to figure out if the data are duplicate or not when collecting the final outputs + // (see org.apache.hadoop.hive.ql.exec.Utils.removeTempOrDuplicateFiles) taskAttemptIdBuilder.append(taskIdFormat.format(TaskContext.get().partitionId())) - .append("_0"); + .append("_").append(TaskContext.get().attemptNumber()); String taskAttemptIdStr = taskAttemptIdBuilder.toString(); jobConf.set("mapred.task.id", taskAttemptIdStr);