Return-Path: X-Original-To: archive-asf-public-internal@cust-asf2.ponee.io Delivered-To: archive-asf-public-internal@cust-asf2.ponee.io Received: from cust-asf.ponee.io (cust-asf.ponee.io [163.172.22.183]) by cust-asf2.ponee.io (Postfix) with ESMTP id C3832200C32 for ; Thu, 9 Mar 2017 23:38:10 +0100 (CET) Received: by cust-asf.ponee.io (Postfix) id C20E3160B75; Thu, 9 Mar 2017 22:38:10 +0000 (UTC) Delivered-To: archive-asf-public@cust-asf.ponee.io Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by cust-asf.ponee.io (Postfix) with SMTP id 1593F160B5F for ; Thu, 9 Mar 2017 23:38:09 +0100 (CET) Received: (qmail 40861 invoked by uid 500); 9 Mar 2017 22:38:09 -0000 Mailing-List: contact issues-help@hive.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: dev@hive.apache.org Delivered-To: mailing list issues@hive.apache.org Received: (qmail 40852 invoked by uid 99); 9 Mar 2017 22:38:09 -0000 Received: from pnap-us-west-generic-nat.apache.org (HELO spamd4-us-west.apache.org) (209.188.14.142) by apache.org (qpsmtpd/0.29) with ESMTP; Thu, 09 Mar 2017 22:38:09 +0000 Received: from localhost (localhost [127.0.0.1]) by spamd4-us-west.apache.org (ASF Mail Server at spamd4-us-west.apache.org) with ESMTP id 024D7CAC38 for ; Thu, 9 Mar 2017 22:26:41 +0000 (UTC) X-Virus-Scanned: Debian amavisd-new at spamd4-us-west.apache.org X-Spam-Flag: NO X-Spam-Score: 1.451 X-Spam-Level: * X-Spam-Status: No, score=1.451 tagged_above=-999 required=6.31 tests=[KAM_ASCII_DIVIDERS=0.8, RP_MATCHES_RCVD=-0.001, SPF_NEUTRAL=0.652] autolearn=disabled Received: from mx1-lw-us.apache.org ([10.40.0.8]) by localhost (spamd4-us-west.apache.org [10.40.0.11]) (amavisd-new, port 10024) with ESMTP id o0HNbYWUperi for ; Thu, 9 Mar 2017 22:26:39 +0000 (UTC) Received: from mailrelay1-us-west.apache.org (mailrelay1-us-west.apache.org [209.188.14.139]) by mx1-lw-us.apache.org (ASF Mail Server at mx1-lw-us.apache.org) with ESMTP id 8CA1C60DFC for ; Thu, 9 Mar 2017 22:26:39 +0000 (UTC) Received: from jira-lw-us.apache.org (unknown [207.244.88.139]) by mailrelay1-us-west.apache.org (ASF Mail Server at mailrelay1-us-west.apache.org) with ESMTP id 6E4E1E08C0 for ; Thu, 9 Mar 2017 22:26:38 +0000 (UTC) Received: from jira-lw-us.apache.org (localhost [127.0.0.1]) by jira-lw-us.apache.org (ASF Mail Server at jira-lw-us.apache.org) with ESMTP id 18E45243AC for ; Thu, 9 Mar 2017 22:26:38 +0000 (UTC) Date: Thu, 9 Mar 2017 22:26:38 +0000 (UTC) From: "Ashutosh Chauhan (JIRA)" To: issues@hive.apache.org Message-ID: In-Reply-To: References: Subject: [jira] [Commented] (HIVE-15289) Flaky test: TestSparkCliDriver.org.apache.hadoop.hive.cli.TestSparkCliDriver (setup) MIME-Version: 1.0 Content-Type: text/plain; charset=utf-8 Content-Transfer-Encoding: 7bit X-JIRA-FingerPrint: 30527f35849b9dde25b450d4833f0394 archived-at: Thu, 09 Mar 2017 22:38:11 -0000 [ https://issues.apache.org/jira/browse/HIVE-15289?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15903979#comment-15903979 ] Ashutosh Chauhan commented on HIVE-15289: ----------------------------------------- likely same root cause as HIVE-15165 > Flaky test: TestSparkCliDriver.org.apache.hadoop.hive.cli.TestSparkCliDriver (setup) > ------------------------------------------------------------------------------------ > > Key: HIVE-15289 > URL: https://issues.apache.org/jira/browse/HIVE-15289 > Project: Hive > Issue Type: Sub-task > Reporter: Anthony Hsu > > In recent PreCommit builds, TestSparkCliDriver has failed during setup with errors like the following: > From https://builds.apache.org/job/PreCommit-HIVE-Build/2292/testReport/: > {noformat} > Failed during createSources processLine with code=3 > ... > Job failed with java.io.IOException: Failed to create local dir in /tmp/blockmgr-be4539eb-7896-4903-89c9-7ae1c48faa24/01. > at org.apache.spark.storage.DiskBlockManager.getFile(DiskBlockManager.scala:70) > at org.apache.spark.storage.DiskStore.contains(DiskStore.scala:124) > at org.apache.spark.storage.BlockManager.org$apache$spark$storage$BlockManager$$getCurrentBlockStatus(BlockManager.scala:379) > at org.apache.spark.storage.BlockManager$$anonfun$doPutIterator$1.apply(BlockManager.scala:959) > at org.apache.spark.storage.BlockManager$$anonfun$doPutIterator$1.apply(BlockManager.scala:910) > at org.apache.spark.storage.BlockManager.doPut(BlockManager.scala:866) > at org.apache.spark.storage.BlockManager.doPutIterator(BlockManager.scala:910) > at org.apache.spark.storage.BlockManager.putIterator(BlockManager.scala:700) > at org.apache.spark.storage.BlockManager.putSingle(BlockManager.scala:1213) > at org.apache.spark.broadcast.TorrentBroadcast.writeBlocks(TorrentBroadcast.scala:103) > at org.apache.spark.broadcast.TorrentBroadcast.(TorrentBroadcast.scala:86) > at org.apache.spark.broadcast.TorrentBroadcastFactory.newBroadcast(TorrentBroadcastFactory.scala:34) > at org.apache.spark.broadcast.BroadcastManager.newBroadcast(BroadcastManager.scala:56) > at org.apache.spark.SparkContext.broadcast(SparkContext.scala:1370) > at org.apache.spark.rdd.HadoopRDD.(HadoopRDD.scala:125) > at org.apache.spark.SparkContext$$anonfun$hadoopRDD$1.apply(SparkContext.scala:965) > at org.apache.spark.SparkContext$$anonfun$hadoopRDD$1.apply(SparkContext.scala:961) > at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151) > at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112) > at org.apache.spark.SparkContext.withScope(SparkContext.scala:682) > at org.apache.spark.SparkContext.hadoopRDD(SparkContext.scala:961) > at org.apache.spark.api.java.JavaSparkContext.hadoopRDD(JavaSparkContext.scala:412) > at org.apache.hadoop.hive.ql.exec.spark.SparkPlanGenerator.generateMapInput(SparkPlanGenerator.java:205) > at org.apache.hadoop.hive.ql.exec.spark.SparkPlanGenerator.generateParentTran(SparkPlanGenerator.java:145) > at org.apache.hadoop.hive.ql.exec.spark.SparkPlanGenerator.generate(SparkPlanGenerator.java:117) > at org.apache.hadoop.hive.ql.exec.spark.RemoteHiveSparkClient$JobStatusJob.call(RemoteHiveSparkClient.java:339) > at org.apache.hive.spark.client.RemoteDriver$JobWrapper.call(RemoteDriver.java:358) > at org.apache.hive.spark.client.RemoteDriver$JobWrapper.call(RemoteDriver.java:323) > at java.util.concurrent.FutureTask.run(FutureTask.java:266) > at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) > at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) > at java.lang.Thread.run(Thread.java:745) > {noformat} > From https://builds.apache.org/job/PreCommit-HIVE-Build/2291/testReport/: > {noformat} > Failed during createSources processLine with code=1 > ... > Failed to monitor Job[ 11] with exception 'org.apache.hadoop.hive.ql.metadata.HiveException(java.util.concurrent.TimeoutException)' > {noformat} -- This message was sent by Atlassian JIRA (v6.3.15#6346)