Return-Path: X-Original-To: apmail-hive-dev-archive@www.apache.org Delivered-To: apmail-hive-dev-archive@www.apache.org Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by minotaur.apache.org (Postfix) with SMTP id EE6DE10972 for ; Wed, 3 Dec 2014 22:14:12 +0000 (UTC) Received: (qmail 19776 invoked by uid 500); 3 Dec 2014 22:14:12 -0000 Delivered-To: apmail-hive-dev-archive@hive.apache.org Received: (qmail 19693 invoked by uid 500); 3 Dec 2014 22:14:12 -0000 Mailing-List: contact dev-help@hive.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: dev@hive.apache.org Delivered-To: mailing list dev@hive.apache.org Received: (qmail 19681 invoked by uid 500); 3 Dec 2014 22:14:12 -0000 Delivered-To: apmail-hadoop-hive-dev@hadoop.apache.org Received: (qmail 19678 invoked by uid 99); 3 Dec 2014 22:14:12 -0000 Received: from arcas.apache.org (HELO arcas.apache.org) (140.211.11.28) by apache.org (qpsmtpd/0.29) with ESMTP; Wed, 03 Dec 2014 22:14:12 +0000 Date: Wed, 3 Dec 2014 22:14:12 +0000 (UTC) From: "Chao (JIRA)" To: hive-dev@hadoop.apache.org Message-ID: In-Reply-To: References: Subject: [jira] [Resolved] (HIVE-8981) Not a directory error in mapjoin_hook.q [Spark Branch] MIME-Version: 1.0 Content-Type: text/plain; charset=utf-8 Content-Transfer-Encoding: 7bit X-JIRA-FingerPrint: 30527f35849b9dde25b450d4833f0394 [ https://issues.apache.org/jira/browse/HIVE-8981?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] Chao resolved HIVE-8981. ------------------------ Resolution: Not a Problem Fix Version/s: spark-branch Closing it now, since the error message is expected. > Not a directory error in mapjoin_hook.q [Spark Branch] > ------------------------------------------------------ > > Key: HIVE-8981 > URL: https://issues.apache.org/jira/browse/HIVE-8981 > Project: Hive > Issue Type: Sub-task > Components: Spark > Affects Versions: spark-branch > Environment: Using remote-spark context with spark-master=local-cluster [2,2,1024] > Reporter: Szehon Ho > Assignee: Chao > Fix For: spark-branch > > > Hits the following exception: > {noformat} > 2014-11-26 15:17:11,728 INFO [stderr-redir-1]: client.SparkClientImpl (SparkClientImpl.java:run(364)) - 14/11/26 15:17:11 WARN TaskSetManager: Lost task 0.0 in stage 8.0 (TID 18, 172.16.3.52): java.lang.RuntimeException: org.apache.hadoop.hive.ql.metadata.HiveException: org.apache.hadoop.hive.ql.metadata.HiveException: Error while trying to create table container > 2014-11-26 15:17:11,728 INFO [stderr-redir-1]: client.SparkClientImpl (SparkClientImpl.java:run(364)) - at org.apache.hadoop.hive.ql.exec.spark.SparkMapRecordHandler.processRow(SparkMapRecordHandler.java:160) > 2014-11-26 15:17:11,728 INFO [stderr-redir-1]: client.SparkClientImpl (SparkClientImpl.java:run(364)) - at org.apache.hadoop.hive.ql.exec.spark.HiveMapFunctionResultList.processNextRecord(HiveMapFunctionResultList.java:47) > 2014-11-26 15:17:11,729 INFO [stderr-redir-1]: client.SparkClientImpl (SparkClientImpl.java:run(364)) - at org.apache.hadoop.hive.ql.exec.spark.HiveMapFunctionResultList.processNextRecord(HiveMapFunctionResultList.java:28) > 2014-11-26 15:17:11,729 INFO [stderr-redir-1]: client.SparkClientImpl (SparkClientImpl.java:run(364)) - at org.apache.hadoop.hive.ql.exec.spark.HiveBaseFunctionResultList$ResultIterator.hasNext(HiveBaseFunctionResultList.java:96) > 2014-11-26 15:17:11,729 INFO [stderr-redir-1]: client.SparkClientImpl (SparkClientImpl.java:run(364)) - at scala.collection.convert.Wrappers$JIteratorWrapper.hasNext(Wrappers.scala:41) > 2014-11-26 15:17:11,729 INFO [stderr-redir-1]: client.SparkClientImpl (SparkClientImpl.java:run(364)) - at scala.collection.Iterator$class.foreach(Iterator.scala:727) > 2014-11-26 15:17:11,729 INFO [stderr-redir-1]: client.SparkClientImpl (SparkClientImpl.java:run(364)) - at scala.collection.AbstractIterator.foreach(Iterator.scala:1157) > 2014-11-26 15:17:11,729 INFO [stderr-redir-1]: client.SparkClientImpl (SparkClientImpl.java:run(364)) - at org.apache.spark.rdd.AsyncRDDActions$$anonfun$foreachAsync$2.apply(AsyncRDDActions.scala:115) > 2014-11-26 15:17:11,729 INFO [stderr-redir-1]: client.SparkClientImpl (SparkClientImpl.java:run(364)) - at org.apache.spark.rdd.AsyncRDDActions$$anonfun$foreachAsync$2.apply(AsyncRDDActions.scala:115) > 2014-11-26 15:17:11,729 INFO [stderr-redir-1]: client.SparkClientImpl (SparkClientImpl.java:run(364)) - at org.apache.spark.SparkContext$$anonfun$30.apply(SparkContext.scala:1390) > 2014-11-26 15:17:11,729 INFO [stderr-redir-1]: client.SparkClientImpl (SparkClientImpl.java:run(364)) - at org.apache.spark.SparkContext$$anonfun$30.apply(SparkContext.scala:1390) > 2014-11-26 15:17:11,729 INFO [stderr-redir-1]: client.SparkClientImpl (SparkClientImpl.java:run(364)) - at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:61) > 2014-11-26 15:17:11,729 INFO [stderr-redir-1]: client.SparkClientImpl (SparkClientImpl.java:run(364)) - at org.apache.spark.scheduler.Task.run(Task.scala:56) > 2014-11-26 15:17:11,729 INFO [stderr-redir-1]: client.SparkClientImpl (SparkClientImpl.java:run(364)) - at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:196) > 2014-11-26 15:17:11,729 INFO [stderr-redir-1]: client.SparkClientImpl (SparkClientImpl.java:run(364)) - at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) > 2014-11-26 15:17:11,729 INFO [stderr-redir-1]: client.SparkClientImpl (SparkClientImpl.java:run(364)) - at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) > 2014-11-26 15:17:11,729 INFO [stderr-redir-1]: client.SparkClientImpl (SparkClientImpl.java:run(364)) - at java.lang.Thread.run(Thread.java:744) > 2014-11-26 15:17:11,729 INFO [stderr-redir-1]: client.SparkClientImpl (SparkClientImpl.java:run(364)) - Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: org.apache.hadoop.hive.ql.metadata.HiveException: Error while trying to create table container > 2014-11-26 15:17:11,729 INFO [stderr-redir-1]: client.SparkClientImpl (SparkClientImpl.java:run(364)) - at org.apache.hadoop.hive.ql.exec.spark.HashTableLoader.load(HashTableLoader.java:100) > 2014-11-26 15:17:11,729 INFO [stderr-redir-1]: client.SparkClientImpl (SparkClientImpl.java:run(364)) - at org.apache.hadoop.hive.ql.exec.MapJoinOperator.loadHashTable(MapJoinOperator.java:193) > 2014-11-26 15:17:11,729 INFO [stderr-redir-1]: client.SparkClientImpl (SparkClientImpl.java:run(364)) - at org.apache.hadoop.hive.ql.exec.MapJoinOperator.cleanUpInputFileChangedOp(MapJoinOperator.java:219) > 2014-11-26 15:17:11,729 INFO [stderr-redir-1]: client.SparkClientImpl (SparkClientImpl.java:run(364)) - at org.apache.hadoop.hive.ql.exec.Operator.cleanUpInputFileChanged(Operator.java:1051) > 2014-11-26 15:17:11,729 INFO [stderr-redir-1]: client.SparkClientImpl (SparkClientImpl.java:run(364)) - at org.apache.hadoop.hive.ql.exec.Operator.cleanUpInputFileChanged(Operator.java:1055) > 2014-11-26 15:17:11,729 INFO [stderr-redir-1]: client.SparkClientImpl (SparkClientImpl.java:run(364)) - at org.apache.hadoop.hive.ql.exec.Operator.cleanUpInputFileChanged(Operator.java:1055) > 2014-11-26 15:17:11,729 INFO [stderr-redir-1]: client.SparkClientImpl (SparkClientImpl.java:run(364)) - at org.apache.hadoop.hive.ql.exec.Operator.cleanUpInputFileChanged(Operator.java:1055) > 2014-11-26 15:17:11,729 INFO [stderr-redir-1]: client.SparkClientImpl (SparkClientImpl.java:run(364)) - at org.apache.hadoop.hive.ql.exec.MapOperator.process(MapOperator.java:486) > 2014-11-26 15:17:11,729 INFO [stderr-redir-1]: client.SparkClientImpl (SparkClientImpl.java:run(364)) - at org.apache.hadoop.hive.ql.exec.spark.SparkMapRecordHandler.processRow(SparkMapRecordHandler.java:149) > 2014-11-26 15:17:11,729 INFO [stderr-redir-1]: client.SparkClientImpl (SparkClientImpl.java:run(364)) - ... 16 more > 2014-11-26 15:17:11,729 INFO [stderr-redir-1]: client.SparkClientImpl (SparkClientImpl.java:run(364)) - Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: Error while trying to create table container > 2014-11-26 15:17:11,729 INFO [stderr-redir-1]: client.SparkClientImpl (SparkClientImpl.java:run(364)) - at org.apache.hadoop.hive.ql.exec.persistence.MapJoinTableContainerSerDe.load(MapJoinTableContainerSerDe.java:154) > 2014-11-26 15:17:11,729 INFO [stderr-redir-1]: client.SparkClientImpl (SparkClientImpl.java:run(364)) - at org.apache.hadoop.hive.ql.exec.spark.HashTableLoader.load(HashTableLoader.java:97) > 2014-11-26 15:17:11,729 INFO [stderr-redir-1]: client.SparkClientImpl (SparkClientImpl.java:run(364)) - ... 24 more > 2014-11-26 15:17:11,729 INFO [stderr-redir-1]: client.SparkClientImpl (SparkClientImpl.java:run(364)) - Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: Error, not a directory: file:/home/szehon/repos/apache-hive-git/hive/itests/qtest-spark/target/tmp/scratchdir/szehon/34689ef1-da29-422f-9409-f358480e03b9/hive_2014-11-26_15-17-11_015_4372638719563218766-1/-mr-10002/HashTable-Stage-1/MapJoin-mapfile31--.hashtable > 2014-11-26 15:17:11,729 INFO [stderr-redir-1]: client.SparkClientImpl (SparkClientImpl.java:run(364)) - at org.apache.hadoop.hive.ql.exec.persistence.MapJoinTableContainerSerDe.load(MapJoinTableContainerSerDe.java:105) > 2014-11-26 15:17:11,729 INFO [stderr-redir-1]: client.SparkClientImpl (SparkClientImpl.java:run(364)) - ... 25 more > {noformat} -- This message was sent by Atlassian JIRA (v6.3.4#6332)