hive-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Chao (JIRA)" <j...@apache.org>
Subject [jira] [Commented] (HIVE-8578) Investigate test failures related to HIVE-8545 [Spark Branch]
Date Mon, 27 Oct 2014 17:19:35 GMT

    [ https://issues.apache.org/jira/browse/HIVE-8578?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14185443#comment-14185443
] 

Chao commented on HIVE-8578:
----------------------------

I think the error may be caused by missing of plan file (either {{map.xml}} or {{reduce.xml}}).
For instance:

{noformat}
2014-10-24 14:41:30,112 DEBUG executor.Executor (Logging.scala:logDebug(63)) - Task 100's
epoch is 43
2014-10-24 14:41:30,112 DEBUG storage.BlockManager (Logging.scala:logDebug(63)) - Getting
local block broadcast_139
2014-10-24 14:41:30,113 DEBUG storage.BlockManager (Logging.scala:logDebug(63)) - Level for
block broadcast_139 is StorageLevel(true, true, false, true, 1)
2014-10-24 14:41:30,113 DEBUG storage.BlockManager (Logging.scala:logDebug(63)) - Getting
block broadcast_139 from memory
2014-10-24 14:41:30,116 DEBUG storage.BlockManager (Logging.scala:logDebug(63)) - Getting
local block broadcast_142
2014-10-24 14:41:30,116 DEBUG storage.BlockManager (Logging.scala:logDebug(63)) - Level for
block broadcast_142 is StorageLevel(true, true, false, true, 1)
2014-10-24 14:41:30,116 DEBUG storage.BlockManager (Logging.scala:logDebug(63)) - Getting
block broadcast_142 from memory
2014-10-24 14:41:30,116 DEBUG executor.Executor (Logging.scala:logDebug(63)) - Task 102's
epoch is 43
2014-10-24 14:41:30,117 DEBUG storage.BlockManager (Logging.scala:logDebug(63)) - Getting
local block broadcast_139
2014-10-24 14:41:30,117 DEBUG storage.BlockManager (Logging.scala:logDebug(63)) - Level for
block broadcast_139 is StorageLevel(true, true, false, true, 1)
2014-10-24 14:41:30,117 DEBUG storage.BlockManager (Logging.scala:logDebug(63)) - Getting
block broadcast_139 from memory
2014-10-24 14:41:30,118 DEBUG storage.BlockManager (Logging.scala:logDebug(63)) - Getting
local block broadcast_142
2014-10-24 14:41:30,118 DEBUG storage.BlockManager (Logging.scala:logDebug(63)) - Level for
block broadcast_142 is StorageLevel(true, true, false, true, 1)
2014-10-24 14:41:30,118 DEBUG storage.BlockManager (Logging.scala:logDebug(63)) - Getting
block broadcast_142 from memory
2014-10-24 14:41:30,118 DEBUG executor.Executor (Logging.scala:logDebug(63)) - Task 101's
epoch is 43
2014-10-24 14:41:30,118 DEBUG storage.BlockManager (Logging.scala:logDebug(63)) - Getting
local block broadcast_140
2014-10-24 14:41:30,119 DEBUG storage.BlockManager (Logging.scala:logDebug(63)) - Level for
block broadcast_140 is StorageLevel(true, true, false, true, 1)
2014-10-24 14:41:30,119 DEBUG storage.BlockManager (Logging.scala:logDebug(63)) - Getting
block broadcast_140 from memory
2014-10-24 14:41:30,119 DEBUG storage.BlockManager (Logging.scala:logDebug(63)) - Getting
local block broadcast_141
2014-10-24 14:41:30,119 DEBUG storage.BlockManager (Logging.scala:logDebug(63)) - Level for
block broadcast_141 is StorageLevel(true, true, false, true, 1)
2014-10-24 14:41:30,120 DEBUG storage.BlockManager (Logging.scala:logDebug(63)) - Getting
block broadcast_141 from memory
2014-10-24 14:41:30,120 INFO  rdd.HadoopRDD (Logging.scala:logInfo(59)) - Input split: Paths:/home/hiveptest/54.177.7.181-hiveptest-1/apache-svn-spark-source/itests/qtest-spark/target/warehouse/src/kv1.txt:0+5812InputFormatClass:
org.apache.hadoop.mapred.TextInputFormat

2014-10-24 14:41:30,120 DEBUG rdd.HadoopRDD (Logging.scala:logDebug(63)) - Re-using user-broadcasted
JobConf
2014-10-24 14:41:30,121 INFO  exec.Utilities (Utilities.java:getBaseWork(363)) - PLAN PATH
= file:/home/hiveptest/54.177.7.181-hiveptest-1/apache-svn-spark-source/itests/qtest-spark/target/tmp/scratchdir/hiveptest/171c6c35-9d46-45b1-8d6a-da6b58dcdbd8/hive_2014-10-24_14-41-29_094_2948446759305742203-1/-mr-10002/d349d6e0-d1b2-482b-9120-6b02b93bd85e/map.xml
2014-10-24 14:41:30,121 INFO  exec.Utilities (Utilities.java:getBaseWork(377)) - local path
= file:/home/hiveptest/54.177.7.181-hiveptest-1/apache-svn-spark-source/itests/qtest-spark/target/tmp/scratchdir/hiveptest/171c6c35-9d46-45b1-8d6a-da6b58dcdbd8/hive_2014-10-24_14-41-29_094_2948446759305742203-1/-mr-10002/d349d6e0-d1b2-482b-9120-6b02b93bd85e/map.xml
2014-10-24 14:41:30,121 INFO  exec.Utilities (Utilities.java:getBaseWork(389)) - Open file
to read in plan: file:/home/hiveptest/54.177.7.181-hiveptest-1/apache-svn-spark-source/itests/qtest-spark/target/tmp/scratchdir/hiveptest/171c6c35-9d46-45b1-8d6a-da6b58dcdbd8/hive_2014-10-24_14-41-29_094_2948446759305742203-1/-mr-10002/d349d6e0-d1b2-482b-9120-6b02b93bd85e/map.xml
2014-10-24 14:41:30,121 INFO  exec.Utilities (Utilities.java:getBaseWork(425)) - File not
found: File file:/home/hiveptest/54.177.7.181-hiveptest-1/apache-svn-spark-source/itests/qtest-spark/target/tmp/scratchdir/hiveptest/171c6c35-9d46-45b1-8d6a-da6b58dcdbd8/hive_2014-10-24_14-41-29_094_2948446759305742203-1/-mr-10002/d349d6e0-d1b2-482b-9120-6b02b93bd85e/map.xml
does not exist
2014-10-24 14:41:30,122 INFO  exec.Utilities (Utilities.java:getBaseWork(426)) - No plan file
found: file:/home/hiveptest/54.177.7.181-hiveptest-1/apache-svn-spark-source/itests/qtest-spark/target/tmp/scratchdir/hiveptest/171c6c35-9d46-45b1-8d6a-da6b58dcdbd8/hive_2014-10-24_14-41-29_094_2948446759305742203-1/-mr-10002/d349d6e0-d1b2-482b-9120-6b02b93bd85e/map.xml
2014-10-24 14:41:30,124 ERROR executor.Executor (Logging.scala:logError(96)) - Exception in
task 2.0 in stage 84.0 (TID 102)
java.io.IOException: java.lang.reflect.InvocationTargetException
	at org.apache.hadoop.hive.io.HiveIOExceptionHandlerChain.handleRecordReaderCreationException(HiveIOExceptionHandlerChain.java:97)
	at org.apache.hadoop.hive.io.HiveIOExceptionHandlerUtil.handleRecordReaderCreationException(HiveIOExceptionHandlerUtil.java:57)
	at org.apache.hadoop.hive.shims.HadoopShimsSecure$CombineFileRecordReader.initNextRecordReader(HadoopShimsSecure.java:300)
	at org.apache.hadoop.hive.shims.HadoopShimsSecure$CombineFileRecordReader.<init>(HadoopShimsSecure.java:247)
	at org.apache.hadoop.hive.shims.HadoopShimsSecure$CombineFileInputFormatShim.getRecordReader(HadoopShimsSecure.java:371)
	at org.apache.hadoop.hive.ql.io.CombineHiveInputFormat.getRecordReader(CombineHiveInputFormat.java:652)
	at org.apache.spark.rdd.HadoopRDD$$anon$1.<init>(HadoopRDD.scala:220)
	at org.apache.spark.rdd.HadoopRDD.compute(HadoopRDD.scala:211)
	at org.apache.spark.rdd.HadoopRDD.compute(HadoopRDD.scala:100)
	at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:262)
	at org.apache.spark.rdd.RDD.iterator(RDD.scala:229)
	at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:35)
	at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:262)
	at org.apache.spark.rdd.RDD.iterator(RDD.scala:229)
	at org.apache.spark.rdd.UnionRDD.compute(UnionRDD.scala:86)
	at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:262)
	at org.apache.spark.rdd.RDD.iterator(RDD.scala:229)
	at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:68)
	at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:41)
	at org.apache.spark.scheduler.Task.run(Task.scala:56)
	at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:181)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
	at java.lang.Thread.run(Thread.java:744)
Caused by: java.lang.reflect.InvocationTargetException
	at sun.reflect.GeneratedConstructorAccessor156.newInstance(Unknown Source)
	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
	at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
	at org.apache.hadoop.hive.shims.HadoopShimsSecure$CombineFileRecordReader.initNextRecordReader(HadoopShimsSecure.java:286)
	... 21 more
Caused by: java.lang.NullPointerException
	at org.apache.hadoop.hive.ql.io.CombineHiveInputFormat$CombineHiveInputSplit.<init>(CombineHiveInputFormat.java:102)
	at org.apache.hadoop.hive.ql.io.CombineHiveRecordReader.<init>(CombineHiveRecordReader.java:47)
	... 25 more
{noformat}

The NPE at the bottom is caused by the missing of plan file.

> Investigate test failures related to HIVE-8545 [Spark Branch]
> -------------------------------------------------------------
>
>                 Key: HIVE-8578
>                 URL: https://issues.apache.org/jira/browse/HIVE-8578
>             Project: Hive
>          Issue Type: Test
>          Components: Spark
>            Reporter: Chao
>
> In HIVE-8545, there are a few test failures, for instance, {{multi_insert_lateral_view.q}}
and {{ppr_multi_insert.q}}. They appear to be happening at random, and not reproducible locally.
We need to track down the root cause, and fix in this JIRA.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

Mime
View raw message