From commits-return-92579-archive-asf-public=cust-asf.ponee.io@beam.apache.org Fri Sep 14 23:43:17 2018 Return-Path: X-Original-To: archive-asf-public@cust-asf.ponee.io Delivered-To: archive-asf-public@cust-asf.ponee.io Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by mx-eu-01.ponee.io (Postfix) with SMTP id F2F2E180647 for ; Fri, 14 Sep 2018 23:43:15 +0200 (CEST) Received: (qmail 21416 invoked by uid 500); 14 Sep 2018 21:43:15 -0000 Mailing-List: contact commits-help@beam.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: dev@beam.apache.org Delivered-To: mailing list commits@beam.apache.org Received: (qmail 21407 invoked by uid 99); 14 Sep 2018 21:43:15 -0000 Received: from mail-relay.apache.org (HELO mailrelay1-lw-us.apache.org) (207.244.88.152) by apache.org (qpsmtpd/0.29) with ESMTP; Fri, 14 Sep 2018 21:43:15 +0000 Received: from jenkins02.apache.org (jenkins02.apache.org [195.201.213.130]) by mailrelay1-lw-us.apache.org (ASF Mail Server at mailrelay1-lw-us.apache.org) with ESMTP id 21AF5DC5 for ; Fri, 14 Sep 2018 21:43:14 +0000 (UTC) Received: from jenkins02.apache.org (localhost.localdomain [127.0.0.1]) by jenkins02.apache.org (ASF Mail Server at jenkins02.apache.org) with ESMTP id 89C6833E00A4 for ; Fri, 14 Sep 2018 21:43:13 +0000 (UTC) Date: Fri, 14 Sep 2018 21:43:13 +0000 (UTC) From: Apache Jenkins Server To: commits@beam.apache.org Message-ID: <304763513.2250.1536961393562.JavaMail.jenkins@jenkins02> Subject: Build failed in Jenkins: beam_PostCommit_Java_ValidatesRunner_Spark_Gradle #1481 MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 7bit X-Instance-Identity: MIIBIjANBgkqhkiG9w0BAQEFAAOCAQ8AMIIBCgKCAQEAkqVKZPv7YyHBB3FvWfV7XQehwe/Ga3aadzSNknt8g382X3uN8A3SOQ+Ixq9HxS+ZlN6XR4TECySmSRy2JN5Rx8svxAD0TjtSF9LuU98dD+LniNDP7Lq6gvRFuJhbMHoS0nuTizDZLsK4X8TW5MyV9w+jFbdoZfRE5O/Mse0fkOeL5uoIS/3Vvu/W+x9QSjDkB7CaU56bPFlQjqqJBl3Cn9r34CkXQZYnLb/NjW4vcpw0+TgMUAPTIVEr5BTPZRshz19g7huwg3zANT5HBIZnzV4hsVY9w4JHkceFdKi/ibNnjPjsFs9pm0HSGJ/RDxjIvSTYT02eH4+m1RAYaj2E9QIDAQAB X-Jenkins-Job: beam_PostCommit_Java_ValidatesRunner_Spark_Gradle X-Jenkins-Result: FAILURE See Changes: [pablo] Adding a custom coder test [github] Adding license ------------------------------------------ [...truncated 28.47 MB...] [dag-scheduler-event-loop] INFO org.apache.spark.SparkContext - Created broadcast 107 from broadcast at DAGScheduler.scala:1039 [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - Submitting 4 missing tasks from ResultStage 547 (MapPartitionsRDD[2773] at map at TranslationUtils.java:129) (first 15 tasks are for partitions Vector(0, 1, 2, 3)) [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.TaskSchedulerImpl - Adding task set 547.0 with 4 tasks [dispatcher-event-loop-3] INFO org.apache.spark.scheduler.TaskSetManager - Starting task 0.0 in stage 547.0 (TID 463, localhost, executor driver, partition 0, PROCESS_LOCAL, 8308 bytes) [dispatcher-event-loop-3] INFO org.apache.spark.scheduler.TaskSetManager - Starting task 1.0 in stage 547.0 (TID 464, localhost, executor driver, partition 1, PROCESS_LOCAL, 8308 bytes) [dispatcher-event-loop-3] INFO org.apache.spark.scheduler.TaskSetManager - Starting task 2.0 in stage 547.0 (TID 465, localhost, executor driver, partition 2, PROCESS_LOCAL, 8308 bytes) [dispatcher-event-loop-3] INFO org.apache.spark.scheduler.TaskSetManager - Starting task 3.0 in stage 547.0 (TID 466, localhost, executor driver, partition 3, PROCESS_LOCAL, 8308 bytes) [Executor task launch worker for task 463] INFO org.apache.spark.executor.Executor - Running task 0.0 in stage 547.0 (TID 463) [Executor task launch worker for task 464] INFO org.apache.spark.executor.Executor - Running task 1.0 in stage 547.0 (TID 464) [Executor task launch worker for task 465] INFO org.apache.spark.executor.Executor - Running task 2.0 in stage 547.0 (TID 465) [Executor task launch worker for task 466] INFO org.apache.spark.executor.Executor - Running task 3.0 in stage 547.0 (TID 466) [Executor task launch worker for task 466] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Getting 0 non-empty blocks out of 5 blocks [Executor task launch worker for task 463] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Getting 0 non-empty blocks out of 5 blocks [Executor task launch worker for task 463] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms [Executor task launch worker for task 466] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms [Executor task launch worker for task 463] INFO org.apache.spark.storage.BlockManager - Found block rdd_2441_0 locally [Executor task launch worker for task 466] INFO org.apache.spark.storage.BlockManager - Found block rdd_2441_3 locally [Executor task launch worker for task 466] INFO org.apache.spark.storage.memory.MemoryStore - Block rdd_2756_3 stored as bytes in memory (estimated size 4.0 B, free 13.5 GB) [Executor task launch worker for task 463] INFO org.apache.spark.storage.memory.MemoryStore - Block rdd_2756_0 stored as bytes in memory (estimated size 4.0 B, free 13.5 GB) [Executor task launch worker for task 464] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Getting 0 non-empty blocks out of 5 blocks [Executor task launch worker for task 465] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Getting 0 non-empty blocks out of 5 blocks [Executor task launch worker for task 464] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms [Executor task launch worker for task 465] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms [dispatcher-event-loop-0] INFO org.apache.spark.storage.BlockManagerInfo - Added rdd_2756_0 in memory on localhost:36991 (size: 4.0 B, free: 13.5 GB) [Executor task launch worker for task 465] INFO org.apache.spark.storage.BlockManager - Found block rdd_2441_2 locally [Executor task launch worker for task 464] INFO org.apache.spark.storage.BlockManager - Found block rdd_2441_1 locally [dispatcher-event-loop-0] INFO org.apache.spark.storage.BlockManagerInfo - Added rdd_2756_3 in memory on localhost:36991 (size: 4.0 B, free: 13.5 GB) [Executor task launch worker for task 465] INFO org.apache.spark.storage.memory.MemoryStore - Block rdd_2756_2 stored as bytes in memory (estimated size 4.0 B, free 13.5 GB) [Executor task launch worker for task 464] INFO org.apache.spark.storage.memory.MemoryStore - Block rdd_2756_1 stored as bytes in memory (estimated size 4.0 B, free 13.5 GB) [dispatcher-event-loop-3] INFO org.apache.spark.storage.BlockManagerInfo - Added rdd_2756_2 in memory on localhost:36991 (size: 4.0 B, free: 13.5 GB) [dispatcher-event-loop-3] INFO org.apache.spark.storage.BlockManagerInfo - Added rdd_2756_1 in memory on localhost:36991 (size: 4.0 B, free: 13.5 GB) [Executor task launch worker for task 463] INFO org.apache.spark.executor.Executor - Finished task 0.0 in stage 547.0 (TID 463). 59881 bytes result sent to driver [Executor task launch worker for task 466] INFO org.apache.spark.executor.Executor - Finished task 3.0 in stage 547.0 (TID 466). 59881 bytes result sent to driver [Executor task launch worker for task 465] INFO org.apache.spark.executor.Executor - Finished task 2.0 in stage 547.0 (TID 465). 59881 bytes result sent to driver [Executor task launch worker for task 464] INFO org.apache.spark.executor.Executor - Finished task 1.0 in stage 547.0 (TID 464). 59881 bytes result sent to driver [task-result-getter-0] INFO org.apache.spark.scheduler.TaskSetManager - Finished task 0.0 in stage 547.0 (TID 463) in 16 ms on localhost (executor driver) (1/4) [task-result-getter-2] INFO org.apache.spark.scheduler.TaskSetManager - Finished task 3.0 in stage 547.0 (TID 466) in 15 ms on localhost (executor driver) (2/4) [task-result-getter-3] INFO org.apache.spark.scheduler.TaskSetManager - Finished task 2.0 in stage 547.0 (TID 465) in 15 ms on localhost (executor driver) (3/4) [task-result-getter-1] INFO org.apache.spark.scheduler.TaskSetManager - Finished task 1.0 in stage 547.0 (TID 464) in 16 ms on localhost (executor driver) (4/4) [task-result-getter-1] INFO org.apache.spark.scheduler.TaskSchedulerImpl - Removed TaskSet 547.0, whose tasks have all completed, from pool [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - ResultStage 547 (foreach at UnboundedDataset.java:80) finished in 0.026 s [streaming-job-executor-0] INFO org.apache.spark.scheduler.DAGScheduler - Job 34 finished: foreach at UnboundedDataset.java:80, took 0.094279 s [JobScheduler] INFO org.apache.spark.streaming.scheduler.JobScheduler - Finished job streaming job 1536960829000 ms.2 from job set of time 1536960829000 ms [JobScheduler] INFO org.apache.spark.streaming.scheduler.JobScheduler - Starting job streaming job 1536960829000 ms.3 from job set of time 1536960829000 ms [streaming-job-executor-0] INFO org.apache.spark.SparkContext - Starting job: foreach at UnboundedDataset.java:80 [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - Registering RDD 2787 (mapToPair at GroupCombineFunctions.java:54) [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - Registering RDD 2815 (mapToPair at GroupCombineFunctions.java:54) [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - Got job 35 (foreach at UnboundedDataset.java:80) with 4 output partitions [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - Final stage: ResultStage 575 (foreach at UnboundedDataset.java:80) [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - Parents of final stage: List(ShuffleMapStage 556, ShuffleMapStage 574, ShuffleMapStage 564, ShuffleMapStage 561, ShuffleMapStage 565, ShuffleMapStage 562, ShuffleMapStage 563, ShuffleMapStage 573, ShuffleMapStage 570) [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - Missing parents: List(ShuffleMapStage 573) [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - Submitting ShuffleMapStage 572 (MapPartitionsRDD[2787] at mapToPair at GroupCombineFunctions.java:54), which has no missing parents [dag-scheduler-event-loop] INFO org.apache.spark.storage.memory.MemoryStore - Block broadcast_108 stored as values in memory (estimated size 161.0 KB, free 13.5 GB) [dag-scheduler-event-loop] INFO org.apache.spark.storage.memory.MemoryStore - Block broadcast_108_piece0 stored as bytes in memory (estimated size 35.2 KB, free 13.5 GB) [dispatcher-event-loop-1] INFO org.apache.spark.storage.BlockManagerInfo - Added broadcast_108_piece0 in memory on localhost:36991 (size: 35.2 KB, free: 13.5 GB) [dag-scheduler-event-loop] INFO org.apache.spark.SparkContext - Created broadcast 108 from broadcast at DAGScheduler.scala:1039 [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - Submitting 4 missing tasks from ShuffleMapStage 572 (MapPartitionsRDD[2787] at mapToPair at GroupCombineFunctions.java:54) (first 15 tasks are for partitions Vector(0, 1, 2, 3)) [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.TaskSchedulerImpl - Adding task set 572.0 with 4 tasks [dispatcher-event-loop-2] INFO org.apache.spark.scheduler.TaskSetManager - Starting task 0.0 in stage 572.0 (TID 467, localhost, executor driver, partition 0, PROCESS_LOCAL, 8297 bytes) [dispatcher-event-loop-2] INFO org.apache.spark.scheduler.TaskSetManager - Starting task 1.0 in stage 572.0 (TID 468, localhost, executor driver, partition 1, PROCESS_LOCAL, 8297 bytes) [dispatcher-event-loop-2] INFO org.apache.spark.scheduler.TaskSetManager - Starting task 2.0 in stage 572.0 (TID 469, localhost, executor driver, partition 2, PROCESS_LOCAL, 8297 bytes) [dispatcher-event-loop-2] INFO org.apache.spark.scheduler.TaskSetManager - Starting task 3.0 in stage 572.0 (TID 470, localhost, executor driver, partition 3, PROCESS_LOCAL, 8297 bytes) [Executor task launch worker for task 467] INFO org.apache.spark.executor.Executor - Running task 0.0 in stage 572.0 (TID 467) [Executor task launch worker for task 468] INFO org.apache.spark.executor.Executor - Running task 1.0 in stage 572.0 (TID 468) [Executor task launch worker for task 469] INFO org.apache.spark.executor.Executor - Running task 2.0 in stage 572.0 (TID 469) [Executor task launch worker for task 470] INFO org.apache.spark.executor.Executor - Running task 3.0 in stage 572.0 (TID 470) [Executor task launch worker for task 470] INFO org.apache.spark.storage.BlockManager - Found block rdd_2559_3 locally [Executor task launch worker for task 468] INFO org.apache.spark.storage.BlockManager - Found block rdd_2559_1 locally [Executor task launch worker for task 467] INFO org.apache.spark.storage.BlockManager - Found block rdd_2559_0 locally [Executor task launch worker for task 469] INFO org.apache.spark.storage.BlockManager - Found block rdd_2559_2 locally [Executor task launch worker for task 470] INFO org.apache.spark.executor.Executor - Finished task 3.0 in stage 572.0 (TID 470). 59509 bytes result sent to driver [Executor task launch worker for task 468] INFO org.apache.spark.executor.Executor - Finished task 1.0 in stage 572.0 (TID 468). 59466 bytes result sent to driver [Executor task launch worker for task 467] INFO org.apache.spark.executor.Executor - Finished task 0.0 in stage 572.0 (TID 467). 59509 bytes result sent to driver [task-result-getter-0] INFO org.apache.spark.scheduler.TaskSetManager - Finished task 3.0 in stage 572.0 (TID 470) in 12 ms on localhost (executor driver) (1/4) [task-result-getter-2] INFO org.apache.spark.scheduler.TaskSetManager - Finished task 1.0 in stage 572.0 (TID 468) in 12 ms on localhost (executor driver) (2/4) [task-result-getter-3] INFO org.apache.spark.scheduler.TaskSetManager - Finished task 0.0 in stage 572.0 (TID 467) in 13 ms on localhost (executor driver) (3/4) [Executor task launch worker for task 469] INFO org.apache.spark.executor.Executor - Finished task 2.0 in stage 572.0 (TID 469). 59466 bytes result sent to driver [task-result-getter-1] INFO org.apache.spark.scheduler.TaskSetManager - Finished task 2.0 in stage 572.0 (TID 469) in 15 ms on localhost (executor driver) (4/4) [task-result-getter-1] INFO org.apache.spark.scheduler.TaskSchedulerImpl - Removed TaskSet 572.0, whose tasks have all completed, from pool [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - ShuffleMapStage 572 (mapToPair at GroupCombineFunctions.java:54) finished in 0.022 s [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - looking for newly runnable stages [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - running: Set() [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - waiting: Set(ResultStage 575, ShuffleMapStage 573) [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - failed: Set() [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - Submitting ShuffleMapStage 573 (MapPartitionsRDD[2815] at mapToPair at GroupCombineFunctions.java:54), which has no missing parents [dag-scheduler-event-loop] INFO org.apache.spark.storage.memory.MemoryStore - Block broadcast_109 stored as values in memory (estimated size 198.5 KB, free 13.5 GB) [dag-scheduler-event-loop] INFO org.apache.spark.storage.memory.MemoryStore - Block broadcast_109_piece0 stored as bytes in memory (estimated size 44.9 KB, free 13.5 GB) [dispatcher-event-loop-1] INFO org.apache.spark.storage.BlockManagerInfo - Added broadcast_109_piece0 in memory on localhost:36991 (size: 44.9 KB, free: 13.5 GB) [dag-scheduler-event-loop] INFO org.apache.spark.SparkContext - Created broadcast 109 from broadcast at DAGScheduler.scala:1039 [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - Submitting 5 missing tasks from ShuffleMapStage 573 (MapPartitionsRDD[2815] at mapToPair at GroupCombineFunctions.java:54) (first 15 tasks are for partitions Vector(0, 1, 2, 3, 4)) [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.TaskSchedulerImpl - Adding task set 573.0 with 5 tasks [dispatcher-event-loop-2] INFO org.apache.spark.scheduler.TaskSetManager - Starting task 0.0 in stage 573.0 (TID 471, localhost, executor driver, partition 0, PROCESS_LOCAL, 8436 bytes) [dispatcher-event-loop-2] INFO org.apache.spark.scheduler.TaskSetManager - Starting task 1.0 in stage 573.0 (TID 472, localhost, executor driver, partition 1, PROCESS_LOCAL, 8436 bytes) [dispatcher-event-loop-2] INFO org.apache.spark.scheduler.TaskSetManager - Starting task 2.0 in stage 573.0 (TID 473, localhost, executor driver, partition 2, PROCESS_LOCAL, 8436 bytes) [dispatcher-event-loop-2] INFO org.apache.spark.scheduler.TaskSetManager - Starting task 3.0 in stage 573.0 (TID 474, localhost, executor driver, partition 3, PROCESS_LOCAL, 8436 bytes) [Executor task launch worker for task 471] INFO org.apache.spark.executor.Executor - Running task 0.0 in stage 573.0 (TID 471) [Executor task launch worker for task 472] INFO org.apache.spark.executor.Executor - Running task 1.0 in stage 573.0 (TID 472) [Executor task launch worker for task 473] INFO org.apache.spark.executor.Executor - Running task 2.0 in stage 573.0 (TID 473) [Executor task launch worker for task 474] INFO org.apache.spark.executor.Executor - Running task 3.0 in stage 573.0 (TID 474) [Executor task launch worker for task 471] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Getting 0 non-empty blocks out of 4 blocks [Executor task launch worker for task 471] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms [Executor task launch worker for task 472] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Getting 0 non-empty blocks out of 4 blocks [Executor task launch worker for task 471] INFO org.apache.spark.storage.BlockManager - Found block rdd_2484_0 locally [Executor task launch worker for task 472] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Started 0 remote fetches in 1 ms [Executor task launch worker for task 473] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Getting 0 non-empty blocks out of 4 blocks [Executor task launch worker for task 473] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms [Executor task launch worker for task 472] INFO org.apache.spark.storage.BlockManager - Found block rdd_2484_1 locally [Executor task launch worker for task 474] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Getting 0 non-empty blocks out of 4 blocks [Executor task launch worker for task 474] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms [Executor task launch worker for task 473] INFO org.apache.spark.storage.BlockManager - Found block rdd_2484_2 locally [Executor task launch worker for task 474] INFO org.apache.spark.storage.BlockManager - Found block rdd_2484_3 locally [Executor task launch worker for task 471] INFO org.apache.spark.storage.memory.MemoryStore - Block rdd_2799_0 stored as bytes in memory (estimated size 4.0 B, free 13.5 GB) [Executor task launch worker for task 472] INFO org.apache.spark.storage.memory.MemoryStore - Block rdd_2799_1 stored as bytes in memory (estimated size 4.0 B, free 13.5 GB) [dispatcher-event-loop-0] INFO org.apache.spark.storage.BlockManagerInfo - Added rdd_2799_0 in memory on localhost:36991 (size: 4.0 B, free: 13.5 GB) [Executor task launch worker for task 473] INFO org.apache.spark.storage.memory.MemoryStore - Block rdd_2799_2 stored as bytes in memory (estimated size 4.0 B, free 13.5 GB) [Executor task launch worker for task 474] INFO org.apache.spark.storage.memory.MemoryStore - Block rdd_2799_3 stored as bytes in memory (estimated size 4.0 B, free 13.5 GB) [dispatcher-event-loop-0] INFO org.apache.spark.storage.BlockManagerInfo - Added rdd_2799_1 in memory on localhost:36991 (size: 4.0 B, free: 13.5 GB) [dispatcher-event-loop-0] INFO org.apache.spark.storage.BlockManagerInfo - Added rdd_2799_2 in memory on localhost:36991 (size: 4.0 B, free: 13.5 GB) [dispatcher-event-loop-0] INFO org.apache.spark.storage.BlockManagerInfo - Added rdd_2799_3 in memory on localhost:36991 (size: 4.0 B, free: 13.5 GB) [Executor task launch worker for task 472] INFO org.apache.spark.executor.Executor - Finished task 1.0 in stage 573.0 (TID 472). 59939 bytes result sent to driver [Executor task launch worker for task 471] INFO org.apache.spark.executor.Executor - Finished task 0.0 in stage 573.0 (TID 471). 59939 bytes result sent to driver [Executor task launch worker for task 474] INFO org.apache.spark.executor.Executor - Finished task 3.0 in stage 573.0 (TID 474). 59939 bytes result sent to driver [Executor task launch worker for task 473] INFO org.apache.spark.executor.Executor - Finished task 2.0 in stage 573.0 (TID 473). 59939 bytes result sent to driver [dispatcher-event-loop-1] INFO org.apache.spark.scheduler.TaskSetManager - Starting task 4.0 in stage 573.0 (TID 475, localhost, executor driver, partition 4, PROCESS_LOCAL, 7968 bytes) [Executor task launch worker for task 475] INFO org.apache.spark.executor.Executor - Running task 4.0 in stage 573.0 (TID 475) [task-result-getter-2] INFO org.apache.spark.scheduler.TaskSetManager - Finished task 0.0 in stage 573.0 (TID 471) in 15 ms on localhost (executor driver) (1/5) [task-result-getter-3] INFO org.apache.spark.scheduler.TaskSetManager - Finished task 2.0 in stage 573.0 (TID 473) in 15 ms on localhost (executor driver) (2/5) [task-result-getter-0] INFO org.apache.spark.scheduler.TaskSetManager - Finished task 1.0 in stage 573.0 (TID 472) in 15 ms on localhost (executor driver) (3/5) [task-result-getter-1] INFO org.apache.spark.scheduler.TaskSetManager - Finished task 3.0 in stage 573.0 (TID 474) in 16 ms on localhost (executor driver) (4/5) [Executor task launch worker for task 475] INFO org.apache.spark.executor.Executor - Finished task 4.0 in stage 573.0 (TID 475). 59423 bytes result sent to driver [task-result-getter-2] INFO org.apache.spark.scheduler.TaskSetManager - Finished task 4.0 in stage 573.0 (TID 475) in 13 ms on localhost (executor driver) (5/5) [task-result-getter-2] INFO org.apache.spark.scheduler.TaskSchedulerImpl - Removed TaskSet 573.0, whose tasks have all completed, from pool [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - ShuffleMapStage 573 (mapToPair at GroupCombineFunctions.java:54) finished in 0.034 s [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - looking for newly runnable stages [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - running: Set() [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - waiting: Set(ResultStage 575) [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - failed: Set() [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - Submitting ResultStage 575 (MapPartitionsRDD[2844] at map at TranslationUtils.java:129), which has no missing parents [dag-scheduler-event-loop] INFO org.apache.spark.storage.memory.MemoryStore - Block broadcast_110 stored as values in memory (estimated size 236.0 KB, free 13.5 GB) [dag-scheduler-event-loop] INFO org.apache.spark.storage.memory.MemoryStore - Block broadcast_110_piece0 stored as bytes in memory (estimated size 56.2 KB, free 13.5 GB) [dispatcher-event-loop-2] INFO org.apache.spark.storage.BlockManagerInfo - Added broadcast_110_piece0 in memory on localhost:36991 (size: 56.2 KB, free: 13.5 GB) [dag-scheduler-event-loop] INFO org.apache.spark.SparkContext - Created broadcast 110 from broadcast at DAGScheduler.scala:1039 [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - Submitting 4 missing tasks from ResultStage 575 (MapPartitionsRDD[2844] at map at TranslationUtils.java:129) (first 15 tasks are for partitions Vector(0, 1, 2, 3)) [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.TaskSchedulerImpl - Adding task set 575.0 with 4 tasks [dispatcher-event-loop-3] INFO org.apache.spark.scheduler.TaskSetManager - Starting task 0.0 in stage 575.0 (TID 476, localhost, executor driver, partition 0, PROCESS_LOCAL, 8308 bytes) [dispatcher-event-loop-3] INFO org.apache.spark.scheduler.TaskSetManager - Starting task 1.0 in stage 575.0 (TID 477, localhost, executor driver, partition 1, PROCESS_LOCAL, 8308 bytes) [dispatcher-event-loop-3] INFO org.apache.spark.scheduler.TaskSetManager - Starting task 2.0 in stage 575.0 (TID 478, localhost, executor driver, partition 2, PROCESS_LOCAL, 8308 bytes) [dispatcher-event-loop-3] INFO org.apache.spark.scheduler.TaskSetManager - Starting task 3.0 in stage 575.0 (TID 479, localhost, executor driver, partition 3, PROCESS_LOCAL, 8308 bytes) [Executor task launch worker for task 476] INFO org.apache.spark.executor.Executor - Running task 0.0 in stage 575.0 (TID 476) [Executor task launch worker for task 477] INFO org.apache.spark.executor.Executor - Running task 1.0 in stage 575.0 (TID 477) [Executor task launch worker for task 478] INFO org.apache.spark.executor.Executor - Running task 2.0 in stage 575.0 (TID 478) [Executor task launch worker for task 479] INFO org.apache.spark.executor.Executor - Running task 3.0 in stage 575.0 (TID 479) [Executor task launch worker for task 476] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Getting 0 non-empty blocks out of 5 blocks [Executor task launch worker for task 476] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms [Executor task launch worker for task 476] INFO org.apache.spark.storage.BlockManager - Found block rdd_2512_0 locally [Executor task launch worker for task 476] INFO org.apache.spark.storage.memory.MemoryStore - Block rdd_2827_0 stored as bytes in memory (estimated size 4.0 B, free 13.5 GB) [Executor task launch worker for task 477] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Getting 0 non-empty blocks out of 5 blocks [Executor task launch worker for task 479] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Getting 0 non-empty blocks out of 5 blocks [Executor task launch worker for task 478] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Getting 0 non-empty blocks out of 5 blocks [dispatcher-event-loop-0] INFO org.apache.spark.storage.BlockManagerInfo - Added rdd_2827_0 in memory on localhost:36991 (size: 4.0 B, free: 13.5 GB) [Executor task launch worker for task 479] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms [Executor task launch worker for task 477] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms [Executor task launch worker for task 478] INFO org.apache.spark.storage.ShuffleBlockFetcherIterator - Started 0 remote fetches in 0 ms [Executor task launch worker for task 479] INFO org.apache.spark.storage.BlockManager - Found block rdd_2512_3 locally [Executor task launch worker for task 477] INFO org.apache.spark.storage.BlockManager - Found block rdd_2512_1 locally [Executor task launch worker for task 478] INFO org.apache.spark.storage.BlockManager - Found block rdd_2512_2 locally [Executor task launch worker for task 477] INFO org.apache.spark.storage.memory.MemoryStore - Block rdd_2827_1 stored as bytes in memory (estimated size 4.0 B, free 13.5 GB) [Executor task launch worker for task 479] INFO org.apache.spark.storage.memory.MemoryStore - Block rdd_2827_3 stored as bytes in memory (estimated size 4.0 B, free 13.5 GB) [Executor task launch worker for task 478] INFO org.apache.spark.storage.memory.MemoryStore - Block rdd_2827_2 stored as bytes in memory (estimated size 4.0 B, free 13.5 GB) [dispatcher-event-loop-2] INFO org.apache.spark.storage.BlockManagerInfo - Added rdd_2827_3 in memory on localhost:36991 (size: 4.0 B, free: 13.5 GB) [dispatcher-event-loop-2] INFO org.apache.spark.storage.BlockManagerInfo - Added rdd_2827_1 in memory on localhost:36991 (size: 4.0 B, free: 13.5 GB) [dispatcher-event-loop-2] INFO org.apache.spark.storage.BlockManagerInfo - Added rdd_2827_2 in memory on localhost:36991 (size: 4.0 B, free: 13.5 GB) [Executor task launch worker for task 476] INFO org.apache.spark.executor.Executor - Finished task 0.0 in stage 575.0 (TID 476). 59881 bytes result sent to driver [Executor task launch worker for task 479] INFO org.apache.spark.executor.Executor - Finished task 3.0 in stage 575.0 (TID 479). 59881 bytes result sent to driver [Executor task launch worker for task 477] INFO org.apache.spark.executor.Executor - Finished task 1.0 in stage 575.0 (TID 477). 59881 bytes result sent to driver [Executor task launch worker for task 478] INFO org.apache.spark.executor.Executor - Finished task 2.0 in stage 575.0 (TID 478). 59881 bytes result sent to driver [task-result-getter-3] INFO org.apache.spark.scheduler.TaskSetManager - Finished task 0.0 in stage 575.0 (TID 476) in 15 ms on localhost (executor driver) (1/4) [task-result-getter-0] INFO org.apache.spark.scheduler.TaskSetManager - Finished task 1.0 in stage 575.0 (TID 477) in 15 ms on localhost (executor driver) (2/4) [task-result-getter-1] INFO org.apache.spark.scheduler.TaskSetManager - Finished task 3.0 in stage 575.0 (TID 479) in 15 ms on localhost (executor driver) (3/4) [task-result-getter-2] INFO org.apache.spark.scheduler.TaskSetManager - Finished task 2.0 in stage 575.0 (TID 478) in 16 ms on localhost (executor driver) (4/4) [task-result-getter-2] INFO org.apache.spark.scheduler.TaskSchedulerImpl - Removed TaskSet 575.0, whose tasks have all completed, from pool [dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler - ResultStage 575 (foreach at UnboundedDataset.java:80) finished in 0.026 s [streaming-job-executor-0] INFO org.apache.spark.scheduler.DAGScheduler - Job 35 finished: foreach at UnboundedDataset.java:80, took 0.091752 s [JobScheduler] INFO org.apache.spark.streaming.scheduler.JobScheduler - Finished job streaming job 1536960829000 ms.3 from job set of time 1536960829000 ms [JobScheduler] INFO org.apache.spark.streaming.scheduler.JobScheduler - Total delay: 5.458 s for time 1536960829000 ms (execution: 0.381 s) [Test worker] INFO org.apache.spark.streaming.scheduler.JobScheduler - Stopped JobScheduler [Test worker] INFO org.spark_project.jetty.server.handler.ContextHandler - Stopped o.s.j.s.ServletContextHandler@61d49c32{/streaming,null,UNAVAILABLE,@Spark} [Test worker] INFO org.spark_project.jetty.server.handler.ContextHandler - Stopped o.s.j.s.ServletContextHandler@63dce2f1{/streaming/batch,null,UNAVAILABLE,@Spark} [Test worker] INFO org.spark_project.jetty.server.handler.ContextHandler - Stopped o.s.j.s.ServletContextHandler@6f8be5a{/static/streaming,null,UNAVAILABLE,@Spark} [Test worker] INFO org.apache.spark.streaming.StreamingContext - StreamingContext stopped successfully [Test worker] INFO org.spark_project.jetty.server.AbstractConnector - Stopped Spark@5bc609ac{HTTP/1.1,[http/1.1]}{127.0.0.1:4040} [Test worker] INFO org.apache.spark.ui.SparkUI - Stopped Spark web UI at http://localhost:4040 [dispatcher-event-loop-3] INFO org.apache.spark.MapOutputTrackerMasterEndpoint - MapOutputTrackerMasterEndpoint stopped! [Test worker] INFO org.apache.spark.storage.memory.MemoryStore - MemoryStore cleared [Test worker] INFO org.apache.spark.storage.BlockManager - BlockManager stopped [Test worker] INFO org.apache.spark.storage.BlockManagerMaster - BlockManagerMaster stopped [dispatcher-event-loop-2] INFO org.apache.spark.scheduler.OutputCommitCoordinator$OutputCommitCoordinatorEndpoint - OutputCommitCoordinator stopped! [Test worker] INFO org.apache.spark.SparkContext - Successfully stopped SparkContext Gradle Test Executor 283 finished executing tests. > Task :beam-runners-spark:validatesRunnerStreaming [Thread-4] INFO org.apache.spark.util.ShutdownHookManager - Shutdown hook called [Thread-4] INFO org.apache.spark.util.ShutdownHookManager - Deleting directory /tmp/spark-dc991f1e-c50e-481b-af81-f61cf8005b7b org.apache.beam.runners.spark.translation.streaming.StreamingSourceMetricsTest > testUnboundedSourceMetrics STANDARD_ERROR [Test worker] INFO org.spark_project.jetty.server.AbstractConnector - Stopped Spark@7e3999be{HTTP/1.1,[http/1.1]}{127.0.0.1:4041} [Test worker] INFO org.apache.spark.ui.SparkUI - Stopped Spark web UI at http://localhost:4041 [dispatcher-event-loop-2] INFO org.apache.spark.MapOutputTrackerMasterEndpoint - MapOutputTrackerMasterEndpoint stopped! [Test worker] INFO org.apache.spark.storage.memory.MemoryStore - MemoryStore cleared [Test worker] INFO org.apache.spark.storage.BlockManager - BlockManager stopped [Test worker] INFO org.apache.spark.storage.BlockManagerMaster - BlockManagerMaster stopped [dispatcher-event-loop-2] INFO org.apache.spark.scheduler.OutputCommitCoordinator$OutputCommitCoordinatorEndpoint - OutputCommitCoordinator stopped! [Test worker] INFO org.apache.spark.SparkContext - Successfully stopped SparkContext Gradle Test Executor 286 finished executing tests. > Task :beam-runners-spark:validatesRunnerStreaming FAILED [Thread-5] INFO org.apache.spark.util.ShutdownHookManager - Shutdown hook called [Thread-5] INFO org.apache.spark.util.ShutdownHookManager - Deleting directory /tmp/spark-7ea51197-590d-43ac-b3eb-3b838a133413 14 tests completed, 2 failed Finished generating test XML results (0.099 secs) into: Generating HTML test report... Finished generating test html results (0.096 secs) into: :beam-runners-spark:validatesRunnerStreaming (Thread[Task worker for ':' Thread 7,5,main]) completed. Took 10 mins 26.595 secs. FAILURE: Build failed with an exception. * What went wrong: Execution failed for task ':beam-runners-spark:validatesRunnerStreaming'. > There were failing tests. See the report at: file:// * Try: Run with --stacktrace option to get the stack trace. Run with --debug option to get more log output. Run with --scan to get full insights. * Get more help at https://help.gradle.org Deprecated Gradle features were used in this build, making it incompatible with Gradle 5.0. See https://docs.gradle.org/4.8/userguide/command_line_interface.html#sec:command_line_warnings BUILD FAILED in 15m 57s 40 actionable tasks: 36 executed, 4 from cache Publishing build scan... https://gradle.com/s/55u3kbegyq7wc Build step 'Invoke Gradle script' changed build result to FAILURE Build step 'Invoke Gradle script' marked build as failure