From commits-return-59947-archive-asf-public=cust-asf.ponee.io@beam.apache.org Tue Mar 6 12:09:24 2018 Return-Path: X-Original-To: archive-asf-public@cust-asf.ponee.io Delivered-To: archive-asf-public@cust-asf.ponee.io Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by mx-eu-01.ponee.io (Postfix) with SMTP id D87D5180652 for ; Tue, 6 Mar 2018 12:09:23 +0100 (CET) Received: (qmail 16252 invoked by uid 500); 6 Mar 2018 11:09:22 -0000 Mailing-List: contact commits-help@beam.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: dev@beam.apache.org Delivered-To: mailing list commits@beam.apache.org Received: (qmail 16243 invoked by uid 99); 6 Mar 2018 11:09:22 -0000 Received: from pnap-us-west-generic-nat.apache.org (HELO spamd3-us-west.apache.org) (209.188.14.142) by apache.org (qpsmtpd/0.29) with ESMTP; Tue, 06 Mar 2018 11:09:22 +0000 Received: from localhost (localhost [127.0.0.1]) by spamd3-us-west.apache.org (ASF Mail Server at spamd3-us-west.apache.org) with ESMTP id 6DD5818030B for ; Tue, 6 Mar 2018 11:09:22 +0000 (UTC) X-Virus-Scanned: Debian amavisd-new at spamd3-us-west.apache.org X-Spam-Flag: NO X-Spam-Score: 1.991 X-Spam-Level: * X-Spam-Status: No, score=1.991 tagged_above=-999 required=6.31 tests=[KAM_ASCII_DIVIDERS=0.8, KAM_BADIPHTTP=2, KAM_LAZY_DOMAIN_SECURITY=1, KAM_NUMSUBJECT=0.5, RCVD_IN_DNSWL_MED=-2.3, T_RP_MATCHES_RCVD=-0.01, WEIRD_PORT=0.001] autolearn=disabled Received: from mx1-lw-us.apache.org ([10.40.0.8]) by localhost (spamd3-us-west.apache.org [10.40.0.10]) (amavisd-new, port 10024) with ESMTP id BTQSvDrKD_by for ; Tue, 6 Mar 2018 11:09:17 +0000 (UTC) Received: from mailrelay1-us-west.apache.org (mailrelay1-us-west.apache.org [209.188.14.139]) by mx1-lw-us.apache.org (ASF Mail Server at mx1-lw-us.apache.org) with ESMTP id 916565F1E7 for ; Tue, 6 Mar 2018 11:09:17 +0000 (UTC) Received: from jenkins-master.apache.org (unknown [62.210.60.235]) by mailrelay1-us-west.apache.org (ASF Mail Server at mailrelay1-us-west.apache.org) with ESMTP id CCF7DE00A7 for ; Tue, 6 Mar 2018 11:09:16 +0000 (UTC) Received: from jenkins-master.apache.org (localhost [127.0.0.1]) by jenkins-master.apache.org (ASF Mail Server at jenkins-master.apache.org) with ESMTP id BFF79A002F for ; Tue, 6 Mar 2018 11:09:13 +0000 (UTC) Date: Tue, 6 Mar 2018 11:09:13 +0000 (UTC) From: Apache Jenkins Server To: commits@beam.apache.org Message-ID: <1432025541.3315.1520334553194.JavaMail.jenkins@jenkins-master.apache.org> Subject: Build failed in Jenkins: beam_PostRelease_NightlySnapshot #88 MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 7bit X-Instance-Identity: MIIBIjANBgkqhkiG9w0BAQEFAAOCAQ8AMIIBCgKCAQEAkqVKZPv7YyHBB3FvWfV7XQehwe/Ga3aadzSNknt8g382X3uN8A3SOQ+Ixq9HxS+ZlN6XR4TECySmSRy2JN5Rx8svxAD0TjtSF9LuU98dD+LniNDP7Lq6gvRFuJhbMHoS0nuTizDZLsK4X8TW5MyV9w+jFbdoZfRE5O/Mse0fkOeL5uoIS/3Vvu/W+x9QSjDkB7CaU56bPFlQjqqJBl3Cn9r34CkXQZYnLb/NjW4vcpw0+TgMUAPTIVEr5BTPZRshz19g7huwg3zANT5HBIZnzV4hsVY9w4JHkceFdKi/ibNnjPjsFs9pm0HSGJ/RDxjIvSTYT02eH4+m1RAYaj2E9QIDAQAB X-Jenkins-Job: beam_PostRelease_NightlySnapshot X-Jenkins-Result: FAILURE See Changes: [ehudm] Don't cache pubsub subscription prematurely. [ehudm] Add Python lint check for calls to unittest.main. [github] Fixing formatting bug in filebasedsink.py. [github] Fix lint issue. [mariagh] Add TestClock to test [daniel.o.programmer] [BEAM-3126] Fixing incorrect function call in bundle processor. [samuel.waggoner] [BEAM-3777] allow UDAFs to be indirect subclasses of CombineFn ------------------------------------------ [...truncated 3.60 MB...] at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38) at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:323) at org.apache.spark.rdd.RDD.iterator(RDD.scala:287) at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38) at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:323) at org.apache.spark.rdd.RDD.iterator(RDD.scala:287) at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38) at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:323) at org.apache.spark.rdd.RDD.iterator(RDD.scala:287) at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:96) at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:53) at org.apache.spark.scheduler.Task.run(Task.scala:108) at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:338) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:748) Mar 06, 2018 11:06:09 AM org.apache.spark.internal.Logging$class logError SEVERE: Task 1 in stage 0.0 failed 1 times; aborting job Mar 06, 2018 11:06:09 AM org.apache.spark.internal.Logging$class logInfo INFO: Removed TaskSet 0.0, whose tasks have all completed, from pool Mar 06, 2018 11:06:09 AM org.apache.spark.internal.Logging$class logInfo INFO: Lost task 2.0 in stage 0.0 (TID 2) on localhost, executor driver: java.lang.NoSuchMethodError (org.apache.beam.runners.core.DoFnRunners.simpleRunner(Lorg/apache/beam/sdk/options/PipelineOptions;Lorg/apache/beam/sdk/transforms/DoFn;Lorg/apache/beam/runners/core/SideInputReader;Lorg/apache/beam/runners/core/DoFnRunners$OutputManager;Lorg/apache/beam/sdk/values/TupleTag;Ljava/util/List;Lorg/apache/beam/runners/core/StepContext;Lorg/apache/beam/sdk/values/WindowingStrategy;)Lorg/apache/beam/runners/core/DoFnRunner;) [duplicate 1] Mar 06, 2018 11:06:09 AM org.apache.spark.internal.Logging$class logInfo INFO: Removed TaskSet 0.0, whose tasks have all completed, from pool Mar 06, 2018 11:06:09 AM org.apache.spark.internal.Logging$class logInfo INFO: Lost task 0.0 in stage 0.0 (TID 0) on localhost, executor driver: java.lang.NoSuchMethodError (org.apache.beam.runners.core.DoFnRunners.simpleRunner(Lorg/apache/beam/sdk/options/PipelineOptions;Lorg/apache/beam/sdk/transforms/DoFn;Lorg/apache/beam/runners/core/SideInputReader;Lorg/apache/beam/runners/core/DoFnRunners$OutputManager;Lorg/apache/beam/sdk/values/TupleTag;Ljava/util/List;Lorg/apache/beam/runners/core/StepContext;Lorg/apache/beam/sdk/values/WindowingStrategy;)Lorg/apache/beam/runners/core/DoFnRunner;) [duplicate 2] Mar 06, 2018 11:06:09 AM org.apache.spark.internal.Logging$class logInfo INFO: Removed TaskSet 0.0, whose tasks have all completed, from pool Mar 06, 2018 11:06:09 AM org.apache.spark.internal.Logging$class logInfo INFO: Lost task 3.0 in stage 0.0 (TID 3) on localhost, executor driver: java.lang.NoSuchMethodError (org.apache.beam.runners.core.DoFnRunners.simpleRunner(Lorg/apache/beam/sdk/options/PipelineOptions;Lorg/apache/beam/sdk/transforms/DoFn;Lorg/apache/beam/runners/core/SideInputReader;Lorg/apache/beam/runners/core/DoFnRunners$OutputManager;Lorg/apache/beam/sdk/values/TupleTag;Ljava/util/List;Lorg/apache/beam/runners/core/StepContext;Lorg/apache/beam/sdk/values/WindowingStrategy;)Lorg/apache/beam/runners/core/DoFnRunner;) [duplicate 3] Mar 06, 2018 11:06:09 AM org.apache.spark.internal.Logging$class logInfo INFO: Removed TaskSet 0.0, whose tasks have all completed, from pool Mar 06, 2018 11:06:09 AM org.apache.spark.internal.Logging$class logInfo INFO: Cancelling stage 0 Mar 06, 2018 11:06:09 AM org.apache.spark.internal.Logging$class logInfo INFO: ShuffleMapStage 0 (mapToPair at GroupCombineFunctions.java:184) failed in 2.561 s due to Job aborted due to stage failure: Task 1 in stage 0.0 failed 1 times, most recent failure: Lost task 1.0 in stage 0.0 (TID 1, localhost, executor driver): java.lang.NoSuchMethodError: org.apache.beam.runners.core.DoFnRunners.simpleRunner(Lorg/apache/beam/sdk/options/PipelineOptions;Lorg/apache/beam/sdk/transforms/DoFn;Lorg/apache/beam/runners/core/SideInputReader;Lorg/apache/beam/runners/core/DoFnRunners$OutputManager;Lorg/apache/beam/sdk/values/TupleTag;Ljava/util/List;Lorg/apache/beam/runners/core/StepContext;Lorg/apache/beam/sdk/values/WindowingStrategy;)Lorg/apache/beam/runners/core/DoFnRunner; at org.apache.beam.runners.spark.translation.MultiDoFnFunction.call(MultiDoFnFunction.java:137) at org.apache.beam.runners.spark.translation.MultiDoFnFunction.call(MultiDoFnFunction.java:58) at org.apache.spark.api.java.JavaRDDLike$$anonfun$fn$7$1.apply(JavaRDDLike.scala:186) at org.apache.spark.api.java.JavaRDDLike$$anonfun$fn$7$1.apply(JavaRDDLike.scala:186) at org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$23.apply(RDD.scala:797) at org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$23.apply(RDD.scala:797) at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38) at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:323) at org.apache.spark.rdd.RDD.iterator(RDD.scala:287) at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38) at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:323) at org.apache.spark.rdd.RDD.iterator(RDD.scala:287) at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38) at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:323) at org.apache.spark.rdd.RDD.iterator(RDD.scala:287) at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38) at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:323) at org.apache.spark.rdd.RDD.iterator(RDD.scala:287) at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38) at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:323) at org.apache.spark.rdd.RDD.iterator(RDD.scala:287) at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38) at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:323) at org.apache.spark.rdd.RDD.iterator(RDD.scala:287) at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38) at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:323) at org.apache.spark.rdd.RDD.iterator(RDD.scala:287) at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38) at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:323) at org.apache.spark.rdd.RDD.iterator(RDD.scala:287) at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:96) at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:53) at org.apache.spark.scheduler.Task.run(Task.scala:108) at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:338) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:748) Driver stacktrace: Mar 06, 2018 11:06:09 AM org.apache.spark.internal.Logging$class logInfo INFO: Job 0 failed: collect at BoundedDataset.java:87, took 2.914829 s Mar 06, 2018 11:06:09 AM org.spark_project.jetty.server.AbstractConnector doStop INFO: Stopped Spark@7e442deb{HTTP/1.1,[http/1.1]}{127.0.0.1:4040} Mar 06, 2018 11:06:09 AM org.apache.spark.internal.Logging$class logInfo INFO: Stopped Spark web UI at http://127.0.0.1:4040 Mar 06, 2018 11:06:09 AM org.apache.spark.internal.Logging$class logInfo INFO: MapOutputTrackerMasterEndpoint stopped! Mar 06, 2018 11:06:09 AM org.apache.spark.internal.Logging$class logInfo INFO: MemoryStore cleared Mar 06, 2018 11:06:09 AM org.apache.spark.internal.Logging$class logInfo INFO: BlockManager stopped Mar 06, 2018 11:06:09 AM org.apache.spark.internal.Logging$class logInfo INFO: BlockManagerMaster stopped Mar 06, 2018 11:06:09 AM org.apache.spark.internal.Logging$class logInfo INFO: OutputCommitCoordinator stopped! Mar 06, 2018 11:06:09 AM org.apache.spark.internal.Logging$class logInfo INFO: Successfully stopped SparkContext [WARNING] org.apache.beam.sdk.Pipeline$PipelineExecutionException: java.lang.NoSuchMethodError: org.apache.beam.runners.core.DoFnRunners.simpleRunner(Lorg/apache/beam/sdk/options/PipelineOptions;Lorg/apache/beam/sdk/transforms/DoFn;Lorg/apache/beam/runners/core/SideInputReader;Lorg/apache/beam/runners/core/DoFnRunners$OutputManager;Lorg/apache/beam/sdk/values/TupleTag;Ljava/util/List;Lorg/apache/beam/runners/core/StepContext;Lorg/apache/beam/sdk/values/WindowingStrategy;)Lorg/apache/beam/runners/core/DoFnRunner; at org.apache.beam.runners.spark.SparkPipelineResult.beamExceptionFrom (SparkPipelineResult.java:68) at org.apache.beam.runners.spark.SparkPipelineResult.waitUntilFinish (SparkPipelineResult.java:99) at org.apache.beam.runners.spark.SparkPipelineResult.waitUntilFinish (SparkPipelineResult.java:87) at org.apache.beam.examples.WordCount.main (WordCount.java:187) at sun.reflect.NativeMethodAccessorImpl.invoke0 (Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke (NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke (DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke (Method.java:498) at org.codehaus.mojo.exec.ExecJavaMojo$1.run (ExecJavaMojo.java:282) at java.lang.Thread.run (Thread.java:748) Caused by: java.lang.NoSuchMethodError: org.apache.beam.runners.core.DoFnRunners.simpleRunner(Lorg/apache/beam/sdk/options/PipelineOptions;Lorg/apache/beam/sdk/transforms/DoFn;Lorg/apache/beam/runners/core/SideInputReader;Lorg/apache/beam/runners/core/DoFnRunners$OutputManager;Lorg/apache/beam/sdk/values/TupleTag;Ljava/util/List;Lorg/apache/beam/runners/core/StepContext;Lorg/apache/beam/sdk/values/WindowingStrategy;)Lorg/apache/beam/runners/core/DoFnRunner; at org.apache.beam.runners.spark.translation.MultiDoFnFunction.call (MultiDoFnFunction.java:137) at org.apache.beam.runners.spark.translation.MultiDoFnFunction.call (MultiDoFnFunction.java:58) at org.apache.spark.api.java.JavaRDDLike$$anonfun$fn$7$1.apply (JavaRDDLike.scala:186) at org.apache.spark.api.java.JavaRDDLike$$anonfun$fn$7$1.apply (JavaRDDLike.scala:186) at org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$23.apply (RDD.scala:797) at org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$23.apply (RDD.scala:797) at org.apache.spark.rdd.MapPartitionsRDD.compute (MapPartitionsRDD.scala:38) at org.apache.spark.rdd.RDD.computeOrReadCheckpoint (RDD.scala:323) at org.apache.spark.rdd.RDD.iterator (RDD.scala:287) at org.apache.spark.rdd.MapPartitionsRDD.compute (MapPartitionsRDD.scala:38) at org.apache.spark.rdd.RDD.computeOrReadCheckpoint (RDD.scala:323) at org.apache.spark.rdd.RDD.iterator (RDD.scala:287) at org.apache.spark.rdd.MapPartitionsRDD.compute (MapPartitionsRDD.scala:38) at org.apache.spark.rdd.RDD.computeOrReadCheckpoint (RDD.scala:323) at org.apache.spark.rdd.RDD.iterator (RDD.scala:287) at org.apache.spark.rdd.MapPartitionsRDD.compute (MapPartitionsRDD.scala:38) at org.apache.spark.rdd.RDD.computeOrReadCheckpoint (RDD.scala:323) at org.apache.spark.rdd.RDD.iterator (RDD.scala:287) at org.apache.spark.rdd.MapPartitionsRDD.compute (MapPartitionsRDD.scala:38) at org.apache.spark.rdd.RDD.computeOrReadCheckpoint (RDD.scala:323) at org.apache.spark.rdd.RDD.iterator (RDD.scala:287) at org.apache.spark.rdd.MapPartitionsRDD.compute (MapPartitionsRDD.scala:38) at org.apache.spark.rdd.RDD.computeOrReadCheckpoint (RDD.scala:323) at org.apache.spark.rdd.RDD.iterator (RDD.scala:287) at org.apache.spark.rdd.MapPartitionsRDD.compute (MapPartitionsRDD.scala:38) at org.apache.spark.rdd.RDD.computeOrReadCheckpoint (RDD.scala:323) at org.apache.spark.rdd.RDD.iterator (RDD.scala:287) at org.apache.spark.rdd.MapPartitionsRDD.compute (MapPartitionsRDD.scala:38) at org.apache.spark.rdd.RDD.computeOrReadCheckpoint (RDD.scala:323) at org.apache.spark.rdd.RDD.iterator (RDD.scala:287) at org.apache.spark.scheduler.ShuffleMapTask.runTask (ShuffleMapTask.scala:96) at org.apache.spark.scheduler.ShuffleMapTask.runTask (ShuffleMapTask.scala:53) at org.apache.spark.scheduler.Task.run (Task.scala:108) at org.apache.spark.executor.Executor$TaskRunner.run (Executor.scala:338) at java.util.concurrent.ThreadPoolExecutor.runWorker (ThreadPoolExecutor.java:1149) at java.util.concurrent.ThreadPoolExecutor$Worker.run (ThreadPoolExecutor.java:624) at java.lang.Thread.run (Thread.java:748) [INFO] ------------------------------------------------------------------------ [INFO] BUILD FAILURE [INFO] ------------------------------------------------------------------------ [INFO] Total time: 01:45 min [INFO] Finished at: 2018-03-06T11:06:09Z [INFO] Final Memory: 89M/1207M [INFO] ------------------------------------------------------------------------ [ERROR] Failed to execute goal org.codehaus.mojo:exec-maven-plugin:1.6.0:java (default-cli) on project word-count-beam: An exception occured while executing the Java class. java.lang.NoSuchMethodError: org.apache.beam.runners.core.DoFnRunners.simpleRunner(Lorg/apache/beam/sdk/options/PipelineOptions;Lorg/apache/beam/sdk/transforms/DoFn;Lorg/apache/beam/runners/core/SideInputReader;Lorg/apache/beam/runners/core/DoFnRunners$OutputManager;Lorg/apache/beam/sdk/values/TupleTag;Ljava/util/List;Lorg/apache/beam/runners/core/StepContext;Lorg/apache/beam/sdk/values/WindowingStrategy;)Lorg/apache/beam/runners/core/DoFnRunner; -> [Help 1] [ERROR] [ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch. [ERROR] Re-run Maven using the -X switch to enable full debug logging. [ERROR] [ERROR] For more information about the errors and possible solutions, please read the following articles: [ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException [ERROR] Failed command :runners:spark:runQuickstartJavaSpark FAILED Mar 06, 2018 11:06:22 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2018-03-06T11:06:22.185Z: (d0918e729e906fcd): Executing operation WordCount.CountWords/Count.PerElement/Combine.perKey(Count)/GroupByKey/Close Mar 06, 2018 11:06:22 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2018-03-06T11:06:22.249Z: (d0918e729e906395): Executing operation WriteCounts/WriteFiles/WriteUnshardedBundlesToTempFiles/GroupUnwritten/Create Mar 06, 2018 11:06:22 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2018-03-06T11:06:22.386Z: (d0918e729e906a90): Executing operation WordCount.CountWords/Count.PerElement/Combine.perKey(Count)/GroupByKey/Read+WordCount.CountWords/Count.PerElement/Combine.perKey(Count)/Combine.GroupedValues+WordCount.CountWords/Count.PerElement/Combine.perKey(Count)/Combine.GroupedValues/Extract+MapElements/Map+WriteCounts/WriteFiles/RewindowIntoGlobal/Window.Assign+WriteCounts/WriteFiles/WriteUnshardedBundlesToTempFiles/WriteUnshardedBundles+WriteCounts/WriteFiles/GatherTempFileResults/View.AsList/ParDo(ToIsmRecordForGlobalWindow)+WriteCounts/WriteFiles/WriteUnshardedBundlesToTempFiles/GroupUnwritten/Reify+WriteCounts/WriteFiles/WriteUnshardedBundlesToTempFiles/GroupUnwritten/Write Mar 06, 2018 11:06:36 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2018-03-06T11:06:35.942Z: (f72b707b5e07eff): Executing operation WriteCounts/WriteFiles/WriteUnshardedBundlesToTempFiles/GroupUnwritten/Close Mar 06, 2018 11:06:36 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2018-03-06T11:06:36.022Z: (d0918e729e9060f6): Executing operation WriteCounts/WriteFiles/WriteUnshardedBundlesToTempFiles/GroupUnwritten/Read+WriteCounts/WriteFiles/WriteUnshardedBundlesToTempFiles/GroupUnwritten/GroupByWindow+WriteCounts/WriteFiles/WriteUnshardedBundlesToTempFiles/WriteUnwritten+WriteCounts/WriteFiles/WriteUnshardedBundlesToTempFiles/DropShardNum+WriteCounts/WriteFiles/GatherTempFileResults/View.AsList/ParDo(ToIsmRecordForGlobalWindow) Mar 06, 2018 11:06:38 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2018-03-06T11:06:38.115Z: (d0918e729e906394): Executing operation s12-u31 Mar 06, 2018 11:06:38 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2018-03-06T11:06:38.331Z: (f72b707b5e070aa): Executing operation WriteCounts/WriteFiles/GatherTempFileResults/View.AsList/CreateDataflowView Mar 06, 2018 11:06:38 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2018-03-06T11:06:38.505Z: (45f02a316e2e3a4a): Executing operation WriteCounts/WriteFiles/GatherTempFileResults/Reify.ReifyViewInGlobalWindow/Create.Values/Read(CreateSource)+WriteCounts/WriteFiles/GatherTempFileResults/Reify.ReifyViewInGlobalWindow/Reify.ReifyView/ParDo(Anonymous)+WriteCounts/WriteFiles/GatherTempFileResults/Reify.ReifyViewInGlobalWindow/Values/Values/Map+WriteCounts/WriteFiles/FinalizeTempFileBundles/Finalize Mar 06, 2018 11:06:40 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2018-03-06T11:06:40.292Z: (6ef79b57b2e29c72): Cleaning up. Mar 06, 2018 11:06:40 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2018-03-06T11:06:40.372Z: (6ef79b57b2e29b28): Stopping worker pool... Mar 06, 2018 11:09:00 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2018-03-06T11:08:58.123Z: (9b13f9a2b6d6b331): Autoscaling: Resized worker pool from 1 to 0. Mar 06, 2018 11:09:00 AM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2018-03-06T11:08:58.141Z: (9b13f9a2b6d6be77): Autoscaling: Would further reduce the number of workers but reached the minimum number allowed for the job. Mar 06, 2018 11:09:06 AM org.apache.beam.runners.dataflow.DataflowPipelineJob waitUntilFinish INFO: Job 2018-03-06_03_04_13-1941969681040247338 finished with status DONE. [INFO] ------------------------------------------------------------------------ [INFO] BUILD SUCCESS [INFO] ------------------------------------------------------------------------ [INFO] Total time: 06:15 min [INFO] Finished at: 2018-03-06T11:09:06Z [INFO] Final Memory: 78M/1288M [INFO] ------------------------------------------------------------------------ gsutil cat gs://temp-storage-for-release-validation-tests/quickstart/count* | grep Montague: Montague: 47 Verified Montague: 47 gsutil rm gs://temp-storage-for-release-validation-tests/quickstart/count* Removing gs://temp-storage-for-release-validation-tests/quickstart/counts-00000-of-00003... / [1 objects] Removing gs://temp-storage-for-release-validation-tests/quickstart/counts-00001-of-00003... / [2 objects] Removing gs://temp-storage-for-release-validation-tests/quickstart/counts-00002-of-00003... / [3 objects] Operation completed over 3 objects. [SUCCESS] FAILURE: Build completed with 3 failures. 1: Task failed with an exception. ----------- * What went wrong: Execution failed for task ':runners:flink:runQuickstartJavaFlinkLocal'. > Process 'command '/usr/local/asfpackages/java/jdk1.8.0_152/bin/java'' finished with non-zero exit value 1 * Try: Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights. ============================================================================== 2: Task failed with an exception. ----------- * What went wrong: Execution failed for task ':runners:apex:runQuickstartJavaApex'. > Process 'command '/usr/local/asfpackages/java/jdk1.8.0_152/bin/java'' finished with non-zero exit value 1 * Try: Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights. ============================================================================== 3: Task failed with an exception. ----------- * What went wrong: Execution failed for task ':runners:spark:runQuickstartJavaSpark'. > Process 'command '/usr/local/asfpackages/java/jdk1.8.0_152/bin/java'' finished with non-zero exit value 1 * Try: Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights. ============================================================================== * Get more help at https://help.gradle.org BUILD FAILED in 7m 14s 6 actionable tasks: 6 executed Build step 'Invoke Gradle script' changed build result to FAILURE Build step 'Invoke Gradle script' marked build as failure Not sending mail to unregistered user daniel.o.programmer@gmail.com Not sending mail to unregistered user samuel.waggoner@healthsparq.com Not sending mail to unregistered user github@alasdairhodge.co.uk Not sending mail to unregistered user ehudm@google.com Not sending mail to unregistered user mariagh@mariagh.svl.corp.google.com