From builds-return-46193-archive-asf-public=cust-asf.ponee.io@beam.apache.org Fri Apr 17 18:55:32 2020 Return-Path: X-Original-To: archive-asf-public@cust-asf.ponee.io Delivered-To: archive-asf-public@cust-asf.ponee.io Received: from mail.apache.org (hermes.apache.org [207.244.88.153]) by mx-eu-01.ponee.io (Postfix) with SMTP id 57273180647 for ; Fri, 17 Apr 2020 20:55:32 +0200 (CEST) Received: (qmail 62673 invoked by uid 500); 17 Apr 2020 18:55:31 -0000 Mailing-List: contact builds-help@beam.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: builds@beam.apache.org Delivered-To: mailing list builds@beam.apache.org Received: (qmail 62649 invoked by uid 99); 17 Apr 2020 18:55:31 -0000 Received: from Unknown (HELO mailrelay1-lw-us.apache.org) (10.10.3.159) by apache.org (qpsmtpd/0.29) with ESMTP; Fri, 17 Apr 2020 18:55:31 +0000 Received: from jenkins02.apache.org (jenkins02.apache.org [195.201.213.130]) by mailrelay1-lw-us.apache.org (ASF Mail Server at mailrelay1-lw-us.apache.org) with ESMTP id AD1ADF64 for ; Fri, 17 Apr 2020 18:55:30 +0000 (UTC) Received: from jenkins02.apache.org (localhost.localdomain [127.0.0.1]) by jenkins02.apache.org (ASF Mail Server at jenkins02.apache.org) with ESMTP id 4708F33E0116 for ; Fri, 17 Apr 2020 18:55:30 +0000 (UTC) Date: Fri, 17 Apr 2020 18:55:30 +0000 (UTC) From: Apache Jenkins Server To: builds@beam.apache.org Message-ID: <1565391144.6429.1587149730286.JavaMail.jenkins@jenkins02> In-Reply-To: <1123625626.6366.1587138211986.JavaMail.jenkins@jenkins02> References: <1123625626.6366.1587138211986.JavaMail.jenkins@jenkins02> Subject: Build failed in Jenkins: beam_PostCommit_Python2 #2264 MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 7bit X-Instance-Identity: MIIBIjANBgkqhkiG9w0BAQEFAAOCAQ8AMIIBCgKCAQEAkqVKZPv7YyHBB3FvWfV7XQehwe/Ga3aadzSNknt8g382X3uN8A3SOQ+Ixq9HxS+ZlN6XR4TECySmSRy2JN5Rx8svxAD0TjtSF9LuU98dD+LniNDP7Lq6gvRFuJhbMHoS0nuTizDZLsK4X8TW5MyV9w+jFbdoZfRE5O/Mse0fkOeL5uoIS/3Vvu/W+x9QSjDkB7CaU56bPFlQjqqJBl3Cn9r34CkXQZYnLb/NjW4vcpw0+TgMUAPTIVEr5BTPZRshz19g7huwg3zANT5HBIZnzV4hsVY9w4JHkceFdKi/ibNnjPjsFs9pm0HSGJ/RDxjIvSTYT02eH4+m1RAYaj2E9QIDAQAB X-Jenkins-Job: beam_PostCommit_Python2 X-Jenkins-Result: FAILURE Auto-submitted: auto-generated See Changes: [ehudm] [BEAM-9737] Don't use docker create option -u ------------------------------------------ [...truncated 11.47 MB...] }, { "@type": "kind:global_window" } ], "is_wrapper": true }, "output_name": "out", "user_name": "GroupByKey.out" } ], "parallel_input": { "@type": "OutputReference", "output_name": "None", "step_name": "s3" }, "serialized_fn": "%0AB%22%40%0A%1Dref_Coder_GlobalWindowCoder_1%12%1F%0A%1D%0A%1Bbeam%3Acoder%3Aglobal_window%3Av1jQ%0A%22%0A%20beam%3Awindow_fn%3Aglobal_windows%3Av1%10%01%1A%1Dref_Coder_GlobalWindowCoder_1%22%02%3A%00%28%010%018%01H%01", "user_name": "GroupByKey" } }, { "kind": "ParallelDo", "name": "s5", "properties": { "display_data": [ { "key": "fn", "label": "Transform Function", "namespace": "apache_beam.transforms.core.CallableWrapperDoFn", "type": "STRING", "value": "" }, { "key": "fn", "label": "Transform Function", "namespace": "apache_beam.transforms.core.ParDo", "shortValue": "CallableWrapperDoFn", "type": "STRING", "value": "apache_beam.transforms.core.CallableWrapperDoFn" } ], "non_parallel_inputs": {}, "output_info": [ { "encoding": { "@type": "kind:windowed_value", "component_encodings": [ { "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", "component_encodings": [ { "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", "component_encodings": [], "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_4" }, { "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", "component_encodings": [], "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_4" } ], "is_pair_like": true, "pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_4" }, { "@type": "kind:global_window" } ], "is_wrapper": true }, "output_name": "None", "user_name": "m_out.out" } ], "parallel_input": { "@type": "OutputReference", "output_name": "out", "step_name": "s4" }, "serialized_fn": "", "user_name": "m_out" } } ], "type": "JOB_TYPE_BATCH" } INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2020-04-17_11_47_17-14639304656263172165] INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-17_11_47_17-14639304656263172165?project=apache-beam-testing INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2020-04-17_11_47_17-14639304656263172165 is in state JOB_STATE_RUNNING INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-17T18:47:17.192Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2020-04-17_11_47_17-14639304656263172165. The number of workers will be between 1 and 1000. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-17T18:47:17.192Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2020-04-17_11_47_17-14639304656263172165. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-17T18:47:20.827Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-17T18:47:21.610Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-1 in us-central1-f. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-17T18:47:22.259Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-17T18:47:22.303Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step GroupByKey: GroupByKey not followed by a combiner. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-17T18:47:22.338Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into optimizable parts. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-17T18:47:22.360Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-17T18:47:22.436Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-17T18:47:22.490Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-17T18:47:22.533Z: JOB_MESSAGE_DETAILED: Fusing consumer metrics into Create/Read INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-17T18:47:22.575Z: JOB_MESSAGE_DETAILED: Fusing consumer map_to_common_key into metrics INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-17T18:47:22.615Z: JOB_MESSAGE_DETAILED: Fusing consumer GroupByKey/Reify into map_to_common_key INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-17T18:47:22.663Z: JOB_MESSAGE_DETAILED: Fusing consumer GroupByKey/Write into GroupByKey/Reify INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-17T18:47:22.711Z: JOB_MESSAGE_DETAILED: Fusing consumer GroupByKey/GroupByWindow into GroupByKey/Read INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-17T18:47:22.754Z: JOB_MESSAGE_DETAILED: Fusing consumer m_out into GroupByKey/GroupByWindow INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-17T18:47:22.788Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-17T18:47:22.826Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-17T18:47:22.851Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-17T18:47:22.876Z: JOB_MESSAGE_DEBUG: Assigning stage ids. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-17T18:47:23.090Z: JOB_MESSAGE_DEBUG: Executing wait step start13 INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-17T18:47:23.176Z: JOB_MESSAGE_BASIC: Executing operation GroupByKey/Create INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-17T18:47:23.221Z: JOB_MESSAGE_DEBUG: Starting worker pool setup. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-17T18:47:23.256Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-f... INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-17T18:47:23.301Z: JOB_MESSAGE_BASIC: Finished operation GroupByKey/Create INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-17T18:47:23.375Z: JOB_MESSAGE_DEBUG: Value "GroupByKey/Session" materialized. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-17T18:47:23.444Z: JOB_MESSAGE_BASIC: Executing operation Create/Read+metrics+map_to_common_key+GroupByKey/Reify+GroupByKey/Write Runs streaming Dataflow job and verifies that user metrics are reported ... ok INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-17T18:47:39.417Z: JOB_MESSAGE_WARNING: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-17T18:47:47.234Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running stage(s). INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-17T18:48:20.888Z: JOB_MESSAGE_BASIC: Finished operation Create/Read+ExternalTransform(simple)/Map()+assert_that/WindowInto(WindowIntoFn)+assert_that/ToVoidKey+assert_that/Group/pair_with_1+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-17T18:48:24.081Z: JOB_MESSAGE_BASIC: Finished operation assert_that/Create/Read+assert_that/Group/pair_with_0+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-17T18:48:24.146Z: JOB_MESSAGE_BASIC: Executing operation assert_that/Group/GroupByKey/Close INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-17T18:48:24.211Z: JOB_MESSAGE_BASIC: Finished operation assert_that/Group/GroupByKey/Close INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-17T18:48:24.292Z: JOB_MESSAGE_BASIC: Executing operation assert_that/Group/GroupByKey/Read+assert_that/Group/GroupByKey/GroupByWindow+assert_that/Group/Map(_merge_tagged_vals_under_key)+assert_that/Unkey+assert_that/Match INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-17T18:48:33.567Z: JOB_MESSAGE_BASIC: Finished operation assert_that/Group/GroupByKey/Read+assert_that/Group/GroupByKey/GroupByWindow+assert_that/Group/Map(_merge_tagged_vals_under_key)+assert_that/Unkey+assert_that/Match INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-17T18:48:33.640Z: JOB_MESSAGE_DEBUG: Executing success step success19 INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-17T18:48:33.768Z: JOB_MESSAGE_DETAILED: Cleaning up. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-17T18:48:33.829Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-17T18:48:33.859Z: JOB_MESSAGE_BASIC: Stopping worker pool... INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-17T18:49:29.745Z: JOB_MESSAGE_DETAILED: Workers have started successfully. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-17T18:49:29.769Z: JOB_MESSAGE_DETAILED: Workers have started successfully. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-17T18:49:46.621Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized worker pool from 1 to 0. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-17T18:49:46.678Z: JOB_MESSAGE_BASIC: Worker pool stopped. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-17T18:49:46.726Z: JOB_MESSAGE_DEBUG: Tearing down pending resources... INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2020-04-17_11_42_33-4049586930458669108 is in state JOB_STATE_DONE test_job_python_from_python_it (apache_beam.transforms.external_it_test.ExternalTransformIT) ... ok INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-17T18:52:53.373Z: JOB_MESSAGE_BASIC: Finished operation Create/Read+metrics+map_to_common_key+GroupByKey/Reify+GroupByKey/Write INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-17T18:52:53.425Z: JOB_MESSAGE_BASIC: Executing operation GroupByKey/Close INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-17T18:52:53.469Z: JOB_MESSAGE_BASIC: Finished operation GroupByKey/Close INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-17T18:52:53.618Z: JOB_MESSAGE_BASIC: Executing operation GroupByKey/Read+GroupByKey/GroupByWindow+m_out INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-17T18:53:02.701Z: JOB_MESSAGE_BASIC: Finished operation GroupByKey/Read+GroupByKey/GroupByWindow+m_out INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-17T18:53:02.749Z: JOB_MESSAGE_DEBUG: Executing success step success11 INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-17T18:53:02.885Z: JOB_MESSAGE_DETAILED: Cleaning up. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-17T18:53:02.930Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-17T18:53:02.960Z: JOB_MESSAGE_BASIC: Stopping worker pool... INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-17T18:55:13.499Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized worker pool from 1 to 0. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-17T18:55:13.544Z: JOB_MESSAGE_BASIC: Worker pool stopped. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-17T18:55:13.584Z: JOB_MESSAGE_DEBUG: Tearing down pending resources... INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2020-04-17_11_47_17-14639304656263172165 is in state JOB_STATE_DONE test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok test_metrics_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok ---------------------------------------------------------------------- XML: nosetests-postCommitIT-df.xml ---------------------------------------------------------------------- XML: ---------------------------------------------------------------------- Ran 58 tests in 3881.635s OK (SKIP=7) Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-17_10_51_08-4181800531997097431?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-17_10_59_46-13137872200820678706?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-17_11_09_24-2370993092642836997?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-17_11_17_13-801469999082347692?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-17_11_24_53-6422054882688396490?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-17_11_32_03-3299810571809113384?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-17_11_39_35-7079649891261113862?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-17_11_47_17-14639304656263172165?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-17_10_51_11-335615213193920916?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-17_11_05_22-3279258722789364518?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-17_11_13_11-6480639412769776571?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-17_11_21_20-2977533773736970261?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-17_11_28_54-6590451723591248781?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-17_11_36_37-9010381111086266496?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-17_10_51_07-10560407452367864775?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-17_11_12_43-11386950552732843172?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-17_11_20_38-6575090723158996083?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-17_11_37_29-15140862721956985530?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-17_10_51_09-12171313678869038360?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-17_11_03_35-12908749712007379028?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-17_11_11_17-17659526824813322788?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-17_11_19_06-14476842708311116915?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-17_11_26_49-5795521256795963840?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-17_11_34_25-6101393009109478126?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-17_11_42_33-4049586930458669108?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-17_10_51_08-2470889202209321594?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-17_11_11_14-17674235976374938416?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-17_11_19_31-13102717923897021149?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-17_11_27_22-17244523343931978100?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-17_11_35_41-8629638014719362400?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-17_10_51_06-5949610154471012192?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-17_10_58_59-3243494491189088009?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-17_11_07_26-180564267555867515?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-17_11_16_50-18193428556840401468?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-17_11_25_05-7878992334927146928?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-17_11_32_29-7827618560912493897?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-17_11_40_00-4525395906390709947?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-17_10_51_07-14117950898862876308?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-17_11_00_06-17256355456408300084?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-17_11_08_39-2105726312066434042?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-17_11_16_24-2640586193456311474?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-17_11_24_05-1978625709376592558?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-17_11_31_44-16539885074832080709?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-17_11_39_39-9770908133093374162?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-17_10_51_09-4590983529888297156?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-17_11_00_56-5876937695815956378?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-17_11_11_40-5854776101866001503?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-17_11_19_57-2014181354346641834?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-17_11_28_07-3307489891696671438?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-17_11_35_21-7297962525172717350?project=apache-beam-testing FAILURE: Build failed with an exception. * Where: Build file ' line: 81 * What went wrong: Execution failed for task ':sdks:python:test-suites:direct:py2:hdfsIntegrationTest'. > Process 'command 'sh'' finished with non-zero exit value 255 * Try: Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights. * Get more help at https://help.gradle.org Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0. Use '--warning-mode all' to show the individual deprecation warnings. See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings BUILD FAILED in 1h 6m 16s 129 actionable tasks: 102 executed, 24 from cache, 3 up-to-date Publishing build scan... https://gradle.com/s/fu2t6uuhqt3zc Build step 'Invoke Gradle script' changed build result to FAILURE Build step 'Invoke Gradle script' marked build as failure --------------------------------------------------------------------- To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org For additional commands, e-mail: builds-help@beam.apache.org