From builds-return-50195-archive-asf-public=cust-asf.ponee.io@beam.apache.org Wed May 20 07:21:15 2020 Return-Path: X-Original-To: archive-asf-public@cust-asf.ponee.io Delivered-To: archive-asf-public@cust-asf.ponee.io Received: from mail.apache.org (hermes.apache.org [207.244.88.153]) by mx-eu-01.ponee.io (Postfix) with SMTP id 4179E180637 for ; Wed, 20 May 2020 09:21:15 +0200 (CEST) Received: (qmail 30956 invoked by uid 500); 20 May 2020 07:21:14 -0000 Mailing-List: contact builds-help@beam.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: builds@beam.apache.org Delivered-To: mailing list builds@beam.apache.org Received: (qmail 30944 invoked by uid 99); 20 May 2020 07:21:14 -0000 Received: from Unknown (HELO mailrelay1-lw-us.apache.org) (10.10.3.42) by apache.org (qpsmtpd/0.29) with ESMTP; Wed, 20 May 2020 07:21:14 +0000 Received: from jenkins02.apache.org (jenkins02.apache.org [195.201.213.130]) by mailrelay1-lw-us.apache.org (ASF Mail Server at mailrelay1-lw-us.apache.org) with ESMTP id 092464FBC for ; Wed, 20 May 2020 07:21:14 +0000 (UTC) Received: from jenkins02.apache.org (localhost.localdomain [127.0.0.1]) by jenkins02.apache.org (ASF Mail Server at jenkins02.apache.org) with ESMTP id 7679D33E001C for ; Wed, 20 May 2020 07:21:13 +0000 (UTC) Date: Wed, 20 May 2020 07:21:13 +0000 (UTC) From: Apache Jenkins Server To: builds@beam.apache.org Message-ID: <1420316524.17739.1589959273278.JavaMail.jenkins@jenkins02> Subject: Build failed in Jenkins: beam_PostCommit_Py_VR_Dataflow_V2 #575 MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 7bit X-Instance-Identity: MIIBIjANBgkqhkiG9w0BAQEFAAOCAQ8AMIIBCgKCAQEAkqVKZPv7YyHBB3FvWfV7XQehwe/Ga3aadzSNknt8g382X3uN8A3SOQ+Ixq9HxS+ZlN6XR4TECySmSRy2JN5Rx8svxAD0TjtSF9LuU98dD+LniNDP7Lq6gvRFuJhbMHoS0nuTizDZLsK4X8TW5MyV9w+jFbdoZfRE5O/Mse0fkOeL5uoIS/3Vvu/W+x9QSjDkB7CaU56bPFlQjqqJBl3Cn9r34CkXQZYnLb/NjW4vcpw0+TgMUAPTIVEr5BTPZRshz19g7huwg3zANT5HBIZnzV4hsVY9w4JHkceFdKi/ibNnjPjsFs9pm0HSGJ/RDxjIvSTYT02eH4+m1RAYaj2E9QIDAQAB X-Jenkins-Job: beam_PostCommit_Py_VR_Dataflow_V2 X-Jenkins-Result: FAILURE Auto-submitted: auto-generated See Changes: ------------------------------------------ [...truncated 5.39 MB...] ], "parallel_input": { "@type": "OutputReference", "output_name": "None", "step_name": "s20" }, "serialized_fn": "ref_AppliedPTransform_assert_that/Match_30", "user_name": "assert_that/Match" } } ], "type": "JOB_TYPE_STREAMING" } INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2020-05-20_00_12_48-9408421992978003567] INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2020-05-20_00_12_48-9408421992978003567 INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-20_00_12_48-9408421992978003567?project=apache-beam-testing INFO:apache_beam.runners.dataflow.dataflow_runner:2020-05-20T07:12:47.493Z: JOB_MESSAGE_DETAILED: Autoscaling: Reduced the number of workers to 0 based on low average worker CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-05-20T07:12:47.535Z: JOB_MESSAGE_BASIC: Worker pool stopped. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-05-20T07:12:47.573Z: JOB_MESSAGE_DEBUG: Tearing down pending resources... INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2020-05-20_00_03_24-5668333139697070852 is in state JOB_STATE_DONE test_reshuffle_preserves_timestamps (apache_beam.transforms.util_test.ReshuffleTest) ... ok WARNING:apache_beam.runners.dataflow.test_dataflow_runner:Waiting indefinitely for streaming job. INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2020-05-20_00_12_48-9408421992978003567 is in state JOB_STATE_RUNNING INFO:apache_beam.runners.dataflow.dataflow_runner:2020-05-20T07:12:48.165Z: JOB_MESSAGE_WARNING: Autoscaling is enabled for Dataflow Streaming Engine. Workers will scale between 1 and 100 unless maxNumWorkers is specified. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-05-20T07:12:48.165Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2020-05-20_00_12_48-9408421992978003567. The number of workers will be between 1 and 100. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-05-20T07:12:48.166Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2020-05-20_00_12_48-9408421992978003567. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-05-20T07:12:55.884Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-05-20T07:12:56.740Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-2 in us-central1-a. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-05-20T07:12:57.516Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-05-20T07:12:57.561Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-05-20T07:12:57.652Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-05-20T07:12:57.686Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step assert_that/Group/GroupByKey: GroupByKey not followed by a combiner. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-05-20T07:12:57.730Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey: GroupByKey not followed by a combiner. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-05-20T07:12:57.773Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-05-20T07:12:57.820Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps INFO:apache_beam.runners.dataflow.dataflow_runner:2020-05-20T07:12:57.961Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns INFO:apache_beam.runners.dataflow.dataflow_runner:2020-05-20T07:12:58.101Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-05-20T07:12:58.182Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations INFO:apache_beam.runners.dataflow.dataflow_runner:2020-05-20T07:12:58.231Z: JOB_MESSAGE_DETAILED: Unzipping flatten s17 for input s15.None INFO:apache_beam.runners.dataflow.dataflow_runner:2020-05-20T07:12:58.274Z: JOB_MESSAGE_DETAILED: Fusing unzipped copy of assert_that/Group/GroupByKey/WriteStream, through flatten assert_that/Group/Flatten, into producer assert_that/Group/pair_with_0 INFO:apache_beam.runners.dataflow.dataflow_runner:2020-05-20T07:12:58.321Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/WriteStream into assert_that/Group/pair_with_1 INFO:apache_beam.runners.dataflow.dataflow_runner:2020-05-20T07:12:58.360Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/MergeBuckets into Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/ReadStream INFO:apache_beam.runners.dataflow.dataflow_runner:2020-05-20T07:12:58.402Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps) into Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/MergeBuckets INFO:apache_beam.runners.dataflow.dataflow_runner:2020-05-20T07:12:58.449Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys into Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps) INFO:apache_beam.runners.dataflow.dataflow_runner:2020-05-20T07:12:58.482Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/Map(decode) into Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys INFO:apache_beam.runners.dataflow.dataflow_runner:2020-05-20T07:12:58.538Z: JOB_MESSAGE_DETAILED: Fusing consumer Key param into Create/Map(decode) INFO:apache_beam.runners.dataflow.dataflow_runner:2020-05-20T07:12:58.580Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/WindowInto(WindowIntoFn) into Key param INFO:apache_beam.runners.dataflow.dataflow_runner:2020-05-20T07:12:58.617Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/ToVoidKey into assert_that/WindowInto(WindowIntoFn) INFO:apache_beam.runners.dataflow.dataflow_runner:2020-05-20T07:12:58.662Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/pair_with_1 into assert_that/ToVoidKey INFO:apache_beam.runners.dataflow.dataflow_runner:2020-05-20T07:12:58.696Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/MergeBuckets into assert_that/Group/GroupByKey/ReadStream INFO:apache_beam.runners.dataflow.dataflow_runner:2020-05-20T07:12:58.731Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/Map(_merge_tagged_vals_under_key) into assert_that/Group/GroupByKey/MergeBuckets INFO:apache_beam.runners.dataflow.dataflow_runner:2020-05-20T07:12:58.776Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Unkey into assert_that/Group/Map(_merge_tagged_vals_under_key) INFO:apache_beam.runners.dataflow.dataflow_runner:2020-05-20T07:12:58.808Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Match into assert_that/Unkey INFO:apache_beam.runners.dataflow.dataflow_runner:2020-05-20T07:12:58.868Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/FlatMap() into Create/Impulse INFO:apache_beam.runners.dataflow.dataflow_runner:2020-05-20T07:12:58.915Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Create/FlatMap() into assert_that/Create/Impulse INFO:apache_beam.runners.dataflow.dataflow_runner:2020-05-20T07:12:58.962Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Create/Map(decode) into assert_that/Create/FlatMap() INFO:apache_beam.runners.dataflow.dataflow_runner:2020-05-20T07:12:59.023Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/pair_with_0 into assert_that/Create/Map(decode) INFO:apache_beam.runners.dataflow.dataflow_runner:2020-05-20T07:12:59.053Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/AddRandomKeys into Create/FlatMap() INFO:apache_beam.runners.dataflow.dataflow_runner:2020-05-20T07:12:59.098Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps) into Create/MaybeReshuffle/Reshuffle/AddRandomKeys INFO:apache_beam.runners.dataflow.dataflow_runner:2020-05-20T07:12:59.136Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/WriteStream into Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps) INFO:apache_beam.runners.dataflow.dataflow_runner:2020-05-20T07:12:59.191Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-05-20T07:12:59.229Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-05-20T07:12:59.270Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-05-20T07:12:59.312Z: JOB_MESSAGE_DEBUG: Assigning stage ids. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-05-20T07:13:03.044Z: JOB_MESSAGE_DEBUG: Starting worker pool setup. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-05-20T07:13:03.250Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-a... INFO:apache_beam.runners.dataflow.dataflow_runner:2020-05-20T07:13:03.304Z: JOB_MESSAGE_DEBUG: Starting worker pool setup. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-05-20T07:13:15.118Z: JOB_MESSAGE_WARNING: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete INFO:apache_beam.runners.dataflow.dataflow_runner:2020-05-20T07:13:30.479Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 so that the pipeline can catch up with its backlog and keep up with its input rate. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-05-20T07:14:10.517Z: JOB_MESSAGE_DETAILED: Workers have started successfully. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-05-20T07:14:10.551Z: JOB_MESSAGE_DETAILED: Workers have started successfully. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-05-20T07:19:00.547Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-05-20T07:19:19.311Z: JOB_MESSAGE_DETAILED: Cleaning up. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-05-20T07:19:19.378Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-05-20T07:19:19.420Z: JOB_MESSAGE_BASIC: Stopping worker pool... INFO:apache_beam.runners.dataflow.dataflow_runner:2020-05-20T07:19:19.464Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-05-20T07:19:19.498Z: JOB_MESSAGE_BASIC: Stopping worker pool... INFO:oauth2client.transport:Refreshing due to a 401 (attempt 1/2) INFO:apache_beam.runners.dataflow.dataflow_runner:2020-05-20T07:21:00.411Z: JOB_MESSAGE_DETAILED: Autoscaling: Reduced the number of workers to 0 based on low average worker CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-05-20T07:21:00.462Z: JOB_MESSAGE_BASIC: Worker pool stopped. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-05-20T07:21:00.503Z: JOB_MESSAGE_DEBUG: Tearing down pending resources... INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2020-05-20_00_12_48-9408421992978003567 is in state JOB_STATE_DONE test_element_param (apache_beam.pipeline_test.DoFnTest) ... ok test_key_param (apache_beam.pipeline_test.DoFnTest) ... ok ====================================================================== ERROR: test_default_value_singleton_side_input (apache_beam.transforms.sideinputs_test.SideInputsTest) ---------------------------------------------------------------------- Traceback (most recent call last): File " line 186, in test_default_value_singleton_side_input pipeline.run() File " line 112, in run False if self.not_use_test_runner_api else test_runner_api)) File " line 512, in run self._options).run(False) File " line 525, in run return self.runner.run_pipeline(self, self._options) File " line 57, in run_pipeline self).run_pipeline(pipeline, options) File " line 582, in run_pipeline self.dataflow_client.create_job(self.job), self) File " line 236, in wrapper return fun(*args, **kwargs) File " line 649, in create_job self.create_job_description(job) File " line 705, in create_job_description resources = self._stage_resources(job.proto_pipeline, job.options) File " line 602, in _stage_resources resources, staging_location=google_cloud_options.staging_location) File " line 305, in stage_job_resources file_path, FileSystems.join(staging_location, staged_path)) File " line 968, in stage_artifact local_path_to_artifact, artifact_name) File " line 236, in wrapper return fun(*args, **kwargs) File " line 576, in _gcs_file_copy self.stage_file(to_folder, to_name, f, total_size=total_size) File " line 627, in stage_file response = self._storage_client.objects.Insert(request, upload=upload) File " line 1156, in Insert upload=upload, upload_config=upload_config) File " line 715, in _RunMethod http_request, client=self.client) File " line 908, in InitializeUpload return self.StreamInChunks() File " line 1020, in StreamInChunks additional_headers=additional_headers) File " line 971, in __StreamMedia self.RefreshResumableUploadState() File " line 875, in RefreshResumableUploadState raise exceptions.HttpError.FromResponse(refresh_response) HttpError: HttpError accessing : response: <{'status': '410', 'content-length': '205', 'expires': 'Mon, 01 Jan 1990 00:00:00 GMT', 'vary': 'Origin, X-Origin', 'x-guploader-uploadid': 'AAANsUkDY88nFdg_wRwpFRkK_WYfk7bpZ_8s7lCb6fiBPeRF-4QWWrkHTZsitLNXzZucRO6_8m8fg3SKclPPMDZT0A', 'pragma': 'no-cache', 'cache-control': 'no-cache, no-store, max-age=0, must-revalidate', 'date': 'Wed, 20 May 2020 07:01:52 GMT', 'server': 'UploadServer', 'content-type': 'application/json; charset=UTF-8'}>, content <{ "error": { "code": 503, "message": "Backend Error", "errors": [ { "message": "Backend Error", "domain": "global", "reason": "backendError" } ] } } > -------------------- >> begin captured logging << -------------------- apache_beam.runners.portability.stager: INFO: Executing command: [' '-m', 'pip', 'download', '--dest', '/tmp/dataflow-requirements-cache', '-r', 'postcommit_requirements.txt', '--exists-action', 'i', '--no-binary', ':all:'] apache_beam.runners.portability.stager: INFO: Copying Beam SDK " to staging location. root: WARNING: Make sure that locally built Python SDK docker image has Python 2.7 interpreter. root: INFO: Using Python SDK docker image: apache/beam_python2.7_sdk:2.22.0.dev. If the image is not available at local, we will try to pull from hub.docker.com apache_beam.runners.dataflow.internal.apiclient: INFO: Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0520070129-672462.1589958089.672603/pipeline.pb... apache_beam.runners.dataflow.internal.apiclient: INFO: Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0520070129-672462.1589958089.672603/pipeline.pb in 0 seconds. apache_beam.runners.dataflow.internal.apiclient: INFO: Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0520070129-672462.1589958089.672603/requirements.txt... apache_beam.runners.dataflow.internal.apiclient: INFO: Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0520070129-672462.1589958089.672603/requirements.txt in 0 seconds. apache_beam.runners.dataflow.internal.apiclient: INFO: Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0520070129-672462.1589958089.672603/parameterized-0.7.4.tar.gz... apache_beam.runners.dataflow.internal.apiclient: INFO: Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0520070129-672462.1589958089.672603/parameterized-0.7.4.tar.gz in 0 seconds. apache_beam.runners.dataflow.internal.apiclient: INFO: Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0520070129-672462.1589958089.672603/six-1.14.0.tar.gz... apache_beam.runners.dataflow.internal.apiclient: INFO: Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0520070129-672462.1589958089.672603/six-1.14.0.tar.gz in 0 seconds. apache_beam.runners.dataflow.internal.apiclient: INFO: Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0520070129-672462.1589958089.672603/parameterized-0.7.3.tar.gz... apache_beam.runners.dataflow.internal.apiclient: INFO: Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0520070129-672462.1589958089.672603/parameterized-0.7.3.tar.gz in 0 seconds. apache_beam.runners.dataflow.internal.apiclient: INFO: Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0520070129-672462.1589958089.672603/parameterized-0.7.1.tar.gz... apache_beam.runners.dataflow.internal.apiclient: INFO: Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0520070129-672462.1589958089.672603/parameterized-0.7.1.tar.gz in 0 seconds. apache_beam.runners.dataflow.internal.apiclient: INFO: Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0520070129-672462.1589958089.672603/mock-2.0.0.tar.gz... apache_beam.runners.dataflow.internal.apiclient: INFO: Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0520070129-672462.1589958089.672603/mock-2.0.0.tar.gz in 0 seconds. apache_beam.runners.dataflow.internal.apiclient: INFO: Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0520070129-672462.1589958089.672603/pbr-5.4.4.tar.gz... apache_beam.runners.dataflow.internal.apiclient: INFO: Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0520070129-672462.1589958089.672603/pbr-5.4.4.tar.gz in 0 seconds. apache_beam.runners.dataflow.internal.apiclient: INFO: Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0520070129-672462.1589958089.672603/funcsigs-1.0.2.tar.gz... apache_beam.runners.dataflow.internal.apiclient: INFO: Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0520070129-672462.1589958089.672603/funcsigs-1.0.2.tar.gz in 0 seconds. apache_beam.runners.dataflow.internal.apiclient: INFO: Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0520070129-672462.1589958089.672603/pbr-5.4.5.tar.gz... apache_beam.runners.dataflow.internal.apiclient: INFO: Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0520070129-672462.1589958089.672603/pbr-5.4.5.tar.gz in 0 seconds. apache_beam.runners.dataflow.internal.apiclient: INFO: Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0520070129-672462.1589958089.672603/PyHamcrest-1.10.1.tar.gz... apache_beam.runners.dataflow.internal.apiclient: INFO: Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0520070129-672462.1589958089.672603/PyHamcrest-1.10.1.tar.gz in 0 seconds. apache_beam.runners.dataflow.internal.apiclient: INFO: Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0520070129-672462.1589958089.672603/dataflow_python_sdk.tar... apache_beam.runners.dataflow.internal.apiclient: INFO: Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0520070129-672462.1589958089.672603/dataflow_python_sdk.tar in 0 seconds. apache_beam.runners.dataflow.internal.apiclient: INFO: Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0520070129-672462.1589958089.672603/dataflow-worker.jar... --------------------- >> end captured logging << --------------------- ---------------------------------------------------------------------- XML: nosetests-validatesRunnerStreamingTests-df.xml ---------------------------------------------------------------------- XML: ---------------------------------------------------------------------- Ran 27 tests in 2213.470s FAILED (errors=1) Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-19_23_44_40-15134190620612974219?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-19_23_53_46-3790887394032838729?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-20_00_03_23-13222052460576518605?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-20_00_12_48-9408421992978003567?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-19_23_44_41-14061114565530325156?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-19_23_53_22-4522895235902894439?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-20_00_02_17-8562848251823430491?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-19_23_44_42-16124275479429994647?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-19_23_54_11-3242101826658950242?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-19_23_44_42-5508438491587229331?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-19_23_53_13-11623438820370786362?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-20_00_02_13-12614242288173946127?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-19_23_44_39-5770902116313106615?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-19_23_53_48-13197550552999150066?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-20_00_03_20-724213970303539014?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-19_23_44_42-2771286059515192570?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-19_23_54_15-4381908426378907612?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-20_00_03_24-5668333139697070852?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-19_23_44_43-14213330987178177712?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-19_23_54_11-16489073577281169859?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-19_23_44_40-2084340884805348874?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-19_23_53_06-13659418678085845208?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-20_00_01_41-8746846760191334697?project=apache-beam-testing > Task :sdks:python:test-suites:dataflow:py2:validatesRunnerStreamingTests FAILED FAILURE: Build failed with an exception. * Where: Build file ' line: 173 * What went wrong: Execution failed for task ':sdks:python:test-suites:dataflow:py2:validatesRunnerStreamingTests'. > Process 'command 'sh'' finished with non-zero exit value 1 * Try: Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights. * Get more help at https://help.gradle.org Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0. Use '--warning-mode all' to show the individual deprecation warnings. See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings BUILD FAILED in 1h 19m 57s 64 actionable tasks: 46 executed, 18 from cache Publishing build scan... https://gradle.com/s/mlen6fschfy3u Build step 'Invoke Gradle script' changed build result to FAILURE Build step 'Invoke Gradle script' marked build as failure --------------------------------------------------------------------- To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org For additional commands, e-mail: builds-help@beam.apache.org