Return-Path: X-Original-To: archive-asf-public-internal@cust-asf2.ponee.io Delivered-To: archive-asf-public-internal@cust-asf2.ponee.io Received: from cust-asf.ponee.io (cust-asf.ponee.io [163.172.22.183]) by cust-asf2.ponee.io (Postfix) with ESMTP id 0BCDA200CCB for ; Thu, 6 Jul 2017 03:08:28 +0200 (CEST) Received: by cust-asf.ponee.io (Postfix) id 0A88A164F38; Thu, 6 Jul 2017 01:08:28 +0000 (UTC) Delivered-To: archive-asf-public@cust-asf.ponee.io Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by cust-asf.ponee.io (Postfix) with SMTP id 02E78164F37 for ; Thu, 6 Jul 2017 03:08:26 +0200 (CEST) Received: (qmail 62685 invoked by uid 500); 6 Jul 2017 01:08:26 -0000 Mailing-List: contact commits-help@beam.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: dev@beam.apache.org Delivered-To: mailing list commits@beam.apache.org Received: (qmail 62674 invoked by uid 99); 6 Jul 2017 01:08:26 -0000 Received: from crius.apache.org (HELO crius.apache.org) (140.211.11.14) by apache.org (qpsmtpd/0.29) with ESMTP; Thu, 06 Jul 2017 01:08:26 +0000 Received: from crius.apache.org (localhost [127.0.0.1]) by crius.apache.org (ASF Mail Server at crius.apache.org) with ESMTP id F41E19C00E8; Thu, 6 Jul 2017 01:08:24 +0000 (UTC) Date: Thu, 6 Jul 2017 01:08:22 +0000 (UTC) From: Apache Jenkins Server To: commits@beam.apache.org, altay@google.com, tgroh@google.com Message-ID: <783971614.9371.1499303303095.JavaMail.jenkins@crius> In-Reply-To: <1554951197.9347.1499300329353.JavaMail.jenkins@crius> References: <1554951197.9347.1499300329353.JavaMail.jenkins@crius> Subject: Build failed in Jenkins: beam_PostCommit_Python_Verify #2661 MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 7bit X-Instance-Identity: MIIBIjANBgkqhkiG9w0BAQEFAAOCAQ8AMIIBCgKCAQEAkqVKZPv7YyHBB3FvWfV7XQehwe/Ga3aadzSNknt8g382X3uN8A3SOQ+Ixq9HxS+ZlN6XR4TECySmSRy2JN5Rx8svxAD0TjtSF9LuU98dD+LniNDP7Lq6gvRFuJhbMHoS0nuTizDZLsK4X8TW5MyV9w+jFbdoZfRE5O/Mse0fkOeL5uoIS/3Vvu/W+x9QSjDkB7CaU56bPFlQjqqJBl3Cn9r34CkXQZYnLb/NjW4vcpw0+TgMUAPTIVEr5BTPZRshz19g7huwg3zANT5HBIZnzV4hsVY9w4JHkceFdKi/ibNnjPjsFs9pm0HSGJ/RDxjIvSTYT02eH4+m1RAYaj2E9QIDAQAB X-Jenkins-Job: beam_PostCommit_Python_Verify X-Jenkins-Result: FAILURE archived-at: Thu, 06 Jul 2017 01:08:28 -0000 See Changes: [tgroh] Disallow Combiner Lifting for multi-window WindowFns ------------------------------------------ [...truncated 587.38 KB...] } }, { "kind": "CombineValues", "name": "s4", "properties": { "display_data": [], "encoding": { "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", "component_encodings": [ { "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", "component_encodings": [] }, { "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", "component_encodings": [] } ], "is_pair_like": true }, "output_info": [ { "encoding": { "@type": "kind:windowed_value", "component_encodings": [ { "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", "component_encodings": [ { "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", "component_encodings": [] }, { "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", "component_encodings": [] } ], "is_pair_like": true }, { "@type": "kind:global_window" } ], "is_wrapper": true }, "output_name": "out", "user_name": "monthly count/Combine.out" } ], "parallel_input": { "@type": "OutputReference", "output_name": "out", "step_name": "s3" }, "serialized_fn": "", "user_name": "monthly count/Combine" } }, { "kind": "ParallelDo", "name": "s5", "properties": { "display_data": [ { "key": "fn", "label": "Transform Function", "namespace": "apache_beam.transforms.core.CallableWrapperDoFn", "type": "STRING", "value": "" }, { "key": "fn", "label": "Transform Function", "namespace": "apache_beam.transforms.core.ParDo", "shortValue": "CallableWrapperDoFn", "type": "STRING", "value": "apache_beam.transforms.core.CallableWrapperDoFn" } ], "non_parallel_inputs": {}, "output_info": [ { "encoding": { "@type": "kind:windowed_value", "component_encodings": [ { "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", "component_encodings": [ { "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", "component_encodings": [] }, { "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", "component_encodings": [] } ], "is_pair_like": true }, { "@type": "kind:global_window" } ], "is_wrapper": true }, "output_name": "out", "user_name": "format.out" } ], "parallel_input": { "@type": "OutputReference", "output_name": "out", "step_name": "s4" }, "serialized_fn": "", "user_name": "format" } }, { "kind": "ParallelWrite", "name": "s6", "properties": { "create_disposition": "CREATE_IF_NEEDED", "dataset": "BigQueryTornadoesIT", "display_data": [], "encoding": { "@type": "kind:windowed_value", "component_encodings": [ { "@type": "RowAsDictJsonCoder$eNprYEpOLEhMzkiNT0pNzNXLzNdLTy7QS8pMLyxNLarkCsovdyx2yUwu8SrOz3POT0kt4ipk0GwsZKwtZErSAwCu1BVY", "component_encodings": [] }, { "@type": "kind:global_window" } ], "is_wrapper": true }, "format": "bigquery", "parallel_input": { "@type": "OutputReference", "output_name": "out", "step_name": "s5" }, "schema": "{\"fields\": [{\"type\": \"INTEGER\", \"name\": \"month\", \"mode\": \"NULLABLE\"}, {\"type\": \"INTEGER\", \"name\": \"tornado_count\", \"mode\": \"NULLABLE\"}]}", "table": "monthly_tornadoes_1499302932637", "user_name": "Write/WriteToBigQuery/NativeWrite", "write_disposition": "WRITE_TRUNCATE" } } ], "type": "JOB_TYPE_BATCH" } root: INFO: Create job: root: INFO: Created job with id: [2017-07-05_18_02_13-14321492511956794927] root: INFO: To access the Dataflow monitoring console, please navigate to https://console.developers.google.com/project/apache-beam-testing/dataflow/job/2017-07-05_18_02_13-14321492511956794927 root: INFO: Job 2017-07-05_18_02_13-14321492511956794927 is in state JOB_STATE_RUNNING root: INFO: 2017-07-06T01:02:13.231Z: JOB_MESSAGE_WARNING: (c6c0298fa499f244): Setting the number of workers (1) disables autoscaling for this job. If you are trying to cap autoscaling, consider only setting max_num_workers. If you want to disable autoscaling altogether, the documented way is to explicitly use autoscalingAlgorithm=NONE. root: INFO: 2017-07-06T01:02:15.482Z: JOB_MESSAGE_DETAILED: (9c657285e1b8f7bb): Checking required Cloud APIs are enabled. root: INFO: 2017-07-06T01:02:16.437Z: JOB_MESSAGE_DETAILED: (9c657285e1b8f355): Expanding GroupByKey operations into optimizable parts. root: INFO: 2017-07-06T01:02:16.442Z: JOB_MESSAGE_DETAILED: (9c657285e1b8f06b): Lifting ValueCombiningMappingFns into MergeBucketsMappingFns root: INFO: 2017-07-06T01:02:16.450Z: JOB_MESSAGE_DEBUG: (9c657285e1b8fa97): Annotating graph with Autotuner information. root: INFO: 2017-07-06T01:02:16.601Z: JOB_MESSAGE_DETAILED: (9c657285e1b8f34e): Fusing adjacent ParDo, Read, Write, and Flatten operations root: INFO: 2017-07-06T01:02:16.605Z: JOB_MESSAGE_DETAILED: (9c657285e1b8f064): Fusing consumer months with tornadoes into read root: INFO: 2017-07-06T01:02:16.608Z: JOB_MESSAGE_DETAILED: (9c657285e1b8fd7a): Fusing consumer monthly count/GroupByKey/Reify into monthly count/GroupByKey+monthly count/Combine/Partial root: INFO: 2017-07-06T01:02:16.613Z: JOB_MESSAGE_DETAILED: (9c657285e1b8fa90): Fusing consumer format into monthly count/Combine/Extract root: INFO: 2017-07-06T01:02:16.617Z: JOB_MESSAGE_DETAILED: (9c657285e1b8f7a6): Fusing consumer monthly count/Combine/Extract into monthly count/Combine root: INFO: 2017-07-06T01:02:16.621Z: JOB_MESSAGE_DETAILED: (9c657285e1b8f4bc): Fusing consumer Write/WriteToBigQuery/NativeWrite into format root: INFO: 2017-07-06T01:02:16.625Z: JOB_MESSAGE_DETAILED: (9c657285e1b8f1d2): Fusing consumer monthly count/Combine into monthly count/GroupByKey/Read root: INFO: 2017-07-06T01:02:16.629Z: JOB_MESSAGE_DETAILED: (9c657285e1b8fee8): Fusing consumer monthly count/GroupByKey+monthly count/Combine/Partial into months with tornadoes root: INFO: 2017-07-06T01:02:16.632Z: JOB_MESSAGE_DETAILED: (9c657285e1b8fbfe): Fusing consumer monthly count/GroupByKey/Write into monthly count/GroupByKey/Reify root: INFO: 2017-07-06T01:02:16.700Z: JOB_MESSAGE_DEBUG: (9c657285e1b8f61c): Workflow config is missing a default resource spec. root: INFO: 2017-07-06T01:02:16.705Z: JOB_MESSAGE_DETAILED: (9c657285e1b8f332): Adding StepResource setup and teardown to workflow graph. root: INFO: 2017-07-06T01:02:16.710Z: JOB_MESSAGE_DEBUG: (9c657285e1b8f048): Adding workflow start and stop steps. root: INFO: 2017-07-06T01:02:16.714Z: JOB_MESSAGE_DEBUG: (9c657285e1b8fd5e): Assigning stage ids. root: INFO: 2017-07-06T01:02:16.831Z: JOB_MESSAGE_DEBUG: (45a098b4f77d39c4): Executing wait step start22 root: INFO: 2017-07-06T01:02:16.856Z: JOB_MESSAGE_BASIC: (e9ccafcd3dddd169): Executing operation monthly count/GroupByKey/Create root: INFO: 2017-07-06T01:02:17.062Z: JOB_MESSAGE_DEBUG: (88c5839e5ca42729): Starting worker pool setup. root: INFO: 2017-07-06T01:02:17.071Z: JOB_MESSAGE_BASIC: (88c5839e5ca42317): Starting 1 workers in us-central1-f... root: INFO: 2017-07-06T01:02:17.111Z: JOB_MESSAGE_DEBUG: (45a098b4f77d3a03): Value "monthly count/GroupByKey/Session" materialized. root: INFO: 2017-07-06T01:02:17.142Z: JOB_MESSAGE_BASIC: (e9ccafcd3dddd436): Executing operation read+months with tornadoes+monthly count/GroupByKey+monthly count/Combine/Partial+monthly count/GroupByKey/Reify+monthly count/GroupByKey/Write root: INFO: 2017-07-06T01:02:18.033Z: JOB_MESSAGE_BASIC: (55dd343bfd764d28): BigQuery export job "dataflow_job_6187158895337029898" started. You can check its status with the bq tool: "bq show -j --project_id=clouddataflow-readonly dataflow_job_6187158895337029898". root: INFO: 2017-07-06T01:02:48.443Z: JOB_MESSAGE_DETAILED: (4794b79e0a20f9c5): BigQuery export job progress: "dataflow_job_6187158895337029898" observed total of 1 exported files thus far. root: INFO: 2017-07-06T01:02:48.446Z: JOB_MESSAGE_BASIC: (4794b79e0a20f76f): BigQuery export job finished: "dataflow_job_6187158895337029898" root: INFO: 2017-07-06T01:03:00.224Z: JOB_MESSAGE_DETAILED: (8c342424a950c11e): Workers have started successfully. root: INFO: 2017-07-06T01:05:18.529Z: JOB_MESSAGE_BASIC: (2117febbd414065b): Executing operation monthly count/GroupByKey/Close root: INFO: 2017-07-06T01:05:18.556Z: JOB_MESSAGE_BASIC: (e9ccafcd3dddd7f2): Executing operation monthly count/GroupByKey/Read+monthly count/Combine+monthly count/Combine/Extract+format+Write/WriteToBigQuery/NativeWrite root: INFO: 2017-07-06T01:05:31.426Z: JOB_MESSAGE_BASIC: (2117febbd4140b82): Executing BigQuery import job "dataflow_job_6172818816812679983". You can check its status with the bq tool: "bq show -j --project_id=apache-beam-testing dataflow_job_6172818816812679983". root: INFO: 2017-07-06T01:05:42.617Z: JOB_MESSAGE_BASIC: (2117febbd4140ce7): BigQuery import job "dataflow_job_6172818816812679983" done. root: INFO: 2017-07-06T01:05:43.521Z: JOB_MESSAGE_DEBUG: (2117febbd4140bef): Executing success step success20 root: INFO: 2017-07-06T01:05:43.619Z: JOB_MESSAGE_DETAILED: (9c657285e1b8fecc): Cleaning up. root: INFO: 2017-07-06T01:05:43.622Z: JOB_MESSAGE_DEBUG: (9c657285e1b8fbe2): Starting worker pool teardown. root: INFO: 2017-07-06T01:05:43.626Z: JOB_MESSAGE_BASIC: (9c657285e1b8f8f8): Stopping worker pool... root: INFO: 2017-07-06T01:07:03.634Z: JOB_MESSAGE_BASIC: (9c657285e1b8f499): Worker pool stopped. root: INFO: 2017-07-06T01:07:03.716Z: JOB_MESSAGE_DEBUG: (9c657285e1b8fa66): Tearing down pending resources... root: INFO: Job 2017-07-05_18_02_13-14321492511956794927 is in state JOB_STATE_DONE root: INFO: Start verify Bigquery data. google.auth.transport._http_client: DEBUG: Making request: GET http://169.254.169.254 google.auth.transport._http_client: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id google_auth_httplib2: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true google_auth_httplib2: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-compute@developer.gserviceaccount.com/token root: WARNING: Retry with exponential backoff: waiting for 2.65581644987 seconds before retrying _query_with_retry because we caught exception: ValueError: too many values to unpack Traceback for above exception (most recent call last): File " line 168, in wrapper return fun(*args, **kwargs) File " line 95, in _query_with_retry rows, _, page_token = query.fetch_data(page_token=page_token) root: WARNING: Retry with exponential backoff: waiting for 6.48699485617 seconds before retrying _query_with_retry because we caught exception: ValueError: too many values to unpack Traceback for above exception (most recent call last): File " line 168, in wrapper return fun(*args, **kwargs) File " line 95, in _query_with_retry rows, _, page_token = query.fetch_data(page_token=page_token) root: WARNING: Retry with exponential backoff: waiting for 16.1888349205 seconds before retrying _query_with_retry because we caught exception: ValueError: too many values to unpack Traceback for above exception (most recent call last): File " line 168, in wrapper return fun(*args, **kwargs) File " line 95, in _query_with_retry rows, _, page_token = query.fetch_data(page_token=page_token) root: WARNING: Retry with exponential backoff: waiting for 38.5682178232 seconds before retrying _query_with_retry because we caught exception: ValueError: too many values to unpack Traceback for above exception (most recent call last): File " line 168, in wrapper return fun(*args, **kwargs) File " line 95, in _query_with_retry rows, _, page_token = query.fetch_data(page_token=page_token) --------------------- >> end captured logging << --------------------- ---------------------------------------------------------------------- Ran 2 tests in 368.564s FAILED (errors=1) Found: https://console.cloud.google.com/dataflow/job/2017-07-05_18_02_13-14321492511956794927?project=apache-beam-testing Found: https://console.cloud.google.com/dataflow/job/2017-07-05_18_02_13-10668284294056289845?project=apache-beam-testing Build step 'Execute shell' marked build as failure