Return-Path: X-Original-To: archive-asf-public-internal@cust-asf2.ponee.io Delivered-To: archive-asf-public-internal@cust-asf2.ponee.io Received: from cust-asf.ponee.io (cust-asf.ponee.io [163.172.22.183]) by cust-asf2.ponee.io (Postfix) with ESMTP id 83839200C80 for ; Thu, 25 May 2017 11:57:40 +0200 (CEST) Received: by cust-asf.ponee.io (Postfix) id 81E11160BCA; Thu, 25 May 2017 09:57:40 +0000 (UTC) Delivered-To: archive-asf-public@cust-asf.ponee.io Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by cust-asf.ponee.io (Postfix) with SMTP id 7ABD6160BC7 for ; Thu, 25 May 2017 11:57:39 +0200 (CEST) Received: (qmail 48630 invoked by uid 500); 25 May 2017 09:57:38 -0000 Mailing-List: contact commits-help@beam.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: dev@beam.apache.org Delivered-To: mailing list commits@beam.apache.org Received: (qmail 48619 invoked by uid 99); 25 May 2017 09:57:38 -0000 Received: from crius.apache.org (HELO crius.apache.org) (140.211.11.14) by apache.org (qpsmtpd/0.29) with ESMTP; Thu, 25 May 2017 09:57:38 +0000 Received: from crius.apache.org (localhost [127.0.0.1]) by crius.apache.org (ASF Mail Server at crius.apache.org) with ESMTP id 7B2EA9C00AF; Thu, 25 May 2017 09:57:38 +0000 (UTC) Date: Thu, 25 May 2017 09:57:38 +0000 (UTC) From: Apache Jenkins Server To: commits@beam.apache.org, robertwb@google.com Message-ID: <1595714780.9325.1495706258344.JavaMail.jenkins@crius> In-Reply-To: <690373203.9202.1495681514997.JavaMail.jenkins@crius> References: <690373203.9202.1495681514997.JavaMail.jenkins@crius> Subject: Build failed in Jenkins: beam_PostCommit_Python_Verify #2314 MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 7bit X-Instance-Identity: MIIBIjANBgkqhkiG9w0BAQEFAAOCAQ8AMIIBCgKCAQEAkqVKZPv7YyHBB3FvWfV7XQehwe/Ga3aadzSNknt8g382X3uN8A3SOQ+Ixq9HxS+ZlN6XR4TECySmSRy2JN5Rx8svxAD0TjtSF9LuU98dD+LniNDP7Lq6gvRFuJhbMHoS0nuTizDZLsK4X8TW5MyV9w+jFbdoZfRE5O/Mse0fkOeL5uoIS/3Vvu/W+x9QSjDkB7CaU56bPFlQjqqJBl3Cn9r34CkXQZYnLb/NjW4vcpw0+TgMUAPTIVEr5BTPZRshz19g7huwg3zANT5HBIZnzV4hsVY9w4JHkceFdKi/ibNnjPjsFs9pm0HSGJ/RDxjIvSTYT02eH4+m1RAYaj2E9QIDAQAB X-Jenkins-Job: beam_PostCommit_Python_Verify X-Jenkins-Result: FAILURE archived-at: Thu, 25 May 2017 09:57:40 -0000 See ------------------------------------------ [...truncated 576.54 KB...] } ], "is_pair_like": true }, { "@type": "kind:global_window" } ], "is_wrapper": true } ] }, "output_name": "out", "user_name": "write/Write/WriteImpl/FinalizeWrite/SideInput-s16.output" } ], "parallel_input": { "@type": "OutputReference", "output_name": "out", "step_name": "s14" }, "user_name": "write/Write/WriteImpl/FinalizeWrite/SideInput-s16" } }, { "kind": "ParallelDo", "name": "s17", "properties": { "display_data": [ { "key": "fn", "label": "Transform Function", "namespace": "apache_beam.transforms.core.CallableWrapperDoFn", "type": "STRING", "value": "_finalize_write" }, { "key": "fn", "label": "Transform Function", "namespace": "apache_beam.transforms.core.ParDo", "shortValue": "CallableWrapperDoFn", "type": "STRING", "value": "apache_beam.transforms.core.CallableWrapperDoFn" } ], "non_parallel_inputs": { "SideInput-s15": { "@type": "OutputReference", "output_name": "out", "step_name": "SideInput-s15" }, "SideInput-s16": { "@type": "OutputReference", "output_name": "out", "step_name": "SideInput-s16" } }, "output_info": [ { "encoding": { "@type": "kind:windowed_value", "component_encodings": [ { "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", "component_encodings": [ { "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", "component_encodings": [] }, { "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", "component_encodings": [] } ], "is_pair_like": true }, { "@type": "kind:global_window" } ], "is_wrapper": true }, "output_name": "out", "user_name": "write/Write/WriteImpl/FinalizeWrite.out" } ], "parallel_input": { "@type": "OutputReference", "output_name": "out", "step_name": "s7" }, "serialized_fn": "", "user_name": "write/Write/WriteImpl/FinalizeWrite/Do" } } ], "type": "JOB_TYPE_BATCH" } root: INFO: Create job: root: INFO: Created job with id: [2017-05-25_02_52_01-14859652357523077495] root: INFO: To access the Dataflow monitoring console, please navigate to https://console.developers.google.com/project/apache-beam-testing/dataflow/job/2017-05-25_02_52_01-14859652357523077495 root: INFO: Job 2017-05-25_02_52_01-14859652357523077495 is in state JOB_STATE_RUNNING root: INFO: 2017-05-25T09:52:01.428Z: JOB_MESSAGE_WARNING: (ce381716dff57f13): Setting the number of workers (1) disables autoscaling for this job. If you are trying to cap autoscaling, consider only setting max_num_workers. If you want to disable autoscaling altogether, the documented way is to explicitly use autoscalingAlgorithm=NONE. root: INFO: 2017-05-25T09:52:03.547Z: JOB_MESSAGE_DETAILED: (b35fa43191d6bd58): Checking required Cloud APIs are enabled. root: INFO: 2017-05-25T09:52:04.407Z: JOB_MESSAGE_DEBUG: (b35fa43191d6b8e3): Combiner lifting skipped for step write/Write/WriteImpl/GroupByKey: GroupByKey not followed by a combiner. root: INFO: 2017-05-25T09:52:04.409Z: JOB_MESSAGE_DEBUG: (b35fa43191d6ba6d): Combiner lifting skipped for step group: GroupByKey not followed by a combiner. root: INFO: 2017-05-25T09:52:04.412Z: JOB_MESSAGE_DETAILED: (b35fa43191d6bbf7): Expanding GroupByKey operations into optimizable parts. root: INFO: 2017-05-25T09:52:04.415Z: JOB_MESSAGE_DETAILED: (b35fa43191d6bd81): Lifting ValueCombiningMappingFns into MergeBucketsMappingFns root: INFO: 2017-05-25T09:52:04.421Z: JOB_MESSAGE_DEBUG: (b35fa43191d6b21f): Annotating graph with Autotuner information. root: INFO: 2017-05-25T09:52:04.432Z: JOB_MESSAGE_DETAILED: (b35fa43191d6b533): Fusing adjacent ParDo, Read, Write, and Flatten operations root: INFO: 2017-05-25T09:52:04.435Z: JOB_MESSAGE_DETAILED: (b35fa43191d6b6bd): Fusing consumer split into read/Read root: INFO: 2017-05-25T09:52:04.436Z: JOB_MESSAGE_DETAILED: (b35fa43191d6b847): Fusing consumer group/Write into group/Reify root: INFO: 2017-05-25T09:52:04.438Z: JOB_MESSAGE_DETAILED: (b35fa43191d6b9d1): Fusing consumer group/GroupByWindow into group/Read root: INFO: 2017-05-25T09:52:04.440Z: JOB_MESSAGE_DETAILED: (b35fa43191d6bb5b): Fusing consumer write/Write/WriteImpl/GroupByKey/GroupByWindow into write/Write/WriteImpl/GroupByKey/Read root: INFO: 2017-05-25T09:52:04.443Z: JOB_MESSAGE_DETAILED: (b35fa43191d6bce5): Fusing consumer write/Write/WriteImpl/GroupByKey/Write into write/Write/WriteImpl/GroupByKey/Reify root: INFO: 2017-05-25T09:52:04.447Z: JOB_MESSAGE_DETAILED: (b35fa43191d6bff9): Fusing consumer write/Write/WriteImpl/WindowInto(WindowIntoFn) into write/Write/WriteImpl/Pair root: INFO: 2017-05-25T09:52:04.449Z: JOB_MESSAGE_DETAILED: (b35fa43191d6b183): Fusing consumer write/Write/WriteImpl/GroupByKey/Reify into write/Write/WriteImpl/WindowInto(WindowIntoFn) root: INFO: 2017-05-25T09:52:04.452Z: JOB_MESSAGE_DETAILED: (b35fa43191d6b30d): Fusing consumer pair_with_one into split root: INFO: 2017-05-25T09:52:04.454Z: JOB_MESSAGE_DETAILED: (b35fa43191d6b497): Fusing consumer group/Reify into pair_with_one root: INFO: 2017-05-25T09:52:04.456Z: JOB_MESSAGE_DETAILED: (b35fa43191d6b621): Fusing consumer write/Write/WriteImpl/WriteBundles/Do into format root: INFO: 2017-05-25T09:52:04.458Z: JOB_MESSAGE_DETAILED: (b35fa43191d6b7ab): Fusing consumer write/Write/WriteImpl/Pair into write/Write/WriteImpl/WriteBundles/Do root: INFO: 2017-05-25T09:52:04.460Z: JOB_MESSAGE_DETAILED: (b35fa43191d6b935): Fusing consumer format into count root: INFO: 2017-05-25T09:52:04.462Z: JOB_MESSAGE_DETAILED: (b35fa43191d6babf): Fusing consumer write/Write/WriteImpl/Extract into write/Write/WriteImpl/GroupByKey/GroupByWindow root: INFO: 2017-05-25T09:52:04.464Z: JOB_MESSAGE_DETAILED: (b35fa43191d6bc49): Fusing consumer count into group/GroupByWindow root: INFO: 2017-05-25T09:52:04.472Z: JOB_MESSAGE_DETAILED: (b35fa43191d6b271): Fusing consumer write/Write/WriteImpl/InitializeWrite into write/Write/WriteImpl/DoOnce/Read root: INFO: 2017-05-25T09:52:04.539Z: JOB_MESSAGE_DEBUG: (b35fa43191d6b09d): Workflow config is missing a default resource spec. root: INFO: 2017-05-25T09:52:04.541Z: JOB_MESSAGE_DETAILED: (b35fa43191d6b227): Adding StepResource setup and teardown to workflow graph. root: INFO: 2017-05-25T09:52:04.543Z: JOB_MESSAGE_DEBUG: (b35fa43191d6b3b1): Adding workflow start and stop steps. root: INFO: 2017-05-25T09:52:04.546Z: JOB_MESSAGE_DEBUG: (b35fa43191d6b53b): Assigning stage ids. root: INFO: 2017-05-25T09:52:04.583Z: JOB_MESSAGE_DEBUG: (cd2465b8e9e61b84): Executing wait step start25 root: INFO: 2017-05-25T09:52:04.591Z: JOB_MESSAGE_BASIC: (cd2465b8e9e616a6): Executing operation write/Write/WriteImpl/DoOnce/Read+write/Write/WriteImpl/InitializeWrite root: INFO: 2017-05-25T09:52:04.593Z: JOB_MESSAGE_BASIC: (58b7d9fa60cbf683): Executing operation group/Create root: INFO: 2017-05-25T09:52:04.794Z: JOB_MESSAGE_DEBUG: (26891a5f5242adf1): Starting worker pool setup. root: INFO: 2017-05-25T09:52:04.796Z: JOB_MESSAGE_BASIC: (26891a5f5242af53): Starting 1 workers... root: INFO: 2017-05-25T09:52:04.810Z: JOB_MESSAGE_DEBUG: (58b7d9fa60cbf88a): Value "group/Session" materialized. root: INFO: 2017-05-25T09:52:04.820Z: JOB_MESSAGE_BASIC: (58b7d9fa60cbf502): Executing operation read/Read+split+pair_with_one+group/Reify+group/Write root: INFO: 2017-05-25T09:54:16.997Z: JOB_MESSAGE_DETAILED: (6c4a229f32137869): Workers have started successfully. root: INFO: 2017-05-25T09:55:41.961Z: JOB_MESSAGE_ERROR: (b8553047d4aa30f5): Traceback (most recent call last): File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 706, in run self._load_main_session(self.local_staging_directory) File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 446, in _load_main_session pickler.load_session(session_file) File "/usr/local/lib/python2.7/dist-packages/apache_beam/internal/pickler.py", line 247, in load_session return dill.load_session(file_path) File "/usr/local/lib/python2.7/dist-packages/dill/dill.py", line 363, in load_session module = unpickler.load() File "/usr/lib/python2.7/pickle.py", line 858, in load dispatch[key](self) File "/usr/lib/python2.7/pickle.py", line 1133, in load_reduce value = func(*args) File "/usr/local/lib/python2.7/dist-packages/dill/dill.py", line 766, in _import_module return __import__(import_name) ImportError: No module named gen_protos root: INFO: 2017-05-25T09:55:44.040Z: JOB_MESSAGE_ERROR: (b8553047d4aa306d): Traceback (most recent call last): File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 706, in run self._load_main_session(self.local_staging_directory) File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 446, in _load_main_session pickler.load_session(session_file) File "/usr/local/lib/python2.7/dist-packages/apache_beam/internal/pickler.py", line 247, in load_session return dill.load_session(file_path) File "/usr/local/lib/python2.7/dist-packages/dill/dill.py", line 363, in load_session module = unpickler.load() File "/usr/lib/python2.7/pickle.py", line 858, in load dispatch[key](self) File "/usr/lib/python2.7/pickle.py", line 1133, in load_reduce value = func(*args) File "/usr/local/lib/python2.7/dist-packages/dill/dill.py", line 766, in _import_module return __import__(import_name) ImportError: No module named gen_protos root: INFO: 2017-05-25T09:55:46.109Z: JOB_MESSAGE_ERROR: (b8553047d4aa3fe5): Traceback (most recent call last): File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 706, in run self._load_main_session(self.local_staging_directory) File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 446, in _load_main_session pickler.load_session(session_file) File "/usr/local/lib/python2.7/dist-packages/apache_beam/internal/pickler.py", line 247, in load_session return dill.load_session(file_path) File "/usr/local/lib/python2.7/dist-packages/dill/dill.py", line 363, in load_session module = unpickler.load() File "/usr/lib/python2.7/pickle.py", line 858, in load dispatch[key](self) File "/usr/lib/python2.7/pickle.py", line 1133, in load_reduce value = func(*args) File "/usr/local/lib/python2.7/dist-packages/dill/dill.py", line 766, in _import_module return __import__(import_name) ImportError: No module named gen_protos root: INFO: 2017-05-25T09:55:48.226Z: JOB_MESSAGE_ERROR: (b8553047d4aa3f5d): Traceback (most recent call last): File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 706, in run self._load_main_session(self.local_staging_directory) File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 446, in _load_main_session pickler.load_session(session_file) File "/usr/local/lib/python2.7/dist-packages/apache_beam/internal/pickler.py", line 247, in load_session return dill.load_session(file_path) File "/usr/local/lib/python2.7/dist-packages/dill/dill.py", line 363, in load_session module = unpickler.load() File "/usr/lib/python2.7/pickle.py", line 858, in load dispatch[key](self) File "/usr/lib/python2.7/pickle.py", line 1133, in load_reduce value = func(*args) File "/usr/local/lib/python2.7/dist-packages/dill/dill.py", line 766, in _import_module return __import__(import_name) ImportError: No module named gen_protos root: INFO: 2017-05-25T09:55:50.290Z: JOB_MESSAGE_ERROR: (b8553047d4aa3ed5): Traceback (most recent call last): File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 706, in run self._load_main_session(self.local_staging_directory) File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 446, in _load_main_session pickler.load_session(session_file) File "/usr/local/lib/python2.7/dist-packages/apache_beam/internal/pickler.py", line 247, in load_session return dill.load_session(file_path) File "/usr/local/lib/python2.7/dist-packages/dill/dill.py", line 363, in load_session module = unpickler.load() File "/usr/lib/python2.7/pickle.py", line 858, in load dispatch[key](self) File "/usr/lib/python2.7/pickle.py", line 1133, in load_reduce value = func(*args) File "/usr/local/lib/python2.7/dist-packages/dill/dill.py", line 766, in _import_module return __import__(import_name) ImportError: No module named gen_protos root: INFO: 2017-05-25T09:55:50.943Z: JOB_MESSAGE_DEBUG: (cd2465b8e9e61991): Executing failure step failure24 root: INFO: 2017-05-25T09:55:50.945Z: JOB_MESSAGE_ERROR: (cd2465b8e9e61d47): Workflow failed. Causes: (cd2465b8e9e61225): S01:write/Write/WriteImpl/DoOnce/Read+write/Write/WriteImpl/InitializeWrite failed., (6d710753e7f6f036): Failed to split source. root: INFO: 2017-05-25T09:55:51.007Z: JOB_MESSAGE_DETAILED: (b35fa43191d6bdb2): Cleaning up. root: INFO: 2017-05-25T09:55:51.009Z: JOB_MESSAGE_DEBUG: (b35fa43191d6bf3c): Starting worker pool teardown. root: INFO: 2017-05-25T09:55:51.012Z: JOB_MESSAGE_BASIC: (b35fa43191d6b0c6): Stopping worker pool... root: INFO: 2017-05-25T09:57:01.022Z: JOB_MESSAGE_BASIC: (b35fa43191d6b315): Worker pool stopped. root: INFO: 2017-05-25T09:57:01.078Z: JOB_MESSAGE_DEBUG: (b35fa43191d6b93d): Tearing down pending resources... root: INFO: Job 2017-05-25_02_52_01-14859652357523077495 is in state JOB_STATE_FAILED --------------------- >> end captured logging << --------------------- ---------------------------------------------------------------------- Ran 2 tests in 338.268s FAILED (errors=1) Found: https://console.cloud.google.com/dataflow/job/2017-05-25_02_52_00-149841084067825722?project=apache-beam-testing Found: https://console.cloud.google.com/dataflow/job/2017-05-25_02_52_01-14859652357523077495?project=apache-beam-testing Build step 'Execute shell' marked build as failure