beam-commits mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Apache Jenkins Server <jenk...@builds.apache.org>
Subject Build failed in Jenkins: beam_PostCommit_PythonVerify #134
Date Fri, 22 Jul 2016 00:43:14 GMT
See <https://builds.apache.org/job/beam_PostCommit_PythonVerify/134/changes>

Changes:

[robertwb] Remove expensive per-element-step logging context.

[robertwb] Cache dofn.proces method.

[robertwb] Restore (faster) logging context.

------------------------------------------
[...truncated 2482 lines...]
        "serialized_fn": "<string of 1340 bytes>", 
        "user_name": "write/WriteImpl/write_bundles"
      }
    }, 
    {
      "kind": "CollectionToSingleton", 
      "name": "s11", 
      "properties": {
        "output_info": [
          {
            "encoding": {
              "@type": "WindowedValueCoder$<string of 736 bytes>", 
              "component_encodings": [
                {
                  "@type": "WindowedValueCoder$<string of 736 bytes>", 
                  "component_encodings": [
                    {
                      "@type": "FastPrimitivesCoder$<string of 172 bytes>", 
                      "component_encodings": [
                        {
                          "@type": "FastPrimitivesCoder$<string of 172 bytes>", 
                          "component_encodings": []
                        }, 
                        {
                          "@type": "FastPrimitivesCoder$<string of 172 bytes>", 
                          "component_encodings": []
                        }
                      ], 
                      "is_pair_like": true
                    }, 
                    {
                      "@type": "TimestampCoder$gAJjYXBhY2hlX2JlYW0uY29kZXJzLmNvZGVycwpUaW1lc3RhbXBDb2RlcgpxACmBcQF9cQJiLg==",

                      "component_encodings": []
                    }, 
                    {
                      "@type": "SingletonCoder$<string of 344 bytes>", 
                      "component_encodings": []
                    }
                  ], 
                  "is_wrapper": true
                }
              ]
            }, 
            "output_name": "out", 
            "user_name": "write/WriteImpl/ViewAsIterable(write|WriteImpl|write_bundles.None)/CreatePCollectionView.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "out", 
          "step_name": "s10"
        }, 
        "user_name": "write/WriteImpl/ViewAsIterable(write|WriteImpl|write_bundles.None)/CreatePCollectionView"
      }
    }, 
    {
      "kind": "ParallelDo", 
      "name": "s12", 
      "properties": {
        "non_parallel_inputs": {
          "s11": {
            "@type": "OutputReference", 
            "output_name": "out", 
            "step_name": "s11"
          }, 
          "s9": {
            "@type": "OutputReference", 
            "output_name": "out", 
            "step_name": "s9"
          }
        }, 
        "output_info": [
          {
            "encoding": {
              "@type": "WindowedValueCoder$<string of 736 bytes>", 
              "component_encodings": [
                {
                  "@type": "FastPrimitivesCoder$<string of 172 bytes>", 
                  "component_encodings": [
                    {
                      "@type": "FastPrimitivesCoder$<string of 172 bytes>", 
                      "component_encodings": []
                    }, 
                    {
                      "@type": "FastPrimitivesCoder$<string of 172 bytes>", 
                      "component_encodings": []
                    }
                  ], 
                  "is_pair_like": true
                }, 
                {
                  "@type": "TimestampCoder$gAJjYXBhY2hlX2JlYW0uY29kZXJzLmNvZGVycwpUaW1lc3RhbXBDb2RlcgpxACmBcQF9cQJiLg==",

                  "component_encodings": []
                }, 
                {
                  "@type": "SingletonCoder$<string of 344 bytes>", 
                  "component_encodings": []
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "out", 
            "user_name": "write/WriteImpl/finalize_write.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "out", 
          "step_name": "s7"
        }, 
        "serialized_fn": "<string of 1496 bytes>", 
        "user_name": "write/WriteImpl/finalize_write"
      }
    }
  ], 
  "type": "JOB_TYPE_BATCH"
}
INFO:root:Create job: <Job
 id: u'2016-07-21_17_39_34-11919126913628582743'
 projectId: u'apache-beam-testing'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
INFO:root:Created job with id: [2016-07-21_17_39_34-11919126913628582743]
INFO:root:To access the Dataflow monitoring console, please navigate to https://console.developers.google.com/project/apache-beam-testing/dataflow/job/2016-07-21_17_39_34-11919126913628582743
INFO:root:2016-07-21_17_39_34-11919126913628582743_00000156100b0e27: 2016-07-22T00:39:35.207Z:
JOB_MESSAGE_DETAILED: (66e966f9c1475bc3): Checking required Cloud APIs are enabled.
INFO:root:Job 2016-07-21_17_39_34-11919126913628582743 is in state JOB_STATE_RUNNING
INFO:root:2016-07-21_17_39_34-11919126913628582743_00000156100b0fcc: 2016-07-22T00:39:35.628Z:
JOB_MESSAGE_DEBUG: (66e966f9c1475d0a): Combiner lifting skipped for step group: GroupByKey
not followed by a combiner.
INFO:root:2016-07-21_17_39_34-11919126913628582743_00000156100b0fcf: 2016-07-22T00:39:35.631Z:
JOB_MESSAGE_DETAILED: (66e966f9c14752a8): Expanding GroupByKey operations into optimizable
parts.
INFO:root:2016-07-21_17_39_34-11919126913628582743_00000156100b0fd2: 2016-07-22T00:39:35.634Z:
JOB_MESSAGE_DETAILED: (66e966f9c1475846): Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
INFO:root:2016-07-21_17_39_34-11919126913628582743_00000156100b0fd8: 2016-07-22T00:39:35.640Z:
JOB_MESSAGE_DETAILED: (66e966f9c1475382): Annotating graph with Autotuner information.
INFO:root:2016-07-21_17_39_34-11919126913628582743_00000156100b1005: 2016-07-22T00:39:35.685Z:
JOB_MESSAGE_DETAILED: (66e966f9c14759fa): Fusing adjacent ParDo, Read, Write, and Flatten
operations
INFO:root:2016-07-21_17_39_34-11919126913628582743_00000156100b1008: 2016-07-22T00:39:35.688Z:
JOB_MESSAGE_DETAILED: (66e966f9c1475f98): Fusing consumer split into read
INFO:root:2016-07-21_17_39_34-11919126913628582743_00000156100b1009: 2016-07-22T00:39:35.689Z:
JOB_MESSAGE_DETAILED: (66e966f9c1475536): Fusing consumer group/Reify into pair_with_one
INFO:root:2016-07-21_17_39_34-11919126913628582743_00000156100b100c: 2016-07-22T00:39:35.692Z:
JOB_MESSAGE_DETAILED: (66e966f9c1475ad4): Fusing consumer format into count
INFO:root:2016-07-21_17_39_34-11919126913628582743_00000156100b100e: 2016-07-22T00:39:35.694Z:
JOB_MESSAGE_DETAILED: (66e966f9c1475072): Fusing consumer count into group/GroupByWindow
INFO:root:2016-07-21_17_39_34-11919126913628582743_00000156100b1014: 2016-07-22T00:39:35.700Z:
JOB_MESSAGE_DETAILED: (66e966f9c1475bae): Fusing consumer pair_with_one into split
INFO:root:2016-07-21_17_39_34-11919126913628582743_00000156100b1017: 2016-07-22T00:39:35.703Z:
JOB_MESSAGE_DETAILED: (66e966f9c147514c): Fusing consumer group/GroupByWindow into group/Read
INFO:root:2016-07-21_17_39_34-11919126913628582743_00000156100b1019: 2016-07-22T00:39:35.705Z:
JOB_MESSAGE_DETAILED: (66e966f9c14756ea): Fusing consumer write/WriteImpl/write_bundles into
format
INFO:root:2016-07-21_17_39_34-11919126913628582743_00000156100b101c: 2016-07-22T00:39:35.708Z:
JOB_MESSAGE_DETAILED: (66e966f9c1475c88): Fusing consumer group/Write into group/Reify
INFO:root:2016-07-21_17_39_34-11919126913628582743_00000156100b1052: 2016-07-22T00:39:35.762Z:
JOB_MESSAGE_DEBUG: (66e966f9c147581c): Workflow config is missing a default resource spec.
INFO:root:2016-07-21_17_39_34-11919126913628582743_00000156100b1055: 2016-07-22T00:39:35.765Z:
JOB_MESSAGE_DETAILED: (66e966f9c1475dba): Adding StepResource setup and teardown to workflow
graph.
INFO:root:2016-07-21_17_39_34-11919126913628582743_00000156100b1086: 2016-07-22T00:39:35.814Z:
JOB_MESSAGE_DEBUG: (1607a536476d0625): Adding workflow start and stop steps.
INFO:root:2016-07-21_17_39_34-11919126913628582743_00000156100b1092: 2016-07-22T00:39:35.826Z:
JOB_MESSAGE_DEBUG: (964bb1306ba3ade2): Assigning stage ids.
INFO:root:2016-07-21_17_39_34-11919126913628582743_00000156100b10f7: 2016-07-22T00:39:35.927Z:
JOB_MESSAGE_DEBUG: (191004cfedb4f4be): Executing wait step start2
INFO:root:2016-07-21_17_39_34-11919126913628582743_00000156100b1103: 2016-07-22T00:39:35.939Z:
JOB_MESSAGE_DEBUG: (f9b92bf05eb3131b): Executing operation write/WriteImpl/DoOnce
INFO:root:2016-07-21_17_39_34-11919126913628582743_00000156100b1111: 2016-07-22T00:39:35.953Z:
JOB_MESSAGE_DEBUG: (66e966f9c1475e94): Value "write/WriteImpl/DoOnce.out" materialized.
INFO:root:2016-07-21_17_39_34-11919126913628582743_00000156100b111d: 2016-07-22T00:39:35.965Z:
JOB_MESSAGE_BASIC: S01: (ca37e07305f6e896): Executing operation write/WriteImpl/initialize_write
INFO:root:2016-07-21_17_39_34-11919126913628582743_00000156100b1120: 2016-07-22T00:39:35.968Z:
JOB_MESSAGE_BASIC: S02: (946f41bda0d8987f): Executing operation group/Create
INFO:root:2016-07-21_17_39_34-11919126913628582743_00000156100b11ea: 2016-07-22T00:39:36.170Z:
JOB_MESSAGE_DEBUG: (a80265b0f93b1617): Starting worker pool setup.
INFO:root:2016-07-21_17_39_34-11919126913628582743_00000156100b11ec: 2016-07-22T00:39:36.172Z:
JOB_MESSAGE_BASIC: (a80265b0f93b1289): Starting 1 workers...
INFO:root:2016-07-21_17_39_34-11919126913628582743_00000156100b1200: 2016-07-22T00:39:36.192Z:
JOB_MESSAGE_DEBUG: (a14a60ddd9585b02): Value "group/Session" materialized.
INFO:root:2016-07-21_17_39_34-11919126913628582743_00000156100b120d: 2016-07-22T00:39:36.205Z:
JOB_MESSAGE_BASIC: S03: (e85bfb9a3ce83b61): Executing operation read+split+pair_with_one+group/Reify+group/Write
INFO:root:2016-07-21_17_39_34-11919126913628582743_00000156100c6cfa: 2016-07-22T00:41:05.018Z:
JOB_MESSAGE_DETAILED: (77d5029a39d1af44): Workers have started successfully.
INFO:root:2016-07-21_17_39_34-11919126913628582743_00000156100d12b0: 2016-07-22T00:41:47.440Z:
JOB_MESSAGE_ERROR: (a562237edf71b227): Traceback (most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 462,
in do_work
    work_executor.execute()
  File "dataflow_worker/executor.py", line 891, in dataflow_worker.executor.MapTaskExecutor.execute
(dataflow_worker/executor.c:24041)
    op.start()
  File "dataflow_worker/executor.py", line 477, in dataflow_worker.executor.DoOperation.start
(dataflow_worker/executor.c:14398)
    def start(self):
  File "dataflow_worker/executor.py", line 508, in dataflow_worker.executor.DoOperation.start
(dataflow_worker/executor.c:14333)
    self.dofn_runner.start()
  File "apache_beam/runners/common.py", line 92, in apache_beam.runners.common.DoFnRunner.start
(apache_beam/runners/common.c:3261)
    self.logging_context.exit()
AttributeError: 'PerThreadLoggingContext' object has no attribute 'exit'

INFO:root:2016-07-21_17_39_34-11919126913628582743_00000156100d1cbb: 2016-07-22T00:41:50.011Z:
JOB_MESSAGE_ERROR: (69e9bbf70c56e239): Traceback (most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 462,
in do_work
    work_executor.execute()
  File "dataflow_worker/executor.py", line 891, in dataflow_worker.executor.MapTaskExecutor.execute
(dataflow_worker/executor.c:24041)
    op.start()
  File "dataflow_worker/executor.py", line 477, in dataflow_worker.executor.DoOperation.start
(dataflow_worker/executor.c:14398)
    def start(self):
  File "dataflow_worker/executor.py", line 508, in dataflow_worker.executor.DoOperation.start
(dataflow_worker/executor.c:14333)
    self.dofn_runner.start()
  File "apache_beam/runners/common.py", line 92, in apache_beam.runners.common.DoFnRunner.start
(apache_beam/runners/common.c:3261)
    self.logging_context.exit()
AttributeError: 'PerThreadLoggingContext' object has no attribute 'exit'

INFO:root:2016-07-21_17_39_34-11919126913628582743_00000156100d20e2: 2016-07-22T00:41:51.074Z:
JOB_MESSAGE_ERROR: (37d7ddc04502f29d): Traceback (most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 462,
in do_work
    work_executor.execute()
  File "dataflow_worker/executor.py", line 891, in dataflow_worker.executor.MapTaskExecutor.execute
(dataflow_worker/executor.c:24041)
    op.start()
  File "dataflow_worker/executor.py", line 477, in dataflow_worker.executor.DoOperation.start
(dataflow_worker/executor.c:14398)
    def start(self):
  File "dataflow_worker/executor.py", line 508, in dataflow_worker.executor.DoOperation.start
(dataflow_worker/executor.c:14333)
    self.dofn_runner.start()
  File "apache_beam/runners/common.py", line 92, in apache_beam.runners.common.DoFnRunner.start
(apache_beam/runners/common.c:3261)
    self.logging_context.exit()
AttributeError: 'PerThreadLoggingContext' object has no attribute 'exit'

INFO:root:2016-07-21_17_39_34-11919126913628582743_00000156100d250c: 2016-07-22T00:41:52.140Z:
JOB_MESSAGE_ERROR: (6a6c8d9ff1dfb2ba): Traceback (most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 462,
in do_work
    work_executor.execute()
  File "dataflow_worker/executor.py", line 891, in dataflow_worker.executor.MapTaskExecutor.execute
(dataflow_worker/executor.c:24041)
    op.start()
  File "dataflow_worker/executor.py", line 477, in dataflow_worker.executor.DoOperation.start
(dataflow_worker/executor.c:14398)
    def start(self):
  File "dataflow_worker/executor.py", line 508, in dataflow_worker.executor.DoOperation.start
(dataflow_worker/executor.c:14333)
    self.dofn_runner.start()
  File "apache_beam/runners/common.py", line 92, in apache_beam.runners.common.DoFnRunner.start
(apache_beam/runners/common.c:3261)
    self.logging_context.exit()
AttributeError: 'PerThreadLoggingContext' object has no attribute 'exit'

INFO:root:2016-07-21_17_39_34-11919126913628582743_00000156100d252b: 2016-07-22T00:41:52.171Z:
JOB_MESSAGE_WARNING: S01: (3a4ddfa9bb933898): Unable to delete temp files: "gs://temp-storage-for-end-to-end-tests/temp-wordcount/py-wordcount-1469147972.1469147973.379150/dax-tmp-2016-07-21_17_39_34-11919126913628582743-S01-1-ca37e07305f6e388/tmp-ca37e07305f6e9b5@DAX.avro."
INFO:root:2016-07-21_17_39_34-11919126913628582743_00000156100d253c: 2016-07-22T00:41:52.188Z:
JOB_MESSAGE_DEBUG: (a14a60ddd9585d64): Executing failure step failure1
INFO:root:2016-07-21_17_39_34-11919126913628582743_00000156100d253e: 2016-07-22T00:41:52.190Z:
JOB_MESSAGE_ERROR: (a14a60ddd95858da): Workflow failed. Causes: (ca37e07305f6e415): S01:write/WriteImpl/initialize_write
failed.
INFO:root:2016-07-21_17_39_34-11919126913628582743_00000156100d2579: 2016-07-22T00:41:52.249Z:
JOB_MESSAGE_DETAILED: (f9b92bf05eb313be): Cleaning up.
INFO:root:2016-07-21_17_39_34-11919126913628582743_00000156100d25f6: 2016-07-22T00:41:52.374Z:
JOB_MESSAGE_DEBUG: (f9b92bf05eb3164d): Starting worker pool teardown.
INFO:root:2016-07-21_17_39_34-11919126913628582743_00000156100d25f8: 2016-07-22T00:41:52.376Z:
JOB_MESSAGE_BASIC: (f9b92bf05eb31d57): Stopping worker pool...
INFO:root:2016-07-21_17_39_34-11919126913628582743_00000156100d294c: 2016-07-22T00:41:53.228Z:
JOB_MESSAGE_DETAILED: (75c8f82f1135ba1c): Failed to publish the result of the work update.
Causes: (75c8f82f1135b137): Failed to update work status. Causes: (9380ca2d8863cec8): Work
"1405345380650175154" not leased (or the lease was lost).
INFO:root:2016-07-21_17_39_34-11919126913628582743_00000156100e4af8: 2016-07-22T00:43:07.384Z:
JOB_MESSAGE_BASIC: (f9b92bf05eb31fe6): Worker pool stopped.
INFO:root:2016-07-21_17_39_34-11919126913628582743_00000156100e5068: 2016-07-22T00:43:08.776Z:
JOB_MESSAGE_DEBUG: (f9b92bf05eb313bb): Tearing down pending resources...
INFO:root:Job 2016-07-21_17_39_34-11919126913628582743 is in state JOB_STATE_FAILED
Traceback (most recent call last):
  File "/usr/lib/python2.7/runpy.py", line 162, in _run_module_as_main
    "__main__", fname, loader, pkg_name)
  File "/usr/lib/python2.7/runpy.py", line 72, in _run_code
    exec code in run_globals
  File "<https://builds.apache.org/job/beam_PostCommit_PythonVerify/ws/sdks/python/apache_beam/examples/wordcount.py",>
line 102, in <module>
    run()
  File "<https://builds.apache.org/job/beam_PostCommit_PythonVerify/ws/sdks/python/apache_beam/examples/wordcount.py",>
line 93, in run
    result = p.run()
  File "apache_beam/pipeline.py", line 159, in run
    return self.runner.run(self)
  File "apache_beam/runners/dataflow_runner.py", line 188, in run
    % getattr(self, 'last_error_msg', None), self.result)
apache_beam.runners.dataflow_runner.DataflowRuntimeException: Dataflow pipeline failed:
(6a6c8d9ff1dfb2ba): Traceback (most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 462,
in do_work
    work_executor.execute()
  File "dataflow_worker/executor.py", line 891, in dataflow_worker.executor.MapTaskExecutor.execute
(dataflow_worker/executor.c:24041)
    op.start()
  File "dataflow_worker/executor.py", line 477, in dataflow_worker.executor.DoOperation.start
(dataflow_worker/executor.c:14398)
    def start(self):
  File "dataflow_worker/executor.py", line 508, in dataflow_worker.executor.DoOperation.start
(dataflow_worker/executor.c:14333)
    self.dofn_runner.start()
  File "apache_beam/runners/common.py", line 92, in apache_beam.runners.common.DoFnRunner.start
(apache_beam/runners/common.c:3261)
    self.logging_context.exit()
AttributeError: 'PerThreadLoggingContext' object has no attribute 'exit'


# Grep will exit with status 1 if success message was not found.
echo ">>> CHECKING JOB SUCCESS"
>>> CHECKING JOB SUCCESS
grep JOB_STATE_DONE job_output
Build step 'Execute shell' marked build as failure

Mime
View raw message