beam-commits mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Apache Jenkins Server <jenk...@builds.apache.org>
Subject Build failed in Jenkins: beam_PostCommit_Python_Verify #1075
Date Mon, 23 Jan 2017 23:35:36 GMT
See <https://builds.apache.org/job/beam_PostCommit_Python_Verify/1075/changes>

Changes:

[chamikara] Increments major version used by Dataflow runner to 5

[robertwb] Remove dataflow_test.py

[robertwb] Code cleanup now that all runners support windowed side inputs.

------------------------------------------
[...truncated 8425 lines...]
              ], 
              "is_wrapper": true
            }, 
            "output_name": "out", 
            "user_name": "assert:even/UnKey.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "out", 
          "step_name": "s15"
        }, 
        "serialized_fn": "<string of 1052 bytes>", 
        "user_name": "assert:even/UnKey"
      }
    }, 
    {
      "kind": "ParallelDo", 
      "name": "s17", 
      "properties": {
        "display_data": [
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "_equal"
          }, 
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.ParDo", 
            "shortValue": "CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "apache_beam.transforms.core.CallableWrapperDoFn"
          }
        ], 
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/",

                  "component_encodings": [
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/",

                      "component_encodings": []
                    }, 
                    {
                      "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/",

                      "component_encodings": []
                    }
                  ], 
                  "is_pair_like": true
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "out", 
            "user_name": "assert:even/Match.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "out", 
          "step_name": "s16"
        }, 
        "serialized_fn": "<string of 1212 bytes>", 
        "user_name": "assert:even/Match"
      }
    }
  ], 
  "type": "JOB_TYPE_BATCH"
}
root: INFO: Create job: <Job
 id: u'2017-01-23_15_30_54-12011182360156094633'
 projectId: u'apache-beam-testing'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
root: INFO: Created job with id: [2017-01-23_15_30_54-12011182360156094633]
root: INFO: To access the Dataflow monitoring console, please navigate to https://console.developers.google.com/project/apache-beam-testing/dataflow/job/2017-01-23_15_30_54-12011182360156094633
root: INFO: Job 2017-01-23_15_30_54-12011182360156094633 is in state JOB_STATE_RUNNING
root: INFO: 2017-01-23_15_30_54-12011182360156094633_00000159cdab092b: 2017-01-23T23:30:55.403Z:
JOB_MESSAGE_DETAILED: (3fadbfc3db312046): Checking required Cloud APIs are enabled.
root: INFO: 2017-01-23_15_30_54-12011182360156094633_00000159cdab0b0b: 2017-01-23T23:30:55.883Z:
JOB_MESSAGE_DEBUG: (3fadbfc3db312195): Combiner lifting skipped for step assert_that/Group:
GroupByKey not followed by a combiner.
root: INFO: 2017-01-23_15_30_54-12011182360156094633_00000159cdab0b0d: 2017-01-23T23:30:55.885Z:
JOB_MESSAGE_DEBUG: (3fadbfc3db312ac3): Combiner lifting skipped for step assert:even/Group:
GroupByKey not followed by a combiner.
root: INFO: 2017-01-23_15_30_54-12011182360156094633_00000159cdab0b0f: 2017-01-23T23:30:55.887Z:
JOB_MESSAGE_DEBUG: (3fadbfc3db3123f1): Combiner lifting skipped for step assert:odd/Group:
GroupByKey not followed by a combiner.
root: INFO: 2017-01-23_15_30_54-12011182360156094633_00000159cdab0b12: 2017-01-23T23:30:55.890Z:
JOB_MESSAGE_DETAILED: (3fadbfc3db312d1f): Expanding GroupByKey operations into optimizable
parts.
root: INFO: 2017-01-23_15_30_54-12011182360156094633_00000159cdab0b14: 2017-01-23T23:30:55.892Z:
JOB_MESSAGE_DETAILED: (3fadbfc3db31264d): Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
root: INFO: 2017-01-23_15_30_54-12011182360156094633_00000159cdab0b1e: 2017-01-23T23:30:55.902Z:
JOB_MESSAGE_DETAILED: (3fadbfc3db312b05): Annotating graph with Autotuner information.
root: INFO: 2017-01-23_15_30_54-12011182360156094633_00000159cdab0b29: 2017-01-23T23:30:55.913Z:
JOB_MESSAGE_DETAILED: (3fadbfc3db3129f8): Fusing adjacent ParDo, Read, Write, and Flatten
operations
root: INFO: 2017-01-23_15_30_54-12011182360156094633_00000159cdab0b2f: 2017-01-23T23:30:55.919Z:
JOB_MESSAGE_DETAILED: (3fadbfc3db312c54): Fusing consumer assert:odd/ToVoidKey into assert:odd/WindowInto
root: INFO: 2017-01-23_15_30_54-12011182360156094633_00000159cdab0b32: 2017-01-23T23:30:55.922Z:
JOB_MESSAGE_DETAILED: (3fadbfc3db312582): Fusing consumer assert:odd/UnKey into assert:odd/Group/GroupByWindow
root: INFO: 2017-01-23_15_30_54-12011182360156094633_00000159cdab0b34: 2017-01-23T23:30:55.924Z:
JOB_MESSAGE_DETAILED: (3fadbfc3db312eb0): Fusing consumer assert:even/UnKey into assert:even/Group/GroupByWindow
root: INFO: 2017-01-23_15_30_54-12011182360156094633_00000159cdab0b37: 2017-01-23T23:30:55.927Z:
JOB_MESSAGE_DETAILED: (3fadbfc3db3127de): Fusing consumer assert:even/Group/GroupByWindow
into assert:even/Group/Read
root: INFO: 2017-01-23_15_30_54-12011182360156094633_00000159cdab0b39: 2017-01-23T23:30:55.929Z:
JOB_MESSAGE_DETAILED: (3fadbfc3db31210c): Fusing consumer assert_that/Match into assert_that/UnKey
root: INFO: 2017-01-23_15_30_54-12011182360156094633_00000159cdab0b3b: 2017-01-23T23:30:55.931Z:
JOB_MESSAGE_DETAILED: (3fadbfc3db312a3a): Fusing consumer assert_that/UnKey into assert_that/Group/GroupByWindow
root: INFO: 2017-01-23_15_30_54-12011182360156094633_00000159cdab0b3d: 2017-01-23T23:30:55.933Z:
JOB_MESSAGE_DETAILED: (3fadbfc3db312368): Fusing consumer assert_that/Group/GroupByWindow
into assert_that/Group/Read
root: INFO: 2017-01-23_15_30_54-12011182360156094633_00000159cdab0b3f: 2017-01-23T23:30:55.935Z:
JOB_MESSAGE_DETAILED: (3fadbfc3db312c96): Fusing consumer assert_that/Group/Write into assert_that/Group/Reify
root: INFO: 2017-01-23_15_30_54-12011182360156094633_00000159cdab0b42: 2017-01-23T23:30:55.938Z:
JOB_MESSAGE_DETAILED: (3fadbfc3db3125c4): Fusing consumer assert_that/Group/Reify into assert_that/ToVoidKey
root: INFO: 2017-01-23_15_30_54-12011182360156094633_00000159cdab0b44: 2017-01-23T23:30:55.940Z:
JOB_MESSAGE_DETAILED: (3fadbfc3db312ef2): Fusing consumer assert_that/ToVoidKey into assert_that/WindowInto
root: INFO: 2017-01-23_15_30_54-12011182360156094633_00000159cdab0b46: 2017-01-23T23:30:55.942Z:
JOB_MESSAGE_DETAILED: (3fadbfc3db312820): Fusing consumer assert:odd/Group/GroupByWindow into
assert:odd/Group/Read
root: INFO: 2017-01-23_15_30_54-12011182360156094633_00000159cdab0b48: 2017-01-23T23:30:55.944Z:
JOB_MESSAGE_DETAILED: (3fadbfc3db31214e): Fusing consumer assert:even/Group/Write into assert:even/Group/Reify
root: INFO: 2017-01-23_15_30_54-12011182360156094633_00000159cdab0b4a: 2017-01-23T23:30:55.946Z:
JOB_MESSAGE_DETAILED: (3fadbfc3db312a7c): Fusing consumer assert:even/Match into assert:even/UnKey
root: INFO: 2017-01-23_15_30_54-12011182360156094633_00000159cdab0b4c: 2017-01-23T23:30:55.948Z:
JOB_MESSAGE_DETAILED: (3fadbfc3db3123aa): Fusing consumer assert:even/Group/Reify into assert:even/ToVoidKey
root: INFO: 2017-01-23_15_30_54-12011182360156094633_00000159cdab0b4f: 2017-01-23T23:30:55.951Z:
JOB_MESSAGE_DETAILED: (3fadbfc3db312cd8): Fusing consumer assert:odd/WindowInto into ClassifyNumbers/ClassifyNumbers
root: INFO: 2017-01-23_15_30_54-12011182360156094633_00000159cdab0b51: 2017-01-23T23:30:55.953Z:
JOB_MESSAGE_DETAILED: (3fadbfc3db312606): Fusing consumer assert:odd/Group/Write into assert:odd/Group/Reify
root: INFO: 2017-01-23_15_30_54-12011182360156094633_00000159cdab0b53: 2017-01-23T23:30:55.955Z:
JOB_MESSAGE_DETAILED: (3fadbfc3db312f34): Fusing consumer assert:even/WindowInto into ClassifyNumbers/ClassifyNumbers
root: INFO: 2017-01-23_15_30_54-12011182360156094633_00000159cdab0b55: 2017-01-23T23:30:55.957Z:
JOB_MESSAGE_DETAILED: (3fadbfc3db312862): Fusing consumer assert:even/ToVoidKey into assert:even/WindowInto
root: INFO: 2017-01-23_15_30_54-12011182360156094633_00000159cdab0b57: 2017-01-23T23:30:55.959Z:
JOB_MESSAGE_DETAILED: (3fadbfc3db312190): Fusing consumer assert_that/WindowInto into ClassifyNumbers/ClassifyNumbers
root: INFO: 2017-01-23_15_30_54-12011182360156094633_00000159cdab0b59: 2017-01-23T23:30:55.961Z:
JOB_MESSAGE_DETAILED: (3fadbfc3db312abe): Fusing consumer assert:odd/Match into assert:odd/UnKey
root: INFO: 2017-01-23_15_30_54-12011182360156094633_00000159cdab0b5b: 2017-01-23T23:30:55.963Z:
JOB_MESSAGE_DETAILED: (3fadbfc3db3123ec): Fusing consumer assert:odd/Group/Reify into assert:odd/ToVoidKey
root: INFO: 2017-01-23_15_30_54-12011182360156094633_00000159cdab0bb1: 2017-01-23T23:30:56.049Z:
JOB_MESSAGE_DEBUG: (3fadbfc3db3120c0): Workflow config is missing a default resource spec.
root: INFO: 2017-01-23_15_30_54-12011182360156094633_00000159cdab0bb3: 2017-01-23T23:30:56.051Z:
JOB_MESSAGE_DETAILED: (3fadbfc3db3129ee): Adding StepResource setup and teardown to workflow
graph.
root: INFO: 2017-01-23_15_30_54-12011182360156094633_00000159cdab0bf6: 2017-01-23T23:30:56.118Z:
JOB_MESSAGE_DEBUG: (e31afff0dc916c6a): Adding workflow start and stop steps.
root: INFO: 2017-01-23_15_30_54-12011182360156094633_00000159cdab0c28: 2017-01-23T23:30:56.168Z:
JOB_MESSAGE_DEBUG: (dd9172f79e564817): Assigning stage ids.
root: INFO: 2017-01-23_15_30_54-12011182360156094633_00000159cdab0ccc: 2017-01-23T23:30:56.332Z:
JOB_MESSAGE_DEBUG: (aaac8d07a592185a): Executing wait step start2
root: INFO: 2017-01-23_15_30_54-12011182360156094633_00000159cdab0cd8: 2017-01-23T23:30:56.344Z:
JOB_MESSAGE_DEBUG: (e5b879d3fc421531): Executing operation Some Numbers
root: INFO: 2017-01-23_15_30_54-12011182360156094633_00000159cdab0ce6: 2017-01-23T23:30:56.358Z:
JOB_MESSAGE_DEBUG: (e31afff0dc916fc1): Value "Some Numbers.out" materialized.
root: INFO: 2017-01-23_15_30_54-12011182360156094633_00000159cdab0cf1: 2017-01-23T23:30:56.369Z:
JOB_MESSAGE_BASIC: S01: (aaac8d07a5921fae): Executing operation assert:odd/Group/Create
root: INFO: 2017-01-23_15_30_54-12011182360156094633_00000159cdab0cf4: 2017-01-23T23:30:56.372Z:
JOB_MESSAGE_BASIC: S02: (3fadbfc3db3128e1): Executing operation assert:even/Group/Create
root: INFO: 2017-01-23_15_30_54-12011182360156094633_00000159cdab0cf5: 2017-01-23T23:30:56.373Z:
JOB_MESSAGE_BASIC: S03: (6e37547a5e2c658a): Executing operation assert_that/Group/Create
root: INFO: 2017-01-23_15_30_54-12011182360156094633_00000159cdab0dbe: 2017-01-23T23:30:56.574Z:
JOB_MESSAGE_DEBUG: (e9a91083e1434023): Starting worker pool setup.
root: INFO: 2017-01-23_15_30_54-12011182360156094633_00000159cdab0dc0: 2017-01-23T23:30:56.576Z:
JOB_MESSAGE_BASIC: (e9a91083e14348bd): Starting 1 workers...
root: INFO: 2017-01-23_15_30_54-12011182360156094633_00000159cdab0dd2: 2017-01-23T23:30:56.594Z:
JOB_MESSAGE_DEBUG: (708a4aa4ce81981a): Value "assert:odd/Group/Session" materialized.
root: INFO: 2017-01-23_15_30_54-12011182360156094633_00000159cdab0dd4: 2017-01-23T23:30:56.596Z:
JOB_MESSAGE_DEBUG: (dd9172f79e564c8e): Value "assert_that/Group/Session" materialized.
root: INFO: 2017-01-23_15_30_54-12011182360156094633_00000159cdab0dea: 2017-01-23T23:30:56.618Z:
JOB_MESSAGE_DEBUG: (3bd4ccbdb25e4eac): Value "assert:even/Group/Session" materialized.
root: INFO: 2017-01-23_15_30_54-12011182360156094633_00000159cdab0df5: 2017-01-23T23:30:56.629Z:
JOB_MESSAGE_BASIC: S04: (e5b879d3fc4213da): Executing operation ClassifyNumbers/ClassifyNumbers+assert:odd/WindowInto+assert:odd/ToVoidKey+assert:even/WindowInto+assert:even/ToVoidKey+assert:even/Group/Reify+assert:even/Group/Write+assert_that/WindowInto+assert_that/ToVoidKey+assert_that/Group/Reify+assert_that/Group/Write+assert:odd/Group/Reify+assert:odd/Group/Write
root: INFO: 2017-01-23_15_30_54-12011182360156094633_00000159cdabed2f: 2017-01-23T23:31:53.775Z:
JOB_MESSAGE_DETAILED: (d3212f834d43fd2a): Workers have started successfully.
root: INFO: 2017-01-23_15_30_54-12011182360156094633_00000159cdadd6e4: 2017-01-23T23:33:59.140Z:
JOB_MESSAGE_ERROR: (d3f0897736e1c274): Traceback (most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 514,
in do_work
    work_executor.execute()
  File "dataflow_worker/executor.py", line 899, in dataflow_worker.executor.MapTaskExecutor.execute
(dataflow_worker/executor.c:26452)
    op.start()
  File "dataflow_worker/executor.py", line 464, in dataflow_worker.executor.DoOperation.start
(dataflow_worker/executor.c:15269)
    def start(self):
  File "dataflow_worker/executor.py", line 469, in dataflow_worker.executor.DoOperation.start
(dataflow_worker/executor.c:14434)
    pickler.loads(self.spec.serialized_fn))
  File "/usr/local/lib/python2.7/dist-packages/apache_beam/internal/pickler.py", line 212,
in loads
    return dill.loads(s)
  File "/usr/local/lib/python2.7/dist-packages/dill/dill.py", line 260, in loads
    return load(file)
  File "/usr/local/lib/python2.7/dist-packages/dill/dill.py", line 250, in load
    obj = pik.load()
  File "/usr/lib/python2.7/pickle.py", line 858, in load
    dispatch[key](self)
  File "/usr/lib/python2.7/pickle.py", line 1133, in load_reduce
    value = func(*args)
  File "/usr/local/lib/python2.7/dist-packages/dill/dill.py", line 726, in _import_module
    return getattr(__import__(module, None, None, [obj]), obj)
  File "/usr/local/lib/python2.7/dist-packages/apache_beam/transforms/ptransform_test.py",
line 26, in <module>
    import hamcrest as hc
ImportError: No module named hamcrest

root: INFO: 2017-01-23_15_30_54-12011182360156094633_00000159cdadd7cb: 2017-01-23T23:33:59.371Z:
JOB_MESSAGE_ERROR: (d3f0897736e1c8b9): Traceback (most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 514,
in do_work
    work_executor.execute()
  File "dataflow_worker/executor.py", line 899, in dataflow_worker.executor.MapTaskExecutor.execute
(dataflow_worker/executor.c:26452)
    op.start()
  File "dataflow_worker/executor.py", line 464, in dataflow_worker.executor.DoOperation.start
(dataflow_worker/executor.c:15269)
    def start(self):
  File "dataflow_worker/executor.py", line 469, in dataflow_worker.executor.DoOperation.start
(dataflow_worker/executor.c:14434)
    pickler.loads(self.spec.serialized_fn))
  File "/usr/local/lib/python2.7/dist-packages/apache_beam/internal/pickler.py", line 212,
in loads
    return dill.loads(s)
  File "/usr/local/lib/python2.7/dist-packages/dill/dill.py", line 260, in loads
    return load(file)
  File "/usr/local/lib/python2.7/dist-packages/dill/dill.py", line 250, in load
    obj = pik.load()
  File "/usr/lib/python2.7/pickle.py", line 858, in load
    dispatch[key](self)
  File "/usr/lib/python2.7/pickle.py", line 1133, in load_reduce
    value = func(*args)
  File "/usr/local/lib/python2.7/dist-packages/dill/dill.py", line 726, in _import_module
    return getattr(__import__(module, None, None, [obj]), obj)
  File "/usr/local/lib/python2.7/dist-packages/apache_beam/transforms/ptransform_test.py",
line 26, in <module>
    import hamcrest as hc
ImportError: No module named hamcrest

root: INFO: 2017-01-23_15_30_54-12011182360156094633_00000159cdadd882: 2017-01-23T23:33:59.554Z:
JOB_MESSAGE_ERROR: (d3f0897736e1c3b3): Traceback (most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 514,
in do_work
    work_executor.execute()
  File "dataflow_worker/executor.py", line 899, in dataflow_worker.executor.MapTaskExecutor.execute
(dataflow_worker/executor.c:26452)
    op.start()
  File "dataflow_worker/executor.py", line 464, in dataflow_worker.executor.DoOperation.start
(dataflow_worker/executor.c:15269)
    def start(self):
  File "dataflow_worker/executor.py", line 469, in dataflow_worker.executor.DoOperation.start
(dataflow_worker/executor.c:14434)
    pickler.loads(self.spec.serialized_fn))
  File "/usr/local/lib/python2.7/dist-packages/apache_beam/internal/pickler.py", line 212,
in loads
    return dill.loads(s)
  File "/usr/local/lib/python2.7/dist-packages/dill/dill.py", line 260, in loads
    return load(file)
  File "/usr/local/lib/python2.7/dist-packages/dill/dill.py", line 250, in load
    obj = pik.load()
  File "/usr/lib/python2.7/pickle.py", line 858, in load
    dispatch[key](self)
  File "/usr/lib/python2.7/pickle.py", line 1133, in load_reduce
    value = func(*args)
  File "/usr/local/lib/python2.7/dist-packages/dill/dill.py", line 726, in _import_module
    return getattr(__import__(module, None, None, [obj]), obj)
  File "/usr/local/lib/python2.7/dist-packages/apache_beam/transforms/ptransform_test.py",
line 26, in <module>
    import hamcrest as hc
ImportError: No module named hamcrest

root: INFO: 2017-01-23_15_30_54-12011182360156094633_00000159cdadd965: 2017-01-23T23:33:59.781Z:
JOB_MESSAGE_ERROR: (d3f0897736e1c9f8): Traceback (most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 514,
in do_work
    work_executor.execute()
  File "dataflow_worker/executor.py", line 899, in dataflow_worker.executor.MapTaskExecutor.execute
(dataflow_worker/executor.c:26452)
    op.start()
  File "dataflow_worker/executor.py", line 464, in dataflow_worker.executor.DoOperation.start
(dataflow_worker/executor.c:15269)
    def start(self):
  File "dataflow_worker/executor.py", line 469, in dataflow_worker.executor.DoOperation.start
(dataflow_worker/executor.c:14434)
    pickler.loads(self.spec.serialized_fn))
  File "/usr/local/lib/python2.7/dist-packages/apache_beam/internal/pickler.py", line 212,
in loads
    return dill.loads(s)
  File "/usr/local/lib/python2.7/dist-packages/dill/dill.py", line 260, in loads
    return load(file)
  File "/usr/local/lib/python2.7/dist-packages/dill/dill.py", line 250, in load
    obj = pik.load()
  File "/usr/lib/python2.7/pickle.py", line 858, in load
    dispatch[key](self)
  File "/usr/lib/python2.7/pickle.py", line 1133, in load_reduce
    value = func(*args)
  File "/usr/local/lib/python2.7/dist-packages/dill/dill.py", line 726, in _import_module
    return getattr(__import__(module, None, None, [obj]), obj)
  File "/usr/local/lib/python2.7/dist-packages/apache_beam/transforms/ptransform_test.py",
line 26, in <module>
    import hamcrest as hc
ImportError: No module named hamcrest

root: INFO: 2017-01-23_15_30_54-12011182360156094633_00000159cdadd98f: 2017-01-23T23:33:59.823Z:
JOB_MESSAGE_DEBUG: (3fadbfc3db312ff5): Executing failure step failure1
root: INFO: 2017-01-23_15_30_54-12011182360156094633_00000159cdadd991: 2017-01-23T23:33:59.825Z:
JOB_MESSAGE_ERROR: (3fadbfc3db312923): Workflow failed. Causes: (e5b879d3fc42115d): S04:ClassifyNumbers/ClassifyNumbers+assert:odd/WindowInto+assert:odd/ToVoidKey+assert:even/WindowInto+assert:even/ToVoidKey+assert:even/Group/Reify+assert:even/Group/Write+assert_that/WindowInto+assert_that/ToVoidKey+assert_that/Group/Reify+assert_that/Group/Write+assert:odd/Group/Reify+assert:odd/Group/Write
failed.
root: INFO: 2017-01-23_15_30_54-12011182360156094633_00000159cdadd9cf: 2017-01-23T23:33:59.887Z:
JOB_MESSAGE_DETAILED: (580466590c7e466a): Cleaning up.
root: INFO: 2017-01-23_15_30_54-12011182360156094633_00000159cdadda4c: 2017-01-23T23:34:00.012Z:
JOB_MESSAGE_DEBUG: (580466590c7e4304): Starting worker pool teardown.
root: INFO: 2017-01-23_15_30_54-12011182360156094633_00000159cdadda4e: 2017-01-23T23:34:00.014Z:
JOB_MESSAGE_BASIC: (580466590c7e4f9e): Stopping worker pool...
root: INFO: 2017-01-23_15_30_54-12011182360156094633_00000159cdaf12d6: 2017-01-23T23:35:20.022Z:
JOB_MESSAGE_BASIC: (580466590c7e4285): Worker pool stopped.
root: INFO: 2017-01-23_15_30_54-12011182360156094633_00000159cdaf16d8: 2017-01-23T23:35:21.048Z:
JOB_MESSAGE_DEBUG: (580466590c7e4853): Tearing down pending resources...
root: INFO: Job 2017-01-23_15_30_54-12011182360156094633 is in state JOB_STATE_FAILED
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
Ran 14 tests in 1186.872s

FAILED (errors=4)
Build step 'Execute shell' marked build as failure

Mime
View raw message