beam-commits mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Apache Jenkins Server <jenk...@builds.apache.org>
Subject Build failed in Jenkins: beam_PostCommit_Python_Verify #1077
Date Tue, 24 Jan 2017 09:22:59 GMT
See <https://builds.apache.org/job/beam_PostCommit_Python_Verify/1077/>

------------------------------------------
[...truncated 8519 lines...]
root: INFO: 2017-01-24_01_18_28-12399986343756263108_00000159cfc4fa8b: 2017-01-24T09:18:30.027Z:
JOB_MESSAGE_DETAILED: (87856bc85f4e273e): Fusing consumer assert:even/Group/GroupByWindow
into assert:even/Group/Read
root: INFO: 2017-01-24_01_18_28-12399986343756263108_00000159cfc4fa8d: 2017-01-24T09:18:30.029Z:
JOB_MESSAGE_DETAILED: (87856bc85f4e24e0): Fusing consumer assert_that/Match into assert_that/UnKey
root: INFO: 2017-01-24_01_18_28-12399986343756263108_00000159cfc4fa8f: 2017-01-24T09:18:30.031Z:
JOB_MESSAGE_DETAILED: (87856bc85f4e2282): Fusing consumer assert_that/UnKey into assert_that/Group/GroupByWindow
root: INFO: 2017-01-24_01_18_28-12399986343756263108_00000159cfc4fa91: 2017-01-24T09:18:30.033Z:
JOB_MESSAGE_DETAILED: (87856bc85f4e2024): Fusing consumer assert_that/Group/GroupByWindow
into assert_that/Group/Read
root: INFO: 2017-01-24_01_18_28-12399986343756263108_00000159cfc4fa93: 2017-01-24T09:18:30.035Z:
JOB_MESSAGE_DETAILED: (87856bc85f4e2dc6): Fusing consumer assert_that/Group/Write into assert_that/Group/Reify
root: INFO: 2017-01-24_01_18_28-12399986343756263108_00000159cfc4fa95: 2017-01-24T09:18:30.037Z:
JOB_MESSAGE_DETAILED: (87856bc85f4e2b68): Fusing consumer assert_that/Group/Reify into assert_that/ToVoidKey
root: INFO: 2017-01-24_01_18_28-12399986343756263108_00000159cfc4fa97: 2017-01-24T09:18:30.039Z:
JOB_MESSAGE_DETAILED: (87856bc85f4e290a): Fusing consumer assert_that/ToVoidKey into assert_that/WindowInto
root: INFO: 2017-01-24_01_18_28-12399986343756263108_00000159cfc4fa99: 2017-01-24T09:18:30.041Z:
JOB_MESSAGE_DETAILED: (87856bc85f4e26ac): Fusing consumer assert:odd/Group/GroupByWindow into
assert:odd/Group/Read
root: INFO: 2017-01-24_01_18_28-12399986343756263108_00000159cfc4fa9b: 2017-01-24T09:18:30.043Z:
JOB_MESSAGE_DETAILED: (87856bc85f4e244e): Fusing consumer assert:even/Group/Write into assert:even/Group/Reify
root: INFO: 2017-01-24_01_18_28-12399986343756263108_00000159cfc4fa9d: 2017-01-24T09:18:30.045Z:
JOB_MESSAGE_DETAILED: (87856bc85f4e21f0): Fusing consumer assert:even/Match into assert:even/UnKey
root: INFO: 2017-01-24_01_18_28-12399986343756263108_00000159cfc4fa9f: 2017-01-24T09:18:30.047Z:
JOB_MESSAGE_DETAILED: (87856bc85f4e2f92): Fusing consumer assert:even/Group/Reify into assert:even/ToVoidKey
root: INFO: 2017-01-24_01_18_28-12399986343756263108_00000159cfc4faa1: 2017-01-24T09:18:30.049Z:
JOB_MESSAGE_DETAILED: (87856bc85f4e2d34): Fusing consumer assert:odd/WindowInto into ClassifyNumbers/ClassifyNumbers
root: INFO: 2017-01-24_01_18_28-12399986343756263108_00000159cfc4faa3: 2017-01-24T09:18:30.051Z:
JOB_MESSAGE_DETAILED: (87856bc85f4e2ad6): Fusing consumer assert:odd/Group/Write into assert:odd/Group/Reify
root: INFO: 2017-01-24_01_18_28-12399986343756263108_00000159cfc4faa5: 2017-01-24T09:18:30.053Z:
JOB_MESSAGE_DETAILED: (87856bc85f4e2878): Fusing consumer assert:even/WindowInto into ClassifyNumbers/ClassifyNumbers
root: INFO: 2017-01-24_01_18_28-12399986343756263108_00000159cfc4faa7: 2017-01-24T09:18:30.055Z:
JOB_MESSAGE_DETAILED: (87856bc85f4e261a): Fusing consumer assert:even/ToVoidKey into assert:even/WindowInto
root: INFO: 2017-01-24_01_18_28-12399986343756263108_00000159cfc4faa9: 2017-01-24T09:18:30.057Z:
JOB_MESSAGE_DETAILED: (87856bc85f4e23bc): Fusing consumer assert_that/WindowInto into ClassifyNumbers/ClassifyNumbers
root: INFO: 2017-01-24_01_18_28-12399986343756263108_00000159cfc4faab: 2017-01-24T09:18:30.059Z:
JOB_MESSAGE_DETAILED: (87856bc85f4e215e): Fusing consumer assert:odd/Match into assert:odd/UnKey
root: INFO: 2017-01-24_01_18_28-12399986343756263108_00000159cfc4faad: 2017-01-24T09:18:30.061Z:
JOB_MESSAGE_DETAILED: (87856bc85f4e2f00): Fusing consumer assert:odd/Group/Reify into assert:odd/ToVoidKey
root: INFO: 2017-01-24_01_18_28-12399986343756263108_00000159cfc4fb00: 2017-01-24T09:18:30.144Z:
JOB_MESSAGE_DEBUG: (87856bc85f4e250c): Workflow config is missing a default resource spec.
root: INFO: 2017-01-24_01_18_28-12399986343756263108_00000159cfc4fb03: 2017-01-24T09:18:30.147Z:
JOB_MESSAGE_DETAILED: (87856bc85f4e22ae): Adding StepResource setup and teardown to workflow
graph.
root: INFO: 2017-01-24_01_18_28-12399986343756263108_00000159cfc4fb21: 2017-01-24T09:18:30.177Z:
JOB_MESSAGE_DEBUG: (31cd4de23232ef81): Adding workflow start and stop steps.
root: INFO: 2017-01-24_01_18_28-12399986343756263108_00000159cfc4fb53: 2017-01-24T09:18:30.227Z:
JOB_MESSAGE_DEBUG: (f93568740c05d1b1): Assigning stage ids.
root: INFO: 2017-01-24_01_18_28-12399986343756263108_00000159cfc4fba9: 2017-01-24T09:18:30.313Z:
JOB_MESSAGE_DEBUG: (f93568740c05d098): Executing wait step start2
root: INFO: 2017-01-24_01_18_28-12399986343756263108_00000159cfc4fbb9: 2017-01-24T09:18:30.329Z:
JOB_MESSAGE_DEBUG: (ac2138444fc23d7d): Executing operation Some Numbers
root: INFO: 2017-01-24_01_18_28-12399986343756263108_00000159cfc4fbee: 2017-01-24T09:18:30.382Z:
JOB_MESSAGE_DEBUG: (a5eb5b5ec15467df): Value "Some Numbers.out" materialized.
root: INFO: 2017-01-24_01_18_28-12399986343756263108_00000159cfc4fbf7: 2017-01-24T09:18:30.391Z:
JOB_MESSAGE_BASIC: S02: (f93568740c05d91c): Executing operation assert:even/Group/Create
root: INFO: 2017-01-24_01_18_28-12399986343756263108_00000159cfc4fbfa: 2017-01-24T09:18:30.394Z:
JOB_MESSAGE_BASIC: S03: (d90ef3f76c48fa2a): Executing operation assert_that/Group/Create
root: INFO: 2017-01-24_01_18_28-12399986343756263108_00000159cfc4fc1b: 2017-01-24T09:18:30.427Z:
JOB_MESSAGE_BASIC: S01: (af4267f01a05b233): Executing operation assert:odd/Group/Create
root: INFO: 2017-01-24_01_18_28-12399986343756263108_00000159cfc4fcc3: 2017-01-24T09:18:30.595Z:
JOB_MESSAGE_DEBUG: (b9d847bf036a3ee3): Starting worker pool setup.
root: INFO: 2017-01-24_01_18_28-12399986343756263108_00000159cfc4fcc5: 2017-01-24T09:18:30.597Z:
JOB_MESSAGE_BASIC: (b9d847bf036a36e5): Starting 1 workers...
root: INFO: 2017-01-24_01_18_28-12399986343756263108_00000159cfc4fcd6: 2017-01-24T09:18:30.614Z:
JOB_MESSAGE_DEBUG: (f1cc619595f671fd): Value "assert_that/Group/Session" materialized.
root: INFO: 2017-01-24_01_18_28-12399986343756263108_00000159cfc4fcd8: 2017-01-24T09:18:30.616Z:
JOB_MESSAGE_DEBUG: (f93568740c05d2a8): Value "assert:even/Group/Session" materialized.
root: INFO: 2017-01-24_01_18_28-12399986343756263108_00000159cfc4fce3: 2017-01-24T09:18:30.627Z:
JOB_MESSAGE_BASIC: S04: (87856bc85f4e2cc3): Executing operation ClassifyNumbers/ClassifyNumbers+assert:odd/WindowInto+assert:odd/ToVoidKey+assert:even/WindowInto+assert:even/ToVoidKey+assert:even/Group/Reify+assert:even/Group/Write+assert_that/WindowInto+assert_that/ToVoidKey+assert_that/Group/Reify+assert_that/Group/Write+assert:odd/Group/Reify+assert:odd/Group/Write
root: INFO: 2017-01-24_01_18_28-12399986343756263108_00000159cfc6b696: 2017-01-24T09:20:23.702Z:
JOB_MESSAGE_DETAILED: (43b1b18f7b1c6b33): Workers have started successfully.
root: INFO: 2017-01-24_01_18_28-12399986343756263108_00000159cfc7ca91: 2017-01-24T09:21:34.353Z:
JOB_MESSAGE_ERROR: (72f067c206330db1): Traceback (most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 514,
in do_work
    work_executor.execute()
  File "dataflow_worker/executor.py", line 899, in dataflow_worker.executor.MapTaskExecutor.execute
(dataflow_worker/executor.c:26452)
    op.start()
  File "dataflow_worker/executor.py", line 464, in dataflow_worker.executor.DoOperation.start
(dataflow_worker/executor.c:15269)
    def start(self):
  File "dataflow_worker/executor.py", line 469, in dataflow_worker.executor.DoOperation.start
(dataflow_worker/executor.c:14434)
    pickler.loads(self.spec.serialized_fn))
  File "/usr/local/lib/python2.7/dist-packages/apache_beam/internal/pickler.py", line 212,
in loads
    return dill.loads(s)
  File "/usr/local/lib/python2.7/dist-packages/dill/dill.py", line 260, in loads
    return load(file)
  File "/usr/local/lib/python2.7/dist-packages/dill/dill.py", line 250, in load
    obj = pik.load()
  File "/usr/lib/python2.7/pickle.py", line 858, in load
    dispatch[key](self)
  File "/usr/lib/python2.7/pickle.py", line 1133, in load_reduce
    value = func(*args)
  File "/usr/local/lib/python2.7/dist-packages/dill/dill.py", line 726, in _import_module
    return getattr(__import__(module, None, None, [obj]), obj)
  File "/usr/local/lib/python2.7/dist-packages/apache_beam/transforms/ptransform_test.py",
line 26, in <module>
    import hamcrest as hc
ImportError: No module named hamcrest

root: INFO: 2017-01-24_01_18_28-12399986343756263108_00000159cfc7cb72: 2017-01-24T09:21:34.578Z:
JOB_MESSAGE_ERROR: (72f067c2063301ae): Traceback (most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 514,
in do_work
    work_executor.execute()
  File "dataflow_worker/executor.py", line 899, in dataflow_worker.executor.MapTaskExecutor.execute
(dataflow_worker/executor.c:26452)
    op.start()
  File "dataflow_worker/executor.py", line 464, in dataflow_worker.executor.DoOperation.start
(dataflow_worker/executor.c:15269)
    def start(self):
  File "dataflow_worker/executor.py", line 469, in dataflow_worker.executor.DoOperation.start
(dataflow_worker/executor.c:14434)
    pickler.loads(self.spec.serialized_fn))
  File "/usr/local/lib/python2.7/dist-packages/apache_beam/internal/pickler.py", line 212,
in loads
    return dill.loads(s)
  File "/usr/local/lib/python2.7/dist-packages/dill/dill.py", line 260, in loads
    return load(file)
  File "/usr/local/lib/python2.7/dist-packages/dill/dill.py", line 250, in load
    obj = pik.load()
  File "/usr/lib/python2.7/pickle.py", line 858, in load
    dispatch[key](self)
  File "/usr/lib/python2.7/pickle.py", line 1133, in load_reduce
    value = func(*args)
  File "/usr/local/lib/python2.7/dist-packages/dill/dill.py", line 726, in _import_module
    return getattr(__import__(module, None, None, [obj]), obj)
  File "/usr/local/lib/python2.7/dist-packages/apache_beam/transforms/ptransform_test.py",
line 26, in <module>
    import hamcrest as hc
ImportError: No module named hamcrest

root: INFO: 2017-01-24_01_18_28-12399986343756263108_00000159cfc7cc0f: 2017-01-24T09:21:34.735Z:
JOB_MESSAGE_ERROR: (72f067c206330e98): Traceback (most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 514,
in do_work
    work_executor.execute()
  File "dataflow_worker/executor.py", line 899, in dataflow_worker.executor.MapTaskExecutor.execute
(dataflow_worker/executor.c:26452)
    op.start()
  File "dataflow_worker/executor.py", line 464, in dataflow_worker.executor.DoOperation.start
(dataflow_worker/executor.c:15269)
    def start(self):
  File "dataflow_worker/executor.py", line 469, in dataflow_worker.executor.DoOperation.start
(dataflow_worker/executor.c:14434)
    pickler.loads(self.spec.serialized_fn))
  File "/usr/local/lib/python2.7/dist-packages/apache_beam/internal/pickler.py", line 212,
in loads
    return dill.loads(s)
  File "/usr/local/lib/python2.7/dist-packages/dill/dill.py", line 260, in loads
    return load(file)
  File "/usr/local/lib/python2.7/dist-packages/dill/dill.py", line 250, in load
    obj = pik.load()
  File "/usr/lib/python2.7/pickle.py", line 858, in load
    dispatch[key](self)
  File "/usr/lib/python2.7/pickle.py", line 1133, in load_reduce
    value = func(*args)
  File "/usr/local/lib/python2.7/dist-packages/dill/dill.py", line 726, in _import_module
    return getattr(__import__(module, None, None, [obj]), obj)
  File "/usr/local/lib/python2.7/dist-packages/apache_beam/transforms/ptransform_test.py",
line 26, in <module>
    import hamcrest as hc
ImportError: No module named hamcrest

root: INFO: 2017-01-24_01_18_28-12399986343756263108_00000159cfc7ccaf: 2017-01-24T09:21:34.895Z:
JOB_MESSAGE_ERROR: (72f067c206330b82): Traceback (most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 514,
in do_work
    work_executor.execute()
  File "dataflow_worker/executor.py", line 899, in dataflow_worker.executor.MapTaskExecutor.execute
(dataflow_worker/executor.c:26452)
    op.start()
  File "dataflow_worker/executor.py", line 464, in dataflow_worker.executor.DoOperation.start
(dataflow_worker/executor.c:15269)
    def start(self):
  File "dataflow_worker/executor.py", line 469, in dataflow_worker.executor.DoOperation.start
(dataflow_worker/executor.c:14434)
    pickler.loads(self.spec.serialized_fn))
  File "/usr/local/lib/python2.7/dist-packages/apache_beam/internal/pickler.py", line 212,
in loads
    return dill.loads(s)
  File "/usr/local/lib/python2.7/dist-packages/dill/dill.py", line 260, in loads
    return load(file)
  File "/usr/local/lib/python2.7/dist-packages/dill/dill.py", line 250, in load
    obj = pik.load()
  File "/usr/lib/python2.7/pickle.py", line 858, in load
    dispatch[key](self)
  File "/usr/lib/python2.7/pickle.py", line 1133, in load_reduce
    value = func(*args)
  File "/usr/local/lib/python2.7/dist-packages/dill/dill.py", line 726, in _import_module
    return getattr(__import__(module, None, None, [obj]), obj)
  File "/usr/local/lib/python2.7/dist-packages/apache_beam/transforms/ptransform_test.py",
line 26, in <module>
    import hamcrest as hc
ImportError: No module named hamcrest

root: INFO: 2017-01-24_01_18_28-12399986343756263108_00000159cfc7cd46: 2017-01-24T09:21:35.046Z:
JOB_MESSAGE_ERROR: (72f067c20633086c): Traceback (most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 514,
in do_work
    work_executor.execute()
  File "dataflow_worker/executor.py", line 899, in dataflow_worker.executor.MapTaskExecutor.execute
(dataflow_worker/executor.c:26452)
    op.start()
  File "dataflow_worker/executor.py", line 464, in dataflow_worker.executor.DoOperation.start
(dataflow_worker/executor.c:15269)
    def start(self):
  File "dataflow_worker/executor.py", line 469, in dataflow_worker.executor.DoOperation.start
(dataflow_worker/executor.c:14434)
    pickler.loads(self.spec.serialized_fn))
  File "/usr/local/lib/python2.7/dist-packages/apache_beam/internal/pickler.py", line 212,
in loads
    return dill.loads(s)
  File "/usr/local/lib/python2.7/dist-packages/dill/dill.py", line 260, in loads
    return load(file)
  File "/usr/local/lib/python2.7/dist-packages/dill/dill.py", line 250, in load
    obj = pik.load()
  File "/usr/lib/python2.7/pickle.py", line 858, in load
    dispatch[key](self)
  File "/usr/lib/python2.7/pickle.py", line 1133, in load_reduce
    value = func(*args)
  File "/usr/local/lib/python2.7/dist-packages/dill/dill.py", line 726, in _import_module
    return getattr(__import__(module, None, None, [obj]), obj)
  File "/usr/local/lib/python2.7/dist-packages/apache_beam/transforms/ptransform_test.py",
line 26, in <module>
    import hamcrest as hc
ImportError: No module named hamcrest

root: INFO: 2017-01-24_01_18_28-12399986343756263108_00000159cfc7ce14: 2017-01-24T09:21:35.252Z:
JOB_MESSAGE_ERROR: (72f067c206330c69): Traceback (most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 514,
in do_work
    work_executor.execute()
  File "dataflow_worker/executor.py", line 899, in dataflow_worker.executor.MapTaskExecutor.execute
(dataflow_worker/executor.c:26452)
    op.start()
  File "dataflow_worker/executor.py", line 464, in dataflow_worker.executor.DoOperation.start
(dataflow_worker/executor.c:15269)
    def start(self):
  File "dataflow_worker/executor.py", line 469, in dataflow_worker.executor.DoOperation.start
(dataflow_worker/executor.c:14434)
    pickler.loads(self.spec.serialized_fn))
  File "/usr/local/lib/python2.7/dist-packages/apache_beam/internal/pickler.py", line 212,
in loads
    return dill.loads(s)
  File "/usr/local/lib/python2.7/dist-packages/dill/dill.py", line 260, in loads
    return load(file)
  File "/usr/local/lib/python2.7/dist-packages/dill/dill.py", line 250, in load
    obj = pik.load()
  File "/usr/lib/python2.7/pickle.py", line 858, in load
    dispatch[key](self)
  File "/usr/lib/python2.7/pickle.py", line 1133, in load_reduce
    value = func(*args)
  File "/usr/local/lib/python2.7/dist-packages/dill/dill.py", line 726, in _import_module
    return getattr(__import__(module, None, None, [obj]), obj)
  File "/usr/local/lib/python2.7/dist-packages/apache_beam/transforms/ptransform_test.py",
line 26, in <module>
    import hamcrest as hc
ImportError: No module named hamcrest

root: INFO: 2017-01-24_01_18_28-12399986343756263108_00000159cfc7cedc: 2017-01-24T09:21:35.452Z:
JOB_MESSAGE_ERROR: (72f067c206330066): Traceback (most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 514,
in do_work
    work_executor.execute()
  File "dataflow_worker/executor.py", line 899, in dataflow_worker.executor.MapTaskExecutor.execute
(dataflow_worker/executor.c:26452)
    op.start()
  File "dataflow_worker/executor.py", line 464, in dataflow_worker.executor.DoOperation.start
(dataflow_worker/executor.c:15269)
    def start(self):
  File "dataflow_worker/executor.py", line 469, in dataflow_worker.executor.DoOperation.start
(dataflow_worker/executor.c:14434)
    pickler.loads(self.spec.serialized_fn))
  File "/usr/local/lib/python2.7/dist-packages/apache_beam/internal/pickler.py", line 212,
in loads
    return dill.loads(s)
  File "/usr/local/lib/python2.7/dist-packages/dill/dill.py", line 260, in loads
    return load(file)
  File "/usr/local/lib/python2.7/dist-packages/dill/dill.py", line 250, in load
    obj = pik.load()
  File "/usr/lib/python2.7/pickle.py", line 858, in load
    dispatch[key](self)
  File "/usr/lib/python2.7/pickle.py", line 1133, in load_reduce
    value = func(*args)
  File "/usr/local/lib/python2.7/dist-packages/dill/dill.py", line 726, in _import_module
    return getattr(__import__(module, None, None, [obj]), obj)
  File "/usr/local/lib/python2.7/dist-packages/apache_beam/transforms/ptransform_test.py",
line 26, in <module>
    import hamcrest as hc
ImportError: No module named hamcrest

root: INFO: 2017-01-24_01_18_28-12399986343756263108_00000159cfc7cfad: 2017-01-24T09:21:35.661Z:
JOB_MESSAGE_ERROR: (72f067c206330463): Traceback (most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 514,
in do_work
    work_executor.execute()
  File "dataflow_worker/executor.py", line 899, in dataflow_worker.executor.MapTaskExecutor.execute
(dataflow_worker/executor.c:26452)
    op.start()
  File "dataflow_worker/executor.py", line 464, in dataflow_worker.executor.DoOperation.start
(dataflow_worker/executor.c:15269)
    def start(self):
  File "dataflow_worker/executor.py", line 469, in dataflow_worker.executor.DoOperation.start
(dataflow_worker/executor.c:14434)
    pickler.loads(self.spec.serialized_fn))
  File "/usr/local/lib/python2.7/dist-packages/apache_beam/internal/pickler.py", line 212,
in loads
    return dill.loads(s)
  File "/usr/local/lib/python2.7/dist-packages/dill/dill.py", line 260, in loads
    return load(file)
  File "/usr/local/lib/python2.7/dist-packages/dill/dill.py", line 250, in load
    obj = pik.load()
  File "/usr/lib/python2.7/pickle.py", line 858, in load
    dispatch[key](self)
  File "/usr/lib/python2.7/pickle.py", line 1133, in load_reduce
    value = func(*args)
  File "/usr/local/lib/python2.7/dist-packages/dill/dill.py", line 726, in _import_module
    return getattr(__import__(module, None, None, [obj]), obj)
  File "/usr/local/lib/python2.7/dist-packages/apache_beam/transforms/ptransform_test.py",
line 26, in <module>
    import hamcrest as hc
ImportError: No module named hamcrest

root: INFO: 2017-01-24_01_18_28-12399986343756263108_00000159cfc7cfd6: 2017-01-24T09:21:35.702Z:
JOB_MESSAGE_DEBUG: (5129b9404837122d): Executing failure step failure1
root: INFO: 2017-01-24_01_18_28-12399986343756263108_00000159cfc7cfd9: 2017-01-24T09:21:35.705Z:
JOB_MESSAGE_ERROR: (5129b94048371343): Workflow failed. Causes: (87856bc85f4e2d60): S04:ClassifyNumbers/ClassifyNumbers+assert:odd/WindowInto+assert:odd/ToVoidKey+assert:even/WindowInto+assert:even/ToVoidKey+assert:even/Group/Reify+assert:even/Group/Write+assert_that/WindowInto+assert_that/ToVoidKey+assert_that/Group/Reify+assert_that/Group/Write+assert:odd/Group/Reify+assert:odd/Group/Write
failed.
root: INFO: 2017-01-24_01_18_28-12399986343756263108_00000159cfc7d016: 2017-01-24T09:21:35.766Z:
JOB_MESSAGE_DETAILED: (90c2002d0fc71cca): Cleaning up.
root: INFO: 2017-01-24_01_18_28-12399986343756263108_00000159cfc7d094: 2017-01-24T09:21:35.892Z:
JOB_MESSAGE_DEBUG: (90c2002d0fc71e84): Starting worker pool teardown.
root: INFO: 2017-01-24_01_18_28-12399986343756263108_00000159cfc7d096: 2017-01-24T09:21:35.894Z:
JOB_MESSAGE_BASIC: (90c2002d0fc7103e): Stopping worker pool...
root: INFO: 2017-01-24_01_18_28-12399986343756263108_00000159cfc8f596: 2017-01-24T09:22:50.902Z:
JOB_MESSAGE_BASIC: (90c2002d0fc71ad5): Worker pool stopped.
root: INFO: 2017-01-24_01_18_28-12399986343756263108_00000159cfc8f99a: 2017-01-24T09:22:51.930Z:
JOB_MESSAGE_DEBUG: (90c2002d0fc71003): Tearing down pending resources...
root: INFO: Job 2017-01-24_01_18_28-12399986343756263108 is in state JOB_STATE_FAILED
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
Ran 14 tests in 1040.088s

FAILED (errors=4)
Build step 'Execute shell' marked build as failure

Mime
View raw message