beam-commits mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Apache Jenkins Server <jenk...@builds.apache.org>
Subject Build failed in Jenkins: beam_PostCommit_Python_Verify #1384
Date Tue, 28 Feb 2017 10:39:40 GMT
See <https://builds.apache.org/job/beam_PostCommit_Python_Verify/1384/display/redirect>

------------------------------------------
[...truncated 1.33 MB...]
root: INFO: 2017-02-28_02_34_49-17108550646169921832_0000015a844976b6: 2017-02-28T10:34:51.446Z:
JOB_MESSAGE_DEBUG: (e0fa07d87901aa70): Assigning stage ids.
root: INFO: 2017-02-28_02_34_49-17108550646169921832_0000015a844976d8: 2017-02-28T10:34:51.480Z:
JOB_MESSAGE_DEBUG: (ec59e9167e70d453): Executing wait step start13
root: INFO: 2017-02-28_02_34_49-17108550646169921832_0000015a844976df: 2017-02-28T10:34:51.487Z:
JOB_MESSAGE_DEBUG: (ec59e9167e70d9b9): Executing operation start
root: INFO: 2017-02-28_02_34_49-17108550646169921832_0000015a844976e1: 2017-02-28T10:34:51.489Z:
JOB_MESSAGE_DEBUG: (d164a77bf9a9140a): Executing operation side
root: INFO: 2017-02-28_02_34_49-17108550646169921832_0000015a844976e9: 2017-02-28T10:34:51.497Z:
JOB_MESSAGE_DEBUG: (d164a77bf9a9178b): Value "start.out" materialized.
root: INFO: 2017-02-28_02_34_49-17108550646169921832_0000015a844976ec: 2017-02-28T10:34:51.500Z:
JOB_MESSAGE_DEBUG: (ec59e9167e70d2b0): Value "side.out" materialized.
root: INFO: 2017-02-28_02_34_49-17108550646169921832_0000015a844976f4: 2017-02-28T10:34:51.508Z:
JOB_MESSAGE_BASIC: S01: (ec59e9167e70dba7): Executing operation ViewAsIterable(side.None)/CreatePCollectionView
root: INFO: 2017-02-28_02_34_49-17108550646169921832_0000015a844976ff: 2017-02-28T10:34:51.519Z:
JOB_MESSAGE_DEBUG: (ec59e9167e70dbc0): Value "ViewAsIterable(side.None)/CreatePCollectionView.out"
materialized.
root: INFO: 2017-02-28_02_34_49-17108550646169921832_0000015a84497705: 2017-02-28T10:34:51.525Z:
JOB_MESSAGE_BASIC: S02: (ec59e9167e70d848): Executing operation assert_that/Group/Create
root: INFO: 2017-02-28_02_34_49-17108550646169921832_0000015a844977d0: 2017-02-28T10:34:51.728Z:
JOB_MESSAGE_DEBUG: (d341fe4cef57e41a): Starting worker pool setup.
root: INFO: 2017-02-28_02_34_49-17108550646169921832_0000015a844977d2: 2017-02-28T10:34:51.730Z:
JOB_MESSAGE_BASIC: (d341fe4cef57e6b0): Starting 1 workers...
root: INFO: 2017-02-28_02_34_49-17108550646169921832_0000015a844977df: 2017-02-28T10:34:51.743Z:
JOB_MESSAGE_DEBUG: (ec59e9167e70d314): Value "assert_that/Group/Session" materialized.
root: INFO: 2017-02-28_02_34_49-17108550646169921832_0000015a844977ec: 2017-02-28T10:34:51.756Z:
JOB_MESSAGE_BASIC: S03: (ec59e9167e70dc0b): Executing operation compute+assert_that/WindowInto(WindowIntoFn)+assert_that/ToVoidKey+assert_that/Group/Reify+assert_that/Group/Write
root: INFO: 2017-02-28_02_34_49-17108550646169921832_0000015a844b19f6: 2017-02-28T10:36:38.774Z:
JOB_MESSAGE_DETAILED: (7030960f9ace01e1): Workers have started successfully.
root: INFO: 2017-02-28_02_34_49-17108550646169921832_0000015a844cae45: 2017-02-28T10:38:22.277Z:
JOB_MESSAGE_ERROR: (b5102054638bb3e4): Traceback (most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 544,
in do_work
    work_executor.execute()
  File "dataflow_worker/executor.py", line 971, in dataflow_worker.executor.MapTaskExecutor.execute
(dataflow_worker/executor.c:30533)
    with op.scoped_metrics_container:
  File "dataflow_worker/executor.py", line 972, in dataflow_worker.executor.MapTaskExecutor.execute
(dataflow_worker/executor.c:30481)
    op.start()
  File "dataflow_worker/executor.py", line 207, in dataflow_worker.executor.ReadOperation.start
(dataflow_worker/executor.c:8758)
    def start(self):
  File "dataflow_worker/executor.py", line 208, in dataflow_worker.executor.ReadOperation.start
(dataflow_worker/executor.c:8663)
    with self.scoped_start_state:
  File "dataflow_worker/executor.py", line 213, in dataflow_worker.executor.ReadOperation.start
(dataflow_worker/executor.c:8579)
    with self.spec.source.reader() as reader:
  File "dataflow_worker/executor.py", line 223, in dataflow_worker.executor.ReadOperation.start
(dataflow_worker/executor.c:8524)
    self.output(windowed_value)
  File "dataflow_worker/executor.py", line 151, in dataflow_worker.executor.Operation.output
(dataflow_worker/executor.c:6317)
    cython.cast(Receiver, self.receivers[output_index]).receive(windowed_value)
  File "dataflow_worker/executor.py", line 84, in dataflow_worker.executor.ConsumerSet.receive
(dataflow_worker/executor.c:4021)
    cython.cast(Operation, consumer).process(windowed_value)
  File "dataflow_worker/executor.py", line 544, in dataflow_worker.executor.DoOperation.process
(dataflow_worker/executor.c:18474)
    with self.scoped_process_state:
  File "dataflow_worker/executor.py", line 545, in dataflow_worker.executor.DoOperation.process
(dataflow_worker/executor.c:18428)
    self.dofn_receiver.receive(o)
  File "apache_beam/runners/common.py", line 195, in apache_beam.runners.common.DoFnRunner.receive
(apache_beam/runners/common.c:5142)
    self.process(windowed_value)
  File "apache_beam/runners/common.py", line 267, in apache_beam.runners.common.DoFnRunner.process
(apache_beam/runners/common.c:7201)
    self.reraise_augmented(exn)
  File "apache_beam/runners/common.py", line 279, in apache_beam.runners.common.DoFnRunner.reraise_augmented
(apache_beam/runners/common.c:7590)
    raise type(exn), args, sys.exc_info()[2]
  File "apache_beam/runners/common.py", line 265, in apache_beam.runners.common.DoFnRunner.process
(apache_beam/runners/common.c:7112)
    self._dofn_invoker(element)
  File "apache_beam/runners/common.py", line 232, in apache_beam.runners.common.DoFnRunner._dofn_invoker
(apache_beam/runners/common.c:6131)
    self._dofn_per_window_invoker(element)
  File "apache_beam/runners/common.py", line 218, in apache_beam.runners.common.DoFnRunner._dofn_per_window_invoker
(apache_beam/runners/common.c:5877)
    self._process_outputs(element, self.dofn_process(*args))
  File "apache_beam/runners/common.py", line 326, in apache_beam.runners.common.DoFnRunner._process_outputs
(apache_beam/runners/common.c:8563)
    self.main_receivers.receive(windowed_value)
  File "dataflow_worker/executor.py", line 82, in dataflow_worker.executor.ConsumerSet.receive
(dataflow_worker/executor.c:3987)
    self.update_counters_start(windowed_value)
  File "dataflow_worker/executor.py", line 88, in dataflow_worker.executor.ConsumerSet.update_counters_start
(dataflow_worker/executor.c:4207)
    self.opcounter.update_from(windowed_value)
  File "dataflow_worker/opcounters.py", line 57, in dataflow_worker.opcounters.OperationCounters.update_from
(dataflow_worker/opcounters.c:2396)
    self.do_sample(windowed_value)
  File "dataflow_worker/opcounters.py", line 75, in dataflow_worker.opcounters.OperationCounters.do_sample
(dataflow_worker/opcounters.c:3017)
    self.coder_impl.get_estimated_size_and_observables(windowed_value))
  File "apache_beam/coders/coder_impl.py", line 695, in apache_beam.coders.coder_impl.WindowedValueCoderImpl.get_estimated_size_and_observables
(apache_beam/coders/coder_impl.c:22894)
    def get_estimated_size_and_observables(self, value, nested=False):
  File "apache_beam/coders/coder_impl.py", line 704, in apache_beam.coders.coder_impl.WindowedValueCoderImpl.get_estimated_size_and_observables
(apache_beam/coders/coder_impl.c:22613)
    self._value_coder.get_estimated_size_and_observables(
  File "apache_beam/coders/coder_impl.py", line 247, in apache_beam.coders.coder_impl.FastPrimitivesCoderImpl.get_estimated_size_and_observables
(apache_beam/coders/coder_impl.c:9564)
    out = ByteCountingOutputStream()
  File "apache_beam/coders/stream.pyx", line 28, in apache_beam.coders.stream.OutputStream.__cinit__
(apache_beam/coders/stream.c:1241)
    self.buffer_size = 1024
AttributeError: 'apache_beam.coders.stream.ByteCountingOutputStream' object has no attribute
'buffer_size' [while running 'compute']

root: INFO: 2017-02-28_02_34_49-17108550646169921832_0000015a844cba90: 2017-02-28T10:38:25.424Z:
JOB_MESSAGE_ERROR: (b5102054638bb6d1): Traceback (most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 544,
in do_work
    work_executor.execute()
  File "dataflow_worker/executor.py", line 971, in dataflow_worker.executor.MapTaskExecutor.execute
(dataflow_worker/executor.c:30533)
    with op.scoped_metrics_container:
  File "dataflow_worker/executor.py", line 972, in dataflow_worker.executor.MapTaskExecutor.execute
(dataflow_worker/executor.c:30481)
    op.start()
  File "dataflow_worker/executor.py", line 207, in dataflow_worker.executor.ReadOperation.start
(dataflow_worker/executor.c:8758)
    def start(self):
  File "dataflow_worker/executor.py", line 208, in dataflow_worker.executor.ReadOperation.start
(dataflow_worker/executor.c:8663)
    with self.scoped_start_state:
  File "dataflow_worker/executor.py", line 213, in dataflow_worker.executor.ReadOperation.start
(dataflow_worker/executor.c:8579)
    with self.spec.source.reader() as reader:
  File "dataflow_worker/executor.py", line 223, in dataflow_worker.executor.ReadOperation.start
(dataflow_worker/executor.c:8524)
    self.output(windowed_value)
  File "dataflow_worker/executor.py", line 151, in dataflow_worker.executor.Operation.output
(dataflow_worker/executor.c:6317)
    cython.cast(Receiver, self.receivers[output_index]).receive(windowed_value)
  File "dataflow_worker/executor.py", line 84, in dataflow_worker.executor.ConsumerSet.receive
(dataflow_worker/executor.c:4021)
    cython.cast(Operation, consumer).process(windowed_value)
  File "dataflow_worker/executor.py", line 544, in dataflow_worker.executor.DoOperation.process
(dataflow_worker/executor.c:18474)
    with self.scoped_process_state:
  File "dataflow_worker/executor.py", line 545, in dataflow_worker.executor.DoOperation.process
(dataflow_worker/executor.c:18428)
    self.dofn_receiver.receive(o)
  File "apache_beam/runners/common.py", line 195, in apache_beam.runners.common.DoFnRunner.receive
(apache_beam/runners/common.c:5142)
    self.process(windowed_value)
  File "apache_beam/runners/common.py", line 267, in apache_beam.runners.common.DoFnRunner.process
(apache_beam/runners/common.c:7201)
    self.reraise_augmented(exn)
  File "apache_beam/runners/common.py", line 279, in apache_beam.runners.common.DoFnRunner.reraise_augmented
(apache_beam/runners/common.c:7590)
    raise type(exn), args, sys.exc_info()[2]
  File "apache_beam/runners/common.py", line 265, in apache_beam.runners.common.DoFnRunner.process
(apache_beam/runners/common.c:7112)
    self._dofn_invoker(element)
  File "apache_beam/runners/common.py", line 232, in apache_beam.runners.common.DoFnRunner._dofn_invoker
(apache_beam/runners/common.c:6131)
    self._dofn_per_window_invoker(element)
  File "apache_beam/runners/common.py", line 218, in apache_beam.runners.common.DoFnRunner._dofn_per_window_invoker
(apache_beam/runners/common.c:5877)
    self._process_outputs(element, self.dofn_process(*args))
  File "apache_beam/runners/common.py", line 326, in apache_beam.runners.common.DoFnRunner._process_outputs
(apache_beam/runners/common.c:8563)
    self.main_receivers.receive(windowed_value)
  File "dataflow_worker/executor.py", line 82, in dataflow_worker.executor.ConsumerSet.receive
(dataflow_worker/executor.c:3987)
    self.update_counters_start(windowed_value)
  File "dataflow_worker/executor.py", line 88, in dataflow_worker.executor.ConsumerSet.update_counters_start
(dataflow_worker/executor.c:4207)
    self.opcounter.update_from(windowed_value)
  File "dataflow_worker/opcounters.py", line 57, in dataflow_worker.opcounters.OperationCounters.update_from
(dataflow_worker/opcounters.c:2396)
    self.do_sample(windowed_value)
  File "dataflow_worker/opcounters.py", line 75, in dataflow_worker.opcounters.OperationCounters.do_sample
(dataflow_worker/opcounters.c:3017)
    self.coder_impl.get_estimated_size_and_observables(windowed_value))
  File "apache_beam/coders/coder_impl.py", line 695, in apache_beam.coders.coder_impl.WindowedValueCoderImpl.get_estimated_size_and_observables
(apache_beam/coders/coder_impl.c:22894)
    def get_estimated_size_and_observables(self, value, nested=False):
  File "apache_beam/coders/coder_impl.py", line 704, in apache_beam.coders.coder_impl.WindowedValueCoderImpl.get_estimated_size_and_observables
(apache_beam/coders/coder_impl.c:22613)
    self._value_coder.get_estimated_size_and_observables(
  File "apache_beam/coders/coder_impl.py", line 247, in apache_beam.coders.coder_impl.FastPrimitivesCoderImpl.get_estimated_size_and_observables
(apache_beam/coders/coder_impl.c:9564)
    out = ByteCountingOutputStream()
  File "apache_beam/coders/stream.pyx", line 28, in apache_beam.coders.stream.OutputStream.__cinit__
(apache_beam/coders/stream.c:1241)
    self.buffer_size = 1024
AttributeError: 'apache_beam.coders.stream.ByteCountingOutputStream' object has no attribute
'buffer_size' [while running 'compute']

root: INFO: 2017-02-28_02_34_49-17108550646169921832_0000015a844cc6ba: 2017-02-28T10:38:28.538Z:
JOB_MESSAGE_ERROR: (b5102054638bb9be): Traceback (most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 544,
in do_work
    work_executor.execute()
  File "dataflow_worker/executor.py", line 971, in dataflow_worker.executor.MapTaskExecutor.execute
(dataflow_worker/executor.c:30533)
    with op.scoped_metrics_container:
  File "dataflow_worker/executor.py", line 972, in dataflow_worker.executor.MapTaskExecutor.execute
(dataflow_worker/executor.c:30481)
    op.start()
  File "dataflow_worker/executor.py", line 207, in dataflow_worker.executor.ReadOperation.start
(dataflow_worker/executor.c:8758)
    def start(self):
  File "dataflow_worker/executor.py", line 208, in dataflow_worker.executor.ReadOperation.start
(dataflow_worker/executor.c:8663)
    with self.scoped_start_state:
  File "dataflow_worker/executor.py", line 213, in dataflow_worker.executor.ReadOperation.start
(dataflow_worker/executor.c:8579)
    with self.spec.source.reader() as reader:
  File "dataflow_worker/executor.py", line 223, in dataflow_worker.executor.ReadOperation.start
(dataflow_worker/executor.c:8524)
    self.output(windowed_value)
  File "dataflow_worker/executor.py", line 151, in dataflow_worker.executor.Operation.output
(dataflow_worker/executor.c:6317)
    cython.cast(Receiver, self.receivers[output_index]).receive(windowed_value)
  File "dataflow_worker/executor.py", line 84, in dataflow_worker.executor.ConsumerSet.receive
(dataflow_worker/executor.c:4021)
    cython.cast(Operation, consumer).process(windowed_value)
  File "dataflow_worker/executor.py", line 544, in dataflow_worker.executor.DoOperation.process
(dataflow_worker/executor.c:18474)
    with self.scoped_process_state:
  File "dataflow_worker/executor.py", line 545, in dataflow_worker.executor.DoOperation.process
(dataflow_worker/executor.c:18428)
    self.dofn_receiver.receive(o)
  File "apache_beam/runners/common.py", line 195, in apache_beam.runners.common.DoFnRunner.receive
(apache_beam/runners/common.c:5142)
    self.process(windowed_value)
  File "apache_beam/runners/common.py", line 267, in apache_beam.runners.common.DoFnRunner.process
(apache_beam/runners/common.c:7201)
    self.reraise_augmented(exn)
  File "apache_beam/runners/common.py", line 279, in apache_beam.runners.common.DoFnRunner.reraise_augmented
(apache_beam/runners/common.c:7590)
    raise type(exn), args, sys.exc_info()[2]
  File "apache_beam/runners/common.py", line 265, in apache_beam.runners.common.DoFnRunner.process
(apache_beam/runners/common.c:7112)
    self._dofn_invoker(element)
  File "apache_beam/runners/common.py", line 232, in apache_beam.runners.common.DoFnRunner._dofn_invoker
(apache_beam/runners/common.c:6131)
    self._dofn_per_window_invoker(element)
  File "apache_beam/runners/common.py", line 218, in apache_beam.runners.common.DoFnRunner._dofn_per_window_invoker
(apache_beam/runners/common.c:5877)
    self._process_outputs(element, self.dofn_process(*args))
  File "apache_beam/runners/common.py", line 326, in apache_beam.runners.common.DoFnRunner._process_outputs
(apache_beam/runners/common.c:8563)
    self.main_receivers.receive(windowed_value)
  File "dataflow_worker/executor.py", line 82, in dataflow_worker.executor.ConsumerSet.receive
(dataflow_worker/executor.c:3987)
    self.update_counters_start(windowed_value)
  File "dataflow_worker/executor.py", line 88, in dataflow_worker.executor.ConsumerSet.update_counters_start
(dataflow_worker/executor.c:4207)
    self.opcounter.update_from(windowed_value)
  File "dataflow_worker/opcounters.py", line 57, in dataflow_worker.opcounters.OperationCounters.update_from
(dataflow_worker/opcounters.c:2396)
    self.do_sample(windowed_value)
  File "dataflow_worker/opcounters.py", line 75, in dataflow_worker.opcounters.OperationCounters.do_sample
(dataflow_worker/opcounters.c:3017)
    self.coder_impl.get_estimated_size_and_observables(windowed_value))
  File "apache_beam/coders/coder_impl.py", line 695, in apache_beam.coders.coder_impl.WindowedValueCoderImpl.get_estimated_size_and_observables
(apache_beam/coders/coder_impl.c:22894)
    def get_estimated_size_and_observables(self, value, nested=False):
  File "apache_beam/coders/coder_impl.py", line 704, in apache_beam.coders.coder_impl.WindowedValueCoderImpl.get_estimated_size_and_observables
(apache_beam/coders/coder_impl.c:22613)
    self._value_coder.get_estimated_size_and_observables(
  File "apache_beam/coders/coder_impl.py", line 247, in apache_beam.coders.coder_impl.FastPrimitivesCoderImpl.get_estimated_size_and_observables
(apache_beam/coders/coder_impl.c:9564)
    out = ByteCountingOutputStream()
  File "apache_beam/coders/stream.pyx", line 28, in apache_beam.coders.stream.OutputStream.__cinit__
(apache_beam/coders/stream.c:1241)
    self.buffer_size = 1024
AttributeError: 'apache_beam.coders.stream.ByteCountingOutputStream' object has no attribute
'buffer_size' [while running 'compute']

root: INFO: 2017-02-28_02_34_49-17108550646169921832_0000015a844cd2e9: 2017-02-28T10:38:31.657Z:
JOB_MESSAGE_ERROR: (b5102054638bbcab): Traceback (most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 544,
in do_work
    work_executor.execute()
  File "dataflow_worker/executor.py", line 971, in dataflow_worker.executor.MapTaskExecutor.execute
(dataflow_worker/executor.c:30533)
    with op.scoped_metrics_container:
  File "dataflow_worker/executor.py", line 972, in dataflow_worker.executor.MapTaskExecutor.execute
(dataflow_worker/executor.c:30481)
    op.start()
  File "dataflow_worker/executor.py", line 207, in dataflow_worker.executor.ReadOperation.start
(dataflow_worker/executor.c:8758)
    def start(self):
  File "dataflow_worker/executor.py", line 208, in dataflow_worker.executor.ReadOperation.start
(dataflow_worker/executor.c:8663)
    with self.scoped_start_state:
  File "dataflow_worker/executor.py", line 213, in dataflow_worker.executor.ReadOperation.start
(dataflow_worker/executor.c:8579)
    with self.spec.source.reader() as reader:
  File "dataflow_worker/executor.py", line 223, in dataflow_worker.executor.ReadOperation.start
(dataflow_worker/executor.c:8524)
    self.output(windowed_value)
  File "dataflow_worker/executor.py", line 151, in dataflow_worker.executor.Operation.output
(dataflow_worker/executor.c:6317)
    cython.cast(Receiver, self.receivers[output_index]).receive(windowed_value)
  File "dataflow_worker/executor.py", line 84, in dataflow_worker.executor.ConsumerSet.receive
(dataflow_worker/executor.c:4021)
    cython.cast(Operation, consumer).process(windowed_value)
  File "dataflow_worker/executor.py", line 544, in dataflow_worker.executor.DoOperation.process
(dataflow_worker/executor.c:18474)
    with self.scoped_process_state:
  File "dataflow_worker/executor.py", line 545, in dataflow_worker.executor.DoOperation.process
(dataflow_worker/executor.c:18428)
    self.dofn_receiver.receive(o)
  File "apache_beam/runners/common.py", line 195, in apache_beam.runners.common.DoFnRunner.receive
(apache_beam/runners/common.c:5142)
    self.process(windowed_value)
  File "apache_beam/runners/common.py", line 267, in apache_beam.runners.common.DoFnRunner.process
(apache_beam/runners/common.c:7201)
    self.reraise_augmented(exn)
  File "apache_beam/runners/common.py", line 279, in apache_beam.runners.common.DoFnRunner.reraise_augmented
(apache_beam/runners/common.c:7590)
    raise type(exn), args, sys.exc_info()[2]
  File "apache_beam/runners/common.py", line 265, in apache_beam.runners.common.DoFnRunner.process
(apache_beam/runners/common.c:7112)
    self._dofn_invoker(element)
  File "apache_beam/runners/common.py", line 232, in apache_beam.runners.common.DoFnRunner._dofn_invoker
(apache_beam/runners/common.c:6131)
    self._dofn_per_window_invoker(element)
  File "apache_beam/runners/common.py", line 218, in apache_beam.runners.common.DoFnRunner._dofn_per_window_invoker
(apache_beam/runners/common.c:5877)
    self._process_outputs(element, self.dofn_process(*args))
  File "apache_beam/runners/common.py", line 326, in apache_beam.runners.common.DoFnRunner._process_outputs
(apache_beam/runners/common.c:8563)
    self.main_receivers.receive(windowed_value)
  File "dataflow_worker/executor.py", line 82, in dataflow_worker.executor.ConsumerSet.receive
(dataflow_worker/executor.c:3987)
    self.update_counters_start(windowed_value)
  File "dataflow_worker/executor.py", line 88, in dataflow_worker.executor.ConsumerSet.update_counters_start
(dataflow_worker/executor.c:4207)
    self.opcounter.update_from(windowed_value)
  File "dataflow_worker/opcounters.py", line 57, in dataflow_worker.opcounters.OperationCounters.update_from
(dataflow_worker/opcounters.c:2396)
    self.do_sample(windowed_value)
  File "dataflow_worker/opcounters.py", line 75, in dataflow_worker.opcounters.OperationCounters.do_sample
(dataflow_worker/opcounters.c:3017)
    self.coder_impl.get_estimated_size_and_observables(windowed_value))
  File "apache_beam/coders/coder_impl.py", line 695, in apache_beam.coders.coder_impl.WindowedValueCoderImpl.get_estimated_size_and_observables
(apache_beam/coders/coder_impl.c:22894)
    def get_estimated_size_and_observables(self, value, nested=False):
  File "apache_beam/coders/coder_impl.py", line 704, in apache_beam.coders.coder_impl.WindowedValueCoderImpl.get_estimated_size_and_observables
(apache_beam/coders/coder_impl.c:22613)
    self._value_coder.get_estimated_size_and_observables(
  File "apache_beam/coders/coder_impl.py", line 247, in apache_beam.coders.coder_impl.FastPrimitivesCoderImpl.get_estimated_size_and_observables
(apache_beam/coders/coder_impl.c:9564)
    out = ByteCountingOutputStream()
  File "apache_beam/coders/stream.pyx", line 28, in apache_beam.coders.stream.OutputStream.__cinit__
(apache_beam/coders/stream.c:1241)
    self.buffer_size = 1024
AttributeError: 'apache_beam.coders.stream.ByteCountingOutputStream' object has no attribute
'buffer_size' [while running 'compute']

root: INFO: 2017-02-28_02_34_49-17108550646169921832_0000015a844cd371: 2017-02-28T10:38:31.793Z:
JOB_MESSAGE_DEBUG: (ec59e9167e70d8ac): Executing failure step failure12
root: INFO: 2017-02-28_02_34_49-17108550646169921832_0000015a844cd373: 2017-02-28T10:38:31.795Z:
JOB_MESSAGE_ERROR: (ec59e9167e70dfce): Workflow failed. Causes: (ec59e9167e70d346): S03:compute+assert_that/WindowInto(WindowIntoFn)+assert_that/ToVoidKey+assert_that/Group/Reify+assert_that/Group/Write
failed.
root: INFO: 2017-02-28_02_34_49-17108550646169921832_0000015a844cd3ac: 2017-02-28T10:38:31.852Z:
JOB_MESSAGE_DETAILED: (e0fa07d87901a205): Cleaning up.
root: INFO: 2017-02-28_02_34_49-17108550646169921832_0000015a844cd42f: 2017-02-28T10:38:31.983Z:
JOB_MESSAGE_DEBUG: (e0fa07d87901a7c3): Starting worker pool teardown.
root: INFO: 2017-02-28_02_34_49-17108550646169921832_0000015a844cd431: 2017-02-28T10:38:31.985Z:
JOB_MESSAGE_BASIC: (e0fa07d87901ad81): Stopping worker pool...
root: INFO: 2017-02-28_02_34_49-17108550646169921832_0000015a844dbe97: 2017-02-28T10:39:31.991Z:
JOB_MESSAGE_BASIC: (e0fa07d87901ae1e): Worker pool stopped.
root: INFO: 2017-02-28_02_34_49-17108550646169921832_0000015a844dbeae: 2017-02-28T10:39:32.014Z:
JOB_MESSAGE_DEBUG: (e0fa07d87901af58): Tearing down pending resources...
root: INFO: Job 2017-02-28_02_34_49-17108550646169921832 is in state JOB_STATE_FAILED
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
Ran 14 tests in 4302.954s

FAILED (errors=13)
Build step 'Execute shell' marked build as failure

Mime
View raw message