beam-commits mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Apache Jenkins Server <jenk...@builds.apache.org>
Subject Build failed in Jenkins: beam_PostCommit_Python_ValidatesRunner_Dataflow #1166
Date Fri, 23 Mar 2018 02:13:32 GMT
See <https://builds.apache.org/job/beam_PostCommit_Python_ValidatesRunner_Dataflow/1166/display/redirect?page=changes>

Changes:

[andreas.ehrencrona] [BEAM-2264] Credentials were not being reused between GCS calls

[ccy] Replace side inputs when applying PTransformOverrides

[Pablo] Updating dataflow API protocol buffers.

[boyuanz] Add cython annotation to make DistributionAccumulator faster

[ehudm] Reduce precommit test timeouts for Java and Go.

[altay] More graceful fallback when grpc is not present.

------------------------------------------
[...truncated 1.24 MB...]
      "properties": {
        "display_data": [
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "<lambda>"
          }, 
          {
            "key": "fn", 
            "label": "Transform Function", 
            "namespace": "apache_beam.transforms.core.ParDo", 
            "shortValue": "CallableWrapperDoFn", 
            "type": "STRING", 
            "value": "apache_beam.transforms.core.CallableWrapperDoFn"
          }
        ], 
        "non_parallel_inputs": {}, 
        "output_info": [
          {
            "encoding": {
              "@type": "kind:windowed_value", 
              "component_encodings": [
                {
                  "@type": "kind:pair", 
                  "component_encodings": [
                    {
                      "@type": "kind:bytes"
                    }, 
                    {
                      "@type": "VarIntCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxhiUWeeSXOIA5XIYNmYyFjbSFTkh4A89cR+g==",

                      "component_encodings": []
                    }
                  ], 
                  "is_pair_like": true
                }, 
                {
                  "@type": "kind:global_window"
                }
              ], 
              "is_wrapper": true
            }, 
            "output_name": "out", 
            "user_name": "compute/MapToVoidKey0.out"
          }
        ], 
        "parallel_input": {
          "@type": "OutputReference", 
          "output_name": "out", 
          "step_name": "s2"
        }, 
        "serialized_fn": "<string of 968 bytes>", 
        "user_name": "compute/MapToVoidKey0"
      }
    }
  ], 
  "type": "JOB_TYPE_BATCH"
}
root: INFO: Create job: <Job
 createTime: u'2018-03-23T02:05:12.259479Z'
 currentStateTime: u'1970-01-01T00:00:00Z'
 id: u'2018-03-22_19_05_11-7449992449654972149'
 location: u'us-central1'
 name: u'beamapp-jenkins-0323020459-556685'
 projectId: u'apache-beam-testing'
 stageStates: []
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
root: INFO: Created job with id: [2018-03-22_19_05_11-7449992449654972149]
root: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-22_19_05_11-7449992449654972149?project=apache-beam-testing
root: INFO: Job 2018-03-22_19_05_11-7449992449654972149 is in state JOB_STATE_PENDING
root: INFO: 2018-03-23T02:05:11.364Z: JOB_MESSAGE_WARNING: Job 2018-03-22_19_05_11-7449992449654972149
might autoscale up to 250 workers.
root: INFO: 2018-03-23T02:05:11.379Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job
2018-03-22_19_05_11-7449992449654972149. The number of workers will be between 1 and 250.
root: INFO: 2018-03-23T02:05:11.400Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically
enabled for job 2018-03-22_19_05_11-7449992449654972149.
root: INFO: 2018-03-23T02:05:13.893Z: JOB_MESSAGE_DETAILED: Checking required Cloud APIs are
enabled.
root: INFO: 2018-03-23T02:05:14.049Z: JOB_MESSAGE_DETAILED: Checking permissions granted to
controller Service Account.
root: INFO: 2018-03-23T02:05:14.839Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations
into optimizable parts.
root: INFO: 2018-03-23T02:05:14.864Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step
assert_that/Group/GroupByKey: GroupByKey not followed by a combiner.
root: INFO: 2018-03-23T02:05:14.891Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations
into optimizable parts.
root: INFO: 2018-03-23T02:05:14.913Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns
into MergeBucketsMappingFns
root: INFO: 2018-03-23T02:05:14.938Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
root: INFO: 2018-03-23T02:05:14.974Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write,
and Flatten operations
root: INFO: 2018-03-23T02:05:14.998Z: JOB_MESSAGE_DETAILED: Unzipping flatten s11 for input
s10.out
root: INFO: 2018-03-23T02:05:15.028Z: JOB_MESSAGE_DETAILED: Fusing unzipped copy of assert_that/Group/GroupByKey/Reify,
through flatten assert_that/Group/Flatten, into producer assert_that/Group/pair_with_1
root: INFO: 2018-03-23T02:05:15.061Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/GroupByWindow
into assert_that/Group/GroupByKey/Read
root: INFO: 2018-03-23T02:05:15.094Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Unkey
into assert_that/Group/Map(_merge_tagged_vals_under_key)
root: INFO: 2018-03-23T02:05:15.124Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Match
into assert_that/Unkey
root: INFO: 2018-03-23T02:05:15.156Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/Map(_merge_tagged_vals_under_key)
into assert_that/Group/GroupByKey/GroupByWindow
root: INFO: 2018-03-23T02:05:15.186Z: JOB_MESSAGE_DETAILED: Unzipping flatten s11-u13 for
input s12-reify-value0-c11
root: INFO: 2018-03-23T02:05:15.218Z: JOB_MESSAGE_DETAILED: Fusing unzipped copy of assert_that/Group/GroupByKey/Write,
through flatten s11-u13, into producer assert_that/Group/GroupByKey/Reify
root: INFO: 2018-03-23T02:05:15.249Z: JOB_MESSAGE_DETAILED: Fusing consumer compute/MapToVoidKey0
into side/Read
root: INFO: 2018-03-23T02:05:15.292Z: JOB_MESSAGE_DETAILED: Fusing consumer compute/MapToVoidKey0
into side/Read
root: INFO: 2018-03-23T02:05:15.323Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/Write
into assert_that/Group/GroupByKey/Reify
root: INFO: 2018-03-23T02:05:15.345Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/Reify
into assert_that/Group/pair_with_0
root: INFO: 2018-03-23T02:05:15.371Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/pair_with_1
into assert_that/ToVoidKey
root: INFO: 2018-03-23T02:05:15.398Z: JOB_MESSAGE_DETAILED: Fusing consumer compute/compute
into start/Read
root: INFO: 2018-03-23T02:05:15.431Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/ToVoidKey
into assert_that/WindowInto(WindowIntoFn)
root: INFO: 2018-03-23T02:05:15.451Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/WindowInto(WindowIntoFn)
into compute/compute
root: INFO: 2018-03-23T02:05:15.478Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/pair_with_0
into assert_that/Create/Read
root: INFO: 2018-03-23T02:05:15.505Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default
resource spec.
root: INFO: 2018-03-23T02:05:15.528Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown
to workflow graph.
root: INFO: 2018-03-23T02:05:15.560Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
root: INFO: 2018-03-23T02:05:15.595Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
root: INFO: 2018-03-23T02:05:15.746Z: JOB_MESSAGE_DEBUG: Executing wait step start22
root: INFO: 2018-03-23T02:05:15.803Z: JOB_MESSAGE_BASIC: Executing operation side/Read+compute/MapToVoidKey0+compute/MapToVoidKey0
root: INFO: 2018-03-23T02:05:15.837Z: JOB_MESSAGE_BASIC: Executing operation assert_that/Group/GroupByKey/Create
root: INFO: 2018-03-23T02:05:15.850Z: JOB_MESSAGE_DEBUG: Starting worker pool setup.
root: INFO: 2018-03-23T02:05:15.871Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-f...
root: INFO: 2018-03-23T02:05:15.955Z: JOB_MESSAGE_DEBUG: Value "assert_that/Group/GroupByKey/Session"
materialized.
root: INFO: 2018-03-23T02:05:16.003Z: JOB_MESSAGE_BASIC: Executing operation assert_that/Create/Read+assert_that/Group/pair_with_0+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write
root: INFO: Job 2018-03-22_19_05_11-7449992449654972149 is in state JOB_STATE_RUNNING
root: INFO: 2018-03-23T02:05:25.730Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number
of workers to 0 based on the rate of progress in the currently running step(s).
root: INFO: 2018-03-23T02:05:41.449Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number
of workers to 1 based on the rate of progress in the currently running step(s).
root: INFO: 2018-03-23T02:05:57.311Z: JOB_MESSAGE_DETAILED: Workers have started successfully.
root: INFO: 2018-03-23T02:10:44.965Z: JOB_MESSAGE_DEBUG: Value "compute/MapToVoidKey0.out"
materialized.
root: INFO: 2018-03-23T02:10:45.025Z: JOB_MESSAGE_BASIC: Executing operation compute/_DataflowIterableSideInput(MapToVoidKey0.out.0)
root: INFO: 2018-03-23T02:10:45.133Z: JOB_MESSAGE_DEBUG: Value "compute/_DataflowIterableSideInput(MapToVoidKey0.out.0).output"
materialized.
root: INFO: 2018-03-23T02:10:45.202Z: JOB_MESSAGE_BASIC: Executing operation start/Read+compute/compute+assert_that/WindowInto(WindowIntoFn)+assert_that/ToVoidKey+assert_that/Group/pair_with_1+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write
root: INFO: 2018-03-23T02:10:53.750Z: JOB_MESSAGE_ERROR: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 609,
in do_work
    work_executor.execute()
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/executor.py", line 167, in
execute
    op.start()
  File "apache_beam/runners/worker/operations.py", line 339, in apache_beam.runners.worker.operations.DoOperation.start
    def start(self):
  File "apache_beam/runners/worker/operations.py", line 340, in apache_beam.runners.worker.operations.DoOperation.start
    with self.scoped_start_state:
  File "apache_beam/runners/worker/operations.py", line 372, in apache_beam.runners.worker.operations.DoOperation.start
    self.dofn_runner = common.DoFnRunner(
  File "apache_beam/runners/common.py", line 483, in apache_beam.runners.common.DoFnRunner.__init__
    self.do_fn_invoker = DoFnInvoker.create_invoker(
  File "apache_beam/runners/common.py", line 203, in apache_beam.runners.common.DoFnInvoker.create_invoker
    return PerWindowInvoker(
  File "apache_beam/runners/common.py", line 313, in apache_beam.runners.common.PerWindowInvoker.__init__
    input_args, input_kwargs, [si[global_window] for si in side_inputs])
  File "/usr/local/lib/python2.7/dist-packages/apache_beam/transforms/sideinputs.py", line
62, in __getitem__
    self._cache[window] = self._view_class._from_runtime_iterable(
AttributeError: type object '_DataflowIterableSideInput' has no attribute '_from_runtime_iterable'

root: INFO: 2018-03-23T02:10:57.140Z: JOB_MESSAGE_ERROR: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 609,
in do_work
    work_executor.execute()
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/executor.py", line 167, in
execute
    op.start()
  File "apache_beam/runners/worker/operations.py", line 339, in apache_beam.runners.worker.operations.DoOperation.start
    def start(self):
  File "apache_beam/runners/worker/operations.py", line 340, in apache_beam.runners.worker.operations.DoOperation.start
    with self.scoped_start_state:
  File "apache_beam/runners/worker/operations.py", line 372, in apache_beam.runners.worker.operations.DoOperation.start
    self.dofn_runner = common.DoFnRunner(
  File "apache_beam/runners/common.py", line 483, in apache_beam.runners.common.DoFnRunner.__init__
    self.do_fn_invoker = DoFnInvoker.create_invoker(
  File "apache_beam/runners/common.py", line 203, in apache_beam.runners.common.DoFnInvoker.create_invoker
    return PerWindowInvoker(
  File "apache_beam/runners/common.py", line 313, in apache_beam.runners.common.PerWindowInvoker.__init__
    input_args, input_kwargs, [si[global_window] for si in side_inputs])
  File "/usr/local/lib/python2.7/dist-packages/apache_beam/transforms/sideinputs.py", line
62, in __getitem__
    self._cache[window] = self._view_class._from_runtime_iterable(
AttributeError: type object '_DataflowIterableSideInput' has no attribute '_from_runtime_iterable'

root: INFO: 2018-03-23T02:11:00.502Z: JOB_MESSAGE_ERROR: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 609,
in do_work
    work_executor.execute()
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/executor.py", line 167, in
execute
    op.start()
  File "apache_beam/runners/worker/operations.py", line 339, in apache_beam.runners.worker.operations.DoOperation.start
    def start(self):
  File "apache_beam/runners/worker/operations.py", line 340, in apache_beam.runners.worker.operations.DoOperation.start
    with self.scoped_start_state:
  File "apache_beam/runners/worker/operations.py", line 372, in apache_beam.runners.worker.operations.DoOperation.start
    self.dofn_runner = common.DoFnRunner(
  File "apache_beam/runners/common.py", line 483, in apache_beam.runners.common.DoFnRunner.__init__
    self.do_fn_invoker = DoFnInvoker.create_invoker(
  File "apache_beam/runners/common.py", line 203, in apache_beam.runners.common.DoFnInvoker.create_invoker
    return PerWindowInvoker(
  File "apache_beam/runners/common.py", line 313, in apache_beam.runners.common.PerWindowInvoker.__init__
    input_args, input_kwargs, [si[global_window] for si in side_inputs])
  File "/usr/local/lib/python2.7/dist-packages/apache_beam/transforms/sideinputs.py", line
62, in __getitem__
    self._cache[window] = self._view_class._from_runtime_iterable(
AttributeError: type object '_DataflowIterableSideInput' has no attribute '_from_runtime_iterable'

root: INFO: 2018-03-23T02:11:03.907Z: JOB_MESSAGE_ERROR: Traceback (most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 609,
in do_work
    work_executor.execute()
  File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/executor.py", line 167, in
execute
    op.start()
  File "apache_beam/runners/worker/operations.py", line 339, in apache_beam.runners.worker.operations.DoOperation.start
    def start(self):
  File "apache_beam/runners/worker/operations.py", line 340, in apache_beam.runners.worker.operations.DoOperation.start
    with self.scoped_start_state:
  File "apache_beam/runners/worker/operations.py", line 372, in apache_beam.runners.worker.operations.DoOperation.start
    self.dofn_runner = common.DoFnRunner(
  File "apache_beam/runners/common.py", line 483, in apache_beam.runners.common.DoFnRunner.__init__
    self.do_fn_invoker = DoFnInvoker.create_invoker(
  File "apache_beam/runners/common.py", line 203, in apache_beam.runners.common.DoFnInvoker.create_invoker
    return PerWindowInvoker(
  File "apache_beam/runners/common.py", line 313, in apache_beam.runners.common.PerWindowInvoker.__init__
    input_args, input_kwargs, [si[global_window] for si in side_inputs])
  File "/usr/local/lib/python2.7/dist-packages/apache_beam/transforms/sideinputs.py", line
62, in __getitem__
    self._cache[window] = self._view_class._from_runtime_iterable(
AttributeError: type object '_DataflowIterableSideInput' has no attribute '_from_runtime_iterable'

root: INFO: 2018-03-23T02:11:03.950Z: JOB_MESSAGE_DEBUG: Executing failure step failure21
root: INFO: 2018-03-23T02:11:03.980Z: JOB_MESSAGE_ERROR: Workflow failed. Causes: S05:start/Read+compute/compute+assert_that/WindowInto(WindowIntoFn)+assert_that/ToVoidKey+assert_that/Group/pair_with_1+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write
failed., A work item was attempted 4 times without success. Each time the worker eventually
lost contact with the service. The work item was attempted on: 
  beamapp-jenkins-032302045-03221905-d288-harness-hffv,
  beamapp-jenkins-032302045-03221905-d288-harness-hffv,
  beamapp-jenkins-032302045-03221905-d288-harness-hffv,
  beamapp-jenkins-032302045-03221905-d288-harness-hffv
root: INFO: 2018-03-23T02:11:04.080Z: JOB_MESSAGE_DETAILED: Cleaning up.
root: INFO: 2018-03-23T02:11:04.127Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown.
root: INFO: 2018-03-23T02:11:04.149Z: JOB_MESSAGE_BASIC: Stopping worker pool...
root: INFO: 2018-03-23T02:12:27.611Z: JOB_MESSAGE_DETAILED: Autoscaling: Reduced the number
of workers to 0 based on the rate of progress in the currently running step(s).
root: INFO: 2018-03-23T02:12:27.667Z: JOB_MESSAGE_DEBUG: Tearing down pending resources...
root: INFO: Job 2018-03-22_19_05_11-7449992449654972149 is in state JOB_STATE_FAILED
--------------------- >> end captured logging << ---------------------

----------------------------------------------------------------------
Ran 16 tests in 1657.284s

FAILED (errors=13)
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-22_18_45_49-13313620363163990844?project=apache-beam-testing
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-22_18_47_26-1680749264064027695?project=apache-beam-testing
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-22_18_54_51-6250365814551906199?project=apache-beam-testing
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-22_19_02_01-8638643449849694966?project=apache-beam-testing
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-22_18_45_50-14022589790249837194?project=apache-beam-testing
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-22_18_47_20-13687073865483655168?project=apache-beam-testing
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-22_18_54_17-6578937141719779286?project=apache-beam-testing
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-22_19_05_11-7449992449654972149?project=apache-beam-testing
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-22_18_45_50-18007576993199576905?project=apache-beam-testing
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-22_18_47_36-4779221549457932954?project=apache-beam-testing
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-22_18_58_06-11857559477839872159?project=apache-beam-testing
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-22_19_06_13-5787767063285973531?project=apache-beam-testing
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-22_18_45_50-4697351751722046060?project=apache-beam-testing
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-22_18_47_40-3995040887985047600?project=apache-beam-testing
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-22_18_54_59-16032330438633847742?project=apache-beam-testing
Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-22_19_04_10-7698288622735887743?project=apache-beam-testing
Build step 'Execute shell' marked build as failure
Not sending mail to unregistered user ccy@google.com
Not sending mail to unregistered user ehudm@google.com
Not sending mail to unregistered user boyuanz@google.com
Not sending mail to unregistered user markliu@google.com
Not sending mail to unregistered user XuMingmin@users.noreply.github.com
Not sending mail to unregistered user wcn@google.com
Not sending mail to unregistered user herohde@google.com
Not sending mail to unregistered user aaltay@gmail.com
Not sending mail to unregistered user andreas.ehrencrona@velik.it
Not sending mail to unregistered user ankurgoenka@gmail.com

Mime
View raw message