beam-commits mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Reuven Lax (JIRA)" <>
Subject [jira] [Commented] (BEAM-2768) Fix bigquery.WriteTables generating non-unique job identifiers
Date Tue, 15 Aug 2017 21:50:00 GMT


Reuven Lax commented on BEAM-2768:

The load job for a specific table is the UUID (which is generated when the pipeline first
starts) followed by a hash of the table name. This means that if the worker fails, it _will_
use the same job id to load that table. This is by design so we don't double load data into

> Fix bigquery.WriteTables generating non-unique job identifiers
> --------------------------------------------------------------
>                 Key: BEAM-2768
>                 URL:
>             Project: Beam
>          Issue Type: Bug
>          Components: beam-model
>    Affects Versions: 2.0.0
>            Reporter: Matti Remes
>            Assignee: Reuven Lax
> This is a result of BigQueryIO not creating unique job ids for batch inserts, thus BigQuery
API responding with a 409 conflict error:
> {code:java}
> Request failed with code 409, will NOT retry:<project_id>/jobs
> {code}
> The jobs are initiated in a step BatchLoads/SinglePartitionWriteTables, called by step's
WriteTables ParDo:
> It would probably be a good idea to append a UUIDs as part of a job id.
> Edit: This is a major bug blocking using BigQuery as a sink for bounded input.

This message was sent by Atlassian JIRA

View raw message