beam-commits mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Ahmet Altay (JIRA)" <>
Subject [jira] [Assigned] (BEAM-2595) WriteToBigQuery does not work with nested json schema
Date Wed, 12 Jul 2017 01:39:00 GMT


Ahmet Altay reassigned BEAM-2595:

    Assignee: Sourabh Bajaj  (was: Ahmet Altay)

> WriteToBigQuery does not work with nested json schema
> -----------------------------------------------------
>                 Key: BEAM-2595
>                 URL:
>             Project: Beam
>          Issue Type: Bug
>          Components: sdk-py
>    Affects Versions: 2.1.0
>         Environment: mac os local runner, Python
>            Reporter: Andrea Pierleoni
>            Assignee: Sourabh Bajaj
>            Priority: Minor
>              Labels: gcp
>             Fix For: 2.1.0
> I am trying to use the new `WriteToBigQuery` PTransform added to ``
in version 2.1.0-RC1
> I need to write to a bigquery table with nested fields.
> The only way to specify nested schemas in bigquery is with teh json schema.
> None of the classes in `` are able to parse the json schema,
but they accept a schema as an instance of the class ``
> I am composing the `TableFieldSchema` as suggested here [],
and it looks fine when passed to the PTransform `WriteToBigQuery`. 
> The problem is that the base class `PTransformWithSideInputs` try to pickle and unpickle
the function []
 (that includes the TableFieldSchema instance) and for some reason when the class is unpickled
some `FieldList` instance are converted to simple lists, and the pickling validation fails.
> Would it be possible to extend the test coverage to nested json objects for bigquery?
> They are also relatively easy to parse into a TableFieldSchema.

This message was sent by Atlassian JIRA

View raw message