nifi-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Koji Kawamura <>
Subject Re: Ingest data into SQL database using NiFi
Date Thu, 28 Sep 2017 02:31:27 GMT
Hi Tina,

Glad to hear you were able to get schema.

The read size in ExecuteSQL is less because it's serialized with Avro
in which data can be written efficiently, and gets bigger after
ConvertJSONToSQL because each FlowFile has SQL statement in it.

Which version of Apache NiFi are you using? If you can use 1.3.0, I'd
recommend to use QueryRecord to transform data, and PutDatabaseRecord
to store rows into the destination table.

If you're not familiar with Record data model, Mark's blog would be helpful:

Once you know how Record works, you can do interesting things such as
transform data using SQL against FlowFile:

With record data model, you don't have to split each row nor convert
to SQL statement one by one. Instead a FlowFile containing multiple
records (rows) can be passed around processors to be processed more


On Thu, Sep 28, 2017 at 4:48 AM, tzhu <> wrote:
> Hi Koji,
> Thank you so much for your help! I didn't specify the 'Catalog Name' and
> 'Schema Name' before, and now the error is fixed.
> Now I have another question: After getting converted into different
> datatype, the data size gets very large. The read size in ExecuteSQL is
> about 200 MB, and the size in ConvertJSONToSQL becomes 1 GB. Is there any
> way to reduce the size? I'm thinking about two solutions. One is to use
> other efficient processors; or to split the input into small pieces, and
> maybe take 1000 rows every time to do the transformation.
> Hope this makes sense to you.
> Thank you,
> Tina
> --
> Sent from:

View raw message