flink-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Timo Walther <twal...@apache.org>
Subject Re: Move Row, RowInputFormat to core package
Date Fri, 25 Nov 2016 13:30:10 GMT
Hi Anton,

I would also support the idea of moving Row and RowTypeInfo to Flink 
core. I think there are many real-world use cases where a 
variable-length record that supports null values is required. However, I 
think that those classes needs to be reworked before. They should not 
depend on Scala-related things.

RowTypeInfo should not inherit from CaseClassTypeInfo, the current 
solution with the dummy field names is a hacky solution anyway. Row 
should not inherit from Scala classes.

Regards,
Timo

Am 24/11/16 um 16:46 schrieb Anton Solovev:
> Hello,
>
>
>
> In Scala case classes can store huge count of fields, it's really helpful for reading
wide csv files, but It uses only in table api.
>
> what about this issue (https://issues.apache.org/jira/browse/FLINK-2186), should we use
table api in machine learning library?
>
> To solve the issue #readCsvFile can generate RowInputFormat.
>
> For commodity I added another one constructor in RowTypeInfo (https://github.com/apache/flink/compare/master...tonycox:FLINK-2186-x)
>
> What do you think about add some scala and moving Row to Flink core?
>


Mime
View raw message