flink-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Luqman Ghani <lgsa...@gmail.com>
Subject Specifying Schema dynamically
Date Sun, 12 Feb 2017 08:30:08 GMT

I hope everyone is doing well.

I have a use case where we infer schema according to file headers and other
information. Now, in Flink, we can specify schema of a stream with case
classes and tuples. With tuples, we cannot give names to fields, but we
will have to generate case classes on the fly if we use them. Is there any
way of specifying schema with a Map[String,Any] to Flink, so it can infer
schema from this map.

Like if a file has a header: id, first_name, last_name, last_login
and we infer schema as: Int, String, String, Long

Can we specify it as Map[String, Any]("id" -> Int, "first_name" -> String,
"last_name" -> String, "last_login" -> Long)

We want to use keyBy with field names instead of their indices. I hope
there is a way :)

I was looking into dynamically create case classes in scala using
scala-reflect, but I'm facing problems in getting that class that
forwarding it to Flink program.


View raw message