flink-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From 时某人 <shijinkui...@163.com>
Subject Re:Re: [DISCUSS] how choose Scala and Java
Date Thu, 08 Sep 2016 00:54:13 GMT


Hi, Till, Thanks for your clear reply. 


In fact the API of Java and Scala are not coordinating. User are easy confused by the Same
class name of Java/Scala API such as `StreamExecutionEnvironment`.
Since Scala can do almost what Java can, Why not use Scala only? Kafka's API looks good. https://github.com/apache/kafka/blob/trunk/core/src/main/scala/kafka/javaapi/consumer/SimpleConsumer.scala
There is a little hard to maintain the Java and Scala API at the same time. Two source packages
mean the separated implements each other.
Can we re-consider such base design?

At 2016-09-07 15:26:55, "Till Rohrmann" <trohrmann@apache.org> wrote:
>I think you're referring to the implementation of some of Flink's modules,
>right?
>
>If that is the case, then the rule of thumb is that we want to use Java for
>the low level runtime implementations. For the API implementations it is a
>case to case decision. The Scala API, for example is of course implemented
>in Scala. For other APIs we tend to use Scala only if it gives a clear
>advantage over a Java implementation.
>
>If your question is more like which Flink API to use (either Java or Scala
>API), then it's completely up to you and your preferences.
>
>Cheers,
>Till
>
>On Wed, Sep 7, 2016 at 8:21 AM, 时某人 <shijinkui666@163.com> wrote:
>
>> Scala and Java mixed in the module. Some Flink API indeed make someone
>> confused.
>> What is rule about  the current Scala and Java API at the first implement
>> time?
>>
>>
>> Thanks
Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message