flink-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Robert Metzger <rmetz...@apache.org>
Subject Drop support for CDH4 / Hadoop 2.0.0-alpha
Date Thu, 26 Feb 2015 16:57:52 GMT

I'm currently working on https://issues.apache.org/jira/browse/FLINK-1605
and its a hell of a mess.

I got almost everything working, except for the hadoop 2.0.0-alpha profile.
The profile exists because google protobuf has a different version in that
Hadoop release.
Since maven is setting the version of protobuf for the entire project to
the older version, we have to use an older akka version which is causing

The logical conclusion from that would be shading Hadoop's protobuf version
into the Hadoop jars. That by itself is working, however its not working
for the "flink-yarn-tests".

I think I can also solve the issue with the flink-yarn-tests, but it would
be a very dirty hack (either injecting shaded code into the failsafe
tests-classpath or putting test code into src/main).

But the general question remains: Are we willing to continue spending a lot
of time on maintaining the profile?
Till has spend a lot of time recently to fix failing testcases for that old
akka version, I spend almost two days now on getting the
shading/dependencies right, and I'm sure we'll keep having troubles with
the profile.

Therefore, I was wondering if this is the right time to drop support for
CDH4 / Hadoop 2.0.0-alpha.


  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message