flink-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Robert Metzger <rmetz...@apache.org>
Subject Re: Extend Travis-CI build matrix
Date Thu, 28 Aug 2014 12:38:06 GMT
Oh yes, you are right. I have not thought about this, but its indeed a nice
way to oversee wrong API usage.
I'm going to use a JDK6 to run the hadoop2 profile, because some modules
are disabled on the hadoop1 profile (hbase and yarn). This ensure that we
have at least one build with all modules on an actual JDK6.


I'm going to work on this issue at a later point. I've the impression that
I was working a bit too much on the build / infrastructure side recently (I
haven't seen much Java code in the last three days :( )


On Wed, Aug 27, 2014 at 7:58 PM, Sean Owen <srowen@gmail.com> wrote:

> Sure, if you can see a good reason to add 1-2 more, go for it. Yes you
> certainly want to cover all of these Hadoop and JDK versions at least
> once for the reasons you give.
>
> PS you won't catch, for example, code that uses Java 7+ classes or
> methods just because the language level is set to 6, unless you set
> bootclasspath with Maven. Cross compiling is a little bit dangerous
> that way. But you're covered since you have at least one Java 6-only
> build, that is actually run with with JDK 6.
>
> On Wed, Aug 27, 2014 at 6:20 PM, Robert Metzger <rmetzger@apache.org>
> wrote:
> > @Sean: We are able to build Hadoop 1.2.1 with Java 8 because we are
> always
> > setting the compilation level to Java 6.
> >
> > I can remember two instances where we found java specific issues:
> > 1) The javadocs of Java 8 are checking the correctness of the HTML codes
> in
> > the source comments, so our java 8 builds failed.
> > 2) Oracle's Java 6 has a compiler bug that leads to null pointer
> exceptions
> > during compilations.
> > Both issues were discovered independent of the Hadoop version, so we
> should
> > be fine.
> >
> > I'm probably going for 5 parallel builds just because it will not add any
> > additional waiting time.
> >
> > Thank you for the feedback. I'll try and see how I can configure travis
> to
> > reflect these build settings:
> > https://issues.apache.org/jira/browse/FLINK-1072.
> >
> > -- Robert
> >
> >
> > On Wed, Aug 27, 2014 at 4:31 PM, Stephan Ewen <sewen@apache.org> wrote:
> >
> >> Sounds reasonable.
> >>
> >> Since Travis runs 5 concurrent builds, we can add one more without
> adding
> >> extra time.
> >>
> >> I would suggest to add (1.2.1 - Java 7) ->  I would suspect that to be
> >> still used quite a bit in that combination.
> >>
> >> Stephan
> >>
> >>
> >>
> >> On Wed, Aug 27, 2014 at 12:06 PM, Sean Owen <srowen@gmail.com> wrote:
> >>
> >> > I think the most important thing is building at least once against the
> >> > 4 Hadoop versions, and building at least once against the 3 JDK
> >> > versions. It's very unlikely that a particular JDK + Hadoop version
> >> > fails to compile, while the same JDK with another Hadoop, or the same
> >> > Hadoop with another JDK, does.
> >> >
> >> > I think you could get away with 4:
> >> >
> >> > 1.2.1 - 6
> >> > 2.0.0-alpha - 6
> >> > 2.2.0 - 7
> >> > 2.5.0 - 8
> >> >
> >> > These at least pairs old JDK with old Hadoop. I am not sure Hadoop <
> >> > 2.2 even reliably works with Java 7, for example? testing Java 8 +
> >> > Hadoop 1.2.1 is probably pointless, for example.
> >> >
> >> > You can add back a few more pairs here and there if this feels too
> >> sparse.
> >> >
> >> >
> >> > On Wed, Aug 27, 2014 at 10:40 AM, Robert Metzger <rmetzger@apache.org
> >
> >> > wrote:
> >> > > Hi guys,
> >> > >
> >> > > while creating the 0.6-incubating release I noticed that often build
> >> > issues
> >> > > are triggered by changing dependencies.
> >> > > In particular we allow users to set the version of the Hadoop
> >> dependency.
> >> > >
> >> > > Right now, we test the following variants:
> >> > >
> >> > > (oraclejdk8, oraclejdk7, openjdk6) x (hadoop 1.2.1, hadoop 2.2.0)
> >> > >
> >> > > Accidentially, I found out that the recently merged streaming
> component
> >> > > does not build with hadoop 2.4.0 as a dependency (
> >> > > https://issues.apache.org/jira/browse/FLINK-1065).
> >> > >
> >> > > I'm suggesting to add the following versions into the pool of Hadoop
> >> > > versions we test against:
> >> > > 1) "hadoop 2.0.0-alpha"
> >> > > 2 "hadoop 2.5.0"
> >> > >
> >> > > 1) is going to be the replacement for the "cdh4" package, and I
> think
> >> we
> >> > > should test versions we are going to ship with releases. (
> >> > > https://issues.apache.org/jira/browse/FLINK-1068)
> >> > > 2) is the current stable hadoop version. I think we should test
> against
> >> > > hadoop 2.2.0 and the latest stable hadoop version.
> >> > >
> >> > > Adding these two versions would result in 3x4 = 12 builds per push
/
> >> pull
> >> > > request, which is a lot given that we can only run 5 tests in
> parallel.
> >> > > Therefore, I'm suggesting to add just 2 builds with "oraclejdk8" and
> >> the
> >> > > two new hadoop versions.
> >> > >
> >> > > Opinions?
> >> > >
> >> > >
> >> > > -- Robert
> >> >
> >>
>

Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message