Return-Path: X-Original-To: apmail-flink-dev-archive@www.apache.org Delivered-To: apmail-flink-dev-archive@www.apache.org Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by minotaur.apache.org (Postfix) with SMTP id 5F39B174EF for ; Fri, 15 May 2015 07:32:04 +0000 (UTC) Received: (qmail 95165 invoked by uid 500); 15 May 2015 07:32:04 -0000 Delivered-To: apmail-flink-dev-archive@flink.apache.org Received: (qmail 95110 invoked by uid 500); 15 May 2015 07:32:04 -0000 Mailing-List: contact dev-help@flink.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: dev@flink.apache.org Delivered-To: mailing list dev@flink.apache.org Received: (qmail 95099 invoked by uid 99); 15 May 2015 07:32:04 -0000 Received: from mail-relay.apache.org (HELO mail-relay.apache.org) (140.211.11.15) by apache.org (qpsmtpd/0.29) with ESMTP; Fri, 15 May 2015 07:32:04 +0000 Received: from mail-vn0-f44.google.com (mail-vn0-f44.google.com [209.85.216.44]) by mail-relay.apache.org (ASF Mail Server at mail-relay.apache.org) with ESMTPSA id D565E1A06F2 for ; Fri, 15 May 2015 07:32:03 +0000 (UTC) Received: by vnbf190 with SMTP id f190so6842126vnb.10 for ; Fri, 15 May 2015 00:32:02 -0700 (PDT) MIME-Version: 1.0 X-Received: by 10.52.71.203 with SMTP id x11mr7502799vdu.48.1431675122940; Fri, 15 May 2015 00:32:02 -0700 (PDT) Received: by 10.52.36.108 with HTTP; Fri, 15 May 2015 00:32:02 -0700 (PDT) In-Reply-To: References: <428FCBF2-7110-4AB3-9887-05E7EBA484F6@icloud.com> Date: Fri, 15 May 2015 09:32:02 +0200 Message-ID: Subject: Re: Hello Everyone From: Aljoscha Krettek To: dev@flink.apache.org Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: quoted-printable We'll look at why it failed and then decide whether it's good. So for those KafkaITCase fails we no that it doesn't matter right now. On Fri, May 15, 2015 at 12:56 AM, Lokesh Rajaram wrote: > This almost worked. Of the 5 build jobs four passed and one failed. > What's the acceptance criteria for a pull request? Do I need to build aga= in > to get all 5 build jobs passing? > > Thanks, > Lokesh > > On Thu, May 14, 2015 at 8:50 AM, Robert Metzger wro= te: > >> No, you don't have to wait. >> The KafkaITCase is not always failing. If you're lucky, it will pass wit= h >> the next run. >> >> On Thu, May 14, 2015 at 5:48 PM, Lokesh Rajaram >> wrote: >> >> > If I understand it correct, I have to wait for your pull request to be >> > merged, I can rebase and trigger build again. is that right? >> > >> > Thanks Robert, Aljoscha for super fast reply/help. >> > >> > Thanks, >> > Lokesh >> > >> > On Thu, May 14, 2015 at 8:39 AM, Robert Metzger >> > wrote: >> > >> > > However, you can only restart runs in your travis account, not on th= e >> > > apache account (also used for validating pull requests). >> > > >> > > I have opened a pull request a few minutes ago which will reduce the >> > number >> > > of KafakITCase failures (there is still one other unresolved issue). >> > > >> > > On Thu, May 14, 2015 at 5:37 PM, Aljoscha Krettek > > >> > > wrote: >> > > >> > > > Hi, >> > > > don't worry, there are very few stupid questions. :D >> > > > >> > > > The KafkaITCase sometimes fails on Travis, this is a known problem >> > > > currently. On travis you can restart the individual runs for a com= mit >> > > > in the view of the failed run. >> > > > >> > > > Hope that helps. >> > > > >> > > > Cheers, >> > > > Aljoscha >> > > > >> > > > On Thu, May 14, 2015 at 5:35 PM, Lokesh Rajaram >> > > > wrote: >> > > > > Thanks Aljoscha, Robert. After adding guava dependency for >> > > flink-spargel >> > > > I >> > > > > was able to progress further but now it's failing in >> > > > > flink-streaming-connectors for the following test case: >> > > > > >> > > > > KafkaITCase.brokerFailureTest:936->tryExecute:352 Test failed wi= th: >> > Job >> > > > > execution failed. >> > > > > >> > > > > Any pointers would help me proceed further. Sorry for a lot of >> > trivial >> > > > > questions, I am just getting started not familiar with the code >> base. >> > > > > I tried running locally, I am able to run it successfully, don't >> know >> > > why >> > > > > it's only failing in Travis build. Not sure if I am missing >> something >> > > in >> > > > my >> > > > > local environment. >> > > > > >> > > > > Thanks, >> > > > > Lokesh >> > > > > >> > > > > On Thu, May 14, 2015 at 1:39 AM, Robert Metzger < >> rmetzger@apache.org >> > > >> > > > wrote: >> > > > > >> > > > >> I think flink-spargel is missing the guava dependency. >> > > > >> >> > > > >> On Thu, May 14, 2015 at 8:18 AM, Aljoscha Krettek < >> > > aljoscha@apache.org> >> > > > >> wrote: >> > > > >> >> > > > >> > @Robert, this seems like a problem with the Shading? >> > > > >> > >> > > > >> > On Thu, May 14, 2015 at 5:41 AM, Lokesh Rajaram >> > > > >> > wrote: >> > > > >> > > Thanks Aljioscha. I was able to change as recommended and a= ble >> > to >> > > > run >> > > > >> the >> > > > >> > > entire test suite in local successfully. >> > > > >> > > However Travis build is failing for pull request: >> > > > >> > > https://github.com/apache/flink/pull/673. >> > > > >> > > >> > > > >> > > It's a compilation failure: >> > > > >> > > >> > > > >> > > [ERROR] Failed to execute goal >> > > > >> > > org.apache.maven.plugins:maven-compiler-plugin:3.1:compile >> > > > >> > > (default-compile) on project flink-spargel: Compilation >> failure: >> > > > >> > > Compilation failure: >> > > > >> > > [ERROR] >> > > > >> > > >> > > > >> > >> > > > >> >> > > > >> > > >> > >> /home/travis/build/apache/flink/flink-staging/flink-spargel/src/main/jav= a/org/apache/flink/spargel/java/VertexCentricIteration.java:[42,30] >> > > > >> > > package com.google.common.base does not exist >> > > > >> > > >> > > > >> > > I can definitely see the package imported in the class, >> > compiling >> > > > and >> > > > >> > > passing all tests in local. >> > > > >> > > Anything I am missing here? >> > > > >> > > >> > > > >> > > Thanks, >> > > > >> > > Lokesh >> > > > >> > > >> > > > >> > > On Mon, May 11, 2015 at 1:25 AM, Aljoscha Krettek < >> > > > aljoscha@apache.org >> > > > >> > >> > > > >> > > wrote: >> > > > >> > > >> > > > >> > >> I think you can replace Validate.NotNull(p) with require(p= !=3D >> > > > null, "p >> > > > >> > >> is null (or something like this)"). >> > > > >> > >> >> > > > >> > >> On Mon, May 11, 2015 at 12:27 AM, Lokesh Rajaram >> > > > >> > >> wrote: >> > > > >> > >> > 1. I think I can use require for replacing Validate.isTr= ue >> > > > >> > >> > 2. What about Validate.notNull? If require is used it wo= uld >> > > throw >> > > > >> > >> > IllegalArgumentException, >> > > > >> > >> > if assume or assert is used it would throw AssertionErro= r >> > which >> > > > is >> > > > >> not >> > > > >> > >> > compatible with current implementation. >> > > > >> > >> > >> > > > >> > >> > Please let me know if my understanding is correct. Also, >> let >> > me >> > > > know >> > > > >> > your >> > > > >> > >> > thoughts. >> > > > >> > >> > >> > > > >> > >> > Thanks, >> > > > >> > >> > Lokesh >> > > > >> > >> > >> > > > >> > >> > On Sun, May 10, 2015 at 1:04 AM, Aljoscha Krettek < >> > > > >> > aljoscha@apache.org> >> > > > >> > >> > wrote: >> > > > >> > >> > >> > > > >> > >> >> I would propose using the methods as Chiwan suggested. = If >> > > > everyone >> > > > >> > >> >> agrees I can change the Jira issue. >> > > > >> > >> >> >> > > > >> > >> >> On Sun, May 10, 2015 at 6:47 AM, Lokesh Rajaram >> > > > >> > >> >> wrote: >> > > > >> > >> >> > Thank you for the reference links. Which approach >> should I >> > > > take, >> > > > >> > >> casting >> > > > >> > >> >> or >> > > > >> > >> >> > use scala methods. >> > > > >> > >> >> > If it's the latter option will the JIRA ticket >> FLINK-1711 >> > > > >> > >> >> > be >> > > > updated to >> > > > >> > >> >> reflect it? >> > > > >> > >> >> > >> > > > >> > >> >> > Thanks, >> > > > >> > >> >> > Lokesh >> > > > >> > >> >> > >> > > > >> > >> >> > On Sat, May 9, 2015 at 8:16 PM, Chiwan Park < >> > > > >> chiwanpark@icloud.com >> > > > >> > > >> > > > >> > >> >> wrote: >> > > > >> > >> >> > >> > > > >> > >> >> >> Hi. There is some problems using Guava=E2=80=99s che= ck method >> in >> > > > Scala. >> > > > >> ( >> > > > >> > >> >> >> >> > > > >> https://groups.google.com/forum/#!topic/guava-discuss/juwovq26R= 3k >> > > > >> > < >> > > > >> > >> >> >> >> > > > >> https://groups.google.com/forum/#!topic/guava-discuss/juwovq26R= 3k >> > > > >> > >) >> > > > >> > >> You >> > > > >> > >> >> >> can solve this error simply with casting last argume= nt >> to >> > > > >> > >> >> java.lang.Object. >> > > > >> > >> >> >> But I think we=E2=80=99d better use `require`, `assu= me`, >> `assert` >> > > > method >> > > > >> > >> >> provided >> > > > >> > >> >> >> by Scala. ( >> > > > >> > >> >> >> >> > > > >> http://daily-scala.blogspot.kr/2010/03/assert-require-assume.ht= ml >> > > > >> > < >> > > > >> > >> >> >> >> > > > >> http://daily-scala.blogspot.kr/2010/03/assert-require-assume.ht= ml >> > > > >> > >) >> > > > >> > >> >> >> Because this changes affects many other codes, so we >> > should >> > > > >> > discuss >> > > > >> > >> >> about >> > > > >> > >> >> >> changing Guava's method to Scala=E2=80=99s method. >> > > > >> > >> >> >> >> > > > >> > >> >> >> Regards. >> > > > >> > >> >> >> Chiwan Park (Sent with iPhone) >> > > > >> > >> >> >> >> > > > >> > >> >> >> >> > > > >> > >> >> >> >> > > > >> > >> >> >> > On May 10, 2015, at 11:49 AM, Lokesh Rajaram < >> > > > >> > >> >> rajaram.lokesh@gmail.com> >> > > > >> > >> >> >> wrote: >> > > > >> > >> >> >> > >> > > > >> > >> >> >> > Hello All, >> > > > >> > >> >> >> > >> > > > >> > >> >> >> > I am new to Flink community and am very excited ab= out >> > the >> > > > >> > project >> > > > >> > >> and >> > > > >> > >> >> >> work >> > > > >> > >> >> >> > you all have been doing. Kudos!! >> > > > >> > >> >> >> > >> > > > >> > >> >> >> > I was looking to pickup some starter task. Robert >> > > > recommended >> > > > >> to >> > > > >> > >> pick >> > > > >> > >> >> up >> > > > >> > >> >> >> > https://issues.apache.org/jira/browse/FLINK-1711. >> > Thanks >> > > > >> Robert >> > > > >> > >> for >> > > > >> > >> >> your >> > > > >> > >> >> >> > guidance. >> > > > >> > >> >> >> > >> > > > >> > >> >> >> > Sorry for a dumb question. I am done with code >> changes >> > > but >> > > > my >> > > > >> > "mvn >> > > > >> > >> >> >> verify" >> > > > >> > >> >> >> > failing only for the scala module as follows >> > > > >> > >> >> >> > >> > > > >> > >> >> >> > >> > > > >> > >> >> >> >> > > > >> > >> >> >> > > > >> > >> >> > > > >> > >> > > > >> >> > > > >> > > >> > >> flink/flink-scala/src/main/scala/org/apache/flink/api/scala/joinDataSet.= scala:77: >> > > > >> > >> >> >> > error: ambiguous reference to overloaded definitio= n, >> > > > >> > >> >> >> > [ERROR] both method checkNotNull in object >> > Preconditions >> > > of >> > > > >> type >> > > > >> > >> >> [T](x$1: >> > > > >> > >> >> >> > T, x$2: String, x$3: [Object])T >> > > > >> > >> >> >> > [ERROR] and method checkNotNull in object >> > Preconditions >> > > of >> > > > >> type >> > > > >> > >> >> [T](x$1: >> > > > >> > >> >> >> > T, x$2: Any)T >> > > > >> > >> >> >> > [ERROR] match argument types ((L, R) =3D> O,String= ) >> > > > >> > >> >> >> > [ERROR] Preconditions.checkNotNull(fun, "Join >> > > function >> > > > >> must >> > > > >> > >> not be >> > > > >> > >> >> >> > null.") >> > > > >> > >> >> >> > >> > > > >> > >> >> >> > Same error I see for all of the Scala classes I >> > changed. >> > > > Any >> > > > >> > >> pointers >> > > > >> > >> >> >> here >> > > > >> > >> >> >> > will be very helpful for me to proceed further. >> Please >> > > let >> > > > me >> > > > >> > know >> > > > >> > >> if >> > > > >> > >> >> you >> > > > >> > >> >> >> > need more information. >> > > > >> > >> >> >> > >> > > > >> > >> >> >> > Thanks in advance for your help and support. >> > > > >> > >> >> >> > >> > > > >> > >> >> >> > Thanks, >> > > > >> > >> >> >> > Lokesh >> > > > >> > >> >> >> >> > > > >> > >> >> >> >> > > > >> > >> >> >> > > > >> > >> >> > > > >> > >> > > > >> >> > > > >> > > >> > >>