Return-Path: X-Original-To: apmail-flink-user-archive@minotaur.apache.org Delivered-To: apmail-flink-user-archive@minotaur.apache.org Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by minotaur.apache.org (Postfix) with SMTP id 89E1C177B7 for ; Tue, 14 Oct 2014 14:36:02 +0000 (UTC) Received: (qmail 65810 invoked by uid 500); 14 Oct 2014 14:36:02 -0000 Delivered-To: apmail-flink-user-archive@flink.apache.org Received: (qmail 65758 invoked by uid 500); 14 Oct 2014 14:36:02 -0000 Mailing-List: contact user-help@flink.incubator.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: user@flink.incubator.apache.org Delivered-To: mailing list user@flink.incubator.apache.org Received: (qmail 65745 invoked by uid 99); 14 Oct 2014 14:36:02 -0000 Received: from athena.apache.org (HELO athena.apache.org) (140.211.11.136) by apache.org (qpsmtpd/0.29) with ESMTP; Tue, 14 Oct 2014 14:36:02 +0000 X-ASF-Spam-Status: No, hits=-1997.8 required=5.0 tests=ALL_TRUSTED,HTML_MESSAGE,T_RP_MATCHES_RCVD X-Spam-Check-By: apache.org Received: from [140.211.11.3] (HELO mail.apache.org) (140.211.11.3) by apache.org (qpsmtpd/0.29) with SMTP; Tue, 14 Oct 2014 14:36:00 +0000 Received: (qmail 61960 invoked by uid 99); 14 Oct 2014 14:35:39 -0000 Received: from mail-relay.apache.org (HELO mail-relay.apache.org) (140.211.11.15) by apache.org (qpsmtpd/0.29) with ESMTP; Tue, 14 Oct 2014 14:35:39 +0000 Received: from mail-yh0-f44.google.com (mail-yh0-f44.google.com [209.85.213.44]) by mail-relay.apache.org (ASF Mail Server at mail-relay.apache.org) with ESMTPSA id 92F1E1A0446 for ; Tue, 14 Oct 2014 14:35:25 +0000 (UTC) Received: by mail-yh0-f44.google.com with SMTP id i57so5092488yha.3 for ; Tue, 14 Oct 2014 07:35:36 -0700 (PDT) MIME-Version: 1.0 X-Received: by 10.236.74.199 with SMTP id x47mr6834273yhd.41.1413297336172; Tue, 14 Oct 2014 07:35:36 -0700 (PDT) Received: by 10.170.145.136 with HTTP; Tue, 14 Oct 2014 07:35:36 -0700 (PDT) In-Reply-To: References: Date: Tue, 14 Oct 2014 16:35:36 +0200 Message-ID: Subject: Re: Forced to use Solution Set in Step Function From: Fabian Hueske To: user@flink.incubator.apache.org Content-Type: multipart/alternative; boundary=20cf300faf6daf4895050562edee X-Virus-Checked: Checked by ClamAV on apache.org --20cf300faf6daf4895050562edee Content-Type: text/plain; charset=UTF-8 Hi, I'm not super familiar with the iterations, but my guess would be that the filter is not evaluated as part of the iteration. Since it is not connect to the workset, the filter is not part of the loop and evaluated once outside where no superset number is available. I guess, moving the filter outside of the loop gives the same error. Cheers, Fabian 2014-10-14 16:18 GMT+02:00 Maximilian Alber : > Hmm or maybe not. With this code I get some strange error: > > def createPlan_find_center(env: ExecutionEnvironment) = { > val X = env readTextFile config.xFile map > {Vector.parseFromString(config.dimensions, _)}; > val residual = env readTextFile config.yFile map > {Vector.parseFromString(_)}; > val randoms = env readTextFile config.randomFile map > {Vector.parseFromString(_)} > > val residual_2 = residual * residual > val ys = (residual_2 sumV) * (randoms filter {_.id == 0}) > > val emptyDataSet = env.fromCollection[Vector](Seq()) > val sumVector = env.fromCollection(Seq(Vector.zeros(config.dimensions))) > val cumSum = emptyDataSet.iterateDelta(sumVector, config.N, Array("id")) { > (solutionset, old_sum) => > val current = residual_2 filter (new RichFilterFunction[Vector]{ > def filter(x: Vector) = x.id == > (getIterationRuntimeContext.getSuperstepNumber) > }) > val sum = VectorDataSet.add(old_sum, current) > > (sum map (new RichMapFunction[Vector, Vector]{ > def map(x: Vector) = new > Vector(getIterationRuntimeContext.getSuperstepNumber, x.values) > }), > sum) > } > > Error: > 10/14/2014 15:57:35: Job execution switched to status RUNNING > 10/14/2014 15:57:35: DataSource ([-1 0.0]) (1/1) switched to SCHEDULED > 10/14/2014 15:57:35: DataSource ([-1 0.0]) (1/1) switched to DEPLOYING > 10/14/2014 15:57:35: DataSource ([]) (1/1) switched to SCHEDULED > 10/14/2014 15:57:35: DataSource ([]) (1/1) switched to DEPLOYING > 10/14/2014 15:57:35: CHAIN DataSource (TextInputFormat (/tmp/tmpBhOsLd) - > UTF-8) -> Map (org.apache.flink.api.scala.DataSet$$anon$1) (1/1) switched > to SCHEDULED > 10/14/2014 15:57:35: CHAIN DataSource (TextInputFormat (/tmp/tmpBhOsLd) - > UTF-8) -> Map (org.apache.flink.api.scala.DataSet$$anon$1) (1/1) switched > to DEPLOYING > 10/14/2014 15:57:35: DataSource ([-1 0.0]) (1/1) switched to RUNNING > 10/14/2014 15:57:35: IterationHead(WorksetIteration (Unnamed Delta > Iteration)) (1/1) switched to SCHEDULED > 10/14/2014 15:57:35: IterationHead(WorksetIteration (Unnamed Delta > Iteration)) (1/1) switched to DEPLOYING > 10/14/2014 15:57:35: CHAIN DataSource (TextInputFormat (/tmp/tmpBhOsLd) - > UTF-8) -> Map (org.apache.flink.api.scala.DataSet$$anon$1) (1/1) switched > to RUNNING > 10/14/2014 15:57:35: DataSource ([]) (1/1) switched to RUNNING > 10/14/2014 15:57:35: IterationHead(WorksetIteration (Unnamed Delta > Iteration)) (1/1) switched to RUNNING > 10/14/2014 15:57:35: CHAIN > Join(org.apache.flink.api.scala.UnfinishedJoinOperation$$anon$5) -> Map > (org.apache.flink.api.scala.DataSet$$anon$1) -> Filter > (bumpboost.BumpBoost$$anonfun$8$$anon$1) (1/1) switched to SCHEDULED > 10/14/2014 15:57:35: CHAIN > Join(org.apache.flink.api.scala.UnfinishedJoinOperation$$anon$5) -> Map > (org.apache.flink.api.scala.DataSet$$anon$1) -> Filter > (bumpboost.BumpBoost$$anonfun$8$$anon$1) (1/1) switched to DEPLOYING > 10/14/2014 15:57:36: CHAIN > Join(org.apache.flink.api.scala.UnfinishedJoinOperation$$anon$5) -> Map > (org.apache.flink.api.scala.DataSet$$anon$1) (1/1) switched to SCHEDULED > 10/14/2014 15:57:36: CHAIN > Join(org.apache.flink.api.scala.UnfinishedJoinOperation$$anon$5) -> Map > (org.apache.flink.api.scala.DataSet$$anon$1) (1/1) switched to DEPLOYING > 10/14/2014 15:57:36: CHAIN > Join(org.apache.flink.api.scala.UnfinishedJoinOperation$$anon$5) -> Map > (org.apache.flink.api.scala.DataSet$$anon$1) -> Filter > (bumpboost.BumpBoost$$anonfun$8$$anon$1) (1/1) switched to RUNNING > 10/14/2014 15:57:36: CHAIN > Join(org.apache.flink.api.scala.UnfinishedJoinOperation$$anon$5) -> Map > (org.apache.flink.api.scala.DataSet$$anon$1) (1/1) switched to RUNNING > 10/14/2014 15:57:36: CHAIN > Join(org.apache.flink.api.scala.UnfinishedJoinOperation$$anon$5) -> Map > (org.apache.flink.api.scala.DataSet$$anon$1) -> Filter > (bumpboost.BumpBoost$$anonfun$8$$anon$1) (1/1) switched to FAILED > java.lang.IllegalStateException: This stub is not part of an iteration > step function. > at > org.apache.flink.api.common.functions.AbstractRichFunction.getIterationRuntimeContext(AbstractRichFunction.java:59) > at bumpboost.BumpBoost$$anonfun$8$$anon$1.filter(BumpBoost.scala:40) > at bumpboost.BumpBoost$$anonfun$8$$anon$1.filter(BumpBoost.scala:39) > at > org.apache.flink.api.java.operators.translation.PlanFilterOperator$FlatMapFilter.flatMap(PlanFilterOperator.java:47) > at > org.apache.flink.runtime.operators.chaining.ChainedFlatMapDriver.collect(ChainedFlatMapDriver.java:79) > at > org.apache.flink.runtime.operators.chaining.ChainedMapDriver.collect(ChainedMapDriver.java:78) > at > org.apache.flink.api.scala.UnfinishedJoinOperation$$anon$5.join(joinDataSet.scala:184) > at > org.apache.flink.runtime.operators.hash.BuildFirstHashMatchIterator.callWithNextKey(BuildFirstHashMatchIterator.java:140) > at org.apache.flink.runtime.operators.MatchDriver.run(MatchDriver.java:148) > at > org.apache.flink.runtime.operators.RegularPactTask.run(RegularPactTask.java:484) > at > org.apache.flink.runtime.operators.RegularPactTask.invoke(RegularPactTask.java:359) > at > org.apache.flink.runtime.execution.RuntimeEnvironment.run(RuntimeEnvironment.java:235) > at java.lang.Thread.run(Thread.java:745) > > 10/14/2014 15:57:36: Job execution switched to status FAILING > 10/14/2014 15:57:36: DataSource ([]) (1/1) switched to CANCELING > 10/14/2014 15:57:36: CHAIN DataSource (TextInputFormat (/tmp/tmpBhOsLd) - > UTF-8) -> Map (org.apache.flink.api.scala.DataSet$$anon$1) (1/1) switched > to CANCELING > 10/14/2014 15:57:36: DataSource ([-1 0.0]) (1/1) switched to CANCELING > 10/14/2014 15:57:36: IterationHead(WorksetIteration (Unnamed Delta > Iteration)) (1/1) switched to CANCELING > 10/14/2014 15:57:36: Map (org.apache.flink.api.scala.DataSet$$anon$1) > (1/1) switched to CANCELED > 10/14/2014 15:57:36: DataSink(TextOutputFormat (/tmp/tmplSYJ7S) - UTF-8) > (1/1) switched to CANCELED > 10/14/2014 15:57:36: Sync (WorksetIteration (Unnamed Delta Iteration)) > (1/1) switched to CANCELED > 10/14/2014 15:57:36: CHAIN > Join(org.apache.flink.api.scala.UnfinishedJoinOperation$$anon$5) -> Map > (org.apache.flink.api.scala.DataSet$$anon$1) (1/1) switched to CANCELING > 10/14/2014 15:57:36: Map (bumpboost.BumpBoost$$anonfun$8$$anon$2) (1/1) > switched to CANCELED > 10/14/2014 15:57:36: SolutionSet Delta (1/1) switched to CANCELED > 10/14/2014 15:57:36: DataSource ([]) (1/1) switched to CANCELED > 10/14/2014 15:57:36: CHAIN DataSource (TextInputFormat (/tmp/tmpBhOsLd) - > UTF-8) -> Map (org.apache.flink.api.scala.DataSet$$anon$1) (1/1) switched > to CANCELED > 10/14/2014 15:57:36: DataSource ([-1 0.0]) (1/1) switched to CANCELED > 10/14/2014 15:57:36: CHAIN > Join(org.apache.flink.api.scala.UnfinishedJoinOperation$$anon$5) -> Map > (org.apache.flink.api.scala.DataSet$$anon$1) (1/1) switched to CANCELED > 10/14/2014 15:57:36: IterationHead(WorksetIteration (Unnamed Delta > Iteration)) (1/1) switched to CANCELED > 10/14/2014 15:57:36: Job execution switched to status FAILED > Error: The program execution failed: java.lang.IllegalStateException: This > stub is not part of an iteration step function. > at > org.apache.flink.api.common.functions.AbstractRichFunction.getIterationRuntimeContext(AbstractRichFunction.java:59) > at bumpboost.BumpBoost$$anonfun$8$$anon$1.filter(BumpBoost.scala:40) > at bumpboost.BumpBoost$$anonfun$8$$anon$1.filter(BumpBoost.scala:39) > at > org.apache.flink.api.java.operators.translation.PlanFilterOperator$FlatMapFilter.flatMap(PlanFilterOperator.java:47) > at > org.apache.flink.runtime.operators.chaining.ChainedFlatMapDriver.collect(ChainedFlatMapDriver.java:79) > at > org.apache.flink.runtime.operators.chaining.ChainedMapDriver.collect(ChainedMapDriver.java:78) > at > org.apache.flink.api.scala.UnfinishedJoinOperation$$anon$5.join(joinDataSet.scala:184) > at > org.apache.flink.runtime.operators.hash.BuildFirstHashMatchIterator.callWithNextKey(BuildFirstHashMatchIterator.java:140) > at org.apache.flink.runtime.operators.MatchDriver.run(MatchDriver.java:148) > at > org.apache.flink.runtime.operators.RegularPactTask.run(RegularPactTask.java:484) > at > org.apache.flink.runtime.operators.RegularPactTask.invoke(RegularPactTask.java:359) > at > org.apache.flink.runtime.execution.RuntimeEnvironment.run(RuntimeEnvironment.java:235) > at java.lang.Thread.run(Thread.java:745) > > On Tue, Oct 14, 2014 at 3:32 PM, Maximilian Alber < > alber.maximilian@gmail.com> wrote: > >> Should work now. >> Cheers >> >> On Fri, Oct 10, 2014 at 3:38 PM, Maximilian Alber < >> alber.maximilian@gmail.com> wrote: >> >>> Ok, thanks. >>> Please let me know when it is fixed. >>> >>> Cheers >>> Max >>> >>> On Fri, Oct 10, 2014 at 1:34 PM, Stephan Ewen wrote: >>> >>>> Thank you, I will look into that... >>>> >>> >>> >> > --20cf300faf6daf4895050562edee Content-Type: text/html; charset=UTF-8 Content-Transfer-Encoding: quoted-printable
Hi,

I'm not super familiar= with the iterations, but my guess would be that the filter is not evaluate= d as part of the iteration.
Since it is not connect to the workset, the = filter is not part of the loop and evaluated once outside where no superset= number is available.
I guess, moving the filter outside of the lo= op gives the same error.

Cheers, Fabian


<= /div>

= 2014-10-14 16:18 GMT+02:00 Maximilian Alber <alber.maximilian@g= mail.com>:
Hmm or maybe not. With this code I get some strange error:
<= div>
def createPlan_find_center(env: ExecutionEnvironment) = =3D {
val X =3D env readTextFile config.xFile map {Vector.parseFromS= tring(config.dimensions, _)};
val residual =3D env readTextFile conf= ig.yFile map {Vector.parseFromString(_)};
val randoms =3D env readTe= xtFile config.randomFile map {Vector.parseFromString(_)}

val res= idual_2 =3D residual * residual
val ys =3D (residual_2 sumV) * (rand= oms filter {_.id =3D=3D 0})

val emptyDataSet = =3D env.fromCollection[Vector](Seq())
val sumVector =3D env.f= romCollection(Seq(Vector.zeros(config.dimensions)))
val cumSum =3D e= mptyDataSet.iterateDelta(sumVector, config.N, Array("id")) {
= =C2=A0 =C2=A0 (solutionset, old_sum) =3D>
=C2=A0 =C2=A0 val c= urrent =3D residual_2 filter (new RichFilterFunction[Vector]{
=C2=A0 =C2=A0 =C2=A0 def filter(x: Vector) =3D x.id =3D=3D (getIterationRuntimeContext.getS= uperstepNumber)
=C2=A0 =C2=A0 })
=C2=A0 = =C2=A0 val sum =3D VectorDataSet.add(old_sum, current)

=C2=A0 = =C2=A0 (sum map (new RichMapFunction[Vector, Vector]{
=C2=A0 =C2= =A0 =C2=A0 def map(x: Vector) =3D new Vector(getIterationRuntimeContext.get= SuperstepNumber, x.values)
=C2=A0 =C2=A0 }),
=C2=A0 = =C2=A0 sum)
}

Error:
10/14/2014 1= 5:57:35: Job execution switched to status RUNNING
10/14/2014 15:57:35: D= ataSource ([-1 0.0]) (1/1) switched to SCHEDULED
10/14/2014 15:57:35: Da= taSource ([-1 0.0]) (1/1) switched to DEPLOYING
10/14/2014 15:57:35: Dat= aSource ([]) (1/1) switched to SCHEDULED
10/14/2014 15:57:35: DataSource= ([]) (1/1) switched to DEPLOYING
10/14/2014 15:57:35: CHAIN DataSource = (TextInputFormat (/tmp/tmpBhOsLd) - UTF-8) -> Map (org.apache.flink.api.= scala.DataSet$$anon$1) (1/1) switched to SCHEDULED
10/14/2014 15:57:35: = CHAIN DataSource (TextInputFormat (/tmp/tmpBhOsLd) - UTF-8) -> Map (org.= apache.flink.api.scala.DataSet$$anon$1) (1/1) switched to DEPLOYING
10/1= 4/2014 15:57:35: DataSource ([-1 0.0]) (1/1) switched to RUNNING
10/14/2= 014 15:57:35: IterationHead(WorksetIteration (Unnamed Delta Iteration)) (1/= 1) switched to SCHEDULED
10/14/2014 15:57:35: IterationHead(WorksetItera= tion (Unnamed Delta Iteration)) (1/1) switched to DEPLOYING
10/14/2014 1= 5:57:35: CHAIN DataSource (TextInputFormat (/tmp/tmpBhOsLd) - UTF-8) -> = Map (org.apache.flink.api.scala.DataSet$$anon$1) (1/1) switched to RUNNING<= br>10/14/2014 15:57:35: DataSource ([]) (1/1) switched to RUNNING
10/14/= 2014 15:57:35: IterationHead(WorksetIteration (Unnamed Delta Iteration)) (1= /1) switched to RUNNING
10/14/2014 15:57:35: CHAIN Join(org.apache.flink= .api.scala.UnfinishedJoinOperation$$anon$5) -> Map (org.apache.flink.api= .scala.DataSet$$anon$1) -> Filter (bumpboost.BumpBoost$$anonfun$8$$anon$= 1) (1/1) switched to SCHEDULED
10/14/2014 15:57:35: CHAIN Join(org.apach= e.flink.api.scala.UnfinishedJoinOperation$$anon$5) -> Map (org.apache.fl= ink.api.scala.DataSet$$anon$1) -> Filter (bumpboost.BumpBoost$$anonfun$8= $$anon$1) (1/1) switched to DEPLOYING
10/14/2014 15:57:36: CHAIN Join(or= g.apache.flink.api.scala.UnfinishedJoinOperation$$anon$5) -> Map (org.ap= ache.flink.api.scala.DataSet$$anon$1) (1/1) switched to SCHEDULED
10/14/= 2014 15:57:36: CHAIN Join(org.apache.flink.api.scala.UnfinishedJoinOperatio= n$$anon$5) -> Map (org.apache.flink.api.scala.DataSet$$anon$1) (1/1) swi= tched to DEPLOYING
10/14/2014 15:57:36: CHAIN Join(org.apache.flink.api.= scala.UnfinishedJoinOperation$$anon$5) -> Map (org.apache.flink.api.scal= a.DataSet$$anon$1) -> Filter (bumpboost.BumpBoost$$anonfun$8$$anon$1) (1= /1) switched to RUNNING
10/14/2014 15:57:36: CHAIN Join(org.apache.flink= .api.scala.UnfinishedJoinOperation$$anon$5) -> Map (org.apache.flink.api= .scala.DataSet$$anon$1) (1/1) switched to RUNNING
10/14/2014 15:57:36: C= HAIN Join(org.apache.flink.api.scala.UnfinishedJoinOperation$$anon$5) ->= Map (org.apache.flink.api.scala.DataSet$$anon$1) -> Filter (bumpboost.B= umpBoost$$anonfun$8$$anon$1) (1/1) switched to FAILED
java.lang.IllegalS= tateException: This stub is not part of an iteration step function.
at = org.apache.flink.api.common.functions.AbstractRichFunction.getIterationRunt= imeContext(AbstractRichFunction.java:59)
at bumpboost.BumpBoost$$anonfu= n$8$$anon$1.filter(BumpBoost.scala:40)
at bumpboost.BumpBoost$$anonfun$= 8$$anon$1.filter(BumpBoost.scala:39)
at org.apache.flink.api.java.opera= tors.translation.PlanFilterOperator$FlatMapFilter.flatMap(PlanFilterOperato= r.java:47)
at org.apache.flink.runtime.operators.chaining.ChainedFlatMa= pDriver.collect(ChainedFlatMapDriver.java:79)
at org.apache.flink.runti= me.operators.chaining.ChainedMapDriver.collect(ChainedMapDriver.java:78) at org.apache.flink.api.scala.UnfinishedJoinOperation$$anon$5.join(joinDa= taSet.scala:184)
at org.apache.flink.runtime.operators.hash.BuildFirstH= ashMatchIterator.callWithNextKey(BuildFirstHashMatchIterator.java:140)
= at org.apache.flink.runtime.operators.MatchDriver.run(MatchDriver.java:148)=
at org.apache.flink.runtime.operators.RegularPactTask.run(RegularPactT= ask.java:484)
at org.apache.flink.runtime.operators.RegularPactTask.inv= oke(RegularPactTask.java:359)
at org.apache.flink.runtime.execution.Run= timeEnvironment.run(RuntimeEnvironment.java:235)
at java.lang.Thread.ru= n(Thread.java:745)

10/14/2014 15:57:36: Job execution switched to st= atus FAILING
10/14/2014 15:57:36: DataSource ([]) (1/1) switched to CANC= ELING
10/14/2014 15:57:36: CHAIN DataSource (TextInputFormat (/tmp/tmpBh= OsLd) - UTF-8) -> Map (org.apache.flink.api.scala.DataSet$$anon$1) (1/1)= switched to CANCELING
10/14/2014 15:57:36: DataSource ([-1 0.0]) (1/1) = switched to CANCELING
10/14/2014 15:57:36: IterationHead(WorksetIteratio= n (Unnamed Delta Iteration)) (1/1) switched to CANCELING
10/14/2014 15:5= 7:36: Map (org.apache.flink.api.scala.DataSet$$anon$1) (1/1) switched to CA= NCELED
10/14/2014 15:57:36: DataSink(TextOutputFormat (/tmp/tmplSYJ7S) -= UTF-8) (1/1) switched to CANCELED
10/14/2014 15:57:36: Sync (WorksetIte= ration (Unnamed Delta Iteration)) (1/1) switched to CANCELED
10/14/2014 = 15:57:36: CHAIN Join(org.apache.flink.api.scala.UnfinishedJoinOperation$$an= on$5) -> Map (org.apache.flink.api.scala.DataSet$$anon$1) (1/1) switched= to CANCELING
10/14/2014 15:57:36: Map (bumpboost.BumpBoost$$anonfun$8$$= anon$2) (1/1) switched to CANCELED
10/14/2014 15:57:36: SolutionSet Delt= a (1/1) switched to CANCELED
10/14/2014 15:57:36: DataSource ([]) (1/1) = switched to CANCELED
10/14/2014 15:57:36: CHAIN DataSource (TextInputFor= mat (/tmp/tmpBhOsLd) - UTF-8) -> Map (org.apache.flink.api.scala.DataSet= $$anon$1) (1/1) switched to CANCELED
10/14/2014 15:57:36: DataSource ([-= 1 0.0]) (1/1) switched to CANCELED
10/14/2014 15:57:36: CHAIN Join(org.a= pache.flink.api.scala.UnfinishedJoinOperation$$anon$5) -> Map (org.apach= e.flink.api.scala.DataSet$$anon$1) (1/1) switched to CANCELED
10/14/2014= 15:57:36: IterationHead(WorksetIteration (Unnamed Delta Iteration)) (1/1) = switched to CANCELED
10/14/2014 15:57:36: Job execution switched to stat= us FAILED
Error: The program execution failed: java.lang.IllegalStateExc= eption: This stub is not part of an iteration step function.
at org.apa= che.flink.api.common.functions.AbstractRichFunction.getIterationRuntimeCont= ext(AbstractRichFunction.java:59)
at bumpboost.BumpBoost$$anonfun$8$$an= on$1.filter(BumpBoost.scala:40)
at bumpboost.BumpBoost$$anonfun$8$$anon= $1.filter(BumpBoost.scala:39)
at org.apache.flink.api.java.operators.tr= anslation.PlanFilterOperator$FlatMapFilter.flatMap(PlanFilterOperator.java:= 47)
at org.apache.flink.runtime.operators.chaining.ChainedFlatMapDriver= .collect(ChainedFlatMapDriver.java:79)
at org.apache.flink.runtime.oper= ators.chaining.ChainedMapDriver.collect(ChainedMapDriver.java:78)
at or= g.apache.flink.api.scala.UnfinishedJoinOperation$$anon$5.join(joinDataSet.s= cala:184)
at org.apache.flink.runtime.operators.hash.BuildFirstHashMatc= hIterator.callWithNextKey(BuildFirstHashMatchIterator.java:140)
at org.= apache.flink.runtime.operators.MatchDriver.run(MatchDriver.java:148)
at= org.apache.flink.runtime.operators.RegularPactTask.run(RegularPactTask.jav= a:484)
at org.apache.flink.runtime.operators.RegularPactTask.invoke(Reg= ularPactTask.java:359)
at org.apache.flink.runtime.execution.RuntimeEnv= ironment.run(RuntimeEnvironment.java:235)
at java.lang.Thread.run(Threa= d.java:745)

On Tue, Oct 14, 2014 at 3= :32 PM, Maximilian Alber <alber.maximilian@gmail.com> wrote:
Should wo= rk now.
Cheers

On Fri, Oct 10, 2014 at 3:38 PM, Maximilia= n Alber <alber.maximilian@gmail.com> wrote:
Ok, thanks.
= Please let me know when it is fixed.

Cheers
<= div>Max

On Fri, Oct 10, 2014 at 1:34 PM, Stephan Ewen <sewen@apache.org= > wrote:

Tha= nk you, I will look into that...





--20cf300faf6daf4895050562edee--