Return-Path: X-Original-To: apmail-flink-user-archive@minotaur.apache.org Delivered-To: apmail-flink-user-archive@minotaur.apache.org Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by minotaur.apache.org (Postfix) with SMTP id 4171B17847 for ; Tue, 14 Oct 2014 14:58:23 +0000 (UTC) Received: (qmail 28512 invoked by uid 500); 14 Oct 2014 14:58:23 -0000 Delivered-To: apmail-flink-user-archive@flink.apache.org Received: (qmail 28460 invoked by uid 500); 14 Oct 2014 14:58:23 -0000 Mailing-List: contact user-help@flink.incubator.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: user@flink.incubator.apache.org Delivered-To: mailing list user@flink.incubator.apache.org Received: (qmail 28451 invoked by uid 99); 14 Oct 2014 14:58:23 -0000 Received: from athena.apache.org (HELO athena.apache.org) (140.211.11.136) by apache.org (qpsmtpd/0.29) with ESMTP; Tue, 14 Oct 2014 14:58:23 +0000 X-ASF-Spam-Status: No, hits=-1997.8 required=5.0 tests=ALL_TRUSTED,HTML_MESSAGE,T_RP_MATCHES_RCVD X-Spam-Check-By: apache.org Received: from [140.211.11.3] (HELO mail.apache.org) (140.211.11.3) by apache.org (qpsmtpd/0.29) with SMTP; Tue, 14 Oct 2014 14:58:20 +0000 Received: (qmail 27161 invoked by uid 99); 14 Oct 2014 14:58:00 -0000 Received: from mail-relay.apache.org (HELO mail-relay.apache.org) (140.211.11.15) by apache.org (qpsmtpd/0.29) with ESMTP; Tue, 14 Oct 2014 14:58:00 +0000 Received: from mail-yh0-f41.google.com (mail-yh0-f41.google.com [209.85.213.41]) by mail-relay.apache.org (ASF Mail Server at mail-relay.apache.org) with ESMTPSA id 1E3741A0446 for ; Tue, 14 Oct 2014 14:57:48 +0000 (UTC) Received: by mail-yh0-f41.google.com with SMTP id i57so5138724yha.14 for ; Tue, 14 Oct 2014 07:57:59 -0700 (PDT) MIME-Version: 1.0 X-Received: by 10.236.29.108 with SMTP id h72mr8764511yha.66.1413298679501; Tue, 14 Oct 2014 07:57:59 -0700 (PDT) Received: by 10.170.145.136 with HTTP; Tue, 14 Oct 2014 07:57:59 -0700 (PDT) In-Reply-To: References: Date: Tue, 14 Oct 2014 16:57:59 +0200 Message-ID: Subject: Re: Forced to use Solution Set in Step Function From: Fabian Hueske To: user@flink.incubator.apache.org Content-Type: multipart/alternative; boundary=089e0158b6a4c0db750505633d52 X-Virus-Checked: Checked by ClamAV on apache.org --089e0158b6a4c0db750505633d52 Content-Type: text/plain; charset=UTF-8 Jep, I see you point. Conceptually, all data that changes and affects the result of an iteration should be part of the workset. Hence, the model kind of assumes that the datum "superStepNumber" should be part of the workset. I am not familiar with your application, but would it make sense to add the number as an additional attribute to the workset data set and increase it manually? 2014-10-14 16:45 GMT+02:00 Maximilian Alber : > Ok, sounds true, but somehow I would like to execute it inside of it. So I > probably need to do some nonsense work to make it part of it? > > On Tue, Oct 14, 2014 at 4:36 PM, Aljoscha Krettek > wrote: > >> Dammit you beat me to it. But yes, this is exactly what I was just >> writing. >> >> On Tue, Oct 14, 2014 at 4:35 PM, Fabian Hueske >> wrote: >> > Hi, >> > >> > I'm not super familiar with the iterations, but my guess would be that >> the >> > filter is not evaluated as part of the iteration. >> > Since it is not connect to the workset, the filter is not part of the >> loop >> > and evaluated once outside where no superset number is available. >> > I guess, moving the filter outside of the loop gives the same error. >> > >> > Cheers, Fabian >> > >> > >> > >> > 2014-10-14 16:18 GMT+02:00 Maximilian Alber > >: >> >> >> >> Hmm or maybe not. With this code I get some strange error: >> >> >> >> def createPlan_find_center(env: ExecutionEnvironment) = { >> >> val X = env readTextFile config.xFile map >> >> {Vector.parseFromString(config.dimensions, _)}; >> >> val residual = env readTextFile config.yFile map >> >> {Vector.parseFromString(_)}; >> >> val randoms = env readTextFile config.randomFile map >> >> {Vector.parseFromString(_)} >> >> >> >> val residual_2 = residual * residual >> >> val ys = (residual_2 sumV) * (randoms filter {_.id == 0}) >> >> >> >> val emptyDataSet = env.fromCollection[Vector](Seq()) >> >> val sumVector = >> env.fromCollection(Seq(Vector.zeros(config.dimensions))) >> >> val cumSum = emptyDataSet.iterateDelta(sumVector, config.N, >> Array("id")) { >> >> (solutionset, old_sum) => >> >> val current = residual_2 filter (new RichFilterFunction[Vector]{ >> >> def filter(x: Vector) = x.id == >> >> (getIterationRuntimeContext.getSuperstepNumber) >> >> }) >> >> val sum = VectorDataSet.add(old_sum, current) >> >> >> >> (sum map (new RichMapFunction[Vector, Vector]{ >> >> def map(x: Vector) = new >> >> Vector(getIterationRuntimeContext.getSuperstepNumber, x.values) >> >> }), >> >> sum) >> >> } >> >> >> >> Error: >> >> 10/14/2014 15:57:35: Job execution switched to status RUNNING >> >> 10/14/2014 15:57:35: DataSource ([-1 0.0]) (1/1) switched to SCHEDULED >> >> 10/14/2014 15:57:35: DataSource ([-1 0.0]) (1/1) switched to DEPLOYING >> >> 10/14/2014 15:57:35: DataSource ([]) (1/1) switched to SCHEDULED >> >> 10/14/2014 15:57:35: DataSource ([]) (1/1) switched to DEPLOYING >> >> 10/14/2014 15:57:35: CHAIN DataSource (TextInputFormat >> (/tmp/tmpBhOsLd) - >> >> UTF-8) -> Map (org.apache.flink.api.scala.DataSet$$anon$1) (1/1) >> switched to >> >> SCHEDULED >> >> 10/14/2014 15:57:35: CHAIN DataSource (TextInputFormat >> (/tmp/tmpBhOsLd) - >> >> UTF-8) -> Map (org.apache.flink.api.scala.DataSet$$anon$1) (1/1) >> switched to >> >> DEPLOYING >> >> 10/14/2014 15:57:35: DataSource ([-1 0.0]) (1/1) switched to RUNNING >> >> 10/14/2014 15:57:35: IterationHead(WorksetIteration (Unnamed Delta >> >> Iteration)) (1/1) switched to SCHEDULED >> >> 10/14/2014 15:57:35: IterationHead(WorksetIteration (Unnamed Delta >> >> Iteration)) (1/1) switched to DEPLOYING >> >> 10/14/2014 15:57:35: CHAIN DataSource (TextInputFormat >> (/tmp/tmpBhOsLd) - >> >> UTF-8) -> Map (org.apache.flink.api.scala.DataSet$$anon$1) (1/1) >> switched to >> >> RUNNING >> >> 10/14/2014 15:57:35: DataSource ([]) (1/1) switched to RUNNING >> >> 10/14/2014 15:57:35: IterationHead(WorksetIteration (Unnamed Delta >> >> Iteration)) (1/1) switched to RUNNING >> >> 10/14/2014 15:57:35: CHAIN >> >> Join(org.apache.flink.api.scala.UnfinishedJoinOperation$$anon$5) -> Map >> >> (org.apache.flink.api.scala.DataSet$$anon$1) -> Filter >> >> (bumpboost.BumpBoost$$anonfun$8$$anon$1) (1/1) switched to SCHEDULED >> >> 10/14/2014 15:57:35: CHAIN >> >> Join(org.apache.flink.api.scala.UnfinishedJoinOperation$$anon$5) -> Map >> >> (org.apache.flink.api.scala.DataSet$$anon$1) -> Filter >> >> (bumpboost.BumpBoost$$anonfun$8$$anon$1) (1/1) switched to DEPLOYING >> >> 10/14/2014 15:57:36: CHAIN >> >> Join(org.apache.flink.api.scala.UnfinishedJoinOperation$$anon$5) -> Map >> >> (org.apache.flink.api.scala.DataSet$$anon$1) (1/1) switched to >> SCHEDULED >> >> 10/14/2014 15:57:36: CHAIN >> >> Join(org.apache.flink.api.scala.UnfinishedJoinOperation$$anon$5) -> Map >> >> (org.apache.flink.api.scala.DataSet$$anon$1) (1/1) switched to >> DEPLOYING >> >> 10/14/2014 15:57:36: CHAIN >> >> Join(org.apache.flink.api.scala.UnfinishedJoinOperation$$anon$5) -> Map >> >> (org.apache.flink.api.scala.DataSet$$anon$1) -> Filter >> >> (bumpboost.BumpBoost$$anonfun$8$$anon$1) (1/1) switched to RUNNING >> >> 10/14/2014 15:57:36: CHAIN >> >> Join(org.apache.flink.api.scala.UnfinishedJoinOperation$$anon$5) -> Map >> >> (org.apache.flink.api.scala.DataSet$$anon$1) (1/1) switched to RUNNING >> >> 10/14/2014 15:57:36: CHAIN >> >> Join(org.apache.flink.api.scala.UnfinishedJoinOperation$$anon$5) -> Map >> >> (org.apache.flink.api.scala.DataSet$$anon$1) -> Filter >> >> (bumpboost.BumpBoost$$anonfun$8$$anon$1) (1/1) switched to FAILED >> >> java.lang.IllegalStateException: This stub is not part of an iteration >> >> step function. >> >> at >> >> >> org.apache.flink.api.common.functions.AbstractRichFunction.getIterationRuntimeContext(AbstractRichFunction.java:59) >> >> at bumpboost.BumpBoost$$anonfun$8$$anon$1.filter(BumpBoost.scala:40) >> >> at bumpboost.BumpBoost$$anonfun$8$$anon$1.filter(BumpBoost.scala:39) >> >> at >> >> >> org.apache.flink.api.java.operators.translation.PlanFilterOperator$FlatMapFilter.flatMap(PlanFilterOperator.java:47) >> >> at >> >> >> org.apache.flink.runtime.operators.chaining.ChainedFlatMapDriver.collect(ChainedFlatMapDriver.java:79) >> >> at >> >> >> org.apache.flink.runtime.operators.chaining.ChainedMapDriver.collect(ChainedMapDriver.java:78) >> >> at >> >> >> org.apache.flink.api.scala.UnfinishedJoinOperation$$anon$5.join(joinDataSet.scala:184) >> >> at >> >> >> org.apache.flink.runtime.operators.hash.BuildFirstHashMatchIterator.callWithNextKey(BuildFirstHashMatchIterator.java:140) >> >> at >> >> >> org.apache.flink.runtime.operators.MatchDriver.run(MatchDriver.java:148) >> >> at >> >> >> org.apache.flink.runtime.operators.RegularPactTask.run(RegularPactTask.java:484) >> >> at >> >> >> org.apache.flink.runtime.operators.RegularPactTask.invoke(RegularPactTask.java:359) >> >> at >> >> >> org.apache.flink.runtime.execution.RuntimeEnvironment.run(RuntimeEnvironment.java:235) >> >> at java.lang.Thread.run(Thread.java:745) >> >> >> >> 10/14/2014 15:57:36: Job execution switched to status FAILING >> >> 10/14/2014 15:57:36: DataSource ([]) (1/1) switched to CANCELING >> >> 10/14/2014 15:57:36: CHAIN DataSource (TextInputFormat >> (/tmp/tmpBhOsLd) - >> >> UTF-8) -> Map (org.apache.flink.api.scala.DataSet$$anon$1) (1/1) >> switched to >> >> CANCELING >> >> 10/14/2014 15:57:36: DataSource ([-1 0.0]) (1/1) switched to CANCELING >> >> 10/14/2014 15:57:36: IterationHead(WorksetIteration (Unnamed Delta >> >> Iteration)) (1/1) switched to CANCELING >> >> 10/14/2014 15:57:36: Map (org.apache.flink.api.scala.DataSet$$anon$1) >> >> (1/1) switched to CANCELED >> >> 10/14/2014 15:57:36: DataSink(TextOutputFormat (/tmp/tmplSYJ7S) - >> UTF-8) >> >> (1/1) switched to CANCELED >> >> 10/14/2014 15:57:36: Sync (WorksetIteration (Unnamed Delta Iteration)) >> >> (1/1) switched to CANCELED >> >> 10/14/2014 15:57:36: CHAIN >> >> Join(org.apache.flink.api.scala.UnfinishedJoinOperation$$anon$5) -> Map >> >> (org.apache.flink.api.scala.DataSet$$anon$1) (1/1) switched to >> CANCELING >> >> 10/14/2014 15:57:36: Map (bumpboost.BumpBoost$$anonfun$8$$anon$2) (1/1) >> >> switched to CANCELED >> >> 10/14/2014 15:57:36: SolutionSet Delta (1/1) switched to CANCELED >> >> 10/14/2014 15:57:36: DataSource ([]) (1/1) switched to CANCELED >> >> 10/14/2014 15:57:36: CHAIN DataSource (TextInputFormat >> (/tmp/tmpBhOsLd) - >> >> UTF-8) -> Map (org.apache.flink.api.scala.DataSet$$anon$1) (1/1) >> switched to >> >> CANCELED >> >> 10/14/2014 15:57:36: DataSource ([-1 0.0]) (1/1) switched to CANCELED >> >> 10/14/2014 15:57:36: CHAIN >> >> Join(org.apache.flink.api.scala.UnfinishedJoinOperation$$anon$5) -> Map >> >> (org.apache.flink.api.scala.DataSet$$anon$1) (1/1) switched to CANCELED >> >> 10/14/2014 15:57:36: IterationHead(WorksetIteration (Unnamed Delta >> >> Iteration)) (1/1) switched to CANCELED >> >> 10/14/2014 15:57:36: Job execution switched to status FAILED >> >> Error: The program execution failed: java.lang.IllegalStateException: >> This >> >> stub is not part of an iteration step function. >> >> at >> >> >> org.apache.flink.api.common.functions.AbstractRichFunction.getIterationRuntimeContext(AbstractRichFunction.java:59) >> >> at bumpboost.BumpBoost$$anonfun$8$$anon$1.filter(BumpBoost.scala:40) >> >> at bumpboost.BumpBoost$$anonfun$8$$anon$1.filter(BumpBoost.scala:39) >> >> at >> >> >> org.apache.flink.api.java.operators.translation.PlanFilterOperator$FlatMapFilter.flatMap(PlanFilterOperator.java:47) >> >> at >> >> >> org.apache.flink.runtime.operators.chaining.ChainedFlatMapDriver.collect(ChainedFlatMapDriver.java:79) >> >> at >> >> >> org.apache.flink.runtime.operators.chaining.ChainedMapDriver.collect(ChainedMapDriver.java:78) >> >> at >> >> >> org.apache.flink.api.scala.UnfinishedJoinOperation$$anon$5.join(joinDataSet.scala:184) >> >> at >> >> >> org.apache.flink.runtime.operators.hash.BuildFirstHashMatchIterator.callWithNextKey(BuildFirstHashMatchIterator.java:140) >> >> at >> >> >> org.apache.flink.runtime.operators.MatchDriver.run(MatchDriver.java:148) >> >> at >> >> >> org.apache.flink.runtime.operators.RegularPactTask.run(RegularPactTask.java:484) >> >> at >> >> >> org.apache.flink.runtime.operators.RegularPactTask.invoke(RegularPactTask.java:359) >> >> at >> >> >> org.apache.flink.runtime.execution.RuntimeEnvironment.run(RuntimeEnvironment.java:235) >> >> at java.lang.Thread.run(Thread.java:745) >> >> >> >> On Tue, Oct 14, 2014 at 3:32 PM, Maximilian Alber >> >> wrote: >> >>> >> >>> Should work now. >> >>> Cheers >> >>> >> >>> On Fri, Oct 10, 2014 at 3:38 PM, Maximilian Alber >> >>> wrote: >> >>>> >> >>>> Ok, thanks. >> >>>> Please let me know when it is fixed. >> >>>> >> >>>> Cheers >> >>>> Max >> >>>> >> >>>> On Fri, Oct 10, 2014 at 1:34 PM, Stephan Ewen >> wrote: >> >>>>> >> >>>>> Thank you, I will look into that... >> >>>> >> >>>> >> >>> >> >> >> > >> > > --089e0158b6a4c0db750505633d52 Content-Type: text/html; charset=UTF-8 Content-Transfer-Encoding: quoted-printable
Jep, I see you point.
Conceptually, all data that= changes and affects the result of an iteration should be part of the works= et.
Hence, the model kind of assumes that the datum "superStepNumb= er" should be part of the workset.

I am not familiar with= your application, but would it make sense to add the number as an addition= al attribute to the workset data set and increase it manually?

2014-10-14 16:45 GMT= +02:00 Maximilian Alber <alber.maximilian@gmail.com>:
Ok, sounds true, but = somehow I would like to execute it inside of it. So I probably need to do s= ome nonsense work to make it part of it?

On= Tue, Oct 14, 2014 at 4:36 PM, Aljoscha Krettek <aljoscha@apache.org= > wrote:
Dammit you beat me to = it. But yes, this is exactly what I was just writing.

On Tue, Oct 14, 2014 at 4:35 PM, Fabian Hueske <fhueske@apache.org> wrote:
> Hi,
>
> I'm not super familiar with the iterations, but my guess would be = that the
> filter is not evaluated as part of the iteration.
> Since it is not connect to the workset, the filter is not part of the = loop
> and evaluated once outside where no superset number is available.
> I guess, moving the filter outside of the loop gives the same error. >
> Cheers, Fabian
>
>
>
> 2014-10-14 16:18 GMT+02:00 Maximilian Alber <alber.maximilian@gmail.com>= ;:
>>
>> Hmm or maybe not. With this code I get some strange error:
>>
>> def createPlan_find_center(env: ExecutionEnvironment) =3D {
>> val X =3D env readTextFile config.xFile map
>> {Vector.parseFromString(config.dimensions, _)};
>> val residual =3D env readTextFile config.yFile map
>> {Vector.parseFromString(_)};
>> val randoms =3D env readTextFile config.randomFile map
>> {Vector.parseFromString(_)}
>>
>> val residual_2 =3D residual * residual
>> val ys =3D (residual_2 sumV) * (randoms filter {_.id =3D=3D 0}) >>
>> val emptyDataSet =3D env.fromCollection[Vector](Seq())
>> val sumVector =3D env.fromCollection(Seq(Vector.zeros(config.dimen= sions)))
>> val cumSum =3D emptyDataSet.iterateDelta(sumVector, config.N, Arra= y("id")) {
>>=C2=A0 =C2=A0 =C2=A0(solutionset, old_sum) =3D>
>>=C2=A0 =C2=A0 =C2=A0val current =3D residual_2 filter (new RichFilt= erFunction[Vector]{
>>=C2=A0 =C2=A0 =C2=A0 =C2=A0def filter(x: Vector) =3D x.id =3D=3D
>> (getIterationRuntimeContext.getSuperstepNumber)
>>=C2=A0 =C2=A0 =C2=A0})
>>=C2=A0 =C2=A0 =C2=A0val sum =3D VectorDataSet.add(old_sum, current)=
>>
>>=C2=A0 =C2=A0 =C2=A0(sum map (new RichMapFunction[Vector, Vector]{<= br> >>=C2=A0 =C2=A0 =C2=A0 =C2=A0def map(x: Vector) =3D new
>> Vector(getIterationRuntimeContext.getSuperstepNumber, x.values) >>=C2=A0 =C2=A0 =C2=A0}),
>>=C2=A0 =C2=A0 =C2=A0sum)
>> }
>>
>> Error:
>> 10/14/2014 15:57:35: Job execution switched to status RUNNING
>> 10/14/2014 15:57:35: DataSource ([-1 0.0]) (1/1) switched to SCHED= ULED
>> 10/14/2014 15:57:35: DataSource ([-1 0.0]) (1/1) switched to DEPLO= YING
>> 10/14/2014 15:57:35: DataSource ([]) (1/1) switched to SCHEDULED >> 10/14/2014 15:57:35: DataSource ([]) (1/1) switched to DEPLOYING >> 10/14/2014 15:57:35: CHAIN DataSource (TextInputFormat (/tmp/tmpBh= OsLd) -
>> UTF-8) -> Map (org.apache.flink.api.scala.DataSet$$anon$1) (1/1= ) switched to
>> SCHEDULED
>> 10/14/2014 15:57:35: CHAIN DataSource (TextInputFormat (/tmp/tmpBh= OsLd) -
>> UTF-8) -> Map (org.apache.flink.api.scala.DataSet$$anon$1) (1/1= ) switched to
>> DEPLOYING
>> 10/14/2014 15:57:35: DataSource ([-1 0.0]) (1/1) switched to RUNNI= NG
>> 10/14/2014 15:57:35: IterationHead(WorksetIteration (Unnamed Delta=
>> Iteration)) (1/1) switched to SCHEDULED
>> 10/14/2014 15:57:35: IterationHead(WorksetIteration (Unnamed Delta=
>> Iteration)) (1/1) switched to DEPLOYING
>> 10/14/2014 15:57:35: CHAIN DataSource (TextInputFormat (/tmp/tmpBh= OsLd) -
>> UTF-8) -> Map (org.apache.flink.api.scala.DataSet$$anon$1) (1/1= ) switched to
>> RUNNING
>> 10/14/2014 15:57:35: DataSource ([]) (1/1) switched to RUNNING
>> 10/14/2014 15:57:35: IterationHead(WorksetIteration (Unnamed Delta=
>> Iteration)) (1/1) switched to RUNNING
>> 10/14/2014 15:57:35: CHAIN
>> Join(org.apache.flink.api.scala.UnfinishedJoinOperation$$anon$5) -= > Map
>> (org.apache.flink.api.scala.DataSet$$anon$1) -> Filter
>> (bumpboost.BumpBoost$$anonfun$8$$anon$1) (1/1) switched to SCHEDUL= ED
>> 10/14/2014 15:57:35: CHAIN
>> Join(org.apache.flink.api.scala.UnfinishedJoinOperation$$anon$5) -= > Map
>> (org.apache.flink.api.scala.DataSet$$anon$1) -> Filter
>> (bumpboost.BumpBoost$$anonfun$8$$anon$1) (1/1) switched to DEPLOYI= NG
>> 10/14/2014 15:57:36: CHAIN
>> Join(org.apache.flink.api.scala.UnfinishedJoinOperation$$anon$5) -= > Map
>> (org.apache.flink.api.scala.DataSet$$anon$1) (1/1) switched to SCH= EDULED
>> 10/14/2014 15:57:36: CHAIN
>> Join(org.apache.flink.api.scala.UnfinishedJoinOperation$$anon$5) -= > Map
>> (org.apache.flink.api.scala.DataSet$$anon$1) (1/1) switched to DEP= LOYING
>> 10/14/2014 15:57:36: CHAIN
>> Join(org.apache.flink.api.scala.UnfinishedJoinOperation$$anon$5) -= > Map
>> (org.apache.flink.api.scala.DataSet$$anon$1) -> Filter
>> (bumpboost.BumpBoost$$anonfun$8$$anon$1) (1/1) switched to RUNNING=
>> 10/14/2014 15:57:36: CHAIN
>> Join(org.apache.flink.api.scala.UnfinishedJoinOperation$$anon$5) -= > Map
>> (org.apache.flink.api.scala.DataSet$$anon$1) (1/1) switched to RUN= NING
>> 10/14/2014 15:57:36: CHAIN
>> Join(org.apache.flink.api.scala.UnfinishedJoinOperation$$anon$5) -= > Map
>> (org.apache.flink.api.scala.DataSet$$anon$1) -> Filter
>> (bumpboost.BumpBoost$$anonfun$8$$anon$1) (1/1) switched to FAILED<= br> >> java.lang.IllegalStateException: This stub is not part of an itera= tion
>> step function.
>> at
>> org.apache.flink.api.common.functions.AbstractRichFunction.getIter= ationRuntimeContext(AbstractRichFunction.java:59)
>> at bumpboost.BumpBoost$$anonfun$8$$anon$1.filter(BumpBoost.scala:4= 0)
>> at bumpboost.BumpBoost$$anonfun$8$$anon$1.filter(BumpBoost.scala:3= 9)
>> at
>> org.apache.flink.api.java.operators.translation.PlanFilterOperator= $FlatMapFilter.flatMap(PlanFilterOperator.java:47)
>> at
>> org.apache.flink.runtime.operators.chaining.ChainedFlatMapDriver.c= ollect(ChainedFlatMapDriver.java:79)
>> at
>> org.apache.flink.runtime.operators.chaining.ChainedMapDriver.colle= ct(ChainedMapDriver.java:78)
>> at
>> org.apache.flink.api.scala.UnfinishedJoinOperation$$anon$5.join(jo= inDataSet.scala:184)
>> at
>> org.apache.flink.runtime.operators.hash.BuildFirstHashMatchIterato= r.callWithNextKey(BuildFirstHashMatchIterator.java:140)
>> at
>> org.apache.flink.runtime.operators.MatchDriver.run(MatchDriver.jav= a:148)
>> at
>> org.apache.flink.runtime.operators.RegularPactTask.run(RegularPact= Task.java:484)
>> at
>> org.apache.flink.runtime.operators.RegularPactTask.invoke(RegularP= actTask.java:359)
>> at
>> org.apache.flink.runtime.execution.RuntimeEnvironment.run(RuntimeE= nvironment.java:235)
>> at java.lang.Thread.run(Thread.java:745)
>>
>> 10/14/2014 15:57:36: Job execution switched to status FAILING
>> 10/14/2014 15:57:36: DataSource ([]) (1/1) switched to CANCELING >> 10/14/2014 15:57:36: CHAIN DataSource (TextInputFormat (/tmp/tmpBh= OsLd) -
>> UTF-8) -> Map (org.apache.flink.api.scala.DataSet$$anon$1) (1/1= ) switched to
>> CANCELING
>> 10/14/2014 15:57:36: DataSource ([-1 0.0]) (1/1) switched to CANCE= LING
>> 10/14/2014 15:57:36: IterationHead(WorksetIteration (Unnamed Delta=
>> Iteration)) (1/1) switched to CANCELING
>> 10/14/2014 15:57:36: Map (org.apache.flink.api.scala.DataSet$$anon= $1)
>> (1/1) switched to CANCELED
>> 10/14/2014 15:57:36: DataSink(TextOutputFormat (/tmp/tmplSYJ7S) - = UTF-8)
>> (1/1) switched to CANCELED
>> 10/14/2014 15:57:36: Sync (WorksetIteration (Unnamed Delta Iterati= on))
>> (1/1) switched to CANCELED
>> 10/14/2014 15:57:36: CHAIN
>> Join(org.apache.flink.api.scala.UnfinishedJoinOperation$$anon$5) -= > Map
>> (org.apache.flink.api.scala.DataSet$$anon$1) (1/1) switched to CAN= CELING
>> 10/14/2014 15:57:36: Map (bumpboost.BumpBoost$$anonfun$8$$anon$2) = (1/1)
>> switched to CANCELED
>> 10/14/2014 15:57:36: SolutionSet Delta (1/1) switched to CANCELED<= br> >> 10/14/2014 15:57:36: DataSource ([]) (1/1) switched to CANCELED >> 10/14/2014 15:57:36: CHAIN DataSource (TextInputFormat (/tmp/tmpBh= OsLd) -
>> UTF-8) -> Map (org.apache.flink.api.scala.DataSet$$anon$1) (1/1= ) switched to
>> CANCELED
>> 10/14/2014 15:57:36: DataSource ([-1 0.0]) (1/1) switched to CANCE= LED
>> 10/14/2014 15:57:36: CHAIN
>> Join(org.apache.flink.api.scala.UnfinishedJoinOperation$$anon$5) -= > Map
>> (org.apache.flink.api.scala.DataSet$$anon$1) (1/1) switched to CAN= CELED
>> 10/14/2014 15:57:36: IterationHead(WorksetIteration (Unnamed Delta=
>> Iteration)) (1/1) switched to CANCELED
>> 10/14/2014 15:57:36: Job execution switched to status FAILED
>> Error: The program execution failed: java.lang.IllegalStateExcepti= on: This
>> stub is not part of an iteration step function.
>> at
>> org.apache.flink.api.common.functions.AbstractRichFunction.getIter= ationRuntimeContext(AbstractRichFunction.java:59)
>> at bumpboost.BumpBoost$$anonfun$8$$anon$1.filter(BumpBoost.scala:4= 0)
>> at bumpboost.BumpBoost$$anonfun$8$$anon$1.filter(BumpBoost.scala:3= 9)
>> at
>> org.apache.flink.api.java.operators.translation.PlanFilterOperator= $FlatMapFilter.flatMap(PlanFilterOperator.java:47)
>> at
>> org.apache.flink.runtime.operators.chaining.ChainedFlatMapDriver.c= ollect(ChainedFlatMapDriver.java:79)
>> at
>> org.apache.flink.runtime.operators.chaining.ChainedMapDriver.colle= ct(ChainedMapDriver.java:78)
>> at
>> org.apache.flink.api.scala.UnfinishedJoinOperation$$anon$5.join(jo= inDataSet.scala:184)
>> at
>> org.apache.flink.runtime.operators.hash.BuildFirstHashMatchIterato= r.callWithNextKey(BuildFirstHashMatchIterator.java:140)
>> at
>> org.apache.flink.runtime.operators.MatchDriver.run(MatchDriver.jav= a:148)
>> at
>> org.apache.flink.runtime.operators.RegularPactTask.run(RegularPact= Task.java:484)
>> at
>> org.apache.flink.runtime.operators.RegularPactTask.invoke(RegularP= actTask.java:359)
>> at
>> org.apache.flink.runtime.execution.RuntimeEnvironment.run(RuntimeE= nvironment.java:235)
>> at java.lang.Thread.run(Thread.java:745)
>>
>> On Tue, Oct 14, 2014 at 3:32 PM, Maximilian Alber
>> <alber.maximilian@gmail.com> wrote:
>>>
>>> Should work now.
>>> Cheers
>>>
>>> On Fri, Oct 10, 2014 at 3:38 PM, Maximilian Alber
>>> <alber.maximilian@gmail.com> wrote:
>>>>
>>>> Ok, thanks.
>>>> Please let me know when it is fixed.
>>>>
>>>> Cheers
>>>> Max
>>>>
>>>> On Fri, Oct 10, 2014 at 1:34 PM, Stephan Ewen <sewen@apache.org> wro= te:
>>>>>
>>>>> Thank you, I will look into that...
>>>>
>>>>
>>>
>>
>


--089e0158b6a4c0db750505633d52--