flink-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Piotr Nowojski <pi...@data-artisans.com>
Subject Re: state of parallel jobs when one task fails
Date Fri, 29 Sep 2017 15:21:56 GMT
Hi,

Yes, by default Flink will restart all of the tasks. I think that since Flink 1.3, you can
configure a FailoverStrategy <https://ci.apache.org/projects/flink/flink-docs-release-1.3/api/java/org/apache/flink/runtime/executiongraph/failover/FailoverStrategy.html>
to change this behavior.

Thanks, Piotrek

> On Sep 29, 2017, at 5:10 PM, r. r. <robert@abv.bg> wrote:
> 
> Hello
> I have a simple job with a single map() processing which I want to run with many documents
in parallel in Flink.
> What will happen if one of the 'instances' of the job fails?
>  
> This statement in Flink docs confuses me:
> "In case of failures, a job switches first to failing where it cancels all running tasks".
> So if I have 10 documents processed in parallel in the job's map() (each in a different
task slot, I presume) and one of them fails, does it mean that all the rest will be failed/cancelled
as well?
> 
> Thanks!
> 


Mime
View raw message