spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Jacek Laskowski <ja...@japila.pl>
Subject Re: How to cause a stage to fail (using spark-shell)?
Date Sun, 19 Jun 2016 10:22:56 GMT
Hi,

Thanks Burak for the idea, but it *only* fails the tasks that
eventually fail the entire job not a particular stage (just once or
twice) before the entire job is failed. The idea is to see the
attempts in web UI as there's a special handling for cases where a
stage failed once or twice before finishing up properly.

Any ideas? I've got one but it requires quite an extensive cluster set
up which I'd like to avoid if possible. Just something I could use
during workshops or demos and others could reproduce easily to learn
Spark's internals.

Pozdrawiam,
Jacek Laskowski
----
https://medium.com/@jaceklaskowski/
Mastering Apache Spark http://bit.ly/mastering-apache-spark
Follow me at https://twitter.com/jaceklaskowski


On Sun, Jun 19, 2016 at 5:25 AM, Burak Yavuz <brkyvz@gmail.com> wrote:
> Hi Jacek,
>
> Can't you simply have a mapPartitions task throw an exception or something?
> Are you trying to do something more esoteric?
>
> Best,
> Burak
>
> On Sat, Jun 18, 2016 at 5:35 AM, Jacek Laskowski <jacek@japila.pl> wrote:
>>
>> Hi,
>>
>> Following up on this question, is a stage considered failed only when
>> there is a FetchFailed exception? Can I have a failed stage with only
>> a single-stage job?
>>
>> Appreciate any help on this...(as my family doesn't like me spending
>> the weekend with Spark :))
>>
>> Pozdrawiam,
>> Jacek Laskowski
>> ----
>> https://medium.com/@jaceklaskowski/
>> Mastering Apache Spark http://bit.ly/mastering-apache-spark
>> Follow me at https://twitter.com/jaceklaskowski
>>
>>
>> On Sat, Jun 18, 2016 at 11:53 AM, Jacek Laskowski <jacek@japila.pl> wrote:
>> > Hi,
>> >
>> > I'm trying to see some stats about failing stages in web UI and want
>> > to "create" few failed stages. Is this possible using spark-shell at
>> > all? Which setup of Spark/spark-shell would allow for such a scenario.
>> >
>> > I could write a Scala code if that's the only way to have failing
>> > stages.
>> >
>> > Please guide. Thanks.
>> >
>> > /me on to reviewing the Spark code...
>> >
>> > Pozdrawiam,
>> > Jacek Laskowski
>> > ----
>> > https://medium.com/@jaceklaskowski/
>> > Mastering Apache Spark http://bit.ly/mastering-apache-spark
>> > Follow me at https://twitter.com/jaceklaskowski
>>
>> ---------------------------------------------------------------------
>> To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
>> For additional commands, e-mail: user-help@spark.apache.org
>>
>

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org


Mime
View raw message