airflow-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From David Capwell <dcapw...@gmail.com>
Subject Re: Rerunning task without cleaning DB?
Date Thu, 08 Feb 2018 03:52:17 GMT
Ananth, I am not familiar with that and couldn't find any reference in the
code, can you say more?

On Feb 7, 2018 3:02 PM, "Trent Robbins" <robbintt@gmail.com> wrote:

> If you want to keep the rest of your history you can:
>
> 1. turn the DAG off
> 2. delete its bad tasks, delete the bad DAG run
> 3. turn the DAG on
> 4. let it backfill or hit the play button manually depending on your needs
>
> Unfortunately this does not keep the task you are working with, but it's
> better than dropping the database by far.
>
>
>
>
>
> Best,
>
> Trent Robbins
> Strategic Consultant for Open Source Software
> Tau Informatics LLC
> desk: 415-404-9452
> cell: 513-233-5651
> trent@tauinformatics.com
> https://www.linkedin.com/in/trentrobbins
>
> On Wed, Feb 7, 2018 at 2:57 PM, Ananth Durai <vananth22@gmail.com> wrote:
>
> > We can't do that, unfortunately. Airflow schedule the task based on the
> > current state in the DB. If you would like to preserve the history one
> > option would be to add instrumentation on airflow_local_settings.py
> >
> > Regards,
> > Ananth.P,
> >
> >
> >
> >
> >
> >
> > On 5 February 2018 at 13:09, David Capwell <dcapwell@gmail.com> wrote:
> >
> > > When a production issue happens it's common that we clear the history
> to
> > > get airflow to run the task again.  This is problematic since it throws
> > > away the history making finding out what real happened harder.
> > >
> > > Is there any way to rerun a task without deleting from the DB?
> > >
> >
>

Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message