airflow-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From siddharth anand <san...@apache.org>
Subject Re: Running a task from the Airflow UI
Date Wed, 03 Aug 2016 06:10:36 GMT
A REST api is long overdue. I suggest anyone in the community that has the
cycles to start implementing.. your PRs would be welcome. Currently, we
have a very powerful CLI  that should ideally have similar functionality
exposed via the API. The CLI's trigger_dag command is one of the first ones
I'd like to see. I've seen requests to have AWS Lambda's trigger_dag in
response to messages coming over SQS or Kinesis.

Seems like a good first REST API candidate to me.
-s

On Mon, Jul 11, 2016 at 11:17 AM, Paul Minton <pminton@change.org> wrote:

> Thanks Jeremiah. I just noticed this feature this weekend.
>
> I'm wondering if that functionality could be exposed to the REST api. Like
> a post to "airflow.run" with a json object in the payload that gets
> included with the pickled context object.
>
> That would 1) allow for dynamic runs and 2) (more importantly for our use
> case) expose dynamic runs via HTTP making it super simple for external
> resources to trigger dag runs.
>
> I'm still getting familiar with the source so I'm not sure how difficult
> that would be to implement.
>
> On Mon, Jul 11, 2016 at 6:44 AM, Jeremiah Lowin <jlowin@apache.org> wrote:
>
>> Paul,
>>
>> The trigger_dag cli command accepts a json "conf" parameter. To be honest
>> I'm not familiar with the feature (perhaps Sid can provide more detail)
>> but
>> I think it might accomplish your goals.
>>
>> J
>>
>> On Fri, Jul 8, 2016 at 1:11 PM Paul Minton <pminton@change.org> wrote:
>>
>> > +1 to this feature as well.
>> >
>> > I wonder if it would be possible to pass along some context when
>> triggering
>> > a job. This may be outside the scope of this thread, but it would allow
>> for
>> > more dynamic runs of a job. As a simple example, I may want to kick off
>> a
>> > job and pass along the key to a file on s3. Right now we would depend
>> on a
>> > initial s3 sensor, but that would require that the filename be static
>> > across runs.
>> >
>> > On Thu, Jul 7, 2016 at 9:55 AM, Chris Riccomini <criccomini@apache.org>
>> > wrote:
>> >
>> > > +1
>> > >
>> > > On Thu, Jul 7, 2016 at 5:18 AM, Bolke de Bruin <bdbruin@gmail.com>
>> > wrote:
>> > >
>> > > > Ideally the CLI and WebUI should both access an API that handles
>> > > > authentication and authorization. This would resolve both issues.
>> > > However,
>> > > > the UI already allows for authentication and to a lesser extent
>> > > > authorization. Thus allowing this from the UI (which we already do
>> for
>> > > > Celery) is not a big change.
>> > > >
>> > > > - Bolke
>> > > >
>> > > >
>> > > > > Op 7 jul. 2016, om 11:01 heeft Alexander Alten-Lorenz <
>> > > > wget.null@gmail.com> het volgende geschreven:
>> > > > >
>> > > > > Sounds good, but on the other hand I'm with Maxime. Given that
the
>> > task
>> > > > can be triggered per CLI, the functionality is available but needs
a
>> > > local
>> > > > login. When the "run" button now would be available for everyone who
>> > has
>> > > > access to the UI, I can imagine that would cause some serious load
>> > issues
>> > > > in a production environment, especially with SLA based workflow
>> setups.
>> > > > > On the other hand, when the "run" button with a local executor
>> would
>> > > > queue the task in a control queue (like "external triggered") a
>> admin
>> > > could
>> > > > finally mark them as "approved".
>> > > > >
>> > > > > --alex
>> > > > >
>> > > > >> On Jul 7, 2016, at 12:12 AM, Jeremiah Lowin <jlowin@apache.org>
>> > > wrote:
>> > > > >>
>> > > > >> Perhaps it's a good chance to revisit the functionality.
Right
>> now
>> > the
>> > > > UI
>> > > > >> "run" button actually runs the task via CeleryExecutor. Perhaps
>> > > instead
>> > > > (or
>> > > > >> just when using a non-Celery executor) it should queue the
task
>> and
>> > > let
>> > > > the
>> > > > >> Scheduler pick it up. I guess in that case it would just
be sugar
>> > for
>> > > > >> marking a TI as QUEUED. Just a thought.
>> > > > >>
>> > > > >> On Wed, Jul 6, 2016 at 2:54 AM Maxime Beauchemin <
>> > > > maximebeauchemin@gmail.com>
>> > > > >> wrote:
>> > > > >>
>> > > > >>> Hi,
>> > > > >>>
>> > > > >>> The problem is that a web server isn't the right place
to run an
>> > > > airflow
>> > > > >>> task. From the context of the web request scope we have
to
>> somehow
>> > > > pass a
>> > > > >>> message to an external executor to run the task. For
>> LocalExecutor
>> > to
>> > > > work
>> > > > >>> the web server would have to start a LocalExecutor as
a sub
>> process
>> > > and
>> > > > >>> that doesn't sound like a great idea...
>> > > > >>>
>> > > > >>> Max
>> > > > >>>
>> > > > >>> On Tue, Jul 5, 2016 at 11:22 AM, Jason Chen <
>> > > chingchien.chen@gmail.com
>> > > > >
>> > > > >>> wrote:
>> > > > >>>
>> > > > >>>> Hi Airflow team,
>> > > > >>>> I am using the "LocalExecutor" and it works very
well to run
>> the
>> > > > >>> workflow
>> > > > >>>> I setup.
>> > > > >>>>
>> > > > >>>> I noticed that, from the UI, it can trigger a task
to run.
>> > > > >>>> However, I got the error "Only works with the CeleryExecutor,
>> > sorry
>> > > ".
>> > > > >>>> I can ssh into airflow node and run the command line
from
>> there.
>> > > > >>>> However, it would be nice to just run it from airflow
UI.
>> > > > >>>> Is it possible to do that (with "LocalExecutor")
or it's a
>> future
>> > > > feature
>> > > > >>>> to consider ?
>> > > > >>>>
>> > > > >>>> Thanks.
>> > > > >>>> Jason
>> > > > >>>>
>> > > > >>>
>> > > > >
>> > > >
>> > > >
>> > >
>> >
>>
>
>

Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message