airflow-commits mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Kamil Szkoda (JIRA)" <j...@apache.org>
Subject [jira] [Updated] (AIRFLOW-2189) Scheduler under systemd doesn't work in parallel
Date Wed, 07 Mar 2018 16:29:00 GMT

     [ https://issues.apache.org/jira/browse/AIRFLOW-2189?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]

Kamil Szkoda updated AIRFLOW-2189:
----------------------------------
    Description: 
I'm using scheduler under system V. In this case all dags can run parallel.

I just migrated to System D and even though I have the same configuration I can't run dags
parallel.

Just an example:
 # When one job is running 30 minutes (for example spark execution) rest of them are queueing.
This problem stop execution for rest of jobs until the spark will be not finished.
 # When dag's code is failed scheduler doesn't execute rest of dags. .

I have no celery configuration. Does celery with system D resolve my problems ?

 

 

  was:
I'm using scheduler under system V. In this case all dags can run parallel.

I just migrated to System D and even though I have the same configuration I can't run dags
parallel.

Just an example:
 # When one job is running 30 minutes (for example spark execution) rest of them are queueing.
This problem stop execution for rest of jobs not so heavy.
 # When dag's code is failed scheduler doesn't execute rest of dags. .

I have no celery configuration. Does celery with system D resolve my problems ?

 

 


> Scheduler under systemd doesn't work in parallel
> ------------------------------------------------
>
>                 Key: AIRFLOW-2189
>                 URL: https://issues.apache.org/jira/browse/AIRFLOW-2189
>             Project: Apache Airflow
>          Issue Type: Bug
>          Components: scheduler
>    Affects Versions: 1.9.0
>            Reporter: Kamil Szkoda
>            Priority: Major
>
> I'm using scheduler under system V. In this case all dags can run parallel.
> I just migrated to System D and even though I have the same configuration I can't run
dags parallel.
> Just an example:
>  # When one job is running 30 minutes (for example spark execution) rest of them are
queueing. This problem stop execution for rest of jobs until the spark will be not finished.
>  # When dag's code is failed scheduler doesn't execute rest of dags. .
> I have no celery configuration. Does celery with system D resolve my problems ?
>  
>  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

Mime
View raw message