Return-Path: X-Original-To: archive-asf-public-internal@cust-asf2.ponee.io Delivered-To: archive-asf-public-internal@cust-asf2.ponee.io Received: from cust-asf.ponee.io (cust-asf.ponee.io [163.172.22.183]) by cust-asf2.ponee.io (Postfix) with ESMTP id 69FE2200C62 for ; Wed, 26 Apr 2017 21:45:10 +0200 (CEST) Received: by cust-asf.ponee.io (Postfix) id 688E8160AA1; Wed, 26 Apr 2017 19:45:10 +0000 (UTC) Delivered-To: archive-asf-public@cust-asf.ponee.io Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by cust-asf.ponee.io (Postfix) with SMTP id 88EF4160BB4 for ; Wed, 26 Apr 2017 21:45:09 +0200 (CEST) Received: (qmail 5855 invoked by uid 500); 26 Apr 2017 19:45:08 -0000 Mailing-List: contact commits-help@airflow.incubator.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: dev@airflow.incubator.apache.org Delivered-To: mailing list commits@airflow.incubator.apache.org Received: (qmail 5846 invoked by uid 99); 26 Apr 2017 19:45:08 -0000 Received: from pnap-us-west-generic-nat.apache.org (HELO spamd2-us-west.apache.org) (209.188.14.142) by apache.org (qpsmtpd/0.29) with ESMTP; Wed, 26 Apr 2017 19:45:08 +0000 Received: from localhost (localhost [127.0.0.1]) by spamd2-us-west.apache.org (ASF Mail Server at spamd2-us-west.apache.org) with ESMTP id 5A9821B0BDA for ; Wed, 26 Apr 2017 19:45:08 +0000 (UTC) X-Virus-Scanned: Debian amavisd-new at spamd2-us-west.apache.org X-Spam-Flag: NO X-Spam-Score: -100.002 X-Spam-Level: X-Spam-Status: No, score=-100.002 tagged_above=-999 required=6.31 tests=[RP_MATCHES_RCVD=-0.001, SPF_PASS=-0.001, USER_IN_WHITELIST=-100] autolearn=disabled Received: from mx1-lw-us.apache.org ([10.40.0.8]) by localhost (spamd2-us-west.apache.org [10.40.0.9]) (amavisd-new, port 10024) with ESMTP id sCDkloQbk3ER for ; Wed, 26 Apr 2017 19:45:06 +0000 (UTC) Received: from mailrelay1-us-west.apache.org (mailrelay1-us-west.apache.org [209.188.14.139]) by mx1-lw-us.apache.org (ASF Mail Server at mx1-lw-us.apache.org) with ESMTP id 3A5895F342 for ; Wed, 26 Apr 2017 19:45:06 +0000 (UTC) Received: from jira-lw-us.apache.org (unknown [207.244.88.139]) by mailrelay1-us-west.apache.org (ASF Mail Server at mailrelay1-us-west.apache.org) with ESMTP id 945DDE0B21 for ; Wed, 26 Apr 2017 19:45:05 +0000 (UTC) Received: from jira-lw-us.apache.org (localhost [127.0.0.1]) by jira-lw-us.apache.org (ASF Mail Server at jira-lw-us.apache.org) with ESMTP id BE9CD21DEB for ; Wed, 26 Apr 2017 19:45:04 +0000 (UTC) Date: Wed, 26 Apr 2017 19:45:04 +0000 (UTC) From: "Mubin Khalid (JIRA)" To: commits@airflow.incubator.apache.org Message-ID: In-Reply-To: References: Subject: [jira] [Commented] (AIRFLOW-1147) airflow scheduler not working MIME-Version: 1.0 Content-Type: text/plain; charset=utf-8 Content-Transfer-Encoding: 7bit X-JIRA-FingerPrint: 30527f35849b9dde25b450d4833f0394 archived-at: Wed, 26 Apr 2017 19:45:10 -0000 [ https://issues.apache.org/jira/browse/AIRFLOW-1147?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15985453#comment-15985453 ] Mubin Khalid commented on AIRFLOW-1147: --------------------------------------- yes, did test it with both toggle it on from UI and also from cli. {code}airflow unpause DAGID{code} here is 5 seconds log window {code} [2017-04-26 05:51:30,876] {jobs.py:343} DagFileProcessor2 INFO - Started process (PID=9434) to work on /airflow/dags/etl_elastic/StandardizeDataDag.py [2017-04-26 05:51:30,880] {jobs.py:1521} DagFileProcessor2 INFO - Processing file /airflow/dags/etl_elastic/StandardizeDataDag.py for tasks to queue [2017-04-26 05:51:30,880] {models.py:167} DagFileProcessor2 INFO - Filling up the DagBag from /airflow/dags/etl_elastic/StandardizeDataDag.py [2017-04-26 05:51:31,067] {jobs.py:1535} DagFileProcessor2 INFO - DAG(s) dict_keys(['StandardizeDataDag']) retrieved from /airflow/dags/etl_elastic/StandardizeDataDag.py [2017-04-26 05:51:31,094] {jobs.py:1169} DagFileProcessor2 INFO - Processing StandardizeDataDag [2017-04-26 05:51:31,104] {jobs.py:566} DagFileProcessor2 INFO - Skipping SLA check for because no tasks in DAG have SLAs /anaconda3/lib/python3.5/site-packages/sqlalchemy/sql/default_comparator.py:161: SAWarning: The IN-predicate on "dag_run.dag_id" was invoked with an empty sequence. This results in a contradiction, which nonetheless can be expensive to evaluate. Consider alternative strategies for improved performance. 'strategies for improved performance.' % expr) [2017-04-26 05:51:31,112] {models.py:322} DagFileProcessor2 INFO - Finding 'running' jobs without a recent heartbeat [2017-04-26 05:51:31,113] {models.py:328} DagFileProcessor2 INFO - Failing jobs without heartbeat after 2017-04-26 05:46:31.113179 [2017-04-26 05:51:31,118] {jobs.py:351} DagFileProcessor2 INFO - Processing /airflow/dags/etl_elastic/StandardizeDataDag.py took 0.243 seconds [2017-04-26 05:51:32,925] {jobs.py:343} DagFileProcessor5 INFO - Started process (PID=9441) to work on /airflow/dags/etl_elastic/StandardizeDataDag.py [2017-04-26 05:51:32,929] {jobs.py:1521} DagFileProcessor5 INFO - Processing file /airflow/dags/etl_elastic/StandardizeDataDag.py for tasks to queue [2017-04-26 05:51:32,930] {models.py:167} DagFileProcessor5 INFO - Filling up the DagBag from /airflow/dags/etl_elastic/StandardizeDataDag.py [2017-04-26 05:51:33,119] {jobs.py:1535} DagFileProcessor5 INFO - DAG(s) dict_keys(['StandardizeDataDag']) retrieved from /airflow/dags/etl_elastic/StandardizeDataDag.py [2017-04-26 05:51:33,145] {jobs.py:1169} DagFileProcessor5 INFO - Processing StandardizeDataDag [2017-04-26 05:51:33,155] {jobs.py:566} DagFileProcessor5 INFO - Skipping SLA check for because no tasks in DAG have SLAs /anaconda3/lib/python3.5/site-packages/sqlalchemy/sql/default_comparator.py:161: SAWarning: The IN-predicate on "dag_run.dag_id" was invoked with an empty sequence. This results in a contradiction, which nonetheless can be expensive to evaluate. Consider alternative strategies for improved performance. 'strategies for improved performance.' % expr) [2017-04-26 05:51:33,164] {models.py:322} DagFileProcessor5 INFO - Finding 'running' jobs without a recent heartbeat [2017-04-26 05:51:33,164] {models.py:328} DagFileProcessor5 INFO - Failing jobs without heartbeat after 2017-04-26 05:46:33.164884 [2017-04-26 05:51:33,170] {jobs.py:351} DagFileProcessor5 INFO - Processing /airflow/dags/etl_elastic/StandardizeDataDag.py took 0.245 seconds [2017-04-26 05:51:34,971] {jobs.py:343} DagFileProcessor8 INFO - Started process (PID=9447) to work on /airflow/dags/etl_elastic/StandardizeDataDag.py [2017-04-26 05:51:34,975] {jobs.py:1521} DagFileProcessor8 INFO - Processing file /airflow/dags/etl_elastic/StandardizeDataDag.py for tasks to queue [2017-04-26 05:51:34,975] {models.py:167} DagFileProcessor8 INFO - Filling up the DagBag from /airflow/dags/etl_elastic/StandardizeDataDag.py [2017-04-26 05:51:35,150] {jobs.py:1535} DagFileProcessor8 INFO - DAG(s) dict_keys(['StandardizeDataDag']) retrieved from /airflow/dags/etl_elastic/StandardizeDataDag.py [2017-04-26 05:51:35,178] {jobs.py:1169} DagFileProcessor8 INFO - Processing StandardizeDataDag [2017-04-26 05:51:35,187] {jobs.py:566} DagFileProcessor8 INFO - Skipping SLA check for because no tasks in DAG have SLAs /anaconda3/lib/python3.5/site-packages/sqlalchemy/sql/default_comparator.py:161: SAWarning: The IN-predicate on "dag_run.dag_id" was invoked with an empty sequence. This results in a contradiction, which nonetheless can be expensive to evaluate. Consider alternative strategies for improved performance. 'strategies for improved performance.' % expr) [2017-04-26 05:51:35,196] {models.py:322} DagFileProcessor8 INFO - Finding 'running' jobs without a recent heartbeat [2017-04-26 05:51:35,197] {models.py:328} DagFileProcessor8 INFO - Failing jobs without heartbeat after 2017-04-26 05:46:35.197177 [2017-04-26 05:51:35,201] {jobs.py:351} DagFileProcessor8 INFO - Processing /airflow/dags/etl_elastic/StandardizeDataDag.py took 0.231 seconds [2017-04-26 05:51:37,033] {jobs.py:343} DagFileProcessor11 INFO - Started process (PID=9453) to work on /airflow/dags/etl_elastic/StandardizeDataDag.py [2017-04-26 05:51:37,036] {jobs.py:1521} DagFileProcessor11 INFO - Processing file /airflow/dags/etl_elastic/StandardizeDataDag.py for tasks to queue [2017-04-26 05:51:37,037] {models.py:167} DagFileProcessor11 INFO - Filling up the DagBag from /airflow/dags/etl_elastic/StandardizeDataDag.py [2017-04-26 05:51:37,195] {jobs.py:1535} DagFileProcessor11 INFO - DAG(s) dict_keys(['StandardizeDataDag']) retrieved from /airflow/dags/etl_elastic/StandardizeDataDag.py {code} > airflow scheduler not working > ----------------------------- > > Key: AIRFLOW-1147 > URL: https://issues.apache.org/jira/browse/AIRFLOW-1147 > Project: Apache Airflow > Issue Type: Bug > Components: scheduler > Affects Versions: Airflow 1.8 > Environment: CentOS running on 128 GB ram > Reporter: Mubin Khalid > Priority: Critical > Labels: documentation, newbie > Original Estimate: 24h > Remaining Estimate: 24h > > I've created some `DAG`s, and I tried to put it on scheduler. I want to run all the tasks in the DAG after exact 24 hours. > I tried to do something like this. > {code} > DEFAULT_ARGS = { > 'owner' : 'mubin', > 'depends_on_past' : False, > 'start_date' : datetime(2017, 4, 24, 14, 30), > 'retries' : 5, > 'retry_delay' : timedetla(1), > } > SCHEDULE_INTERVAL = timedelta(minutes=1440) > # SCHEDULE_INTERVAL = timedelta(hours=24) > # SCHEDULE_INTERVAL = timedelta(days=1) > dag = DAG('StandardizeDataDag', > default_args = DEFAULT_ARGS, > schedule_interval = SCHEDULE_INTERVAL > ) > {code} > I tried to put different intervals, but not any working. However if I try to reset db {code} airflow resetdb -y {code} and then run {code} airflow initdb {code} , it works for once. then after that, scheduler isn't able to run it. > PS. {code} airflow scheduler {code} executed from {code} root {code} -- This message was sent by Atlassian JIRA (v6.3.15#6346)