airflow-commits mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Sathyaprakash Govindasamy (JIRA)" <j...@apache.org>
Subject [jira] [Commented] (AIRFLOW-2342) DAG in running state but tasks not running
Date Fri, 27 Apr 2018 06:25:00 GMT

    [ https://issues.apache.org/jira/browse/AIRFLOW-2342?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16455980#comment-16455980
] 

Sathyaprakash Govindasamy commented on AIRFLOW-2342:
----------------------------------------------------

I see that error occurs when _SqlSensor_ is tried to be imported in _daily_building_blocks_main_
dag

File "/home/airflow_sa/dags/daily_building_blocks_main.py", line 14, in <module> from
airflow.operators.sensors import SqlSensor

You can run interactive python shell and run this below line to see whether it throws any
error
{code:java}
from airflow.operators.sensors import SqlSensor{code}
This error is caused by _snakbite_ import in hdfs_hook (which is internally called in _sensors_
module)

File "/usr/local/lib/python2.7/site-packages/airflow/hooks/hdfs_hook.py", line 20, in <module>
from snakebite.client import Client, HAClient, Namenode, AutoConfigClient

But I see that in 1.8, this import is within try block. So I am not sure why this import statement
is getting filed. 

[https://github.com/apache/incubator-airflow/blob/4e370ffc9ab579aa75de0fa8704c96683b9b963e/airflow/hooks/hdfs_hook.py#L20]

You can inspect the file */usr/local/lib/python2.7/site-packages/airflow/hooks/hdfs_hook.py* in
your system to see whether this import is really in try block

> DAG in running state but tasks not running
> ------------------------------------------
>
>                 Key: AIRFLOW-2342
>                 URL: https://issues.apache.org/jira/browse/AIRFLOW-2342
>             Project: Apache Airflow
>          Issue Type: Bug
>          Components: DagRun
>    Affects Versions: Airflow 1.8
>         Environment: Redhat
>            Reporter: chidrup jhanjhari
>            Priority: Major
>         Attachments: job1.py.log
>
>   Original Estimate: 1h
>  Remaining Estimate: 1h
>
> Hi, We are on Airflow 1.8.0. We have airflow production environment running well since
8 months. There has been no change on configuration etc. The issue is since 2 days DAGs are
showing in running state but the tasks are not getting triggered. After the default start
task, the DAG run is not moving to the next task. Attached is the scheduler throwing following
error:
> 2018-04-19 01:32:22,586] \{jobs.py:354} DagFileProcessor17 ERROR - Got an exception!
Propagating...
> Traceback (most recent call last):
>   Any help will be greatly appreciated.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

Mime
View raw message