airflow-commits mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "ASF subversion and git services (JIRA)" <j...@apache.org>
Subject [jira] [Commented] (AIRFLOW-1706) Scheduler is failed on startup with MS SQL Server as backend
Date Mon, 30 Oct 2017 19:08:00 GMT

    [ https://issues.apache.org/jira/browse/AIRFLOW-1706?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16225567#comment-16225567
] 

ASF subversion and git services commented on AIRFLOW-1706:
----------------------------------------------------------

Commit c800632bb40882d06344be1ebe0022e0a50ff121 in incubator-airflow's branch refs/heads/master
from k.privezentsev
[ https://git-wip-us.apache.org/repos/asf?p=incubator-airflow.git;h=c800632 ]

[AIRFLOW-1706] Fix query error for MSSQL backend

MSSQL doesn't support key word 'is' as synonym for
'='

Closes #2733 from
patsak/fix/illegal_query_for_mssql


> Scheduler is failed on startup with MS SQL Server as backend
> ------------------------------------------------------------
>
>                 Key: AIRFLOW-1706
>                 URL: https://issues.apache.org/jira/browse/AIRFLOW-1706
>             Project: Apache Airflow
>          Issue Type: Bug
>          Components: scheduler
>    Affects Versions: 1.9.0
>            Reporter: Konstantin Privezentsev
>            Priority: Minor
>              Labels: newbie, patch
>             Fix For: 1.9.0
>
>         Attachments: 0001-fix-query-error-for-mssql-backend.patch
>
>
> Actual for commit - 21e94c7d1594c5e0
> Scheduler log:
> {noformat}
> airflow_scheduler_1  | [2017-10-12 08:45:49,554] {{jobs.py:1497}} INFO - Starting the
scheduler
> airflow_scheduler_1  | [2017-10-12 08:45:49,554] {{jobs.py:1510}} INFO - Processing files
using up to 2 processes at a time
> airflow_scheduler_1  | [2017-10-12 08:45:49,554] {{jobs.py:1511}} INFO - Running execute
loop for -1 seconds
> airflow_scheduler_1  | [2017-10-12 08:45:49,554] {{jobs.py:1512}} INFO - Processing each
file at most -1 times
> airflow_scheduler_1  | [2017-10-12 08:45:49,555] {{jobs.py:1513}} INFO - Process each
file at most once every 0 seconds
> airflow_scheduler_1  | [2017-10-12 08:45:49,555] {{jobs.py:1514}} INFO - Checking for
new files in /opt/airflow/dags every 300 seconds
> airflow_scheduler_1  | [2017-10-12 08:45:49,555] {{jobs.py:1517}} INFO - Searching for
files in /opt/airflow/dags
> airflow_scheduler_1  | [2017-10-12 08:45:49,555] {{jobs.py:1519}} INFO - There are 1
files in /opt/airflow/dags
> airflow_scheduler_1  | [2017-10-12 08:45:49,628] {{jobs.py:1580}} INFO - Resetting orphaned
tasks for active dag runs
> airflow_scheduler_1  | [2017-10-12 08:45:49,634] {{jobs.py:1538}} INFO - Exited execute
loop
> airflow_scheduler_1  | Traceback (most recent call last):
> airflow_scheduler_1  |   File "/usr/local/bin/airflow", line 27, in <module>
> airflow_scheduler_1  |     args.func(args)
> airflow_scheduler_1  |   File "/usr/local/lib/python2.7/dist-packages/airflow/bin/cli.py",
line 828, in scheduler
> airflow_scheduler_1  |     job.run()
> airflow_scheduler_1  |   File "/usr/local/lib/python2.7/dist-packages/airflow/jobs.py",
line 197, in run
> airflow_scheduler_1  |     self._execute()
> airflow_scheduler_1  |   File "/usr/local/lib/python2.7/dist-packages/airflow/jobs.py",
line 1536, in _execute
> airflow_scheduler_1  |     self._execute_helper(processor_manager)
> airflow_scheduler_1  |   File "/usr/local/lib/python2.7/dist-packages/airflow/jobs.py",
line 1581, in _execute_helper
> airflow_scheduler_1  |     self.reset_state_for_orphaned_tasks(session=session)
> airflow_scheduler_1  |   File "/usr/local/lib/python2.7/dist-packages/airflow/utils/db.py",
line 50, in wrapper
> airflow_scheduler_1  |     result = func(*args, **kwargs)
> airflow_scheduler_1  |   File "/usr/local/lib/python2.7/dist-packages/airflow/jobs.py",
line 246, in reset_state_for_orphaned_tasks
> airflow_scheduler_1  |     TI.state.in_(resettable_states))).all()
> airflow_scheduler_1  |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/orm/query.py",
line 2703, in all
> airflow_scheduler_1  |     return list(self)
> airflow_scheduler_1  |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/orm/query.py",
line 2855, in __iter__
> airflow_scheduler_1  |     return self._execute_and_instances(context)
> airflow_scheduler_1  |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/orm/query.py",
line 2878, in _execute_and_instances
> airflow_scheduler_1  |     result = conn.execute(querycontext.statement, self._params)
> airflow_scheduler_1  |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/engine/base.py",
line 945, in execute
> airflow_scheduler_1  |     return meth(self, multiparams, params)
> airflow_scheduler_1  |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/sql/elements.py",
line 263, in _execute_on_connection
> airflow_scheduler_1  |     return connection._execute_clauseelement(self, multiparams,
params)
> airflow_scheduler_1  |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/engine/base.py",
line 1053, in _execute_clauseelement
> airflow_scheduler_1  |     compiled_sql, distilled_params
> airflow_scheduler_1  |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/engine/base.py",
line 1189, in _execute_context
> airflow_scheduler_1  |     context)
> airflow_scheduler_1  |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/engine/base.py",
line 1402, in _handle_dbapi_exception
> airflow_scheduler_1  |     exc_info
> airflow_scheduler_1  |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/util/compat.py",
line 203, in raise_from_cause
> airflow_scheduler_1  |     reraise(type(exception), exception, tb=exc_tb, cause=cause)
> airflow_scheduler_1  |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/engine/base.py",
line 1182, in _execute_context
> airflow_scheduler_1  |     context)
> airflow_scheduler_1  |   File "/usr/local/lib/python2.7/dist-packages/sqlalchemy/engine/default.py",
line 470, in do_execute
> airflow_scheduler_1  |     cursor.execute(statement, parameters)
> airflow_scheduler_1  | sqlalchemy.exc.ProgrammingError: (pyodbc.ProgrammingError) ('42000',
"[42000] [Microsoft][ODBC Driver 13 for SQL Server][SQL Server]Incorrect syntax near '0'.
(102) (SQLExecDirectW)") [SQL: u'SELECT task_instance.task_id AS task_instance_task_id, task_instance.dag_id
AS task_instance_dag_id, task_instance.execution_date AS task_instance_execution_date, task_instance.start_date
AS task_instance_start_date, task_instance.end_date AS task_instance_end_date, task_instance.duration
AS task_instance_duration, task_instance.state AS task_instance_state, task_instance.try_number
AS task_instance_try_number, task_instance.max_tries AS task_instance_max_tries, task_instance.hostname
AS task_instance_hostname, task_instance.unixname AS task_instance_unixname, task_instance.job_id
AS task_instance_job_id, task_instance.pool AS task_instance_pool, task_instance.queue AS
task_instance_queue, task_instance.priority_weight AS task_instance_priority_weight, task_instance.operator
AS task_instance_operator, task_instance.queued_dttm AS task_instance_queued_dttm, task_instance.pid
AS task_instance_pid \nFROM task_instance JOIN dag_run ON task_instance.dag_id = dag_run.dag_id
AND task_instance.execution_date = dag_run.execution_date \nWHERE dag_run.state = ? AND dag_run.external_trigger
IS 0 AND dag_run.run_id NOT LIKE ? AND task_instance.state IN (?, ?)'] [parameters: (u'running',
u'backfill_%', u'scheduled', u'queued')]
> {noformat}
> Patch in attachment



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

Mime
View raw message