Return-Path: X-Original-To: archive-asf-public-internal@cust-asf2.ponee.io Delivered-To: archive-asf-public-internal@cust-asf2.ponee.io Received: from cust-asf.ponee.io (cust-asf.ponee.io [163.172.22.183]) by cust-asf2.ponee.io (Postfix) with ESMTP id F0764200B61 for ; Tue, 26 Jul 2016 05:21:25 +0200 (CEST) Received: by cust-asf.ponee.io (Postfix) id EEF7A160A8F; Tue, 26 Jul 2016 03:21:25 +0000 (UTC) Delivered-To: archive-asf-public@cust-asf.ponee.io Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by cust-asf.ponee.io (Postfix) with SMTP id 1AB19160A7D for ; Tue, 26 Jul 2016 05:21:24 +0200 (CEST) Received: (qmail 81731 invoked by uid 500); 26 Jul 2016 03:21:23 -0000 Mailing-List: contact commits-help@airflow.incubator.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: dev@airflow.incubator.apache.org Delivered-To: mailing list commits@airflow.incubator.apache.org Received: (qmail 81722 invoked by uid 99); 26 Jul 2016 03:21:23 -0000 Received: from pnap-us-west-generic-nat.apache.org (HELO spamd1-us-west.apache.org) (209.188.14.142) by apache.org (qpsmtpd/0.29) with ESMTP; Tue, 26 Jul 2016 03:21:23 +0000 Received: from localhost (localhost [127.0.0.1]) by spamd1-us-west.apache.org (ASF Mail Server at spamd1-us-west.apache.org) with ESMTP id 7D246C0179 for ; Tue, 26 Jul 2016 03:21:23 +0000 (UTC) X-Virus-Scanned: Debian amavisd-new at spamd1-us-west.apache.org X-Spam-Flag: NO X-Spam-Score: -4.507 X-Spam-Level: X-Spam-Status: No, score=-4.507 tagged_above=-999 required=6.31 tests=[KAM_ASCII_DIVIDERS=0.8, KAM_LAZY_DOMAIN_SECURITY=1, RCVD_IN_DNSWL_HI=-5, RCVD_IN_MSPIKE_H3=-0.01, RCVD_IN_MSPIKE_WL=-0.01, RP_MATCHES_RCVD=-1.287] autolearn=disabled Received: from mx2-lw-us.apache.org ([10.40.0.8]) by localhost (spamd1-us-west.apache.org [10.40.0.7]) (amavisd-new, port 10024) with ESMTP id KXdLQ6hCq8At for ; Tue, 26 Jul 2016 03:21:21 +0000 (UTC) Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by mx2-lw-us.apache.org (ASF Mail Server at mx2-lw-us.apache.org) with SMTP id 523025F474 for ; Tue, 26 Jul 2016 03:21:21 +0000 (UTC) Received: (qmail 81036 invoked by uid 99); 26 Jul 2016 03:21:20 -0000 Received: from arcas.apache.org (HELO arcas) (140.211.11.28) by apache.org (qpsmtpd/0.29) with ESMTP; Tue, 26 Jul 2016 03:21:20 +0000 Received: from arcas.apache.org (localhost [127.0.0.1]) by arcas (Postfix) with ESMTP id AF4A72C0D5E for ; Tue, 26 Jul 2016 03:21:20 +0000 (UTC) Date: Tue, 26 Jul 2016 03:21:20 +0000 (UTC) From: "wei.he (JIRA)" To: commits@airflow.incubator.apache.org Message-ID: In-Reply-To: References: Subject: [jira] [Comment Edited] (AIRFLOW-357) how should I use the right owner task in airflow? MIME-Version: 1.0 Content-Type: text/plain; charset=utf-8 Content-Transfer-Encoding: 7bit X-JIRA-FingerPrint: 30527f35849b9dde25b450d4833f0394 archived-at: Tue, 26 Jul 2016 03:21:26 -0000 [ https://issues.apache.org/jira/browse/AIRFLOW-357?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15393130#comment-15393130 ] wei.he edited comment on AIRFLOW-357 at 7/26/16 3:21 AM: --------------------------------------------------------- I don't fix this problem. My dag code is the following. {code:title=test3.py|borderStyle=solid} from airflow.models import DAG from airflow.operators import BashOperator from datetime import datetime, timedelta rootdir = "/tmp/airflow" default_args = { 'owner': 'max', 'depends_on_past': False, 'start_date': datetime.now(), 'email': ['max@test.com'], 'email_on_failure': False, 'email_on_retry': False, 'retries': 1, 'retry_delay': timedelta(minutes=5), } dag = DAG('test3', default_args=default_args, schedule_interval='*/2 * * * *') t1 = BashOperator( task_id='test3-task1', bash_command='date >> {rootdir}/test3-task1.out'.format(rootdir=rootdir), owner='max', dag=dag) t2 = BashOperator( task_id='test3-task2', bash_command='whoami', retries=3, owner='max', dag=dag) {code} Then I run the command "airflow test test3 test3-task2 2016-07-25" with 'airflow' user of linux. The result of output "whoami" is "airflow". But I hope that the output result is "owner" of task. [2016-07-25 11:22:37,716] {bash_operator.py:64} INFO - Temporary script location :/tmp/airflowtmpoYNJE8//tmp/airflowtmpoYNJE8/test3-task2U1lpom [2016-07-25 11:22:37,716] {bash_operator.py:65} INFO - Running command: whoami [2016-07-25 11:22:37,722] {bash_operator.py:73} INFO - Output: [2016-07-25 11:22:37,725] {bash_operator.py:77} INFO - **airflow** [2016-07-25 11:22:37,725] {bash_operator.py:80} INFO - Command exited with return code 0 What is my wrong ? Thanks was (Author: hwbj): I don't fix this problem. My dag code is the following. from airflow.models import DAG from airflow.operators import BashOperator from datetime import datetime, timedelta rootdir = "/tmp/airflow" default_args = { 'owner': 'max', 'depends_on_past': False, 'start_date': datetime.now(), 'email': ['max@test.com'], 'email_on_failure': False, 'email_on_retry': False, 'retries': 1, 'retry_delay': timedelta(minutes=5), } dag = DAG('test3', default_args=default_args, schedule_interval='*/2 * * * *') t1 = BashOperator( task_id='test3-task1', bash_command='date >> {rootdir}/test3-task1.out'.format(rootdir=rootdir), owner='max', dag=dag) t2 = BashOperator( task_id='test3-task2', bash_command='whoami', retries=3, owner='max', dag=dag) Then I run the command "airflow test test3 test3-task2 2016-07-25" with 'airflow' user of linux. The result of output "whoami" is "airflow". But I hope that the output result is "owner" of task. [2016-07-25 11:22:37,716] {bash_operator.py:64} INFO - Temporary script location :/tmp/airflowtmpoYNJE8//tmp/airflowtmpoYNJE8/test3-task2U1lpom [2016-07-25 11:22:37,716] {bash_operator.py:65} INFO - Running command: whoami [2016-07-25 11:22:37,722] {bash_operator.py:73} INFO - Output: [2016-07-25 11:22:37,725] {bash_operator.py:77} INFO - **airflow** [2016-07-25 11:22:37,725] {bash_operator.py:80} INFO - Command exited with return code 0 What is my wrong ? Thanks > how should I use the right owner task in airflow? > ------------------------------------------------- > > Key: AIRFLOW-357 > URL: https://issues.apache.org/jira/browse/AIRFLOW-357 > Project: Apache Airflow > Issue Type: Bug > Affects Versions: Airflow 1.7.1 > Reporter: wei.he > > I dont understand the "owner" in airflow. the comment of ower is "the owner of the task, using the unix username is recommended". I wrote some the following code. > Default_args = { > 'owner': 'max', > 'depends_on_past': False, > 'start_date': datetime(2016, 7, 14), > 'email': ['max@test.com'], > 'email_on_failure': False, > 'email_on_retry': False, > 'retries': 1, > 'retry_delay': timedelta(minutes=5), > dag = DAG('dmp-annalect', default_args=default_args, > schedule_interval='30 0 * * *') > task1_pigjob_basedata = """ > {local_dir}/src/basedata/basedata.sh > {local_dir}/log/basedata/run_log & > """.format(local_dir=WORKSPACE) > task1_pigjob_basedata = BashOperator( > task_id='task1_pigjob_basedata_impclk',owner='max', > bash_command=pigjob_basedata_impclk, > dag=dag) > I used the command "airflow test dagid taskid 2016-07-20" , > But I got a error, > ... {bash_operator.py:77} INFO - put: Permission denied: user=airflow, .... > I thought that my job ran with "max" user, but apperently , ran test using 'airflow' user . > I hope if I run my task using 'max' user, how should I do. > Thanks -- This message was sent by Atlassian JIRA (v6.3.4#6332)