Return-Path: X-Original-To: archive-asf-public-internal@cust-asf2.ponee.io Delivered-To: archive-asf-public-internal@cust-asf2.ponee.io Received: from cust-asf.ponee.io (cust-asf.ponee.io [163.172.22.183]) by cust-asf2.ponee.io (Postfix) with ESMTP id D7FE9200B38 for ; Fri, 8 Jul 2016 19:58:32 +0200 (CEST) Received: by cust-asf.ponee.io (Postfix) id D6A12160A5A; Fri, 8 Jul 2016 17:58:32 +0000 (UTC) Delivered-To: archive-asf-public@cust-asf.ponee.io Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by cust-asf.ponee.io (Postfix) with SMTP id 2A059160A36 for ; Fri, 8 Jul 2016 19:58:32 +0200 (CEST) Received: (qmail 35132 invoked by uid 500); 8 Jul 2016 17:58:31 -0000 Mailing-List: contact dev-help@airflow.incubator.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: dev@airflow.incubator.apache.org Delivered-To: mailing list dev@airflow.incubator.apache.org Received: (qmail 35121 invoked by uid 99); 8 Jul 2016 17:58:31 -0000 Received: from pnap-us-west-generic-nat.apache.org (HELO spamd1-us-west.apache.org) (209.188.14.142) by apache.org (qpsmtpd/0.29) with ESMTP; Fri, 08 Jul 2016 17:58:31 +0000 Received: from localhost (localhost [127.0.0.1]) by spamd1-us-west.apache.org (ASF Mail Server at spamd1-us-west.apache.org) with ESMTP id B36EAC6BEF for ; Fri, 8 Jul 2016 17:58:30 +0000 (UTC) X-Virus-Scanned: Debian amavisd-new at spamd1-us-west.apache.org X-Spam-Flag: NO X-Spam-Score: 2.998 X-Spam-Level: ** X-Spam-Status: No, score=2.998 tagged_above=-999 required=6.31 tests=[RDNS_NONE=3, SPF_HELO_PASS=-0.001, SPF_PASS=-0.001] autolearn=disabled Received: from mx2-lw-us.apache.org ([10.40.0.8]) by localhost (spamd1-us-west.apache.org [10.40.0.7]) (amavisd-new, port 10024) with ESMTP id zkS58Xm8j-wj for ; Fri, 8 Jul 2016 17:58:28 +0000 (UTC) Received: from mytemp.email (unknown [176.126.236.241]) by mx2-lw-us.apache.org (ASF Mail Server at mx2-lw-us.apache.org) with ESMTPS id 91AE45F2ED for ; Fri, 8 Jul 2016 17:58:28 +0000 (UTC) Content-Type: text/plain; format=flowed From: ziqr@vo.yoo.ro To: dev@airflow.incubator.apache.org Subject: Help reusing filepath from previous task X-Mytemp-Email: 5eee9cd2a22fc0145c08f379fe90778b3e5f7224 Content-Transfer-Encoding: 7bit Date: Fri, 08 Jul 2016 17:58:39 +0000 Message-Id: <1468000719847-f9ae8a50-980aa15e-98a6dd93@vo.yoo.ro> MIME-Version: 1.0 archived-at: Fri, 08 Jul 2016 17:58:33 -0000 Hello. I'd like to reuse some info given by a task to feed subsequent S3KeySensor checks, but after trying several ideas I'm not able to make this work. Basically my current DAG is as follow : copy_data = BashOperator( task_id='copy_data', # this script copies data to S3 bash_command='do_stuff.sh', dag=dag) s3_path = xxxx for data_type in ("fileA", "fileB", "fileC"): S3Sensor_task = S3KeySensor( task_id='check_' + data_type, poke_interval=20, timeout=60, retry_delay=timedelta(seconds=30), bucket_key=s3_path + data_type, bucket_name='xxxx', s3_conn_id='s3_conn_id', dag=dag) S3Sensor_task.set_upstream(copy_data) This works great, but I'd rather avoid duplicating the S3 path in the DAG and in the do_stuff.sh script. So I tought pushing S3 path from the do_stuff.sh to xcom variables, but then I need access to thoses items in my for loop. I've tried SubDag but once again I did not manage to read xcom. Could someone give me some hints ? Regards