Return-Path: X-Original-To: archive-asf-public-internal@cust-asf2.ponee.io Delivered-To: archive-asf-public-internal@cust-asf2.ponee.io Received: from cust-asf.ponee.io (cust-asf.ponee.io [163.172.22.183]) by cust-asf2.ponee.io (Postfix) with ESMTP id 49C7820049D for ; Wed, 9 Aug 2017 19:10:56 +0200 (CEST) Received: by cust-asf.ponee.io (Postfix) id 4830A169A13; Wed, 9 Aug 2017 17:10:56 +0000 (UTC) Delivered-To: archive-asf-public@cust-asf.ponee.io Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by cust-asf.ponee.io (Postfix) with SMTP id 8DD24169A12 for ; Wed, 9 Aug 2017 19:10:55 +0200 (CEST) Received: (qmail 24277 invoked by uid 500); 9 Aug 2017 17:10:54 -0000 Mailing-List: contact dev-help@airflow.incubator.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: dev@airflow.incubator.apache.org Delivered-To: mailing list dev@airflow.incubator.apache.org Received: (qmail 24264 invoked by uid 99); 9 Aug 2017 17:10:54 -0000 Received: from pnap-us-west-generic-nat.apache.org (HELO spamd1-us-west.apache.org) (209.188.14.142) by apache.org (qpsmtpd/0.29) with ESMTP; Wed, 09 Aug 2017 17:10:54 +0000 Received: from localhost (localhost [127.0.0.1]) by spamd1-us-west.apache.org (ASF Mail Server at spamd1-us-west.apache.org) with ESMTP id 00E1EC02EB for ; Wed, 9 Aug 2017 17:10:54 +0000 (UTC) X-Virus-Scanned: Debian amavisd-new at spamd1-us-west.apache.org X-Spam-Flag: NO X-Spam-Score: 2.379 X-Spam-Level: ** X-Spam-Status: No, score=2.379 tagged_above=-999 required=6.31 tests=[DKIM_SIGNED=0.1, DKIM_VALID=-0.1, DKIM_VALID_AU=-0.1, HTML_MESSAGE=2, RCVD_IN_DNSWL_NONE=-0.0001, RCVD_IN_MSPIKE_H3=-0.01, RCVD_IN_MSPIKE_WL=-0.01, RCVD_IN_SORBS_SPAM=0.5, SPF_PASS=-0.001] autolearn=disabled Authentication-Results: spamd1-us-west.apache.org (amavisd-new); dkim=pass (1024-bit key) header.d=cloverhealth.com Received: from mx1-lw-us.apache.org ([10.40.0.8]) by localhost (spamd1-us-west.apache.org [10.40.0.7]) (amavisd-new, port 10024) with ESMTP id UG1VXZuo45en for ; Wed, 9 Aug 2017 17:10:51 +0000 (UTC) Received: from mail-oi0-f46.google.com (mail-oi0-f46.google.com [209.85.218.46]) by mx1-lw-us.apache.org (ASF Mail Server at mx1-lw-us.apache.org) with ESMTPS id A30B25FB95 for ; Wed, 9 Aug 2017 17:10:51 +0000 (UTC) Received: by mail-oi0-f46.google.com with SMTP id g131so67125647oic.3 for ; Wed, 09 Aug 2017 10:10:51 -0700 (PDT) X-Google-DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=1e100.net; s=20161025; h=x-gm-message-state:mime-version:references:in-reply-to:from:date :message-id:subject:to; bh=ekGS6Roe0qN846Lw0IP2lmjR3A8PvkrwWuM3oQe+3u0=; b=abWFRhi9/2DeOCYnOzMcJiI8CvL9s/9t5GQtBhSXeJVvBtIx+wevoZl7W0y8sP0zAl kuwWFacPTIZwmRvSF7WX3yB/ef6cQVuAN3AfIBBBOVEdhFXFhiIFV0/kQCDARjpvQMUS lpzjPxJaZ/SgBlXLtHyU1W46686JBheYaPsSTWqnIjL2JYgK1GSo+IvS5ayIkWQTz6A5 2e5moCJHhDKFZU/tWWBbOuUQq+RyKUT9JDdrj/TVpj6ppZEB7uy9hI/deW8kzm8e1pgJ 1EdPFZxXkZEbcYw9OrEexbJiABJCemsKxS4ig4/G78a4AkhZEcs6BQZ3o6l3ssGAZ6SH TI2Q== X-Gm-Message-State: AHYfb5gd5112aG3nYpz5UDmU0oRBbawEsuXrDYKGErDYJ/IT7hcECNam OrqEXR/P5y5Sq/IYe4ggtGDrgrxeEiMZL54= X-Received: by 10.202.193.8 with SMTP id r8mr3453312oif.227.1502298649765; Wed, 09 Aug 2017 10:10:49 -0700 (PDT) MIME-Version: 1.0 References: <0DAC7CB2-E1AB-4336-ACA1-A4F85517F36C@myntra.com> <63890AFD-6D26-46B3-BEF7-378FB29FD7B4@myntra.com> In-Reply-To: <63890AFD-6D26-46B3-BEF7-378FB29FD7B4@myntra.com> From: George Leslie-Waksman Date: Wed, 09 Aug 2017 17:10:39 +0000 Message-ID: Subject: Re: Task partitioning using Airflow To: dev@airflow.incubator.apache.org Content-Type: multipart/alternative; boundary="001a113cd73c5de3cd055655298b" archived-at: Wed, 09 Aug 2017 17:10:56 -0000 --001a113cd73c5de3cd055655298b Content-Type: text/plain; charset="UTF-8" Airflow is best for situations where you want to run different tasks that depend on each other or process data that arrives over time. If your goal is to take a large dataset, split it up, and process chunks of it, there are probably other tools better suited to your purpose. Off the top of my head, you might consider Dask: https://dask.pydata.org/en/latest/ or directly using Celery: http://www.celeryproject.org/ --George On Wed, Aug 9, 2017 at 9:52 AM Ashish Rawat wrote: > Hi - Can anyone please provide some pointers for this use case over > Airflow? > > -- > Regards, > Ashish > > > > > On 03-Aug-2017, at 9:13 PM, Ashish Rawat > wrote: > > > > Hi, > > > > We have a use case where we are running some R/Python based data science > models, which execute over a single node. The execution time of the models > is constantly increasing and we are now planning to split the model > training by a partition key and distribute the workload over multiple > machines. > > > > Does Airflow provide some simple way to split a task into multiple > tasks, all of which will work on a specific value of the key. > > > > -- > > Regards, > > Ashish > > > > > > > > --001a113cd73c5de3cd055655298b--