From user-return-24008-archive-asf-public=cust-asf.ponee.io@ignite.apache.org Sun Jan 6 10:41:40 2019 Return-Path: X-Original-To: archive-asf-public@cust-asf.ponee.io Delivered-To: archive-asf-public@cust-asf.ponee.io Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by mx-eu-01.ponee.io (Postfix) with SMTP id 3C216180636 for ; Sun, 6 Jan 2019 10:41:40 +0100 (CET) Received: (qmail 41756 invoked by uid 500); 6 Jan 2019 09:41:39 -0000 Mailing-List: contact user-help@ignite.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: user@ignite.apache.org Delivered-To: mailing list user@ignite.apache.org Received: (qmail 41746 invoked by uid 99); 6 Jan 2019 09:41:39 -0000 Received: from pnap-us-west-generic-nat.apache.org (HELO spamd3-us-west.apache.org) (209.188.14.142) by apache.org (qpsmtpd/0.29) with ESMTP; Sun, 06 Jan 2019 09:41:39 +0000 Received: from localhost (localhost [127.0.0.1]) by spamd3-us-west.apache.org (ASF Mail Server at spamd3-us-west.apache.org) with ESMTP id B24F2180EC6 for ; Sun, 6 Jan 2019 09:41:38 +0000 (UTC) X-Virus-Scanned: Debian amavisd-new at spamd3-us-west.apache.org X-Spam-Flag: NO X-Spam-Score: 3.172 X-Spam-Level: *** X-Spam-Status: No, score=3.172 tagged_above=-999 required=6.31 tests=[DKIM_ADSP_CUSTOM_MED=0.001, FORGED_GMAIL_RCVD=1, NML_ADSP_CUSTOM_MED=1.2, RCVD_IN_DNSWL_NONE=-0.0001, SPF_HELO_PASS=-0.001, SPF_SOFTFAIL=0.972] autolearn=disabled Received: from mx1-lw-eu.apache.org ([10.40.0.8]) by localhost (spamd3-us-west.apache.org [10.40.0.10]) (amavisd-new, port 10024) with ESMTP id nqcwgmqqOXFB for ; Sun, 6 Jan 2019 09:41:37 +0000 (UTC) Received: from n6.nabble.com (n6.nabble.com [162.255.23.37]) by mx1-lw-eu.apache.org (ASF Mail Server at mx1-lw-eu.apache.org) with ESMTP id CCE485FB43 for ; Sun, 6 Jan 2019 09:41:36 +0000 (UTC) Received: from n6.nabble.com (localhost [127.0.0.1]) by n6.nabble.com (Postfix) with ESMTP id 0BD10B830F0D for ; Sun, 6 Jan 2019 02:41:36 -0700 (MST) Date: Sun, 6 Jan 2019 02:41:36 -0700 (MST) From: mehdi sey To: user@ignite.apache.org Message-ID: <1546767696045-0.post@n6.nabble.com> Subject: Distributed Training in tensorflow MIME-Version: 1.0 Content-Type: text/plain; charset=us-ascii Content-Transfer-Encoding: 7bit Distributed training allows computational resources to be used on the whole cluster and thus speed up training of deep learning models. TensorFlow is a machine learning framework that natively supports distributed neural network training, inference and other computations.Using this ability, we can calculate gradients on the nodes the data are stored on, reduce them and then finally update model parameters.In case of TensorFlow on Apache Ignite does in a server in cluster we must run a tensorflow worker for doing work on its data? -- Sent from: http://apache-ignite-users.70518.x6.nabble.com/