Return-Path: X-Original-To: apmail-hadoop-mapreduce-user-archive@minotaur.apache.org Delivered-To: apmail-hadoop-mapreduce-user-archive@minotaur.apache.org Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by minotaur.apache.org (Postfix) with SMTP id 97E6A117B9 for ; Wed, 27 Aug 2014 18:28:36 +0000 (UTC) Received: (qmail 45471 invoked by uid 500); 27 Aug 2014 18:28:31 -0000 Delivered-To: apmail-hadoop-mapreduce-user-archive@hadoop.apache.org Received: (qmail 45360 invoked by uid 500); 27 Aug 2014 18:28:31 -0000 Mailing-List: contact user-help@hadoop.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: user@hadoop.apache.org Delivered-To: mailing list user@hadoop.apache.org Received: (qmail 45338 invoked by uid 99); 27 Aug 2014 18:28:31 -0000 Received: from nike.apache.org (HELO nike.apache.org) (192.87.106.230) by apache.org (qpsmtpd/0.29) with ESMTP; Wed, 27 Aug 2014 18:28:31 +0000 X-ASF-Spam-Status: No, hits=1.5 required=5.0 tests=HTML_MESSAGE,RCVD_IN_DNSWL_LOW,SPF_PASS X-Spam-Check-By: apache.org Received-SPF: pass (nike.apache.org: domain of sebastiano.dipaola@gmail.com designates 209.85.216.41 as permitted sender) Received: from [209.85.216.41] (HELO mail-qa0-f41.google.com) (209.85.216.41) by apache.org (qpsmtpd/0.29) with ESMTP; Wed, 27 Aug 2014 18:28:03 +0000 Received: by mail-qa0-f41.google.com with SMTP id j7so606346qaq.0 for ; Wed, 27 Aug 2014 11:28:01 -0700 (PDT) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=gmail.com; s=20120113; h=mime-version:in-reply-to:references:date:message-id:subject:from:to :content-type; bh=/4m2PyrNr59ybmczGGX5d8qzEtLh4jNIqYAWNpX8kl0=; b=D4FxW5PF44s8i8HX4xxj4n4rMHWsfacGmnYk7rPwZZej9WQiUzuRcDL1usyBk3hFRQ iVGRgmV6f2Yz+9kH17BeE7gDYzsBhLSzkge1GHLjObSoioLyLAPuOADJDRquesjWQkAm ZJgXZcNMCEbb19lmon3/WdyAR2aFD7wfrz+1M/gn3LWhIk1lz7MMiRpYqYfwcKzvqscY CjCpMGYNi5/DJlE7C5hqjH8wKmdBeoJ2pNGumR3uJLD4KG9S8WkHAuAdK73fRjLrppxL WrUZmMBSVAJNB2/Z+cTZxKM7K8onlKN4/kI+V1zdBRVt17E49foTYl2hWdtgirYSAYQv 4k+A== MIME-Version: 1.0 X-Received: by 10.224.123.9 with SMTP id n9mr19297985qar.84.1409164081877; Wed, 27 Aug 2014 11:28:01 -0700 (PDT) Received: by 10.96.45.130 with HTTP; Wed, 27 Aug 2014 11:28:01 -0700 (PDT) In-Reply-To: References: Date: Wed, 27 Aug 2014 20:28:01 +0200 Message-ID: Subject: Re: Need some tutorials for Mapreduce written in Python From: Sebastiano Di Paola To: user@hadoop.apache.org Content-Type: multipart/alternative; boundary=089e01536d2087d9df0501a094c4 X-Virus-Checked: Checked by ClamAV on apache.org --089e01536d2087d9df0501a094c4 Content-Type: text/plain; charset=UTF-8 Hi there, In order to use Python to write mapreduce jobs you need to use hadoop streaming api. So I will suggest start searching for it. (here's a link although is for hadoop 1.x http://hadoop.apache.org/docs/r1.2.1/streaming.html ) but it's a starting point. With streaming API you can use whatever language to write map/reduce jobs provided they will expect to read data from stdin and write data to stdout. Streaming api will do the magic for you ;-) Hope it helps. Seba On Wed, Aug 27, 2014 at 8:13 PM, Amar Singh wrote: > Hi Users, > I am new to big data world and was in process of reading some material of > writing mapreduce using Python. > > Any links or pointers in that direction will be really helpful. > --089e01536d2087d9df0501a094c4 Content-Type: text/html; charset=UTF-8 Content-Transfer-Encoding: quoted-printable
Hi there,
In order = to use Python to write mapreduce jobs you need to use hadoop streaming api.=
So I will suggest start searching for it.
(here'= ;s a link although is for hadoop 1.x http://hadoop.apache.org/docs/r1.2.1/streaming.ht= ml ) but it's a starting point.
With streaming API you can use whatever language to write map/reduce = jobs provided they will expect to read data from stdin and write data to st= dout.
Streaming api will do the magic for you ;-)
Hope it= helps.
Seba



On Wed, Aug 27, 2014 at 8:13 PM= , Amar Singh <amarsingh125@gmail.com> wrote:
Hi Users,
I am new to big data world and was in process of reading some material of= writing mapreduce using Python.

Any links or pointers in that direction will be really helpful.

--089e01536d2087d9df0501a094c4--