Return-Path: X-Original-To: apmail-spark-dev-archive@minotaur.apache.org Delivered-To: apmail-spark-dev-archive@minotaur.apache.org Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by minotaur.apache.org (Postfix) with SMTP id 03D4911570 for ; Tue, 20 May 2014 21:07:00 +0000 (UTC) Received: (qmail 56040 invoked by uid 500); 20 May 2014 21:06:59 -0000 Delivered-To: apmail-spark-dev-archive@spark.apache.org Received: (qmail 55982 invoked by uid 500); 20 May 2014 21:06:59 -0000 Mailing-List: contact dev-help@spark.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: dev@spark.apache.org Delivered-To: mailing list dev@spark.apache.org Received: (qmail 55973 invoked by uid 99); 20 May 2014 21:06:59 -0000 Received: from athena.apache.org (HELO athena.apache.org) (140.211.11.136) by apache.org (qpsmtpd/0.29) with ESMTP; Tue, 20 May 2014 21:06:59 +0000 X-ASF-Spam-Status: No, hits=3.8 required=10.0 tests=FREEMAIL_REPLY,HTML_MESSAGE,RCVD_IN_DNSWL_LOW,SPF_PASS,URI_HEX X-Spam-Check-By: apache.org Received-SPF: pass (athena.apache.org: domain of andykonwinski@gmail.com designates 209.85.217.177 as permitted sender) Received: from [209.85.217.177] (HELO mail-lb0-f177.google.com) (209.85.217.177) by apache.org (qpsmtpd/0.29) with ESMTP; Tue, 20 May 2014 21:06:55 +0000 Received: by mail-lb0-f177.google.com with SMTP id s7so852778lbd.22 for ; Tue, 20 May 2014 14:06:33 -0700 (PDT) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=gmail.com; s=20120113; h=mime-version:in-reply-to:references:date:message-id:subject:from:to :content-type; bh=En9tKGnHsCW7NOYMybL8LzjwgEAazfbi0W9BMGSpCcw=; b=b0QuZX8kGhuNnC64blyQdvwvia49uLURqenvZqLkIYqj/fV2M3I6J5SeKWODsq0bWh uMmQSnah6s8C7poxanmrJ4WieKeF25m1dQ19nB2hsrApW6X1pf/yBIxzutiE9YE1qOo1 wzyhU+WkSnKEev4vxKnGMBoHDjIIujKkMqeAS4VQNClbN8COtmQtBIz0ptwPYlIInms/ P7u3qn84pJU39OU8PeCZMVcOAYE8EEOpMloy53Xjn0anT9I23C35GIpjbvNknZgPZk4d Wy3I6nqyKTFN/sO3qkuiV/N3M2DhvDLkYGahUdD0dYRKZlzbkY3jZtANY3FyNdomTVdn fM6Q== MIME-Version: 1.0 X-Received: by 10.152.19.195 with SMTP id h3mr14475738lae.47.1400619993567; Tue, 20 May 2014 14:06:33 -0700 (PDT) Received: by 10.112.147.4 with HTTP; Tue, 20 May 2014 14:06:33 -0700 (PDT) In-Reply-To: References: <1400258494709-6593.post@n3.nabble.com> Date: Tue, 20 May 2014 14:06:33 -0700 Message-ID: Subject: Re: Scala examples for Spark do not work as written in documentation From: Andy Konwinski To: dev@spark.apache.org Content-Type: multipart/alternative; boundary=089e013d187e2e965504f9db41ba X-Virus-Checked: Checked by ClamAV on apache.org --089e013d187e2e965504f9db41ba Content-Type: text/plain; charset=UTF-8 I fixed the bug, but I kept the parameter "i" instead of "_" since that (1) keeps it more parallel to the python and java versions which also use functions with a named variable and (2) doesn't require readers to know this particular use of the "_" syntax in Scala. Thanks for catching this Glenn. Andy On Fri, May 16, 2014 at 12:38 PM, Mark Hamstra wrote: > Sorry, looks like an extra line got inserted in there. One more try: > > val count = spark.parallelize(1 to NUM_SAMPLES).map { _ => > val x = Math.random() > val y = Math.random() > if (x*x + y*y < 1) 1 else 0 > }.reduce(_ + _) > > > > On Fri, May 16, 2014 at 12:36 PM, Mark Hamstra >wrote: > > > Actually, the better way to write the multi-line closure would be: > > > > val count = spark.parallelize(1 to NUM_SAMPLES).map { _ => > > > > val x = Math.random() > > val y = Math.random() > > if (x*x + y*y < 1) 1 else 0 > > }.reduce(_ + _) > > > > > > On Fri, May 16, 2014 at 9:41 AM, GlennStrycker >wrote: > > > >> On the webpage http://spark.apache.org/examples.html, there is an > example > >> written as > >> > >> val count = spark.parallelize(1 to NUM_SAMPLES).map(i => > >> val x = Math.random() > >> val y = Math.random() > >> if (x*x + y*y < 1) 1 else 0 > >> ).reduce(_ + _) > >> println("Pi is roughly " + 4.0 * count / NUM_SAMPLES) > >> > >> This does not execute in Spark, which gives me an error: > >> :2: error: illegal start of simple expression > >> val x = Math.random() > >> ^ > >> > >> If I rewrite the query slightly, adding in {}, it works: > >> > >> val count = spark.parallelize(1 to 10000).map(i => > >> { > >> val x = Math.random() > >> val y = Math.random() > >> if (x*x + y*y < 1) 1 else 0 > >> } > >> ).reduce(_ + _) > >> println("Pi is roughly " + 4.0 * count / 10000.0) > >> > >> > >> > >> > >> > >> -- > >> View this message in context: > >> > http://apache-spark-developers-list.1001551.n3.nabble.com/Scala-examples-for-Spark-do-not-work-as-written-in-documentation-tp6593.html > >> Sent from the Apache Spark Developers List mailing list archive at > >> Nabble.com. > >> > > > > > --089e013d187e2e965504f9db41ba--