mahout-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "ASF GitHub Bot (JIRA)" <j...@apache.org>
Subject [jira] [Commented] (MAHOUT-1653) Spark 1.3
Date Mon, 06 Jul 2015 20:56:15 GMT

    [ https://issues.apache.org/jira/browse/MAHOUT-1653?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14615642#comment-14615642
] 

ASF GitHub Bot commented on MAHOUT-1653:
----------------------------------------

Github user andrewpalumbo commented on a diff in the pull request:

    https://github.com/apache/mahout/pull/146#discussion_r33981278
  
    --- Diff: spark-shell/src/main/scala/org/apache/mahout/sparkbindings/shell/MahoutSparkILoop.scala
---
    @@ -48,55 +77,63 @@ class MahoutSparkILoop extends SparkILoop {
     
         conf.set("spark.executor.memory", "1g")
     
    -    sparkContext = mahoutSparkContext(
    +    _interp.sparkContext = mahoutSparkContext(
           masterUrl = master,
           appName = "Mahout Spark Shell",
           customJars = jars,
           sparkConf = conf
         )
     
    -    echo("Created spark context..")
    +    echoToShell("Created spark context..")
         sparkContext
       }
     
    +  // need to change our SparkDistributedContext name to 'sc' since  we cannot override
the
    +  // private sparkCleanUp() method.
    +  // this is technically not part of Sparks explicitly defined Developer API though
    +  // nothing in the SparkILoopInit.scala file is marked as such.
       override def initializeSpark() {
    -    intp.beQuietDuring {
    -      command("""
    +    _interp.beQuietDuring {
    +      _interp.interpret("""
     
    -         @transient implicit val sdc: org.apache.mahout.math.drm.DistributedContext =
    +         @transient implicit val sc: org.apache.mahout.math.drm.DistributedContext =
    --- End diff --
    
    Actually, now that I'm looking at it again without worrying about the possibility of backwards
compatibility <1.3 we should also  be creating a SqlContext in that `initalizeSpark()`
call:
    
    
    From SparkILoop.initializeSpark():
    ```
          command("""
             @transient val sqlContext = {
               val _sqlContext = org.apache.spark.repl.Main.interp.createSQLContext()
               println("SQL context available as sqlContext.")
               _sqlContext
             }
            """)
          command("import org.apache.spark.SparkContext._")
          command("import sqlContext.implicits._")
          command("import sqlContext.sql")
          command("import org.apache.spark.sql.functions._")
    ```


> Spark 1.3
> ---------
>
>                 Key: MAHOUT-1653
>                 URL: https://issues.apache.org/jira/browse/MAHOUT-1653
>             Project: Mahout
>          Issue Type: Dependency upgrade
>            Reporter: Andrew Musselman
>            Assignee: Andrew Palumbo
>             Fix For: 0.11.0
>
>
> Support Spark 1.3



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

Mime
View raw message