mahout-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Dmitriy Lyubimov (JIRA)" <j...@apache.org>
Subject [jira] [Commented] (MAHOUT-1546) building spark context fails due to incorrect classpath query
Date Mon, 05 May 2014 18:02:16 GMT

    [ https://issues.apache.org/jira/browse/MAHOUT-1546?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=13989777#comment-13989777
] 

Dmitriy Lyubimov commented on MAHOUT-1546:
------------------------------------------

I'd say if "mahout -spark classpath" returns nothing, something is wrong with your setup.
Define MAHOUT_HOME and SCALA_HOME. Also, spark programs do not need all Mahout artifacts and
their dependendencies (e.g. core). More over, it may cause (and is known to cause) dependency
interference, especially with Hadoop and its transitives. So using standard mahout classpath
should be avoided.

> building spark context fails due to incorrect classpath query
> -------------------------------------------------------------
>
>                 Key: MAHOUT-1546
>                 URL: https://issues.apache.org/jira/browse/MAHOUT-1546
>             Project: Mahout
>          Issue Type: Bug
>         Environment: Spark running locally
>            Reporter: Pat Ferrel
>            Assignee: Dmitriy Lyubimov
>            Priority: Critical
>
> The classpath retrieval is using a "-spark" flag that returns nothing, using the default
"mahout classpath" seems to get all needed jar paths so commenting out the "-spark" makes
it work for me. Not sure this is the best fix though.
> This is in def mahoutSparkContext(...)
> {code}
>         //val p = Runtime.getRuntime.exec(Array(exec.getAbsolutePath, "-spark", "classpath"))
>         val p = Runtime.getRuntime.exec(Array(exec.getAbsolutePath, "classpath"))
> {code}



--
This message was sent by Atlassian JIRA
(v6.2#6252)

Mime
View raw message