mahout-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Dmitriy Lyubimov <dlie...@gmail.com>
Subject Re: [jira] [Commented] (MAHOUT-1546) building spark context fails due to incorrect classpath query
Date Mon, 05 May 2014 19:09:08 GMT
it does seem however that not setting SPARK_HOME should be caught and
reported as an error, and it is not currently. If you can fix it, that
would be a viable fix.


On Mon, May 5, 2014 at 12:08 PM, Dmitriy Lyubimov <dlieu.7@gmail.com> wrote:

> At this point it is just that we have MAHOUT_HOME and SPARK_HOME set up.
> and Spark must be a specific version (0.9.1) .
>
> If we know how we might be able to determine Spark's class path without
> knowing SPARK_HOME, we can drop requirements for having SPARK_HOME, but so
> far i was not able to see how. For once, a person may have 3 or 4 versions
> of spark set up at the same time (wink-wink, yours truly is the case here),
> so i can't see how asking for SPARK_HOME can be waived.
>
>
> On Mon, May 5, 2014 at 11:37 AM, Pat Ferrel (JIRA) <jira@apache.org>wrote:
>
>>
>>     [
>> https://issues.apache.org/jira/browse/MAHOUT-1546?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=13989819#comment-13989819]
>>
>> Pat Ferrel commented on MAHOUT-1546:
>> ------------------------------------
>>
>> Scala works fine without SCALA_HOME. That doesn't apply here.
>> MAHOUT_HOME is set otherwise the fix wouldn't have worked.
>>
>> Running "mahout -spark classpath" from bash returns nothing. In the bash
>> script is looks like SPARK_HOME is checked but I haven't set it and Spark
>> itself seems to run fine without it. In fact the Spark install doesn't ask
>> you to set it. In any case setting that seems to fix the problem.
>>
>> Do we have a wiki page that describes setup for Spark, maybe I'm a good
>> guinea pig to write or edit it.
>>
>> > building spark context fails due to incorrect classpath query
>> > -------------------------------------------------------------
>> >
>> >                 Key: MAHOUT-1546
>> >                 URL: https://issues.apache.org/jira/browse/MAHOUT-1546
>> >             Project: Mahout
>> >          Issue Type: Bug
>> >         Environment: Spark running locally
>> >            Reporter: Pat Ferrel
>> >            Assignee: Dmitriy Lyubimov
>> >            Priority: Critical
>> >
>> > The classpath retrieval is using a "-spark" flag that returns nothing,
>> using the default "mahout classpath" seems to get all needed jar paths so
>> commenting out the "-spark" makes it work for me. Not sure this is the best
>> fix though.
>> > This is in def mahoutSparkContext(...)
>> > {code}
>> >         //val p = Runtime.getRuntime.exec(Array(exec.getAbsolutePath,
>> "-spark", "classpath"))
>> >         val p = Runtime.getRuntime.exec(Array(exec.getAbsolutePath,
>> "classpath"))
>> > {code}
>>
>>
>>
>> --
>> This message was sent by Atlassian JIRA
>> (v6.2#6252)
>>
>
>

Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message