mahout-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Andrew Palumbo <>
Subject Re: Zeppelin Integration PR
Date Wed, 01 Jun 2016 00:55:15 GMT
which dependencies need to be removed?

I saw the Kryo version on the PR was conflicting also, that may be a flink thing.  I think
the version at one point was being enforced in the flink module at least.

From: Trevor Grant <>
Sent: Tuesday, May 31, 2016 8:41:15 PM
Subject: Re: Zeppelin Integration PR

As a follow up to this- it would be nice to remove the dependencies from
the pom.xml...

All we REALLY need to do is make sure we can get to the required jars and
load them.  By including them in the pom we are ensuring they are
available, but there is surely some other way to get ahold of them.  Since
we have assumed that Mahout is installed on the system and MAHOUT_HOME=...
we can probably leverage that...

Trevor Grant
Data Scientist

*"Fortunate is he, who is able to know the causes of things."  -Virgil*

On Tue, May 31, 2016 at 7:31 PM, Trevor Grant <>

> For what it is worth, simply removing the dependencies from pom.xml breaks
> the Mahout interpreter.
> Upon a little further testing in cluster mode, so long as the dependencies
> are included in pom.xml, the appropriate Mahout jars are shipped off to the
> cluster and everything works swimmingly (in Zeppelin there is a local Spark
> Interpretter internal to Zeppelin and then the 'real' cluster that
> everything gets shipped off to. Sometimes you can make things work in local
> mode that won't work in cluster mode)
> The moral of this story is that the patch DOES in fact work in local and
> cluster mode, so we just need to work out the dependencies and the
> licensing (and a couple of fail safes to make sure the user is running
> Spark version > 1.5.2) and we should be good to go.
> Trevor Grant
> Data Scientist
> *"Fortunate is he, who is able to know the causes of things."  -Virgil*
> On Tue, May 31, 2016 at 4:22 PM, Trevor Grant <>
> wrote:
>> Hey folks,
>> looks like we're making progress on the Mahout-Zeppelin integration.
>> Any who are interested check out:
>> Regarding Moon's last comments:
>> Does anyone know off hand if anything will break if we roll back the
>> conflicting packages to the Spark 1.6 version?
>> Also regarding the pom.xml and:
>> "Packaging
>> If mahout requires to be loaded in spark executor's classpath, then
>> adding mahout dependency in pom.xml will not be enough to work with Spark
>> cluster. Could you clarify if mahout need to be loaded in spark executor?"
>> All we need to do is load the jars appropriate Mahout jars, I'm not
>> familiar enough with the Spark Interpreter or Spark or Java to know exactly
>> what would happen, any thoughts on this?
>> Tonight I might just try removing mahout dependencies from pom.xml and
>> seeing what happens? that would solve all of these problems I think.  As
>> long as user has 'mvn install'ed Mahout, should be gtg?
>> Trevor Grant
>> Data Scientist
>> *"Fortunate is he, who is able to know the causes of things."  -Virgil*

View raw message