mahout-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Chomon (JIRA)" <j...@apache.org>
Subject [jira] [Commented] (MAHOUT-1758) mahout spark-shell - get illegal acces error at startup
Date Wed, 09 Nov 2016 05:45:58 GMT

    [ https://issues.apache.org/jira/browse/MAHOUT-1758?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15649888#comment-15649888
] 

Chomon commented on MAHOUT-1758:
--------------------------------

When I type bin/mahout spark-shell,I get the following error:
chomon@ubuntu:~/mahout$ bin/mahout spark-shell
MAHOUT_LOCAL is set, so we don't add HADOOP_CONF_DIR to classpath.
16/11/08 21:38:29 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform...
using builtin-java classes where applicable

                         _                 _
         _ __ ___   __ _| |__   ___  _   _| |_
        | '_ ` _ \ / _` | '_ \ / _ \| | | | __|
        | | | | | | (_| | | | | (_) | |_| | |_
        |_| |_| |_|\__,_|_| |_|\___/ \__,_|\__|  version 0.12.2

      
Using Scala version 2.10.4 (Java HotSpot(TM) 64-Bit Server VM, Java 1.7.0_79)
Type in expressions to have them evaluated.
Type :help for more information.
16/11/08 21:38:39 WARN MetricsSystem: Using default name DAGScheduler for source because spark.app.id
is not set.
Created spark context..
Spark context is available as "val sc".
Mahout distributed context is available as "implicit val sdc".
16/11/08 21:39:00 WARN ObjectStore: Version information not found in metastore. hive.metastore.schema.verification
is not enabled so recording the schema version 1.2.0
16/11/08 21:39:00 WARN ObjectStore: Failed to get database default, returning NoSuchObjectException
16/11/08 21:39:00 WARN : Your hostname, ubuntu resolves to a loopback/non-reachable address:
127.0.0.1, but we couldn't find any external IP address!
16/11/08 21:39:03 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform...
using builtin-java classes where applicable
16/11/08 21:39:10 WARN : Your hostname, ubuntu resolves to a loopback/non-reachable address:
127.0.0.1, but we couldn't find any external IP address!
SQL context available as "val sqlContext".
mahout> 
how can I solve it ?
I used to version : spark 1.5.2,hadoop 2.6,java 7u 79 jdk and scala 2.10.4.

> mahout spark-shell - get illegal acces error at startup
> -------------------------------------------------------
>
>                 Key: MAHOUT-1758
>                 URL: https://issues.apache.org/jira/browse/MAHOUT-1758
>             Project: Mahout
>          Issue Type: Bug
>    Affects Versions: 0.10.1
>         Environment: linux unbuntu 14.04,  cluster 1pc master 2pc slave, 16GB ram by
node.
> Hadoop 2.6
> Spark 1.4.1
> Mahout 10.1
> R 3.0.2/Rhadoop
> scala 2.10
>            Reporter: JP Bordenave
>            Assignee: Suneel Marthi
>            Priority: Critical
>             Fix For: 0.11.0
>
>
> Hello,
> i installed hadoop 2.6,  spark 1.4 ,sparkR,pyspark working fine, no issue
> scala 2.10
> now i try to configure mahout with my cluster spark/hadoop, but when i start
> mahout, i get illegalaccesseror, it try tot start in local mode, i get same error, look
to be incompatible with spark 1.4.x and mahout 10.1, 
> can you confirm ? patch ?
> edit: i saw in release note mahout 10.1, compatibilty 1.2.2 or less
> Thanks for your info
> JP
> i set my variable nd my cluster spark
> export SPARK_HOME=/usr/local/spark
> export MASTER=spark://stargate:7077
> {noformat}
> hduser@stargate:~$ mahout spark-shell
> MAHOUT_LOCAL is not set; adding HADOOP_CONF_DIR to classpath.
> SLF4J: Class path contains multiple SLF4J bindings.
> SLF4J: Found binding in [jar:file:/usr/local/apache-mahout-distribution-0.10.1/mahout-examples-0.10.1-job.jar!/org/slf4j/impl/StaticLoggerBinder.class]
> SLF4J: Found binding in [jar:file:/usr/local/apache-mahout-distribution-0.10.1/mahout-mr-0.10.1-job.jar!/org/slf4j/impl/StaticLoggerBinder.class]
> SLF4J: Found binding in [jar:file:/usr/local/spark-1.4.1-bin-hadoop2.6/lib/spark-assembly-1.4.1-hadoop2.6.0.jar!/org/slf4j/impl/StaticLoggerBinder.class]
> SLF4J: Found binding in [jar:file:/usr/local/apache-mahout-distribution-0.10.1/lib/slf4j-log4j12-1.7.12.jar!/org/slf4j/impl/StaticLoggerBinder.class]
> SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
> SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
> 15/07/17 23:17:54 WARN NativeCodeLoader: Unable to load native-hadoop library for your
platform... using builtin-java classes where applicable
>                          _                 _
>          _ __ ___   __ _| |__   ___  _   _| |_
>         | '_ ` _ \ / _` | '_ \ / _ \| | | | __|
>         | | | | | | (_| | | | | (_) | |_| | |_
>         |_| |_| |_|\__,_|_| |_|\___/ \__,_|\__|  version 0.10.0
> Using Scala version 2.10.4 (OpenJDK 64-Bit Server VM, Java 1.7.0_79)
> Type in expressions to have them evaluated.
> Type :help for more information.
> java.lang.IllegalAccessError: tried to access method org.apache.spark.repl.SparkIMain.classServer()Lorg/apache/spark/HttpServer;
from class org.apache.mahout.sparkbindings.shell.MahoutSparkILoop
>         at org.apache.mahout.sparkbindings.shell.MahoutSparkILoop.createSparkContext(MahoutSparkILoop.scala:42)
>         at $iwC$$iwC.<init>(<console>:11)
>         at $iwC.<init>(<console>:18)
>         at <init>(<console>:20)
>         at .<init>(<console>:24)
>         at .<clinit>(<console>)
>         at .<init>(<console>:7)
>         at .<clinit>(<console>)
>         at $print(<console>)
>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>         at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>         at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>         at java.lang.reflect.Method.invoke(Method.java:606)
>         at org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065)
>         at org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1338)
>         at org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840)
>         at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)
>         at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819)
>         at org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:857)
>         at org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:902)
>         at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:814)
>         at org.apache.mahout.sparkbindings.shell.MahoutSparkILoop$$anonfun$initializeSpark$1.apply$mcV$sp(MahoutSparkILoop.scala:63)
>         at org.apache.mahout.sparkbindings.shell.MahoutSparkILoop$$anonfun$initializeSpark$1.apply(MahoutSparkILoop.scala:62)
>         at org.apache.mahout.sparkbindings.shell.MahoutSparkILoop$$anonfun$initializeSpark$1.apply(MahoutSparkILoop.scala:62)
>         at org.apache.spark.repl.SparkIMain.beQuietDuring(SparkIMain.scala:324)
>         at org.apache.mahout.sparkbindings.shell.MahoutSparkILoop.initializeSpark(MahoutSparkILoop.scala:62)
>         at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1$$anonfun$apply$mcZ$sp$5.apply$mcV$sp(SparkILoop.scala:974)
>         at org.apache.spark.repl.SparkILoopInit$class.runThunks(SparkILoopInit.scala:157)
>         at org.apache.spark.repl.SparkILoop.runThunks(SparkILoop.scala:64)
>         at org.apache.spark.repl.SparkILoopInit$class.postInitialization(SparkILoopInit.scala:106)
>         at org.apache.mahout.sparkbindings.shell.MahoutSparkILoop.postInitialization(MahoutSparkILoop.scala:24)
>         at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:991)
>         at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
>         at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
>         at scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
>         at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$process(SparkILoop.scala:945)
>         at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1059)
>         at org.apache.mahout.sparkbindings.shell.Main$.main(Main.scala:39)
>         at org.apache.mahout.sparkbindings.shell.Main.main(Main.scala)
> Mahout distributed context is available as "implicit val sdc".
> {noformat}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

Mime
View raw message