Return-Path: X-Original-To: apmail-mahout-dev-archive@www.apache.org Delivered-To: apmail-mahout-dev-archive@www.apache.org Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by minotaur.apache.org (Postfix) with SMTP id F4211189EC for ; Fri, 17 Jul 2015 21:29:10 +0000 (UTC) Received: (qmail 51300 invoked by uid 500); 17 Jul 2015 21:29:05 -0000 Delivered-To: apmail-mahout-dev-archive@mahout.apache.org Received: (qmail 51209 invoked by uid 500); 17 Jul 2015 21:29:05 -0000 Mailing-List: contact dev-help@mahout.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: dev@mahout.apache.org Delivered-To: mailing list dev@mahout.apache.org Received: (qmail 50986 invoked by uid 99); 17 Jul 2015 21:29:05 -0000 Received: from arcas.apache.org (HELO arcas.apache.org) (140.211.11.28) by apache.org (qpsmtpd/0.29) with ESMTP; Fri, 17 Jul 2015 21:29:05 +0000 Date: Fri, 17 Jul 2015 21:29:05 +0000 (UTC) From: "JP Bordenave (JIRA)" To: dev@mahout.apache.org Message-ID: In-Reply-To: References: Subject: [jira] [Updated] (MAHOUT-1758) mahout spark-shell - get illegal acces error at startup MIME-Version: 1.0 Content-Type: text/plain; charset=utf-8 Content-Transfer-Encoding: 7bit X-JIRA-FingerPrint: 30527f35849b9dde25b450d4833f0394 [ https://issues.apache.org/jira/browse/MAHOUT-1758?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ] JP Bordenave updated MAHOUT-1758: --------------------------------- Summary: mahout spark-shell - get illegal acces error at startup (was: mahout spark-shell - get illegal acces eror at startup) > mahout spark-shell - get illegal acces error at startup > ------------------------------------------------------- > > Key: MAHOUT-1758 > URL: https://issues.apache.org/jira/browse/MAHOUT-1758 > Project: Mahout > Issue Type: Bug > Environment: linux unbuntu 14.04, cluster 1pc master 2pc slave, 16GB ram by node. > Hadoop 2.6 > Spark 1.4.1 > Mahout 10.1 > R 3.0.2/Rhadoop > scala 2.10 > Reporter: JP Bordenave > Priority: Critical > > Hello, > i installed hadoop 2.6, spark 1.4 ,sparkR,pyspark working fine, no issue > scala 2.10 > now i try to configure mahout with my cluster spark/hadoop, but when i start > mahout, i get illegalaccesseror, it try tot start in local mode, i get same error, look to be incompatible with spark 1.4.x and mahout 10.1, > can you confirm ? patch ? > edit: i saw in release note mahout 10.1, compatibilty 1.2.2 or less > Thanks for your info > JP > i set my variable nd my cluster spark > export SPARK_HOME=/usr/local/spark > export MASTER=spark://stargate:7077 > {noformat} > hduser@stargate:~$ mahout spark-shell > MAHOUT_LOCAL is not set; adding HADOOP_CONF_DIR to classpath. > SLF4J: Class path contains multiple SLF4J bindings. > SLF4J: Found binding in [jar:file:/usr/local/apache-mahout-distribution-0.10.1/mahout-examples-0.10.1-job.jar!/org/slf4j/impl/StaticLoggerBinder.class] > SLF4J: Found binding in [jar:file:/usr/local/apache-mahout-distribution-0.10.1/mahout-mr-0.10.1-job.jar!/org/slf4j/impl/StaticLoggerBinder.class] > SLF4J: Found binding in [jar:file:/usr/local/spark-1.4.1-bin-hadoop2.6/lib/spark-assembly-1.4.1-hadoop2.6.0.jar!/org/slf4j/impl/StaticLoggerBinder.class] > SLF4J: Found binding in [jar:file:/usr/local/apache-mahout-distribution-0.10.1/lib/slf4j-log4j12-1.7.12.jar!/org/slf4j/impl/StaticLoggerBinder.class] > SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation. > SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory] > 15/07/17 23:17:54 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable > _ _ > _ __ ___ __ _| |__ ___ _ _| |_ > | '_ ` _ \ / _` | '_ \ / _ \| | | | __| > | | | | | | (_| | | | | (_) | |_| | |_ > |_| |_| |_|\__,_|_| |_|\___/ \__,_|\__| version 0.10.0 > Using Scala version 2.10.4 (OpenJDK 64-Bit Server VM, Java 1.7.0_79) > Type in expressions to have them evaluated. > Type :help for more information. > java.lang.IllegalAccessError: tried to access method org.apache.spark.repl.SparkIMain.classServer()Lorg/apache/spark/HttpServer; from class org.apache.mahout.sparkbindings.shell.MahoutSparkILoop > at org.apache.mahout.sparkbindings.shell.MahoutSparkILoop.createSparkContext(MahoutSparkILoop.scala:42) > at $iwC$$iwC.(:11) > at $iwC.(:18) > at (:20) > at .(:24) > at .() > at .(:7) > at .() > at $print() > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) > at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) > at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) > at java.lang.reflect.Method.invoke(Method.java:606) > at org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065) > at org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1338) > at org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840) > at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871) > at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819) > at org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:857) > at org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:902) > at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:814) > at org.apache.mahout.sparkbindings.shell.MahoutSparkILoop$$anonfun$initializeSpark$1.apply$mcV$sp(MahoutSparkILoop.scala:63) > at org.apache.mahout.sparkbindings.shell.MahoutSparkILoop$$anonfun$initializeSpark$1.apply(MahoutSparkILoop.scala:62) > at org.apache.mahout.sparkbindings.shell.MahoutSparkILoop$$anonfun$initializeSpark$1.apply(MahoutSparkILoop.scala:62) > at org.apache.spark.repl.SparkIMain.beQuietDuring(SparkIMain.scala:324) > at org.apache.mahout.sparkbindings.shell.MahoutSparkILoop.initializeSpark(MahoutSparkILoop.scala:62) > at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1$$anonfun$apply$mcZ$sp$5.apply$mcV$sp(SparkILoop.scala:974) > at org.apache.spark.repl.SparkILoopInit$class.runThunks(SparkILoopInit.scala:157) > at org.apache.spark.repl.SparkILoop.runThunks(SparkILoop.scala:64) > at org.apache.spark.repl.SparkILoopInit$class.postInitialization(SparkILoopInit.scala:106) > at org.apache.mahout.sparkbindings.shell.MahoutSparkILoop.postInitialization(MahoutSparkILoop.scala:24) > at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:991) > at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945) > at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945) > at scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135) > at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$process(SparkILoop.scala:945) > at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1059) > at org.apache.mahout.sparkbindings.shell.Main$.main(Main.scala:39) > at org.apache.mahout.sparkbindings.shell.Main.main(Main.scala) > Mahout distributed context is available as "implicit val sdc". > {noformat} -- This message was sent by Atlassian JIRA (v6.3.4#6332)