Return-Path: X-Original-To: apmail-phoenix-dev-archive@minotaur.apache.org Delivered-To: apmail-phoenix-dev-archive@minotaur.apache.org Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by minotaur.apache.org (Postfix) with SMTP id 9F3B018D25 for ; Wed, 9 Dec 2015 21:00:35 +0000 (UTC) Received: (qmail 9446 invoked by uid 500); 9 Dec 2015 21:00:29 -0000 Delivered-To: apmail-phoenix-dev-archive@phoenix.apache.org Received: (qmail 9384 invoked by uid 500); 9 Dec 2015 21:00:29 -0000 Mailing-List: contact dev-help@phoenix.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: dev@phoenix.apache.org Delivered-To: mailing list dev@phoenix.apache.org Received: (qmail 9373 invoked by uid 99); 9 Dec 2015 21:00:29 -0000 Received: from Unknown (HELO spamd2-us-west.apache.org) (209.188.14.142) by apache.org (qpsmtpd/0.29) with ESMTP; Wed, 09 Dec 2015 21:00:29 +0000 Received: from localhost (localhost [127.0.0.1]) by spamd2-us-west.apache.org (ASF Mail Server at spamd2-us-west.apache.org) with ESMTP id A251F1A2CFF for ; Wed, 9 Dec 2015 21:00:28 +0000 (UTC) X-Virus-Scanned: Debian amavisd-new at spamd2-us-west.apache.org X-Spam-Flag: NO X-Spam-Score: 0.447 X-Spam-Level: X-Spam-Status: No, score=0.447 tagged_above=-999 required=6.31 tests=[KAM_LAZY_DOMAIN_SECURITY=1, RP_MATCHES_RCVD=-0.554, URIBL_BLOCKED=0.001] autolearn=disabled Received: from mx1-eu-west.apache.org ([10.40.0.8]) by localhost (spamd2-us-west.apache.org [10.40.0.9]) (amavisd-new, port 10024) with ESMTP id crRXvRwYvTH7 for ; Wed, 9 Dec 2015 21:00:15 +0000 (UTC) Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by mx1-eu-west.apache.org (ASF Mail Server at mx1-eu-west.apache.org) with SMTP id D718F24D8F for ; Wed, 9 Dec 2015 21:00:14 +0000 (UTC) Received: (qmail 7754 invoked by uid 99); 9 Dec 2015 21:00:13 -0000 Received: from arcas.apache.org (HELO arcas) (140.211.11.28) by apache.org (qpsmtpd/0.29) with ESMTP; Wed, 09 Dec 2015 21:00:13 +0000 Received: from arcas.apache.org (localhost [127.0.0.1]) by arcas (Postfix) with ESMTP id 2ECFC2C14F7 for ; Wed, 9 Dec 2015 21:00:13 +0000 (UTC) Date: Wed, 9 Dec 2015 21:00:13 +0000 (UTC) From: "Jonathan Cox (JIRA)" To: dev@phoenix.incubator.apache.org Message-ID: In-Reply-To: References: Subject: [jira] [Created] (PHOENIX-2503) Multiple Java NoClass/Method Errors with Spark and Phoenix MIME-Version: 1.0 Content-Type: text/plain; charset=utf-8 Content-Transfer-Encoding: 7bit X-JIRA-FingerPrint: 30527f35849b9dde25b450d4833f0394 Jonathan Cox created PHOENIX-2503: ------------------------------------- Summary: Multiple Java NoClass/Method Errors with Spark and Phoenix Key: PHOENIX-2503 URL: https://issues.apache.org/jira/browse/PHOENIX-2503 Project: Phoenix Issue Type: Bug Affects Versions: 4.6.0 Environment: Debian 8 (Jessie) x64 hadoop-2.6.2 hbase-1.1.2 phoenix-4.6.0-HBase-1.1 spark-1.5.2-bin-without-hadoop Reporter: Jonathan Cox Priority: Blocker I have encountered a variety of Java errors while trying to get Apache Phoenix working with Spark. In particular, I encounter these errors when submitting Python jobs to the spark-shell, or running interactively in the scala Spark shell. ------- Issue 1 ------- The first issue I encountered was that Phoenix would not work with the binary Spark release that includes Hadoop 2.6 (spark-1.5.2-bin-hadoop2.6.tgz). I tried adding the phoenix-4.6.0-HBase-1.1-client.jar to both spark-env.sh and spark-defaults.conf, but encountered the same error when launching spark-shell: 15/12/08 18:38:05 WARN ObjectStore: Version information not found in metastore. hive.metastore.schema.verification is not enabled so recording the schema version 1.2.0 15/12/08 18:38:05 WARN ObjectStore: Failed to get database default, returning NoSuchObjectException 15/12/08 18:38:05 WARN Hive: Failed to access metastore. This class should not accessed in runtime. org.apache.hadoop.hive.ql.metadata.HiveException: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient at org.apache.hadoop.hive.ql.metadata.Hive.getAllDatabases(Hive.java:1236) ----- Issue 2 ----- Alright, having given up on getting Phoenix to work with the Spark package that includes Hadoop, I decided to download hadoop-2.6.2.tar.gz and spark-1.5.2-bin-without-hadoop.tgz. I installed these, and again added phoenix-4.6.0-HBase-1.1-client.jar to spark-defaults.conf. In addition, I added the following lines to spark-env.sh: SPARK_DIST_CLASSPATH=$(/usr/local/hadoop/bin/hadoop classpath) export SPARK_DIST_CLASSPATH="$SPARK_DIST_CLASSPATH:/usr/local/hadoop/share/hadoop/tools/lib/*" export HADOOP_CONF_DIR=/usr/local/hadoop/etc/hadoop This solved "Issue 1" described above, and now spark-shell launches without generating an error. Nevertheless, other Spark functionality is now broken: 15/12/09 13:55:46 INFO repl.SparkILoop: Created spark context.. Spark context available as sc. 15/12/09 13:55:46 INFO repl.SparkILoop: Created sql context.. SQL context available as sqlContext. scala> val textFile = sc.textFile("README.md") java.lang.NoSuchMethodError: com.fasterxml.jackson.module.scala.deser.BigDecimalDeserializer$.handledType()Ljava/lang/Class; at com.fasterxml.jackson.module.scala.deser.NumberDeserializers$.(ScalaNumberDeserializersModule.scala:49) at com.fasterxml.jackson.module.scala.deser.NumberDeserializers$.(ScalaNumberDeserializersModule.scala) Note, this error goes away if I omit phoenix-4.6.0-HBase-1.1-client.jar (but then I have no Phoenix support, obviously). This makes me believe that phoenix-4.6.0-HBase-1.1-client.jar contains some conflicting version of Jackson FastXML classes, which are overriding Spark's Jackson classes with an earlier version that doesn't include this particular method. In other words, Spark needs one version of Jackson JARs, but Phoenix is including another that breaks Spark. Does this make any sense? Sincerely, Jonathan -- This message was sent by Atlassian JIRA (v6.3.4#6332)