Return-Path: X-Original-To: apmail-hbase-user-archive@www.apache.org Delivered-To: apmail-hbase-user-archive@www.apache.org Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by minotaur.apache.org (Postfix) with SMTP id DBE1418520 for ; Fri, 19 Feb 2016 02:25:22 +0000 (UTC) Received: (qmail 84307 invoked by uid 500); 19 Feb 2016 02:25:21 -0000 Delivered-To: apmail-hbase-user-archive@hbase.apache.org Received: (qmail 84241 invoked by uid 500); 19 Feb 2016 02:25:21 -0000 Mailing-List: contact user-help@hbase.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: user@hbase.apache.org Delivered-To: mailing list user@hbase.apache.org Received: (qmail 84229 invoked by uid 99); 19 Feb 2016 02:25:20 -0000 Received: from pnap-us-west-generic-nat.apache.org (HELO spamd2-us-west.apache.org) (209.188.14.142) by apache.org (qpsmtpd/0.29) with ESMTP; Fri, 19 Feb 2016 02:25:20 +0000 Received: from localhost (localhost [127.0.0.1]) by spamd2-us-west.apache.org (ASF Mail Server at spamd2-us-west.apache.org) with ESMTP id 37BA01A0257 for ; Fri, 19 Feb 2016 02:25:20 +0000 (UTC) X-Virus-Scanned: Debian amavisd-new at spamd2-us-west.apache.org X-Spam-Flag: NO X-Spam-Score: 1.88 X-Spam-Level: * X-Spam-Status: No, score=1.88 tagged_above=-999 required=6.31 tests=[DKIM_SIGNED=0.1, DKIM_VALID=-0.1, DKIM_VALID_AU=-0.1, HTML_MESSAGE=2, RCVD_IN_DNSWL_NONE=-0.0001, RCVD_IN_MSPIKE_H3=-0.01, RCVD_IN_MSPIKE_WL=-0.01, SPF_PASS=-0.001, WEIRD_PORT=0.001] autolearn=disabled Authentication-Results: spamd2-us-west.apache.org (amavisd-new); dkim=pass (2048-bit key) header.d=gmail.com Received: from mx1-lw-us.apache.org ([10.40.0.8]) by localhost (spamd2-us-west.apache.org [10.40.0.9]) (amavisd-new, port 10024) with ESMTP id V9New1tOACkU for ; Fri, 19 Feb 2016 02:25:15 +0000 (UTC) Received: from mail-yw0-f179.google.com (mail-yw0-f179.google.com [209.85.161.179]) by mx1-lw-us.apache.org (ASF Mail Server at mx1-lw-us.apache.org) with ESMTPS id 7CA405FB24 for ; Fri, 19 Feb 2016 02:25:15 +0000 (UTC) Received: by mail-yw0-f179.google.com with SMTP id e63so56988910ywc.3 for ; Thu, 18 Feb 2016 18:25:15 -0800 (PST) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=gmail.com; s=20120113; h=mime-version:in-reply-to:references:date:message-id:subject:from:to :cc:content-type; bh=uovR8Iqgmt/DhkaQ7Vssmp170BdbLVLBQNNYPJyCIGI=; b=G0XmyM4sYxcHhFVJhQJFElHUOiq6tVTLQG7oX9Mv4t+WAFKRGqSBj+fMkGWUmaejql izeWp617kRVDrlMbCTu7bRy+RBHgnarNE7wTD6WkpZBZAnLLj8IqkQxLzf1maWkNAXSN vT4qO9qTfw+ckhe+MZ/Xl5fdqapwP3iweHnCMPOkUttufZMsk7WMCIh7hy0tDlaJs740 Olopq7UHJOQ1hzZedSZrZzXkjJQWHe1ja0A5CBZl6NfISDeGUfo8N2LR686MNTwequsa bvHvVZ35ab+ql8bDIaLPLrSxKEpHFUuBhtPBoBwACW5ysFFF1+UayVQdUpMAGCkcvEpi 8WPQ== X-Google-DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=1e100.net; s=20130820; h=x-gm-message-state:mime-version:in-reply-to:references:date :message-id:subject:from:to:cc:content-type; bh=uovR8Iqgmt/DhkaQ7Vssmp170BdbLVLBQNNYPJyCIGI=; b=MQ5R3Ol/RRrj5OuZHUtskIqghdIiynD4/deHYrTfXrc7+mtKnXe8l+1J92u1h/H1Gm /rVHFfkfrYPFLZET+jf1DkmoTZLM19TCjbKQ7HATJrQmS2YITmddjV6sKjLX824yGGR7 FFRnj1b4uiihjz/IqMHZJNiucrqqET9FMSc/6EImlfu8Z+cGtbtvy0335cJiinAmlvmT hxwBsO406MpTgH8aEcN8UgWFM7p4coSeJ8ZbNUBWnMBLwKHdlWZHL7hfy1p19E24qYZR b0GmXvPSKTYlsjlbbnIf6aI2fm0vWbccX/Vl7CY13xi0PlLKp4MamsoDlH1yMLbt/lcr qSbw== X-Gm-Message-State: AG10YOTm5+v21G3btcvF8T/i4hVIBAyQUEes6ssZj21iskQBvW4KrngfqZJ1sxkQYXuh69k0w6PGh9zmBJZpug== MIME-Version: 1.0 X-Received: by 10.129.45.2 with SMTP id t2mr6411170ywt.182.1455848709541; Thu, 18 Feb 2016 18:25:09 -0800 (PST) Received: by 10.37.215.81 with HTTP; Thu, 18 Feb 2016 18:25:09 -0800 (PST) In-Reply-To: References: Date: Thu, 18 Feb 2016 18:25:09 -0800 Message-ID: Subject: Re: Error : starting spark-shell with phoenix client jar From: Ted Yu To: Divya Gehlot Cc: "user@hbase.apache.org" Content-Type: multipart/alternative; boundary=001a11428d3e2dd8da052c163129 --001a11428d3e2dd8da052c163129 Content-Type: text/plain; charset=UTF-8 If you cannot wait for the next HDP release with fix for PHOENIX-2608, please consider rebuilding Phoenix with patch from PHOENIX-2608 applied. Cheers On Thu, Feb 18, 2016 at 6:19 PM, Divya Gehlot wrote: > Thanks Ted for getting me to the root cause of the issue . > I am using hortonworks distribution HDP2.3.4.. How can I upgrade ? > Could you provide me the steps ? > > > > > > On 18 February 2016 at 20:39, Ted Yu wrote: > >> This was likely caused by version conflict of jackson dependencies between >> Spark and Phoenix. >> Phoenix uses 1.8.8 while Spark uses 1.9.13 >> >> One solution is to upgrade the jackson version in Phoenix. >> See PHOENIX-2608 >> >> >> On Thu, Feb 18, 2016 at 12:31 AM, Divya Gehlot >> wrote: >> >> > Hi, >> > I am getting following error while starting spark shell with phoenix >> > clients >> > spark-shell --jars >> > /usr/hdp/current/phoenix-client/phoenix-4.4.0.2.3.4.0-3485-client.jar >> > --driver-class-path >> > /usr/hdp/current/phoenix-client/phoenix-4.4.0.2.3.4.0-3485-client.jar >> > --master yarn-client >> > >> > StackTrace : >> > >> > > INFO TimelineClientImpl: Timeline service address: >> > > >> > >> http://ip-xxx-xx-xx-xxx.ap-southeast-1.compute.internal:8188/ws/v1/timeline/ >> > > java.lang.NoSuchMethodError: >> > > >> > >> org.codehaus.jackson.map.ObjectMapper.setSerializationInclusion(Lorg/codehaus/jackson/map/annotate/JsonSerialize$Inclusion;)Lorg/codehaus/jackson/map/ObjectMapper; >> > > at >> > > >> > >> org.apache.hadoop.yarn.webapp.YarnJacksonJaxbJsonProvider.configObjectMapper(YarnJacksonJaxbJsonProvider.java:59) >> > > at >> > > >> > >> org.apache.hadoop.yarn.util.timeline.TimelineUtils.(TimelineUtils.java:50) >> > > at >> > > >> > >> org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.serviceInit(YarnClientImpl.java:172) >> > > at >> > > >> org.apache.hadoop.service.AbstractService.init(AbstractService.java:163) >> > > at >> > > >> org.apache.spark.deploy.yarn.Client.submitApplication(Client.scala:108) >> > > at >> > > >> > >> org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend.start(YarnClientSchedulerBackend.scala:57) >> > > at >> > > >> > >> org.apache.spark.scheduler.TaskSchedulerImpl.start(TaskSchedulerImpl.scala:144) >> > > at >> org.apache.spark.SparkContext.(SparkContext.scala:523) >> > > at >> > > >> > >> org.apache.spark.repl.SparkILoop.createSparkContext(SparkILoop.scala:1017) >> > > at $iwC$$iwC.(:9) >> > > at $iwC.(:18) >> > > at (:20) >> > > at .(:24) >> > > at .() >> > > at .(:7) >> > > at .() >> > > at $print() >> > > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) >> > > at >> > > >> > >> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) >> > > at >> > > >> > >> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) >> > > at java.lang.reflect.Method.invoke(Method.java:606) >> > > at >> > > >> > >> org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065) >> > > at >> > > >> > >> org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1340) >> > > at >> > > org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840) >> > > at >> > org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871) >> > > at >> > org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819) >> > > at >> > > >> org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:857) >> > > at >> > > >> > >> org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:902) >> > > at >> org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:814) >> > > at >> > > >> > >> org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:125) >> > > at >> > > >> > >> org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:124) >> > > at >> > > org.apache.spark.repl.SparkIMain.beQuietDuring(SparkIMain.scala:324) >> > > at >> > > >> > >> org.apache.spark.repl.SparkILoopInit$class.initializeSpark(SparkILoopInit.scala:124) >> > > at >> > > org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:64) >> > > at >> > > >> > >> org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1$$anonfun$apply$mcZ$sp$5.apply$mcV$sp(SparkILoop.scala:974) >> > > at >> > > >> > >> org.apache.spark.repl.SparkILoopInit$class.runThunks(SparkILoopInit.scala:159) >> > > at >> > org.apache.spark.repl.SparkILoop.runThunks(SparkILoop.scala:64) >> > > at >> > > >> > >> org.apache.spark.repl.SparkILoopInit$class.postInitialization(SparkILoopInit.scala:108) >> > > at >> > > >> org.apache.spark.repl.SparkILoop.postInitialization(SparkILoop.scala:64) >> > > at >> > > >> > >> org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:991) >> > > at >> > > >> > >> org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945) >> > > at >> > > >> > >> org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945) >> > > at >> > > >> > >> scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135) >> > > at org.apache.spark.repl.SparkILoop.org >> > > $apache$spark$repl$SparkILoop$$process(SparkILoop.scala:945) >> > > at >> > org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1059) >> > > at org.apache.spark.repl.Main$.main(Main.scala:31) >> > > at org.apache.spark.repl.Main.main(Main.scala) >> > > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) >> > > at >> > > >> > >> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) >> > > at >> > > >> > >> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) >> > > at java.lang.reflect.Method.invoke(Method.java:606) >> > > at >> > > >> > >> org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:685) >> > > at >> > > >> org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:180) >> > > at >> > > org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:205) >> > > at >> > org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:120) >> > > at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) >> > > java.lang.NullPointerException >> > > at >> > > >> > >> org.apache.spark.sql.execution.ui.SQLListener.(SQLListener.scala:34) >> > > at org.apache.spark.sql.SQLContext.(SQLContext.scala:77) >> > > at >> > > org.apache.spark.sql.hive.HiveContext.(HiveContext.scala:74) >> > > at >> sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native >> > > Method) >> > > at >> > > >> > >> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57) >> > > at >> > > >> > >> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) >> > > at >> > java.lang.reflect.Constructor.newInstance(Constructor.java:526) >> > > at >> > > >> org.apache.spark.repl.SparkILoop.createSQLContext(SparkILoop.scala:1028) >> > > at $iwC$$iwC.(:9) >> > > at $iwC.(:18) >> > > at (:20) >> > > at .(:24) >> > > at .() >> > > at .(:7) >> > > at .() >> > > at $print() >> > > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) >> > > at >> > > >> > >> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) >> > > at >> > > >> > >> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) >> > > at java.lang.reflect.Method.invoke(Method.java:606) >> > > at >> > > >> > >> org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065) >> > > at >> > > >> > >> org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1340) >> > > at >> > > org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840) >> > > at >> > org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871) >> > > at >> > org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819) >> > > at >> > > >> org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:857) >> > > at >> > > >> > >> org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:902) >> > > at >> org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:814) >> > > at >> > > >> > >> org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:132) >> > > at >> > > >> > >> org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:124) >> > > at >> > > org.apache.spark.repl.SparkIMain.beQuietDuring(SparkIMain.scala:324) >> > > at >> > > >> > >> org.apache.spark.repl.SparkILoopInit$class.initializeSpark(SparkILoopInit.scala:124) >> > > at >> > > org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:64) >> > > at >> > > >> > >> org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1$$anonfun$apply$mcZ$sp$5.apply$mcV$sp(SparkILoop.scala:974) >> > > at >> > > >> > >> org.apache.spark.repl.SparkILoopInit$class.runThunks(SparkILoopInit.scala:159) >> > > at >> > org.apache.spark.repl.SparkILoop.runThunks(SparkILoop.scala:64) >> > > at >> > > >> > >> org.apache.spark.repl.SparkILoopInit$class.postInitialization(SparkILoopInit.scala:108) >> > > at >> > > >> org.apache.spark.repl.SparkILoop.postInitialization(SparkILoop.scala:64) >> > > at >> > > >> > >> org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:991) >> > > at >> > > >> > >> org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945) >> > > at >> > > >> > >> org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945) >> > > at >> > > >> > >> scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135) >> > > at org.apache.spark.repl.SparkILoop.org >> > > $apache$spark$repl$SparkILoop$$process(SparkILoop.scala:945) >> > > at >> > org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1059) >> > > at org.apache.spark.repl.Main$.main(Main.scala:31) >> > > at org.apache.spark.repl.Main.main(Main.scala) >> > > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) >> > > at >> > > >> > >> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) >> > > at >> > > >> > >> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) >> > > at java.lang.reflect.Method.invoke(Method.java:606) >> > > at >> > > >> > >> org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:685) >> > > at >> > > >> org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:180) >> > > at >> > > org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:205) >> > > at >> > org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:120) >> > > at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) >> > > :10: error: not found: value sqlContext >> > > import sqlContext.implicits._ >> > > ^ >> > > :10: error: not found: value sqlContext >> > > import sqlContext.sql >> > >> > >> > >> > Googled and found there is Jackson dependency is not available for >> Hadoop >> > 2.x version(SPARK-5108 >> > < >> > >> https://issues.apache.org/jira/browse/SPARK-5108?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel >> > > >> > ) >> > Is the above errors related to above mentioned issue . >> > >> > Thanks >> > Divya >> > >> > > --001a11428d3e2dd8da052c163129--