Return-Path: X-Original-To: apmail-hive-user-archive@www.apache.org Delivered-To: apmail-hive-user-archive@www.apache.org Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by minotaur.apache.org (Postfix) with SMTP id 0F06B181BE for ; Mon, 21 Mar 2016 06:18:36 +0000 (UTC) Received: (qmail 66746 invoked by uid 500); 21 Mar 2016 06:18:34 -0000 Delivered-To: apmail-hive-user-archive@hive.apache.org Received: (qmail 66671 invoked by uid 500); 21 Mar 2016 06:18:34 -0000 Mailing-List: contact user-help@hive.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: user@hive.apache.org Delivered-To: mailing list user@hive.apache.org Received: (qmail 66661 invoked by uid 99); 21 Mar 2016 06:18:34 -0000 Received: from pnap-us-west-generic-nat.apache.org (HELO spamd3-us-west.apache.org) (209.188.14.142) by apache.org (qpsmtpd/0.29) with ESMTP; Mon, 21 Mar 2016 06:18:34 +0000 Received: from localhost (localhost [127.0.0.1]) by spamd3-us-west.apache.org (ASF Mail Server at spamd3-us-west.apache.org) with ESMTP id 72FB4180219 for ; Mon, 21 Mar 2016 06:18:33 +0000 (UTC) X-Virus-Scanned: Debian amavisd-new at spamd3-us-west.apache.org X-Spam-Flag: NO X-Spam-Score: 5.281 X-Spam-Level: ***** X-Spam-Status: No, score=5.281 tagged_above=-999 required=6.31 tests=[DKIM_SIGNED=0.1, DKIM_VALID=-0.1, HTML_MESSAGE=2, RCVD_IN_DNSWL_LOW=-0.7, RCVD_IN_MSPIKE_H3=-0.01, RCVD_IN_MSPIKE_WL=-0.01, URIBL_BLACK=4, WEIRD_PORT=0.001] autolearn=disabled Authentication-Results: spamd3-us-west.apache.org (amavisd-new); dkim=pass (2048-bit key) header.d=is-land-com-tw.20150623.gappssmtp.com Received: from mx1-lw-eu.apache.org ([10.40.0.8]) by localhost (spamd3-us-west.apache.org [10.40.0.10]) (amavisd-new, port 10024) with ESMTP id WLaUIpgcr2pb for ; Mon, 21 Mar 2016 06:18:26 +0000 (UTC) Received: from mail-vk0-f52.google.com (mail-vk0-f52.google.com [209.85.213.52]) by mx1-lw-eu.apache.org (ASF Mail Server at mx1-lw-eu.apache.org) with ESMTPS id 2ECAF5FB09 for ; Mon, 21 Mar 2016 06:18:26 +0000 (UTC) Received: by mail-vk0-f52.google.com with SMTP id q138so111954057vkb.3 for ; Sun, 20 Mar 2016 23:18:26 -0700 (PDT) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=is-land-com-tw.20150623.gappssmtp.com; s=20150623; h=mime-version:in-reply-to:references:from:date:message-id:subject:to :cc; bh=Qc+ARGG9A3EsdqjO21LFWFuIgASPrJgwO4rNx9JVFR8=; b=0vY3fwa8jbQsyZMJpeuw6ANxPt2Yr0WUlhGId7QyNhFkYXRm8tTA6RbUFVhaMlXNuO pvm5yvAr3IA5Z0gfnv4zeJJg2LoplbNv2KMq+Eu2+80PS2676WXqbWxOYnXR6gn1GCXx CRhrEWrOZpWWi73ImDIepVthSB1ZQQsoEjdQq53zU6/+6sGiXY1HwL2fM6z+IXOEoCsM MDJefVeiG2qUY2162kffDCRnhiEnDBpfzAm8ShZBsPOxheHxiA1k/l4oRgM9E1dw6Y4J mmkkq1keK+dGQ0L49pLJIVEbilrj4wsslMKSR5lbXgAdosUEsmDt7taoDp4GzYe8Hc4T QnvA== X-Google-DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=1e100.net; s=20130820; h=x-gm-message-state:mime-version:in-reply-to:references:from:date :message-id:subject:to:cc; bh=Qc+ARGG9A3EsdqjO21LFWFuIgASPrJgwO4rNx9JVFR8=; b=B/iJTdIBiW+rsYUegaNvdSYWpw82Sre8DzE5hCGLW4T4ewjxvlZwtycecWc/HVeKEL FXUTGeoV1xgZagoMDnREvw4T6WPDBmz8MmtvQHvxG2BuCyxAaYkpYaXqV0lYRzISN3W9 gXRVrbdCkNVGyT+eDHvIoxknXfQefR5WfreiTRJrJROd6FIe7aq2MnTfzke0g3bJyZ6k Z4YCZOcV5640j18OT1lrGxCWuAR1ERUD1hkb//kq7zwB2DuUU0J5Gnfq6IbpdzlUyhun n4hbvrD3QsDKSClxLhUXNnKTmQdd9Mfx+38AsJ7rXmcceYgua3fbdMNLURgPPUh4Xkd2 AIkA== X-Gm-Message-State: AD7BkJLiOVYVpK0xLP7MCq1SgyxRYUfYPEOBZdKdHSPXP8oh7zWYvIA7hJJVWeIwba6cGdoNyNeoxuKjZyiQ8w== X-Received: by 10.31.50.65 with SMTP id y62mr30392594vky.11.1458541105012; Sun, 20 Mar 2016 23:18:25 -0700 (PDT) MIME-Version: 1.0 Received: by 10.31.162.65 with HTTP; Sun, 20 Mar 2016 23:18:05 -0700 (PDT) In-Reply-To: References: From: Stana Date: Mon, 21 Mar 2016 14:18:05 +0800 Message-ID: Subject: Re: Error in Hive on Spark To: dev@hive.apache.org Cc: user@hive.apache.org Content-Type: multipart/alternative; boundary=001a1143158a7477ae052e8910d7 --001a1143158a7477ae052e8910d7 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: quoted-printable Does anyone have suggestions in setting property of hive-exec-2.0.0.jar path in application? Something like 'hiveConf.set("hive.remote.driver.jar","hdfs://storm0:9000/tmp/spark-assemb= ly-1.4.1-hadoop2.6.0.jar")'. 2016-03-11 10:53 GMT+08:00 Stana : > Thanks for reply > > I have set the property spark.home in my application. Otherwise the > application threw 'SPARK_HOME not found exception'. > > I found hive source code in SparkClientImpl.java: > > private Thread startDriver(final RpcServer rpcServer, final String > clientId, final String secret) > throws IOException { > ... > > List argv =3D Lists.newArrayList(); > > ... > > argv.add("--class"); > argv.add(RemoteDriver.class.getName()); > > String jar =3D "spark-internal"; > if (SparkContext.jarOfClass(this.getClass()).isDefined()) { > jar =3D SparkContext.jarOfClass(this.getClass()).get(); > } > argv.add(jar); > > ... > > } > > When hive executed spark-submit , it generate the shell command with > --class org.apache.hive.spark.client.RemoteDriver ,and set jar path with > SparkContext.jarOfClass(this.getClass()).get(). It will get the local pat= h > of hive-exec-2.0.0.jar. > > In my situation, the application and yarn cluster are in different cluste= r. > When application executed spark-submit with local path of > hive-exec-2.0.0.jar to yarn cluster, there 's no hive-exec-2.0.0.jar in > yarn cluster. Then application threw the exception: "hive-exec-2.0.0.jar > does not exist ...". > > Can it be set property of hive-exec-2.0.0.jar path in application ? > Something like 'hiveConf.set("hive.remote.driver.jar", > "hdfs://storm0:9000/tmp/spark-assembly-1.4.1-hadoop2.6.0.jar")'. > If not, is it possible to achieve in the future version? > > > > > 2016-03-10 23:51 GMT+08:00 Xuefu Zhang : > >> You can probably avoid the problem by set environment variable SPARK_HOM= E >> or JVM property spark.home that points to your spark installation. >> >> --Xuefu >> >> On Thu, Mar 10, 2016 at 3:11 AM, Stana wrote: >> >> > I am trying out Hive on Spark with hive 2.0.0 and spark 1.4.1, and >> > executing org.apache.hadoop.hive.ql.Driver with java application. >> > >> > Following are my situations: >> > 1.Building spark 1.4.1 assembly jar without Hive . >> > 2.Uploading the spark assembly jar to the hadoop cluster. >> > 3.Executing the java application with eclipse IDE in my client compute= r. >> > >> > The application went well and it submitted mr job to the yarn cluster >> > successfully when using " hiveConf.set("hive.execution.engine", "mr") >> > ",but it threw exceptions in spark-engine. >> > >> > Finally, i traced Hive source code and came to the conclusion=EF=BC=9A >> > >> > In my situation, SparkClientImpl class will generate the spark-submit >> > shell and executed it. >> > The shell command allocated --class with RemoteDriver.class.getName() >> > and jar with SparkContext.jarOfClass(this.getClass()).get(), so that >> > my application threw the exception. >> > >> > Is it right? And how can I do to execute the application with >> > spark-engine successfully in my client computer ? Thanks a lot! >> > >> > >> > Java application code: >> > >> > public class TestHiveDriver { >> > >> > private static HiveConf hiveConf; >> > private static Driver driver; >> > private static CliSessionState ss; >> > public static void main(String[] args){ >> > >> > String sql =3D "select * from hadoop0263_0 as a join >> > hadoop0263_0 as b >> > on (a.key =3D b.key)"; >> > ss =3D new CliSessionState(new >> HiveConf(SessionState.class)); >> > hiveConf =3D new HiveConf(Driver.class); >> > hiveConf.set("fs.default.name", "hdfs://storm0:9000"); >> > hiveConf.set("yarn.resourcemanager.address", >> > "storm0:8032"); >> > hiveConf.set("yarn.resourcemanager.scheduler.address", >> > "storm0:8030"); >> > >> > >> hiveConf.set("yarn.resourcemanager.resource-tracker.address","storm0:803= 1"); >> > hiveConf.set("yarn.resourcemanager.admin.address", >> > "storm0:8033"); >> > hiveConf.set("mapreduce.framework.name", "yarn"); >> > hiveConf.set("mapreduce.johistory.address", >> > "storm0:10020"); >> > >> > >> hiveConf.set("javax.jdo.option.ConnectionURL","jdbc:mysql://storm0:3306/= stana_metastore"); >> > >> > >> hiveConf.set("javax.jdo.option.ConnectionDriverName","com.mysql.jdbc.Dri= ver"); >> > hiveConf.set("javax.jdo.option.ConnectionUserName", >> > "root"); >> > hiveConf.set("javax.jdo.option.ConnectionPassword", >> > "123456"); >> > hiveConf.setBoolean("hive.auto.convert.join",false); >> > hiveConf.set("spark.yarn.jar", >> > "hdfs://storm0:9000/tmp/spark-assembly-1.4.1-hadoop2.6.0.jar"); >> > hiveConf.set("spark.home","target/spark"); >> > hiveConf.set("hive.execution.engine", "spark"); >> > hiveConf.set("hive.dbname", "default"); >> > >> > >> > driver =3D new Driver(hiveConf); >> > SessionState.start(hiveConf); >> > >> > CommandProcessorResponse res =3D null; >> > try { >> > res =3D driver.run(sql); >> > } catch (CommandNeedRetryException e) { >> > // TODO Auto-generated catch block >> > e.printStackTrace(); >> > } >> > >> > System.out.println("Response Code:" + >> > res.getResponseCode()); >> > System.out.println("Error Message:" + >> > res.getErrorMessage()); >> > System.out.println("SQL State:" + res.getSQLState()); >> > >> > } >> > } >> > >> > >> > >> > >> > Exception of spark-engine: >> > >> > 16/03/10 18:32:58 INFO SparkClientImpl: Running client driver with >> > argv: >> > >> /Volumes/Sdhd/Documents/project/island/java/apache/hive-200-test/hive-re= lease-2.0.0/itests/hive-unit/target/spark/bin/spark-submit >> > --properties-file >> > >> > >> /var/folders/vt/cjcdhms903x7brn1kbh558s40000gn/T/spark-submit.7697089826= 296920539.properties >> > --class org.apache.hive.spark.client.RemoteDriver >> > >> > >> /Users/stana/.m2/repository/org/apache/hive/hive-exec/2.0.0/hive-exec-2.= 0.0.jar >> > --remote-host MacBook-Pro.local --remote-port 51331 --conf >> > hive.spark.client.connect.timeout=3D1000 --conf >> > hive.spark.client.server.connect.timeout=3D90000 --conf >> > hive.spark.client.channel.log.level=3Dnull --conf >> > hive.spark.client.rpc.max.size=3D52428800 --conf >> > hive.spark.client.rpc.threads=3D8 --conf >> > hive.spark.client.secret.bits=3D256 >> > 16/03/10 18:33:09 INFO SparkClientImpl: 16/03/10 18:33:09 INFO Client: >> > 16/03/10 18:33:09 INFO SparkClientImpl: client token: N/A >> > 16/03/10 18:33:09 INFO SparkClientImpl: diagnostics: N/A >> > 16/03/10 18:33:09 INFO SparkClientImpl: ApplicationMaster hos= t: >> > N/A >> > 16/03/10 18:33:09 INFO SparkClientImpl: ApplicationMaster RPC >> > port: -1 >> > 16/03/10 18:33:09 INFO SparkClientImpl: queue: default >> > 16/03/10 18:33:09 INFO SparkClientImpl: start time: >> 1457180833494 >> > 16/03/10 18:33:09 INFO SparkClientImpl: final status: UNDEFIN= ED >> > 16/03/10 18:33:09 INFO SparkClientImpl: tracking URL: >> > http://storm0:8088/proxy/application_1457002628102_0043/ >> > 16/03/10 18:33:09 INFO SparkClientImpl: user: stana >> > 16/03/10 18:33:10 INFO SparkClientImpl: 16/03/10 18:33:10 INFO Client: >> > Application report for application_1457002628102_0043 (state: FAILED) >> > 16/03/10 18:33:10 INFO SparkClientImpl: 16/03/10 18:33:10 INFO Client: >> > 16/03/10 18:33:10 INFO SparkClientImpl: client token: N/A >> > 16/03/10 18:33:10 INFO SparkClientImpl: diagnostics: >> Application >> > application_1457002628102_0043 failed 1 times due to AM Container for >> > appattempt_1457002628102_0043_000001 exited with exitCode: -1000 >> > 16/03/10 18:33:10 INFO SparkClientImpl: For more detailed output, >> > check application tracking >> > page:http://storm0:8088/proxy/application_1457002628102_0043/Then, >> > click on links to logs of each attempt. >> > 16/03/10 18:33:10 INFO SparkClientImpl: Diagnostics: >> > java.io.FileNotFoundException: File >> > >> > >> file:/Users/stana/.m2/repository/org/apache/hive/hive-exec/2.0.0/hive-ex= ec-2.0.0.jar >> > does not exist >> > 16/03/10 18:33:10 INFO SparkClientImpl: Failing this attempt. Failing >> > the application. >> > 16/03/10 18:33:10 INFO SparkClientImpl: ApplicationMaster hos= t: >> > N/A >> > 16/03/10 18:33:10 INFO SparkClientImpl: ApplicationMaster RPC >> > port: -1 >> > 16/03/10 18:33:10 INFO SparkClientImpl: queue: default >> > 16/03/10 18:33:10 INFO SparkClientImpl: start time: >> 1457180833494 >> > 16/03/10 18:33:10 INFO SparkClientImpl: final status: FAILED >> > 16/03/10 18:33:10 INFO SparkClientImpl: tracking URL: >> > http://storm0:8088/cluster/app/application_1457002628102_0043 >> > 16/03/10 18:33:10 INFO SparkClientImpl: user: stana >> > 16/03/10 18:33:10 INFO SparkClientImpl: Exception in thread "main" >> > org.apache.spark.SparkException: Application >> > application_1457002628102_0043 finished with failed status >> > 16/03/10 18:33:10 INFO SparkClientImpl: at >> > org.apache.spark.deploy.yarn.Client.run(Client.scala:920) >> > 16/03/10 18:33:10 INFO SparkClientImpl: at >> > org.apache.spark.deploy.yarn.Client$.main(Client.scala:966) >> > 16/03/10 18:33:10 INFO SparkClientImpl: at >> > org.apache.spark.deploy.yarn.Client.main(Client.scala) >> > 16/03/10 18:33:10 INFO SparkClientImpl: at >> > sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) >> > 16/03/10 18:33:10 INFO SparkClientImpl: at >> > >> > >> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.jav= a:57) >> > 16/03/10 18:33:10 INFO SparkClientImpl: at >> > >> > >> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessor= Impl.java:43) >> > 16/03/10 18:33:10 INFO SparkClientImpl: at >> > java.lang.reflect.Method.invoke(Method.java:606) >> > 16/03/10 18:33:10 INFO SparkClientImpl: at >> > >> > >> org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit= $$runMain(SparkSubmit.scala:672) >> > 16/03/10 18:33:10 INFO SparkClientImpl: at >> > org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:180= ) >> > 16/03/10 18:33:10 INFO SparkClientImpl: at >> > org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:205) >> > 16/03/10 18:33:10 INFO SparkClientImpl: at >> > org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:120) >> > 16/03/10 18:33:10 INFO SparkClientImpl: at >> > org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) >> > 16/03/10 18:33:10 INFO SparkClientImpl: 16/03/10 18:33:10 INFO >> > ShutdownHookManager: Shutdown hook called >> > 16/03/10 18:33:10 INFO SparkClientImpl: 16/03/10 18:33:10 INFO >> > ShutdownHookManager: Deleting directory >> > >> > >> /private/var/folders/vt/cjcdhms903x7brn1kbh558s40000gn/T/spark-5b92ce20-= b6f8-4832-8b15-5e98bd0e0705 >> > 16/03/10 18:33:10 WARN SparkClientImpl: Error while waiting for client >> > to connect. >> > java.util.concurrent.ExecutionException: java.lang.RuntimeException: >> > Cancel client '5bda93c0-865b-48a8-b368-c2fcc30e81e8'. Error: Child >> > process exited before connecting back >> > at >> > io.netty.util.concurrent.AbstractFuture.get(AbstractFuture.java:37) >> > ~[netty-all-4.0.23.Final.jar:4.0.23.Final] >> > at >> > >> org.apache.hive.spark.client.SparkClientImpl.(SparkClientImpl.java= :101) >> > [hive-exec-2.0.0.jar:2.0.0] >> > at >> > >> org.apache.hive.spark.client.SparkClientFactory.createClient(SparkClient= Factory.java:80) >> > [hive-exec-2.0.0.jar:2.0.0] >> > at >> > >> org.apache.hadoop.hive.ql.exec.spark.RemoteHiveSparkClient.createRemoteC= lient(RemoteHiveSparkClient.java:98) >> > [hive-exec-2.0.0.jar:2.0.0] >> > at >> > >> org.apache.hadoop.hive.ql.exec.spark.RemoteHiveSparkClient.(Remote= HiveSparkClient.java:94) >> > [hive-exec-2.0.0.jar:2.0.0] >> > at >> > >> org.apache.hadoop.hive.ql.exec.spark.HiveSparkClientFactory.createHiveSp= arkClient(HiveSparkClientFactory.java:63) >> > [hive-exec-2.0.0.jar:2.0.0] >> > at >> > >> org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionImpl.open(Spark= SessionImpl.java:55) >> > [hive-exec-2.0.0.jar:2.0.0] >> > at >> > >> org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionManagerImpl.get= Session(SparkSessionManagerImpl.java:114) >> > [hive-exec-2.0.0.jar:2.0.0] >> > at >> > >> org.apache.hadoop.hive.ql.exec.spark.SparkUtilities.getSparkSession(Spar= kUtilities.java:131) >> > [hive-exec-2.0.0.jar:2.0.0] >> > at >> > >> org.apache.hadoop.hive.ql.optimizer.spark.SetSparkReducerParallelism.pro= cess(SetSparkReducerParallelism.java:117) >> > [hive-exec-2.0.0.jar:2.0.0] >> > at >> > >> org.apache.hadoop.hive.ql.lib.DefaultRuleDispatcher.dispatch(DefaultRule= Dispatcher.java:90) >> > [hive-exec-2.0.0.jar:2.0.0] >> > at >> > >> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.dispatchAndReturn(Defau= ltGraphWalker.java:105) >> > [hive-exec-2.0.0.jar:2.0.0] >> > at >> > >> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.dispatch(DefaultGraphWa= lker.java:89) >> > [hive-exec-2.0.0.jar:2.0.0] >> > at >> > >> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.walk(DefaultGraphWalker= .java:158) >> > [hive-exec-2.0.0.jar:2.0.0] >> > at >> > >> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.startWalking(DefaultGra= phWalker.java:120) >> > [hive-exec-2.0.0.jar:2.0.0] >> > at >> > >> org.apache.hadoop.hive.ql.parse.spark.SparkCompiler.runJoinOptimizations= (SparkCompiler.java:181) >> > [hive-exec-2.0.0.jar:2.0.0] >> > at >> > >> org.apache.hadoop.hive.ql.parse.spark.SparkCompiler.optimizeOperatorPlan= (SparkCompiler.java:119) >> > [hive-exec-2.0.0.jar:2.0.0] >> > at >> > >> org.apache.hadoop.hive.ql.parse.TaskCompiler.compile(TaskCompiler.java:1= 02) >> > [hive-exec-2.0.0.jar:2.0.0] >> > at >> > >> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(Semanti= cAnalyzer.java:10195) >> > [hive-exec-2.0.0.jar:2.0.0] >> > at >> > >> org.apache.hadoop.hive.ql.parse.CalcitePlanner.analyzeInternal(CalcitePl= anner.java:229) >> > [hive-exec-2.0.0.jar:2.0.0] >> > at >> > >> org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanti= cAnalyzer.java:239) >> > [hive-exec-2.0.0.jar:2.0.0] >> > at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:479) >> > [hive-exec-2.0.0.jar:?] >> > at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:319) >> > [hive-exec-2.0.0.jar:?] >> > at >> > org.apache.hadoop.hive.ql.Driver.compileInternal(Driver.java:1255) >> > [hive-exec-2.0.0.jar:?] >> > at >> org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1301) >> > [hive-exec-2.0.0.jar:?] >> > at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1184) >> > [hive-exec-2.0.0.jar:?] >> > at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1172) >> > [hive-exec-2.0.0.jar:?] >> > at >> > org.apache.hadoop.hive.ql.TestHiveDriver.main(TestHiveDriver.java:41) >> > [test-classes/:?] >> > Caused by: java.lang.RuntimeException: Cancel client >> > '5bda93c0-865b-48a8-b368-c2fcc30e81e8'. Error: Child process exited >> > before connecting back >> > at >> > >> org.apache.hive.spark.client.rpc.RpcServer.cancelClient(RpcServer.java:1= 79) >> > ~[hive-exec-2.0.0.jar:2.0.0] >> > at >> > >> org.apache.hive.spark.client.SparkClientImpl$3.run(SparkClientImpl.java:= 450) >> > ~[hive-exec-2.0.0.jar:2.0.0] >> > at java.lang.Thread.run(Thread.java:745) ~[?:1.7.0_67] >> > 16/03/10 18:33:10 WARN SparkClientImpl: Child process exited with code >> 1. >> > FAILED: SemanticException Failed to get a spark session: >> > org.apache.hadoop.hive.ql.metadata.HiveException: Failed to create >> > spark client. >> > 16/03/10 18:33:10 ERROR Driver: FAILED: SemanticException Failed to >> > get a spark session: org.apache.hadoop.hive.ql.metadata.HiveException: >> > Failed to create spark client. >> > org.apache.hadoop.hive.ql.parse.SemanticException: Failed to get a >> > spark session: org.apache.hadoop.hive.ql.metadata.HiveException: >> > Failed to create spark client. >> > at >> > >> org.apache.hadoop.hive.ql.optimizer.spark.SetSparkReducerParallelism.pro= cess(SetSparkReducerParallelism.java:121) >> > at >> > >> org.apache.hadoop.hive.ql.lib.DefaultRuleDispatcher.dispatch(DefaultRule= Dispatcher.java:90) >> > at >> > >> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.dispatchAndReturn(Defau= ltGraphWalker.java:105) >> > at >> > >> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.dispatch(DefaultGraphWa= lker.java:89) >> > at >> > >> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.walk(DefaultGraphWalker= .java:158) >> > at >> > >> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.startWalking(DefaultGra= phWalker.java:120) >> > at >> > >> org.apache.hadoop.hive.ql.parse.spark.SparkCompiler.runJoinOptimizations= (SparkCompiler.java:181) >> > at >> > >> org.apache.hadoop.hive.ql.parse.spark.SparkCompiler.optimizeOperatorPlan= (SparkCompiler.java:119) >> > at >> > >> org.apache.hadoop.hive.ql.parse.TaskCompiler.compile(TaskCompiler.java:1= 02) >> > at >> > >> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(Semanti= cAnalyzer.java:10195) >> > at >> > >> org.apache.hadoop.hive.ql.parse.CalcitePlanner.analyzeInternal(CalcitePl= anner.java:229) >> > at >> > >> org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanti= cAnalyzer.java:239) >> > at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:479) >> > at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:319) >> > at >> > org.apache.hadoop.hive.ql.Driver.compileInternal(Driver.java:1255) >> > at >> org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1301) >> > at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1184) >> > at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1172) >> > at >> > org.apache.hadoop.hive.ql.TestHiveDriver.main(TestHiveDriver.java:41) >> > >> > > --001a1143158a7477ae052e8910d7 Content-Type: text/html; charset=UTF-8 Content-Transfer-Encoding: quoted-printable
Does anyone have suggestions in setting property of hive-e= xec-2.0.0.jar path in application?
Something like 'hiveConf.set(&quo= t;hive.remote.driver.jar","hdfs://storm0:9000/tmp/spark-assembly-= 1.4.1-hadoop2.6.0.jar")'.



2016-03-11 10:53 GMT+08:00 Stana <stan= a@is-land.com.tw>:
Thanks for r= eply

I have set the property spark.home in my application. Oth= erwise the application threw 'SPARK_HOME not found exception'.
<= br>
I found hive source code in SparkClientImpl.java:

private T= hread startDriver(final RpcServer rpcServer, final String clientId, final S= tring secret)
=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 throws IOException {
...=

List<String> argv =3D Lists.newArrayList();

...
argv.add("--class");
argv.add(RemoteDriver.class.getName());<= br>
String jar =3D "spark-internal";
if (SparkContext.jarOf= Class(this.getClass()).isDefined()) {
jar =3D SparkContext.jarOfClass(th= is.getClass()).get();
}
argv.add(jar);

...

}

When hive executed spark-submit , it generate the shell command with --c= lass org.apache.hive.spark.client.RemoteDriver ,and set jar path with Spark= Context.jarOfClass(this.getClass()).get(). It will get the local path of hi= ve-exec-2.0.0.jar.

In my situation, the application and yarn c= luster are in different cluster.
When application executed spark-s= ubmit with local path of hive-exec-2.0.0.jar to yarn cluster, there 's= no hive-exec-2.0.0.jar in yarn cluster. Then application threw the excepti= on: "hive-exec-2.0.0.jar
=C2=A0 does not exist ...".

Can it be set property of hive-exec-= 2.0.0.jar path in application ? Something like 'hiveConf.set("hive= .remote.driver.jar",
"hdfs://storm0:9000/tmp/spark-assembly-1.4.1-hadoop2.6.0.jar")= 9;.
If not, is it possible to achieve in the future version?
<= div class=3D"h5">


2016-03-10 23:51 GMT+08:00 Xuefu Zhang <xuefu= @uber.com>:
You can probably avoid the problem by set environment variable SPARK_HO= ME
or JVM property spark.home that points to your spark installation.

--Xuefu

On Thu, Mar 10, 2016 at 3:11 AM, Stana <stana@is-land.com.tw> wrote:

>=C2=A0 I am trying out Hive on Spark with hive 2.0.0 and spark 1.4.1, a= nd
> executing org.apache.hadoop.hive.ql.Driver with java application.
>
> Following are my situations:
> 1.Building spark 1.4.1 assembly jar without Hive .
> 2.Uploading the spark assembly jar to the hadoop cluster.
> 3.Executing the java application with eclipse IDE in my client compute= r.
>
> The application went well and it submitted mr job to the yarn cluster<= br> > successfully when using " hiveConf.set("hive.execution.engin= e", "mr")
> ",but it threw exceptions in spark-engine.
>
> Finally, i traced Hive source code and came to the conclusion=EF=BC=9A=
>
> In my situation, SparkClientImpl class will generate the spark-submit<= br> > shell and executed it.
> The shell command allocated=C2=A0 --class with RemoteDriver.class.getN= ame()
> and jar with SparkContext.jarOfClass(this.getClass()).get(), so that > my application threw the exception.
>
> Is it right? And how can I do to execute the application with
> spark-engine successfully in my client computer ? Thanks a lot!
>
>
> Java application code:
>
> public class TestHiveDriver {
>
>=C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0private static HiveConf hiveConf;
>=C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0private static Driver driver;
>=C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0private static CliSessionState ss; >=C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0public static void main(String[] args= ){
>
>=C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0String sq= l =3D "select * from hadoop0263_0 as a join
> hadoop0263_0 as b
> on (a.key =3D b.key)";
>=C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0ss =3D ne= w CliSessionState(new HiveConf(SessionState.class));
>=C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0hiveConf = =3D new HiveConf(Driver.class);
>=C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0hiveConf.= set("fs.default.name", "hdfs://storm0:9000");
>=C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0hiveConf.= set("yarn.resourcemanager.address",
> "storm0:8032");
>=C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0hiveConf.= set("yarn.resourcemanager.scheduler.address",
> "storm0:8030");
>
> hiveConf.set("yarn.resourcemanager.resource-tracker.address"= ,"storm0:8031");
>=C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0hiveConf.= set("yarn.resourcemanager.admin.address",
> "storm0:8033");
>=C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0hiveConf.= set("mapreduce.framework.name", "yarn");
>=C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0hiveConf.= set("mapreduce.johistory.address",
> "storm0:10020");
>
> hiveConf.set("javax.jdo.option.ConnectionURL","jdbc:mys= ql://storm0:3306/stana_metastore");
>
> hiveConf.set("javax.jdo.option.ConnectionDriverName","c= om.mysql.jdbc.Driver");
>=C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0hiveConf.= set("javax.jdo.option.ConnectionUserName",
> "root");
>=C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0hiveConf.= set("javax.jdo.option.ConnectionPassword",
> "123456");
>=C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0hiveConf.= setBoolean("hive.auto.convert.join",false);
>=C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0hiveConf.= set("spark.yarn.jar",
> "hdfs://storm0:9000/tmp/spark-assembly-1.4.1-hadoop2.6.0.jar"= ;);
>=C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0hiveConf.= set("spark.home","target/spark");
>=C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0hiveConf.= set("hive.execution.engine", "spark");
>=C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0hiveConf.= set("hive.dbname", "default");
>
>
>=C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0driver = =3D new Driver(hiveConf);
>=C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0SessionSt= ate.start(hiveConf);
>
>=C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0CommandPr= ocessorResponse res =3D null;
>=C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0try {
>=C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 = =C2=A0 =C2=A0 =C2=A0res =3D driver.run(sql);
>=C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0} catch (= CommandNeedRetryException e) {
>=C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 = =C2=A0 =C2=A0 =C2=A0// TODO Auto-generated catch block
>=C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 = =C2=A0 =C2=A0 =C2=A0e.printStackTrace();
>=C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0}
>
>=C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0System.ou= t.println("Response Code:" +
> res.getResponseCode());
>=C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0System.ou= t.println("Error Message:" +
> res.getErrorMessage());
>=C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0System.ou= t.println("SQL State:" + res.getSQLState());
>
>=C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0}
> }
>
>
>
>
> Exception of spark-engine:
>
> 16/03/10 18:32:58 INFO SparkClientImpl: Running client driver with
> argv:
> /Volumes/Sdhd/Documents/project/island/java/apache/hive-200-test/hive-= release-2.0.0/itests/hive-unit/target/spark/bin/spark-submit
> --properties-file
>
> /var/folders/vt/cjcdhms903x7brn1kbh558s40000gn/T/spark-submit.76970898= 26296920539.properties
> --class org.apache.hive.spark.client.RemoteDriver
>
> /Users/stana/.m2/repository/org/apache/hive/hive-exec/2.0.0/hive-exec-= 2.0.0.jar
> --remote-host MacBook-Pro.local --remote-port 51331 --conf
> hive.spark.client.connect.timeout=3D1000 --conf
> hive.spark.client.server.connect.timeout=3D90000 --conf
> hive.spark.client.channel.log.level=3Dnull --conf
> hive.spark.client.rpc.max.size=3D52428800 --conf
> hive.spark.client.rpc.threads=3D8 --conf
> hive.spark.client.secret.bits=3D256
> 16/03/10 18:33:09 INFO SparkClientImpl: 16/03/10 18:33:09 INFO Client:=
> 16/03/10 18:33:09 INFO SparkClientImpl:=C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2= =A0 client token: N/A
> 16/03/10 18:33:09 INFO SparkClientImpl:=C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2= =A0 diagnostics: N/A
> 16/03/10 18:33:09 INFO SparkClientImpl:=C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2= =A0 ApplicationMaster host:
> N/A
> 16/03/10 18:33:09 INFO SparkClientImpl:=C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2= =A0 ApplicationMaster RPC
> port: -1
> 16/03/10 18:33:09 INFO SparkClientImpl:=C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2= =A0 queue: default
> 16/03/10 18:33:09 INFO SparkClientImpl:=C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2= =A0 start time: 1457180833494
> 16/03/10 18:33:09 INFO SparkClientImpl:=C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2= =A0 final status: UNDEFINED
> 16/03/10 18:33:09 INFO SparkClientImpl:=C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2= =A0 tracking URL:
> http://storm0:8088/proxy/application_14= 57002628102_0043/
> 16/03/10 18:33:09 INFO SparkClientImpl:=C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2= =A0 user: stana
> 16/03/10 18:33:10 INFO SparkClientImpl: 16/03/10 18:33:10 INFO Client:=
> Application report for application_1457002628102_0043 (state: FAILED)<= br> > 16/03/10 18:33:10 INFO SparkClientImpl: 16/03/10 18:33:10 INFO Client:=
> 16/03/10 18:33:10 INFO SparkClientImpl:=C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2= =A0 client token: N/A
> 16/03/10 18:33:10 INFO SparkClientImpl:=C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2= =A0 diagnostics: Application
> application_1457002628102_0043 failed 1 times due to AM Container for<= br> > appattempt_1457002628102_0043_000001 exited with=C2=A0 exitCode: -1000=
> 16/03/10 18:33:10 INFO SparkClientImpl: For more detailed output,
> check application tracking
> page:http://storm0:8088/proxy/appli= cation_1457002628102_0043/Then,
> click on links to logs of each attempt.
> 16/03/10 18:33:10 INFO SparkClientImpl: Diagnostics:
> java.io.FileNotFoundException: File
>
> file:/Users/stana/.m2/repository/org/apache/hive/hive-exec/2.0.0/hive-= exec-2.0.0.jar
> does not exist
> 16/03/10 18:33:10 INFO SparkClientImpl: Failing this attempt. Failing<= br> > the application.
> 16/03/10 18:33:10 INFO SparkClientImpl:=C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2= =A0 ApplicationMaster host:
> N/A
> 16/03/10 18:33:10 INFO SparkClientImpl:=C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2= =A0 ApplicationMaster RPC
> port: -1
> 16/03/10 18:33:10 INFO SparkClientImpl:=C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2= =A0 queue: default
> 16/03/10 18:33:10 INFO SparkClientImpl:=C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2= =A0 start time: 1457180833494
> 16/03/10 18:33:10 INFO SparkClientImpl:=C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2= =A0 final status: FAILED
> 16/03/10 18:33:10 INFO SparkClientImpl:=C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2= =A0 tracking URL:
> http://storm0:8088/cluster/app/app= lication_1457002628102_0043
> 16/03/10 18:33:10 INFO SparkClientImpl:=C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2= =A0 user: stana
> 16/03/10 18:33:10 INFO SparkClientImpl: Exception in thread "main= "
> org.apache.spark.SparkException: Application
> application_1457002628102_0043 finished with failed status
> 16/03/10 18:33:10 INFO SparkClientImpl:=C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2= =A0at
> org.apache.spark.deploy.yarn.Client.run(Client.scala:920)
> 16/03/10 18:33:10 INFO SparkClientImpl:=C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2= =A0at
> org.apache.spark.deploy.yarn.Client$.main(Client.scala:966)
> 16/03/10 18:33:10 INFO SparkClientImpl:=C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2= =A0at
> org.apache.spark.deploy.yarn.Client.main(Client.scala)
> 16/03/10 18:33:10 INFO SparkClientImpl:=C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2= =A0at
> sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> 16/03/10 18:33:10 INFO SparkClientImpl:=C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2= =A0at
>
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.j= ava:57)
> 16/03/10 18:33:10 INFO SparkClientImpl:=C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2= =A0at
>
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccess= orImpl.java:43)
> 16/03/10 18:33:10 INFO SparkClientImpl:=C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2= =A0at
> java.lang.reflect.Method.invoke(Method.java:606)
> 16/03/10 18:33:10 INFO SparkClientImpl:=C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2= =A0at
>
> org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubm= it$$runMain(SparkSubmit.scala:672)
> 16/03/10 18:33:10 INFO SparkClientImpl:=C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2= =A0at
> org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:180= )
> 16/03/10 18:33:10 INFO SparkClientImpl:=C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2= =A0at
> org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:205)
> 16/03/10 18:33:10 INFO SparkClientImpl:=C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2= =A0at
> org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:120)
> 16/03/10 18:33:10 INFO SparkClientImpl:=C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2= =A0at
> org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
> 16/03/10 18:33:10 INFO SparkClientImpl: 16/03/10 18:33:10 INFO
> ShutdownHookManager: Shutdown hook called
> 16/03/10 18:33:10 INFO SparkClientImpl: 16/03/10 18:33:10 INFO
> ShutdownHookManager: Deleting directory
>
> /private/var/folders/vt/cjcdhms903x7brn1kbh558s40000gn/T/spark-5b92ce2= 0-b6f8-4832-8b15-5e98bd0e0705
> 16/03/10 18:33:10 WARN SparkClientImpl: Error while waiting for client=
> to connect.
> java.util.concurrent.ExecutionException: java.lang.RuntimeException: > Cancel client '5bda93c0-865b-48a8-b368-c2fcc30e81e8'. Error: C= hild
> process exited before connecting back
>=C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0at
> io.netty.util.concurrent.AbstractFuture.get(AbstractFuture.java:37) > ~[netty-all-4.0.23.Final.jar:4.0.23.Final]
>=C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0at
> org.apache.hive.spark.client.SparkClientImpl.<init>(SparkClientI= mpl.java:101)
> [hive-exec-2.0.0.jar:2.0.0]
>=C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0at
> org.apache.hive.spark.client.SparkClientFactory.createClient(SparkClie= ntFactory.java:80)
> [hive-exec-2.0.0.jar:2.0.0]
>=C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0at
> org.apache.hadoop.hive.ql.exec.spark.RemoteHiveSparkClient.createRemot= eClient(RemoteHiveSparkClient.java:98)
> [hive-exec-2.0.0.jar:2.0.0]
>=C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0at
> org.apache.hadoop.hive.ql.exec.spark.RemoteHiveSparkClient.<init>= ;(RemoteHiveSparkClient.java:94)
> [hive-exec-2.0.0.jar:2.0.0]
>=C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0at
> org.apache.hadoop.hive.ql.exec.spark.HiveSparkClientFactory.createHive= SparkClient(HiveSparkClientFactory.java:63)
> [hive-exec-2.0.0.jar:2.0.0]
>=C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0at
> org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionImpl.open(Spa= rkSessionImpl.java:55)
> [hive-exec-2.0.0.jar:2.0.0]
>=C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0at
> org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionManagerImpl.g= etSession(SparkSessionManagerImpl.java:114)
> [hive-exec-2.0.0.jar:2.0.0]
>=C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0at
> org.apache.hadoop.hive.ql.exec.spark.SparkUtilities.getSparkSession(Sp= arkUtilities.java:131)
> [hive-exec-2.0.0.jar:2.0.0]
>=C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0at
> org.apache.hadoop.hive.ql.optimizer.spark.SetSparkReducerParallelism.p= rocess(SetSparkReducerParallelism.java:117)
> [hive-exec-2.0.0.jar:2.0.0]
>=C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0at
> org.apache.hadoop.hive.ql.lib.DefaultRuleDispatcher.dispatch(DefaultRu= leDispatcher.java:90)
> [hive-exec-2.0.0.jar:2.0.0]
>=C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0at
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.dispatchAndReturn(Def= aultGraphWalker.java:105)
> [hive-exec-2.0.0.jar:2.0.0]
>=C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0at
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.dispatch(DefaultGraph= Walker.java:89)
> [hive-exec-2.0.0.jar:2.0.0]
>=C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0at
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.walk(DefaultGraphWalk= er.java:158)
> [hive-exec-2.0.0.jar:2.0.0]
>=C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0at
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.startWalking(DefaultG= raphWalker.java:120)
> [hive-exec-2.0.0.jar:2.0.0]
>=C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0at
> org.apache.hadoop.hive.ql.parse.spark.SparkCompiler.runJoinOptimizatio= ns(SparkCompiler.java:181)
> [hive-exec-2.0.0.jar:2.0.0]
>=C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0at
> org.apache.hadoop.hive.ql.parse.spark.SparkCompiler.optimizeOperatorPl= an(SparkCompiler.java:119)
> [hive-exec-2.0.0.jar:2.0.0]
>=C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0at
> org.apache.hadoop.hive.ql.parse.TaskCompiler.compile(TaskCompiler.java= :102)
> [hive-exec-2.0.0.jar:2.0.0]
>=C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0at
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(Seman= ticAnalyzer.java:10195)
> [hive-exec-2.0.0.jar:2.0.0]
>=C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0at
> org.apache.hadoop.hive.ql.parse.CalcitePlanner.analyzeInternal(Calcite= Planner.java:229)
> [hive-exec-2.0.0.jar:2.0.0]
>=C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0at
> org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSeman= ticAnalyzer.java:239)
> [hive-exec-2.0.0.jar:2.0.0]
>=C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0at org.apache.hadoop.hive.ql.Driver.c= ompile(Driver.java:479)
> [hive-exec-2.0.0.jar:?]
>=C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0at org.apache.hadoop.hive.ql.Driver.c= ompile(Driver.java:319)
> [hive-exec-2.0.0.jar:?]
>=C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0at
> org.apache.hadoop.hive.ql.Driver.compileInternal(Driver.java:1255)
> [hive-exec-2.0.0.jar:?]
>=C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0at org.apache.hadoop.hive.ql.Driver.r= unInternal(Driver.java:1301)
> [hive-exec-2.0.0.jar:?]
>=C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0at org.apache.hadoop.hive.ql.Driver.r= un(Driver.java:1184)
> [hive-exec-2.0.0.jar:?]
>=C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0at org.apache.hadoop.hive.ql.Driver.r= un(Driver.java:1172)
> [hive-exec-2.0.0.jar:?]
>=C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0at
> org.apache.hadoop.hive.ql.TestHiveDriver.main(TestHiveDriver.java:41)<= br> > [test-classes/:?]
> Caused by: java.lang.RuntimeException: Cancel client
> '5bda93c0-865b-48a8-b368-c2fcc30e81e8'. Error: Child process e= xited
> before connecting back
>=C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0at
> org.apache.hive.spark.client.rpc.RpcServer.cancelClient(RpcServer.java= :179)
> ~[hive-exec-2.0.0.jar:2.0.0]
>=C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0at
> org.apache.hive.spark.client.SparkClientImpl$3.run(SparkClientImpl.jav= a:450)
> ~[hive-exec-2.0.0.jar:2.0.0]
>=C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0at java.lang.Thread.run(Thread.java:7= 45) ~[?:1.7.0_67]
> 16/03/10 18:33:10 WARN SparkClientImpl: Child process exited with code= 1.
> FAILED: SemanticException Failed to get a spark session:
> org.apache.hadoop.hive.ql.metadata.HiveException: Failed to create
> spark client.
> 16/03/10 18:33:10 ERROR Driver: FAILED: SemanticException Failed to > get a spark session: org.apache.hadoop.hive.ql.metadata.HiveException:=
> Failed to create spark client.
> org.apache.hadoop.hive.ql.parse.SemanticException: Failed to get a
> spark session: org.apache.hadoop.hive.ql.metadata.HiveException:
> Failed to create spark client.
>=C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0at
> org.apache.hadoop.hive.ql.optimizer.spark.SetSparkReducerParallelism.p= rocess(SetSparkReducerParallelism.java:121)
>=C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0at
> org.apache.hadoop.hive.ql.lib.DefaultRuleDispatcher.dispatch(DefaultRu= leDispatcher.java:90)
>=C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0at
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.dispatchAndReturn(Def= aultGraphWalker.java:105)
>=C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0at
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.dispatch(DefaultGraph= Walker.java:89)
>=C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0at
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.walk(DefaultGraphWalk= er.java:158)
>=C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0at
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.startWalking(DefaultG= raphWalker.java:120)
>=C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0at
> org.apache.hadoop.hive.ql.parse.spark.SparkCompiler.runJoinOptimizatio= ns(SparkCompiler.java:181)
>=C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0at
> org.apache.hadoop.hive.ql.parse.spark.SparkCompiler.optimizeOperatorPl= an(SparkCompiler.java:119)
>=C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0at
> org.apache.hadoop.hive.ql.parse.TaskCompiler.compile(TaskCompiler.java= :102)
>=C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0at
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(Seman= ticAnalyzer.java:10195)
>=C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0at
> org.apache.hadoop.hive.ql.parse.CalcitePlanner.analyzeInternal(Calcite= Planner.java:229)
>=C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0at
> org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSeman= ticAnalyzer.java:239)
>=C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0at org.apache.hadoop.hive.ql.Driver.c= ompile(Driver.java:479)
>=C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0at org.apache.hadoop.hive.ql.Driver.c= ompile(Driver.java:319)
>=C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0at
> org.apache.hadoop.hive.ql.Driver.compileInternal(Driver.java:1255)
>=C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0at org.apache.hadoop.hive.ql.Driver.r= unInternal(Driver.java:1301)
>=C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0at org.apache.hadoop.hive.ql.Driver.r= un(Driver.java:1184)
>=C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0at org.apache.hadoop.hive.ql.Driver.r= un(Driver.java:1172)
>=C2=A0 =C2=A0 =C2=A0 =C2=A0 =C2=A0at
> org.apache.hadoop.hive.ql.TestHiveDriver.main(TestHiveDriver.java:41)<= br> >


--001a1143158a7477ae052e8910d7--