hive-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Xuefu Zhang <xzh...@cloudera.com>
Subject Re: maven build failure for hive with spark
Date Fri, 28 Nov 2014 13:39:17 GMT
This seems related some heartbeat problem we saw recently. It should be
fixed via HIVE-8836. Could you get the latest code and retry?

Thanks,
Xuefu

On Fri, Nov 28, 2014 at 5:21 AM, Somnath Pandeya <
Somnath_Pandeya@infosys.com> wrote:

>  Hi  Xuefu,
>
>
>
> I am able to setup hive with spark engine,but on running any query I am
> getting following error.
>
>
>
> 14/11/28 06:31:10 INFO master.Master: Removing app app-20141128063025-0000
>
> 14/11/28 06:31:10 WARN remote.ReliableDeliverySupervisor: Association with
> remote system [akka.tcp://sparkDriver@ip-xxxx:35454] has failed, address
> is now gated for [5000] ms. Reason is: [Disassociated].
>
> 14/11/28 06:31:10 INFO master.Master: akka.tcp://sparkDriver@ip-xxxxx:35454
> got disassociated, removing it.
>
> 14/11/28 06:31:10 INFO actor.LocalActorRef: Message
> [akka.remote.transport.ActorTransportAdapter$DisassociateUnderlying] from
> Actor[akka://sparkMaster/deadLetters] to
> Actor[akka://sparkMaster/system/transports/akkaprotocolmanager.tcp0/akkaProtocol-tcp%3A%2F%2FsparkMaster%4010.188.73.13%3A59861-2#1720923396]
> was not delivered. [1] dead letters encountered. This logging can be turned
> off or adjusted with configuration settings 'akka.log-dead-letters' and
> 'akka.log-dead-letters-during-shutdown'.
>
>
>
> And query is failing
>
> Please let me know what can be issue.
>
>
>
> Thanks
>
> Somnath
>
>
>
> *From:* Xuefu Zhang [mailto:xzhang@cloudera.com]
> *Sent:* Thursday, November 27, 2014 6:21 PM
> *To:* Somnath Pandeya
> *Cc:* user@hive.apache.org
>
> *Subject:* Re: maven build failure for hive with spark
>
>
>
> To simply the solution, I'd do the following:
>
> 1. get Spark 1.2 source and build it.
>
> 2. cp spark-assembly jar to hive/lib folder
>
> 3. In hive CLI, exectute "set spark.home=/patch/to/spark/srcfolder;"
> spark.home should contain a bin folder, which has alll the spark scripts.
>
> This should resolve the class not found exception you saw.
>
> --Xuefu
>
>
>
>
>
> On Thu, Nov 27, 2014 at 1:07 AM, Somnath Pandeya <
> Somnath_Pandeya@infosys.com> wrote:
>
>  Thanks Xuefu,
>
>
>
> I am able to build hive successfully now.
>
> But while running hive with spark as execution engine I am getting
> following error.
>
>
>
> I am setting up
>
> set spark.home=location to
> /spark-assembly-1.2.0-SNAPSHOT-hadoop2.3.0-cdh5.1.2.jar
>
> and same spark assembly I am using for spark stand alone cluster.
>
>
>
> Exception in thread "main" java.lang.NoClassDefFoundError: akka/util/Crypt
>
>         at
> org.apache.hive.spark.client.SparkClientFactory.initialize(SparkClientFactory.java:47)
>
>         at
> org.apache.hadoop.hive.ql.exec.spark.RemoteHiveSparkClient.<init>(RemoteHiveSparkClient.java:79)
>
>         at
> org.apache.hadoop.hive.ql.exec.spark.HiveSparkClientFactory.createHiveSparkClient(HiveSparkClientFactory.java:53)
>
>         at
> org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionImpl.open(SparkSessionImpl.java:55)
>
>         at
> org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionManagerImpl.getSession(SparkSessionManagerImpl.java:122)
>
>         at
> org.apache.hadoop.hive.ql.exec.spark.SparkUtilities.getSparkSession(SparkUtilities.java:87)
>
>         at
> org.apache.hadoop.hive.ql.optimizer.spark.SetSparkReducerParallelism.process(SetSparkReducerParallelism.java:99)
>
>         at
> org.apache.hadoop.hive.ql.lib.DefaultRuleDispatcher.dispatch(DefaultRuleDispatcher.java:90)
>
>         at
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.dispatchAndReturn(DefaultGraphWalker.java:94)
>
>         at
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.dispatch(DefaultGraphWalker.java:78)
>
>         at
> org.apache.hadoop.hive.ql.lib.ForwardWalker.walk(ForwardWalker.java:79)
>
>         at
> org.apache.hadoop.hive.ql.lib.DefaultGraphWalker.startWalking(DefaultGraphWalker.java:109)
>
>         at
> org.apache.hadoop.hive.ql.parse.spark.SparkCompiler.optimizeOperatorPlan(SparkCompiler.java:132)
>
>         at
> org.apache.hadoop.hive.ql.parse.TaskCompiler.compile(TaskCompiler.java:99)
>
>         at
> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnalyzer.java:10204)
>
>         at
> org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:221)
>
>         at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:419)
>
>         at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:305)
>
>         at
> org.apache.hadoop.hive.ql.Driver.compileInternal(Driver.java:1107)
>
>         at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1169)
>
>         at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1044)
>
>         at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1034)
>
>         at
> org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:201)
>
>         at
> org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:153)
>
>         at
> org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:364)
>
>         at
> org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:712)
>
>         at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:631)
>
>         at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:570)
>
>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>
>         at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>
>         at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>
>         at java.lang.reflect.Method.invoke(Method.java:606)
>
>         at org.apache.hadoop.util.RunJar.main(RunJar.java:212)
>
>
>
>
>
> *From:* Xuefu Zhang [mailto:xzhang@cloudera.com]
> *Sent:* Wednesday, November 26, 2014 8:59 PM
> *To:* user@hive.apache.org
> *Subject:* Re: maven build failure for hive with spark
>
>
>
> Thanks for your interest. Please remove org/apache/spark in your local
> maven repo and try again. This may fix your problem.
>
> --Xuefu
>
>
>
> On Wed, Nov 26, 2014 at 1:03 AM, Somnath Pandeya <
> Somnath_Pandeya@infosys.com> wrote:
>
>  Hi
>
> I am trying to build hive with spark engine but getting maven build failure
>
> Please help.
>
>
>
> Following is the error.
>
>
>
>
>
>
>
> [INFO] Tests are skipped.
>
> [INFO]
>
> [INFO] --- maven-jar-plugin:2.2:jar (default-jar) @ hive-ant ---
>
> [INFO] Building jar:
> /mnt/disk1/software/hive/ant/target/hive-ant-0.15.0-SNAPSHOT.jar
>
> [INFO]
>
> [INFO] --- maven-site-plugin:3.3:attach-descriptor (attach-descriptor) @
> hive-ant ---
>
> [INFO]
>
> [INFO]
> ------------------------------------------------------------------------
>
> [INFO] Building Spark Remote Client 0.15.0-SNAPSHOT
>
> [INFO]
> ------------------------------------------------------------------------
>
> [INFO]
>
> [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ spark-client ---
>
> [INFO] Deleting /mnt/disk1/software/hive/spark-client (includes =
> [datanucleus.log, derby.log], excludes = [])
>
> [INFO]
>
> [INFO] --- maven-remote-resources-plugin:1.5:process (default) @
> spark-client ---
>
> [INFO]
>
> [INFO] --- maven-resources-plugin:2.6:resources (default-resources) @
> spark-client ---
>
> [INFO] Using 'UTF-8' encoding to copy filtered resources.
>
> [INFO] skip non existing resourceDirectory
> /mnt/disk1/software/hive/spark-client/src/main/resources
>
> [INFO] Copying 3 resources
>
> [INFO]
>
> [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ spark-client
> ---
>
> [INFO] Executing tasks
>
>
>
> main:
>
> [INFO] Executed tasks
>
> [INFO]
>
> [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @
> spark-client ---
>
> [INFO] Compiling 20 source files to
> /mnt/disk1/software/hive/spark-client/target/classes
>
> [INFO] -------------------------------------------------------------
>
> [WARNING] COMPILATION WARNING :
>
> [INFO] -------------------------------------------------------------
>
> [WARNING]
> /mnt/disk1/software/hive/spark-client/src/main/java/org/apache/hive/spark/client/SparkClientImpl.java:
> Some input files use unchecked or unsafe operations.
>
> [WARNING]
> /mnt/disk1/software/hive/spark-client/src/main/java/org/apache/hive/spark/client/SparkClientImpl.java:
> Recompile with -Xlint:unchecked for details.
>
> [INFO] 2 warnings
>
> [INFO] -------------------------------------------------------------
>
> [INFO] -------------------------------------------------------------
>
> [ERROR] COMPILATION ERROR :
>
> [INFO] -------------------------------------------------------------
>
> [ERROR]
> /mnt/disk1/software/hive/spark-client/src/main/java/org/apache/hive/spark/client/JobContext.java:[20,33]
> cannot find symbol
>
>   symbol:   class JavaFutureAction
>
>   location: package org.apache.spark.api.java
>
> [ERROR]
> /mnt/disk1/software/hive/spark-client/src/main/java/org/apache/hive/spark/client/JobContext.java:[46,35]
> cannot find symbol
>
>   symbol:   class JavaFutureAction
>
>   location: interface org.apache.hive.spark.client.JobContext
>
> [ERROR]
> /mnt/disk1/software/hive/spark-client/src/main/java/org/apache/hive/spark/client/JobContext.java:[46,7]
> cannot find symbol
>
>   symbol:   class JavaFutureAction
>
>   location: interface org.apache.hive.spark.client.JobContext
>
> [ERROR]
> /mnt/disk1/software/hive/spark-client/src/main/java/org/apache/hive/spark/client/JobContext.java:[51,20]
> cannot find symbol
>
>   symbol:   class JavaFutureAction
>
>   location: interface org.apache.hive.spark.client.JobContext
>
> [ERROR]
> /mnt/disk1/software/hive/spark-client/src/main/java/org/apache/hive/spark/client/MonitorCallback.java:[20,33]
> cannot find symbol
>
>   symbol:   class JavaFutureAction
>
>   location: package org.apache.spark.api.java
>
> [ERROR]
> /mnt/disk1/software/hive/spark-client/src/main/java/org/apache/hive/spark/client/MonitorCallback.java:[24,13]
> cannot find symbol
>
>   symbol:   class JavaFutureAction
>
>   location: interface org.apache.hive.spark.client.MonitorCallback
>
> [ERROR]
> /mnt/disk1/software/hive/spark-client/src/main/java/org/apache/hive/spark/client/status/HiveSparkJobInfo.java:[20,24]
> cannot find symbol
>
>   symbol:   class JobExecutionStatus
>
>   location: package org.apache.spark
>
> [ERROR]
> /mnt/disk1/software/hive/spark-client/src/main/java/org/apache/hive/spark/client/status/HiveSparkJobInfo.java:[21,24]
> cannot find symbol
>
>   symbol:   class SparkJobInfo
>
>   location: package org.apache.spark
>
> [ERROR]
> /mnt/disk1/software/hive/spark-client/src/main/java/org/apache/hive/spark/client/status/HiveSparkJobInfo.java:[28,42]
> cannot find symbol
>
>   symbol: class SparkJobInfo
>
> [ERROR]
> /mnt/disk1/software/hive/spark-client/src/main/java/org/apache/hive/spark/client/status/HiveSparkJobInfo.java:[31,17]
> cannot find symbol
>
>   symbol:   class JobExecutionStatus
>
>   location: class org.apache.hive.spark.client.status.HiveSparkJobInfo
>
> [ERROR]
> /mnt/disk1/software/hive/spark-client/src/main/java/org/apache/hive/spark/client/status/HiveSparkJobInfo.java:[33,27]
> cannot find symbol
>
>   symbol:   class SparkJobInfo
>
>   location: class org.apache.hive.spark.client.status.HiveSparkJobInfo
>
> [ERROR]
> /mnt/disk1/software/hive/spark-client/src/main/java/org/apache/hive/spark/client/status/HiveSparkJobInfo.java:[39,54]
> cannot find symbol
>
>   symbol:   class JobExecutionStatus
>
>   location: class org.apache.hive.spark.client.status.HiveSparkJobInfo
>
> [ERROR]
> /mnt/disk1/software/hive/spark-client/src/main/java/org/apache/hive/spark/client/status/HiveSparkJobInfo.java:[60,10]
> cannot find symbol
>
>   symbol:   class JobExecutionStatus
>
>   location: class org.apache.hive.spark.client.status.HiveSparkJobInfo
>
> [ERROR]
> /mnt/disk1/software/hive/spark-client/src/main/java/org/apache/hive/spark/client/RemoteDriver.java:[45,33]
> cannot find symbol
>
>   symbol:   class JavaFutureAction
>
>   location: package org.apache.spark.api.java
>
> [ERROR]
> /mnt/disk1/software/hive/spark-client/src/main/java/org/apache/hive/spark/client/JobContextImpl.java:[20,33]
> cannot find symbol
>
>   symbol:   class JavaFutureAction
>
>   location: package org.apache.spark.api.java
>
> [ERROR]
> /mnt/disk1/software/hive/spark-client/src/main/java/org/apache/hive/spark/client/RemoteDriver.java:[197,24]
> cannot find symbol
>
>   symbol:   class JavaFutureAction
>
>   location: class org.apache.hive.spark.client.RemoteDriver.JobWrapper<T>
>
> [ERROR]
> /mnt/disk1/software/hive/spark-client/src/main/java/org/apache/hive/spark/client/RemoteDriver.java:[251,29]
> cannot find symbol
>
>   symbol:   class JavaFutureAction
>
>   location: class org.apache.hive.spark.client.RemoteDriver.JobWrapper<T>
>
> [ERROR]
> /mnt/disk1/software/hive/spark-client/src/main/java/org/apache/hive/spark/client/JobContextImpl.java:[31,34]
> cannot find symbol
>
>   symbol:   class JavaFutureAction
>
>   location: class org.apache.hive.spark.client.JobContextImpl
>
> [ERROR]
> /mnt/disk1/software/hive/spark-client/src/main/java/org/apache/hive/spark/client/JobContextImpl.java:[46,42]
> cannot find symbol
>
>   symbol:   class JavaFutureAction
>
>   location: class org.apache.hive.spark.client.JobContextImpl
>
> [ERROR]
> /mnt/disk1/software/hive/spark-client/src/main/java/org/apache/hive/spark/client/JobContextImpl.java:[46,14]
> cannot find symbol
>
>   symbol:   class JavaFutureAction
>
>   location: class org.apache.hive.spark.client.JobContextImpl
>
> [ERROR]
> /mnt/disk1/software/hive/spark-client/src/main/java/org/apache/hive/spark/client/JobContextImpl.java:[52,27]
> cannot find symbol
>
>   symbol:   class JavaFutureAction
>
>   location: class org.apache.hive.spark.client.JobContextImpl
>
> [ERROR]
> /mnt/disk1/software/hive/spark-client/src/main/java/org/apache/hive/spark/client/status/HiveSparkStageInfo.java:[20,24]
> cannot find symbol
>
>   symbol:   class SparkStageInfo
>
>   location: package org.apache.spark
>
> [ERROR]
> /mnt/disk1/software/hive/spark-client/src/main/java/org/apache/hive/spark/client/status/HiveSparkStageInfo.java:[27,44]
> cannot find symbol
>
>   symbol: class SparkStageInfo
>
> [ERROR]
> /mnt/disk1/software/hive/spark-client/src/main/java/org/apache/hive/spark/client/status/HiveSparkStageInfo.java:[36,29]
> cannot find symbol
>
>   symbol:   class SparkStageInfo
>
>   location: class org.apache.hive.spark.client.status.HiveSparkStageInfo
>
> [ERROR]
> /mnt/disk1/software/hive/spark-client/src/main/java/org/apache/hive/spark/client/status/HiveSparkJobInfo.java:[46,26]
> cannot find symbol
>
>   symbol:   variable JobExecutionStatus
>
>   location: class org.apache.hive.spark.client.status.HiveSparkJobInfo
>
> [ERROR]
> /mnt/disk1/software/hive/spark-client/src/main/java/org/apache/hive/spark/client/status/HiveSparkJobInfo.java:[49,3]
> method does not override or implement a method from a supertype
>
> [ERROR]
> /mnt/disk1/software/hive/spark-client/src/main/java/org/apache/hive/spark/client/status/HiveSparkJobInfo.java:[54,3]
> method does not override or implement a method from a supertype
>
> [ERROR]
> /mnt/disk1/software/hive/spark-client/src/main/java/org/apache/hive/spark/client/status/HiveSparkJobInfo.java:[59,3]
> method does not override or implement a method from a supertype
>
> [ERROR]
> /mnt/disk1/software/hive/spark-client/src/main/java/org/apache/hive/spark/client/RemoteDriver.java:[152,10]
> cannot find symbol
>
>   symbol:   class JavaFutureAction
>
>   location: class org.apache.hive.spark.client.RemoteDriver
>
> [ERROR]
> /mnt/disk1/software/hive/spark-client/src/main/java/org/apache/hive/spark/client/RemoteDriver.java:[213,28]
> cannot find symbol
>
>   symbol: class JavaFutureAction
>
> [ERROR]
> /mnt/disk1/software/hive/spark-client/src/main/java/org/apache/hive/spark/client/RemoteDriver.java:[254,68]
> cannot find symbol
>
>   symbol:   class JavaFutureAction
>
>   location: class org.apache.hive.spark.client.RemoteDriver.JobWrapper<T>
>
> [ERROR]
> /mnt/disk1/software/hive/spark-client/src/main/java/org/apache/hive/spark/client/RemoteDriver.java:[356,14]
> cannot find symbol
>
>   symbol:   class JavaFutureAction
>
>   location: class org.apache.hive.spark.client.RemoteDriver.ClientListener
>
> [ERROR]
> /mnt/disk1/software/hive/spark-client/src/main/java/org/apache/hive/spark/client/JobContextImpl.java:[36,56]
> cannot find symbol
>
>   symbol:   class JavaFutureAction
>
>   location: class org.apache.hive.spark.client.JobContextImpl
>
> [ERROR]
> /mnt/disk1/software/hive/spark-client/src/main/java/org/apache/hive/spark/client/status/HiveSparkStageInfo.java:[61,3]
> method does not override or implement a method from a supertype
>
> [ERROR]
> /mnt/disk1/software/hive/spark-client/src/main/java/org/apache/hive/spark/client/status/HiveSparkStageInfo.java:[66,3]
> method does not override or implement a method from a supertype
>
> [ERROR]
> /mnt/disk1/software/hive/spark-client/src/main/java/org/apache/hive/spark/client/status/HiveSparkStageInfo.java:[71,3]
> method does not override or implement a method from a supertype
>
> [ERROR]
> /mnt/disk1/software/hive/spark-client/src/main/java/org/apache/hive/spark/client/status/HiveSparkStageInfo.java:[76,3]
> method does not override or implement a method from a supertype
>
> [ERROR]
> /mnt/disk1/software/hive/spark-client/src/main/java/org/apache/hive/spark/client/status/HiveSparkStageInfo.java:[81,3]
> method does not override or implement a method from a supertype
>
> [ERROR]
> /mnt/disk1/software/hive/spark-client/src/main/java/org/apache/hive/spark/client/status/HiveSparkStageInfo.java:[86,3]
> method does not override or implement a method from a supertype
>
> [ERROR]
> /mnt/disk1/software/hive/spark-client/src/main/java/org/apache/hive/spark/client/status/HiveSparkStageInfo.java:[91,3]
> method does not override or implement a method from a supertype
>
> [INFO] 40 errors
>
> [INFO] -------------------------------------------------------------
>
> [INFO]
> ------------------------------------------------------------------------
>
> [INFO] Reactor Summary:
>
> [INFO]
>
> [INFO] Hive ............................................... SUCCESS [
> 1.948 s]
>
> [INFO] Hive Shims Common .................................. SUCCESS [
> 1.920 s]
>
> [INFO] Hive Shims 0.20 .................................... SUCCESS [
> 0.816 s]
>
> [INFO] Hive Shims Secure Common ........................... SUCCESS [
> 1.170 s]
>
> [INFO] Hive Shims 0.20S ................................... SUCCESS [
> 0.610 s]
>
> [INFO] Hive Shims 0.23 .................................... SUCCESS [
> 2.664 s]
>
> [INFO] Hive Shims Scheduler ............................... SUCCESS [
> 0.972 s]
>
> [INFO] Hive Shims ......................................... SUCCESS [
> 0.734 s]
>
> [INFO] Hive Common ........................................ SUCCESS [
> 2.987 s]
>
> [INFO] Hive Serde ......................................... SUCCESS [
> 3.569 s]
>
> [INFO] Hive Metastore ..................................... SUCCESS [
> 8.839 s]
>
> [INFO] Hive Ant Utilities ................................. SUCCESS [
> 0.366 s]
>
> [INFO] Spark Remote Client ................................ FAILURE [
> 4.046 s]
>
> [INFO] Hive Query Language ................................ SKIPPED
>
> [INFO] Hive Service ....................................... SKIPPED
>
> [INFO] Hive Accumulo Handler .............................. SKIPPED
>
> [INFO] Hive JDBC .......................................... SKIPPED
>
> [INFO] Hive Beeline ....................................... SKIPPED
>
> [INFO] Hive CLI ........................................... SKIPPED
>
> [INFO] Hive Contrib ....................................... SKIPPED
>
> [INFO] Hive HBase Handler ................................. SKIPPED
>
> [INFO] Hive HCatalog ...................................... SKIPPED
>
> [INFO] Hive HCatalog Core ................................. SKIPPED
>
> [INFO] Hive HCatalog Pig Adapter .......................... SKIPPED
>
> [INFO] Hive HCatalog Server Extensions .................... SKIPPED
>
> [INFO] Hive HCatalog Webhcat Java Client .................. SKIPPED
>
> [INFO] Hive HCatalog Webhcat .............................. SKIPPED
>
> [INFO] Hive HCatalog Streaming ............................ SKIPPED
>
> [INFO] Hive HWI ........................................... SKIPPED
>
> [INFO] Hive ODBC .......................................... SKIPPED
>
> [INFO] Hive Shims Aggregator .............................. SKIPPED
>
> [INFO] Hive TestUtils ..................................... SKIPPED
>
> [INFO] Hive Packaging ..................................... SKIPPED
>
> [INFO]
> ------------------------------------------------------------------------
>
> [INFO] BUILD FAILURE
>
> [INFO]
> ------------------------------------------------------------------------
>
> [INFO] Total time: 31.640 s
>
> [INFO] Finished at: 2014-11-26T03:49:24-05:00
>
> [INFO] Final Memory: 115M/2184M
>
> [INFO]
> ------------------------------------------------------------------------
>
> [ERROR] Failed to execute goal
> org.apache.maven.plugins:maven-compiler-plugin:3.1:compile
> (default-compile) on project spark-client: Compilation failure: Compilation
> failure:
>
> [ERROR]
> /mnt/disk1/software/hive/spark-client/src/main/java/org/apache/hive/spark/client/JobContext.java:[20,33]
> cannot find symbol
>
> [ERROR] symbol:   class JavaFutureAction
>
> [ERROR] location: package org.apache.spark.api.java
>
> [ERROR]
> /mnt/disk1/software/hive/spark-client/src/main/java/org/apache/hive/spark/client/JobContext.java:[46,35]
> cannot find symbol
>
> [ERROR] symbol:   class JavaFutureAction
>
> [ERROR] location: interface org.apache.hive.spark.client.JobContext
>
> [ERROR]
> /mnt/disk1/software/hive/spark-client/src/main/java/org/apache/hive/spark/client/JobContext.java:[46,7]
> cannot find symbol
>
> [ERROR] symbol:   class JavaFutureAction
>
> [ERROR] location: interface org.apache.hive.spark.client.JobContext
>
> [ERROR]
> /mnt/disk1/software/hive/spark-client/src/main/java/org/apache/hive/spark/client/JobContext.java:[51,20]
> cannot find symbol
>
> [ERROR] symbol:   class JavaFutureAction
>
> [ERROR] location: interface org.apache.hive.spark.client.JobContext
>
> [ERROR]
> /mnt/disk1/software/hive/spark-client/src/main/java/org/apache/hive/spark/client/MonitorCallback.java:[20,33]
> cannot find symbol
>
> [ERROR] symbol:   class JavaFutureAction
>
> [ERROR] location: package org.apache.spark.api.java
>
> [ERROR]
> /mnt/disk1/software/hive/spark-client/src/main/java/org/apache/hive/spark/client/MonitorCallback.java:[24,13]
> cannot find symbol
>
> [ERROR] symbol:   class JavaFutureAction
>
> [ERROR] location: interface org.apache.hive.spark.client.MonitorCallback
>
> [ERROR]
> /mnt/disk1/software/hive/spark-client/src/main/java/org/apache/hive/spark/client/status/HiveSparkJobInfo.java:[20,24]
> cannot find symbol
>
> [ERROR] symbol:   class JobExecutionStatus
>
> [ERROR] location: package org.apache.spark
>
> [ERROR]
> /mnt/disk1/software/hive/spark-client/src/main/java/org/apache/hive/spark/client/status/HiveSparkJobInfo.java:[21,24]
> cannot find symbol
>
> [ERROR] symbol:   class SparkJobInfo
>
> [ERROR] location: package org.apache.spark
>
> [ERROR]
> /mnt/disk1/software/hive/spark-client/src/main/java/org/apache/hive/spark/client/status/HiveSparkJobInfo.java:[28,42]
> cannot find symbol
>
> [ERROR] symbol: class SparkJobInfo
>
> [ERROR]
> /mnt/disk1/software/hive/spark-client/src/main/java/org/apache/hive/spark/client/status/HiveSparkJobInfo.java:[31,17]
> cannot find symbol
>
> [ERROR] symbol:   class JobExecutionStatus
>
> [ERROR] location: class
> org.apache.hive.spark.client.status.HiveSparkJobInfo
>
> [ERROR]
> /mnt/disk1/software/hive/spark-client/src/main/java/org/apache/hive/spark/client/status/HiveSparkJobInfo.java:[33,27]
> cannot find symbol
>
> [ERROR] symbol:   class SparkJobInfo
>
> [ERROR] location: class
> org.apache.hive.spark.client.status.HiveSparkJobInfo
>
> [ERROR]
> /mnt/disk1/software/hive/spark-client/src/main/java/org/apache/hive/spark/client/status/HiveSparkJobInfo.java:[39,54]
> cannot find symbol
>
> [ERROR] symbol:   class JobExecutionStatus
>
> [ERROR] location: class
> org.apache.hive.spark.client.status.HiveSparkJobInfo
>
> [ERROR]
> /mnt/disk1/software/hive/spark-client/src/main/java/org/apache/hive/spark/client/status/HiveSparkJobInfo.java:[60,10]
> cannot find symbol
>
> [ERROR] symbol:   class JobExecutionStatus
>
> [ERROR] location: class
> org.apache.hive.spark.client.status.HiveSparkJobInfo
>
> [ERROR]
> /mnt/disk1/software/hive/spark-client/src/main/java/org/apache/hive/spark/client/RemoteDriver.java:[45,33]
> cannot find symbol
>
> [ERROR] symbol:   class JavaFutureAction
>
> [ERROR] location: package org.apache.spark.api.java
>
> [ERROR]
> /mnt/disk1/software/hive/spark-client/src/main/java/org/apache/hive/spark/client/JobContextImpl.java:[20,33]
> cannot find symbol
>
> [ERROR] symbol:   class JavaFutureAction
>
> [ERROR] location: package org.apache.spark.api.java
>
> [ERROR]
> /mnt/disk1/software/hive/spark-client/src/main/java/org/apache/hive/spark/client/RemoteDriver.java:[197,24]
> cannot find symbol
>
> [ERROR] symbol:   class JavaFutureAction
>
> [ERROR] location: class
> org.apache.hive.spark.client.RemoteDriver.JobWrapper<T>
>
> [ERROR]
> /mnt/disk1/software/hive/spark-client/src/main/java/org/apache/hive/spark/client/RemoteDriver.java:[251,29]
> cannot find symbol
>
> [ERROR] symbol:   class JavaFutureAction
>
> [ERROR] location: class
> org.apache.hive.spark.client.RemoteDriver.JobWrapper<T>
>
> [ERROR]
> /mnt/disk1/software/hive/spark-client/src/main/java/org/apache/hive/spark/client/JobContextImpl.java:[31,34]
> cannot find symbol
>
> [ERROR] symbol:   class JavaFutureAction
>
> [ERROR] location: class org.apache.hive.spark.client.JobContextImpl
>
> [ERROR]
> /mnt/disk1/software/hive/spark-client/src/main/java/org/apache/hive/spark/client/JobContextImpl.java:[46,42]
> cannot find symbol
>
> [ERROR] symbol:   class JavaFutureAction
>
> [ERROR] location: class org.apache.hive.spark.client.JobContextImpl
>
> [ERROR]
> /mnt/disk1/software/hive/spark-client/src/main/java/org/apache/hive/spark/client/JobContextImpl.java:[46,14]
> cannot find symbol
>
> [ERROR] symbol:   class JavaFutureAction
>
> [ERROR] location: class org.apache.hive.spark.client.JobContextImpl
>
> [ERROR]
> /mnt/disk1/software/hive/spark-client/src/main/java/org/apache/hive/spark/client/JobContextImpl.java:[52,27]
> cannot find symbol
>
> [ERROR] symbol:   class JavaFutureAction
>
> [ERROR] location: class org.apache.hive.spark.client.JobContextImpl
>
> [ERROR]
> /mnt/disk1/software/hive/spark-client/src/main/java/org/apache/hive/spark/client/status/HiveSparkStageInfo.java:[20,24]
> cannot find symbol
>
> [ERROR] symbol:   class SparkStageInfo
>
> [ERROR] location: package org.apache.spark
>
> [ERROR]
> /mnt/disk1/software/hive/spark-client/src/main/java/org/apache/hive/spark/client/status/HiveSparkStageInfo.java:[27,44]
> cannot find symbol
>
> [ERROR] symbol: class SparkStageInfo
>
> [ERROR]
> /mnt/disk1/software/hive/spark-client/src/main/java/org/apache/hive/spark/client/status/HiveSparkStageInfo.java:[36,29]
> cannot find symbol
>
> [ERROR] symbol:   class SparkStageInfo
>
> [ERROR] location: class
> org.apache.hive.spark.client.status.HiveSparkStageInfo
>
> [ERROR]
> /mnt/disk1/software/hive/spark-client/src/main/java/org/apache/hive/spark/client/status/HiveSparkJobInfo.java:[46,26]
> cannot find symbol
>
> [ERROR] symbol:   variable JobExecutionStatus
>
> [ERROR] location: class
> org.apache.hive.spark.client.status.HiveSparkJobInfo
>
> [ERROR]
> /mnt/disk1/software/hive/spark-client/src/main/java/org/apache/hive/spark/client/status/HiveSparkJobInfo.java:[49,3]
> method does not override or implement a method from a supertype
>
> [ERROR]
> /mnt/disk1/software/hive/spark-client/src/main/java/org/apache/hive/spark/client/status/HiveSparkJobInfo.java:[54,3]
> method does not override or implement a method from a supertype
>
> [ERROR]
> /mnt/disk1/software/hive/spark-client/src/main/java/org/apache/hive/spark/client/status/HiveSparkJobInfo.java:[59,3]
> method does not override or implement a method from a supertype
>
> [ERROR]
> /mnt/disk1/software/hive/spark-client/src/main/java/org/apache/hive/spark/client/RemoteDriver.java:[152,10]
> cannot find symbol
>
> [ERROR] symbol:   class JavaFutureAction
>
> [ERROR] location: class org.apache.hive.spark.client.RemoteDriver
>
> [ERROR]
> /mnt/disk1/software/hive/spark-client/src/main/java/org/apache/hive/spark/client/RemoteDriver.java:[213,28]
> cannot find symbol
>
> [ERROR] symbol: class JavaFutureAction
>
> [ERROR]
> /mnt/disk1/software/hive/spark-client/src/main/java/org/apache/hive/spark/client/RemoteDriver.java:[254,68]
> cannot find symbol
>
> [ERROR] symbol:   class JavaFutureAction
>
> [ERROR] location: class
> org.apache.hive.spark.client.RemoteDriver.JobWrapper<T>
>
> [ERROR]
> /mnt/disk1/software/hive/spark-client/src/main/java/org/apache/hive/spark/client/RemoteDriver.java:[356,14]
> cannot find symbol
>
> [ERROR] symbol:   class JavaFutureAction
>
> [ERROR] location: class
> org.apache.hive.spark.client.RemoteDriver.ClientListener
>
> [ERROR]
> /mnt/disk1/software/hive/spark-client/src/main/java/org/apache/hive/spark/client/JobContextImpl.java:[36,56]
> cannot find symbol
>
> [ERROR] symbol:   class JavaFutureAction
>
> [ERROR] location: class org.apache.hive.spark.client.JobContextImpl
>
> [ERROR]
> /mnt/disk1/software/hive/spark-client/src/main/java/org/apache/hive/spark/client/status/HiveSparkStageInfo.java:[61,3]
> method does not override or implement a method from a supertype
>
> [ERROR]
> /mnt/disk1/software/hive/spark-client/src/main/java/org/apache/hive/spark/client/status/HiveSparkStageInfo.java:[66,3]
> method does not override or implement a method from a supertype
>
> [ERROR]
> /mnt/disk1/software/hive/spark-client/src/main/java/org/apache/hive/spark/client/status/HiveSparkStageInfo.java:[71,3]
> method does not override or implement a method from a supertype
>
> [ERROR]
> /mnt/disk1/software/hive/spark-client/src/main/java/org/apache/hive/spark/client/status/HiveSparkStageInfo.java:[76,3]
> method does not override or implement a method from a supertype
>
> [ERROR]
> /mnt/disk1/software/hive/spark-client/src/main/java/org/apache/hive/spark/client/status/HiveSparkStageInfo.java:[81,3]
> method does not override or implement a method from a supertype
>
> [ERROR]
> /mnt/disk1/software/hive/spark-client/src/main/java/org/apache/hive/spark/client/status/HiveSparkStageInfo.java:[86,3]
> method does not override or implement a method from a supertype
>
> [ERROR]
> /mnt/disk1/software/hive/spark-client/src/main/java/org/apache/hive/spark/client/status/HiveSparkStageInfo.java:[91,3]
> method does not override or implement a method from a supertype
>
> [ERROR] -> [Help 1]
>
> [ERROR]
>
> [ERROR] To see the full stack trace of the errors, re-run Maven with the
> -e switch.
>
> [ERROR] Re-run Maven using the -X switch to enable full debug logging.
>
> [ERROR]
>
> [ERROR] For more information about the errors and possible solutions,
> please read the following articles:
>
> [ERROR] [Help 1]
> http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
>
> [ERROR]
>
> [ERROR] After correcting the problems, you can resume the build with the
> command
>
> [ERROR]   mvn <goals> -rf :spark-client
>
> **************** CAUTION - Disclaimer *****************
>
> This e-mail contains PRIVILEGED AND CONFIDENTIAL INFORMATION intended solely
>
> for the use of the addressee(s). If you are not the intended recipient, please
>
> notify the sender by e-mail and delete the original message. Further, you are not
>
> to copy, disclose, or distribute this e-mail or its contents to any other person and
>
> any such actions are unlawful. This e-mail may contain viruses. Infosys has taken
>
> every reasonable precaution to minimize this risk, but is not liable for any damage
>
> you may sustain as a result of any virus in this e-mail. You should carry out your
>
> own virus checks before opening the e-mail or attachment. Infosys reserves the
>
> right to monitor and review the content of all messages sent to or from this e-mail
>
> address. Messages sent to or from this e-mail address may be stored on the
>
> Infosys e-mail system.
>
> ***INFOSYS******** End of Disclaimer ********INFOSYS***
>
>
>
>
>

Mime
View raw message