spark-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "yinbinfeng0451 (JIRA)" <j...@apache.org>
Subject [jira] [Commented] (SPARK-21106) compile error
Date Thu, 15 Jun 2017 09:47:00 GMT

    [ https://issues.apache.org/jira/browse/SPARK-21106?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16050245#comment-16050245
] 

yinbinfeng0451 commented on SPARK-21106:
----------------------------------------

use eclipse maven , build use  clean ,package 

> compile error
> -------------
>
>                 Key: SPARK-21106
>                 URL: https://issues.apache.org/jira/browse/SPARK-21106
>             Project: Spark
>          Issue Type: Bug
>          Components: Build
>    Affects Versions: 2.1.1
>         Environment: win10 
>            Reporter: yinbinfeng0451
>
> [INFO] 
> [INFO] --- maven-resources-plugin:2.6:resources (default-resources) @ spark-network-shuffle_2.11
---
> [INFO] Using 'UTF-8' encoding to copy filtered resources.
> [INFO] skip non existing resourceDirectory D:\bigdata\spark\common\network-shuffle\src\main\resources
> [INFO] Copying 3 resources
> [INFO] Copying 3 resources
> [INFO] 
> [INFO] --- scala-maven-plugin:3.2.2:compile (scala-compile-first) @ spark-network-shuffle_2.11
---
> [INFO] Downloaded: https://repo1.maven.org/maven2/javax/activation/activation/1.1/activation-1.1.jar
(62 KB at 101.0 KB/sec)
> [INFO] Downloading: https://repo1.maven.org/maven2/org/codehaus/jackson/jackson-jaxrs/1.9.13/jackson-jaxrs-1.9.13.jar
> [ERROR] D:\bigdata\spark\external\flume-sink\src\main\scala\org\apache\spark\streaming\flume\sink\SparkAvroCallbackHandler.scala:45:
not found: type SparkFlumeProtocol
> [ERROR]   val transactionTimeout: Int, val backOffInterval: Int) extends SparkFlumeProtocol
with Logging {
> [ERROR]                                                                  ^
> [ERROR] D:\bigdata\spark\external\flume-sink\src\main\scala\org\apache\spark\streaming\flume\sink\SparkAvroCallbackHandler.scala:70:
not found: type EventBatch
> [ERROR]   override def getEventBatch(n: Int): EventBatch = {
> [ERROR]                                       ^
> [ERROR] D:\bigdata\spark\external\flume-sink\src\main\scala\org\apache\spark\streaming\flume\sink\TransactionProcessor.scala:80:
not found: type EventBatch
> [ERROR]   def getEventBatch: EventBatch = {
> [ERROR]                      ^
> [ERROR] D:\bigdata\spark\external\flume-sink\src\main\scala\org\apache\spark\streaming\flume\sink\SparkSinkUtils.scala:25:
not found: type EventBatch
> [ERROR]   def isErrorBatch(batch: EventBatch): Boolean = {
> [ERROR]                           ^
> [ERROR] D:\bigdata\spark\external\flume-sink\src\main\scala\org\apache\spark\streaming\flume\sink\SparkAvroCallbackHandler.scala:85:
not found: type EventBatch
> [ERROR]         new EventBatch("Spark sink has been stopped!", "", java.util.Collections.emptyList())
> [ERROR]             ^
> [INFO] Downloaded: https://repo1.maven.org/maven2/org/codehaus/jackson/jackson-jaxrs/1.9.13/jackson-jaxrs-1.9.13.jar
(18 KB at 30.1 KB/sec)
> [INFO] Downloading: https://repo1.maven.org/maven2/org/codehaus/jackson/jackson-xc/1.9.13/jackson-xc-1.9.13.jar
> [WARNING] Class org.jboss.netty.channel.ChannelFactory not found - continuing with a
stub.
> [WARNING] Class org.jboss.netty.channel.ChannelFactory not found - continuing with a
stub.
> [WARNING] Class org.jboss.netty.channel.ChannelPipelineFactory not found - continuing
with a stub.
> [WARNING] Class org.jboss.netty.handler.execution.ExecutionHandler not found - continuing
with a stub.
> [WARNING] Class org.jboss.netty.channel.ChannelFactory not found - continuing with a
stub.
> [WARNING] Class org.jboss.netty.handler.execution.ExecutionHandler not found - continuing
with a stub.
> [WARNING] Class org.jboss.netty.channel.group.ChannelGroup not found - continuing with
a stub.
> [ERROR] error while loading Protocol, invalid LOC header (bad signature)
> [ERROR] error while loading SpecificData, invalid LOC header (bad signature)
> [ERROR] D:\bigdata\spark\external\flume-sink\src\main\scala\org\apache\spark\streaming\flume\sink\SparkSink.scala:86:
not found: type SparkFlumeProtocol
> [ERROR]     val responder = new SpecificResponder(classOf[SparkFlumeProtocol], handler.get)
> [ERROR]                                                   ^
> [WARNING] Class com.google.common.collect.ImmutableMap not found - continuing with a
stub.
> [WARNING] Class com.google.common.collect.ImmutableMap not found - continuing with a
stub.
> [WARNING] Class com.google.common.collect.ImmutableMap not found - continuing with a
stub.
> [WARNING] Class com.google.common.collect.ImmutableMap not found - continuing with a
stub.
> [WARNING] Zinc server is not available at port 3030 - reverting to normal incremental
compile
> [INFO] Using incremental compilation
> [ERROR] D:\bigdata\spark\external\flume-sink\src\main\scala\org\apache\spark\streaming\flume\sink\TransactionProcessor.scala:48:
not found: type EventBatch
> [ERROR]   @volatile private var eventBatch: EventBatch = new EventBatch("Unknown Error",
"",
> [ERROR]                                     ^
> [ERROR] D:\bigdata\spark\external\flume-sink\src\main\scala\org\apache\spark\streaming\flume\sink\TransactionProcessor.scala:48:
not found: type EventBatch
> [ERROR]   @volatile private var eventBatch: EventBatch = new EventBatch("Unknown Error",
"",
> [ERROR]                                                      ^
> [INFO] Compiling 19 Java sources to D:\bigdata\spark\common\network-shuffle\target\scala-2.11\classes...
> [INFO] Downloaded: https://repo1.maven.org/maven2/org/codehaus/jackson/jackson-xc/1.9.13/jackson-xc-1.9.13.jar
(27 KB at 43.9 KB/sec)
> [ERROR] D:\bigdata\spark\external\flume-sink\src\main\scala\org\apache\spark\streaming\flume\sink\TransactionProcessor.scala:115:
not found: type SparkSinkEvent
> [ERROR]         val events = new util.ArrayList[SparkSinkEvent](maxBatchSize)
> [ERROR]                                         ^
> [ERROR] D:\bigdata\spark\external\flume-sink\src\main\scala\org\apache\spark\streaming\flume\sink\TransactionProcessor.scala:146:
not found: type EventBatch
> [ERROR]           eventBatch = new EventBatch("", seqNum, events)
> [ERROR]                            ^
> [INFO] 
> [INFO] --- maven-enforcer-plugin:1.4.1:enforce (enforce-versions) @ spark-mllib-local_2.11
---
> [INFO] 
> [INFO] --- maven-clean-plugin:3.0.0:clean (default-clean) @ spark-launcher_2.11 ---
> [INFO] Deleting D:\bigdata\spark\launcher\target
> [WARNING] 11 warnings found
> [ERROR] 12 errors found
> [INFO] ------------------------------------------------------------------------
> [INFO] Reactor Summary:
> [INFO] 
> [INFO] Spark Project Parent POM ........................... SUCCESS [ 21.460 s]
> [INFO] Spark Project Tags ................................. SUCCESS [ 15.256 s]
> [INFO] Spark Project Sketch ............................... SUCCESS [ 14.712 s]
> [INFO] Spark Project Local DB ............................. SUCCESS [ 20.156 s]
> [INFO] Spark Project Networking ........................... SUCCESS [ 28.210 s]
> [INFO] Spark Project Shuffle Streaming Service ............ SKIPPED
> [INFO] Spark Project Unsafe ............................... SUCCESS [ 20.111 s]
> [INFO] Spark Project Launcher ............................. SKIPPED
> [INFO] Spark Project Core ................................. SKIPPED
> [INFO] Spark Project ML Local Library ..................... SKIPPED
> [INFO] Spark Project GraphX ............................... SKIPPED
> [INFO] Spark Project Streaming ............................ SKIPPED
> [INFO] Spark Project Catalyst ............................. SKIPPED
> [INFO] Spark Project SQL .................................. SKIPPED
> [INFO] Spark Project ML Library ........................... SKIPPED
> [INFO] Spark Project Tools ................................ SUCCESS [ 12.723 s]
> [INFO] Spark Project Hive ................................. SKIPPED
> [INFO] Spark Project REPL ................................. SKIPPED
> [INFO] Spark Project Assembly ............................. SKIPPED
> [INFO] Spark Project External Flume Sink .................. FAILURE [ 24.118 s]
> [INFO] Spark Project External Flume ....................... SKIPPED
> [INFO] Spark Project External Flume Assembly .............. SKIPPED
> [INFO] Spark Integration for Kafka 0.8 .................... SKIPPED
> [INFO] Kafka 0.10 Source for Structured Streaming ......... SKIPPED
> [INFO] Spark Project Examples ............................. SKIPPED
> [INFO] Spark Project External Kafka Assembly .............. SKIPPED
> [INFO] Spark Integration for Kafka 0.10 ................... SKIPPED
> [INFO] Spark Integration for Kafka 0.10 Assembly .......... SKIPPED
> [INFO] ------------------------------------------------------------------------
> [INFO] BUILD FAILURE
> [INFO] ------------------------------------------------------------------------
> [INFO] Total time: 01:21 min (Wall Clock)
> [INFO] Finished at: 2017-06-15T16:50:09+08:00
> [INFO] Final Memory: 69M/1276M
> [INFO] ------------------------------------------------------------------------
> [ERROR] Failed to execute goal net.alchim31.maven:scala-maven-plugin:3.2.2:compile (default-cli)
on project spark-streaming-flume-sink_2.11: Execution default-cli of goal net.alchim31.maven:scala-maven-plugin:3.2.2:compile
failed. CompileFailed -> [Help 1]
> org.apache.maven.lifecycle.LifecycleExecutionException: Failed to execute goal net.alchim31.maven:scala-maven-plugin:3.2.2:compile
(default-cli) on project spark-streaming-flume-sink_2.11: Execution default-cli of goal net.alchim31.maven:scala-maven-plugin:3.2.2:compile
failed.
> 	at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:212)
> 	at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:153)
> 	at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:145)
> 	at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject(LifecycleModuleBuilder.java:116)
> 	at org.apache.maven.lifecycle.internal.builder.multithreaded.MultiThreadedBuilder$1.call(MultiThreadedBuilder.java:185)
> 	at org.apache.maven.lifecycle.internal.builder.multithreaded.MultiThreadedBuilder$1.call(MultiThreadedBuilder.java:181)
> 	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
> 	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
> 	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
> 	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
> 	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
> 	at java.lang.Thread.run(Thread.java:745)
> Caused by: org.apache.maven.plugin.PluginExecutionException: Execution default-cli of
goal net.alchim31.maven:scala-maven-plugin:3.2.2:compile failed.
> 	at org.apache.maven.plugin.DefaultBuildPluginManager.executeMojo(DefaultBuildPluginManager.java:145)
> 	at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:207)
> 	... 11 more
> Caused by: Compilation failed
> 	at sbt.compiler.AnalyzingCompiler.call(AnalyzingCompiler.scala:105)
> 	at sbt.compiler.AnalyzingCompiler.compile(AnalyzingCompiler.scala:48)
> 	at sbt.compiler.AnalyzingCompiler.compile(AnalyzingCompiler.scala:41)
> 	at sbt.compiler.AggressiveCompile$$anonfun$3$$anonfun$compileScala$1$1.apply$mcV$sp(AggressiveCompile.scala:99)
> 	at sbt.compiler.AggressiveCompile$$anonfun$3$$anonfun$compileScala$1$1.apply(AggressiveCompile.scala:99)
> 	at sbt.compiler.AggressiveCompile$$anonfun$3$$anonfun$compileScala$1$1.apply(AggressiveCompile.scala:99)
> 	at sbt.compiler.AggressiveCompile.sbt$compiler$AggressiveCompile$$timed(AggressiveCompile.scala:166)
> 	at sbt.compiler.AggressiveCompile$$anonfun$3.compileScala$1(AggressiveCompile.scala:98)
> 	at sbt.compiler.AggressiveCompile$$anonfun$3.apply(AggressiveCompile.scala:143)
> 	at sbt.compiler.AggressiveCompile$$anonfun$3.apply(AggressiveCompile.scala:87)
> 	at sbt.inc.IncrementalCompile$$anonfun$doCompile$1.apply(Compile.scala:39)
> 	at sbt.inc.IncrementalCompile$$anonfun$doCompile$1.apply(Compile.scala:37)
> 	at sbt.inc.IncrementalCommon.cycle(Incremental.scala:99)
> 	at sbt.inc.Incremental$$anonfun$1.apply(Incremental.scala:38)
> 	at sbt.inc.Incremental$$anonfun$1.apply(Incremental.scala:37)
> 	at sbt.inc.Incremental$.manageClassfiles(Incremental.scala:65)
> 	at sbt.inc.Incremental$.compile(Incremental.scala:37)
> 	at sbt.inc.IncrementalCompile$.apply(Compile.scala:27)
> 	at sbt.compiler.AggressiveCompile.compile2(AggressiveCompile.scala:157)
> 	at sbt.compiler.AggressiveCompile.compile1(AggressiveCompile.scala:71)
> 	at com.typesafe.zinc.Compiler.compile(Compiler.scala:184)
> 	at com.typesafe.zinc.Compiler.compile(Compiler.scala:164)
> 	at sbt_inc.SbtIncrementalCompiler.compile(SbtIncrementalCompiler.java:92)
> 	at scala_maven.ScalaCompilerSupport.incrementalCompile(ScalaCompilerSupport.java:303)
> 	at scala_maven.ScalaCompilerSupport.compile(ScalaCompilerSupport.java:119)
> 	at scala_maven.ScalaCompilerSupport.doExecute(ScalaCompilerSupport.java:99)
> 	at scala_maven.ScalaMojoSupport.execute(ScalaMojoSupport.java:482)
> 	at org.apache.maven.plugin.DefaultBuildPluginManager.executeMojo(DefaultBuildPluginManager.java:134)
> 	... 12 more
> [ERROR] 
> [ERROR] Re-run Maven using the -X switch to enable full debug logging.
> [ERROR] 
> [ERROR] For more information about the errors and possible solutions, please read the
following articles:
> [ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/PluginExecutionException
> [ERROR] 
> [ERROR] After correcting the problems, you can resume the build with the command
> [ERROR]   mvn <goals> -rf :spark-streaming-flume-sink_2.11
> [INFO] 
> [INFO] --- scala-maven-plugin:3.2.2:add-source (eclipse-add-source) @ spark-mllib-local_2.11
---
> [INFO] 
> [INFO] --- scala-maven-plugin:3.2.2:compile (default-cli) @ spark-launcher_2.11 ---



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org


Mime
View raw message