hadoop-common-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Tsuyoshi OZAWA (JIRA)" <j...@apache.org>
Subject [jira] [Commented] (HADOOP-9974) Trunk Build Failure at HDFS Sub-project
Date Wed, 18 Sep 2013 20:05:54 GMT

    [ https://issues.apache.org/jira/browse/HADOOP-9974?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=13771173#comment-13771173
] 

Tsuyoshi OZAWA commented on HADOOP-9974:
----------------------------------------

Hi Arpit, 
In my environment(OS X), I could compile HttpFs correctly with trunk. Before compiling after
upgrading your protobuf, please issue mvn clean. If not, the jar binaries which is compiled
with protobuf 2.4.1 and 2.5.0 mixes, and it can causes some regression.
                
> Trunk Build Failure at HDFS Sub-project
> ---------------------------------------
>
>                 Key: HADOOP-9974
>                 URL: https://issues.apache.org/jira/browse/HADOOP-9974
>             Project: Hadoop Common
>          Issue Type: Bug
>         Environment: Mac OS X
>            Reporter: Zhijie Shen
>
> Recently Hadoop upgraded to use Protobuf 2.5.0. To build the trunk, I updated my installed
Protobuf 2.5.0. With this upgrade, I didn't encounter the build failure due to protoc, but
failed when building HDFS sub-project. Bellow is failure message. I'm using Mac OS X.
> {code}
> INFO] Reactor Summary:
> [INFO] 
> [INFO] Apache Hadoop Main ................................ SUCCESS [1.075s]
> [INFO] Apache Hadoop Project POM ......................... SUCCESS [0.805s]
> [INFO] Apache Hadoop Annotations ......................... SUCCESS [2.283s]
> [INFO] Apache Hadoop Assemblies .......................... SUCCESS [0.343s]
> [INFO] Apache Hadoop Project Dist POM .................... SUCCESS [1.913s]
> [INFO] Apache Hadoop Maven Plugins ....................... SUCCESS [2.390s]
> [INFO] Apache Hadoop Auth ................................ SUCCESS [2.597s]
> [INFO] Apache Hadoop Auth Examples ....................... SUCCESS [1.868s]
> [INFO] Apache Hadoop Common .............................. SUCCESS [55.798s]
> [INFO] Apache Hadoop NFS ................................. SUCCESS [3.549s]
> [INFO] Apache Hadoop MiniKDC ............................. SUCCESS [1.788s]
> [INFO] Apache Hadoop Common Project ...................... SUCCESS [0.044s]
> [INFO] Apache Hadoop HDFS ................................ FAILURE [25.219s]
> [INFO] Apache Hadoop HttpFS .............................. SKIPPED
> [INFO] Apache Hadoop HDFS BookKeeper Journal ............. SKIPPED
> [INFO] Apache Hadoop HDFS-NFS ............................ SKIPPED
> [INFO] Apache Hadoop HDFS Project ........................ SKIPPED
> [INFO] hadoop-yarn ....................................... SKIPPED
> [INFO] hadoop-yarn-api ................................... SKIPPED
> [INFO] hadoop-yarn-common ................................ SKIPPED
> [INFO] hadoop-yarn-server ................................ SKIPPED
> [INFO] hadoop-yarn-server-common ......................... SKIPPED
> [INFO] hadoop-yarn-server-nodemanager .................... SKIPPED
> [INFO] hadoop-yarn-server-web-proxy ...................... SKIPPED
> [INFO] hadoop-yarn-server-resourcemanager ................ SKIPPED
> [INFO] hadoop-yarn-server-tests .......................... SKIPPED
> [INFO] hadoop-yarn-client ................................ SKIPPED
> [INFO] hadoop-yarn-applications .......................... SKIPPED
> [INFO] hadoop-yarn-applications-distributedshell ......... SKIPPED
> [INFO] hadoop-mapreduce-client ........................... SKIPPED
> [INFO] hadoop-mapreduce-client-core ...................... SKIPPED
> [INFO] hadoop-yarn-applications-unmanaged-am-launcher .... SKIPPED
> [INFO] hadoop-yarn-site .................................. SKIPPED
> [INFO] hadoop-yarn-project ............................... SKIPPED
> [INFO] hadoop-mapreduce-client-common .................... SKIPPED
> [INFO] hadoop-mapreduce-client-shuffle ................... SKIPPED
> [INFO] hadoop-mapreduce-client-app ....................... SKIPPED
> [INFO] hadoop-mapreduce-client-hs ........................ SKIPPED
> [INFO] hadoop-mapreduce-client-jobclient ................. SKIPPED
> [INFO] hadoop-mapreduce-client-hs-plugins ................ SKIPPED
> [INFO] Apache Hadoop MapReduce Examples .................. SKIPPED
> [INFO] hadoop-mapreduce .................................. SKIPPED
> [INFO] Apache Hadoop MapReduce Streaming ................. SKIPPED
> [INFO] Apache Hadoop Distributed Copy .................... SKIPPED
> [INFO] Apache Hadoop Archives ............................ SKIPPED
> [INFO] Apache Hadoop Rumen ............................... SKIPPED
> [INFO] Apache Hadoop Gridmix ............................. SKIPPED
> [INFO] Apache Hadoop Data Join ........................... SKIPPED
> [INFO] Apache Hadoop Extras .............................. SKIPPED
> [INFO] Apache Hadoop Pipes ............................... SKIPPED
> [INFO] Apache Hadoop Tools Dist .......................... SKIPPED
> [INFO] Apache Hadoop Tools ............................... SKIPPED
> [INFO] Apache Hadoop Distribution ........................ SKIPPED
> [INFO] Apache Hadoop Client .............................. SKIPPED
> [INFO] Apache Hadoop Mini-Cluster ........................ SKIPPED
> [INFO] ------------------------------------------------------------------------
> [INFO] BUILD FAILURE
> [INFO] ------------------------------------------------------------------------
> [INFO] Total time: 1:40.880s
> [INFO] Finished at: Thu Aug 15 16:02:56 PDT 2013
> [INFO] Final Memory: 49M/123M
> [INFO] ------------------------------------------------------------------------
> [ERROR] Failed to execute goal org.apache.maven.plugins:maven-compiler-plugin:2.5.1:compile
(default-compile) on project hadoop-hdfs: Compilation failure
> [ERROR] Failure executing javac, but could not parse the error:
> [ERROR] 
> [ERROR] 
> [ERROR] The system is out of resources.
> [ERROR] Consult the following stack trace for details.
> [ERROR] java.lang.OutOfMemoryError: Java heap space
> [ERROR] at com.sun.tools.javac.code.Scope$ImportScope.makeEntry(Scope.java:385)
> [ERROR] at com.sun.tools.javac.code.Scope.enter(Scope.java:196)
> [ERROR] at com.sun.tools.javac.code.Scope.enter(Scope.java:183)
> [ERROR] at com.sun.tools.javac.comp.MemberEnter.importAll(MemberEnter.java:132)
> [ERROR] at com.sun.tools.javac.comp.MemberEnter.visitTopLevel(MemberEnter.java:509)
> [ERROR] at com.sun.tools.javac.tree.JCTree$JCCompilationUnit.accept(JCTree.java:446)
> [ERROR] at com.sun.tools.javac.comp.MemberEnter.memberEnter(MemberEnter.java:387)
> [ERROR] at com.sun.tools.javac.comp.MemberEnter.complete(MemberEnter.java:819)
> [ERROR] at com.sun.tools.javac.code.Symbol.complete(Symbol.java:384)
> [ERROR] at com.sun.tools.javac.code.Symbol$ClassSymbol.complete(Symbol.java:766)
> [ERROR] at com.sun.tools.javac.comp.Enter.complete(Enter.java:464)
> [ERROR] at com.sun.tools.javac.comp.Enter.main(Enter.java:442)
> [ERROR] at com.sun.tools.javac.main.JavaCompiler.enterTrees(JavaCompiler.java:822)
> [ERROR] at com.sun.tools.javac.main.JavaCompiler.compile(JavaCompiler.java:727)
> [ERROR] at com.sun.tools.javac.main.Main.compile(Main.java:353)
> [ERROR] at com.sun.tools.javac.main.Main.compile(Main.java:279)
> [ERROR] at com.sun.tools.javac.main.Main.compile(Main.java:270)
> [ERROR] at com.sun.tools.javac.Main.compile(Main.java:87)
> [ERROR] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> [ERROR] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
> [ERROR] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
> [ERROR] at java.lang.reflect.Method.invoke(Method.java:597)
> [ERROR] at org.codehaus.plexus.compiler.javac.JavacCompiler.compileInProcess0(JavacCompiler.java:551)
> [ERROR] at org.codehaus.plexus.compiler.javac.JavacCompiler.compileInProcess(JavacCompiler.java:526)
> [ERROR] at org.codehaus.plexus.compiler.javac.JavacCompiler.compile(JavacCompiler.java:167)
> [ERROR] at org.apache.maven.plugin.AbstractCompilerMojo.execute(AbstractCompilerMojo.java:678)
> [ERROR] at org.apache.maven.plugin.CompilerMojo.execute(CompilerMojo.java:128)
> [ERROR] at org.apache.maven.plugin.DefaultBuildPluginManager.executeMojo(DefaultBuildPluginManager.java:101)
> [ERROR] at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:209)
> [ERROR] at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:153)
> [ERROR] at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:145)
> [ERROR] at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject(LifecycleModuleBuilder.java:84)
> [ERROR] -> [Help 1]
> [ERROR] 
> [ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
> [ERROR] Re-run Maven using the -X switch to enable full debug logging.
> [ERROR] 
> [ERROR] For more information about the errors and possible solutions, please read the
following articles:
> [ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
> [ERROR] 
> [ERROR] After correcting the problems, you can resume the build with the command
> [ERROR]   mvn <goals> -rf :hadoop-hdfs
> {code}

--
This message is automatically generated by JIRA.
If you think it was sent incorrectly, please contact your JIRA administrators
For more information on JIRA, see: http://www.atlassian.com/software/jira

Mime
View raw message