hadoop-mapreduce-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Thomas Anderson <t.dt.aander...@gmail.com>
Subject Re: MR-279
Date Mon, 20 Jun 2011 02:05:10 GMT
I just find out avro maven plugin at

http://archive.apache.org/dist/avro/avro-1.5.1/java/avro-maven-plugin-1.5.1.jar

and pom from another place.

Previously my problems are

1st, don not know where the location of avro-maven-plugin
2nd, unable to successfully compile/ install mr-279/hdfs

Now these are solved by downloading avro maven plugin from apache
repository and execute commands pointed out in instruction file.

Thank you for help.


On Sun, Jun 19, 2011 at 5:22 AM, Luke Lu <llu@vicaya.com> wrote:
> The fact you're mentioning avro plugin indicating that I think you're
> playing with obsolete patches. The current code uses avro-maven-plugin
> 1.5.1 which is available via maven central.
>
> Please follow the current instruction here:
> http://svn.apache.org/repos/asf/hadoop/common/branches/MR-279/mapreduce/INSTALL
>
> I have updated the MAPREDUCE-279 jira's description to point to the
> above couple of days ago.
>
> On Sat, Jun 18, 2011 at 1:20 AM, Thomas Anderson
> <t.dt.aanderson@gmail.com> wrote:
>> Thanks. Recompiling common and hdfs by commands
>>
>> 1st. compile commons
>>    ant veryclean mvn-install (hadoop-common-0.22.0-SNAPSHOT.jar is
>> installed to m2 repository)
>>
>> 2nd. compile hdfs
>>    ant veryclean mvn-install -Dresolvers=internal  (It installs
>> hadoop-hdfs-0.22.0-SNAPSHOT.jar to repository)
>>
>> seem to solve the problem below
>>
>> compile-hdfs-classes:
>>   [javac] mr-279/hdfs/build.xml:339: warning: 'includeantruntime' was
>> not set, defaulting to build.sysclasspath=last; set to false for
>> repeatable builds
>>   [javac] Compiling 237 source files to mr-279/hdfs/build/classes
>>   [javac] mr-279/hdfs/src/java/org/apache/hadoop/hdfs/server/datanode/DataNode.java:119:
>> cannot find symbol
>>   ...
>>
>> Now the issue seems to be how to build avro-maven-plugin-1.4.0?
>>
>> I execute the command
>>
>>    export MAVEN_OPTS=-Xmx512m
>>    mvn clean install assembly:assembly
>>
>> would produce error
>>
>> [ERROR] Failed to execute goal
>> org.codehaus.mojo:exec-maven-plugin:1.2:exec (generate-sources) on
>> project yarn-api: Command execution failed. Process exited with an
>> error: 127(Exit value: 127) -> [Help 1]
>>
>> But the avro maven plugin obtained from
>> https://github.com/phunt/avro-maven-plugin.git only provide 1.0. Would
>> modify version (pointing it to 1.4.0) in pom work? Or what is the next
>> step for building yarn-api?
>>
>> Thanks for help.
>>
>> On Fri, Jun 17, 2011 at 9:09 PM, Thomas Graves <tgraves@yahoo-inc.com> wrote:
>>> Did you build common and hdfs before doing mvn install in mapreduce?  You
>>> have to build them in order as stated in INSTALL doc - common, hdfs, then
>>> mapreduce.
>>>
>>> Tom
>>>
>>>
>>> On 6/17/11 3:33 AM, "Thomas Anderson" <t.dt.aanderson@gmail.com> wrote:
>>>
>>>> I was not aware that the source downloaded (a few months ago) is
>>>> obsoleted. So now I switch by doing svn update, which solves the code
>>>> obsolete issue.
>>>>
>>>> Then I follow the instruction at mapreduce/INSTALL [1], which points
>>>> to install depdencies for yarn first at README[2]. During searching
>>>> mailing list, it seems avro plugin is not necessary to install
>>>> manually. So I only install protobuf 2.4.1 (configure/ make/ make
>>>> install works ok.) But after that, a mvn install under mapreduce
>>>> produces error
>>>>
>>>>  Failed to execute goal on project yarn-api: Could not resolve
>>>> dependencies for project org.apache.hadoop:yarn-api:jar:1.0-SNAPSHOT:
>>>> Failure to find org.apache.hadoop:hadoop-hdfs:jar:0.22.0-SNAPSHOT in
>>>> https://repository.jboss.org/nexus/content/groups/public-jboss/ was
>>>> cached in the local repository, resolution will not be reattempted
>>>> until the update interval of jboss-public-repository-group has elapsed
>>>> or updates are forced -> [Help 1]
>>>>
>>>> That looks like missing the hdfs jar artifact. So cd to mr-279/hdfs
>>>> and execute `ant clean package` generates error message as shown in
>>>> compile-hdfs-classes section.
>>>>
>>>> What is the right order/ procedure to successfully compile mr-279? Or
>>>> will have a new update for the instruction?
>>>>
>>>> Thanks for help.
>>>>
>>>> [1]. mapreduce/INSTALL.
>>>>
>>> http://svn.apache.org/repos/asf/hadoop/common/branches/MR-279/mapreduce/INSTAL>
>>> L
>>>> [2]. README.
>>>> http://svn.apache.org/repos/asf/hadoop/common/branches/MR-279/mapreduce/yarn/R
>>>> EADME
>>>>
>>>>
>>>> compile-hdfs-classes:
>>>>     [javac] mr-279/hdfs/build.xml:339: warning: 'includeantruntime'
>>>> was not set, defaulting to build.sysclasspath=last; set to false for
>>>> repeatable builds
>>>>     [javac] Compiling 237 source files to mr-279/hdfs/build/classes
>>>>     [javac]
>>>> mr-279/hdfs/src/java/org/apache/hadoop/hdfs/server/datanode/DataNode.java:119:
>>>> cannot find symbol
>>>>     [javac] symbol  : class ProtocolSignature
>>>>     [javac] location: package org.apache.hadoop.ipc
>>>>     [javac] import org.apache.hadoop.ipc.ProtocolSignature;
>>>>     [javac]                             ^
>>>>     [javac]
>>>>
>>> mr-279/hdfs/src/java/org/apache/hadoop/hdfs/server/datanode/DataNode.java:2308>
>>> :
>>>> cannot find symbol
>>>>     [javac] symbol  : class ProtocolSignature
>>>>     [javac] location: class org.apache.hadoop.hdfs.server.datanode.DataNode
>>>>     [javac]   public ProtocolSignature getProtocolSignature(String protocol,
>>>>     [javac]          ^
>>>>     [javac]
>>>> mr-279/hdfs/src/java/org/apache/hadoop/hdfs/server/namenode/NameNode.java:85:
>>>> cannot find symbol
>>>>     [javac] symbol  : class ProtocolSignature
>>>>     [javac] location: package org.apache.hadoop.ipc
>>>>     [javac] import org.apache.hadoop.ipc.ProtocolSignature;
>>>>     [javac]                             ^
>>>>     [javac]
>>>> mr-279/hdfs/src/java/org/apache/hadoop/hdfs/server/namenode/NameNode.java:194:
>>>> cannot find symbol
>>>>     [javac] symbol  : class ProtocolSignature
>>>>     [javac] location: class org.apache.hadoop.hdfs.server.namenode.NameNode
>>>>     [javac]   public ProtocolSignature getProtocolSignature(String protocol,
>>>>     [javac]          ^
>>>>     [javac] mr-279/hdfs/src/java/org/apache/hadoop/fs/Hdfs.java:389:
>>>> cannot find symbol
>>>>     [javac] symbol  : method getCanonicalServiceName()
>>>>     [javac] location: class org.apache.hadoop.fs.Hdfs
>>>>     [javac]     result.setService(new Text(this.getCanonicalServiceName()));
>>>>     [javac]                                    ^
>>>>     [javac] mr-279/hdfs/src/java/org/apache/hadoop/fs/Hdfs.java:385:
>>>> method does not override or implement a method from a supertype
>>>>     [javac]   @Override //AbstractFileSystem
>>>>     [javac]   ^
>>>>     [javac] mr-279/hdfs/src/java/org/apache/hadoop/hdfs/DFSClient.java:736:
>>>> cannot find symbol
>>>>     [javac] symbol  : method
>>>> validate(java.util.EnumSet<org.apache.hadoop.fs.CreateFlag>)
>>>>     [javac] location: class org.apache.hadoop.fs.CreateFlag
>>>>     [javac]     CreateFlag.validate(flag);
>>>>     [javac]               ^
>>>>     [javac]
>>>> mr-279/hdfs/src/java/org/apache/hadoop/hdfs/server/namenode/FSNamesystem.java:
>>>> 5210:
>>>> cannot find symbol
>>>>     [javac] symbol  : method isRpcInvocation()
>>>>     [javac] location: class org.apache.hadoop.ipc.Server
>>>>     [javac]     return Server.isRpcInvocation();
>>>>     [javac]                  ^
>>>>     [javac]
>>>>
>>> mr-279/hdfs/src/java/org/apache/hadoop/hdfs/server/datanode/FSDataset.java:102>
>>> :
>>>> cannot find symbol
>>>>     [javac] symbol  : method listFiles(java.io.File)
>>>>     [javac] location: class org.apache.hadoop.fs.FileUtil
>>>>     [javac]         File[] files = FileUtil.listFiles(dir);
>>>>     [javac]                                ^
>>>>     [javac]
>>>>
>>> mr-279/hdfs/src/java/org/apache/hadoop/hdfs/server/datanode/FSDataset.java:190>
>>> :
>>>> cannot find symbol
>>>>     [javac] symbol  : method listFiles(java.io.File)
>>>>     [javac] location: class org.apache.hadoop.fs.FileUtil
>>>>     [javac]       File files[] = FileUtil.listFiles(dir);
>>>>     [javac]                              ^
>>>>     [javac]
>>>>
>>> mr-279/hdfs/src/java/org/apache/hadoop/hdfs/server/datanode/FSDataset.java:425>
>>> :
>>>> cannot find symbol
>>>>     [javac] symbol  : method listFiles(java.io.File)
>>>>     [javac] location: class org.apache.hadoop.fs.FileUtil
>>>>     [javac]       File blockFiles[] = FileUtil.listFiles(dir);
>>>>     [javac]                                   ^
>>>>     [javac]
>>>>
>>> mr-279/hdfs/src/java/org/apache/hadoop/hdfs/server/datanode/FSDataset.java:726>
>>> :
>>>> cannot find symbol
>>>>     [javac] symbol  : method list(java.io.File)
>>>>     [javac] location: class org.apache.hadoop.fs.FileUtil
>>>>     [javac]       if (finalizedDir.exists() &&
>>>> FileUtil.list(finalizedDir).length != 0) {
>>>>     [javac]                                        
   ^
>>>>     [javac]
>>>>
>>> mr-279/hdfs/src/java/org/apache/hadoop/hdfs/server/datanode/FSDataset.java:729>
>>> :
>>>> cannot find symbol
>>>>     [javac] symbol  : method list(java.io.File)
>>>>     [javac] location: class org.apache.hadoop.fs.FileUtil
>>>>     [javac]       if (rbwDir.exists() && FileUtil.list(rbwDir).length
!= 0) {
>>>>     [javac]                                      ^
>>>>     [javac]
>>>>
>>> mr-279/hdfs/src/java/org/apache/hadoop/hdfs/server/datanode/FSDataset.java:758>
>>> :
>>>> cannot find symbol
>>>>     [javac] symbol  : method listFiles(java.io.File)
>>>>     [javac] location: class org.apache.hadoop.fs.FileUtil
>>>>     [javac]         for (File f : FileUtil.listFiles(bpCurrentDir))
{
>>>>     [javac]                               ^
>>>>     [javac]
>>>>
>>> mr-279/hdfs/src/java/org/apache/hadoop/hdfs/server/datanode/FSDataset.java:766>
>>> :
>>>> cannot find symbol
>>>>     [javac] symbol  : method listFiles(java.io.File)
>>>>     [javac] location: class org.apache.hadoop.fs.FileUtil
>>>>     [javac]         for (File f : FileUtil.listFiles(bpDir)) {
>>>>     [javac]                               ^
>>>>     [javac]
>>>>
>>> mr-279/hdfs/src/java/org/apache/hadoop/hdfs/server/datanode/DataNode.java:2310>
>>> :
>>>> cannot find symbol
>>>>     [javac] symbol  : variable ProtocolSignature
>>>>     [javac] location: class org.apache.hadoop.hdfs.server.datanode.DataNode
>>>>     [javac]     return ProtocolSignature.getProtocolSigature(
>>>>     [javac]            ^
>>>>     [javac]
>>>>
>>> mr-279/hdfs/src/java/org/apache/hadoop/hdfs/server/datanode/DataNode.java:2307>
>>> :
>>>> method does not override or implement a method from a supertype
>>>>     [javac]   @Override
>>>>     [javac]   ^
>>>>     [javac]
>>>> mr-279/hdfs/src/java/org/apache/hadoop/hdfs/server/datanode/DataStorage.java:5
>>>> 05:
>>>> cannot find symbol
>>>>     [javac] symbol  : method list(java.io.File)
>>>>     [javac] location: class org.apache.hadoop.fs.FileUtil
>>>>     [javac]         if (FileUtil.list(detachDir).length != 0 ) {
>>>>     [javac]                     ^
>>>>     [javac]
>>>> mr-279/hdfs/src/java/org/apache/hadoop/hdfs/server/datanode/DirectoryScanner.j
>>>> ava:490:
>>>> cannot find symbol
>>>>     [javac] symbol  : method listFiles(java.io.File)
>>>>     [javac] location: class org.apache.hadoop.fs.FileUtil
>>>>     [javac]         files = FileUtil.listFiles(dir);
>>>>     [javac]                         ^
>>>>     [javac]
>>>> mr-279/hdfs/src/java/org/apache/hadoop/hdfs/server/datanode/BlockPoolSliceStor
>>>> age.java:355:
>>>> cannot find symbol
>>>>     [javac] symbol  : method list(java.io.File)
>>>>     [javac] location: class org.apache.hadoop.fs.FileUtil
>>>>     [javac]       if (FileUtil.list(detachDir).length != 0) {
>>>>     [javac]                   ^
>>>>     [javac]
>>>> mr-279/hdfs/src/java/org/apache/hadoop/hdfs/DistributedFileSystem.java:812:
>>>> method does not override or implement a method from a supertype
>>>>     [javac]   @Override // FileSystem
>>>>     [javac]   ^
>>>>     [javac]
>>>> mr-279/hdfs/src/java/org/apache/hadoop/hdfs/server/namenode/NameNode.java:196:
>>>> cannot find symbol
>>>>     [javac] symbol  : variable ProtocolSignature
>>>>     [javac] location: class org.apache.hadoop.hdfs.server.namenode.NameNode
>>>>     [javac]     return ProtocolSignature.getProtocolSigature(
>>>>     [javac]            ^
>>>>     [javac]
>>>> mr-279/hdfs/src/java/org/apache/hadoop/hdfs/server/namenode/NameNode.java:193:
>>>> method does not override or implement a method from a supertype
>>>>     [javac]   @Override
>>>>     [javac]   ^
>>>>     [javac]
>>>> mr-279/hdfs/src/java/org/apache/hadoop/hdfs/tools/DFSAdmin.java:273:
>>>> cannot find symbol
>>>>     [javac] symbol  : method getFS()
>>>>     [javac] location: class org.apache.hadoop.hdfs.tools.DFSAdmin
>>>>     [javac]     FileSystem fs = getFS();
>>>>     [javac]                     ^
>>>>     [javac] Note: Some input files use or override a deprecated API.
>>>>     [javac] Note: Recompile with -Xlint:deprecation for details.
>>>>     [javac] 24 errors
>>>>
>>>>
>>>>
>>>>
>>>> On Thu, Jun 16, 2011 at 4:33 PM, Luke Lu <llu@vicaya.com> wrote:
>>>>> Why are you compiling branch HDFS-1052?, it was a temporary branch for
>>>>> merging federated NN changes into trunk. All the changes in that
>>>>> branch have since been merged into trunk and MR-279. These stale
>>>>> branches should be deleted, IMO.
>>>>>
>>>>> MR-279 only supports the common and hdfs in the same branch, though
>>>>> we're moving to trunk soon.
>>>>>
>>>>> On Thu, Jun 16, 2011 at 12:04 AM, Thomas Anderson
>>>>> <t.dt.aanderson@gmail.com> wrote:
>>>>>> When following mr-279/INSTALL to compile source, it throws the
>>>>>> following error in the second step to compile HDFS-1052. How to solve
>>>>>> this problem?
>>>>>> ...
>>>>>
>>>
>>>
>>
>

Mime
View raw message