hadoop-mapreduce-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Praveen Sripati <praveensrip...@gmail.com>
Subject Re: Building and Deploying MRv2
Date Sun, 19 Jun 2011 06:06:37 GMT

Thanks to the group. Finally, I got the build and deploying going with 
the following exception during the common build

      [exec] validate-sitemap:
      [exec] 
/home/praveensripati/Installations/apache-forrest-0.8/main/webapp/resources/schema/relaxng/sitemap-v06.rng:72:31:

error: datatype library "http://www.w3.org/2001/XMLSchema-datatypes" not 
recognized

      [exec] BUILD FAILED
      [exec] 
/home/praveensripati/Installations/apache-forrest-0.8/main/targets/validate.xml:158: 
Validation failed, messages should have been provided.

------

The INSTALL doc  
(http://svn.apache.org/repos/asf/hadoop/common/branches/MR-279/mapreduce/INSTALL) 
is incomplete. Here are the additional steps

- Dependency on the apache-forrest-0.8.tar.gz, apache-forrest-0.9.tar.gz 
is causing ClassNotFound Exceptions for the Common build.

- The hadoop-mapred-examples-0.22.0-SNAPSHOT.jar is not there in the 
$HADOOP_MAPRED_HOME/build. "ant examples -Dresolvers=internal" had to be 
run.

- The configuration files in the below folder have to be copied to a 
configuration folder and the same folder should be set to the 
HADOOP_CONF_DIR variable and exported.

     - MR-279/common/conf
     - MR-279/hdfs/conf
     - MR-279/mapreduce/conf
     - $YARN_INSTALL/conf

- Had to install autoconf 'sudo apt-get install autoconf"on Ubuntu 11.04.

Thanks,
Praveen


On Saturday 18 June 2011 05:22 PM, Praveen Sripati wrote:
> Hi,
>
> I have got the code from the svn into into 
> /home/praveensripati/Hadoop/ directory and untar'd the 
> hadoop-mapreduce-1.0-SNAPSHOT-all.tar.gz file in the 
> /home/praveensripati/Hadoop folder. The INSTALL document says to 
> export the following variables. What should the HADOOP_CONF_DIR 
> variable be set to - common or hdfs or mapreduce conf folder? The 
> YARN_CONF_DIR is again pointing to the HADOOP_CONF_DIR.
>
> export HADOOP_MAPRED_HOME=/home/praveensripati/Hadoop/MR-279/mapreduce
> export HADOOP_COMMON_HOME=/home/praveensripati/Hadoop/MR-279/common
> export HADOOP_HDFS_HOME=/home/praveensripati/Hadoop/MR-279/hdfs
> export YARN_HOME=/home/praveensripati/Hadoop/hadoop-mapreduce-1.0-SNAPSHOT
> export HADOOP_CONF_DIR=
> export YARN_CONF_DIR=$HADOOP_CONF_DIR
> Thanks,
> Praveen
>
> On Saturday 18 June 2011 08:52 AM, Praveen Sripati wrote:
>>
>> Hi,
>>
>> Finally, got all the jars built. Now is the time to run the MRv2.
>>
>> It would be nice if the below documentation gets updated
>>
>> http://svn.apache.org/repos/asf/hadoop/common/branches/MR-279/mapreduce/INSTALL
>>
>> ----------
>>
>> yarn build
>>
>> Ubuntu had an older version of the protoc binary, I did put the 
>> latest protoc binary which I build in the PATH and the following 
>> error is not coming
>>
>> yarn_protos.proto:4:8: Option "java_generate_equals_and_hash" unknown.
>>
>> Then had to install autoconf 'sudo apt-get install autoconf" and the 
>> hadoop-mapreduce-1.0-SNAPSHOT-all.tar.gz file got generated.
>>
>> ----------
>>
>> common build
>>
>> After including the forrestor 8, the following error is not coming 
>> (HADOOP-7394 has been created for the same)
>>
>>       [exec] Exception in thread "main" 
>> java.lang.NoClassDefFoundError: org/apache/fop/messaging/MessageHandler
>>       [exec]     at 
>> org.apache.cocoon.serialization.FOPSerializer.configure(FOPSerializer.java:122) 
>>
>>
>> But, now getting the following error (the jars are getting generated 
>> in the build folder)
>>
>>      [exec] validate-sitemap:
>>      [exec] 
>> /home/praveensripati/Installations/apache-forrest-0.8/main/webapp/resources/schema/relaxng/sitemap-v06.rng:72:31:

>> error: datatype library "http://www.w3.org/2001/XMLSchema-datatypes" 
>> not recognized
>>
>>      [exec] BUILD FAILED
>>      [exec] 
>> /home/praveensripati/Installations/apache-forrest-0.8/main/targets/validate.xml:158:

>> Validation failed, messages should have been provided.
>> Thanks,
>> Praveen
>>
>> On Saturday 18 June 2011 03:44 AM, Siddharth Seth wrote:
>>> Ubuntu seems to install the protocol buffer library (protobuf-compiler) as
>>> part of the standard install. Can you run 'protoc --version' to figure out
>>> which version is being used.
>>> If you've installed it separately - you could play around with the path,
>>> remove the package installed by Ubuntu, etc to make sure protoc 2.4 is used.
>>>
>>> - Sid
>>>
>>> On Fri, Jun 17, 2011 at 1:15 AM, Luke Lu<llu@vicaya.com>  wrote:
>>>
>>>> MR-279 actually works fine with maven 3.0.3 (sans a few (IMO bogus)
>>>> warnings). You can leave out the "tar" target (which depends on the
>>>> "docs" target, which requires forrest 0.8) to unblock the progress, as
>>>> mvn-install would suffice for common and hdfs builds.
>>>>
>>>> On Thu, Jun 16, 2011 at 7:55 PM, Praveen Sripati
>>>> <praveensripati@gmail.com>  wrote:
>>>>> Tom,
>>>>>
>>>>> I downgraded maven and also changed from open-jdk to sun-jdk and there
is
>>>>> not any progress. I am using Ubuntu 11.04 and could not find
>>>> sun-java5-jdk
>>>>> in the Ubuntu repositories, so I installed sun-java6-jdk.
>>>>>
>>>>> praveensripati@praveensripati:~$ java -version
>>>>> java version "1.6.0_24"
>>>>> Java(TM) SE Runtime Environment (build 1.6.0_24-b07)
>>>>>
>>>>> praveensripati@praveensripati:~$ mvn -version
>>>>> Apache Maven 2.2.1 (r801777; 2009-08-07 00:46:01+0530)
>>>>>
>>>>> Thanks,
>>>>> Praveen
>>>>>
>>>>>
>>>>> On Friday 17 June 2011 02:04 AM, Thomas Graves wrote:
>>>>>> I know at one time maven 3.x didn't work so I've been using maven
2.x.
>>>>>>
>>>>>> Well I've never tried using java6 for java5 home but I would think
it
>>>>>> wouldn't work.  I thought it was forrest that required java5. I would
>>>>>> suggest using java5.
>>>>>>
>>>>>> Tom
>>>>>>
>>>>>>
>>>>>> On 6/16/11 12:24 PM, "Praveen Sripati"<praveensripati@gmail.com>
>>>>   wrote:
>>>>>>> Tom,
>>>>>>>
>>>>>>>>> Note, it looks like your java5.home is pointing to java6?
>>>>>>> I have java6 on my laptop and pointed java5.home variable to
java6. The
>>>>>>> hadoop doc says "Java 1.6.x - preferable from Sun". Is this the
>>>> problem?
>>>>>>>>> What version of protobufs are you using?
>>>>>>> I have protobuf 2.4.1.
>>>>>>>
>>>>>>>>> What about mvn version?
>>>>>>> Apache Maven 3.0.3 (r1075438; 2011-02-28 23:01:09+0530)
>>>>>>>
>>>>>>>>> So you had both common and hdfs built before doing mapreduce
and
>>>>>>> common built before building hdfs? Or was common failing with
the error
>>>>>>> you mention  below? If you haven't already you might simply try
>>>>>>> veryclean on everything and go again in order.
>>>>>>> I tried common first and there were some errors related to fop,
but the
>>>>>>> common jars were created, so I started with hdfs and it was successful.
>>>>>>> Then I started the yarn build which led to the
>>>>>>> java_generate_equals_and_hash error.
>>>>>>>
>>>>>>> Thanks,
>>>>>>> Praveen
>>>>>>>
>>>>>>>
>>>>>>> On Thursday 16 June 2011 09:54 PM, Thomas Graves wrote:
>>>>>>>> Note, it looks like your java5.home is pointing to java6?
>>>>>>>>
>>>>>>>> I've never seen this particular error. The
>>>> java_generate_equals_and_hash
>>>>>>>> option seems to have been added in protobuf2.4.0. What version
of
>>>>>>>> protobufs
>>>>>>>> are you using?  The instructions say to use atleast 2.4.0a,
I'm using
>>>>>>>> 2.4.1
>>>>>>>> right now.
>>>>>>>>
>>>>>>>> You need to define the following  (I use a build.properties
file).
>>>> These
>>>>>>>> are
>>>>>>>> the version I'm currently using.  All of these are just downloaded
>>>> from
>>>>>>>> the
>>>>>>>> corresponding website.  Some links to those can be found
here:
>>>>>>>> http://yahoo.github.com/hadoop-common/installing.html
>>>>>>>>
>>>>>>>> java5.home=/home/tgraves/hadoop/jdk1.5.0_22/
>>>>>>>> forrest.home=/home/tgraves/hadoop/apache-forrest-0.8
>>>>>>>> ant.home=/home/tgraves/hadoop/apache-ant-1.8.2
>>>>>>>> xercescroot=/home/tgraves/hadoop/xerces-c-src_2_8_0
>>>>>>>> eclipse.home=/home/tgraves/hadoop/eclipse
>>>>>>>> findbugs.home=/home/tgraves/hadoop/findbugs-1.3.9
>>>>>>>>
>>>>>>>> I thought this was the same as for trunk but perhaps I'm
mistaken.
>>>>>>>>
>>>>>>>> What about mvn version?
>>>>>>>> /home/y/libexec/maven/bin/mvn --version
>>>>>>>> Apache Maven 2.2.1 (r801777; 2009-08-06 19:16:01+0000)
>>>>>>>>
>>>>>>>> So you had both common and hdfs built before doing mapreduce
and
>>>> common
>>>>>>>> built before building hdfs? Or was common failing with the
error you
>>>>>>>> mention
>>>>>>>> below?   If you haven't already you might simply try veryclean
on
>>>>>>>> everything
>>>>>>>> and go again in order.
>>>>>>>>
>>>>>>>> Tom
>>>>>>>>
>>>>>>>>
>>>>>>>> On 6/16/11 8:10 AM, "Praveen Sripati"<praveensripati@gmail.com>
>>>> wrote:
>>>>>>>>> Hi,
>>>>>>>>>
>>>>>>>>> The hdfs build was successful after including the -Dforrest.home
>>>>>>>>> property to the ant command.
>>>>>>>>>
>>>>>>>>> ***********
>>>>>>>>>
>>>>>>>>> When  I started the mapreduce build to get the below
error.
>>>>>>>>>
>>>>>>>>> mvn clean install assembly:assembly
>>>>>>>>>
>>>>>>>>> Downloaded:
>>>>>>>>>
>>>>>>>>>
>>>> http://repo1.maven.org/maven2/org/apache/commons/commons-exec/1.0.1/commons-
>>>>>>>>> ex
>>>>>>>>> ec-1.0.1.jar
>>>>>>>>> (49 KB at 24.4 KB/sec)
>>>>>>>>> yarn_protos.proto:4:8: Option "java_generate_equals_and_hash"
>>>> unknown.
>>>>>>>>> [INFO]
>>>>>>>>>
>>>>>>>>> [INFO]
>>>>>>>>>
>>>>>>>>>
>>>> ------------------------------------------------------------------------
>>>>>>>>> [INFO] Skipping hadoop-mapreduce
>>>>>>>>> [INFO] This project has been banned from the build due
to previous
>>>>>>>>> failures.
>>>>>>>>> [INFO]
>>>>>>>>>
>>>>>>>>>
>>>> ------------------------------------------------------------------------
>>>>>>>>> [INFO]
>>>>>>>>>
>>>>>>>>>
>>>> ------------------------------------------------------------------------
>>>>>>>>> [INFO] Reactor Summary:
>>>>>>>>> [INFO]
>>>>>>>>> [INFO] yarn-api ..........................................
FAILURE
>>>>>>>>> [13:23.081s]
>>>>>>>>> [INFO] yarn-common .......................................
SKIPPED
>>>>>>>>> [INFO] yarn-server-common ................................
SKIPPED
>>>>>>>>> [INFO] yarn-server-nodemanager ...........................
SKIPPED
>>>>>>>>> [INFO] yarn-server-resourcemanager .......................
SKIPPED
>>>>>>>>> [INFO] yarn-server-tests .................................
SKIPPED
>>>>>>>>> [INFO] yarn-server .......................................
SKIPPED
>>>>>>>>> [INFO] yarn ..............................................
SKIPPED
>>>>>>>>> [INFO] hadoop-mapreduce-client-core ......................
SKIPPED
>>>>>>>>> [INFO] hadoop-mapreduce-client-common ....................
SKIPPED
>>>>>>>>> [INFO] hadoop-mapreduce-client-shuffle ...................
SKIPPED
>>>>>>>>> [INFO] hadoop-mapreduce-client-app .......................
SKIPPED
>>>>>>>>> [INFO] hadoop-mapreduce-client-hs ........................
SKIPPED
>>>>>>>>> [INFO] hadoop-mapreduce-client-jobclient .................
SKIPPED
>>>>>>>>> [INFO] hadoop-mapreduce-client ...........................
SKIPPED
>>>>>>>>> [INFO] hadoop-mapreduce ..................................
SKIPPED
>>>>>>>>> [INFO]
>>>>>>>>>
>>>>>>>>>
>>>> ------------------------------------------------------------------------
>>>>>>>>> [INFO] BUILD FAILURE
>>>>>>>>> [INFO]
>>>>>>>>>
>>>>>>>>>
>>>> ------------------------------------------------------------------------
>>>>>>>>> [INFO] Total time: 13:45.437s
>>>>>>>>> [INFO] Finished at: Thu Jun 16 18:30:48 IST 2011
>>>>>>>>> [INFO] Final Memory: 6M/15M
>>>>>>>>> [INFO]
>>>>>>>>>
>>>>>>>>>
>>>> ------------------------------------------------------------------------
>>>>>>>>> [ERROR] Failed to execute goal
>>>>>>>>> org.codehaus.mojo:exec-maven-plugin:1.2:exec (generate-sources)
on
>>>>>>>>> project yarn-api: Command execution failed. Process exited
with an
>>>>>>>>> error: 1(Exit value: 1) ->    [Help 1]
>>>>>>>>> [ERROR]
>>>>>>>>> [ERROR] To see the full stack trace of the errors, re-run
Maven with
>>>>>>>>> the
>>>>>>>>> -e switch.
>>>>>>>>> [ERROR] Re-run Maven using the -X switch to enable full
debug
>>>> logging.
>>>>>>>>> [ERROR]
>>>>>>>>> [ERROR] For more information about the errors and possible
solutions,
>>>>>>>>> please read the following articles:
>>>>>>>>> [ERROR] [Help 1]
>>>>>>>>>
>>>> http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
>>>>>>>>> ***********
>>>>>>>>>
>>>>>>>>> I started building the commons and had to include -Djava5.home
and
>>>>>>>>> -Dforrest.home properties in the ant command.
>>>>>>>>>
>>>>>>>>> ant -Djava5.home=/usr/lib/jvm/java-6-openjdk
>>>>>>>>> -Dforrest.home=/home/praveensripati/Installations/apache-forrest-0.9
>>>>>>>>> veryclean mvn-install tar
>>>>>>>>>
>>>>>>>>> And then I get the below error and the build hangs ,
but I see 4 jars
>>>>>>>>> in
>>>>>>>>> the build folder including hadoop-common-0.22.0-SNAPSHOT.jar.
>>>>>>>>>
>>>>>>>>>         [exec] Cocoon will report the status of each
document:
>>>>>>>>>         [exec]   - in column 1: *=okay X=brokenLink ^=pageSkipped
(see
>>>>>>>>> FAQ).
>>>>>>>>>         [exec]
>>>>>>>>>         [exec]
>>>>>>>>>
>>>>>>>>>
>>>> ------------------------------------------------------------------------
>>>>>>>>>         [exec] cocoon 2.1.12-dev
>>>>>>>>>         [exec] Copyright (c) 1999-2007 Apache Software
Foundation. All
>>>>>>>>> rights reserved.
>>>>>>>>>         [exec]
>>>>>>>>>
>>>>>>>>>
>>>> ------------------------------------------------------------------------
>>>>>>>>>         [exec]
>>>>>>>>>         [exec]
>>>>>>>>>         [exec] * [1/29]    [29/29]   6.547s 9.4Kb   linkmap.html
>>>>>>>>>         [exec] * [2/29]    [1/28]    1.851s 22.3Kb  hdfs_shell.html
>>>>>>>>>         [exec] * [4/28]    [1/28]    1.156s 21.1Kb  distcp.html
>>>>>>>>>         [exec] * [5/27]    [0/0]     0.306s 0b      distcp.pdf
>>>>>>>>>         [exec] Exception in thread "main"
>>>>>>>>> java.lang.NoClassDefFoundError:
>>>>>>>>> org/apache//messaging/MessageHandler
>>>>>>>>>         [exec]  at
>>>>>>>>>
>>>>>>>>>
>>>> org.apache.cocoon.serialization.FOPSerializer.configure(FOPSerializer.java:1
>>>>>>>>> 22
>>>>>>>>> )
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> I included the following in the common/ivy.xml and the
>>>>>>>>> ./common/build/ivy/lib/Hadoop-Common/common/fop-0.93.jar
file is
>>>> there.
>>>>>>>>>        <dependency org="org.apache.xmlgraphics"
>>>>>>>>>          name="fop"
>>>>>>>>>          rev="${fop.version}"
>>>>>>>>>          conf="common->default"/>
>>>>>>>>>
>>>>>>>>> and the following in the common/ivy/libraries.properties
and still
>>>> get
>>>>>>>>> the same error.
>>>>>>>>>
>>>>>>>>> fop.version=0.93
>>>>>>>>>
>>>>>>>>> Thanks,
>>>>>>>>> Praveen
>>>>>>>>>
>>>>>>>>> On Thursday 16 June 2011 07:55 AM, Luke Lu wrote:
>>>>>>>>>> On Wed, Jun 15, 2011 at 6:45 PM, Praveen Sripati
>>>>>>>>>> <praveensripati@gmail.com>     wrote:
>>>>>>>>>>> Do I need the avro-maven-plugin? When I ran the
below command got
>>>> the
>>>>>>>>>>> error that the pom file was not found. Where
do I get the jar and
>>>> the
>>>>>>>>>>> pom files for the avro-maven-plugin? I was able
to get the source
>>>>>>>>>>> code
>>>>>>>>>>> for them, but not the binaries.
>>>>>>>>>>>
>>>>>>>>>>> mvn install:install-file
>>>>>>>>>>> -Dfile=./avro-maven-plugin/avro-maven-plugin-1.4.0-SNAPSHOT.jar
>>>>>>>>>>> -DpomFile=./avro-maven-plugin/avro-maven-plugin-1.4.0-SNAPSHOT.pom
>>>>>>>>>> No, you no longer need to install avro-maven-plugin
manually. It's
>>>>>>>>>> automatically installed via maven, as we switched
to  avro 1.5.1.
>>>>>>>>>> We'll fix the instruction.
>>>>>>>>>>
>>>>>>>>>> __Luke

Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message