hadoop-common-commits mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Apache Wiki <wikidi...@apache.org>
Subject [Hadoop Wiki] Update of "HarmonyHdfs" by GuillermoCabrera
Date Tue, 16 Nov 2010 16:19:59 GMT
Dear Wiki user,

You have subscribed to a wiki page or wiki category on "Hadoop Wiki" for change notification.

The "HarmonyHdfs" page has been changed by GuillermoCabrera.


New page:
= Hadoop hdfs on Harmony/Harmony Select 6 =

Although these instructions look very similar to Hadoop common, there are some subtle changes.
The build process is done without building native libraries as there are some issues we describe
below. Under scripts and patches below there are currently 24 failures for this project, out
of which 9 fail because of an explicit call that is made in the test case to sun/reflect from
the [[http://mockito.org/|mockito]] project.

=== Environment Setup ===

 1. Download missing dependencies: Apache Forrest, Xerces-C++ Parser, Apache Ant, Apache Maven,
Java JRE 5.0 (Needed by Apache Forrest)
 2. Download ecj jar from the [[http://download.eclipse.org/eclipse/downloads/drops/S-3.7M3-201010281441/index.php|3.7M3
build]] (this particular build contains a fix to bug affecting build process), place it in
`$HARMONY_HOME/hdk/jdk/lib` and add to bootclasspath.properties
 3. Add tools.jar from `$HARMONY_HOME/hdk/jdk/lib` to `$HARMONY_HOME/hdk/jdk/jre/lib/boot`.
Then, add entry of tools.jar into bootclasspath.properties in `$HARMONY_HOME/hdk/jdk/jre/lib/boot`
 4. Create a jar file (sun-javadoc.jar) in `$HARMONY_HOME/hdk/jdk/jre/lib/boot` containing
all javadoc related classes from SUN JDK 1.6. Then add entry of sun-javadoc.jar into bootclasspath.properties
in `$HARMONY_HOME/hdk/jdk/jre/lib/boot`
 5. Download Hadoop common {{{
 % svn checkout h ttp://svn.apache.org/repos/asf/hadoop/hdfs/tags/release-0.21.0/ hdfs
 6. Download patches and place in appropriate directory (refer to script)
 7. Download, modify and run build script

=== Testing ===

 1. Copy swing.jar from Harmony 6 into Harmony Select's `jre/lib/ext` (Ivy requires swing
to run testing framework)
 2. Copy rmi.jar from Harmony 6 into Harmony Select's `jre/lib/boot` and add entry into `bootclasspath.properties`
 3. Comment out the line `xmlsec-1.4.3/commons-logging.jar` in Harmony Select's 'jre/lib/boot/bootclasspath.properties`
 4. Copy native libraries (libhdfs.*) from Apache Hadoop 0.21.0 release into `/hdfs/build/c++/Linux-x86-32/lib`
and `/hdfs/build/c++/lib`
 5. Create a jar file (sunExtra.jar) in `$HARMONY_HOME/hdk/jdk/jre/lib/boot` containing all
the sun/reflect and sun/misc classes from SUN JDK 1.6. Then add entry of sunExtra.jar into
 6. Create soft link to libharmonyvm.so from libjvm.so in Harmony Select's `jre/bin/default`
 7. Download, modify and run test script

=== Patches ===

 * [[https://issues.apache.org/jira/browse/HDFS-1494| HDFS-1494]]

=== Build Script ===

# !/bin/sh
export SUBPROJECT=common
export VERSION=0.21.0
export PATCH_DIR=/home/harmony/Hadoop-Patches/Harmony/$VERSION-$SUBPROJECT
#Folder containing a clean version of Hadoop common needed to install patches
export PRISTINE=/home/harmony/Hadoop-Versions/pristine
export JAVA_HOME=/home/harmony/Java-Versions/harmony6-1022137/java6/target/hdk/jdk
export HADOOP_INSTALL=/home/harmony/Hadoop-Versions/hadoop-$VERSION
export FORREST_INSTALL=/home/harmony/Test-Dependencies/apache-forrest-0.8
export XERCES_INSTALL=/home/harmony/Test-Dependencies/xerces-c_2_8_0
export ANT_HOME=/home/harmony/Test-Dependencies/apache-ant-1.8.1
#Java 5 required by Forrest
export JAVA5=/home/harmony/Java-Versions/ibm-java2-i386-50/jre
export PATH=$PATH:$ANT_HOME/bin

#clean (Clean targets are necessary to apply patches)
echo "Cleaning and Copying From Pristine"

# Apply Patches
echo "Applying Patches"

# Remove TestFiHFlush because of HDFS-1421
echo "Deleting TestFiHFlush Test Case"
rm src/test/aop/org/apache/hadoop/hdfs/TestFiHFlush.java

patch -p0 < $PATCH_DIR/HDFS-1494.patch

# Clean, Build and Run the Core (Non-Contrib) Unit Tests
echo "Starting Build"
ant -Dversion=$VERSIONBUILD -Dxercescroot=$XERCES_INSTALL -Dforrest.home=$FORREST_INSTALL
-Djava5.home=$JAVA5 mvn-install -Dresolvers=internal > /home/harmony/Test-Scripts/Hadoop-$VERSION/Harmonybuild-$SUBPROJECT-noNative.out

=== Test Script ===
The following script only runs (does not compile) all of the tests in the test-core target.
By running this script it is assumed that you already built Hadoop common with another JDK.
# !/bin/sh
export SUBPROJECT=hdfs
export VERSION=0.21.0
export JAVA_HOME=/home/harmony/Java-Versions/harmonySelect6-1022137/java6/target/hdk/jdk/jre
export HADOOP_INSTALL=/home/harmony/Hadoop-Versions/hadoop-$VERSION
export ANT_HOME=/home/harmony/Test-Dependencies/apache-ant-1.8.1
export PATH=$PATH:$ANT_HOME/bin


echo "Testing Hadoop hdfs"
ant -Dsun.arch.data.model=32 -Dversion=$VERSIONBUILD run-test-core-nocompile run-test-hdfs-fault-inject-nocompile
-Dresolvers=internal > /home/harmony/Test-Scripts/Hadoop-$VERSION/HSTest-$SUBPROJECT.out

Note: To run a single test case, you need to add the -Dtestcase=testClass property to the
ant execution line.

=== Issues ===

==== Building Native ====

To build using native libraries, you need the following additions to your build script:
export CFLAGS=-m32
export CXXFLAGS=-m32

Also, please note the guidelines when trying to build native libraries, [[http://hadoop.apache.org/common/docs/current/native_libraries.html|Native
Libraries Guide]]

 * javah is not available in Harmony 6. You need to copy tools.jar from Sun into Harmony and
add to the bootclasspath.properties file
 * Even with the created soft link (libjvm.so) to libharmonyvm.so there are still issues where
the configure script in `common/src/native` cannont find -ljvm.

==== Others ====

 * There are two test cases (`TestArrayFile` and `TestMapFile`) that crash during testing,
but succeed when run individually.
 * TestSaslRPC reports `Unable to find SASL client implementation`

View raw message