hadoop-common-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Andrei Dragomir <adrag...@adobe.com>
Subject Building hadoop-common, hadoop-hdfs, hadoop-mapreduce from source, without using the hadoop Ivy / Maven repositories
Date Thu, 20 May 2010 14:28:11 GMT
I have some issues with compiling hadoop-common, hdfs and mapreduce from source. 

Right now, the hdfs and mapreduce builds are broken, because of MAPREDUCE-1803 and HDFS-1166.
However, even if this build would work, it wouldn't be good for me. What I want to do is use
ONLY the hadoop source code for compiling it. 

Up until now, we were using some recipes written by the Cloudera guys, which basically did
the following: 

1. build hadoop-common
2. copy hadoop-common/build/*.jar to hdfs/lib and mapreduce/lib. This step does not work anymore,
because the hadoop dependency is taken from ivy which automagically downloads it from somewhere.
I'd like to be able to shortcut this and just use the libraries that I just built in the first
step. 
3. compile hadoop-hdfs
4. copy hadoop-hdfs/build/*.jar to hadoop-mapreduce/lib
5. compile hadoop-mapreduce
6. copy everything in hadoop-hdfs/build and mapreduce/build to hadoop-common build
7. execute ant tar in hadoop-common

My question is: how does one create a hadoop common, hdfs and mapreduce build completely from
source (using ivy / maven repos ONLY for third party libraries, NOT the hadoop core libraries
for hdfs and mapreduce). 

Thank you, 
  Andrei Dragomir.
Mime
View raw message