hadoop-common-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Michael Stack <st...@duboce.net>
Subject Contrib jars and mapreduce jobs
Date Tue, 17 Jul 2007 16:45:06 GMT
A mapreduce job that depends on a contrib jar such as hbase must take 
the extra step of either building the contrib jar into its job jar or 
copying the contrib jar to ${HADOOP_HOME}/lib across the cluster.  While 
a relatively minor inconvenience, what do folks think of just adding 
contrib classes/jars to the CLASSPATH built by bin/hadoop?

Thanks,
St.Ack

P.S.  Below is a suggested (untested) patch

Index: bin/hadoop
===================================================================
--- bin/hadoop  (revision 556809)
+++ bin/hadoop  (working copy)
@@ -95,7 +95,11 @@
 if [ -d "$HADOOP_HOME/build/test/classes" ]; then
   CLASSPATH=${CLASSPATH}:$HADOOP_HOME/build/test/classes
 fi
+if [ -d "$HADOOP_HOME/build/contrib/*/classes" ]; then
+  CLASSPATH=${CLASSPATH}:$HADOOP_HOME/build/contrib/*/classes
+fi
 
+
 # so that filenames w/ spaces are handled correctly in loops below
 IFS=
 
@@ -106,6 +110,9 @@
 for f in $HADOOP_HOME/hadoop-*-core.jar; do
   CLASSPATH=${CLASSPATH}:$f;
 done
+for f in $HADOOP_HOME/contrib/*.jar; do
+  CLASSPATH=${CLASSPATH}:$f;
+done
 
 # add libs to CLASSPATH
 for f in $HADOOP_HOME/lib/*.jar; do



Mime
View raw message