hadoop-common-commits mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Apache Wiki <wikidi...@apache.org>
Subject [Hadoop Wiki] Update of "HowToSetupYourDevelopmentEnvironment" by TomWhite
Date Thu, 08 Sep 2011 22:46:29 GMT
Dear Wiki user,

You have subscribed to a wiki page or wiki category on "Hadoop Wiki" for change notification.

The "HowToSetupYourDevelopmentEnvironment" page has been changed by TomWhite:
http://wiki.apache.org/hadoop/HowToSetupYourDevelopmentEnvironment?action=diff&rev1=26&rev2=27

   * attempt to run ''ant test''
    *  If you get any strange errors (other than JUnit test failures and errors), then consult
the ''Build Errors'' section below.
   * run ''ant'' to compile (this may not be necessary if you've already run ''ant test'')
-  * follow GettingStartedWithHadoop to learn how to run Hadoop (use this guide if you use
Ubuntu: http://wiki.apache.org/hadoop/Running_Hadoop_On_Ubuntu_Linux_%28Single-Node_Cluster%29)
+  * follow GettingStartedWithHadoop or the instructions below to learn how to run Hadoop
(use this guide if you use Ubuntu: http://wiki.apache.org/hadoop/Running_Hadoop_On_Ubuntu_Linux_%28Single-Node_Cluster%29)
    *  Use the hadoop-core-trunk folder just as you would a downloaded version of Hadoop (symlink
hadoop-core-trunk to hadoop)
    *  If you run in to any problems, refer to the ''Runtime Errors'' below, along with the
troubleshooting document here: TroubleShooting
+ 
+ = Run HDFS in pseudo-distributed mode from the dev tree =
+ 
+ Build the packaging from the top level. This will build the distribution in an exploded
format that we can run directly (i.e. no need to untar):
+ {{{
+ mvn clean package -Pdist -DskipTests -P-cbuild
+ }}}
+ 
+ {{{
+ export HADOOP_COMMON_HOME=$(pwd)/$(ls -d hadoop-common-project/hadoop-common/target/hadoop-common-*-SNAPSHOT)
+ export HADOOP_HDFS_HOME=$(pwd)/$(ls -d hadoop-hdfs-project/hadoop-hdfs/target/hadoop-hdfs-*-SNAPSHOT)
+ export PATH=$HADOOP_COMMON_HOME/bin:$HADOOP_HDFS_HOME/bin:$PATH
+ }}}
+ 
+ Set the {{{fs.default.name}}} to local HDFS:
+ 
+ {{{
+ cat > $HADOOP_COMMON_HOME/etc/hadoop/core-site.xml  << EOF
+ <?xml version="1.0"?><!-- core-site.xml -->
+ <configuration>
+   <property>
+     <name>fs.default.name</name>
+     <value>hdfs://localhost/</value>
+   </property>
+ </configuration>
+ EOF
+ }}}
+ 
+ You can now run HDFS daemons. E.g.:
+ 
+ {{{
+ # Format the namenode
+ hdfs namenode -format
+ # Start the namenode
+ hdfs namenode
+ # Start a datanode
+ hdfs datanode
+ }}}
+ 
+ Note that the {{{start-dfs.sh}} script will not work with this set up, since it assumes
that HADOOP_COMMON_HOME and HADOOP_HDFS_HOME are the same directory.
  
  = Build Errors =
  

Mime
View raw message