hadoop-common-commits mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Apache Wiki <wikidi...@apache.org>
Subject [Hadoop Wiki] Update of "HowToContribute" by SteveLoughran
Date Thu, 12 Jan 2012 15:01:44 GMT
Dear Wiki user,

You have subscribed to a wiki page or wiki category on "Hadoop Wiki" for change notification.

The "HowToContribute" page has been changed by SteveLoughran:
http://wiki.apache.org/hadoop/HowToContribute?action=diff&rev1=75&rev2=76

Comment:
fix some link entries

  First of all, you need the Hadoop source code. The official location for Hadoop is the Apache
SVN repository; Git is also supported, and useful if you want to make lots of local changes
-and keep those changes under some form or private or public revision control.
  
  ==== SVN Access ====
- Get the source code on your local drive using [[http://hadoop.apache.org/core/version_control.html|SVN]].
 Most development is done on the "trunk":
+ Get the source code on your local drive using [[http://hadoop.apache.org/core/version_control.html
|SVN]].  Most development is done on the "trunk":
  
  {{{
  svn checkout http://svn.apache.org/repos/asf/hadoop/common/trunk/ hadoop-trunk
@@ -71, +71 @@

  
  === Building ProtocolBuffers (for 0.23+) ===
  
- Hadoop 0.23+ must have Google's ProtocolBuffers for compilation to work. These are native
binaries which need to be downloaded, compiled and then installed locally.  See [[http://svn.apache.org/repos/asf/hadoop/common/trunk/hadoop-mapreduce-project/hadoop-yarn/README|YARN
Readme]
+ Hadoop 0.23+ must have Google's ProtocolBuffers for compilation to work. These are native
binaries which need to be downloaded, compiled and then installed locally.  See [[http://svn.apache.org/repos/asf/hadoop/common/trunk/hadoop-mapreduce-project/hadoop-yarn/README
|YARN Readme]]
  
  This is a good opportunity to get the GNU C/C++ toolchain installed, which is useful for
working on the native code used in the HDFS project.
  
@@ -85, +85 @@

  
   * you need a copy of GCC 4.1+ including the {{{g++}}} C++ compiler, {{{make}}} and the
rest of the GNU C++ development chain.
   * Linux: you need a copy of autoconf installed, which your local package manager will do
-along with automake.
-  * Download the version of protocol buffers that the YARN readme recommends from [[http://code.google.com/p/protobuf/|the
protocol buffers project]].
+  * Download the version of protocol buffers that the YARN readme recommends from [[http://code.google.com/p/protobuf/
|the protocol buffers project]].
   * unzip it/untar it
   * {{{cd}}} into the directory that has been created.
   * run {{{/.configure}}}.
@@ -108, +108 @@

  protoc: error while loading shared libraries: libprotobuf.so.7: cannot open shared object
file: No such file or directory
  }}}
  
- This is a [[http://code.google.com/p/protobuf/issues/detail?id=213|known issue]] for Linux,
and is caused by a stale cache of libraries. Run {{{ldconfig}}} and try again.
+ This is a [[http://code.google.com/p/protobuf/issues/detail?id=213 |known issue]] for Linux,
and is caused by a stale cache of libraries. Run {{{ldconfig}}} and try again.
  
  === Making Changes ===
  Before you start, send a message to the [[http://hadoop.apache.org/core/mailing_lists.html|Hadoop
developer mailing list]], or file a bug report in [[Jira]].  Describe your proposed changes
and check that they fit in with what others are doing and have planned for the project.  Be
patient, it may take folks a while to understand your requirements.

Mime
View raw message