hadoop-common-commits mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Apache Wiki <wikidi...@apache.org>
Subject [Hadoop Wiki] Update of "HowToContribute" by SteveLoughran
Date Wed, 28 Sep 2011 16:28:52 GMT
Dear Wiki user,

You have subscribed to a wiki page or wiki category on "Hadoop Wiki" for change notification.

The "HowToContribute" page has been changed by SteveLoughran:
http://wiki.apache.org/hadoop/HowToContribute?action=diff&rev1=67&rev2=68

  === Other items ===
  
   * A Java Development Kit is required to be installed and on the path of executables. The
Hadoop developers recommend the Sun JDK.
-  * To install and use ProtocolBuffers: A copy of GCC 4.1+ including the {{g++}} C++ compiler,
{{{make}} and the rest of the GNU C++ development chain. If the ProtocolBuffers build fails
with "C++ preprocessor "/lib/cpp" fails sanity check" that means you don't have g++ installed.

-  * For MapReduce in 0.23+ : [[http://code.google.com/p/protobuf/|protocol buffers]] (see
[[http://svn.apache.org/repos/asf/hadoop/common/trunk/hadoop-mapreduce-project/hadoop-yarn/README|YARN
Readme]. The test for this is verifying that {{{protoc}}} is on the command line.
   * The source code of projects that you depend on. Avro, Jetty, Log4J are some examples.
This isn't compulsory, but as the source is there, it helps you see what is going on.
   * The source code of the Java version that you are using. Again: handy.
   * The Java API javadocs. 
@@ -70, +68 @@

  
  ==== Git Access ====
  See GitAndHadoop
+ 
+ === Building ProtocolBuffers (for 0.23+) ===
+ 
+ Hadoop 0.23+ must have Google's ProtocolBuffers for compilation to work. These are native
binaries which need to be downloaded, compiled and then installed locally.  See [[http://svn.apache.org/repos/asf/hadoop/common/trunk/hadoop-mapreduce-project/hadoop-yarn/README|YARN
Readme]
+ 
+ This is a good opportunity to get the GNU C/C++ toolchain installed, which is useful for
working on the native code used in the HDFS project.
+ 
+ To install and use ProtocolBuffers 
+ 
+ ==== Linux ====
+ 
+ Install the protobuf packages ''provided they are current enough'' -see the README file
for the current version. If they are too old, uninstall any version you have and follow the
instructions. 
+ 
+ ==== Local build and installation ====
+ 
+  * you need a copy of GCC 4.1+ including the {{{g++}}} C++ compiler, {{{make}}} and the
rest of the GNU C++ development chain. 
+  * Download the version of protocol buffers that the YARN readme recommends from [[http://code.google.com/p/protobuf/|the
protocol buffers project]].
+  * unzip it/untar it
+  * {{{cd}}} into the directory that has been created.
+  * run {{{/.configure}}}. 
+  * If configure fails with "C++ preprocessor "/lib/cpp" fails sanity check" that means you
don't have g++ installed. Install it.
+  * run {{{make}}} to build the libraries.
+  * on a Unix system, after building the libraries, you must install it ''as root''. {{{su}}}
to root, then run 
+ 
+ ==== Testing your installation
+ 
+ The test for this is verifying that {{{protoc}}} is on the command line. You should expect
something like
+ 
+ {{{
+ $ protoc
+ Missing input file.
+ }}}
+ 
+ You may see the error message 
+ {{{
+ $ protoc
+ protoc: error while loading shared libraries: libprotobuf.so.7: cannot open shared object
file: No such file or directory
+ }}}
+ 
+ This is a [[http://code.google.com/p/protobuf/issues/detail?id=213|known issue]] for Linux,
and is caused by a stale cache of libraries. Run {{{ldconfig}}} and try again.
+ 
+ 
+ 
  
  === Making Changes ===
  Before you start, send a message to the [[http://hadoop.apache.org/core/mailing_lists.html|Hadoop
developer mailing list]], or file a bug report in [[Jira]].  Describe your proposed changes
and check that they fit in with what others are doing and have planned for the project.  Be
patient, it may take folks a while to understand your requirements.

Mime
View raw message