hadoop-common-commits mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Apache Wiki <wikidi...@apache.org>
Subject [Hadoop Wiki] Trivial Update of "FAQ" by SomeOtherAccount
Date Mon, 03 Jan 2011 21:31:51 GMT
Dear Wiki user,

You have subscribed to a wiki page or wiki category on "Hadoop Wiki" for change notification.

The "FAQ" page has been changed by SomeOtherAccount.
http://wiki.apache.org/hadoop/FAQ?action=diff&rev1=88&rev2=89

--------------------------------------------------

  
  = Platform Specific =
  == Mac OS X ==
- === Building on Mac OS X 10.6 ==
+ === Building on Mac OS X 10.6 ===
  
  Be aware that Apache Hadoop 0.22 and earlier require Apache Forrest to build the documentation.
 As of Snow Leopard, Apple no longer ships Java 1.5 which Apache Forrest requires.  This can
be accomplished by either copying /System/Library/Frameworks/JavaVM.Framework/Versions/1.5
and 1.5.0 from a 10.5 machine or using a utility like Pacifist to install from an official
Apple package. http://chxor.chxo.com/post/183013153/installing-java-1-5-on-snow-leopard provides
some step-by-step directions.
+ 
+ 
+ == Solaris ==
+ === Why do files and directories show up as DrWho and/or user names are missing/weird? ===
+ Prior to 0.22, Hadoop uses the 'whoami' and id commands to determine the user and groups
of the running process. whoami ships as part of the BSD compatibility package and is normally
not in the path.  The id command's output is System V-style whereas Hadoop expects POSIX.
 Two changes to the environment are required to fix this:
+ 
+  1. Make sure /usr/ucb/whoami is installed and in the path, either by including /usr/ucb
at the tail end of the PATH environment or symlinking /usr/ucb/whoami directly.
+  1. In hadoop-env.sh, change the HADOOP_IDENT_STRING thusly:
+ 
+ {{{
+ export HADOOP_IDENT_STRING=`/usr/xpg4/bin/id -u -n`
+ }}}
+ === Reported disk capacities are wrong ===
+ Hadoop uses du and df to determine disk space used.  On pooled storage systems that report
total capacity of the entire pool (such as ZFS) rather than the filesystem, Hadoop gets easily
confused.  Users have reported that using fixed quota sizes for HDFS and MapReduce directories
helps eliminate a lot of this confusion.
  
  == Windows ==
  === Building / Testing Hadoop on Windows ===
@@ -282, +296 @@

  }}}
  other targets work similarly. I just wanted to document this because I spent some time trying
to figure out why the ant build would not run from a cygwin command prompt window. If you
are building/testing on Windows, and haven't figured it out yet, this should get you started.
  
- == Solaris ==
- === Why do files and directories show up as DrWho and/or user names are missing/weird? ===
- Prior to 0.22, Hadoop uses the 'whoami' and id commands to determine the user and groups
of the running process. whoami ships as part of the BSD compatibility package and is normally
not in the path.  The id command's output is System V-style whereas Hadoop expects POSIX.
 Two changes to the environment are required to fix this:
- 
-  1. Make sure /usr/ucb/whoami is installed and in the path, either by including /usr/ucb
at the tail end of the PATH environment or symlinking /usr/ucb/whoami directly.
-  1. In hadoop-env.sh, change the HADOOP_IDENT_STRING thusly:
- 
- {{{
- export HADOOP_IDENT_STRING=`/usr/xpg4/bin/id -u -n`
- }}}
- === Reported disk capacities are wrong ===
- Hadoop uses du and df to determine disk space used.  On pooled storage systems that report
total capacity of the entire pool (such as ZFS) rather than the filesystem, Hadoop gets easily
confused.  Users have reported that using fixed quota sizes for HDFS and MapReduce directories
helps eliminate a lot of this confusion.
- 

Mime
View raw message