hbase-commits mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From st...@apache.org
Subject svn commit: r648746 - in /hadoop/hbase: branches/0.1/src/java/overview.html trunk/src/java/overview.html
Date Wed, 16 Apr 2008 16:45:04 GMT
Author: stack
Date: Wed Apr 16 09:45:00 2008
New Revision: 648746

URL: http://svn.apache.org/viewvc?rev=648746&view=rev
Log:
Add to 'getting started' note about hbase being file handles hog.

Modified:
    hadoop/hbase/branches/0.1/src/java/overview.html
    hadoop/hbase/trunk/src/java/overview.html

Modified: hadoop/hbase/branches/0.1/src/java/overview.html
URL: http://svn.apache.org/viewvc/hadoop/hbase/branches/0.1/src/java/overview.html?rev=648746&r1=648745&r2=648746&view=diff
==============================================================================
--- hadoop/hbase/branches/0.1/src/java/overview.html (original)
+++ hadoop/hbase/branches/0.1/src/java/overview.html Wed Apr 16 09:45:00 2008
@@ -32,6 +32,11 @@
     ssh must be installed and sshd must be running to use Hadoop's
     scripts to manage remote Hadoop daemons.
   </li>
+  <li>HBase currently is a file handle hog.  The usual default of
+  1024 on *nix systems is insufficient if you are loading any significant
+  amount of data into regionservers.  See the
+  <a href="http://wiki.apache.org/hadoop/Hbase/FAQ#6">FAQ: Why do I see "java.io.IOException...(Too
many open files)" in my logs?</a>
+  for how to up the limit.</li>
 </ul>
 
 <h2><a name="getting_started" >Getting Started</a></h2>

Modified: hadoop/hbase/trunk/src/java/overview.html
URL: http://svn.apache.org/viewvc/hadoop/hbase/trunk/src/java/overview.html?rev=648746&r1=648745&r2=648746&view=diff
==============================================================================
--- hadoop/hbase/trunk/src/java/overview.html (original)
+++ hadoop/hbase/trunk/src/java/overview.html Wed Apr 16 09:45:00 2008
@@ -32,6 +32,11 @@
     ssh must be installed and sshd must be running to use Hadoop's
     scripts to manage remote Hadoop daemons.
   </li>
+  <li>HBase currently is a file handle hog.  The usual default of
+  1024 on *nix systems is insufficient if you are loading any significant
+  amount of data into regionservers.  See the
+  <a href="http://wiki.apache.org/hadoop/Hbase/FAQ#6">FAQ: Why do I see "java.io.IOException...(Too
many open files)" in my logs?</a>
+  for how to up the limit.</li>
 </ul>
 
 <h2><a name="getting_started" >Getting Started</a></h2>



Mime
View raw message