incubator-accumulo-commits mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From build...@apache.org
Subject svn commit: r803663 - /websites/staging/accumulo/trunk/content/accumulo/example/wikisearch.html
Date Tue, 31 Jan 2012 17:50:11 GMT
Author: buildbot
Date: Tue Jan 31 17:50:11 2012
New Revision: 803663

Log:
Staging update by buildbot for accumulo

Modified:
    websites/staging/accumulo/trunk/content/accumulo/example/wikisearch.html

Modified: websites/staging/accumulo/trunk/content/accumulo/example/wikisearch.html
==============================================================================
--- websites/staging/accumulo/trunk/content/accumulo/example/wikisearch.html (original)
+++ websites/staging/accumulo/trunk/content/accumulo/example/wikisearch.html Tue Jan 31 17:50:11
2012
@@ -102,6 +102,10 @@
 <li>Custom load balancing, which ensures that a table is evenly distributed on all
region servers</li>
 </ol>
 <p>In the example, Accumulo tracks the cardinality of all terms as elements are ingested.
 If the cardinality is small enough, it will track the set of documents by term directly.
 For example:</p>
+<style type="text/css">
+table td,th {padding-right: 10px;}
+</style>
+
 <table>
 <tr>
 <th>Row (word)</th>
@@ -333,7 +337,7 @@ For comparison, these are the cold start
 <td>8.13
 </table>
 
-<p>Random Query Load</p>
+<h3 id="random_query_load">Random Query Load</h3>
 <p>Random queries were generated using common english words.  A uniform random sample
of 3 to 5 words taken from the 10000 most common words in the Project Gutenberg's online text
collection were joined with “and”.  Words containing anything other than letters
(such as contractions) were not used.  A client was started simultaneously on each of the
10 servers and each ran 100 random queries (1000 queries total).</p>
 <table>
 <tr>
@@ -353,7 +357,7 @@ For comparison, these are the cold start
 <td>275655
 </table>
 
-<p>Query Load During Ingest</p>
+<h3 id="query_load_during_ingest">Query Load During Ingest</h3>
 <p>The English wikipedia data was re-ingested on top of the existing, compacted data.
The following  query samples were taken in 5 minute intervals while ingesting 132 articles/second:</p>
 <table>
 <tr>



Mime
View raw message