hadoop-common-commits mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Apache Wiki <wikidi...@apache.org>
Subject [Hadoop Wiki] Update of "PoweredBy" by spookysam
Date Fri, 13 May 2011 18:46:13 GMT
Dear Wiki user,

You have subscribed to a wiki page or wiki category on "Hadoop Wiki" for change notification.

The "PoweredBy" page has been changed by spookysam.
http://wiki.apache.org/hadoop/PoweredBy?action=diff&rev1=282&rev2=283

--------------------------------------------------

  
    * Our production cluster has been running since Oct 2008.
  
-  * [[http://www.adyard.de|adyard]]
+  * [[http://dentaldentistsolutions.blogspot.com/2009/10/process-and-pictures-dental-implants.html|Dental
Implants]]
    * We use Flume, Hadoop and Pig for log storage and report generation aswell as ad-Targeting.
    * We currently have 12 nodes running HDFS and Pig and plan to add more from time to time.
    * 50% of our recommender system is pure Pig because of it's ease of use.
@@ -309, +309 @@

    * We have multiple grids divided up based upon purpose. They are composed of the following
types of hardware:
     * 120 Nehalem-based nodes, with 2x4 cores, 24GB RAM, 8x1TB storage using ext4 in a JBOD
configuration on CentOS 5.5.
     * 520 Westmere-based nodes, with 2x4 cores, 24GB RAM, 6x2TB storage using ext4 in a JBOD
configuration on CentOS 5.5.
-   * We use modified versions of Apache's Hadoop and Pig distributions for discovering People
You May Know and [[http://www.linkedin.com/careerexplorer/dashboard|other]] [[http://inmaps.linkedinlabs.com/|fun]]
[[http://www.linkedin.com/skills/|facts]].
+   * We use modified versions of Apache's Hadoop and Pig distributions for discovering People
You May Know and [[http://twitter.com/007simple|other]] [[http://itshumour.blogspot.com/2010/06/twenty-hilarious-funny-quotes.html|fun]]
[[http://identi.ca/simple007|facts]].
  
   * [[http://www.lookery.com|Lookery]]
    * We use Hadoop to process clickstream and demographic data in order to create web analytic
reports.
@@ -545, +545 @@

    * We also use Hadoop for filtering and indexing listing, processing log analysis, and
for recommendation data.
  
  = W =
-  * [[http://www.web-alliance.fr|Web Alliance]]
+  * [[http://wiki.citizen.apps.gov/pillbox/bin/view/Main/SussaneNg|Web Alliance]]
    * We use Hadoop for our internal search engine optimization (SEO) tools. It allows us
to store, index, search data in a much faster way.
    * We also use it for logs analysis and trends prediction.
   * [[http://www.worldlingo.com/|WorldLingo]]

Mime
View raw message