hadoop-common-commits mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Apache Wiki <wikidi...@apache.org>
Subject [Hadoop Wiki] Update of "PoweredBy" by spookysam
Date Fri, 02 Sep 2011 07:01:24 GMT
Dear Wiki user,

You have subscribed to a wiki page or wiki category on "Hadoop Wiki" for change notification.

The "PoweredBy" page has been changed by spookysam:
http://wiki.apache.org/hadoop/PoweredBy?action=diff&rev1=332&rev2=333

    * 50% of our recommender system is pure Pig because of it's ease of use.
    * Some of our more deeply-integrated tasks are using the streaming api and ruby aswell
as the excellent Wukong-Library.
  
-  * [[http://www.ablegrape.com/|Able Grape]] - Vertical search engine for trustworthy wine
information
+  * [[http://is.gd/mfZPcY|Able Grape]] - Vertical search engine for trustworthy wine information
    * We have one of the world's smaller hadoop clusters (2 nodes @ 8 CPUs/node)
    * Hadoop and Nutch used to analyze and index textual information
  
@@ -317, +317 @@

    * total of 48TB of HDFS storage
    * HBase & Hadoop version 0.20
  
-  * [[http://www.legolas-media.com|Legolas Media]]
-  * [[http://www.vittleende.se|Tandblekning CO]]
+  * [[http://is.gd/IxkJ8x|Funny Jokes]]
+  * [[http://is.gd/i7jP4M|funny facebook statuses]]
  
   * [[http://www.linkedin.com|LinkedIn]]
    * We have multiple grids divided up based upon purpose. They are composed of the following
types of hardware:
     * 120 Nehalem-based nodes, with 2x4 cores, 24GB RAM, 8x1TB storage using ext4 in a JBOD
configuration on CentOS 5.5.
     * 520 Westmere-based nodes, with 2x4 cores, 24GB RAM, 6x2TB storage using ext4 in a JBOD
configuration on CentOS 5.5.
-   * We use modified versions of Apache's Hadoop and Pig distributions for discovering People
You May Know and [[http://www.linkedin.com/careerexplorer/dashboard|other]] [[http://inmaps.linkedinlabs.com/|fun]]
[[http://www.linkedin.com/skills/|facts]].
+   * We use modified versions of Apache's Hadoop and Pig distributions for discovering People
You May Know and [[http://www.linkedin.com/careerexplorer/dashboard|other]] [[http://is.gd/IxkJ8x|fun]]
[[http://www.linkedin.com/skills/|facts]].
  
   * [[http://www.lookery.com|Lookery]]
    * We use Hadoop to process clickstream and demographic data in order to create web analytic
reports.
@@ -368, +368 @@

    * 18 node cluster (Quad-Core AMD Opteron 2347, 1TB/node storage)
    * Powers data for search and aggregation
  
-  * [[http://metrixcloud.com/|MetrixCloud]] - provides commercial support, installation,
and hosting of Hadoop Clusters. [[http://metrixcloud.com/contact.php|Contact Us.]]
+  * [[http://tinyurl.com/453ulmu|MetrixCloud]] - provides commercial support, installation,
and hosting of Hadoop Clusters. [[http://metrixcloud.com/contact.php|Contact Us.]]
  
  = N =
   * [[http://www.navteqmedia.com|NAVTEQ Media Solutions]]

Mime
View raw message