hadoop-common-commits mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Apache Wiki <wikidi...@apache.org>
Subject [Hadoop Wiki] Update of "PoweredBy" by SomeOtherAccount
Date Tue, 04 Oct 2011 07:21:22 GMT
Dear Wiki user,

You have subscribed to a wiki page or wiki category on "Hadoop Wiki" for change notification.

The "PoweredBy" page has been changed by SomeOtherAccount:
http://wiki.apache.org/hadoop/PoweredBy?action=diff&rev1=346&rev2=347

Comment:
update linkedin's stats

   * [[http://www.legolas-media.com|Legolas Media]]
  
   * [[http://www.linkedin.com|LinkedIn]]
-   * We have multiple grids divided up based upon purpose. They are composed of the following
types of hardware:
-    * 120 Nehalem-based nodes, with 2x4 cores, 24GB RAM, 8x1TB storage using ext4 in a JBOD
configuration on CentOS 5.5.
-    * 580 Westmere-based nodes, with 2x4 cores, 24GB RAM, 6x2TB storage using ext4 in a JBOD
configuration on CentOS 5.5.
+   * We have multiple grids divided up based upon purpose.
+    * Hardware:
+     * 120 Nehalem-based Sun x4275, with 2x4 cores, 24GB RAM, 8x1TB SATA
+     * 580 Westmere-based HP SL 170x, with 2x4 cores, 24GB RAM, 6x2TB SATA
+     * 1200 Westmere-based SuperMicro X8DTT-H, with 2x6 cores, 24GB RAM, 6x2TB SATA
+    * Software:
+     * CentOS 5.5 -> RHEL 6.1
+     * Sun JDK 1.6.0_14 -> Sun JDK 1.6.0_20 -> Sun JDK 1.6.0_26
+     * Apache Hadoop 0.20.2+patches -> Apache Hadoop 0.20.204+patches
+     * Pig 0.9 heavily customized
+     * Azkaban for scheduling
+     * Hive, Avro, Kafka, and other bits and pieces...
-   * We use modified versions of Apache's Hadoop and Pig distributions for discovering People
You May Know and [[http://www.linkedin.com/careerexplorer/dashboard|other]] [[http://inmaps.linkedinlabs.com/|fun]]
[[http://www.linkedin.com/skills/|facts]].
+   * We use this things for discovering People You May Know and [[http://www.linkedin.com/careerexplorer/dashboard|other]]
[[http://inmaps.linkedinlabs.com/|fun]] [[http://www.linkedin.com/skills/|facts]].
  
   * [[http://www.lookery.com|Lookery]]
    * We use Hadoop to process clickstream and demographic data in order to create web analytic
reports.

Mime
View raw message