hadoop-common-commits mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Apache Wiki <wikidi...@apache.org>
Subject [Hadoop Wiki] Update of "PoweredBy" by ShadiSaba
Date Thu, 12 Nov 2009 01:03:59 GMT
Dear Wiki user,

You have subscribed to a wiki page or wiki category on "Hadoop Wiki" for change notification.

The "PoweredBy" page has been changed by ShadiSaba.
http://wiki.apache.org/hadoop/PoweredBy?action=diff&rev1=160&rev2=161

--------------------------------------------------

    * We have a 15-node Hadoop cluster where each machine has 8 cores, 8 GB ram, and 3-4 TB
of storage.
    * We use Hadoop for all of our analytics, and we use Pig to allow PMs and non-engineers
the freedom to query the data in an ad-hoc manner.
  
+  * [[http://www.thestocksprofit.com/|The Stocks Profit - Technical Analysis]]
+   * Generating stock analysis on 23 nodes (dual 2.4GHz Xeon Processor, 2 GB RAM, 72GB Hard
Drive)
+ 
   * [[http://www.weblab.infosci.cornell.edu/|Cornell University Web Lab]]
    * Generating web graphs on 100 nodes (dual 2.4GHz Xeon Processor, 2 GB RAM, 72GB Hard
Drive)
+ 
+ 
+ 
  
   * [[http://www.deepdyve.com|Deepdyve]]
    * Elastic cluster with 5-80 nodes
@@ -88, +94 @@

    * We use Hadoop to filter and index our listings, removing exact duplicates and grouping
similar ones.
    * We plan to use Pig very shortly to produce statistics.
  
+ 
   * [[http://blog.espol.edu.ec/hadoop/|ESPOL University (Escuela Superior Polit├ęcnica del
Litoral) in Guayaquil, Ecuador]]
    * 4 nodes proof-of-concept cluster.
    * We use Hadoop in a Data-Intensive Computing capstone course. The course projects cover
topics like information retrieval, machine learning, social network analysis, business intelligence,
and network security.
@@ -101, +108 @@

    * Facial similarity and recognition across large datasets.
    * Image content based advertising and auto-tagging for social media.
    * Image based video copyright protection.
+ 
  
   * [[http://www.facebook.com/|Facebook]]
    * We use Hadoop to store copies of internal log and dimension data sources and use it
as a source for reporting/analytics and machine learning.

Mime
View raw message