hadoop-common-commits mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Apache Wiki <wikidi...@apache.org>
Subject [Hadoop Wiki] Update of "PoweredBy" by EarlCahill
Date Fri, 24 Apr 2009 06:14:22 GMT
Dear Wiki user,

You have subscribed to a wiki page or wiki category on "Hadoop Wiki" for change notification.

The following page has been changed by EarlCahill:
http://wiki.apache.org/hadoop/PoweredBy

------------------------------------------------------------------------------
  
   * [http://www.hadoop.tw/ Hadoop Taiwan User Group]
  
+  * [http://holaservers.com/ HolaServers.com]
+   * Hosting company
+   * Use pig to provide traffic stats to users in near real time
+ 
+ 
+  * [http://www.hostinghabitat.com/ Hosting Habitat]
+   * We use a customised version of Hadoop and Nutch in a currently experimental 6 node/Dual
Core cluster environment. 
+   * What we crawl are our clients Websites and from the information we gather. We fingerprint
old and non updated software packages in that shared hosting environment. We can then inform
our clients that they have old and non updated software running after matching a signature
to a Database. With that information we know which sites would require patching as a free
and courtesy service to protect the majority of users. Without the technologies of Nutch and
Hadoop this would be a far harder to accomplish task. 
+ 
   * [http://www.ibm.com IBM]
    * [http://www-03.ibm.com/press/us/en/pressrelease/22613.wss Blue Cloud Computing Clusters]
    * [http://www-03.ibm.com/press/us/en/pressrelease/22414.wss University Initiative to Address
Internet-Scale Computing Challenges]
- 
-  * [http://www.hostinghabitat.com/ Hosting Habitat]
-   * We use a customised version of Hadoop and Nutch in a currently experimental 6 node/Dual
Core cluster environment. 
-   * What we crawl are our clients Websites and from the information we gather. We fingerprint
old and non updated software packages in that shared hosting environment. We can then inform
our clients that they have old and non updated software running after matching a signature
to a Database. With that information we know which sites would require patching as a free
and courtesy service to protect the majority of users. Without the technologies of Nutch and
Hadoop this would be a far harder to accomplish task. 
  
   * [http://www.iccs.informatics.ed.ac.uk/ ICCS]
    * We are using Hadoop and Nutch to crawl Blog posts and later process them.

Mime
View raw message