hadoop-mapreduce-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Thoihen Maibam <thoihen...@gmail.com>
Subject Hadoop noob question
Date Sat, 11 May 2013 10:49:14 GMT
Hi All,

Can anyone help me know how does companies like Facebook ,Yahoo etc upload
bulk files say to the tune of 100 petabytes to Hadoop HDFS cluster for
and after processing how they download those files from HDFS to local file

I don't think they might be using the command line hadoop fs put to upload
files as it would take too long or do they divide say 10 parts each 10
petabytes and  compress and use the command line hadoop fs put

Or if they use any tool to upload huge files.

Please help me .


View raw message