hadoop-mapreduce-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Wellington Chevreuil <wellington.chevre...@gmail.com>
Subject Re: Hadoop file uploads
Date Tue, 04 Oct 2011 19:47:03 GMT
Yes, Sadak,

Within this API, you'll copy your files into Hadoop HDFS as you do
when writing to an OutputStream. It will be replicated in your
cluster's HDFS then.


2011/10/4 visioner sadak <visioner.sadak@gmail.com>:
> Hey thanks wellington just a thought will my data be replicated as well coz
> i thought tht mapper does the job of breaking data in to pieces and
> distribution and reducer will do the joining and combining while fetching
> data back thts why was confused to use a MR..can i use this API for
> uploading a large number of small files as well thru my application or
> should i use sequence file class for that...because i saw the small file
> problem in hadoop as well as mentioned in below link
> http://www.cloudera.com/blog/2009/02/the-small-files-problem/
> On Wed, Oct 5, 2011 at 12:54 AM, Wellington Chevreuil
> <wellington.chevreuil@gmail.com> wrote:
>> Hey Sadak,
>> you don't need to write a MR job for that. You can make your java
>> program use Hadoop Java API for that. You would need to use FileSystem
>> (http://hadoop.apache.org/common/docs/current/api/org/apache/hadoop/fs/FileSystem.html)
>> and Path
>> (http://hadoop.apache.org/common/docs/current/api/index.html?org/apache/hadoop/fs/Path.html)
>> classes for that.
>> Cheers,
>> Wellington.
>> 2011/10/4 visioner sadak <visioner.sadak@gmail.com>:
>> > Hello guys,
>> >
>> >             I would like to know how to do file uploads in HDFS using
>> > java,is it to be done using map reduce what if i have a large number of
>> > small files should i use sequence file along with map reduce???,It will
>> > be
>> > great if you can provide some sort of information...

View raw message