flume-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Brock Noland <br...@cloudera.com>
Subject Re: How I can use flume to automatically upload files into HDFS
Date Sat, 17 Nov 2012 19:51:46 GMT
Hi,

This question is for the user@ list not the dev@ list.  Sounds like
you want the spool directory source which will be available in the 1.3
release. Another gentleman has shared his configuration here:

http://s.apache.org/8Ea

for that source.

Brock

On Sat, Nov 17, 2012 at 12:33 PM, kashif khan <drkashif8310@gmail.com> wrote:
> HI,
>
> I am generating files continuously in local folder of my base machine. How
> I can now use the flume to stream the generated files from local folder to
> HDFS.
> I dont know how exactly configure the sources, sinks and hdfs.
>
> 1) location of folder where files are generating: /usr/datastorage/
> 2) name node address: htdfs://hadoop1.example.com:8020
>
> Please let me help.
>
> Many thanks
>
> Best regards,
> KK



-- 
Apache MRUnit - Unit testing MapReduce - http://incubator.apache.org/mrunit/

Mime
View raw message