hadoop-common-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Mohammad Tariq <donta...@gmail.com>
Subject Re: How to copy log files from remote windows machine to Hadoop cluster
Date Thu, 17 Jan 2013 10:51:05 GMT
Yes. It is possible. I haven't tries windows+flume+hadoop combo
personally, but it should work. You may find this
has explained beautifully how to run Flume on a windows box.If I
get time i'll try to simulate your use case and let you know.

BTW, could you please share with us whatever you have tried??

Warm Regards,

On Thu, Jan 17, 2013 at 4:09 PM, Mahesh Balija

> I have studied Flume but I didn't find any thing useful in my case.
> My requirement is there is a directory in Windows machine, in which the
> files will be generated and keep updated with new logs. I want to have a
> tail kind of mechanism (using exec source) through which I can push the
> latest updates into the cluster.
> Or I have to simply push once in a day to the cluster using spooling
> directory mechanism.
> Can somebody assist whether it is possible using Flume if so the
> configurations needed for this specific to remote windows machine.
> But
> On Thu, Jan 17, 2013 at 3:48 PM, Mirko Kämpf <mirko.kaempf@gmail.com>wrote:
>> Give Flume (http://flume.apache.org/) a chance to collect your data.
>> Mirko
>> 2013/1/17 sirenfei <sirenxue@gmail.com>
>>> ftp auto upload?
>>> 2013/1/17 Mahesh Balija <balijamahesh.mca@gmail.com>:
>>> > the Hadoop cluster (HDFS) either in synchronous or asynchronou

View raw message