hadoop-mapreduce-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Mahesh Balija <balijamahesh....@gmail.com>
Subject Re: How to copy log files from remote windows machine to Hadoop cluster
Date Thu, 17 Jan 2013 10:39:56 GMT
I have studied Flume but I didn't find any thing useful in my case.
My requirement is there is a directory in Windows machine, in which the
files will be generated and keep updated with new logs. I want to have a
tail kind of mechanism (using exec source) through which I can push the
latest updates into the cluster.
Or I have to simply push once in a day to the cluster using spooling
directory mechanism.

Can somebody assist whether it is possible using Flume if so the
configurations needed for this specific to remote windows machine.

But

On Thu, Jan 17, 2013 at 3:48 PM, Mirko Kämpf <mirko.kaempf@gmail.com> wrote:

> Give Flume (http://flume.apache.org/) a chance to collect your data.
>
> Mirko
>
>
>
> 2013/1/17 sirenfei <sirenxue@gmail.com>
>
>> ftp auto upload?
>>
>>
>> 2013/1/17 Mahesh Balija <balijamahesh.mca@gmail.com>:
>> > the Hadoop cluster (HDFS) either in synchronous or asynchronou
>>
>
>

Mime
View raw message