flume-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From ed <edor...@gmail.com>
Subject Re: How to generate logs from remote windows system
Date Thu, 13 Feb 2014 05:30:48 GMT

The avro source bind property (a2.sources.r1.bind) is for your local
machine (not the remote machine).  Typically you'll set this to localhost
but if you have multiple network interfaces you might need to specify the
specific interface or use "" to listen on all interfaces.  The
remote machine would need to push/send its data to the machine running the
flume agent that is listening on whatever port you specify.  Your Ubuntu
box will be in a receiving mode, it does not connect to a remote box to
pull data.

Windows Application ==(sends data to)==> Ubuntu Listening on Port 4444

You'll need another flume agent running on the Windows Application server
that uses an AvroSink to send log events to the listening AvroSource on
your Ubuntu box. I haven't tried installing windows on Flume but it looks
like it is possible although probably not typical.  Maybe someone else on
the mailing list can comment on any experience they've had using Flume on

If you don't use a Flume agent on the Windows box then you could look at
syslog or something like nxlog to get your log data to the Flume agent
running on Ubuntu.  In this case you will probably want to use a
SyslogSource or the NetcatSource.



On Thu, Feb 13, 2014 at 1:48 PM, nagarjuna sarpuru <
nagarjuna.sarpuru@gmail.com> wrote:

> Hi,
> Could you please guide me how to take logs from my remote
> system,requirements as follows:
> I have Single node cluster Hadoop 1.2.1 ,Flume 1.4.1 running on ubuntu
> system,Now my requirement is to take the logs of a particular application
> running on My remote machine (Windows).
> For this I have written .conf file as follows
> # Name the components on this agent
> a2.sources = r1
> a2.sinks = k1
> a2.channels = c1
> # Describe/configure the source
> a2.sources.r1.type = avro
> a2.sources.r1.bind =localhost*  //// remote machine ip address(is it
> correct ?)*
> a2.sources.r1.port =4444  *////port number*
> (
> # Describe the sink
> a2.sinks.k1.type = hdfs
> a2.sinks.k1.hdfs.path= hdfs://localhost:54310/exec
> # Use a channel which buffers events in memory
> a2.channels.c1.type = memory
> a2.channels.c1.capacity = 1000
> a2.channels.c1.transactionCapacity = 100
> # Bind the source and sink to the channel
> a2.sources.r1.channels = c1
> a2.sinks.k1.channel = c1
> Apart from that , what are the things needs to be taken care here, like
> where to declare client agent ,flume agent .
> Do I need to write any Java program additionally  (as I am not good at
> Java program).
> *I would be so thankful If Get the help*
> --
> thanks&regards
> Nagarjuna.S
> --
> thanks&regards
> Nagarjuna.S

View raw message