flume-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Ashutoshsharma(오픈플랫폼개발팀) <sharma.ashut...@kt.com>
Subject RE: multiple agents
Date Mon, 12 Nov 2012 02:07:16 GMT
Hi All,

Thank you for your responses!

I already mentioned that I am running multiple flow from flume configuration file. The configuration
file is working fine with the apache release “1.2”. But when I use this configuration
file with CDH4 release it doesn’t work. I don’t understand this behavior and there is
very less information in flume.log file to identify problem. It seems to me that the agent
is stuck somewhere. Below is my log file content:
2012-11-12 09:20:40,237 INFO node.FlumeNode: Flume node stopping - agent
2012-11-12 09:20:40,237 INFO lifecycle.LifecycleSupervisor: Stopping lifecycle supervisor
9
2012-11-12 09:20:40,238 INFO properties.PropertiesFileConfigurationProvider: Configuration
provider stopping
2012-11-12 09:20:40,238 DEBUG properties.PropertiesFileConfigurationProvider: Configuration
provider stopped
2012-11-12 09:20:40,238 INFO nodemanager.DefaultLogicalNodeManager: Node manager stopping
2012-11-12 09:20:40,238 INFO lifecycle.LifecycleSupervisor: Stopping lifecycle supervisor
9
2012-11-12 09:20:40,238 DEBUG nodemanager.DefaultLogicalNodeManager: Node manager stopped
2012-11-12 09:21:41,681 INFO lifecycle.LifecycleSupervisor: Starting lifecycle supervisor
1
2012-11-12 09:21:41,683 INFO node.FlumeNode: Flume node starting - agent
2012-11-12 09:21:41,688 INFO nodemanager.DefaultLogicalNodeManager: Node manager starting
2012-11-12 09:21:41,688 INFO lifecycle.LifecycleSupervisor: Starting lifecycle supervisor
10
2012-11-12 09:21:41,690 INFO properties.PropertiesFileConfigurationProvider: Configuration
provider starting
2012-11-12 09:21:41,691 DEBUG nodemanager.DefaultLogicalNodeManager: Node manager started
2012-11-12 09:21:41,692 DEBUG properties.PropertiesFileConfigurationProvider: Configuration
provider started
2012-11-12 09:21:41,693 DEBUG properties.PropertiesFileConfigurationProvider: Checking file:/etc/flume-ng/conf/flume.conf
for changes
2012-11-12 09:21:41,693 INFO properties.PropertiesFileConfigurationProvider: Reloading configuration
file:/etc/flume-ng/conf/flume.conf
2012-11-12 09:21:41,698 INFO conf.FlumeConfiguration: Processing:tx-es-sink
2012-11-12 09:21:41,698 DEBUG conf.FlumeConfiguration: Created context for tx-es-sink: cluster
2012-11-12 09:21:41,699 INFO conf.FlumeConfiguration: Processing:tx-hdfs-sink
2012-11-12 09:21:41,699 DEBUG conf.FlumeConfiguration: Created context for tx-hdfs-sink: hdfs.fileType
2012-11-12 09:21:41,699 INFO conf.FlumeConfiguration: Processing:web-hdfs-sink
2012-11-12 09:21:41,699 DEBUG conf.FlumeConfiguration: Created context for web-hdfs-sink:
hdfs.rollCount
2012-11-12 09:21:41,699 INFO conf.FlumeConfiguration: Processing:tx-es-sink
2012-11-12 09:21:41,699 INFO conf.FlumeConfiguration: Processing:dev-es-sink
2012-11-12 09:21:41,699 DEBUG conf.FlumeConfiguration: Created context for dev-es-sink: type
2012-11-12 09:21:41,699 INFO conf.FlumeConfiguration: Processing:dev-hdfs-sink
2012-11-12 09:21:41,699 DEBUG conf.FlumeConfiguration: Created context for dev-hdfs-sink:
hdfs.path
2012-11-12 09:21:41,699 INFO conf.FlumeConfiguration: Processing:tx-hdfs-sink
2012-11-12 09:21:41,699 INFO conf.FlumeConfiguration: Processing:web-hdfs-sink
2012-11-12 09:21:41,699 INFO conf.FlumeConfiguration: Processing:dev-hdfs-sink
2012-11-12 09:21:41,699 INFO conf.FlumeConfiguration: Processing:tx-hdfs-sink
2012-11-12 09:21:41,699 INFO conf.FlumeConfiguration: Processing:tx-es-sink
2012-11-12 09:21:41,699 INFO conf.FlumeConfiguration: Processing:web-hdfs-sink
2012-11-12 09:21:41,699 INFO conf.FlumeConfiguration: Processing:tx-hdfs-sink
2012-11-12 09:21:41,699 INFO conf.FlumeConfiguration: Processing:tx-es-sink
2012-11-12 09:21:41,699 INFO conf.FlumeConfiguration: Processing:dev-hdfs-sink
2012-11-12 09:21:41,700 INFO conf.FlumeConfiguration: Processing:tx-es-sink
2012-11-12 09:21:41,700 INFO conf.FlumeConfiguration: Processing:dev-es-sink
2012-11-12 09:21:41,700 INFO conf.FlumeConfiguration: Processing:web-es-sink
2012-11-12 09:21:41,700 DEBUG conf.FlumeConfiguration: Created context for web-es-sink: typeName
2012-11-12 09:21:41,700 INFO conf.FlumeConfiguration: Processing:dev-hdfs-sink
2012-11-12 09:21:41,700 INFO conf.FlumeConfiguration: Processing:web-es-sink
2012-11-12 09:21:41,700 INFO conf.FlumeConfiguration: Processing:tx-hdfs-sink
2012-11-12 09:21:41,700 INFO conf.FlumeConfiguration: Processing:dev-es-sink
2012-11-12 09:21:41,700 INFO conf.FlumeConfiguration: Processing:dev-es-sink
2012-11-12 09:21:41,700 INFO conf.FlumeConfiguration: Added sinks: tx-es-sink tx-hdfs-sink
dev-es-sink dev-hdfs-sink web-es-sink web-hdfs-sink Agent: agent
2012-11-12 09:21:41,700 INFO conf.FlumeConfiguration: Processing:web-hdfs-sink
2012-11-12 09:21:41,700 INFO conf.FlumeConfiguration: Processing:web-hdfs-sink
2012-11-12 09:21:41,700 INFO conf.FlumeConfiguration: Processing:web-hdfs-sink
2012-11-12 09:21:41,700 INFO conf.FlumeConfiguration: Processing:web-es-sink
2012-11-12 09:21:41,700 INFO conf.FlumeConfiguration: Processing:dev-hdfs-sink
2012-11-12 09:21:41,700 INFO conf.FlumeConfiguration: Processing:tx-hdfs-sink
2012-11-12 09:21:41,700 INFO conf.FlumeConfiguration: Processing:dev-es-sink
2012-11-12 09:21:41,700 INFO conf.FlumeConfiguration: Processing:dev-hdfs-sink
2012-11-12 09:21:41,700 INFO conf.FlumeConfiguration: Processing:dev-es-sink
2012-11-12 09:21:41,700 INFO conf.FlumeConfiguration: Processing:tx-hdfs-sink
2012-11-12 09:21:41,700 INFO conf.FlumeConfiguration: Processing:web-hdfs-sink
2012-11-12 09:21:41,700 INFO conf.FlumeConfiguration: Processing:dev-hdfs-sink
2012-11-12 09:21:41,701 INFO conf.FlumeConfiguration: Processing:dev-hdfs-sink
2012-11-12 09:21:41,701 INFO conf.FlumeConfiguration: Processing:tx-hdfs-sink
2012-11-12 09:21:41,701 INFO conf.FlumeConfiguration: Processing:tx-hdfs-sink
2012-11-12 09:21:41,701 INFO conf.FlumeConfiguration: Processing:web-es-sink
2012-11-12 09:21:41,701 INFO conf.FlumeConfiguration: Processing:web-hdfs-sink
2012-11-12 09:21:41,701 INFO conf.FlumeConfiguration: Processing:web-es-sink
2012-11-12 09:21:41,701 INFO conf.FlumeConfiguration: Processing:web-hdfs-sink
2012-11-12 09:21:41,701 INFO conf.FlumeConfiguration: Processing:tx-es-sink
2012-11-12 09:21:41,701 INFO conf.FlumeConfiguration: Processing:dev-hdfs-sink
2012-11-12 09:21:41,701 INFO conf.FlumeConfiguration: Processing:web-es-sink
2012-11-12 09:21:41,701 DEBUG conf.FlumeConfiguration: Starting validation of configuration
for agent: agent, initial-configuration: AgentConfiguration[agent]
SOURCES: {tx-avro={ parameters:{interceptors.log.preserveExisting=false, port=35853, interceptors=log,
channels=tx-mem-channel tx-file-channel, interceptors.log.type=org.flumeng.TransLogInterceptor$Builder,
type=avro, selector.type=replicating, bind=comflusvrintb01} }, web-avro={ parameters:{interceptors.log.preserveExisting=false,
port=35855, interceptors=log, channels=web-mem-channel web-file-channel, type=avro, interceptors.log.type=org.flumeng.AccessLogInterceptor$Builder,
selector.type=replicating, bind=comflusvrintb01} }, dev-avro={ parameters:{port=35854, interceptors.log.preserveExisting=false,
interceptors=isdigit log, channels=dev-mem-channel dev-file-channel, type=avro, interceptors.log.type=org.flumeng.LogInterceptor$Builder,
selector.type=replicating, bind=comflusvrintb01, interceptors.isdigit.type=org.flumeng.DigitInterceptor$Builder}
}}
CHANNELS: {dev-mem-channel={ parameters:{transactionCapacity=20, capacity=10000, type=memory}
}, web-mem-channel={ parameters:{transactionCapacity=20, capacity=10000, type=memory} }, dev-file-channel={
parameters:{checkpointDir=//flume/agent/dev-file-channel/checkpoint, dataDirs=//flume/agent/dev-file-channel/data,
maxFileSize=1073741824, type=FILE} }, tx-file-channel={ parameters:{checkpointDir=//flume/agent/tx-file-channel/checkpoint,
dataDirs=//flume/agent/tx-file-channel/data, maxFileSize=1073741824, type=FILE} }, tx-mem-channel={
parameters:{transactionCapacity=20, capacity=10000, type=memory} }, web-file-channel={ parameters:{checkpointDir=//flume/agent/web-file-channel/checkpoint,
dataDirs=//flume/agent/web-file-channel/data, maxFileSize=1073741824, type=FILE} }}
SINKS: {web-es-sink={ parameters:{typeName=access, host=pisblkwasintb01, indexName=weblog,
cluster=es-, type= org.flume.sink.ElasticSearchSink, channel=web-mem-channel} }, dev-es-sink={
parameters:{typeName=developer, host=pisblkwasintb01, indexName=devlog, cluster=es-, type=..rdc.flume.sink.ElasticSearchSink,
channel=dev-mem-channel} }, web-hdfs-sink={ parameters:{hdfs.fileType=DataStream, hdfs.path=hdfs://comlogsvrintb01/logs/web/%{hostname}/%Y-%m-%d,
hdfs.rollInterval=600, hdfs.rollSize=0, hdfs.writeFormat=Text, hdfs.filePrefix=web, type=hdfs,
channel=web-file-channel, hdfs.rollCount=0} }, dev-hdfs-sink={ parameters:{hdfs.fileType=DataStream,
hdfs.path=hdfs://comlogsvrintb01/logs/developer/%{hostname}/%Y-%m-%d, hdfs.rollInterval=600,
hdfs.rollSize=0, hdfs.writeFormat=Text, hdfs.filePrefix=developer, type=hdfs, hdfs.rollCount=0,
channel=dev-file-channel} }, tx-hdfs-sink={ parameters:{hdfs.fileType=DataStream, hdfs.path=hdfs://comlogsvrintb01/logs/transaction/%{hostname}/%Y-%m-%d,
hdfs.rollInterval=600, hdfs.rollSize=0, hdfs.writeFormat=Text, hdfs.filePrefix=transaction,
type=hdfs, channel=tx-file-channel, hdfs.rollCount=0} }, tx-es-sink={ parameters:{typeName=transaction,
host=pisblkwasintb01, indexName=txlog, cluster=es-, type=org.flume.sink.ElasticSearchSink,
channel=tx-mem-channel} }}

2012-11-12 09:21:41,706 DEBUG conf.FlumeConfiguration: Created channel dev-mem-channel
2012-11-12 09:21:41,706 DEBUG conf.FlumeConfiguration: Created channel web-mem-channel
2012-11-12 09:21:41,706 DEBUG conf.FlumeConfiguration: Created channel dev-file-channel
2012-11-12 09:21:41,707 DEBUG conf.FlumeConfiguration: Created channel tx-file-channel
2012-11-12 09:21:41,707 DEBUG conf.FlumeConfiguration: Created channel tx-mem-channel
2012-11-12 09:21:41,708 DEBUG conf.FlumeConfiguration: Created channel web-file-channel
2012-11-12 09:21:41,730 DEBUG conf.FlumeConfiguration: Creating sink: web-es-sink using OTHER
2012-11-12 09:21:41,731 DEBUG conf.FlumeConfiguration: Creating sink: dev-es-sink using OTHER
2012-11-12 09:21:41,731 DEBUG conf.FlumeConfiguration: Creating sink: web-hdfs-sink using
HDFS
2012-11-12 09:21:41,732 DEBUG conf.FlumeConfiguration: Creating sink: dev-hdfs-sink using
HDFS
2012-11-12 09:21:41,732 DEBUG conf.FlumeConfiguration: Creating sink: tx-hdfs-sink using HDFS
2012-11-12 09:21:41,733 DEBUG conf.FlumeConfiguration: Creating sink: tx-es-sink using OTHER
2012-11-12 09:21:41,733 DEBUG conf.FlumeConfiguration: Post validation configuration for agent
AgentConfiguration created without Configuration stubs for which only basic syntactical validation
was performed[agent]
SOURCES: {tx-avro={ parameters:{interceptors.log.preserveExisting=false, port=35853, interceptors=log,
channels=tx-mem-channel tx-file-channel, interceptors.log.type=org.flumeng.TransLogInterceptor$Builder,
type=avro, selector.type=replicating, bind=comflusvrintb01} }, web-avro={ parameters:{interceptors.log.preserveExisting=false,
port=35855, interceptors=log, channels=web-mem-channel web-file-channel, type=avro, interceptors.log.type=org.flumeng.AccessLogInterceptor$Builder,
selector.type=replicating, bind=comflusvrintb01} }, dev-avro={ parameters:{port=35854, interceptors.log.preserveExisting=false,
interceptors=isdigit log, channels=dev-mem-channel dev-file-channel, type=avro, interceptors.log.type=org.flumeng.LogInterceptor$Builder,
selector.type=replicating, bind=comflusvrintb01, interceptors.isdigit.type=org.flumeng.DigitInterceptor$Builder}
}}
CHANNELS: {dev-mem-channel={ parameters:{transactionCapacity=20, capacity=10000, type=memory}
}, web-mem-channel={ parameters:{transactionCapacity=20, capacity=10000, type=memory} }, dev-file-channel={
parameters:{checkpointDir=//flume/agent/dev-file-channel/checkpoint, dataDirs=//flume/agent/dev-file-channel/data,
maxFileSize=1073741824, type=FILE} }, tx-file-channel={ parameters:{checkpointDir=//flume/agent/tx-file-channel/checkpoint,
dataDirs=//flume/agent/tx-file-channel/data, maxFileSize=1073741824, type=FILE} }, tx-mem-channel={
parameters:{transactionCapacity=20, capacity=10000, type=memory} }, web-file-channel={ parameters:{checkpointDir=//flume/agent/web-file-channel/checkpoint,
dataDirs=//flume/agent/web-file-channel/data, maxFileSize=1073741824, type=FILE} }}
SINKS: {dev-es-sink={ parameters:{typeName=developer, host=pisblkwasintb01, indexName=devlog,
cluster=es-, type=org.flume.sink.ElasticSearchSink, channel=dev-mem-channel} }, web-es-sink={
parameters:{typeName=access, host=pisblkwasintb01, indexName=weblog, cluster=es-, type=org.flume.sink.ElasticSearchSink,
channel=web-mem-channel} }, web-hdfs-sink={ parameters:{hdfs.fileType=DataStream, hdfs.path=hdfs://comlogsvrintb01/web/%{hostname}/%Y-%m-%d,
hdfs.rollInterval=600, hdfs.rollSize=0, hdfs.writeFormat=Text, hdfs.filePrefix=web, type=hdfs,
channel=web-file-channel, hdfs.rollCount=0} }, dev-hdfs-sink={ parameters:{hdfs.fileType=DataStream,
hdfs.path=hdfs://comlogsvrintb01/logs/%{hostname}/%Y-%m-%d, hdfs.rollInterval=600, hdfs.rollSize=0,
hdfs.writeFormat=Text, hdfs.filePrefix=developer, type=hdfs, hdfs.rollCount=0, channel=dev-file-channel}
}, tx-hdfs-sink={ parameters:{hdfs.fileType=DataStream, hdfs.path=hdfs://comlogsvrintb01/logs/%{hostname}/%Y-%m-%d,
hdfs.rollInterval=600, hdfs.rollSize=0, hdfs.writeFormat=Text, hdfs.filePrefix=transaction,
type=hdfs, channel=tx-file-channel, hdfs.rollCount=0} }, tx-es-sink={ parameters:{typeName=transaction,
host=pisblkwasintb01, indexName=txlog, cluster=es-, type=org.flume.sink.ElasticSearchSink,
channel=tx-mem-channel} }}

2012-11-12 09:21:41,733 DEBUG conf.FlumeConfiguration: Channels:dev-mem-channel web-mem-channel
dev-file-channel tx-file-channel tx-mem-channel web-file-channel

2012-11-12 09:21:41,733 DEBUG conf.FlumeConfiguration: Sinks web-es-sink dev-es-sink web-hdfs-sink
dev-hdfs-sink tx-hdfs-sink tx-es-sink

2012-11-12 09:21:41,733 DEBUG conf.FlumeConfiguration: Sources tx-avro web-avro dev-avro

2012-11-12 09:21:41,734 INFO conf.FlumeConfiguration: Post-validation flume configuration
contains configuration  for agents: [agent]
2012-11-12 09:21:41,734 INFO properties.PropertiesFileConfigurationProvider: Creating channels
2012-11-12 09:21:41,734 DEBUG channel.DefaultChannelFactory: Creating instance of channel
dev-mem-channel type memory
2012-11-12 09:21:41,737 INFO properties.PropertiesFileConfigurationProvider: created channel
dev-mem-channel
2012-11-12 09:21:41,738 DEBUG channel.DefaultChannelFactory: Creating instance of channel
web-mem-channel type memory
2012-11-12 09:21:41,738 INFO properties.PropertiesFileConfigurationProvider: created channel
web-mem-channel
2012-11-12 09:21:41,738 DEBUG channel.DefaultChannelFactory: Creating instance of channel
dev-file-channel type FILE
2012-11-12 09:21:41,740 INFO properties.PropertiesFileConfigurationProvider: created channel
dev-file-channel
2012-11-12 09:21:41,740 DEBUG channel.DefaultChannelFactory: Creating instance of channel
tx-file-channel type FILE
2012-11-12 09:21:41,740 INFO properties.PropertiesFileConfigurationProvider: created channel
tx-file-channel
2012-11-12 09:21:41,740 DEBUG channel.DefaultChannelFactory: Creating instance of channel
tx-mem-channel type memory
2012-11-12 09:21:41,740 INFO properties.PropertiesFileConfigurationProvider: created channel
tx-mem-channel
2012-11-12 09:21:41,740 DEBUG channel.DefaultChannelFactory: Creating instance of channel
web-file-channel type FILE
2012-11-12 09:21:41,740 INFO properties.PropertiesFileConfigurationProvider: created channel
web-file-channel
2012-11-12 09:21:41,740 DEBUG source.DefaultSourceFactory: Creating instance of source tx-avro,
type avro
2012-11-12 09:21:41,760 DEBUG source.DefaultSourceFactory: Creating instance of source web-avro,
type avro
2012-11-12 09:21:41,762 DEBUG source.DefaultSourceFactory: Creating instance of source dev-avro,
type avro
2012-11-12 09:21:41,765 INFO sink.DefaultSinkFactory: Creating instance of sink web-es-sink
typeorg.flume.sink.ElasticSearchSink
2012-11-12 09:21:41,765 DEBUG sink.DefaultSinkFactory: Sink type org.flume.sink.ElasticSearchSink
is a custom type
2012-11-12 09:21:41,774 INFO sink.DefaultSinkFactory: Creating instance of sink dev-es-sink
typeorg.flume.sink.ElasticSearchSink
2012-11-12 09:21:41,774 DEBUG sink.DefaultSinkFactory: Sink type org.flume.sink.ElasticSearchSink
is a custom type
2012-11-12 09:21:41,774 INFO sink.DefaultSinkFactory: Creating instance of sink web-hdfs-sink
typehdfs

After this no more information, it stuck from here.

----------------------------------------
Thanks & Regards,
Ashutosh Sharma
----------------------------------------

From: Juhani Connolly [mailto:juhani_connolly@cyberagent.co.jp]
Sent: Friday, November 09, 2012 7:08 PM
To: user@flume.apache.org
Subject: Re: multiple agents

I can't see any obvious problem with your config.

When you start up, check your logs if all the components were correctly configured and started.
You may need to adjust the log4j configurations in your conf directory.

Are all your file channels configured to write to different directories? If they have the
same place setup, things aren't going to work well.

On 11/09/2012 05:43 PM, Ashutoshsharma(오픈플랫폼개발팀) wrote:
Hi,

Can I define the multiple flows with different sources, sinks and channels as below:

agent.sources = tx-avro dev-avro web-avro
agent.sinks = tx-es-sink tx-hdfs-sink dev-es-sink dev-hdfs-sink web-es-sink web-hdfs-sink
agent.channels = tx-mem-channel tx-file-channel dev-mem-channel dev-file-channel web-mem-channel
web-file-channel

##### Flow1 - Start #################################
## Define Avro source
agent.sources.tx-avro.type = avro
agent.sources.tx-avro.bind = 0.0.0.0
agent.sources.tx-avro.port = 35853
agent.sources.tx-avro.channels = tx-mem-channel tx-file-channel
agent.sources.tx-avro.selector.type = replicating

## Define HDFS sink
agent.sinks.tx-hdfs-sink.type = hdfs
agent.sinks.tx-hdfs-sink.hdfs.path = hdfs://…/%{hostname}/%Y-%m-%d
agent.sinks.tx-hdfs-sink.hdfs.fileType = DataStream
agent.sinks.tx-hdfs-sink.hdfs.writeFormat = Text
agent.sinks.tx-hdfs-sink.hdfs.filePrefix = transaction
agent.sinks.tx-hdfs-sink.channel = tx-file-channel
agent.sinks.tx-hdfs-sink.hdfs.rollCount = 0
agent.sinks.tx-hdfs-sink.hdfs.rollSize = 0
agent.sinks.tx-hdfs-sink.hdfs.rollInterval = 600

## Define es sink
agent.sinks.tx-es-sink.type = org.flume.sink.ESSink
agent.sinks.tx-es-sink.indexName = txlog
agent.sinks.tx-es-sink.typeName = tx
agent.sinks.tx-es-sink.cluster = es-cluster
agent.sinks.tx-es-sink.host = 9.127.216.198
agent.sinks.tx-es-sink.channel = tx-mem-channel

## Define the memory channel
agent.channels.tx-mem-channel.type = memory
agent.channels.tx-mem-channel.capacity = 10000
agent.channels.tx-mem-channel.transactionCapacity = 20

## Define the file channel
agent.channels.tx-file-channel.type = FILE
agent.channels.tx-file-channel.checkpointDir = /flume/agent/tx-file-channel/checkpoint
agent.channels.tx-file-channel.dataDirs = /flume/agent/tx-file-channel/data

Same as flow1 #### Flow2 #####....#### Flow3 ######....... is defined with different port
for avro source. Here I am using flow1, flow2 and flow3 for three different types of logs
and store separately i.e. different location.

When I defined the flume.conf(collector) as mentioned above, agents failed to connect to the
avro sources. It returns the RPC connection error. However, I checked that the agent is able
to send the events to the collector if I specify only one avro source.

So, the question is, can I define the mentioned configuration to have multiple agents(flows)
as mentioned above?

----------------------------------------
Thanks & Regards,
Ashutosh Sharma
----------------------------------------

From: Juhani Connolly [mailto:juhani_connolly@cyberagent.co.jp]
Sent: Thursday, November 08, 2012 5:07 PM
To: user@flume.apache.org<mailto:user@flume.apache.org>
Subject: Re: multiple agents

Hi Ashutosh,

as was pointed out, one configuration will work fine.

There is nothing stopping you running multiple background tasks, but that won't be possible
with the service scripts that come with the flume packaged in cdh, you'd have to write your
own service scripts. But really I can't think of a use case where you would want multiple
processes

On 11/08/2012 10:39 AM, Ashutoshsharma(오픈플랫폼개발팀) wrote:
Hi,

I have sources to collect multiple types of logs(mainly three types). Most of them generate
at least two types of logs. That mean, a server generates two types of log. For my use case,
I created two separate agents running on a server to collect the logs. I am running these
agents in foreground using “flume-ng agent –n agent1” command, so two flume process.

Now, I have doubt, should I merge these two agents configuration into single flume.conf file?
If I want to continue to use two different conf file, then how can I run the two agents in
background? Is there any known performance issue with any of these approaches?

Please share your suggestions and thoughts.

----------------------------------------
Thanks & Regards,
Ashutosh Sharma
----------------------------------------



이 메일은 지정된 수취인만을 위해 작성되었으며, 중요한 정보나 저작권을
포함하고 있을 수 있습니다. 어떠한 권한 없이, 본 문서에 포함된 정보의
전부 또는 일부를 무단으로 제3자에게 공개, 배포, 복사 또는 사용하는
것을 엄격히 금지합니다. 만약, 본 메일이 잘못 전송된 경우, 발신인
또는 당사에 알려주시고, 본 메일을 즉시 삭제하여 주시기 바랍니다.
This E-mail may contain confidential information and/or copyright material. This email is
intended for the use of the addressee only. If you receive this email by mistake, please either
delete it without reproducing, distributing or retaining copies thereof or notify the sender
immediately.



이 메일은 지정된 수취인만을 위해 작성되었으며, 중요한 정보나 저작권을
포함하고 있을 수 있습니다. 어떠한 권한 없이, 본 문서에 포함된 정보의
전부 또는 일부를 무단으로 제3자에게 공개, 배포, 복사 또는 사용하는
것을 엄격히 금지합니다. 만약, 본 메일이 잘못 전송된 경우, 발신인
또는 당사에 알려주시고, 본 메일을 즉시 삭제하여 주시기 바랍니다.
This E-mail may contain confidential information and/or copyright material. This email is
intended for the use of the addressee only. If you receive this email by mistake, please either
delete it without reproducing, distributing or retaining copies thereof or notify the sender
immediately.



이 메일은 지정된 수취인만을 위해 작성되었으며, 중요한 정보나 저작권을
포함하고 있을 수 있습니다. 어떠한 권한 없이, 본 문서에 포함된 정보의
전부 또는 일부를 무단으로 제3자에게 공개, 배포, 복사 또는 사용하는
것을 엄격히 금지합니다. 만약, 본 메일이 잘못 전송된 경우, 발신인
또는 당사에 알려주시고, 본 메일을 즉시 삭제하여 주시기 바랍니다.
This E-mail may contain confidential information and/or copyright material. This email is
intended for the use of the addressee only. If you receive this email by mistake, please either
delete it without reproducing, distributing or retaining copies thereof or notify the sender
immediately.
Mime
View raw message