flume-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From =?gb2312?B?6UY=?= <cuirong198...@hotmail.com>
Subject Flume-kafka problem
Date Fri, 29 Aug 2014 09:27:15 GMT




HI,
In my project i used flume and  kafkaSink, but with a  strange phenomeno, the flume didnot
work after an hour, that mean the files in flume spooldir cannot be process normally,it
seems like something wrong but with no errors,the flume-ng server log has nothing except the
info:
(Sourcing environment configuration script /../flume-1.5.0/conf/flume-env.sh+ exec /../jdk1.7.0_67/bin/java
-Xms8192m -Xmx8192m -Xss256k -Xmn2g  -Dflume.root.logger=INFO -cp '/../flume-1.5.0/conf:/../flume-1.5.0/lib/*'
-Djava.library.path= org.apache.flume.node.Application --conf-file conf/flume-conf.properties
--name producer)
Data volume is Probably 10MB one minute ,and memory of flume process looks Ok,less than
2G.
here is flume config:producer.sources.s.type=spooldirproducer.sources.s.channels=cproducer.sources.s.spoolDir=/../flumeSourceproducer.sources.s.fileSuffix=.doproducer.sources.s.deserializer=LINEproducer.sources.s.deserializer.maxLineLength=65535producer.sources.s.deletePolicy=immediateproducer.sources.s.decodeErrorPolicy=IGNORE

# Each sink's type must be definedproducer.sinks.r.type=org.apache.flume.plugins.KafkaSinkproducer.sinks.r.metadata.broker.list=..producer.sinks.r.partition.key=0producer.sinks.r.partitioner.class=org.apache.flume.plugins.SinglePartitionproducer.sinks.r.serializer.class=kafka.serializer.StringEncoderproducer.sinks.r.request.required.acks=0producer.sinks.r.max.message.size=1000000producer.sinks.r.producer.type=syncproducer.sinks.r.custom.encoding=UTF-8producer.sinks.r.custom.topic.name=..
#Specify the channel the sink should useproducer.sinks.r.channel = c
# Each channel's type is defined.producer.channels.c.type = memoryproducer.channels.c.capacity
= 1000
#############################################   consumer config###########################################
consumer.sources = sconsumer.channels = cconsumer.sinks = r
consumer.sources.s.type = seqconsumer.sources.s.channels = cconsumer.sinks.r.type = logger
consumer.sinks.r.channel = cconsumer.channels.c.type = memoryconsumer.channels.c.capacity
= 100
consumer.sources.s.type = org.apache.flume.plugins.KafkaSourceconsumer.sources.s.zookeeper.connect=..consumer.sources.s.group.id=..consumer.sources.s.zookeeper.session.timeout.ms=400consumer.sources.s.zookeeper.sync.time.ms=200consumer.sources.s.auto.commit.interval.ms=1000consumer.sources.s.custom.topic.name=..consumer.sources.s.custom.thread.per.consumer=4
env:apache-flume-1.5.0-binflumeng-kafka-plugin-master(https://github.com/beyondj2ee/flumeng-kafka-plugin)kafka_2.9.2-0.8.1
what should i do with it? The kafka works Normally! thanks a lot for any help.


 		 	   		   		 	   		  
Mime
View raw message