flume-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "鹰" <980548...@qq.com>
Subject set flume send logs to hdfs error
Date Thu, 14 May 2015 02:38:25 GMT
hi all ,
 i'm want set flume send data to hdfs my configure file is lile this :
tier1.sources=source1  
tier1.channels=channel1  
tier1.sinks=sink1  

tier1.sources.source1.type=avro  
tier1.sources.source1.bind=0.0.0.0  
tier1.sources.source1.port=44444  
tier1.sources.source1.channels=channel1  

tier1.channels.channel1.type=memory  
tier1.channels.channel1.capacity=10000  
tier1.channels.channel1.transactionCapacity=1000  
tier1.channels.channel1.keep-alive=30  

tier1.sinks.sink1.type=hdfs  
tier1.sinks.sink1.channel=channel1  
tier1.sinks.sink1.hdfs.path=hdfs://hadoop-home.com:9000/user/hadoop/ 
tier1.sinks.sink1.hdfs.fileType=DataStream  
tier1.sinks.sink1.hdfs.writeFormat=Text  
tier1.sinks.sink1.hdfs.rollInterval=0  
tier1.sinks.sink1.hdfs.rollSize=10240  
tier1.sinks.sink1.hdfs.rollCount=0  
tier1.sinks.sink1.hdfs.idleTimeout=60  

when I start the flume by this configure file and send data to the port 44444 I get an error
:
org.apache.avro.AvroRuntimeException: Excessively large list allocation request detected:
154218761 items! Connection closed;
dose anybody can help me ,thanks.
Mime
View raw message