flume-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Hari Shreedharan <hshreedha...@cloudera.com>
Subject Re: FlumeNG Error while going with Avro (Source) and HDFS (Sink)
Date Wed, 04 Jul 2012 18:07:52 GMT
Are you behind a NAT? If you are, then you need to set up port forwarding on your NAT device.
An easy way to check is to start an nc server: nc -l <port_number> 

and then try connecting to it using: nc 107.108.199.29 <port_number>. 

Thanks
Hari

-- 
Hari Shreedharan


On Wednesday, July 4, 2012 at 5:22 AM, Amit Handa wrote:

> Hi,
> 
> 1) While going for Avro (as Source) , and HDFS as Sink. I am getting error "[ERROR -
org.apache.flume.api.NettyAvroRpcClient.connect(NettyAvroRpcClient.java:125)] RPC connection
error :". Kindly help in resolving it
> 
> 1 a) flume.conf file is as follows
> 
> # Define a memory channel called ch1 on agent1
> agent1.channels.ch1.type = memory
> 
> # Define an Avro source called avro-source1 on agent1 and tell it
> # to bind to 0.0.0.0:41414 (http://0.0.0.0:41414). Connect it to channel ch1.
> agent1.sources.avro-source1.channels = ch1
> agent1.sources.avro-source1.type = avro
> agent1.sources.avro-source1.bind = 107.101.199.29
> agent1.sources.avro-source1.port = 41414
> 
> # Define a hdfs sink that simply logs all events it receives
> # and connect it to the other end of the same channel.
> agent1.sinks.HDFS.channel = ch1
> agent1.sinks.HDFS.type = hdfs
> agent1.sinks.HDFS.hdfs.path = hdfs://107.101.199.29:54310/user/hadoop-node1/flume-test/
(http://107.101.199.29:54310/user/hadoop-node1/flume-test/)
> agent1.sinks.HDFS.hdfs.file.Type = DataStream
> 
> 
> # Finally, now that we've defined all of our components, tell
> # agent1 which ones we want to activate.
> agent1.channels = ch1
> agent1.sources = avro-source1
> agent1.sinks = HDFS
> 
> 2) it is a single node hadoop setup. command executed for flume are 
>             bin/flume-ng agent --conf ./conf/ -f conf/flumeAVRO_HDFS.conf -n agent1 (in
one console)
>             bin/flume-ng avro-client --conf conf -H 107.108.199.29 -p 41414 -F /home/hadoop-node1/Desktop/my.txt
(in other console for Avro CLient) [ also tried command with -H localhost option), but same
error.
> 
> 3) hadoop version 0.20 is used
> 
> 
> 
> 
> With Regards,
> Amit
> 
> 
> Attachments: 
> - error.log
> 



Mime
View raw message