hadoop-hdfs-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Peyman Mohajerian <mohaj...@gmail.com>
Subject Re: Realtime sensor's tcpip data to hadoop
Date Wed, 07 May 2014 20:18:06 GMT
Flume is not just for log files, you can wire up Flume's source for this
purpose. Also there are alternative open-source solutions for data
streaming, e.g. Apache Storm or Kafka.


On Tue, May 6, 2014 at 10:48 PM, Alex Lee <eliyart@hotmail.com> wrote:

> Sensors' may send tcpip data to server. Each sensor may send tcpip data
> like a stream to the server, the quatity of the sensors and the data rate
> of the data is high.
>
> Firstly, how the data from tcpip can be put into hadoop. It need to do
> some process and store in hbase. Does it need through save to data files
> and put into hadoop or can be done in some direct ways from tcpip. Is there
> any software module can take care of this. Searched that Ganglia Nagios and
> Flume may do it. But when looking into details, ganglia and nagios are
> more for monitoring hadoop cluster itself. Flume is for log files.
>
> Secondly, if the total network traffic from sensors are over the limit of
> one lan port, how to share the loads, is there any component in hadoop to
> make this done automatically.
>
> Any suggestions, thanks.
>

Mime
View raw message