hadoop-hdfs-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Harsh J <ha...@cloudera.com>
Subject Re: Realtime sensor's tcpip data to hadoop
Date Sun, 11 May 2014 11:44:15 GMT
Apache Flume isn't just 'for log files' - it is an event collection
framework and would fit your use-case.

For further questions over Flume, please ask on user@flume.apache.org.

On Wed, May 7, 2014 at 8:18 AM, Alex Lee <eliyart@hotmail.com> wrote:
> Sensors' may send tcpip data to server. Each sensor may send tcpip data like
> a stream to the server, the quatity of the sensors and the data rate of the
> data is high.
> Firstly, how the data from tcpip can be put into hadoop. It need to do some
> process and store in hbase. Does it need through save to data files and put
> into hadoop or can be done in some direct ways from tcpip. Is there any
> software module can take care of this. Searched that Ganglia Nagios and
> Flume may do it. But when looking into details, ganglia and nagios are more
> for monitoring hadoop cluster itself. Flume is for log files.
> Secondly, if the total network traffic from sensors are over the limit of
> one lan port, how to share the loads, is there any component in hadoop to
> make this done automatically.
> Any suggestions, thanks.

Harsh J

View raw message