hadoop-hdfs-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From alex kamil <alex.ka...@gmail.com>
Subject Re: Realtime sensor's tcpip data to hadoop
Date Thu, 15 May 2014 01:22:25 GMT
or you can use combination of kafka <http://kafka.apache.org/> +

On Wed, May 7, 2014 at 8:55 PM, Azuryy Yu <azuryyyu@gmail.com> wrote:

> Hi Alex,
> you can try Apache Flume.
> On Wed, May 7, 2014 at 10:48 AM, Alex Lee <eliyart@hotmail.com> wrote:
>> Sensors' may send tcpip data to server. Each sensor may send tcpip data
>> like a stream to the server, the quatity of the sensors and the data rate
>> of the data is high.
>> Firstly, how the data from tcpip can be put into hadoop. It need to do
>> some process and store in hbase. Does it need through save to data files
>> and put into hadoop or can be done in some direct ways from tcpip. Is there
>> any software module can take care of this. Searched that Ganglia Nagios and
>> Flume may do it. But when looking into details, ganglia and nagios are
>> more for monitoring hadoop cluster itself. Flume is for log files.
>> Secondly, if the total network traffic from sensors are over the limit of
>> one lan port, how to share the loads, is there any component in hadoop to
>> make this done automatically.
>> Any suggestions, thanks.

View raw message