hadoop-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Narasingu Ramesh <ramesh.narasi...@gmail.com>
Subject Re: how to execute different tasks on data nodes(simultaneously in hadoop).
Date Tue, 04 Sep 2012 08:03:52 GMT
Hi Users,
              Hadoop can distribute all the data into HDFS inside MapReduce
tasks can work together. which one is goes to which data node and how it
works all those things it can maintain each task has own JVM in each data
node. JVM can handle hell number of data to process to the all data nodes
and then simplifies the each task.
Thanks & Regards,
Ramesh.Narasingu

On Mon, Sep 3, 2012 at 11:11 PM, Bertrand Dechoux <dechouxb@gmail.com>wrote:

> Hi,
>
> Assuming you have to compute these value for every RGB pixel.
> Why couldn't you compute all these values at the same time on the same
> node?
>
> Hadoop let you distribute your computation but it doesn't mean each node
> has to compute only a part of the equations.
> Each node can compute all equations but for a 'small' part of the data.
> That's Hadoop strategy. That way, sequential read and data locality will
> improve your performances.
>
> Regards
>
> Bertrand
>
>
> On Mon, Sep 3, 2012 at 6:35 PM, mallik arjun <mallik.cloud@gmail.com>wrote:
>
>> [image: Inline image 1]
>>
>> On Mon, Sep 3, 2012 at 10:01 PM, Bertrand Dechoux <dechouxb@gmail.com>wrote:
>>
>>> You can check the value of "map.input.file" in order to apply a
>>> different logic for each type of files (in the mapper).
>>> More information about your problem/context would help the readers to
>>> provide a more extensive reply.
>>>
>>> Regards
>>>
>>> Bertrand
>>>
>>> each data node has to process one equation of above simultaneously.
>>
>>
>>> On Mon, Sep 3, 2012 at 6:25 PM, Michael Segel <michael_segel@hotmail.com
>>> > wrote:
>>>
>>>> Not sure what you are trying to do...
>>>>
>>>> You want to pass through the entire data set on all nodes where each
>>>> node runs a single filter?
>>>>
>>>> You're thinking is orthogonal to how Hadoop works.
>>>>
>>>> You would be better off letting each node work on a portion of the data
>>>> which is local to that node running the entire filter set.
>>>>
>>>>
>>>> On Sep 3, 2012, at 11:19 AM, mallik arjun <mallik.cloud@gmail.com>
>>>> wrote:
>>>>
>>>> > genrally in hadoop map function will be exeucted by all the data
>>>> nodes on the input data set ,against this how can i do the following.
>>>> > i have some filter programs , and what i want to do is each data
>>>> node(slave) has to execute one filter alogrithm  simultaneously, diffent
>>>> from other data nodes executions.
>>>> >
>>>> > thanks in advance.
>>>> >
>>>> >
>>>>
>>>>
>>>
>>>
>>> --
>>> Bertrand Dechoux
>>>
>>
>>
>
>
> --
> Bertrand Dechoux
>

Mime
View raw message