hadoop-hdfs-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Hadoop Explorer <hadoopexplo...@outlook.com>
Subject will an application with two maps but no reduce be suitable for hadoop?
Date Thu, 18 Apr 2013 11:49:24 GMT
I have an application that evaluate a graph using this algorithm:

- use a parallel for loop to evaluate all nodes in a graph (to evaluate a node, an image is
read, and then result of this node is calculated)

- use a second parallel for loop to evaluate all edges in the graph.  The function would take
in results from both nodes of the edge, and then calculate the answer for the edge


As you can see, the above algorithm would employ two map functions, but no reduce function.
 The total data size can be very large (say 100GB).  Also, the workload of each node and each
edge is highly irregular, and thus load balancing mechanisms are essential.

In this case, will hadoop suit this application?  if so, how will the architecture of my program
like?  And will hadoop be able to strike the balance between a good load balancing of the
second map function, and minimizing data transfer of the results from the first map function?


 		 	   		  
Mime
View raw message