hadoop-common-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Ted Dunning <tdunn...@maprtech.com>
Subject Re: breadth-first search
Date Tue, 21 Dec 2010 05:02:48 GMT
On Mon, Dec 20, 2010 at 8:16 PM, Peng, Wei <Wei.Peng@xerox.com> wrote:

> ... My question is really about what is the efficient way for graph
> computation, matrix computation, algorithms that need many iterations to
> converge (with intermediate results).
>

Large graph computations usually assume a sparse graph for historical
reasons.  A key property of scalable
algorithms is that the time and space are linear in the input size.  Most
all path algorithms are not linear because
the result is n x n and is dense.

Some graph path computations can be done indirectly by spectral methods.
 With good random projection algorithms for sparse matrix decomposition,
approximate versions of some of these algorithms can be phrased in a
scalable fashion.  It isn't an easy task, however.


> HAMA looks like a very good solution, but can we use it now and how to
> use it?
>
>
I don't think that Hama has produced any usable software yet.

Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message