accumulo-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Keith Turner <>
Subject Re: Using Accumulo as input to a MapReduce job frequently hangs due to lost Zookeeper connection
Date Tue, 21 Aug 2012 12:23:31 GMT
Yeah, that would certainly work.

You could run two map only jobs (could run concurrently).  A job that
reads D1 and writes to Table3 and a job that reads D2 and writes
Table3.   Map reduce may be faster, unless you want the final result
in Accumulo in which case this may be faster.  The two map reduce jobs
could also produce files to bulk import into table3.


On Mon, Aug 20, 2012 at 8:26 PM, David Medinets
<> wrote:
> Can you use a new table to join and then scan the new table? Use the foreign
> key as the rowid. Basically create your own materialized view.

View raw message