accumulo-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Huanchen Zhang <iamzhan...@gmail.com>
Subject Re: Difference between InsertWithBatchWriter and InsertWithOutputFormat
Date Wed, 17 Oct 2012 02:48:43 GMT
Hello,  Corey

Thank you for your answer.

Can I use InsertWithBatchWriter for this task ? I mean, use context.write to write to hdfs,
use batchwriter.addMutation to write to accumulo.

Huanchen

On Oct 16, 2012, at 10:25 PM, Corey Nolet wrote:

> You can extend the output format to write to both and have the resulting record writer
underneath write to the correct endpoint depending on the items submitted from the job.
> 
> 
> 
> 
> 
> On Oct 16, 2012, at 10:16 PM, Huanchen Zhang wrote:
> 
>> Hello,
>> 
>> Hese I have a mapreduce job which needs to write to accumulo. I checked the examples.
It seems there are two different ways to write to accumulo, one is InsertWithBatchWriter,
one is InsertWithOutputFormat.
>> 
>> So, what is the difference of them ? Which one should I choose ?
>> 
>> I actually need to write to accumulo and hdfs in the same job. I seems InsertWithOutputFormat
cannot do this, because it needs to set the output format as "AccumuloOutputFormat.class",
and can only write to accumulo in one job, right ?
>> 
>> Thank you.
>> 
>> Best,
>> Huanchen
> 


Mime
View raw message