hbase-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Jean-Daniel Cryans <jdcry...@apache.org>
Subject Re: Single Job to put Data into Hbase+MySQL
Date Wed, 27 Oct 2010 21:16:31 GMT
Using the HBase API in your mapper:
http://hbase.apache.org/docs/current/api/org/apache/hadoop/hbase/client/HTable.html#put(java.util.List)

J-D

On Wed, Oct 27, 2010 at 2:10 PM, Shuja Rehman <shujamughal@gmail.com> wrote:
> Actually I am not using reducers. Only mappers work for me
> Secondly the procedure was mapper output saved in files which then
> transferred to mysql using sqoop. so here i need to save output to files +
> sent data to hbase. suppose i use output format to save data into file then
> how to send data to hbase in map reduce manually?
>
> On Thu, Oct 28, 2010 at 1:47 AM, Jean-Daniel Cryans <jdcryans@apache.org>
> wrote:
>>
>> Do both insertions in your reducer by either not using the output
>> formats at all or use one of them and do the other insert by hand.
>>
>> J-D
>>
>> On Wed, Oct 27, 2010 at 1:44 PM, Shuja Rehman <shujamughal@gmail.com>
>> wrote:
>> > Hi Folks
>> >
>> > I am wondering if anyone has the answer of this question. I am
>> > processing
>> > log files using Map reduce and get data to put some part into mysql and
>> > rest
>> > of hbase. At the moment, i am running two separate jobs to do this so
>> > reading single file for 2 times to dump the data. My questions is that
>> > can
>> > it be possible that I run single job to achieve it??
>> >
>> > --
>> > Regards
>> > Shuja-ur-Rehman Baig
>> > http://pk.linkedin.com/in/shujamughal
>> > Cell: +92 3214207445
>> >
>
>
>
> --
> Regards
> Shuja-ur-Rehman Baig
> http://pk.linkedin.com/in/shujamughal
> Cell: +92 3214207445
>

Mime
View raw message