hbase-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From yeshwanth kumar <yeshwant...@gmail.com>
Subject Re: writing to multiple hbase tables in a mapreduce job
Date Tue, 26 Aug 2014 18:29:03 GMT
hi ted,

i need to process the data in table i1, and then i need to write the
results to tables i1 and i2
so input for the mapper in my mapreduce job is from hbase table, i1
whereas in WALPlayer input is HLogInputFormat,

if i remove the statement as you said and specify  the inputformat
as TableInputFormat it is throwing "No table was provided " Exception
if i specify the input table as in the statements

TableMapReduceUtil.initTableMapperJob(otherArgs[0], scan,
EntitySearcherMapper.class, ImmutableBytesWritable.class, Put.class,
job);//otherArgs[0]=i1

mapper is not considering other table,
any suggestions to resolve  this issue,

thanks,
yeshwanth




On Tue, Aug 26, 2014 at 10:39 PM, Ted Yu <yuzhihong@gmail.com> wrote:

> Please take a look at WALPlayer.java in hbase where you can find example of
> how MultiTableOutputFormat is used.
>
> Cheers
>
>
> On Tue, Aug 26, 2014 at 10:04 AM, yeshwanth kumar <yeshwanth43@gmail.com>
> wrote:
>
> > hi ted,
> >
> > how can we intialise the mapper if i comment out those lines
> >
> >
> >
> > On Tue, Aug 26, 2014 at 10:08 PM, Ted Yu <yuzhihong@gmail.com> wrote:
> >
> > > TableMapReduceUtil.initTableMapperJob(otherArgs[0], scan,
> > > EntitySearcherMapper.class, ImmutableBytesWritable.class, Put.class,
> > > job);//otherArgs[0]=i1
> > >
> > > You're initializing with table 'i1'
> > > Please remove the above call and try again.
> > >
> > > Cheers
> > >
> > >
> > > On Tue, Aug 26, 2014 at 9:18 AM, yeshwanth kumar <
> yeshwanth43@gmail.com>
> > > wrote:
> > >
> > > > hi i am running  HBase 0.94.20  on Hadoop 2.2.0
> > > >
> > > > i am using MultiTableOutputFormat,
> > > > for writing processed output to two different tables in hbase.
> > > >
> > > > here's the code snippet
> > > >
> > > > private ImmutableBytesWritable tab_cr = new ImmutableBytesWritable(
> > > > Bytes.toBytes("i1")); private ImmutableBytesWritable tab_cvs = new
> > > > ImmutableBytesWritable( Bytes.toBytes("i2"));
> > > >
> > > > @Override
> > > > public void map(ImmutableBytesWritable row, final Result value,
> > > > final Context context) throws IOException, InterruptedException {
> > > >
> > > > -----------------------------------------
> > > > Put pcvs = new Put(entry.getKey().getBytes());
> > > > pcvs.add("cf".getBytes(),"type".getBytes(),column.getBytes());
> > > > Put put = new Put(value.getRow());
> > > > put.add("Entity".getBytes(), "json".getBytes(),
> > > > entry.getValue().getBytes());
> > > > context.write(tab_cr, put);// table i1 context.write(tab_cvs,
> > > pcvs);//table
> > > > i2
> > > >
> > > > }
> > > >
> > > > job.setJarByClass(EntitySearcherMR.class);
> > > > job.setMapperClass(EntitySearcherMapper.class);
> > > > job.setOutputFormatClass(MultiTableOutputFormat.class); Scan scan =
> new
> > > > Scan(); scan.setCacheBlocks(false);
> > > > TableMapReduceUtil.initTableMapperJob(otherArgs[0], scan,
> > > > EntitySearcherMapper.class, ImmutableBytesWritable.class, Put.class,
> > > > job);//otherArgs[0]=i1
> > > TableMapReduceUtil.initTableReducerJob(otherArgs[0],
> > > > null, job); job.setNumReduceTasks(0);
> > > >
> > > > mapreduce job fails by saying nosuchcolumnfamily "cf" exception, in
> > table
> > > > i1
> > > > i am writing data to two different columnfamilies one in each table,
> cf
> > > > belongs to table i2.
> > > > does the columnfamilies should present in both tables??
> > > > is there anything i am missing
> > > > can someone point me in the right direction
> > > >
> > > > thanks,
> > > > yeshwanth.
> > > >
> > >
> >
>

Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message