hbase-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Jean-Marc Spaggiari <jean-m...@spaggiari.org>
Subject Re: Job MapReduce to populate HBase Table
Date Mon, 13 Apr 2015 11:52:50 GMT
Oh, Shahab is write! That's what happend when you write emails before your
coffee ;) I confused with your "Put" key ;)  Looked to quickly...

JM

2015-04-13 7:46 GMT-04:00 Shahab Yunus <shahab.yunus@gmail.com>:

> For the null key you should use NullWritable class, as discussed here:
>
> http://stackoverflow.com/questions/16198752/advantages-of-using-nullwritable-in-hadoop
>
> Regards,
> Shahab
>
> On Mon, Apr 13, 2015 at 7:01 AM, Jean-Marc Spaggiari <
> jean-marc@spaggiari.org> wrote:
>
> > Hi Silvio,
> >
> > What is the key you try to write into your HBase table? From your code,
> > sound like you want your key to be null for all your values, which is not
> > possible in HBase.
> >
> > JM
> >
> > 2015-04-13 6:37 GMT-04:00 Silvio Di gregorio <
> silvio.digregorio@gmail.com
> > >:
> >
> > > Hi,
> > > In Reduce phase when i write to Hbase Table "PFTableNa"
> > > context.write(null , put);
> > > Eclipse say me:
> > > *"The method write(ImmutableBytesWritable, Writable) in the type
> > >
> >
> TaskInputOutputContext<Text,BytesWritable,ImmutableBytesWritable,Writable>
> > > is not applicable for the arguments (null, Put)"*
> > >
> > > *put *is org
> > >
> > >
> >
> <eclipse-javadoc:%E2%98%82=LoadUnico/C:%5C/Users%5C/sdigregorio%5C/workspace-luna%5C/hbase-client-0.96.0-hadoop1.jar%3Corg>
> > > .apache
> > >
> > >
> >
> <eclipse-javadoc:%E2%98%82=LoadUnico/C:%5C/Users%5C/sdigregorio%5C/workspace-luna%5C/hbase-client-0.96.0-hadoop1.jar%3Corg.apache>
> > > .hadoop
> > >
> > >
> >
> <eclipse-javadoc:%E2%98%82=LoadUnico/C:%5C/Users%5C/sdigregorio%5C/workspace-luna%5C/hbase-client-0.96.0-hadoop1.jar%3Corg.apache.hadoop>
> > > .hbase
> > >
> > >
> >
> <eclipse-javadoc:%E2%98%82=LoadUnico/C:%5C/Users%5C/sdigregorio%5C/workspace-luna%5C/hbase-client-0.96.0-hadoop1.jar%3Corg.apache.hadoop.hbase>
> > > .client
> > >
> > >
> >
> <eclipse-javadoc:%E2%98%82=LoadUnico/C:%5C/Users%5C/sdigregorio%5C/workspace-luna%5C/hbase-client-0.96.0-hadoop1.jar%3Corg.apache.hadoop.hbase.client>
> > > .Put
> > >
> > >
> >
> <eclipse-javadoc:%E2%98%82=LoadUnico/C:%5C/Users%5C/sdigregorio%5C/workspace-luna%5C/hbase-client-0.96.0-hadoop1.jar%3Corg.apache.hadoop.hbase.client(Put.class%E2%98%83Put>.Put(byte[]
> > > row)
> > > byte[] rowkey = key.getBytes();
> > > Put put = new Put(rowkey);
> > >
> > > the signature of the reduce method:
> > > reduce( Text key,Iterable<BytesWritable> values, Context context)
> > >
> > > and
> > >
> > > public static class Reduce extends TableReducer<Text, BytesWritable,
> > > ImmutableBytesWritable>{
> > >
> > > in the main method:
> > > Configuration conf = HBaseConfiguration.create();
> > > Job job = new Job(conf, "LetturaFileHDFS2HBase");
> > > ...
> > > TableMapReduceUtil.initTableReducerJob("PFTableNa", Reduce.class, job);
> > >
> > > Thanks a lot
> > > Silvio
> > >
> >
>

Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message