hbase-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Shahab Yunus <shahab.yu...@gmail.com>
Subject Re: Job MapReduce to populate HBase Table
Date Mon, 13 Apr 2015 13:11:03 GMT
Silvio did your problem got resolved or not? I am assuming you have already
seen the example 7.2.4 from here
http://hbase.apache.org/0.94/book/mapreduce.example.html

There seem to be some type mismatch in the job setup, along with hadoop
version.

If you still have an issue then, can you paste your code? Both the
configuration as well as your reducer. Thanks.

Regards,
Shahab

On Mon, Apr 13, 2015 at 8:21 AM, Silvio Di gregorio <
silvio.digregorio@gmail.com> wrote:

> the signature of the write method is:
> write(ImmutableBytesWritable
>
> <eclipse-javadoc:%E2%98%82=LoadUnico/C:%5C/Users%5C/sdigregorio%5C/ws-cloudera%5C/JAR%5C/hadoop-mapreduce-client-core-2.0.0-cdh4.0.0.jar%3Corg.apache.hadoop.mapreduce(TaskInputOutputContext.class%E2%98%83TaskInputOutputContext~write~TKEYOUT;~TVALUEOUT;%E2%98%82org.apache.hadoop.hbase.io.ImmutableBytesWritable>
> arg0, Writable
>
> <eclipse-javadoc:%E2%98%82=LoadUnico/C:%5C/Users%5C/sdigregorio%5C/ws-cloudera%5C/JAR%5C/hadoop-mapreduce-client-core-2.0.0-cdh4.0.0.jar%3Corg.apache.hadoop.mapreduce(TaskInputOutputContext.class%E2%98%83TaskInputOutputContext~write~TKEYOUT;~TVALUEOUT;%E2%98%82org.apache.hadoop.io.Writable>
> arg1)
>
> arg0 don't accept NulWritable.get(), instead "null"
>
> 2015-04-13 14:13 GMT+02:00 Silvio Di gregorio <silvio.digregorio@gmail.com
> >:
>
> > in documentation (
> http://hbase.apache.org/0.94/book/mapreduce.example.html),
> > when the Reduce extends the TableReducer class the write method
> > put the key null, or rather NullWritable like Shahab Say.
> > However, the error has disappeared when i removed the
> > "hbase-client-0.96.0-hadoop1.jar" and inserted
> "hbase-0.94.6-cdh4.3.0.jar".
> > Sorry but i don't understand, I knew only that the Context object had not
> > accepted a PUT where was required WRITABLE with the
> > "hbase-client-0.96.0-hadoop1.jar" jar in classpath. With the
> > "hbase-0.94.6-cdh4.3.0.jar" it is possible.
> >
> > 2015-04-13 13:52 GMT+02:00 Jean-Marc Spaggiari <jean-marc@spaggiari.org
> >:
> >
> >> Oh, Shahab is write! That's what happend when you write emails before
> your
> >> coffee ;) I confused with your "Put" key ;)  Looked to quickly...
> >>
> >> JM
> >>
> >> 2015-04-13 7:46 GMT-04:00 Shahab Yunus <shahab.yunus@gmail.com>:
> >>
> >> > For the null key you should use NullWritable class, as discussed here:
> >> >
> >> >
> >>
> http://stackoverflow.com/questions/16198752/advantages-of-using-nullwritable-in-hadoop
> >> >
> >> > Regards,
> >> > Shahab
> >> >
> >> > On Mon, Apr 13, 2015 at 7:01 AM, Jean-Marc Spaggiari <
> >> > jean-marc@spaggiari.org> wrote:
> >> >
> >> > > Hi Silvio,
> >> > >
> >> > > What is the key you try to write into your HBase table? From your
> >> code,
> >> > > sound like you want your key to be null for all your values, which
> is
> >> not
> >> > > possible in HBase.
> >> > >
> >> > > JM
> >> > >
> >> > > 2015-04-13 6:37 GMT-04:00 Silvio Di gregorio <
> >> > silvio.digregorio@gmail.com
> >> > > >:
> >> > >
> >> > > > Hi,
> >> > > > In Reduce phase when i write to Hbase Table "PFTableNa"
> >> > > > context.write(null , put);
> >> > > > Eclipse say me:
> >> > > > *"The method write(ImmutableBytesWritable, Writable) in the type
> >> > > >
> >> > >
> >> >
> >>
> TaskInputOutputContext<Text,BytesWritable,ImmutableBytesWritable,Writable>
> >> > > > is not applicable for the arguments (null, Put)"*
> >> > > >
> >> > > > *put *is org
> >> > > >
> >> > > >
> >> > >
> >> >
> >>
> <eclipse-javadoc:%E2%98%82=LoadUnico/C:%5C/Users%5C/sdigregorio%5C/workspace-luna%5C/hbase-client-0.96.0-hadoop1.jar%3Corg>
> >> > > > .apache
> >> > > >
> >> > > >
> >> > >
> >> >
> >>
> <eclipse-javadoc:%E2%98%82=LoadUnico/C:%5C/Users%5C/sdigregorio%5C/workspace-luna%5C/hbase-client-0.96.0-hadoop1.jar%3Corg.apache>
> >> > > > .hadoop
> >> > > >
> >> > > >
> >> > >
> >> >
> >>
> <eclipse-javadoc:%E2%98%82=LoadUnico/C:%5C/Users%5C/sdigregorio%5C/workspace-luna%5C/hbase-client-0.96.0-hadoop1.jar%3Corg.apache.hadoop>
> >> > > > .hbase
> >> > > >
> >> > > >
> >> > >
> >> >
> >>
> <eclipse-javadoc:%E2%98%82=LoadUnico/C:%5C/Users%5C/sdigregorio%5C/workspace-luna%5C/hbase-client-0.96.0-hadoop1.jar%3Corg.apache.hadoop.hbase>
> >> > > > .client
> >> > > >
> >> > > >
> >> > >
> >> >
> >>
> <eclipse-javadoc:%E2%98%82=LoadUnico/C:%5C/Users%5C/sdigregorio%5C/workspace-luna%5C/hbase-client-0.96.0-hadoop1.jar%3Corg.apache.hadoop.hbase.client>
> >> > > > .Put
> >> > > >
> >> > > >
> >> > >
> >> >
> >>
> <eclipse-javadoc:%E2%98%82=LoadUnico/C:%5C/Users%5C/sdigregorio%5C/workspace-luna%5C/hbase-client-0.96.0-hadoop1.jar%3Corg.apache.hadoop.hbase.client(Put.class%E2%98%83Put>.Put(byte[]
> >> > > > row)
> >> > > > byte[] rowkey = key.getBytes();
> >> > > > Put put = new Put(rowkey);
> >> > > >
> >> > > > the signature of the reduce method:
> >> > > > reduce( Text key,Iterable<BytesWritable> values, Context
context)
> >> > > >
> >> > > > and
> >> > > >
> >> > > > public static class Reduce extends TableReducer<Text,
> BytesWritable,
> >> > > > ImmutableBytesWritable>{
> >> > > >
> >> > > > in the main method:
> >> > > > Configuration conf = HBaseConfiguration.create();
> >> > > > Job job = new Job(conf, "LetturaFileHDFS2HBase");
> >> > > > ...
> >> > > > TableMapReduceUtil.initTableReducerJob("PFTableNa", Reduce.class,
> >> job);
> >> > > >
> >> > > > Thanks a lot
> >> > > > Silvio
> >> > > >
> >> > >
> >> >
> >>
> >
> >
>

Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message