hbase-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Stack <st...@duboce.net>
Subject Re: Type mismatch
Date Mon, 07 Feb 2011 16:53:52 GMT
Thanks for writing back the list Mark (I should have spotted that -- sorry).
St.Ack

On Sun, Feb 6, 2011 at 8:11 PM, Mark Kerzner <markkerzner@gmail.com> wrote:
> And the correct answer is... instead of this signature
>
>    public static class RowCounterReducer
>            extends TableReducer <Text, IntWritable,
> ImmutableBytesWritable>
>    {
>        public void reduce(Text key,
>                Iterable<IntWritable> values,
>                Reducer.Context context)
>                throws IOException,
>                InterruptedException {
>
> (WRONG!)
>
> I used this signature
>
>     public static class RowCounterReducer extends
> TableReducer<ImmutableBytesWritable, IntWritable, ImmutableBytesWritable> {
>
>        @Override
>        public void reduce(ImmutableBytesWritable key, Iterable<IntWritable>
> values, Context context)
>                throws IOException, InterruptedException {
>
> RIGHT!
>
> Thank you, all.
>
> Mark
>
> On Fri, Feb 4, 2011 at 2:01 PM, Stack <stack@duboce.net> wrote:
>
>> (Thanks Sujee)
>>
>> What did you change in your src to get it going?
>>
>> St.Ack
>>
>> On Fri, Feb 4, 2011 at 10:56 AM, Mark Kerzner <markkerzner@gmail.com>
>> wrote:
>> > I found an example that works and uses the latest HBase API,
>> > http://sujee.net/tech/articles/hbase-map-reduce-freq-counter/, you might
>> > know about it, but for me it was very helpful.
>> >
>> > Mark
>> >
>> > On Fri, Feb 4, 2011 at 11:55 AM, Stack <stack@duboce.net> wrote:
>> >
>> >> Its just an issue of matching your outputs to TOF.   There are
>> >> examples of Reducer usage in the mapreduce package.  They declare
>> >> their types other than how you have it.  See PutSortReducer and
>> >> ImportTsv which uses it (and configures it up).
>> >>
>> >> St.Ack
>> >>
>> >> On Fri, Feb 4, 2011 at 7:08 AM, Mark Kerzner <markkerzner@gmail.com>
>> >> wrote:
>> >> > I tried 0.90 - same error. I am going to try to build HBase from code
>> and
>> >> > include this code in my debugging session, to step through it. But
I
>> must
>> >> be
>> >> > doing something wrong.
>> >> >
>> >> > How does one write to HBase in the Reducer, is there any example!???
>> >> >
>> >> > Thank you!
>> >> >
>> >> > Mark
>> >> >
>> >> > On Fri, Feb 4, 2011 at 12:38 AM, Stack <stack@duboce.net> wrote:
>> >> >
>> >> >> I'm not sure whats up w/ your sample above.  Here's some observations
>> >> >> that might help.
>> >> >>
>> >> >> Here is the code.  Our line numbers differ.  You are not on 0.90.0?
>> >> >> Thats not important.  You are in this method it seems:
>> >> >>
>> >> >>
>> >>
>> http://hbase.apache.org/xref/org/apache/hadoop/hbase/mapreduce/TableOutputFormat.html#124
>> >> >>  See the message on the end.  You should submit a patch where
we add
>> >> >> to the IOException message a toString on the value passed so we
have
>> a
>> >> >> better clue as to where we are off here -- so you can see class
of
>> >> >> object submitted (debugging, I'd add this to the log message).
>> >> >>
>> >> >> Looking at how you declare TOF, it doesn't look right (This helps
>> with
>> >> >> that:
>> >> >>
>> >>
>> http://hbase.apache.org/xref/org/apache/hadoop/hbase/mapreduce/TableReducer.html
>> >> >> ).
>> >> >>  It seems like the declaration should be <KEYIN, VALUEIN, KEYOUT>
>>  but
>> >> >> you are outputting a Text for KEYOUT, not the declared Put.  This
is
>> >> >> probably not your prob. though.
>> >> >>
>> >> >> Looking at IdentityTableReducer, it just passes Writables with
the
>> >> >> value a Delete or Put.
>> >> >>
>> >> >> St.Ack
>> >> >>
>> >> >>
>> >> >>
>> >> >>
>> >> >>
>> >> >> On Thu, Feb 3, 2011 at 10:00 PM, Mark Kerzner <markkerzner@gmail.com
>> >
>> >> >> wrote:
>> >> >> > Thank you, St.Ack, it is very nice of you to keep helping
me. Here
>> is
>> >> the
>> >> >> > stack :) trace, but as you can see, it is the internal Hadoop
code.
>> I
>> >> see
>> >> >> > this code and I see the message - I am not passing it the
right
>> object
>> >> -
>> >> >> but
>> >> >> > how DO I pass the right object?
>> >> >> >
>> >> >> > M
>> >> >> >
>> >> >> >
>> >> >> >        at
>> >> >> >
>> >> >>
>> >>
>> org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:106)
>> >> >> >        at
>> >> >> >
>> >> >>
>> >>
>> org.apache.hadoop.hbase.mapreduce.TableOutputFormat$TableRecordWriter.write(TableOutputFormat.java:65)
>> >> >> >        at
>> >> >> >
>> >> >>
>> >>
>> org.apache.hadoop.mapred.ReduceTask$NewTrackingRecordWriter.write(ReduceTask.java:512)
>> >> >> >        at
>> >> >> >
>> >> >>
>> >>
>> org.apache.hadoop.mapreduce.TaskInputOutputContext.write(TaskInputOutputContext.java:80)
>> >> >> >        at
>> org.apache.hadoop.mapreduce.Reducer.reduce(Reducer.java:156)
>> >> >> >        at org.apache.hadoop.mapreduce.Reducer.run(Reducer.java:176)
>> >> >> >        at
>> >> >> >
>> org.apache.hadoop.mapred.ReduceTask.runNewReducer(ReduceTask.java:570)
>> >> >> >        at
>> org.apache.hadoop.mapred.ReduceTask.run(ReduceTask.java:412)
>> >> >> >        at
>> >> >> >
>> >> org.apache.hadoop.mapred.LocalJobRunner$Job.run(LocalJobRunner.java:258)
>> >> >> >
>> >> >> > On Thu, Feb 3, 2011 at 11:52 PM, Stack <stack@duboce.net>
wrote:
>> >> >> >
>> >> >> >> Look at the stack trace.  See where its being thrown.
 Look at
>> that
>> >> >> >> src code at that line offset.  Should give you a clue.
>> >> >> >> St.Ack
>> >> >> >>
>> >> >> >> On Thu, Feb 3, 2011 at 9:36 PM, Mark Kerzner <
>> markkerzner@gmail.com>
>> >> >> >> wrote:
>> >> >> >> > Thank you, that helped, but now I get this error
on trying to
>> write
>> >> >> back
>> >> >> >> to
>> >> >> >> > HBase:
>> >> >> >> >
>> >> >> >> > java.io.IOException: Pass a Delete or a Put
>> >> >> >> >
>> >> >> >> > Here is a fragment on my code. Again, thanks a bunch!
>> >> >> >> >
>> >> >> >> >    public static class RowCounterReducer
>> >> >> >> >            extends TableReducer <Text, IntWritable,
Put>
>> >> >> >> >    {
>> >> >> >> >        public void reduce(Text key,
>> >> >> >> >                Iterable<IntWritable>
values,
>> >> >> >> >                Reducer.Context context)
>> >> >> >> >                throws IOException,
>> >> >> >> >                InterruptedException {
>> >> >> >> >            Iterator <IntWritable> iterator
= values.iterator();
>> >> >> >> >            while (iterator.hasNext()) {
>> >> >> >> >                IntWritable value = iterator.next();
>> >> >> >> >                Put put = new Put();
>> >> >> >> >                context.write(key, put);
>> >> >> >> >            }
>> >> >> >> >        }
>> >> >> >> >    }
>> >> >> >> >
>> >> >> >> >
>> >> >> >> > On Thu, Feb 3, 2011 at 2:50 PM, Stack <stack@duboce.net>
wrote:
>> >> >> >> >
>> >> >> >> >> You are emitting a Text type.  Try just passing
'row' to the
>> >> context,
>> >> >> >> >> the one passed in to your map.
>> >> >> >> >> St.Ack
>> >> >> >> >>
>> >> >> >> >> On Thu, Feb 3, 2011 at 12:23 PM, Mark Kerzner
<
>> >> markkerzner@gmail.com
>> >> >> >
>> >> >> >> >> wrote:
>> >> >> >> >> > Hi,
>> >> >> >> >> >
>> >> >> >> >> > I have this code to read and write to HBase
from MR, and it
>> >> works
>> >> >> fine
>> >> >> >> >> with
>> >> >> >> >> > 0 reducers, but it gives a type mismatch
error when with 1
>> >> reducer.
>> >> >> >> What
>> >> >> >> >> > should I look at? *Thank you!*
>> >> >> >> >> >
>> >> >> >> >> > *Code:*
>> >> >> >> >> >
>> >> >> >> >> >    static class RowCounterMapper
>> >> >> >> >> >            extends TableMapper<Text,
IntWritable> {
>> >> >> >> >> >
>> >> >> >> >> >        private static enum Counters
{
>> >> >> >> >> >
>> >> >> >> >> >            ROWS
>> >> >> >> >> >        }
>> >> >> >> >> >
>> >> >> >> >> >        @Override
>> >> >> >> >> >        public void map(ImmutableBytesWritable
row, Result
>> >> values,
>> >> >> >> Context
>> >> >> >> >> > context)
>> >> >> >> >> >                throws IOException,
InterruptedException {
>> >> >> >> >> >            for (KeyValue value : values.list())
{
>> >> >> >> >> >                if (value.getValue().length
> 0) {
>> >> >> >> >> >                    Text key =
new Text(value.getValue());
>> >> >> >> >> >                    context.write(key,
ONE);
>> >> >> >> >> >                }
>> >> >> >> >> >            }
>> >> >> >> >> >        }
>> >> >> >> >> >    }
>> >> >> >> >> >
>> >> >> >> >> > *Error: *
>> >> >> >> >> >
>> >> >> >> >> > java.io.IOException: Type mismatch in key
from map: expected
>> >> >> >> >> > org.apache.hadoop.hbase.io.ImmutableBytesWritable,
recieved
>> >> >> >> >> > org.apache.hadoop.io.Text
>> >> >> >> >> >
>> >> >> >> >>
>> >> >> >> >
>> >> >> >>
>> >> >> >
>> >> >>
>> >> >
>> >>
>> >
>>
>

Mime
View raw message