hbase-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From stack <st...@duboce.net>
Subject Re: HBase as MapReduce job data source and sink in hadoop: example errors
Date Wed, 17 Dec 2008 21:59:01 GMT
Sorry the doc. failed for you.   Looking at it now in 0.18, it looks 
like it was out-of-date with the API.  Main change in 0.18.x not 
reflected in the documentation was move away from Text type to use byte 
arrays -- or ImmutableBytesWritable -- instead.  See how 
TableOutputFormat#TableRecordWriter expects a IBW formatted key, not a 
Text.  Changing your mapper to output ImmutableBytesWritable for key 
instead of Text should fix the below.  As Tim Robinson suggests, check 
out the adjacent classes for working examples.

St.Ack



Genady wrote:
> Hi,
>
>  
>
> I'm trying to test Hbase 0.18.1 as data source for terabytes of distributed
> data using Hadoop 0.18.2. It seems I had some versions problem, because an
> example from Hbase API(link
> <http://hadoop.apache.org/hbase/docs/current/api/org/apache/hadoop/hbase/map
> red/package-summary.html#package_description> ) docs isn't working for me, 
>
> only change I did to solve compile problems is to replace Mapper with
> Mapper<LongWritable, Text, Text, MapWritable>,
>
>  if anyone could point me to any direction for solution I'll very appreciate
> it, all hadoop examples are working good, errors print:
>
>  
>
> Line 99 in SampleUploader.java : output.collect(k, v.next());
>
>  
>
> java.lang.ClassCastException: org.apache.hadoop.io.Text
>
>         at
> org.apache.hadoop.hbase.mapred.TableOutputFormat$TableRecordWriter.write(Tab
> leOutputFormat.java:53)
>
>         at
> org.apache.hadoop.mapred.ReduceTask$3.collect(ReduceTask.java:300)
>
>         at
> org.exelate.SampleUploader$TableUploader.reduce(SampleUploader.java:99)
>
>         at
> org.exelate.SampleUploader$TableUploader.reduce(SampleUploader.java:1)
>
>         at
> org.apache.hadoop.hbase.mapred.TableReduce.reduce(TableReduce.java:42)
>
>         at org.apache.hadoop.mapred.ReduceTask.run(ReduceTask.java:318)
>
>         at
> org.apache.hadoop.mapred.TaskTracker$Child.main(TaskTracker.java:2207)
>
>  
>
> 08/12/17 09:50:11 INFO mapred.JobClient: Task Id :
> attempt_200812170342_0017_r_000000_1, Status : FAILED
>
> java.lang.ClassCastException: org.apache.hadoop.io.Text
>
>         at
> org.apache.hadoop.hbase.mapred.TableOutputFormat$TableRecordWriter.write(Tab
> leOutputFormat.java:53)
>
>         at
> org.apache.hadoop.mapred.ReduceTask$3.collect(ReduceTask.java:300)
>
>         at
> org.exelate.SampleUploader$TableUploader.reduce(SampleUploader.java:99)
>
>         at
> org.exelate.SampleUploader$TableUploader.reduce(SampleUploader.java:1)
>
>         at
> org.apache.hadoop.hbase.mapred.TableReduce.reduce(TableReduce.java:42)
>
>         at org.apache.hadoop.mapred.ReduceTask.run(ReduceTask.java:318)
>
>         at
> org.apache.hadoop.mapred.TaskTracker$Child.main(TaskTracker.java:2207)
>
>  
>
> 08/12/17 09:50:19 INFO mapred.JobClient: Task Id :
> attempt_200812170342_0017_r_000000_2, Status : FAILED
>
> java.lang.ClassCastException: org.apache.hadoop.io.Text
>
>         at
> org.apache.hadoop.hbase.mapred.TableOutputFormat$TableRecordWriter.write(Tab
> leOutputFormat.java:53)
>
>         at
> org.apache.hadoop.mapred.ReduceTask$3.collect(ReduceTask.java:300)
>
>         at
> org.exelate.SampleUploader$TableUploader.reduce(SampleUploader.java:99)
>
>         at
> org.exelate.SampleUploader$TableUploader.reduce(SampleUploader.java:1)
>
>         at
> org.apache.hadoop.hbase.mapred.TableReduce.reduce(TableReduce.java:42)
>
>         at org.apache.hadoop.mapred.ReduceTask.run(ReduceTask.java:318)
>
>         at
> org.apache.hadoop.mapred.TaskTracker$Child.main(TaskTracker.java:2207)
>
>  
>
> java.io.IOException: Job failed!
>
>         at org.apache.hadoop.mapred.JobClient.runJob(JobClient.java:1113)
>
>         at org.exelate.SampleUploader.run(SampleUploader.java:119)
>
>         at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
>
>         at org.exelate.SampleUploader.main(SampleUploader.java:132)
>
>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>
>         at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39
> )
>
>         at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl
> .java:25)
>
>         at java.lang.reflect.Method.invoke(Method.java:585)
>
>         at org.apache.hadoop.util.RunJar.main(RunJar.java:155)
>
>         at org.apache.hadoop.mapred.JobShell.run(JobShell.java:54)
>
>         at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
>
>         at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:79)
>
>         at org.apache.hadoop.mapred.JobShell.main(JobShell.java:68)
>
>  
>
>  
>
> Genady Gilin
>
> R&D Team
>
> eXelate Media Ltd. | 20 Ha'magshimim St. Ground Floor | Petah-Tikva, Israel
>
> T: +1.646.237.4206 | M: +972.54.4824533|  <http://www.exelate.com/>
> www.exelate.com |  <mailto:genadyg@exelate.com> genadyg@exelate.com  
>
>  
>
> Check out our new Targeting Channels . . . Personal Tech, Urban, Fashion and
> more!
>
>  
>
> ***This e-mail message is for the sole use of the intended recipient(s) and
> contains confidential and/or privileged information belonging to eXelate
> Media, Ltd. or its subsidiaries, customers or partners. Any unauthorized
> review, use, copying, disclosure or distribution of this message is strictly
> prohibited. If you are not an intended recipient of this message, please
> contact the sender by reply e-mail and destroy all soft and hard copies of
> the message and any attachments. Thank you for your cooperation.***
>
>  
>
>  
>
>
>   


Mime
View raw message