hbase-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Shuja Rehman <shujamug...@gmail.com>
Subject Re: java.io.IOException: One family only at org.apache.hadoop.hbase.mapreduce.HFileOutputFormat$1.write(HFileOutputFormat.java:102)
Date Sat, 10 Sep 2011 10:57:11 GMT
Thanks, leif, yes map was omitting more than one column family. I rectify it
and it solves the problem.
Regards
Shuja

On Sat, Sep 10, 2011 at 2:32 AM, Leif Wickland <leifwickland@gmail.com>wrote:

> Shuja,
>
> That would seem to indicate that your map phase is emiting rows in more
> than
> one column family.  Are you certain that's not what's happening?
>
>
>
> On Fri, Sep 9, 2011 at 9:54 AM, Shuja Rehman <shujamughal@gmail.com>
> wrote:
>
> > Hi All.
> >
> > I am trying to generate Hfiles for bulk import. but i am getting the
> > following exception. The hbase table has only 1 column family
> >
> >
> >
> > attempt_201109091959_0002_r_000000_0: log4j:WARN No appenders could be
> > found
> > for logger (org.apache.hadoop.hdfs.DFSClient).
> > attempt_201109091959_0002_r_000000_0: log4j:WARN Please initialize the
> > log4j
> > system properly.
> > 11/09/09 20:29:09 INFO mapred.JobClient:  map 100% reduce 0%
> > 11/09/09 20:29:16 INFO mapred.JobClient:  map 100% reduce 33%
> > 11/09/09 20:29:18 INFO mapred.JobClient: Task Id :
> > attempt_201109091959_0002_r_000000_1, Status : FAILED
> > java.io.IOException: One family only
> >    at
> >
> >
> org.apache.hadoop.hbase.mapreduce.HFileOutputFormat$1.write(HFileOutputFormat.java:102)
> >    at
> >
> >
> org.apache.hadoop.hbase.mapreduce.HFileOutputFormat$1.write(HFileOutputFormat.java:82)
> >    at
> >
> >
> org.apache.hadoop.mapred.ReduceTask$NewTrackingRecordWriter.write(ReduceTask.java:513)
> >    at
> >
> >
> org.apache.hadoop.mapreduce.TaskInputOutputContext.write(TaskInputOutputContext.java:80)
> >    at
> >
> >
> org.apache.hadoop.hbase.mapreduce.KeyValueSortReducer.reduce(KeyValueSortReducer.java:46)
> >    at
> >
> >
> org.apache.hadoop.hbase.mapreduce.KeyValueSortReducer.reduce(KeyValueSortReducer.java:35)
> >    at org.apache.hadoop.mapreduce.Reducer.run(Reducer.java:176)
> >    at
> > org.apache.hadoop.mapred.ReduceTask.runNewReducer(ReduceTask.java:571)
> >    at org.apache.hadoop.mapred.ReduceTask.run(ReduceTask.java:413)
> >    at org.apache.hadoop.mapred.Child$4.run(Child.java:268)
> >    at java.security.AccessController.doPrivileged(Native Method)
> >    at javax.security.auth.Subject.doAs(Subject.java:396)
> >    at
> >
> >
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1115)
> >    at org.apache.hadoop.mapred.Child.main(Child.java:262)
> >
> > Here is configuration
> >
> > Configuration conf = HBaseConfiguration.create();
> >        conf.set("fs.default.name",FS_DEFAULT_NAME);
> >        conf.setBoolean("mapred.speculative.execution", false);
> >        conf.set("hbase.zookeeper.quorum", HBASE_ZOOKEEPER_QUORUM);
> >        conf.set("hbase.zookeeper.property.clientPort",
> > HBASE_ZOOKEEPER_PROPERTY_CLIENTPORT);
> >        Job job = new Job(conf);
> >        job.setJarByClass(BulkImport.class);
> >        Path p1 = new Path("/user/root/URI/InputFiles/");
> >        FileInputFormat.setInputPaths(job, p1);
> >        job.setInputFormatClass(TextInputFormat.class);
> >        job.setMapperClass(Map.class);
> >
> >        HTable htable = new HTable(conf, "mytab1");
> >        job.setReducerClass(KeyValueSortReducer.class);
> >        Path p2 = new Path("/user/root/URI/OutputFiles5/");
> >        FileOutputFormat.setOutputPath(job, p2);
> >        job.setOutputKeyClass(ImmutableBytesWritable.class);
> >        job.setOutputValueClass(KeyValue.class);
> >
> >        HFileOutputFormat.configureIncrementalLoad(job, htable);
> >        System.exit(job.waitForCompletion(true) ? 0 : 1);
> >
> >
> > Any Clue
> > Thanks
> > --
> > Regards
> > Shuja-ur-Rehman Baig
> > <http://pk.linkedin.com/in/shujamughal>
> >
>



-- 
Regards
Shuja-ur-Rehman Baig
<http://pk.linkedin.com/in/shujamughal>

Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message