hadoop-common-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Amandeep Khurana <ama...@gmail.com>
Subject Re: unable to write in hbase using mapreduce hadoop 0.20 and hbase 0.20
Date Sat, 05 Dec 2009 01:40:11 GMT
Try and output the data you are parsing from the xml to stdout. Maybe its
not getting any data at all?
One more thing you can try is to not use vectors and see if the individual
Puts are getting committed or not.
Use sysouts to see whats happening in the program. The code seems correct.


On Fri, Dec 4, 2009 at 12:13 PM, Vipul Sharma <sharmavipul@gmail.com> wrote:

> Hi all,
>
> I am developing an application to populate hbase table with some data that
> I
> am getting after parsing some xml files. I have a mapreduce job using new
> hadoop 0.20 api and i am using hbase 0.20.2. Here is my mapreduce job
>
> public class MsgEventCollector {
>        private static Logger logger=Logger.getLogger("mr");
>
>        static class MsgEventCollectorMapper extends Mapper<NullWritable,
> Text, Text,Writable >{
>                private HTable table;
>                XmlParser parser = new XmlParser();
>                public MsgEventCollectorMapper(){
>                        try {
>
>  table=new HTable(new HBaseConfiguration(),"EventTable");
>                                table.setAutoFlush(true);
>                        } catch (Exception e) {
>                                // TODO Auto-generated catch block
>                                e.printStackTrace();
>                        }
>                }
>                public void map(NullWritable key, Text value, Context
> context){
>                        String xmlrecord = value.toString();
>                        Vector<Put> rowlist = new Vector<Put>();
> //then I create several rowKey
>                         byte[]
> rowKey=RowKeyConverter.makeObservationRowKey(msg.getLongdatetime, s);
>                         Put p = new Put(rowKey);
> //add data in p per column family
>                         p.add(Bytes.toBytes("coulm_family1"),
> Bytes.toBytes("column1"), Bytes.toBytes(column value));
> //add Put p in Put vector rowlist
>                         rowlist.add(p);
> //commit rowlist in hbase table
>                         table.put(rowlist);
> //main job setup
>        public static void main(String[] args) throws Exception{
>                PropertyConfigurator.configure("./log4j.xml");
>                logger.info("Setting up job to Populate MsgEventTable");
>                Job job=new Job();
>                job.setJarByClass(MsgEventCollector.class);
>
>                FileInputFormat.addInputPath(job, new Path(args[0]));
>                job.setInputFormatClass(SequenceFileInputFormat.class);
>                job.setMapperClass(MsgEventCollectorMapper.class);
>                job.setNumReduceTasks(0);
>                job.setOutputFormatClass(NullOutputFormat.class);
>                logger.info("Waiting for Job to Finish");
>                int exitCode=job.waitForCompletion(true)?0:1;
>                System.exit(exitCode);
>        }
> }
>
>
> My mapreduce job run without an error but I see no data in Table. Few more
> inputs that I have hbase and zookeeper jar in hadoop classpath on all
> servers. I can add data by hand in the table.
>
> Please let me know if I am doing anything wrong here. Thanks for your help
> in advance
>
> --
> Vipul Sharma
> sharmavipul AT gmail DOT com
>

Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message