hbase-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Jean-Marc Spaggiari <jean-m...@spaggiari.org>
Subject Re: [Error]Finding average using hbase hadoop
Date Sat, 17 Aug 2013 11:54:38 GMT
Hi Manish.

Looking a bit more at this, I think the issue is because you "floats" are
written in you table as strings and not as floats....

Can you try something link:
context.write(stock_symbol,new FloatWritable(
Float.parseFloat(Bytes.toString(val)))));

Also, as asked previously,  " can you please paste you code on pastbin?
Same for the exception."?

Thanks,

JM

2013/8/17 manish dunani <manishd207@gmail.com>

> Hey jean,
> I did it according to you.
> I convert as u told..But still face the same error.
>
> And ted I am new to this don't get an idea how can i use this method..Can
> You please show me..?
>
> Your Help will be appreciated..
>
>
>
> On Fri, Aug 16, 2013 at 10:04 PM, Ted Yu <yuzhihong@gmail.com> wrote:
>
> > Here is javadoc for toFloat():
> >
> >    * Presumes float encoded as IEEE 754 floating-point "single format"
> >
> >    * @param bytes byte array
> >
> >    * @return Float made from passed byte array.
> >
> >    */
> >
> >   public static float toFloat(byte [] bytes) {
> >
> > So for values of '2.5', toFloat() is not the proper method.
> >
> > You can float conversion provided by Java.
> >
> >
> > Cheers
> >
> >
> > On Fri, Aug 16, 2013 at 6:57 AM, Ted Yu <yuzhihong@gmail.com> wrote:
> >
> > > Here is code from Bytes:
> > >
> > >   public static float toFloat(byte [] bytes, int offset) {
> > >
> > >     return Float.intBitsToFloat(toInt(bytes, offset, SIZEOF_INT));
> > >
> > > Looking at your sample data:
> > >
> > >  2010-02-04           column=stocks:open, timestamp=1376567559424,
> > > value=*2.5*
> > >
> > > The length of value didn't match SIZEOF_INT.
> > >
> > > It seems you need to validate the values first.
> > >
> > >
> > > Cheers
> > >
> > >
> > > On Fri, Aug 16, 2013 at 3:42 AM, manish dunani <manishd207@gmail.com
> > >wrote:
> > >
> > >> hello,
> > >>
> > >> I am using apache hadoop 1.1.2 and hbase 0.94.9 on pseudo distibuted
> > mode.
> > >>
> > >> I am trying to find Average open stocks values.
> > >>
> > >> *sample dataset in hbase::**(table name:nyse4)*
> > >>
> > >>
> > >>  2010-02-04           column=stocks:open, timestamp=1376567559424,
> > >> value=2.5
> > >>  2010-02-04           column=stocks:symbol, timestamp=1376567559424,
> > >> value=QXM
> > >>  2010-02-05           column=stocks:open, timestamp=1376567559429,
> > >> value=2.42
> > >>  2010-02-05           column=stocks:symbol, timestamp=1376567559429,
> > >> value=QXM
> > >>  2010-02-08           column=stocks:open, timestamp=1376567559431,
> > >> value=2.33
> > >>  2010-02-08           column=stocks:symbol, timestamp=1376567559431,
> > >> value=QXM
> > >>
> > >> *code:*(please ignores the lines that are commenting)
> > >>
> > >>
> > >> > package com.maddy;
> > >> >
> > >> > import java.io.IOException;
> > >> >
> > >> > import org.apache.hadoop.conf.Configuration;
> > >> > import org.apache.hadoop.fs.Path;
> > >> > import org.apache.hadoop.hbase.HBaseConfiguration;
> > >> > import org.apache.hadoop.hbase.client.Put;
> > >> > import org.apache.hadoop.hbase.client.Result;
> > >> > import org.apache.hadoop.hbase.client.Scan;
> > >> > import org.apache.hadoop.hbase.filter.FirstKeyOnlyFilter;
> > >> > import org.apache.hadoop.hbase.io.ImmutableBytesWritable;
> > >> > import org.apache.hadoop.hbase.mapreduce.TableMapReduceUtil;
> > >> > import org.apache.hadoop.hbase.mapreduce.TableMapper;
> > >> > import org.apache.hadoop.hbase.mapreduce.TableReducer;
> > >> > import org.apache.hadoop.hbase.util.Bytes;
> > >> > //import org.apache.hadoop.io.DoubleWritable;
> > >> > import org.apache.hadoop.io.FloatWritable;
> > >> > import org.apache.hadoop.mapreduce.Job;
> > >> > import org.apache.hadoop.mapreduce.lib.output.FileOutputFormat;
> > >> >
> > >> >
> > >> > public class openaveragestock
> > >> > {
> > >> >     public static class map extends
> > >> > TableMapper<ImmutableBytesWritable,FloatWritable>
> > >> >     {
> > >> >         @Override
> > >> >
> > >> >         public void map(ImmutableBytesWritable row,Result
> > value,Context
> > >> > context) throws IOException
> > >> >         {
> > >> >
> > >> >             byte[]
> > >> > val=(value.getValue("stocks".getBytes(),"open".getBytes()));
> > >> >             //byte[]
> > >> > val1=(value.getValue("stocks".getBytes(),"symbol".getBytes()));
> > >> >
> > >> >
> > >> >             ImmutableBytesWritable stock_symbol=new
> > >> > ImmutableBytesWritable("symbol".getBytes());
> > >> >
> > >> >
> > >> >             try
> > >> >             {
> > >> >                 context.write(stock_symbol,new
> > >> > FloatWritable(Bytes.toFloat(val)));
> > >> >             }
> > >> >             catch(InterruptedException e)
> > >> >
> > >> >             {
> > >> >                  throw new IOException(e);
> > >> >             }
> > >> >
> > >> >
> > >> >         }
> > >> >
> > >> >
> > >> >     }
> > >> >
> > >> >
> > >> >     public static class reduce extends
> > >> >
> > >>
> > TableReducer<ImmutableBytesWritable,FloatWritable,ImmutableBytesWritable>
> > >> >     {
> > >> >
> > >> >         @Override
> > >> >         public void reduce(ImmutableBytesWritable
> > >> > key,Iterable<FloatWritable>values,Context context) throws
> IOException,
> > >> > InterruptedException
> > >> >         {
> > >> >             float sum=0;
> > >> >             int count=0;
> > >> >           //  float average=0;
> > >> >             for(FloatWritable val:values)
> > >> >             {
> > >> >                 sum+=val.get();
> > >> >                 count++;
> > >> >             }
> > >> >             //average=(sum/count);
> > >> >             Put put=new Put(key.get());
> > >> >
> > >> >
> > >>
> >
> put.add(Bytes.toBytes("stocks_output"),Bytes.toBytes("average"),Bytes.toBytes(sum/count));
> > >> >             System.out.println("For\t"+count+"\t average
> > >> is:"+(sum/count));
> > >> >             context.write(key,put);
> > >> >
> > >> >         }
> > >> >
> > >> >     }
> > >> >
> > >> >     public static void main(String args[]) throws IOException,
> > >> > ClassNotFoundException, InterruptedException
> > >> >     {
> > >> >         Configuration config=HBaseConfiguration.create();
> > >> >         config.addResource("/home/manish/workspace/hbase
> > >> > project/bin/hbase-site.xml");
> > >> >         Job job=new Job(config,"openstockaverage1");
> > >> >
> > >> >
> > >> >         Scan scan=new Scan();
> > >> >         scan.addFamily("stocks".getBytes());
> > >> >         scan.setFilter(new FirstKeyOnlyFilter());
> > >> >
> > >> >         TableMapReduceUtil.initTableMapperJob("nyse4",
> > >> >                 scan,
> > >> >                 map.class,
> > >> >                 ImmutableBytesWritable.class,
> > >> >                 FloatWritable.class,
> > >> >                 job);
> > >> >
> > >> >         TableMapReduceUtil.initTableReducerJob("nyse5",
> > >> >                 reduce.class,
> > >> >                 job);
> > >> >         //job.setReducerClass(reduce.class);
> > >> >
> > >> >         FileOutputFormat.setOutputPath(job, new Path(
> > >> >
> > >> "hdfs://localhost:54310/user/manish/edurekahbasehadoop1"));
> > >> >         job.waitForCompletion(true);
> > >> >     }
> > >> >
> > >> > }
> > >> >
> > >> >
> > >> >
> > >> > *===>Got stuck into error:*
> > >> >
> > >> >
> > >> > 13/08/16 03:21:45 INFO mapred.JobClient: Running job: job_local_0001
> > >> > 13/08/16 03:21:46 INFO mapred.JobClient:  map 0% reduce 0%
> > >> > 13/08/16 03:21:46 INFO mapreduce.TableOutputFormat: Created table
> > >> instance
> > >> > for nyse5
> > >> > 13/08/16 03:21:46 INFO util.ProcessTree: setsid exited with exit
> code
> > 0
> > >> > 13/08/16 03:21:47 INFO mapred.Task:  Using ResourceCalculatorPlugin
> :
> > >> > org.apache.hadoop.util.LinuxResourceCalculatorPlugin@50b77c
> > >> > 13/08/16 03:21:47 INFO mapred.MapTask: io.sort.mb = 100
> > >> > 13/08/16 03:21:53 INFO mapred.MapTask: data buffer =
> 79691776/99614720
> > >> > 13/08/16 03:21:53 INFO mapred.MapTask: record buffer = 262144/327680
> > >> > 13/08/16 03:21:54 WARN mapred.LocalJobRunner: job_local_0001
> > >> > java.lang.IllegalArgumentException: offset (0) + length (4) exceed
> the
> > >> > capacity of the array: 3
> > >> >     at
> > >> >
> > >>
> >
> org.apache.hadoop.hbase.util.Bytes.explainWrongLengthOrOffset(Bytes.java:543)
> > >> >     at org.apache.hadoop.hbase.util.Bytes.toInt(Bytes.java:690)
> > >> >     at org.apache.hadoop.hbase.util.Bytes.toFloat(Bytes.java:584)
> > >> >     at org.apache.hadoop.hbase.util.Bytes.toFloat(Bytes.java:574)
> > >> >     at com.maddy.openaveragestock$map.map(openaveragestock.java:41)
> > >> >     at com.maddy.openaveragestock$map.map(openaveragestock.java:1)
> > >> >     at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:144)
> > >> >     at
> org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764)
> > >> >     at org.apache.hadoop.mapred.MapTask.run(MapTask.java:370)
> > >> >     at
> > >> >
> > org.apache.hadoop.mapred.LocalJobRunner$Job.run(LocalJobRunner.java:214)
> > >> >
> > >> >
> > >> I cannot find where is it fail??
> > >> Can you please tell me??
> > >> where i was wrong..?
> > >>
> > >>
> > >> Your help will be appreciated.
> > >>
> > >>
> > >>
> > >>
> > >>
> > >>
> > >>
> > >>
> > >>
> > >>
> > >> --
> > >> Regards
> > >>
> > >> *Manish Dunani*
> > >> *Contact No* : +91 9408329137
> > >> *skype id* : manish.dunani*
> > >> *
> > >>
> > >
> > >
> >
>
>
>
> --
> Regards
>
> *Manish Dunani*
> *Contact No* : +91 9408329137
> *skype id* : manish.dunani*
> *
>

Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message