hadoop-mapreduce-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Marco Shaw <marco.s...@gmail.com>
Subject Re: Best class for currency
Date Wed, 26 Feb 2014 23:47:35 GMT
Unless I'm missing something, I can't find a BigDecimalWritable class in Hadoop. 

Hive appears to have such a class...

I'll keep searching...

Marco

> On Feb 26, 2014, at 7:06 PM, Manoj Khangaonkar <khangaonkar@gmail.com> wrote:
> 
> In Java, for precision , you need to use BigDecimal.
> 
> I believe there must be a BigDecimalWritable in hadoop.
> 
> regards
> 
> 
>> On Wed, Feb 26, 2014 at 2:58 PM, Marco Shaw <marco.shaw@gmail.com> wrote:
>> (If code is required, I can send it along later.)
>> 
>> I'm a beginner and I'm having issues with MR when trying to read values of the form
99.99.
>> 
>> I'm reading up as much as I can. I wanted to try to use Java's types to determine
whether to use a DoubleWritable or FloatWritable, but after some research, Java seems to recommend
against using either of these for representing currency.
>> 
>> I'm using the FileInputFormat class which I understand defaults to a LongWritable,Text
<K,V> pair.
>> 
>> What Hadoop data type should I use with my Driver section/Job class to work with
currency?
>> 
>> I've tried FloatWritable and DoubleWritable and my output always ends up as String,Number.0
in the output file ("something[tab]55.0", for example, when I know 55.0 is wrong).
>> 
>> Marco
> 
> 
> 
> -- 
> http://khangaonkar.blogspot.com/

Mime
View raw message