Return-Path: X-Original-To: apmail-hadoop-user-archive@minotaur.apache.org Delivered-To: apmail-hadoop-user-archive@minotaur.apache.org Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by minotaur.apache.org (Postfix) with SMTP id 576A510F3C for ; Wed, 26 Feb 2014 23:07:17 +0000 (UTC) Received: (qmail 31806 invoked by uid 500); 26 Feb 2014 23:07:09 -0000 Delivered-To: apmail-hadoop-user-archive@hadoop.apache.org Received: (qmail 31623 invoked by uid 500); 26 Feb 2014 23:07:09 -0000 Mailing-List: contact user-help@hadoop.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: user@hadoop.apache.org Delivered-To: mailing list user@hadoop.apache.org Received: (qmail 31616 invoked by uid 99); 26 Feb 2014 23:07:09 -0000 Received: from nike.apache.org (HELO nike.apache.org) (192.87.106.230) by apache.org (qpsmtpd/0.29) with ESMTP; Wed, 26 Feb 2014 23:07:09 +0000 X-ASF-Spam-Status: No, hits=1.5 required=5.0 tests=HTML_MESSAGE,RCVD_IN_DNSWL_LOW,SPF_PASS X-Spam-Check-By: apache.org Received-SPF: pass (nike.apache.org: domain of khangaonkar@gmail.com designates 209.85.212.178 as permitted sender) Received: from [209.85.212.178] (HELO mail-wi0-f178.google.com) (209.85.212.178) by apache.org (qpsmtpd/0.29) with ESMTP; Wed, 26 Feb 2014 23:07:02 +0000 Received: by mail-wi0-f178.google.com with SMTP id cc10so2796050wib.11 for ; Wed, 26 Feb 2014 15:06:42 -0800 (PST) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=gmail.com; s=20120113; h=mime-version:in-reply-to:references:date:message-id:subject:from:to :content-type; bh=X9Ci9/rwA3OQEJFCWptwl7/qEKSeaDPaczCFpUkMRps=; b=EiVTv4aVChtOgBjRRWmFOt9z1tdLC5uBBanU/rmgunwVuA2Z8JRWhNb/shQgDWhQ3k RvsBHp/l1yu1wtnq2+jBspyyIQUCgMokKgjJnR6U9+vVAocyeI21izJND9Gr9oV3/RUs eMgeyiLN6ATzrIwvCCT6hikTfhHXu1bG9b3WwRys+gOWD/AyYY8F3NfyWkYcTF8Cy5m1 xmOp5WEShmo2TVJSvorQRC8+ivNYCOjHYtYseeDg4WKHwbZdVxcFpxh91ngtto6OJcSh 01ckbm7CRnjk+OksIqitfWHIcIiLbIZBRhcEpGfQxEsfdQjMYI15QahqPcy0IdQhTzfl XyMw== MIME-Version: 1.0 X-Received: by 10.181.12.16 with SMTP id em16mr6925272wid.3.1393456002490; Wed, 26 Feb 2014 15:06:42 -0800 (PST) Received: by 10.227.127.137 with HTTP; Wed, 26 Feb 2014 15:06:42 -0800 (PST) In-Reply-To: <21EC0B87-167B-452E-9170-D590012A3C0B@gmail.com> References: <21EC0B87-167B-452E-9170-D590012A3C0B@gmail.com> Date: Wed, 26 Feb 2014 15:06:42 -0800 Message-ID: Subject: Re: Best class for currency From: Manoj Khangaonkar To: user@hadoop.apache.org Content-Type: multipart/alternative; boundary=f46d0434c0a009e59a04f35742f6 X-Virus-Checked: Checked by ClamAV on apache.org --f46d0434c0a009e59a04f35742f6 Content-Type: text/plain; charset=ISO-8859-1 In Java, for precision , you need to use BigDecimal. I believe there must be a BigDecimalWritable in hadoop. regards On Wed, Feb 26, 2014 at 2:58 PM, Marco Shaw wrote: > (If code is required, I can send it along later.) > > I'm a beginner and I'm having issues with MR when trying to read values of > the form 99.99. > > I'm reading up as much as I can. I wanted to try to use Java's types to > determine whether to use a DoubleWritable or FloatWritable, but after some > research, Java seems to recommend against using either of these for > representing currency. > > I'm using the FileInputFormat class which I understand defaults to a > LongWritable,Text pair. > > What Hadoop data type should I use with my Driver section/Job class to > work with currency? > > I've tried FloatWritable and DoubleWritable and my output always ends up > as String,Number.0 in the output file ("something[tab]55.0", for example, > when I know 55.0 is wrong). > > Marco -- http://khangaonkar.blogspot.com/ --f46d0434c0a009e59a04f35742f6 Content-Type: text/html; charset=ISO-8859-1 Content-Transfer-Encoding: quoted-printable
In Java, for precision , you need to use BigDeci= mal.

I believe there must be a BigDecimalWritable in hadoop.
regards


On Wed, Feb 26, 2014 at 2:58 PM, Marco Shaw <marco.shaw@gmail.com= > wrote:
(If code is required, I can send it along later.)

I'm a beginner and I'm having issues with MR when trying to read va= lues of the form 99.99.

I'm reading up as much as I can. I wanted to try to use Java's type= s to determine whether to use a DoubleWritable or FloatWritable, but after = some research, Java seems to recommend against using either of these for re= presenting currency.

I'm using the FileInputFormat class which I understand defaults to a Lo= ngWritable,Text <K,V> pair.

What Hadoop data type should I use with my Driver section/Job class to work= with currency?

I've tried FloatWritable and DoubleWritable and my output always ends u= p as String,Number.0 in the output file ("something[tab]55.0", fo= r example, when I know 55.0 is wrong).

Marco



--
http://khangaonkar.blogspot.com/
--f46d0434c0a009e59a04f35742f6--