hadoop-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Bart Verwilst <li...@verwilst.be>
Subject Re: Mapping MySQL schema to Avro
Date Sat, 24 Nov 2012 13:21:24 GMT

Hey Ruslan, 

I'm not using Sqoop because my avro exist of a couple
of subsets, which (afaik) are impossible to reproduce with a single
query and thus sqoop. Or am I wrong? 

Kind regards, 


Al-Fakikh schreef op 24.11.2012 01:06: 

> Hey Bart, 
> Why not using
Sqoop for trasforming data from/to hadoop to/from MySQL? 
http://sqoop.apache.org/docs/1.4.2/SqoopUserGuide.html [1] 
> see the
parameter: -as-avrodatafile 
> Ruslan 
> On Sat, Nov 24, 2012 at
2:20 AM, Ted Dunning <tdunning@maprtech.com> wrote:
>> This is
probably the wrong list for your question. 
>> And, no, I don't
think that your conversion is correct. 
>> To me it looks like you
have lots of values in a fmsession. In the avro version, those values
appear to be repeated in the fmsswitchvalues. That seems wrong. 
On Fri, Nov 23, 2012 at 5:12 AM, Bart Verwilst <lists@verwilst.be>
>>> Hello!
>>> I'm currently writing an importer to
import our MySQL data into hadoop ( as Avro files ). Attached you can
find the schema i'm converting to Avro, along with the corresponding
Avro schema i would like to use for my imported data. I was wondering if
you guys could go over the schema and determine if this is sane/optimal,
and if not, how i should improve it.
>>> As a sidenote, i converted
bigints to long, and had 1 occurrence of double, which i also converted
to long in the avro, not sure if that's the correct type?
Thanks in advance for your expert opinions! ;)
>>> Kind
>>> Bart


View raw message