I'm not using Sqoop because my avro exist of a couple of subsets, which (afaik) are impossible to reproduce with a single query and thus sqoop. Or am I wrong?
Ruslan Al-Fakikh schreef op 24.11.2012 01:06:
Hey Bart,Why not using Sqoop for trasforming data from/to hadoop to/from MySQL?see the parameter: -as-avrodatafileRuslan
On Sat, Nov 24, 2012 at 2:20 AM, Ted Dunning <email@example.com> wrote:
This is probably the wrong list for your question.And, no, I don't think that your conversion is correct.To me it looks like you have lots of values in a fmsession. In the avro version, those values appear to be repeated in the fmsswitchvalues. That seems wrong.
On Fri, Nov 23, 2012 at 5:12 AM, Bart Verwilst <firstname.lastname@example.org> wrote:
I'm currently writing an importer to import our MySQL data into hadoop ( as Avro files ). Attached you can find the schema i'm converting to Avro, along with the corresponding Avro schema i would like to use for my imported data. I was wondering if you guys could go over the schema and determine if this is sane/optimal, and if not, how i should improve it.
As a sidenote, i converted bigints to long, and had 1 occurrence of double, which i also converted to long in the avro, not sure if that's the correct type?
Thanks in advance for your expert opinions! ;)