hive-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From 朱 偉民 <shu-i...@iij.ad.jp>
Subject An error occured when writing tables to avro files
Date Wed, 27 May 2015 08:26:53 GMT
hi

I  create an avro format table follow the wiki
https://cwiki.apache.org/confluence/display/Hive/AvroSerDe#AvroSerDe-Hive0.14andlater

An error occured when insert data from another table that created by previous steps.
I am using hive-0.14.0/hive-1.2.0 + hadoop-2.6.0.
Do you have any idea?

hive> CREATE TABLE as_avro(string1 STRING,
    >                      int1 INT,
    >                      tinyint1 TINYINT,
    >                      smallint1 SMALLINT,
    >                      bigint1 BIGINT,
    >                      boolean1 BOOLEAN,
    >                      float1 FLOAT,
    >                      double1 DOUBLE,
    >                      list1 ARRAY<STRING>,
    >                      map1 MAP<STRING,INT>,
    >                      struct1 STRUCT<sint:INT,sboolean:BOOLEAN,sstring:STRING>,
    >                      union1 uniontype<FLOAT, BOOLEAN, STRING>,
    >                      enum1 STRING,
    >                      nullableint INT,
    >                      bytes1 BINARY,
    >                      fixed1 BINARY)
    > STORED AS AVRO;
OK
Time taken: 0.11 seconds

hive> INSERT OVERWRITE TABLE as_avro SELECT * FROM test_serializer;
FAILED: SemanticException [Error 10044]: Line 1:23 Cannot insert into target table because
column number/types are different 'as_avro': Cannot convert column 11 from uniontype<float,boolean,string>
to uniontype<void,float,boolean,string>.

I do not understand Why the column union1 look like this
       uniontype<void,float,boolean,string>

Thanks
zhuweimin


Mime
View raw message