hive-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Mohammad Islam <misla...@yahoo.com.INVALID>
Subject Schema evolution for parquet table
Date Mon, 05 Oct 2015 22:40:59 GMT
Hi,Does the parquet table support auto casting to wider data types? For example, If I have
a parquet table where some parquet data files which have "int"  as data type and other files
have "long" data type for the same field.
The table schema has type "bigint" for the same field.Does hive can read the file that was
written with type "int"?
I got this exception "Failed with exception java.io.IOException:org.apache.hadoop.hive.ql.metadata.HiveException:
java.lang.ClassCastException: org.apache.hadoop.io.IntWritable cannot be cast to org.apache.hadoop.io.LongWritable".
Regards,Mohammad

Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message