hive-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Mithun Radhakrishnan (JIRA)" <j...@apache.org>
Subject [jira] [Created] (HIVE-14274) When columns are added to structs in a Hive table, HCatLoader breaks.
Date Mon, 18 Jul 2016 22:17:20 GMT
Mithun Radhakrishnan created HIVE-14274:
-------------------------------------------

             Summary: When columns are added to structs in a Hive table, HCatLoader breaks.
                 Key: HIVE-14274
                 URL: https://issues.apache.org/jira/browse/HIVE-14274
             Project: Hive
          Issue Type: Bug
          Components: HCatalog
    Affects Versions: 2.1.0, 1.2.1
            Reporter: Mithun Radhakrishnan
            Assignee: Mithun Radhakrishnan


Consider this sequence of table/partition creation and schema evolution:
{code:sql}
-- Create table.
CREATE EXTERNAL TABLE `simple_text` (
                foo STRING,
                bar STRUCT<goo:STRING,moo:STRING>
                )
PARTITIONED BY ( dt STRING )
ROW FORMAT DELIMITED
        FIELDS TERMINATED BY '\t'
        COLLECTION ITEMS TERMINATED BY ':'
STORED AS TEXTFILE ;

-- Add partition.
ALTER TABLE simple_text ADD PARTITION ( dt='0' );

-- Alter the struct-column to add a new sub-field.
ALTER TABLE simple_text CHANGE COLUMN bar bar STRUCT<goo:STRING, moo:STRING, zoo:STRING>;
{code}

The {{dt='0'}} partition's schema indicates 2 fields in {{bar}}. The data can be read using
Hive, but not through HCatLoader. The error looks as follows:

{noformat}
org.apache.pig.backend.executionengine.ExecException: ERROR 0: Exception while executing (Name:
data_raw: Store(hdfs://dilithiumblue-nn1.blue.ygrid.yahoo.com:8020/tmp/temp-643668868/tmp-1639945319:org.apache.pig.impl.io.TFileStorage)
- scope-1 Operator Key: scope-1): org.apache.pig.backend.executionengine.ExecException: ERROR
0: org.apache.pig.backend.executionengine.ExecException: ERROR 6018: Error converting read
value to tuple
	at org.apache.pig.backend.hadoop.executionengine.physicalLayer.PhysicalOperator.processInput(PhysicalOperator.java:314)
	at org.apache.pig.backend.hadoop.executionengine.tez.plan.operator.POStoreTez.getNextTuple(POStoreTez.java:123)
	at org.apache.pig.backend.hadoop.executionengine.tez.runtime.PigProcessor.runPipeline(PigProcessor.java:376)
	at org.apache.pig.backend.hadoop.executionengine.tez.runtime.PigProcessor.run(PigProcessor.java:241)
	at org.apache.tez.runtime.LogicalIOProcessorRuntimeTask.run(LogicalIOProcessorRuntimeTask.java:362)
	at org.apache.tez.runtime.task.TezTaskRunner$TaskRunnerCallable$1.run(TezTaskRunner.java:179)
	at org.apache.tez.runtime.task.TezTaskRunner$TaskRunnerCallable$1.run(TezTaskRunner.java:171)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:422)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1679)
	at org.apache.tez.runtime.task.TezTaskRunner$TaskRunnerCallable.callInternal(TezTaskRunner.java:171)
	at org.apache.tez.runtime.task.TezTaskRunner$TaskRunnerCallable.callInternal(TezTaskRunner.java:167)
	at org.apache.tez.common.CallableWithNdc.call(CallableWithNdc.java:36)
	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)
Caused by: org.apache.pig.backend.executionengine.ExecException: ERROR 0: org.apache.pig.backend.executionengine.ExecException:
ERROR 6018: Error converting read value to tuple
	at org.apache.pig.backend.hadoop.executionengine.tez.plan.operator.POSimpleTezLoad.getNextTuple(POSimpleTezLoad.java:160)
	at org.apache.pig.backend.hadoop.executionengine.physicalLayer.PhysicalOperator.processInput(PhysicalOperator.java:305)
	... 16 more
Caused by: org.apache.pig.backend.executionengine.ExecException: ERROR 6018: Error converting
read value to tuple
	at org.apache.hive.hcatalog.pig.HCatBaseLoader.getNext(HCatBaseLoader.java:76)
	at org.apache.hive.hcatalog.pig.HCatLoader.getNext(HCatLoader.java:63)
	at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigRecordReader.nextKeyValue(PigRecordReader.java:204)
	at org.apache.tez.mapreduce.lib.MRReaderMapReduce.next(MRReaderMapReduce.java:118)
	at org.apache.pig.backend.hadoop.executionengine.tez.plan.operator.POSimpleTezLoad.getNextTuple(POSimpleTezLoad.java:140)
	... 17 more
Caused by: java.lang.IndexOutOfBoundsException: Index: 2, Size: 2
	at java.util.ArrayList.rangeCheck(ArrayList.java:653)
	at java.util.ArrayList.get(ArrayList.java:429)
	at org.apache.hive.hcatalog.pig.PigHCatUtil.transformToTuple(PigHCatUtil.java:468)
	at org.apache.hive.hcatalog.pig.PigHCatUtil.transformToTuple(PigHCatUtil.java:451)
	at org.apache.hive.hcatalog.pig.PigHCatUtil.extractPigObject(PigHCatUtil.java:410)
	at org.apache.hive.hcatalog.pig.PigHCatUtil.transformToTuple(PigHCatUtil.java:468)
	at org.apache.hive.hcatalog.pig.PigHCatUtil.transformToTuple(PigHCatUtil.java:386)
	at org.apache.hive.hcatalog.pig.HCatBaseLoader.getNext(HCatBaseLoader.java:64)
	... 21 more
{noformat}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

Mime
View raw message