hive-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Markovitz, Dudu" <dmarkov...@paypal.com>
Subject RE: same hdfs location with different schema exception
Date Tue, 14 Jun 2016 07:03:08 GMT
Hi

Can you please share the query?

Thanks

Dudu

From: 赵升/赵荣生 [mailto:roncenzhao@qq.com]
Sent: Tuesday, June 14, 2016 5:26 AM
To: user <user@hive.apache.org>
Subject: same hdfs location with different schema exception

Hi all:
  I have a question when using hive. It's described as follows:

  Firstly, I create two table:
    CREATE TABLE `roncen_tmp`(
    `a` bigint,
    `b` bigint,
    `c` string);

    CREATE EXTERNAL TABLE `ext_roncen`(
    `aaa` bigint)
    LOCATION 'hdfs://xxx/user/hive/warehouse/roncen_tmp'

  You see, the two tables have the same hdfs location, but they have different schema.

  Then:
  When I run a sql which includes the two tables, the exception occur.


2016-06-14 10:16:56,807 INFO [main] org.apache.hadoop.hive.ql.exec.MapJoinOperator: Initializing
child 2 MAPJOIN

2016-06-14 10:16:56,807 INFO [main] org.apache.hadoop.hive.ql.exec.MapJoinOperator: Initializing
Self MAPJOIN[2]

2016-06-14 10:16:56,815 ERROR [main] org.apache.hadoop.hive.ql.exec.HashTableDummyOperator:
Generating output obj inspector from dummy object error

java.lang.RuntimeException: cannot find field aaa from [0:a, 1:b, 2:c]

        at org.apache.hadoop.hive.serde2.objectinspector.ObjectInspectorUtils.getStandardStructFieldRef(ObjectInspectorUtils.java:410)

        at org.apache.hadoop.hive.serde2.BaseStructObjectInspector.getStructFieldRef(BaseStructObjectInspector.java:133)

        at org.apache.hadoop.hive.ql.exec.ExprNodeColumnEvaluator.initialize(ExprNodeColumnEvaluator.java:55)

        at org.apache.hadoop.hive.ql.exec.JoinUtil.getObjectInspectorsFromEvaluators(JoinUtil.java:68)

        at org.apache.hadoop.hive.ql.exec.AbstractMapJoinOperator.initializeOp(AbstractMapJoinOperator.java:68)

        at org.apache.hadoop.hive.ql.exec.MapJoinOperator.initializeOp(MapJoinOperator.java:95)

        at org.apache.hadoop.hive.ql.exec.Operator.initialize(Operator.java:385)

        at org.apache.hadoop.hive.ql.exec.Operator.initialize(Operator.java:469)

        at org.apache.hadoop.hive.ql.exec.Operator.initializeChildren(Operator.java:425)

        at org.apache.hadoop.hive.ql.exec.HashTableDummyOperator.initializeOp(HashTableDummyOperator.java:40)

        at org.apache.hadoop.hive.ql.exec.Operator.initialize(Operator.java:385)

        at org.apache.hadoop.hive.ql.exec.mr.ExecMapper.configure(ExecMapper.java:144)

        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)

        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)

        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)

        at java.lang.reflect.Method.invoke(Method.java:606)

        at org.apache.hadoop.util.ReflectionUtils.setJobConf(ReflectionUtils.java:106)

        at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:75)

        at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:133)

        at org.apache.hadoop.mapred.MapRunner.configure(MapRunner.java:38)

        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)

        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)

        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)

        at java.lang.reflect.Method.invoke(Method.java:606)

        at org.apache.hadoop.util.ReflectionUtils.setJobConf(ReflectionUtils.java:106)

        at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:75)

        at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:133)

        at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:436)

        at org.apache.hadoop.mapred.MapTask.run(MapTask.java:352)

        at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:162)

        at java.security.AccessController.doPrivileged(Native Method)

        at javax.security.auth.Subject.doAs(Subject.java:415)

        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1680)

        at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:157)

2016-06-14 10:16:56,820 INFO [main] org.apache.hadoop.hive.ql.exec.HashTableDummyOperator:
Initialization Done 5 HASHTABLEDUMMY

2016-06-14 10:16:56,876 INFO [main] org.apache.hadoop.hive.ql.log.PerfLogger: <PERFLOG
method=LoadHashtable from=org.apache.hadoop.hive.ql.exec.MapJoinOperator>

2016-06-14 10:16:56,876 FATAL [main] org.apache.hadoop.hive.ql.exec.mr.ExecMapper: java.lang.NullPointerException

        at org.apache.hadoop.hive.ql.exec.MapJoinOperator.loadHashTable(MapJoinOperator.java:189)

        at org.apache.hadoop.hive.ql.exec.MapJoinOperator.cleanUpInputFileChangedOp(MapJoinOperator.java:216)

        at org.apache.hadoop.hive.ql.exec.Operator.cleanUpInputFileChanged(Operator.java:1051)

        at org.apache.hadoop.hive.ql.exec.Operator.cleanUpInputFileChanged(Operator.java:1055)

        at org.apache.hadoop.hive.ql.exec.Operator.cleanUpInputFileChanged(Operator.java:1055)

        at org.apache.hadoop.hive.ql.exec.Operator.cleanUpInputFileChanged(Operator.java:1055)

        at org.apache.hadoop.hive.ql.exec.MapOperator.process(MapOperator.java:486)

        at org.apache.hadoop.hive.ql.exec.mr.ExecMapper.map(ExecMapper.java:176)

        at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:54)

        at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:440)

        at org.apache.hadoop.mapred.MapTask.run(MapTask.java:352)

        at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:162)

        at java.security.AccessController.doPrivileged(Native Method)

        at javax.security.auth.Subject.doAs(Subject.java:415)

        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1680)

        at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:157)


  I find out the reason is that MapOperator uses the <MapInputPath, MapOpCtx> to store
chema infomation.
  When the same location corresponding to different shema, it will only store the first schema[a,b,c],
so it cannot find the column 'aaa'.

  Do you guys have any idea about this case?



                                                                                         
   Best Wishes!





Mime
View raw message