hive-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Sushanth Sowmyan (JIRA)" <j...@apache.org>
Subject [jira] [Commented] (HIVE-10727) Import throws error message "org.apache.thrift.protocol.TProtocolException: Required field 'filesAdded' is unset!"
Date Mon, 18 May 2015 22:20:59 GMT

    [ https://issues.apache.org/jira/browse/HIVE-10727?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14549372#comment-14549372
] 

Sushanth Sowmyan commented on HIVE-10727:
-----------------------------------------

Relevant stacktrace:

{noformat}
2015-05-12 15:50:25,729 WARN  [main]: metastore.RetryingMetaStoreClient (RetryingMetaStoreClient.java:invoke(184))
- MetaStoreClient lost connection. Attempting to reconnect.
org.apache.thrift.protocol.TProtocolException: Required field 'filesAdded' is unset! Struct:InsertEventRequestData(filesAdded:null)
	at org.apache.hadoop.hive.metastore.api.InsertEventRequestData.validate(InsertEventRequestData.java:310)
	at org.apache.hadoop.hive.metastore.api.InsertEventRequestData$InsertEventRequestDataStandardScheme.write(InsertEventRequestData.java:378)
	at org.apache.hadoop.hive.metastore.api.InsertEventRequestData$InsertEventRequestDataStandardScheme.write(InsertEventRequestData.java:338)
	at org.apache.hadoop.hive.metastore.api.InsertEventRequestData.write(InsertEventRequestData.java:288)
	at org.apache.hadoop.hive.metastore.api.FireEventRequestData.standardSchemeWriteValue(FireEventRequestData.java:168)
	at org.apache.thrift.TUnion$TUnionStandardScheme.write(TUnion.java:244)
	at org.apache.thrift.TUnion$TUnionStandardScheme.write(TUnion.java:213)
	at org.apache.thrift.TUnion.write(TUnion.java:152)
	at org.apache.hadoop.hive.metastore.api.FireEventRequest$FireEventRequestStandardScheme.write(FireEventRequest.java:748)
	at org.apache.hadoop.hive.metastore.api.FireEventRequest$FireEventRequestStandardScheme.write(FireEventRequest.java:667)
	at org.apache.hadoop.hive.metastore.api.FireEventRequest.write(FireEventRequest.java:577)
	at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$fire_listener_event_args$fire_listener_event_argsStandardScheme.write(ThriftHiveMetastore.java)
	at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$fire_listener_event_args$fire_listener_event_argsStandardScheme.write(ThriftHiveMetastore.java)
	at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$fire_listener_event_args.write(ThriftHiveMetastore.java)
	at org.apache.thrift.TServiceClient.sendBase(TServiceClient.java:63)
	at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.send_fire_listener_event(ThriftHiveMetastore.java:4176)
	at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.fire_listener_event(ThriftHiveMetastore.java:4168)
	at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.fireListenerEvent(HiveMetaStoreClient.java:1954)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:606)
	at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:156)
	at com.sun.proxy.$Proxy5.fireListenerEvent(Unknown Source)
	at org.apache.hadoop.hive.ql.metadata.Hive.fireInsertEvent(Hive.java:1945)
	at org.apache.hadoop.hive.ql.metadata.Hive.getPartition(Hive.java:1876)
	at org.apache.hadoop.hive.ql.metadata.Hive.loadPartition(Hive.java:1407)
	at org.apache.hadoop.hive.ql.metadata.Hive.loadPartition(Hive.java:1324)
	at org.apache.hadoop.hive.ql.exec.MoveTask.execute(MoveTask.java:438)
	at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:160)
	at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:88)
	at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1650)
	at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1409)
	at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1192)
	at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1059)
	at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1049)
	at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:213)
	at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:165)
	at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:376)
	at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:736)
	at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:681)
	at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:621)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:606)
	at org.apache.hadoop.util.RunJar.run(RunJar.java:221)
	at org.apache.hadoop.util.RunJar.main(RunJar.java:136)
{noformat}

> Import throws error message "org.apache.thrift.protocol.TProtocolException: Required
field 'filesAdded' is unset!"
> ------------------------------------------------------------------------------------------------------------------
>
>                 Key: HIVE-10727
>                 URL: https://issues.apache.org/jira/browse/HIVE-10727
>             Project: Hive
>          Issue Type: Bug
>            Reporter: Balu Vellanki
>            Assignee: Sushanth Sowmyan
>         Attachments: hive.log
>
>
> Here are the steps to reproduce. Please setup two hive warehouses with hive.metastore.event.listeners
set to org.apache.hive.hcatalog.listener.DbNotificationListener. On warehouse 1, please do
the following as user hive.
> {code}
> ## create table
> CREATE TABLE page_view4(viewTime INT, userid BIGINT,
>      page_url STRING, referrer_url STRING,
>      ip STRING COMMENT 'IP Address of the User')
>  COMMENT 'This is the page view table'
>  PARTITIONED BY(dt STRING, country STRING)
>  STORED AS SEQUENCEFILE;
> ## Add partitions
> alter table page_view4 add partition (dt="1", country="usa");
> alter table page_view4 add partition (dt="2", country="india");
> insert into table page_view4 PARTITION (dt="1", country="usa") VALUES (1, 1, "url1",
"referurl1", "ip1");
> ## Export table
> export table page_view4 to '/tmp/export4' for replication('foo');
> {code}
> '/tmp/export4' is created with owner as hive and group hdfs. The 
> '/apps/hive/warehouse/page_view4/' is created with owner hive and group users. 
> Copy the exported data in  '/tmp/export4' to hdfs in warehouse 2. The data is still owned
by hive and belongs to group hdfs. Please change the group for dir '/tmp/export4' to users.

> {code}
> bash# su - hdfs
> hdfs : bash# hadoop fs -chown -R hive:users /tmp/export4
> {code}
> As user hive, do the following
> {code}
> hive> import table page_view4 from '/tmp/export4' ;
> Copying data from hdfs://node-4.example.com:8020/tmp/export4/dt=1/country=usa
> ....
> Loading data to table default.page_view4 partition (country=usa, dt=1)
> Failed with exception org.apache.hadoop.hive.ql.metadata.HiveException: org.apache.thrift.protocol.TProtocolException:
Required field 'filesAdded' is unset! Struct:InsertEventRequestData(filesAdded:null)
> FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.MoveTask
> {code}
> The import failed. Attaching the logs from /tmp/hive/hive.log for further debugging.




--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

Mime
View raw message