hive-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Carl Steinbach (JIRA)" <j...@apache.org>
Subject [jira] [Updated] (HIVE-1480) CREATE TABLE IF NOT EXISTS get incorrect table name
Date Tue, 05 Apr 2011 22:11:06 GMT

     [ https://issues.apache.org/jira/browse/HIVE-1480?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]

Carl Steinbach updated HIVE-1480:
---------------------------------

    Component/s: Query Processor

> CREATE TABLE IF NOT EXISTS get incorrect table name
> ---------------------------------------------------
>
>                 Key: HIVE-1480
>                 URL: https://issues.apache.org/jira/browse/HIVE-1480
>             Project: Hive
>          Issue Type: Bug
>          Components: Query Processor
>            Reporter: Ning Zhang
>            Assignee: Ning Zhang
>
> CREATE TABLE IF NOT EXISTS T AS SELECT ... gives the following error after the job succeeded:
> Setting total progress to 100
> 10/07/22 11:26:14 INFO exec.ExecDriver: Ended Job = job_201006221843_688872
> 10/07/22 11:26:14 INFO exec.FileSinkOperator: Moving tmp dir: hdfs://dfstmp.data.facebook.com:9000/tmp/hive-root/hive_2010-07-22_11-20-15_027_2717837693750284928/_tmp.10001
to: hdfs://dfstmp.data.facebook.com:9000/tmp/hive-root/hive_2010-07-22_11-20-15_027_2717837693750284928/_tmp.10001.intermediate
> 10/07/22 11:26:14 INFO exec.FileSinkOperator: Moving tmp dir: hdfs://dfstmp.data.facebook.com:9000/tmp/hive-root/hive_2010-07-22_11-20-15_027_2717837693750284928/_tmp.10001.intermediate
to: hdfs://dfstmp.data.facebook.com:9000/tmp/hive-root/hive_2010-07-22_11-20-15_027_2717837693750284928/10001
> Moving data to: hdfs://dfstmp.data.facebook.com:9000/user/facebook/warehouse/ericm_budget_email_actua43
> 10/07/22 11:26:15 INFO exec.MoveTask: Moving data to: hdfs://dfstmp.data.facebook.com:9000/user/facebook/warehouse/ericm_budget_email_actua43
from hdfs://dfstmp.data.facebook.com:9000/tmp/hive-root/hive_2010-07-22_11-20-15_027_2717837693750284928/10001
> 10/07/22 11:26:15 WARN hdfs.DFSClient: File /user/facebook/warehouse/ericm_budget_email_actua43
is beng deleted only through Trash org.apache.hadoop.fs.FsShell.delete because all deletes
must go through Trash.
> 10/07/22 11:26:15 INFO hive.log: DDL: struct ericm_budget_email_actua43 { string acct_id,
string first_name, string email, string campaign_name_list}
> 10/07/22 11:26:15 INFO metastore.HiveMetaStore: 0: create_table: db=default tbl=ericm_budget_email_actua43
> 10/07/22 11:26:15 INFO metastore.HiveMetaStore: 0: get_table : db=default tbl=ericm_budget_email_actua43
> 10/07/22 11:26:15 INFO hooks.HookUtils: Host:cdb067.snc1.facebook.com database:audit_silver
> 10/07/22 11:26:15 INFO hooks.HookUtils: Host:cdb067.snc1.facebook.com database:lineage_silver
> 10/07/22 11:26:15 INFO hooks.HookUtils: rows inserted: 1 sql: insert into snc1_command_log
set command = ?, command_type = ?, inputs = ?, outputs = ?, queryId = ?, user_info = ?
> OK
> 10/07/22 11:26:15 INFO ql.Driver: OK
> 10/07/22 11:26:16 INFO ql.Context: getStream error: java.io.FileNotFoundException: File
does not exist: hdfs://dfstmp.data.facebook.com:9000/tmp/hive-root/hive_2010-07-22_11-20-15_027_2717837693750284928/10000
> at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:457)
> at org.apache.hadoop.fs.FilterFileSystem.getFileStatus(FilterFileSystem.java:294)
> at org.apache.hadoop.hive.ql.Context.getStream(Context.java:386)
> at org.apache.hadoop.hive.ql.Driver.getResults(Driver.java:688)
> at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:146)
> at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:197)
> at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:294)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
> at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
> at java.lang.reflect.Method.invoke(Method.java:597)
> at org.apache.hadoop.util.RunJar.main(RunJar.java:156)
>  
> Time taken: 361.26 seconds
> 10/07/22 11:26:16 INFO CliDriver: Time taken: 361.26 seconds
> Exit code: 0, 0
> dus: Cannot access /user/facebook/warehouse/IF: No such file or directory.
> tablesize cmd:/mnt/vol/hive/sites/silver.trunk/hadoop/bin/hadoop dfs -dus /user/facebook/warehouse/IF
| cut -d$'\t' -f2
>  

--
This message is automatically generated by JIRA.
For more information on JIRA, see: http://www.atlassian.com/software/jira

Mime
View raw message