Return-Path: X-Original-To: archive-asf-public-internal@cust-asf2.ponee.io Delivered-To: archive-asf-public-internal@cust-asf2.ponee.io Received: from cust-asf.ponee.io (cust-asf.ponee.io [163.172.22.183]) by cust-asf2.ponee.io (Postfix) with ESMTP id 299D9200D3F for ; Tue, 10 Oct 2017 02:51:43 +0200 (CEST) Received: by cust-asf.ponee.io (Postfix) id 28210160BE4; Tue, 10 Oct 2017 00:51:43 +0000 (UTC) Delivered-To: archive-asf-public@cust-asf.ponee.io Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by cust-asf.ponee.io (Postfix) with SMTP id DF9E31609E0 for ; Tue, 10 Oct 2017 02:51:40 +0200 (CEST) Received: (qmail 78070 invoked by uid 500); 10 Oct 2017 00:51:38 -0000 Mailing-List: contact commits-help@hive.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: hive-dev@hive.apache.org Delivered-To: mailing list commits@hive.apache.org Received: (qmail 76697 invoked by uid 99); 10 Oct 2017 00:51:37 -0000 Received: from git1-us-west.apache.org (HELO git1-us-west.apache.org) (140.211.11.23) by apache.org (qpsmtpd/0.29) with ESMTP; Tue, 10 Oct 2017 00:51:37 +0000 Received: by git1-us-west.apache.org (ASF Mail Server at git1-us-west.apache.org, from userid 33) id E1634F5D88; Tue, 10 Oct 2017 00:51:35 +0000 (UTC) Content-Type: text/plain; charset="us-ascii" MIME-Version: 1.0 Content-Transfer-Encoding: 7bit From: sershe@apache.org To: commits@hive.apache.org Date: Tue, 10 Oct 2017 00:51:53 -0000 Message-Id: <8a7b14e9180e45b19cd29c99aed7530b@git.apache.org> In-Reply-To: <97334c38fc1c41faafc6b13d11b4c2f2@git.apache.org> References: <97334c38fc1c41faafc6b13d11b4c2f2@git.apache.org> X-Mailer: ASF-Git Admin Mailer Subject: [19/61] [abbrv] hive git commit: HIVE-17652: retire ANALYZE TABLE ... PARTIALSCAN (Zoltan Haindrich, reviewed by Ashutosh Chauhan) archived-at: Tue, 10 Oct 2017 00:51:43 -0000 http://git-wip-us.apache.org/repos/asf/hive/blob/71004d2e/ql/src/test/results/clientnegative/stats_partialscan_autogether.q.out ---------------------------------------------------------------------- diff --git a/ql/src/test/results/clientnegative/stats_partialscan_autogether.q.out b/ql/src/test/results/clientnegative/stats_partialscan_autogether.q.out deleted file mode 100644 index 680999f..0000000 --- a/ql/src/test/results/clientnegative/stats_partialscan_autogether.q.out +++ /dev/null @@ -1,75 +0,0 @@ -PREHOOK: query: CREATE table analyze_srcpart_partial_scan (key STRING, value STRING) -partitioned by (ds string, hr string) -stored as rcfile -PREHOOK: type: CREATETABLE -PREHOOK: Output: database:default -PREHOOK: Output: default@analyze_srcpart_partial_scan -POSTHOOK: query: CREATE table analyze_srcpart_partial_scan (key STRING, value STRING) -partitioned by (ds string, hr string) -stored as rcfile -POSTHOOK: type: CREATETABLE -POSTHOOK: Output: database:default -POSTHOOK: Output: default@analyze_srcpart_partial_scan -PREHOOK: query: insert overwrite table analyze_srcpart_partial_scan partition (ds, hr) select * from srcpart where ds is not null order by key -PREHOOK: type: QUERY -PREHOOK: Input: default@srcpart -PREHOOK: Input: default@srcpart@ds=2008-04-08/hr=11 -PREHOOK: Input: default@srcpart@ds=2008-04-08/hr=12 -PREHOOK: Input: default@srcpart@ds=2008-04-09/hr=11 -PREHOOK: Input: default@srcpart@ds=2008-04-09/hr=12 -PREHOOK: Output: default@analyze_srcpart_partial_scan -POSTHOOK: query: insert overwrite table analyze_srcpart_partial_scan partition (ds, hr) select * from srcpart where ds is not null order by key -POSTHOOK: type: QUERY -POSTHOOK: Input: default@srcpart -POSTHOOK: Input: default@srcpart@ds=2008-04-08/hr=11 -POSTHOOK: Input: default@srcpart@ds=2008-04-08/hr=12 -POSTHOOK: Input: default@srcpart@ds=2008-04-09/hr=11 -POSTHOOK: Input: default@srcpart@ds=2008-04-09/hr=12 -POSTHOOK: Output: default@analyze_srcpart_partial_scan@ds=2008-04-08/hr=11 -POSTHOOK: Output: default@analyze_srcpart_partial_scan@ds=2008-04-08/hr=12 -POSTHOOK: Output: default@analyze_srcpart_partial_scan@ds=2008-04-09/hr=11 -POSTHOOK: Output: default@analyze_srcpart_partial_scan@ds=2008-04-09/hr=12 -POSTHOOK: Lineage: analyze_srcpart_partial_scan PARTITION(ds=2008-04-08,hr=11).key SIMPLE [(srcpart)srcpart.FieldSchema(name:key, type:string, comment:default), ] -POSTHOOK: Lineage: analyze_srcpart_partial_scan PARTITION(ds=2008-04-08,hr=11).value SIMPLE [(srcpart)srcpart.FieldSchema(name:value, type:string, comment:default), ] -POSTHOOK: Lineage: analyze_srcpart_partial_scan PARTITION(ds=2008-04-08,hr=12).key SIMPLE [(srcpart)srcpart.FieldSchema(name:key, type:string, comment:default), ] -POSTHOOK: Lineage: analyze_srcpart_partial_scan PARTITION(ds=2008-04-08,hr=12).value SIMPLE [(srcpart)srcpart.FieldSchema(name:value, type:string, comment:default), ] -POSTHOOK: Lineage: analyze_srcpart_partial_scan PARTITION(ds=2008-04-09,hr=11).key SIMPLE [(srcpart)srcpart.FieldSchema(name:key, type:string, comment:default), ] -POSTHOOK: Lineage: analyze_srcpart_partial_scan PARTITION(ds=2008-04-09,hr=11).value SIMPLE [(srcpart)srcpart.FieldSchema(name:value, type:string, comment:default), ] -POSTHOOK: Lineage: analyze_srcpart_partial_scan PARTITION(ds=2008-04-09,hr=12).key SIMPLE [(srcpart)srcpart.FieldSchema(name:key, type:string, comment:default), ] -POSTHOOK: Lineage: analyze_srcpart_partial_scan PARTITION(ds=2008-04-09,hr=12).value SIMPLE [(srcpart)srcpart.FieldSchema(name:value, type:string, comment:default), ] -PREHOOK: query: describe formatted analyze_srcpart_partial_scan PARTITION(ds='2008-04-08',hr=11) -PREHOOK: type: DESCTABLE -PREHOOK: Input: default@analyze_srcpart_partial_scan -POSTHOOK: query: describe formatted analyze_srcpart_partial_scan PARTITION(ds='2008-04-08',hr=11) -POSTHOOK: type: DESCTABLE -POSTHOOK: Input: default@analyze_srcpart_partial_scan -# col_name data_type comment -key string -value string - -# Partition Information -# col_name data_type comment -ds string -hr string - -# Detailed Partition Information -Partition Value: [2008-04-08, 11] -Database: default -Table: analyze_srcpart_partial_scan -#### A masked pattern was here #### -Partition Parameters: - numFiles 1 - totalSize 5077 -#### A masked pattern was here #### - -# Storage Information -SerDe Library: org.apache.hadoop.hive.serde2.columnar.ColumnarSerDe -InputFormat: org.apache.hadoop.hive.ql.io.RCFileInputFormat -OutputFormat: org.apache.hadoop.hive.ql.io.RCFileOutputFormat -Compressed: No -Num Buckets: -1 -Bucket Columns: [] -Sort Columns: [] -Storage Desc Params: - serialization.format 1 -FAILED: SemanticException [Error 10233]: Analyze partialscan is not allowed if hive.stats.autogather is set to false http://git-wip-us.apache.org/repos/asf/hive/blob/71004d2e/ql/src/test/results/clientnegative/stats_partialscan_non_external.q.out ---------------------------------------------------------------------- diff --git a/ql/src/test/results/clientnegative/stats_partialscan_non_external.q.out b/ql/src/test/results/clientnegative/stats_partialscan_non_external.q.out deleted file mode 100644 index ce3073b..0000000 --- a/ql/src/test/results/clientnegative/stats_partialscan_non_external.q.out +++ /dev/null @@ -1,9 +0,0 @@ -PREHOOK: query: CREATE EXTERNAL TABLE external_table (key int, value string) -PREHOOK: type: CREATETABLE -PREHOOK: Output: database:default -PREHOOK: Output: default@external_table -POSTHOOK: query: CREATE EXTERNAL TABLE external_table (key int, value string) -POSTHOOK: type: CREATETABLE -POSTHOOK: Output: database:default -POSTHOOK: Output: default@external_table -FAILED: SemanticException [Error 10231]: ANALYZE TABLE PARTIALSCAN doesn't support external table: external_table http://git-wip-us.apache.org/repos/asf/hive/blob/71004d2e/ql/src/test/results/clientnegative/stats_partialscan_non_native.q.out ---------------------------------------------------------------------- diff --git a/ql/src/test/results/clientnegative/stats_partialscan_non_native.q.out b/ql/src/test/results/clientnegative/stats_partialscan_non_native.q.out deleted file mode 100644 index 9d46969..0000000 --- a/ql/src/test/results/clientnegative/stats_partialscan_non_native.q.out +++ /dev/null @@ -1,11 +0,0 @@ -PREHOOK: query: CREATE TABLE non_native1(key int, value string) -STORED BY 'org.apache.hadoop.hive.ql.metadata.DefaultStorageHandler' -PREHOOK: type: CREATETABLE -PREHOOK: Output: database:default -PREHOOK: Output: default@non_native1 -POSTHOOK: query: CREATE TABLE non_native1(key int, value string) -STORED BY 'org.apache.hadoop.hive.ql.metadata.DefaultStorageHandler' -POSTHOOK: type: CREATETABLE -POSTHOOK: Output: database:default -POSTHOOK: Output: default@non_native1 -FAILED: SemanticException [Error 10229]: ANALYZE TABLE PARTIALSCAN cannot be used for a non-native table non_native1 http://git-wip-us.apache.org/repos/asf/hive/blob/71004d2e/ql/src/test/results/clientnegative/stats_partscan_norcfile.q.out ---------------------------------------------------------------------- diff --git a/ql/src/test/results/clientnegative/stats_partscan_norcfile.q.out b/ql/src/test/results/clientnegative/stats_partscan_norcfile.q.out deleted file mode 100644 index 38a62b0..0000000 --- a/ql/src/test/results/clientnegative/stats_partscan_norcfile.q.out +++ /dev/null @@ -1,36 +0,0 @@ -PREHOOK: query: create table analyze_srcpart_partial_scan like srcpart -PREHOOK: type: CREATETABLE -PREHOOK: Output: database:default -PREHOOK: Output: default@analyze_srcpart_partial_scan -POSTHOOK: query: create table analyze_srcpart_partial_scan like srcpart -POSTHOOK: type: CREATETABLE -POSTHOOK: Output: database:default -POSTHOOK: Output: default@analyze_srcpart_partial_scan -PREHOOK: query: insert overwrite table analyze_srcpart_partial_scan partition (ds, hr) select * from srcpart where ds is not null -PREHOOK: type: QUERY -PREHOOK: Input: default@srcpart -PREHOOK: Input: default@srcpart@ds=2008-04-08/hr=11 -PREHOOK: Input: default@srcpart@ds=2008-04-08/hr=12 -PREHOOK: Input: default@srcpart@ds=2008-04-09/hr=11 -PREHOOK: Input: default@srcpart@ds=2008-04-09/hr=12 -PREHOOK: Output: default@analyze_srcpart_partial_scan -POSTHOOK: query: insert overwrite table analyze_srcpart_partial_scan partition (ds, hr) select * from srcpart where ds is not null -POSTHOOK: type: QUERY -POSTHOOK: Input: default@srcpart -POSTHOOK: Input: default@srcpart@ds=2008-04-08/hr=11 -POSTHOOK: Input: default@srcpart@ds=2008-04-08/hr=12 -POSTHOOK: Input: default@srcpart@ds=2008-04-09/hr=11 -POSTHOOK: Input: default@srcpart@ds=2008-04-09/hr=12 -POSTHOOK: Output: default@analyze_srcpart_partial_scan@ds=2008-04-08/hr=11 -POSTHOOK: Output: default@analyze_srcpart_partial_scan@ds=2008-04-08/hr=12 -POSTHOOK: Output: default@analyze_srcpart_partial_scan@ds=2008-04-09/hr=11 -POSTHOOK: Output: default@analyze_srcpart_partial_scan@ds=2008-04-09/hr=12 -POSTHOOK: Lineage: analyze_srcpart_partial_scan PARTITION(ds=2008-04-08,hr=11).key SIMPLE [(srcpart)srcpart.FieldSchema(name:key, type:string, comment:default), ] -POSTHOOK: Lineage: analyze_srcpart_partial_scan PARTITION(ds=2008-04-08,hr=11).value SIMPLE [(srcpart)srcpart.FieldSchema(name:value, type:string, comment:default), ] -POSTHOOK: Lineage: analyze_srcpart_partial_scan PARTITION(ds=2008-04-08,hr=12).key SIMPLE [(srcpart)srcpart.FieldSchema(name:key, type:string, comment:default), ] -POSTHOOK: Lineage: analyze_srcpart_partial_scan PARTITION(ds=2008-04-08,hr=12).value SIMPLE [(srcpart)srcpart.FieldSchema(name:value, type:string, comment:default), ] -POSTHOOK: Lineage: analyze_srcpart_partial_scan PARTITION(ds=2008-04-09,hr=11).key SIMPLE [(srcpart)srcpart.FieldSchema(name:key, type:string, comment:default), ] -POSTHOOK: Lineage: analyze_srcpart_partial_scan PARTITION(ds=2008-04-09,hr=11).value SIMPLE [(srcpart)srcpart.FieldSchema(name:value, type:string, comment:default), ] -POSTHOOK: Lineage: analyze_srcpart_partial_scan PARTITION(ds=2008-04-09,hr=12).key SIMPLE [(srcpart)srcpart.FieldSchema(name:key, type:string, comment:default), ] -POSTHOOK: Lineage: analyze_srcpart_partial_scan PARTITION(ds=2008-04-09,hr=12).value SIMPLE [(srcpart)srcpart.FieldSchema(name:value, type:string, comment:default), ] -FAILED: SemanticException [Error 10230]: ANALYZE TABLE PARTIALSCAN doesn't support non-RCfile. http://git-wip-us.apache.org/repos/asf/hive/blob/71004d2e/ql/src/test/results/clientpositive/llap/orc_analyze.q.out ---------------------------------------------------------------------- diff --git a/ql/src/test/results/clientpositive/llap/orc_analyze.q.out b/ql/src/test/results/clientpositive/llap/orc_analyze.q.out index 2cecf4e..4e4e3e3 100644 --- a/ql/src/test/results/clientpositive/llap/orc_analyze.q.out +++ b/ql/src/test/results/clientpositive/llap/orc_analyze.q.out @@ -114,53 +114,6 @@ Bucket Columns: [] Sort Columns: [] Storage Desc Params: serialization.format 1 -PREHOOK: query: analyze table orc_create_people compute statistics partialscan -PREHOOK: type: QUERY -PREHOOK: Input: default@orc_create_people -PREHOOK: Output: default@orc_create_people -POSTHOOK: query: analyze table orc_create_people compute statistics partialscan -POSTHOOK: type: QUERY -POSTHOOK: Input: default@orc_create_people -POSTHOOK: Output: default@orc_create_people -PREHOOK: query: desc formatted orc_create_people -PREHOOK: type: DESCTABLE -PREHOOK: Input: default@orc_create_people -POSTHOOK: query: desc formatted orc_create_people -POSTHOOK: type: DESCTABLE -POSTHOOK: Input: default@orc_create_people -# col_name data_type comment -id int -first_name string -last_name string -address string -salary decimal(10,0) -start_date timestamp -state string - -# Detailed Table Information -Database: default -#### A masked pattern was here #### -Retention: 0 -#### A masked pattern was here #### -Table Type: MANAGED_TABLE -Table Parameters: - COLUMN_STATS_ACCURATE {\"BASIC_STATS\":\"true\"} - numFiles 1 - numRows 100 - rawDataSize 52600 - totalSize 3200 -#### A masked pattern was here #### - -# Storage Information -SerDe Library: org.apache.hadoop.hive.ql.io.orc.OrcSerde -InputFormat: org.apache.hadoop.hive.ql.io.orc.OrcInputFormat -OutputFormat: org.apache.hadoop.hive.ql.io.orc.OrcOutputFormat -Compressed: No -Num Buckets: -1 -Bucket Columns: [] -Sort Columns: [] -Storage Desc Params: - serialization.format 1 PREHOOK: query: analyze table orc_create_people compute statistics noscan PREHOOK: type: QUERY PREHOOK: Input: default@orc_create_people @@ -443,100 +396,6 @@ Bucket Columns: [] Sort Columns: [] Storage Desc Params: serialization.format 1 -PREHOOK: query: analyze table orc_create_people partition(state) compute statistics partialscan -PREHOOK: type: QUERY -PREHOOK: Input: default@orc_create_people -PREHOOK: Output: default@orc_create_people -PREHOOK: Output: default@orc_create_people@state=Ca -PREHOOK: Output: default@orc_create_people@state=Or -POSTHOOK: query: analyze table orc_create_people partition(state) compute statistics partialscan -POSTHOOK: type: QUERY -POSTHOOK: Input: default@orc_create_people -POSTHOOK: Output: default@orc_create_people -POSTHOOK: Output: default@orc_create_people@state=Ca -POSTHOOK: Output: default@orc_create_people@state=Or -PREHOOK: query: desc formatted orc_create_people partition(state="Ca") -PREHOOK: type: DESCTABLE -PREHOOK: Input: default@orc_create_people -POSTHOOK: query: desc formatted orc_create_people partition(state="Ca") -POSTHOOK: type: DESCTABLE -POSTHOOK: Input: default@orc_create_people -# col_name data_type comment -id int -first_name string -last_name string -address string -salary decimal(10,0) -start_date timestamp - -# Partition Information -# col_name data_type comment -state string - -# Detailed Partition Information -Partition Value: [Ca] -Database: default -Table: orc_create_people -#### A masked pattern was here #### -Partition Parameters: - COLUMN_STATS_ACCURATE {\"BASIC_STATS\":\"true\"} - numFiles 1 - numRows 50 - rawDataSize 21950 - totalSize 2099 -#### A masked pattern was here #### - -# Storage Information -SerDe Library: org.apache.hadoop.hive.ql.io.orc.OrcSerde -InputFormat: org.apache.hadoop.hive.ql.io.orc.OrcInputFormat -OutputFormat: org.apache.hadoop.hive.ql.io.orc.OrcOutputFormat -Compressed: No -Num Buckets: -1 -Bucket Columns: [] -Sort Columns: [] -Storage Desc Params: - serialization.format 1 -PREHOOK: query: desc formatted orc_create_people partition(state="Or") -PREHOOK: type: DESCTABLE -PREHOOK: Input: default@orc_create_people -POSTHOOK: query: desc formatted orc_create_people partition(state="Or") -POSTHOOK: type: DESCTABLE -POSTHOOK: Input: default@orc_create_people -# col_name data_type comment -id int -first_name string -last_name string -address string -salary decimal(10,0) -start_date timestamp - -# Partition Information -# col_name data_type comment -state string - -# Detailed Partition Information -Partition Value: [Or] -Database: default -Table: orc_create_people -#### A masked pattern was here #### -Partition Parameters: - COLUMN_STATS_ACCURATE {\"BASIC_STATS\":\"true\"} - numFiles 1 - numRows 50 - rawDataSize 22050 - totalSize 2114 -#### A masked pattern was here #### - -# Storage Information -SerDe Library: org.apache.hadoop.hive.ql.io.orc.OrcSerde -InputFormat: org.apache.hadoop.hive.ql.io.orc.OrcInputFormat -OutputFormat: org.apache.hadoop.hive.ql.io.orc.OrcOutputFormat -Compressed: No -Num Buckets: -1 -Bucket Columns: [] -Sort Columns: [] -Storage Desc Params: - serialization.format 1 PREHOOK: query: analyze table orc_create_people partition(state) compute statistics noscan PREHOOK: type: QUERY PREHOOK: Input: default@orc_create_people @@ -923,100 +782,6 @@ Bucket Columns: [first_name] Sort Columns: [Order(col:last_name, order:1)] Storage Desc Params: serialization.format 1 -PREHOOK: query: analyze table orc_create_people partition(state) compute statistics partialscan -PREHOOK: type: QUERY -PREHOOK: Input: default@orc_create_people -PREHOOK: Output: default@orc_create_people -PREHOOK: Output: default@orc_create_people@state=Ca -PREHOOK: Output: default@orc_create_people@state=Or -POSTHOOK: query: analyze table orc_create_people partition(state) compute statistics partialscan -POSTHOOK: type: QUERY -POSTHOOK: Input: default@orc_create_people -POSTHOOK: Output: default@orc_create_people -POSTHOOK: Output: default@orc_create_people@state=Ca -POSTHOOK: Output: default@orc_create_people@state=Or -PREHOOK: query: desc formatted orc_create_people partition(state="Ca") -PREHOOK: type: DESCTABLE -PREHOOK: Input: default@orc_create_people -POSTHOOK: query: desc formatted orc_create_people partition(state="Ca") -POSTHOOK: type: DESCTABLE -POSTHOOK: Input: default@orc_create_people -# col_name data_type comment -id int -first_name string -last_name string -address string -salary decimal(10,0) -start_date timestamp - -# Partition Information -# col_name data_type comment -state string - -# Detailed Partition Information -Partition Value: [Ca] -Database: default -Table: orc_create_people -#### A masked pattern was here #### -Partition Parameters: - COLUMN_STATS_ACCURATE {\"BASIC_STATS\":\"true\"} - numFiles 4 - numRows 50 - rawDataSize 21975 - totalSize 5260 -#### A masked pattern was here #### - -# Storage Information -SerDe Library: org.apache.hadoop.hive.ql.io.orc.OrcSerde -InputFormat: org.apache.hadoop.hive.ql.io.orc.OrcInputFormat -OutputFormat: org.apache.hadoop.hive.ql.io.orc.OrcOutputFormat -Compressed: No -Num Buckets: 4 -Bucket Columns: [first_name] -Sort Columns: [Order(col:last_name, order:1)] -Storage Desc Params: - serialization.format 1 -PREHOOK: query: desc formatted orc_create_people partition(state="Or") -PREHOOK: type: DESCTABLE -PREHOOK: Input: default@orc_create_people -POSTHOOK: query: desc formatted orc_create_people partition(state="Or") -POSTHOOK: type: DESCTABLE -POSTHOOK: Input: default@orc_create_people -# col_name data_type comment -id int -first_name string -last_name string -address string -salary decimal(10,0) -start_date timestamp - -# Partition Information -# col_name data_type comment -state string - -# Detailed Partition Information -Partition Value: [Or] -Database: default -Table: orc_create_people -#### A masked pattern was here #### -Partition Parameters: - COLUMN_STATS_ACCURATE {\"BASIC_STATS\":\"true\"} - numFiles 4 - numRows 50 - rawDataSize 22043 - totalSize 5331 -#### A masked pattern was here #### - -# Storage Information -SerDe Library: org.apache.hadoop.hive.ql.io.orc.OrcSerde -InputFormat: org.apache.hadoop.hive.ql.io.orc.OrcInputFormat -OutputFormat: org.apache.hadoop.hive.ql.io.orc.OrcOutputFormat -Compressed: No -Num Buckets: 4 -Bucket Columns: [first_name] -Sort Columns: [Order(col:last_name, order:1)] -Storage Desc Params: - serialization.format 1 PREHOOK: query: analyze table orc_create_people partition(state) compute statistics noscan PREHOOK: type: QUERY PREHOOK: Input: default@orc_create_people @@ -1362,59 +1127,6 @@ Bucket Columns: [] Sort Columns: [] Storage Desc Params: serialization.format 1 -PREHOOK: query: analyze table orc_create_people partition(state) compute statistics partialscan -PREHOOK: type: QUERY -PREHOOK: Input: default@orc_create_people -PREHOOK: Output: default@orc_create_people -PREHOOK: Output: default@orc_create_people@state=Ca -PREHOOK: Output: default@orc_create_people@state=Or -POSTHOOK: query: analyze table orc_create_people partition(state) compute statistics partialscan -POSTHOOK: type: QUERY -POSTHOOK: Input: default@orc_create_people -POSTHOOK: Output: default@orc_create_people -POSTHOOK: Output: default@orc_create_people@state=Ca -POSTHOOK: Output: default@orc_create_people@state=Or -PREHOOK: query: desc formatted orc_create_people partition(state="Ca") -PREHOOK: type: DESCTABLE -PREHOOK: Input: default@orc_create_people -POSTHOOK: query: desc formatted orc_create_people partition(state="Ca") -POSTHOOK: type: DESCTABLE -POSTHOOK: Input: default@orc_create_people -# col_name data_type comment -id int -first_name string -last_name string -address string -salary decimal(10,0) -start_date timestamp - -# Partition Information -# col_name data_type comment -state string - -# Detailed Partition Information -Partition Value: [Ca] -Database: default -Table: orc_create_people -#### A masked pattern was here #### -Partition Parameters: - COLUMN_STATS_ACCURATE {\"BASIC_STATS\":\"true\"} - numFiles 1 - numRows 50 - rawDataSize 21950 - totalSize 2099 -#### A masked pattern was here #### - -# Storage Information -SerDe Library: org.apache.hadoop.hive.ql.io.orc.OrcSerde -InputFormat: org.apache.hadoop.hive.ql.io.orc.OrcInputFormat -OutputFormat: org.apache.hadoop.hive.ql.io.orc.OrcOutputFormat -Compressed: No -Num Buckets: -1 -Bucket Columns: [] -Sort Columns: [] -Storage Desc Params: - serialization.format 1 PREHOOK: query: analyze table orc_create_people partition(state) compute statistics noscan PREHOOK: type: QUERY PREHOOK: Input: default@orc_create_people http://git-wip-us.apache.org/repos/asf/hive/blob/71004d2e/ql/src/test/results/clientpositive/spark/stats_partscan_1_23.q.out ---------------------------------------------------------------------- diff --git a/ql/src/test/results/clientpositive/spark/stats_partscan_1_23.q.out b/ql/src/test/results/clientpositive/spark/stats_partscan_1_23.q.out deleted file mode 100644 index fdf111e..0000000 --- a/ql/src/test/results/clientpositive/spark/stats_partscan_1_23.q.out +++ /dev/null @@ -1,181 +0,0 @@ -PREHOOK: query: CREATE table analyze_srcpart_partial_scan (key STRING, value STRING) -partitioned by (ds string, hr string) -stored as rcfile -PREHOOK: type: CREATETABLE -PREHOOK: Output: database:default -PREHOOK: Output: default@analyze_srcpart_partial_scan -POSTHOOK: query: CREATE table analyze_srcpart_partial_scan (key STRING, value STRING) -partitioned by (ds string, hr string) -stored as rcfile -POSTHOOK: type: CREATETABLE -POSTHOOK: Output: database:default -POSTHOOK: Output: default@analyze_srcpart_partial_scan -PREHOOK: query: insert overwrite table analyze_srcpart_partial_scan partition (ds, hr) select * from srcpart where ds is not null -PREHOOK: type: QUERY -PREHOOK: Input: default@srcpart -PREHOOK: Input: default@srcpart@ds=2008-04-08/hr=11 -PREHOOK: Input: default@srcpart@ds=2008-04-08/hr=12 -PREHOOK: Input: default@srcpart@ds=2008-04-09/hr=11 -PREHOOK: Input: default@srcpart@ds=2008-04-09/hr=12 -PREHOOK: Output: default@analyze_srcpart_partial_scan -POSTHOOK: query: insert overwrite table analyze_srcpart_partial_scan partition (ds, hr) select * from srcpart where ds is not null -POSTHOOK: type: QUERY -POSTHOOK: Input: default@srcpart -POSTHOOK: Input: default@srcpart@ds=2008-04-08/hr=11 -POSTHOOK: Input: default@srcpart@ds=2008-04-08/hr=12 -POSTHOOK: Input: default@srcpart@ds=2008-04-09/hr=11 -POSTHOOK: Input: default@srcpart@ds=2008-04-09/hr=12 -POSTHOOK: Output: default@analyze_srcpart_partial_scan@ds=2008-04-08/hr=11 -POSTHOOK: Output: default@analyze_srcpart_partial_scan@ds=2008-04-08/hr=12 -POSTHOOK: Output: default@analyze_srcpart_partial_scan@ds=2008-04-09/hr=11 -POSTHOOK: Output: default@analyze_srcpart_partial_scan@ds=2008-04-09/hr=12 -POSTHOOK: Lineage: analyze_srcpart_partial_scan PARTITION(ds=2008-04-08,hr=11).key SIMPLE [(srcpart)srcpart.FieldSchema(name:key, type:string, comment:default), ] -POSTHOOK: Lineage: analyze_srcpart_partial_scan PARTITION(ds=2008-04-08,hr=11).value SIMPLE [(srcpart)srcpart.FieldSchema(name:value, type:string, comment:default), ] -POSTHOOK: Lineage: analyze_srcpart_partial_scan PARTITION(ds=2008-04-08,hr=12).key SIMPLE [(srcpart)srcpart.FieldSchema(name:key, type:string, comment:default), ] -POSTHOOK: Lineage: analyze_srcpart_partial_scan PARTITION(ds=2008-04-08,hr=12).value SIMPLE [(srcpart)srcpart.FieldSchema(name:value, type:string, comment:default), ] -POSTHOOK: Lineage: analyze_srcpart_partial_scan PARTITION(ds=2008-04-09,hr=11).key SIMPLE [(srcpart)srcpart.FieldSchema(name:key, type:string, comment:default), ] -POSTHOOK: Lineage: analyze_srcpart_partial_scan PARTITION(ds=2008-04-09,hr=11).value SIMPLE [(srcpart)srcpart.FieldSchema(name:value, type:string, comment:default), ] -POSTHOOK: Lineage: analyze_srcpart_partial_scan PARTITION(ds=2008-04-09,hr=12).key SIMPLE [(srcpart)srcpart.FieldSchema(name:key, type:string, comment:default), ] -POSTHOOK: Lineage: analyze_srcpart_partial_scan PARTITION(ds=2008-04-09,hr=12).value SIMPLE [(srcpart)srcpart.FieldSchema(name:value, type:string, comment:default), ] -PREHOOK: query: describe formatted analyze_srcpart_partial_scan PARTITION(ds='2008-04-08',hr=11) -PREHOOK: type: DESCTABLE -PREHOOK: Input: default@analyze_srcpart_partial_scan -POSTHOOK: query: describe formatted analyze_srcpart_partial_scan PARTITION(ds='2008-04-08',hr=11) -POSTHOOK: type: DESCTABLE -POSTHOOK: Input: default@analyze_srcpart_partial_scan -# col_name data_type comment -key string -value string - -# Partition Information -# col_name data_type comment -ds string -hr string - -# Detailed Partition Information -Partition Value: [2008-04-08, 11] -Database: default -Table: analyze_srcpart_partial_scan -#### A masked pattern was here #### -Partition Parameters: - numFiles 1 - totalSize 5293 -#### A masked pattern was here #### - -# Storage Information -SerDe Library: org.apache.hadoop.hive.serde2.columnar.ColumnarSerDe -InputFormat: org.apache.hadoop.hive.ql.io.RCFileInputFormat -OutputFormat: org.apache.hadoop.hive.ql.io.RCFileOutputFormat -Compressed: No -Num Buckets: -1 -Bucket Columns: [] -Sort Columns: [] -Storage Desc Params: - serialization.format 1 -PREHOOK: query: explain -analyze table analyze_srcpart_partial_scan PARTITION(ds='2008-04-08',hr=11) compute statistics partialscan -PREHOOK: type: QUERY -POSTHOOK: query: explain -analyze table analyze_srcpart_partial_scan PARTITION(ds='2008-04-08',hr=11) compute statistics partialscan -POSTHOOK: type: QUERY -STAGE DEPENDENCIES: - Stage-2 is a root stage - Stage-1 depends on stages: Stage-0, Stage-2 - -STAGE PLANS: - Stage: Stage-2 - Partial Scan Statistics - - Stage: Stage-1 - Stats-Aggr Operator - -PREHOOK: query: analyze table analyze_srcpart_partial_scan PARTITION(ds='2008-04-08',hr=11) compute statistics partialscan -PREHOOK: type: QUERY -PREHOOK: Input: default@analyze_srcpart_partial_scan -PREHOOK: Input: default@analyze_srcpart_partial_scan@ds=2008-04-08/hr=11 -PREHOOK: Output: default@analyze_srcpart_partial_scan -PREHOOK: Output: default@analyze_srcpart_partial_scan@ds=2008-04-08/hr=11 -POSTHOOK: query: analyze table analyze_srcpart_partial_scan PARTITION(ds='2008-04-08',hr=11) compute statistics partialscan -POSTHOOK: type: QUERY -POSTHOOK: Input: default@analyze_srcpart_partial_scan -POSTHOOK: Input: default@analyze_srcpart_partial_scan@ds=2008-04-08/hr=11 -POSTHOOK: Output: default@analyze_srcpart_partial_scan -POSTHOOK: Output: default@analyze_srcpart_partial_scan@ds=2008-04-08/hr=11 -PREHOOK: query: describe formatted analyze_srcpart_partial_scan PARTITION(ds='2008-04-08',hr=11) -PREHOOK: type: DESCTABLE -PREHOOK: Input: default@analyze_srcpart_partial_scan -POSTHOOK: query: describe formatted analyze_srcpart_partial_scan PARTITION(ds='2008-04-08',hr=11) -POSTHOOK: type: DESCTABLE -POSTHOOK: Input: default@analyze_srcpart_partial_scan -# col_name data_type comment -key string -value string - -# Partition Information -# col_name data_type comment -ds string -hr string - -# Detailed Partition Information -Partition Value: [2008-04-08, 11] -Database: default -Table: analyze_srcpart_partial_scan -#### A masked pattern was here #### -Partition Parameters: - numFiles 1 - totalSize 5293 -#### A masked pattern was here #### - -# Storage Information -SerDe Library: org.apache.hadoop.hive.serde2.columnar.ColumnarSerDe -InputFormat: org.apache.hadoop.hive.ql.io.RCFileInputFormat -OutputFormat: org.apache.hadoop.hive.ql.io.RCFileOutputFormat -Compressed: No -Num Buckets: -1 -Bucket Columns: [] -Sort Columns: [] -Storage Desc Params: - serialization.format 1 -PREHOOK: query: describe formatted analyze_srcpart_partial_scan PARTITION(ds='2008-04-09',hr=11) -PREHOOK: type: DESCTABLE -PREHOOK: Input: default@analyze_srcpart_partial_scan -POSTHOOK: query: describe formatted analyze_srcpart_partial_scan PARTITION(ds='2008-04-09',hr=11) -POSTHOOK: type: DESCTABLE -POSTHOOK: Input: default@analyze_srcpart_partial_scan -# col_name data_type comment -key string -value string - -# Partition Information -# col_name data_type comment -ds string -hr string - -# Detailed Partition Information -Partition Value: [2008-04-09, 11] -Database: default -Table: analyze_srcpart_partial_scan -#### A masked pattern was here #### -Partition Parameters: - numFiles 1 - totalSize 5293 -#### A masked pattern was here #### - -# Storage Information -SerDe Library: org.apache.hadoop.hive.serde2.columnar.ColumnarSerDe -InputFormat: org.apache.hadoop.hive.ql.io.RCFileInputFormat -OutputFormat: org.apache.hadoop.hive.ql.io.RCFileOutputFormat -Compressed: No -Num Buckets: -1 -Bucket Columns: [] -Sort Columns: [] -Storage Desc Params: - serialization.format 1 -PREHOOK: query: drop table analyze_srcpart_partial_scan -PREHOOK: type: DROPTABLE -PREHOOK: Input: default@analyze_srcpart_partial_scan -PREHOOK: Output: default@analyze_srcpart_partial_scan -POSTHOOK: query: drop table analyze_srcpart_partial_scan -POSTHOOK: type: DROPTABLE -POSTHOOK: Input: default@analyze_srcpart_partial_scan -POSTHOOK: Output: default@analyze_srcpart_partial_scan http://git-wip-us.apache.org/repos/asf/hive/blob/71004d2e/ql/src/test/results/clientpositive/stats_partscan_1.q.out ---------------------------------------------------------------------- diff --git a/ql/src/test/results/clientpositive/stats_partscan_1.q.out b/ql/src/test/results/clientpositive/stats_partscan_1.q.out deleted file mode 100644 index e2a3bbd..0000000 --- a/ql/src/test/results/clientpositive/stats_partscan_1.q.out +++ /dev/null @@ -1,215 +0,0 @@ -PREHOOK: query: -- INCLUDE_HADOOP_MAJOR_VERSIONS(0.20,0.20S) --- This test uses mapred.max.split.size/mapred.max.split.size for controlling --- number of input splits, which is not effective in hive 0.20. --- stats_partscan_1_23.q is the same test with this but has different result. - --- test analyze table ... compute statistics partialscan - --- 1. prepare data -CREATE table analyze_srcpart_partial_scan (key STRING, value STRING) -partitioned by (ds string, hr string) -stored as rcfile -PREHOOK: type: CREATETABLE -PREHOOK: Output: database:default -POSTHOOK: query: -- INCLUDE_HADOOP_MAJOR_VERSIONS(0.20,0.20S) --- This test uses mapred.max.split.size/mapred.max.split.size for controlling --- number of input splits, which is not effective in hive 0.20. --- stats_partscan_1_23.q is the same test with this but has different result. - --- test analyze table ... compute statistics partialscan - --- 1. prepare data -CREATE table analyze_srcpart_partial_scan (key STRING, value STRING) -partitioned by (ds string, hr string) -stored as rcfile -POSTHOOK: type: CREATETABLE -POSTHOOK: Output: database:default -POSTHOOK: Output: default@analyze_srcpart_partial_scan -PREHOOK: query: insert overwrite table analyze_srcpart_partial_scan partition (ds, hr) select * from srcpart where ds is not null -PREHOOK: type: QUERY -PREHOOK: Input: default@srcpart -PREHOOK: Input: default@srcpart@ds=2008-04-08/hr=11 -PREHOOK: Input: default@srcpart@ds=2008-04-08/hr=12 -PREHOOK: Input: default@srcpart@ds=2008-04-09/hr=11 -PREHOOK: Input: default@srcpart@ds=2008-04-09/hr=12 -PREHOOK: Output: default@analyze_srcpart_partial_scan -POSTHOOK: query: insert overwrite table analyze_srcpart_partial_scan partition (ds, hr) select * from srcpart where ds is not null -POSTHOOK: type: QUERY -POSTHOOK: Input: default@srcpart -POSTHOOK: Input: default@srcpart@ds=2008-04-08/hr=11 -POSTHOOK: Input: default@srcpart@ds=2008-04-08/hr=12 -POSTHOOK: Input: default@srcpart@ds=2008-04-09/hr=11 -POSTHOOK: Input: default@srcpart@ds=2008-04-09/hr=12 -POSTHOOK: Output: default@analyze_srcpart_partial_scan@ds=2008-04-08/hr=11 -POSTHOOK: Output: default@analyze_srcpart_partial_scan@ds=2008-04-08/hr=12 -POSTHOOK: Output: default@analyze_srcpart_partial_scan@ds=2008-04-09/hr=11 -POSTHOOK: Output: default@analyze_srcpart_partial_scan@ds=2008-04-09/hr=12 -POSTHOOK: Lineage: analyze_srcpart_partial_scan PARTITION(ds=2008-04-08,hr=11).key SIMPLE [(srcpart)srcpart.FieldSchema(name:key, type:string, comment:default), ] -POSTHOOK: Lineage: analyze_srcpart_partial_scan PARTITION(ds=2008-04-08,hr=11).value SIMPLE [(srcpart)srcpart.FieldSchema(name:value, type:string, comment:default), ] -POSTHOOK: Lineage: analyze_srcpart_partial_scan PARTITION(ds=2008-04-08,hr=12).key SIMPLE [(srcpart)srcpart.FieldSchema(name:key, type:string, comment:default), ] -POSTHOOK: Lineage: analyze_srcpart_partial_scan PARTITION(ds=2008-04-08,hr=12).value SIMPLE [(srcpart)srcpart.FieldSchema(name:value, type:string, comment:default), ] -POSTHOOK: Lineage: analyze_srcpart_partial_scan PARTITION(ds=2008-04-09,hr=11).key SIMPLE [(srcpart)srcpart.FieldSchema(name:key, type:string, comment:default), ] -POSTHOOK: Lineage: analyze_srcpart_partial_scan PARTITION(ds=2008-04-09,hr=11).value SIMPLE [(srcpart)srcpart.FieldSchema(name:value, type:string, comment:default), ] -POSTHOOK: Lineage: analyze_srcpart_partial_scan PARTITION(ds=2008-04-09,hr=12).key SIMPLE [(srcpart)srcpart.FieldSchema(name:key, type:string, comment:default), ] -POSTHOOK: Lineage: analyze_srcpart_partial_scan PARTITION(ds=2008-04-09,hr=12).value SIMPLE [(srcpart)srcpart.FieldSchema(name:value, type:string, comment:default), ] -PREHOOK: query: describe formatted analyze_srcpart_partial_scan PARTITION(ds='2008-04-08',hr=11) -PREHOOK: type: DESCTABLE -PREHOOK: Input: default@analyze_srcpart_partial_scan -POSTHOOK: query: describe formatted analyze_srcpart_partial_scan PARTITION(ds='2008-04-08',hr=11) -POSTHOOK: type: DESCTABLE -POSTHOOK: Input: default@analyze_srcpart_partial_scan -# col_name data_type comment - -key string -value string - -# Partition Information -# col_name data_type comment - -ds string -hr string - -# Detailed Partition Information -Partition Value: [2008-04-08, 11] -Database: default -Table: analyze_srcpart_partial_scan -#### A masked pattern was here #### -Partition Parameters: - COLUMN_STATS_ACCURATE false - numFiles 1 - numRows -1 - rawDataSize -1 - totalSize 5293 -#### A masked pattern was here #### - -# Storage Information -SerDe Library: org.apache.hadoop.hive.serde2.columnar.ColumnarSerDe -InputFormat: org.apache.hadoop.hive.ql.io.RCFileInputFormat -OutputFormat: org.apache.hadoop.hive.ql.io.RCFileOutputFormat -Compressed: No -Num Buckets: -1 -Bucket Columns: [] -Sort Columns: [] -Storage Desc Params: - serialization.format 1 -PREHOOK: query: -- 2. partialscan -explain -analyze table analyze_srcpart_partial_scan PARTITION(ds='2008-04-08',hr=11) compute statistics partialscan -PREHOOK: type: QUERY -POSTHOOK: query: -- 2. partialscan -explain -analyze table analyze_srcpart_partial_scan PARTITION(ds='2008-04-08',hr=11) compute statistics partialscan -POSTHOOK: type: QUERY -STAGE DEPENDENCIES: - Stage-2 is a root stage - Stage-1 depends on stages: Stage-2 - -STAGE PLANS: - Stage: Stage-2 - Partial Scan Statistics - - Stage: Stage-1 - Stats-Aggr Operator - -PREHOOK: query: analyze table analyze_srcpart_partial_scan PARTITION(ds='2008-04-08',hr=11) compute statistics partialscan -PREHOOK: type: QUERY -PREHOOK: Input: default@analyze_srcpart_partial_scan -PREHOOK: Input: default@analyze_srcpart_partial_scan@ds=2008-04-08/hr=11 -PREHOOK: Output: default@analyze_srcpart_partial_scan -PREHOOK: Output: default@analyze_srcpart_partial_scan@ds=2008-04-08/hr=11 -POSTHOOK: query: analyze table analyze_srcpart_partial_scan PARTITION(ds='2008-04-08',hr=11) compute statistics partialscan -POSTHOOK: type: QUERY -POSTHOOK: Input: default@analyze_srcpart_partial_scan -POSTHOOK: Input: default@analyze_srcpart_partial_scan@ds=2008-04-08/hr=11 -POSTHOOK: Output: default@analyze_srcpart_partial_scan -POSTHOOK: Output: default@analyze_srcpart_partial_scan@ds=2008-04-08/hr=11 -PREHOOK: query: -- 3. confirm result -describe formatted analyze_srcpart_partial_scan PARTITION(ds='2008-04-08',hr=11) -PREHOOK: type: DESCTABLE -PREHOOK: Input: default@analyze_srcpart_partial_scan -POSTHOOK: query: -- 3. confirm result -describe formatted analyze_srcpart_partial_scan PARTITION(ds='2008-04-08',hr=11) -POSTHOOK: type: DESCTABLE -POSTHOOK: Input: default@analyze_srcpart_partial_scan -# col_name data_type comment - -key string -value string - -# Partition Information -# col_name data_type comment - -ds string -hr string - -# Detailed Partition Information -Partition Value: [2008-04-08, 11] -Database: default -Table: analyze_srcpart_partial_scan -#### A masked pattern was here #### -Partition Parameters: - COLUMN_STATS_ACCURATE true - numFiles 1 - numRows 500 - rawDataSize 4812 - totalSize 5293 -#### A masked pattern was here #### - -# Storage Information -SerDe Library: org.apache.hadoop.hive.serde2.columnar.ColumnarSerDe -InputFormat: org.apache.hadoop.hive.ql.io.RCFileInputFormat -OutputFormat: org.apache.hadoop.hive.ql.io.RCFileOutputFormat -Compressed: No -Num Buckets: -1 -Bucket Columns: [] -Sort Columns: [] -Storage Desc Params: - serialization.format 1 -PREHOOK: query: describe formatted analyze_srcpart_partial_scan PARTITION(ds='2008-04-09',hr=11) -PREHOOK: type: DESCTABLE -PREHOOK: Input: default@analyze_srcpart_partial_scan -POSTHOOK: query: describe formatted analyze_srcpart_partial_scan PARTITION(ds='2008-04-09',hr=11) -POSTHOOK: type: DESCTABLE -POSTHOOK: Input: default@analyze_srcpart_partial_scan -# col_name data_type comment - -key string -value string - -# Partition Information -# col_name data_type comment - -ds string -hr string - -# Detailed Partition Information -Partition Value: [2008-04-09, 11] -Database: default -Table: analyze_srcpart_partial_scan -#### A masked pattern was here #### -Partition Parameters: - COLUMN_STATS_ACCURATE false - numFiles 1 - numRows -1 - rawDataSize -1 - totalSize 5293 -#### A masked pattern was here #### - -# Storage Information -SerDe Library: org.apache.hadoop.hive.serde2.columnar.ColumnarSerDe -InputFormat: org.apache.hadoop.hive.ql.io.RCFileInputFormat -OutputFormat: org.apache.hadoop.hive.ql.io.RCFileOutputFormat -Compressed: No -Num Buckets: -1 -Bucket Columns: [] -Sort Columns: [] -Storage Desc Params: - serialization.format 1 -PREHOOK: query: drop table analyze_srcpart_partial_scan -PREHOOK: type: DROPTABLE -PREHOOK: Input: default@analyze_srcpart_partial_scan -PREHOOK: Output: default@analyze_srcpart_partial_scan -POSTHOOK: query: drop table analyze_srcpart_partial_scan -POSTHOOK: type: DROPTABLE -POSTHOOK: Input: default@analyze_srcpart_partial_scan -POSTHOOK: Output: default@analyze_srcpart_partial_scan http://git-wip-us.apache.org/repos/asf/hive/blob/71004d2e/ql/src/test/results/clientpositive/stats_partscan_1_23.q.out ---------------------------------------------------------------------- diff --git a/ql/src/test/results/clientpositive/stats_partscan_1_23.q.out b/ql/src/test/results/clientpositive/stats_partscan_1_23.q.out deleted file mode 100644 index bc2fec5..0000000 --- a/ql/src/test/results/clientpositive/stats_partscan_1_23.q.out +++ /dev/null @@ -1,184 +0,0 @@ -PREHOOK: query: CREATE table analyze_srcpart_partial_scan (key STRING, value STRING) -partitioned by (ds string, hr string) -stored as rcfile -PREHOOK: type: CREATETABLE -PREHOOK: Output: database:default -PREHOOK: Output: default@analyze_srcpart_partial_scan -POSTHOOK: query: CREATE table analyze_srcpart_partial_scan (key STRING, value STRING) -partitioned by (ds string, hr string) -stored as rcfile -POSTHOOK: type: CREATETABLE -POSTHOOK: Output: database:default -POSTHOOK: Output: default@analyze_srcpart_partial_scan -PREHOOK: query: insert overwrite table analyze_srcpart_partial_scan partition (ds, hr) select * from srcpart where ds is not null -PREHOOK: type: QUERY -PREHOOK: Input: default@srcpart -PREHOOK: Input: default@srcpart@ds=2008-04-08/hr=11 -PREHOOK: Input: default@srcpart@ds=2008-04-08/hr=12 -PREHOOK: Input: default@srcpart@ds=2008-04-09/hr=11 -PREHOOK: Input: default@srcpart@ds=2008-04-09/hr=12 -PREHOOK: Output: default@analyze_srcpart_partial_scan -POSTHOOK: query: insert overwrite table analyze_srcpart_partial_scan partition (ds, hr) select * from srcpart where ds is not null -POSTHOOK: type: QUERY -POSTHOOK: Input: default@srcpart -POSTHOOK: Input: default@srcpart@ds=2008-04-08/hr=11 -POSTHOOK: Input: default@srcpart@ds=2008-04-08/hr=12 -POSTHOOK: Input: default@srcpart@ds=2008-04-09/hr=11 -POSTHOOK: Input: default@srcpart@ds=2008-04-09/hr=12 -POSTHOOK: Output: default@analyze_srcpart_partial_scan@ds=2008-04-08/hr=11 -POSTHOOK: Output: default@analyze_srcpart_partial_scan@ds=2008-04-08/hr=12 -POSTHOOK: Output: default@analyze_srcpart_partial_scan@ds=2008-04-09/hr=11 -POSTHOOK: Output: default@analyze_srcpart_partial_scan@ds=2008-04-09/hr=12 -POSTHOOK: Lineage: analyze_srcpart_partial_scan PARTITION(ds=2008-04-08,hr=11).key SIMPLE [(srcpart)srcpart.FieldSchema(name:key, type:string, comment:default), ] -POSTHOOK: Lineage: analyze_srcpart_partial_scan PARTITION(ds=2008-04-08,hr=11).value SIMPLE [(srcpart)srcpart.FieldSchema(name:value, type:string, comment:default), ] -POSTHOOK: Lineage: analyze_srcpart_partial_scan PARTITION(ds=2008-04-08,hr=12).key SIMPLE [(srcpart)srcpart.FieldSchema(name:key, type:string, comment:default), ] -POSTHOOK: Lineage: analyze_srcpart_partial_scan PARTITION(ds=2008-04-08,hr=12).value SIMPLE [(srcpart)srcpart.FieldSchema(name:value, type:string, comment:default), ] -POSTHOOK: Lineage: analyze_srcpart_partial_scan PARTITION(ds=2008-04-09,hr=11).key SIMPLE [(srcpart)srcpart.FieldSchema(name:key, type:string, comment:default), ] -POSTHOOK: Lineage: analyze_srcpart_partial_scan PARTITION(ds=2008-04-09,hr=11).value SIMPLE [(srcpart)srcpart.FieldSchema(name:value, type:string, comment:default), ] -POSTHOOK: Lineage: analyze_srcpart_partial_scan PARTITION(ds=2008-04-09,hr=12).key SIMPLE [(srcpart)srcpart.FieldSchema(name:key, type:string, comment:default), ] -POSTHOOK: Lineage: analyze_srcpart_partial_scan PARTITION(ds=2008-04-09,hr=12).value SIMPLE [(srcpart)srcpart.FieldSchema(name:value, type:string, comment:default), ] -PREHOOK: query: describe formatted analyze_srcpart_partial_scan PARTITION(ds='2008-04-08',hr=11) -PREHOOK: type: DESCTABLE -PREHOOK: Input: default@analyze_srcpart_partial_scan -POSTHOOK: query: describe formatted analyze_srcpart_partial_scan PARTITION(ds='2008-04-08',hr=11) -POSTHOOK: type: DESCTABLE -POSTHOOK: Input: default@analyze_srcpart_partial_scan -# col_name data_type comment -key string -value string - -# Partition Information -# col_name data_type comment -ds string -hr string - -# Detailed Partition Information -Partition Value: [2008-04-08, 11] -Database: default -Table: analyze_srcpart_partial_scan -#### A masked pattern was here #### -Partition Parameters: - numFiles 1 - totalSize 5293 -#### A masked pattern was here #### - -# Storage Information -SerDe Library: org.apache.hadoop.hive.serde2.columnar.ColumnarSerDe -InputFormat: org.apache.hadoop.hive.ql.io.RCFileInputFormat -OutputFormat: org.apache.hadoop.hive.ql.io.RCFileOutputFormat -Compressed: No -Num Buckets: -1 -Bucket Columns: [] -Sort Columns: [] -Storage Desc Params: - serialization.format 1 -PREHOOK: query: explain -analyze table analyze_srcpart_partial_scan PARTITION(ds='2008-04-08',hr=11) compute statistics partialscan -PREHOOK: type: QUERY -POSTHOOK: query: explain -analyze table analyze_srcpart_partial_scan PARTITION(ds='2008-04-08',hr=11) compute statistics partialscan -POSTHOOK: type: QUERY -STAGE DEPENDENCIES: - Stage-2 is a root stage - Stage-1 depends on stages: Stage-2 - -STAGE PLANS: - Stage: Stage-2 - Partial Scan Statistics - - Stage: Stage-1 - Stats-Aggr Operator - -PREHOOK: query: analyze table analyze_srcpart_partial_scan PARTITION(ds='2008-04-08',hr=11) compute statistics partialscan -PREHOOK: type: QUERY -PREHOOK: Input: default@analyze_srcpart_partial_scan -PREHOOK: Input: default@analyze_srcpart_partial_scan@ds=2008-04-08/hr=11 -PREHOOK: Output: default@analyze_srcpart_partial_scan -PREHOOK: Output: default@analyze_srcpart_partial_scan@ds=2008-04-08/hr=11 -POSTHOOK: query: analyze table analyze_srcpart_partial_scan PARTITION(ds='2008-04-08',hr=11) compute statistics partialscan -POSTHOOK: type: QUERY -POSTHOOK: Input: default@analyze_srcpart_partial_scan -POSTHOOK: Input: default@analyze_srcpart_partial_scan@ds=2008-04-08/hr=11 -POSTHOOK: Output: default@analyze_srcpart_partial_scan -POSTHOOK: Output: default@analyze_srcpart_partial_scan@ds=2008-04-08/hr=11 -PREHOOK: query: describe formatted analyze_srcpart_partial_scan PARTITION(ds='2008-04-08',hr=11) -PREHOOK: type: DESCTABLE -PREHOOK: Input: default@analyze_srcpart_partial_scan -POSTHOOK: query: describe formatted analyze_srcpart_partial_scan PARTITION(ds='2008-04-08',hr=11) -POSTHOOK: type: DESCTABLE -POSTHOOK: Input: default@analyze_srcpart_partial_scan -# col_name data_type comment -key string -value string - -# Partition Information -# col_name data_type comment -ds string -hr string - -# Detailed Partition Information -Partition Value: [2008-04-08, 11] -Database: default -Table: analyze_srcpart_partial_scan -#### A masked pattern was here #### -Partition Parameters: - COLUMN_STATS_ACCURATE {\"BASIC_STATS\":\"true\"} - numFiles 1 - numRows 500 - rawDataSize 4812 - totalSize 5293 -#### A masked pattern was here #### - -# Storage Information -SerDe Library: org.apache.hadoop.hive.serde2.columnar.ColumnarSerDe -InputFormat: org.apache.hadoop.hive.ql.io.RCFileInputFormat -OutputFormat: org.apache.hadoop.hive.ql.io.RCFileOutputFormat -Compressed: No -Num Buckets: -1 -Bucket Columns: [] -Sort Columns: [] -Storage Desc Params: - serialization.format 1 -PREHOOK: query: describe formatted analyze_srcpart_partial_scan PARTITION(ds='2008-04-09',hr=11) -PREHOOK: type: DESCTABLE -PREHOOK: Input: default@analyze_srcpart_partial_scan -POSTHOOK: query: describe formatted analyze_srcpart_partial_scan PARTITION(ds='2008-04-09',hr=11) -POSTHOOK: type: DESCTABLE -POSTHOOK: Input: default@analyze_srcpart_partial_scan -# col_name data_type comment -key string -value string - -# Partition Information -# col_name data_type comment -ds string -hr string - -# Detailed Partition Information -Partition Value: [2008-04-09, 11] -Database: default -Table: analyze_srcpart_partial_scan -#### A masked pattern was here #### -Partition Parameters: - numFiles 1 - totalSize 5293 -#### A masked pattern was here #### - -# Storage Information -SerDe Library: org.apache.hadoop.hive.serde2.columnar.ColumnarSerDe -InputFormat: org.apache.hadoop.hive.ql.io.RCFileInputFormat -OutputFormat: org.apache.hadoop.hive.ql.io.RCFileOutputFormat -Compressed: No -Num Buckets: -1 -Bucket Columns: [] -Sort Columns: [] -Storage Desc Params: - serialization.format 1 -PREHOOK: query: drop table analyze_srcpart_partial_scan -PREHOOK: type: DROPTABLE -PREHOOK: Input: default@analyze_srcpart_partial_scan -PREHOOK: Output: default@analyze_srcpart_partial_scan -POSTHOOK: query: drop table analyze_srcpart_partial_scan -POSTHOOK: type: DROPTABLE -POSTHOOK: Input: default@analyze_srcpart_partial_scan -POSTHOOK: Output: default@analyze_srcpart_partial_scan