Return-Path: X-Original-To: archive-asf-public-internal@cust-asf2.ponee.io Delivered-To: archive-asf-public-internal@cust-asf2.ponee.io Received: from cust-asf.ponee.io (cust-asf.ponee.io [163.172.22.183]) by cust-asf2.ponee.io (Postfix) with ESMTP id 6D4B7200D01 for ; Fri, 22 Sep 2017 13:07:12 +0200 (CEST) Received: by cust-asf.ponee.io (Postfix) id 6B81A1609A7; Fri, 22 Sep 2017 11:07:12 +0000 (UTC) Delivered-To: archive-asf-public@cust-asf.ponee.io Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by cust-asf.ponee.io (Postfix) with SMTP id 3FAA91609E5 for ; Fri, 22 Sep 2017 13:07:10 +0200 (CEST) Received: (qmail 29731 invoked by uid 500); 22 Sep 2017 11:07:09 -0000 Mailing-List: contact commits-help@carbondata.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: dev@carbondata.apache.org Delivered-To: mailing list commits@carbondata.apache.org Received: (qmail 29700 invoked by uid 99); 22 Sep 2017 11:07:09 -0000 Received: from git1-us-west.apache.org (HELO git1-us-west.apache.org) (140.211.11.23) by apache.org (qpsmtpd/0.29) with ESMTP; Fri, 22 Sep 2017 11:07:09 +0000 Received: by git1-us-west.apache.org (ASF Mail Server at git1-us-west.apache.org, from userid 33) id 96D72F576A; Fri, 22 Sep 2017 11:07:08 +0000 (UTC) Content-Type: text/plain; charset="us-ascii" MIME-Version: 1.0 Content-Transfer-Encoding: 7bit From: ravipesala@apache.org To: commits@carbondata.apache.org Date: Fri, 22 Sep 2017 11:07:09 -0000 Message-Id: <4875b8bf6bbd433fbab1bc3c054a638b@git.apache.org> In-Reply-To: References: X-Mailer: ASF-Git Admin Mailer Subject: [2/7] carbondata git commit: [CARBONDATA-1450] Support timestamp, int and Long as Dictionary Exclude archived-at: Fri, 22 Sep 2017 11:07:12 -0000 [CARBONDATA-1450] Support timestamp, int and Long as Dictionary Exclude Timestamp column supports 68 years. This PR breaks the limitation of 68 years and can support any time. To be noted, (1) By default timestamp will be no dictionary column that can support any timestamp without limitation (2) If it is enough to load only 68 years, then explicitly timestamp column can be included in DICTIONARY_INCLUDE(this will be direct_dictionary) (3) Sort columns support for int,long, bigint (4) int, long, bigint can be DICTIONARY_EXCLUDE columns (5) If the timestamp column to be partitioned, it should be a DICTIONARY_INCLUDE column.(Partition on timestamp column(dictionary_exclude column) will not throw any exception but not supported) This closes #1322 Project: http://git-wip-us.apache.org/repos/asf/carbondata/repo Commit: http://git-wip-us.apache.org/repos/asf/carbondata/commit/36ceb59f Tree: http://git-wip-us.apache.org/repos/asf/carbondata/tree/36ceb59f Diff: http://git-wip-us.apache.org/repos/asf/carbondata/diff/36ceb59f Branch: refs/heads/branch-1.2 Commit: 36ceb59f014f7369575f433064e88aa07a7de48e Parents: 8b83f58 Author: dhatchayani Authored: Tue Sep 5 15:54:28 2017 +0530 Committer: Venkata Ramana G Committed: Mon Sep 18 22:34:21 2017 +0530 ---------------------------------------------------------------------- .../core/constants/CarbonCommonConstants.java | 6 + .../carbondata/core/datastore/TableSpec.java | 4 + ...feVariableLengthDimensionDataChunkStore.java | 11 +- ...afeVariableLengthDimesionDataChunkStore.java | 7 +- .../DictionaryBasedVectorResultCollector.java | 3 + .../RestructureBasedRawResultCollector.java | 5 +- .../RestructureBasedVectorResultCollector.java | 27 +- .../executor/impl/AbstractQueryExecutor.java | 18 +- .../scan/executor/util/RestructureUtil.java | 21 +- .../scan/filter/FilterExpressionProcessor.java | 24 +- .../carbondata/core/scan/filter/FilterUtil.java | 13 +- .../executer/RangeValueFilterExecuterImpl.java | 10 +- .../executer/RestructureEvaluatorImpl.java | 5 +- .../executer/RowLevelFilterExecuterImpl.java | 2 + ...velRangeLessThanEqualFilterExecuterImpl.java | 2 + .../RowLevelRangeLessThanFiterExecuterImpl.java | 2 + .../RowLevelRangeFilterResolverImpl.java | 15 +- .../carbondata/core/util/DataTypeUtil.java | 43 +- .../sdv/generated/DataLoadingTestCase.scala | 14 +- .../sdv/generated/QueriesBVATestCase.scala | 2 +- .../sdv/generated/QueriesBasicTestCase.scala | 2 +- .../generated/QueriesCompactionTestCase.scala | 2 +- .../QueriesExcludeDictionaryTestCase.scala | 2 +- .../SortColumnExcudeDictTestCase.scala | 433 +++++++++++++++++++ .../src/test/resources/data_beyond68yrs.csv | 11 + .../spark/testsuite/datetype/DateTypeTest.scala | 51 +-- .../RangeFilterAllDataTypesTestCases.scala | 1 + .../TimestampNoDictionaryColumnTestCase.scala | 93 ++++ .../partition/TestDDLForPartitionTable.scala | 2 +- ...ForPartitionTableWithDefaultProperties.scala | 5 +- .../testsuite/sortcolumns/TestSortColumns.scala | 43 ++ .../spark/sql/catalyst/CarbonDDLSqlParser.scala | 9 +- .../command/carbonTableSchemaCommon.scala | 10 +- .../execution/command/carbonTableSchema.scala | 3 +- .../createtable/TestCreateTableSyntax.scala | 15 +- .../partition/TestAlterPartitionTable.scala | 6 +- .../AlterTableValidationTestCase.scala | 26 +- .../impl/NonDictionaryFieldConverterImpl.java | 31 +- 38 files changed, 855 insertions(+), 124 deletions(-) ---------------------------------------------------------------------- http://git-wip-us.apache.org/repos/asf/carbondata/blob/36ceb59f/core/src/main/java/org/apache/carbondata/core/constants/CarbonCommonConstants.java ---------------------------------------------------------------------- diff --git a/core/src/main/java/org/apache/carbondata/core/constants/CarbonCommonConstants.java b/core/src/main/java/org/apache/carbondata/core/constants/CarbonCommonConstants.java index 3bc1bcc..36d73d7 100644 --- a/core/src/main/java/org/apache/carbondata/core/constants/CarbonCommonConstants.java +++ b/core/src/main/java/org/apache/carbondata/core/constants/CarbonCommonConstants.java @@ -190,6 +190,12 @@ public final class CarbonCommonConstants { * Bytes for string 0, it is used in codegen in case of null values. */ public static final byte[] ZERO_BYTE_ARRAY = "0".getBytes(Charset.forName(DEFAULT_CHARSET)); + + /** + * Empty byte array + */ + public static final byte[] EMPTY_BYTE_ARRAY = new byte[0]; + /** * FILE STATUS IN-PROGRESS */ http://git-wip-us.apache.org/repos/asf/carbondata/blob/36ceb59f/core/src/main/java/org/apache/carbondata/core/datastore/TableSpec.java ---------------------------------------------------------------------- diff --git a/core/src/main/java/org/apache/carbondata/core/datastore/TableSpec.java b/core/src/main/java/org/apache/carbondata/core/datastore/TableSpec.java index 5492f7b..2fdf82b 100644 --- a/core/src/main/java/org/apache/carbondata/core/datastore/TableSpec.java +++ b/core/src/main/java/org/apache/carbondata/core/datastore/TableSpec.java @@ -62,6 +62,10 @@ public class TableSpec { if (dimension.isComplex()) { DimensionSpec spec = new DimensionSpec(ColumnType.COMPLEX, dimension); dimensionSpec[dimIndex++] = spec; + } else if (dimension.getDataType() == DataType.TIMESTAMP && !dimension + .isDirectDictionaryEncoding()) { + DimensionSpec spec = new DimensionSpec(ColumnType.PLAIN_VALUE, dimension); + dimensionSpec[dimIndex++] = spec; } else if (dimension.isDirectDictionaryEncoding()) { DimensionSpec spec = new DimensionSpec(ColumnType.DIRECT_DICTIONARY, dimension); dimensionSpec[dimIndex++] = spec; http://git-wip-us.apache.org/repos/asf/carbondata/blob/36ceb59f/core/src/main/java/org/apache/carbondata/core/datastore/chunk/store/impl/safe/SafeVariableLengthDimensionDataChunkStore.java ---------------------------------------------------------------------- diff --git a/core/src/main/java/org/apache/carbondata/core/datastore/chunk/store/impl/safe/SafeVariableLengthDimensionDataChunkStore.java b/core/src/main/java/org/apache/carbondata/core/datastore/chunk/store/impl/safe/SafeVariableLengthDimensionDataChunkStore.java index 2079811..7ce3a1d 100644 --- a/core/src/main/java/org/apache/carbondata/core/datastore/chunk/store/impl/safe/SafeVariableLengthDimensionDataChunkStore.java +++ b/core/src/main/java/org/apache/carbondata/core/datastore/chunk/store/impl/safe/SafeVariableLengthDimensionDataChunkStore.java @@ -29,6 +29,7 @@ import org.apache.spark.sql.types.IntegerType; import org.apache.spark.sql.types.LongType; import org.apache.spark.sql.types.ShortType; import org.apache.spark.sql.types.StringType; +import org.apache.spark.sql.types.TimestampType; /** * Below class is responsible to store variable length dimension data chunk in @@ -140,11 +141,13 @@ public class SafeVariableLengthDimensionDataChunkStore extends SafeAbsractDimens // for last record length = (short) (this.data.length - currentDataOffset); } - if (ByteUtil.UnsafeComparer.INSTANCE.equals(CarbonCommonConstants.MEMBER_DEFAULT_VAL_ARRAY, 0, - CarbonCommonConstants.MEMBER_DEFAULT_VAL_ARRAY.length, data, currentDataOffset, length)) { + DataType dt = vector.getType(); + if ((!(dt instanceof StringType) && length == 0) || ByteUtil.UnsafeComparer.INSTANCE + .equals(CarbonCommonConstants.MEMBER_DEFAULT_VAL_ARRAY, 0, + CarbonCommonConstants.MEMBER_DEFAULT_VAL_ARRAY.length, data, currentDataOffset, + length)) { vector.putNull(vectorRow); } else { - DataType dt = vector.getType(); if (dt instanceof StringType) { vector.putBytes(vectorRow, currentDataOffset, length, data); } else if (dt instanceof BooleanType) { @@ -155,6 +158,8 @@ public class SafeVariableLengthDimensionDataChunkStore extends SafeAbsractDimens vector.putInt(vectorRow, ByteUtil.toInt(data, currentDataOffset, length)); } else if (dt instanceof LongType) { vector.putLong(vectorRow, ByteUtil.toLong(data, currentDataOffset, length)); + } else if (dt instanceof TimestampType) { + vector.putLong(vectorRow, ByteUtil.toLong(data, currentDataOffset, length) * 1000L); } } } http://git-wip-us.apache.org/repos/asf/carbondata/blob/36ceb59f/core/src/main/java/org/apache/carbondata/core/datastore/chunk/store/impl/unsafe/UnsafeVariableLengthDimesionDataChunkStore.java ---------------------------------------------------------------------- diff --git a/core/src/main/java/org/apache/carbondata/core/datastore/chunk/store/impl/unsafe/UnsafeVariableLengthDimesionDataChunkStore.java b/core/src/main/java/org/apache/carbondata/core/datastore/chunk/store/impl/unsafe/UnsafeVariableLengthDimesionDataChunkStore.java index 6193804..c242752 100644 --- a/core/src/main/java/org/apache/carbondata/core/datastore/chunk/store/impl/unsafe/UnsafeVariableLengthDimesionDataChunkStore.java +++ b/core/src/main/java/org/apache/carbondata/core/datastore/chunk/store/impl/unsafe/UnsafeVariableLengthDimesionDataChunkStore.java @@ -30,6 +30,7 @@ import org.apache.spark.sql.types.IntegerType; import org.apache.spark.sql.types.LongType; import org.apache.spark.sql.types.ShortType; import org.apache.spark.sql.types.StringType; +import org.apache.spark.sql.types.TimestampType; /** * Below class is responsible to store variable length dimension data chunk in @@ -167,11 +168,11 @@ public class UnsafeVariableLengthDimesionDataChunkStore @Override public void fillRow(int rowId, CarbonColumnVector vector, int vectorRow) { byte[] value = getRow(rowId); - if (ByteUtil.UnsafeComparer.INSTANCE + DataType dt = vector.getType(); + if ((!(dt instanceof StringType) && value.length == 0) || ByteUtil.UnsafeComparer.INSTANCE .equals(CarbonCommonConstants.MEMBER_DEFAULT_VAL_ARRAY, value)) { vector.putNull(vectorRow); } else { - DataType dt = vector.getType(); if (dt instanceof StringType) { vector.putBytes(vectorRow, 0, value.length, value); } else if (dt instanceof BooleanType) { @@ -182,6 +183,8 @@ public class UnsafeVariableLengthDimesionDataChunkStore vector.putInt(vectorRow, ByteUtil.toInt(value, 0, value.length)); } else if (dt instanceof LongType) { vector.putLong(vectorRow, ByteUtil.toLong(value, 0, value.length)); + } else if (dt instanceof TimestampType) { + vector.putLong(vectorRow, ByteUtil.toLong(value, 0, value.length) * 1000L); } } } http://git-wip-us.apache.org/repos/asf/carbondata/blob/36ceb59f/core/src/main/java/org/apache/carbondata/core/scan/collector/impl/DictionaryBasedVectorResultCollector.java ---------------------------------------------------------------------- diff --git a/core/src/main/java/org/apache/carbondata/core/scan/collector/impl/DictionaryBasedVectorResultCollector.java b/core/src/main/java/org/apache/carbondata/core/scan/collector/impl/DictionaryBasedVectorResultCollector.java index c857a47..10888fe 100644 --- a/core/src/main/java/org/apache/carbondata/core/scan/collector/impl/DictionaryBasedVectorResultCollector.java +++ b/core/src/main/java/org/apache/carbondata/core/scan/collector/impl/DictionaryBasedVectorResultCollector.java @@ -70,6 +70,9 @@ public class DictionaryBasedVectorResultCollector extends AbstractScannedResultC List complexList = new ArrayList<>(); List implictColumnList = new ArrayList<>(); for (int i = 0; i < queryDimensions.length; i++) { + if (!dimensionInfo.getDimensionExists()[i]) { + continue; + } if (queryDimensions[i].getDimension().hasEncoding(Encoding.IMPLICIT)) { ColumnVectorInfo columnVectorInfo = new ColumnVectorInfo(); implictColumnList.add(columnVectorInfo); http://git-wip-us.apache.org/repos/asf/carbondata/blob/36ceb59f/core/src/main/java/org/apache/carbondata/core/scan/collector/impl/RestructureBasedRawResultCollector.java ---------------------------------------------------------------------- diff --git a/core/src/main/java/org/apache/carbondata/core/scan/collector/impl/RestructureBasedRawResultCollector.java b/core/src/main/java/org/apache/carbondata/core/scan/collector/impl/RestructureBasedRawResultCollector.java index ea89ce5..45275a5 100644 --- a/core/src/main/java/org/apache/carbondata/core/scan/collector/impl/RestructureBasedRawResultCollector.java +++ b/core/src/main/java/org/apache/carbondata/core/scan/collector/impl/RestructureBasedRawResultCollector.java @@ -26,6 +26,7 @@ import org.apache.carbondata.core.datastore.block.SegmentProperties; import org.apache.carbondata.core.keygenerator.KeyGenException; import org.apache.carbondata.core.keygenerator.KeyGenerator; import org.apache.carbondata.core.keygenerator.mdkey.MultiDimKeyVarLengthGenerator; +import org.apache.carbondata.core.metadata.datatype.DataType; import org.apache.carbondata.core.metadata.encoder.Encoding; import org.apache.carbondata.core.metadata.schema.table.column.CarbonDimension; import org.apache.carbondata.core.scan.executor.infos.BlockExecutionInfo; @@ -238,9 +239,11 @@ public class RestructureBasedRawResultCollector extends RawBasedResultCollector Object defaultValue = dimensionInfo.getDefaultValues()[i]; if (null != defaultValue) { newColumnDefaultValue = ((UTF8String) defaultValue).getBytes(); - } else { + } else if (actualQueryDimensions[i].getDimension().getDataType() == DataType.STRING) { newColumnDefaultValue = UTF8String.fromString(CarbonCommonConstants.MEMBER_DEFAULT_VAL).getBytes(); + } else { + newColumnDefaultValue = CarbonCommonConstants.EMPTY_BYTE_ARRAY; } noDictionaryKeyArrayWithNewlyAddedColumns[newKeyArrayIndex++] = newColumnDefaultValue; } http://git-wip-us.apache.org/repos/asf/carbondata/blob/36ceb59f/core/src/main/java/org/apache/carbondata/core/scan/collector/impl/RestructureBasedVectorResultCollector.java ---------------------------------------------------------------------- diff --git a/core/src/main/java/org/apache/carbondata/core/scan/collector/impl/RestructureBasedVectorResultCollector.java b/core/src/main/java/org/apache/carbondata/core/scan/collector/impl/RestructureBasedVectorResultCollector.java index 8ae0d96..65b9a17 100644 --- a/core/src/main/java/org/apache/carbondata/core/scan/collector/impl/RestructureBasedVectorResultCollector.java +++ b/core/src/main/java/org/apache/carbondata/core/scan/collector/impl/RestructureBasedVectorResultCollector.java @@ -18,6 +18,7 @@ package org.apache.carbondata.core.scan.collector.impl; import java.util.List; +import org.apache.carbondata.core.keygenerator.directdictionary.DirectDictionaryKeyGeneratorFactory; import org.apache.carbondata.core.metadata.datatype.DataType; import org.apache.carbondata.core.metadata.encoder.Encoding; import org.apache.carbondata.core.metadata.schema.table.column.CarbonDimension; @@ -30,6 +31,7 @@ import org.apache.carbondata.core.scan.result.vector.CarbonColumnarBatch; import org.apache.carbondata.core.scan.result.vector.ColumnVectorInfo; import org.apache.spark.sql.types.Decimal; +import org.apache.spark.unsafe.types.UTF8String; /** * It is not a collector it is just a scanned result holder. @@ -57,6 +59,12 @@ public class RestructureBasedVectorResultCollector extends DictionaryBasedVector if (!dimensionInfo.getDimensionExists()[i]) { // add a dummy column vector result collector object ColumnVectorInfo columnVectorInfo = new ColumnVectorInfo(); + columnVectorInfo.dimension = queryDimensions[i]; + if (queryDimensions[i].getDimension().getDataType().equals(DataType.TIMESTAMP) + || queryDimensions[i].getDimension().getDataType().equals(DataType.DATE)) { + columnVectorInfo.directDictionaryGenerator = DirectDictionaryKeyGeneratorFactory + .getDirectDictionaryGenerator(queryDimensions[i].getDimension().getDataType()); + } allColumnInfo[queryDimensions[i].getQueryOrder()] = columnVectorInfo; } } @@ -71,6 +79,7 @@ public class RestructureBasedVectorResultCollector extends DictionaryBasedVector // add a dummy column vector result collector object ColumnVectorInfo columnVectorInfo = new ColumnVectorInfo(); allColumnInfo[queryMeasures[i].getQueryOrder()] = columnVectorInfo; + columnVectorInfo.measure = queryMeasures[i]; measureDefaultValues[i] = getMeasureDefaultValue(queryMeasures[i].getMeasure()); } } @@ -140,7 +149,7 @@ public class RestructureBasedVectorResultCollector extends DictionaryBasedVector } else { // fill no dictionary data fillNoDictionaryData(allColumnInfo[queryOrder].vector, allColumnInfo[queryOrder], - dimension.getDefaultValue()); + dimensionInfo.getDefaultValues()[i]); } } } @@ -186,9 +195,21 @@ public class RestructureBasedVectorResultCollector extends DictionaryBasedVector * @param defaultValue */ private void fillNoDictionaryData(CarbonColumnVector vector, ColumnVectorInfo columnVectorInfo, - byte[] defaultValue) { + Object defaultValue) { if (null != defaultValue) { - vector.putBytes(columnVectorInfo.vectorOffset, columnVectorInfo.size, defaultValue); + switch (columnVectorInfo.dimension.getDimension().getDataType()) { + case INT: + vector.putInts(columnVectorInfo.vectorOffset, columnVectorInfo.size, (int) defaultValue); + break; + case LONG: + case TIMESTAMP: + vector + .putLongs(columnVectorInfo.vectorOffset, columnVectorInfo.size, (long) defaultValue); + break; + default: + vector.putBytes(columnVectorInfo.vectorOffset, columnVectorInfo.size, + ((UTF8String) defaultValue).getBytes()); + } } else { vector.putNulls(columnVectorInfo.vectorOffset, columnVectorInfo.size); } http://git-wip-us.apache.org/repos/asf/carbondata/blob/36ceb59f/core/src/main/java/org/apache/carbondata/core/scan/executor/impl/AbstractQueryExecutor.java ---------------------------------------------------------------------- diff --git a/core/src/main/java/org/apache/carbondata/core/scan/executor/impl/AbstractQueryExecutor.java b/core/src/main/java/org/apache/carbondata/core/scan/executor/impl/AbstractQueryExecutor.java index e8e7bfb..25c827b 100644 --- a/core/src/main/java/org/apache/carbondata/core/scan/executor/impl/AbstractQueryExecutor.java +++ b/core/src/main/java/org/apache/carbondata/core/scan/executor/impl/AbstractQueryExecutor.java @@ -296,18 +296,12 @@ public abstract class AbstractQueryExecutor implements QueryExecutor { blockExecutionInfo.setFilterExecuterTree(FilterUtil .getFilterExecuterTree(queryModel.getFilterExpressionResolverTree(), segmentProperties, blockExecutionInfo.getComlexDimensionInfoMap())); - List listOfStartEndKeys = new ArrayList(2); - FilterUtil.traverseResolverTreeAndGetStartAndEndKey(segmentProperties, - queryModel.getFilterExpressionResolverTree(), listOfStartEndKeys); - startIndexKey = listOfStartEndKeys.get(0); - endIndexKey = listOfStartEndKeys.get(1); - } else { - try { - startIndexKey = FilterUtil.prepareDefaultStartIndexKey(segmentProperties); - endIndexKey = FilterUtil.prepareDefaultEndIndexKey(segmentProperties); - } catch (KeyGenException e) { - throw new QueryExecutionException(e); - } + } + try { + startIndexKey = FilterUtil.prepareDefaultStartIndexKey(segmentProperties); + endIndexKey = FilterUtil.prepareDefaultEndIndexKey(segmentProperties); + } catch (KeyGenException e) { + throw new QueryExecutionException(e); } //setting the start index key of the block node blockExecutionInfo.setStartKey(startIndexKey); http://git-wip-us.apache.org/repos/asf/carbondata/blob/36ceb59f/core/src/main/java/org/apache/carbondata/core/scan/executor/util/RestructureUtil.java ---------------------------------------------------------------------- diff --git a/core/src/main/java/org/apache/carbondata/core/scan/executor/util/RestructureUtil.java b/core/src/main/java/org/apache/carbondata/core/scan/executor/util/RestructureUtil.java index aed2775..5e78741 100644 --- a/core/src/main/java/org/apache/carbondata/core/scan/executor/util/RestructureUtil.java +++ b/core/src/main/java/org/apache/carbondata/core/scan/executor/util/RestructureUtil.java @@ -35,6 +35,7 @@ import org.apache.carbondata.core.scan.executor.infos.DimensionInfo; import org.apache.carbondata.core.scan.executor.infos.MeasureInfo; import org.apache.carbondata.core.scan.model.QueryDimension; import org.apache.carbondata.core.scan.model.QueryMeasure; +import org.apache.carbondata.core.util.ByteUtil; import org.apache.carbondata.core.util.CarbonUtil; import org.apache.carbondata.core.util.DataTypeUtil; @@ -157,7 +158,8 @@ public class RestructureUtil { } } else { // no dictionary - defaultValueToBeConsidered = getNoDictionaryDefaultValue(defaultValue); + defaultValueToBeConsidered = + getNoDictionaryDefaultValue(queryDimension.getDataType(), defaultValue); } return defaultValueToBeConsidered; } @@ -211,10 +213,23 @@ public class RestructureUtil { * @param defaultValue * @return */ - private static Object getNoDictionaryDefaultValue(byte[] defaultValue) { + private static Object getNoDictionaryDefaultValue(DataType datatype, byte[] defaultValue) { Object noDictionaryDefaultValue = null; if (!isDefaultValueNull(defaultValue)) { - noDictionaryDefaultValue = UTF8String.fromBytes(defaultValue); + switch (datatype) { + case INT: + noDictionaryDefaultValue = ByteUtil.toInt(defaultValue, 0, defaultValue.length); + break; + case LONG: + noDictionaryDefaultValue = ByteUtil.toLong(defaultValue, 0, defaultValue.length); + break; + case TIMESTAMP: + long timestampValue = ByteUtil.toLong(defaultValue, 0, defaultValue.length); + noDictionaryDefaultValue = timestampValue * 1000L; + break; + default: + noDictionaryDefaultValue = UTF8String.fromBytes(defaultValue); + } } return noDictionaryDefaultValue; } http://git-wip-us.apache.org/repos/asf/carbondata/blob/36ceb59f/core/src/main/java/org/apache/carbondata/core/scan/filter/FilterExpressionProcessor.java ---------------------------------------------------------------------- diff --git a/core/src/main/java/org/apache/carbondata/core/scan/filter/FilterExpressionProcessor.java b/core/src/main/java/org/apache/carbondata/core/scan/filter/FilterExpressionProcessor.java index cfcf112..1290f8b 100644 --- a/core/src/main/java/org/apache/carbondata/core/scan/filter/FilterExpressionProcessor.java +++ b/core/src/main/java/org/apache/carbondata/core/scan/filter/FilterExpressionProcessor.java @@ -113,23 +113,13 @@ public class FilterExpressionProcessor implements FilterProcessor { LOGGER.debug("preparing the start and end key for finding" + "start and end block as per filter resolver"); } - List listOfStartEndKeys = new ArrayList(2); - FilterUtil.traverseResolverTreeAndGetStartAndEndKey(tableSegment.getSegmentProperties(), - filterResolver, listOfStartEndKeys); - // reading the first value from list which has start key - IndexKey searchStartKey = listOfStartEndKeys.get(0); - // reading the last value from list which has end key - IndexKey searchEndKey = listOfStartEndKeys.get(1); - if (null == searchStartKey && null == searchEndKey) { - try { - // TODO need to handle for no dictionary dimensions - searchStartKey = - FilterUtil.prepareDefaultStartIndexKey(tableSegment.getSegmentProperties()); - // TODO need to handle for no dictionary dimensions - searchEndKey = FilterUtil.prepareDefaultEndIndexKey(tableSegment.getSegmentProperties()); - } catch (KeyGenException e) { - return listOfDataBlocksToScan; - } + IndexKey searchStartKey = null; + IndexKey searchEndKey = null; + try { + searchStartKey = FilterUtil.prepareDefaultStartIndexKey(tableSegment.getSegmentProperties()); + searchEndKey = FilterUtil.prepareDefaultEndIndexKey(tableSegment.getSegmentProperties()); + } catch (KeyGenException e) { + throw new RuntimeException(e); } if (LOGGER.isDebugEnabled()) { char delimiter = ','; http://git-wip-us.apache.org/repos/asf/carbondata/blob/36ceb59f/core/src/main/java/org/apache/carbondata/core/scan/filter/FilterUtil.java ---------------------------------------------------------------------- diff --git a/core/src/main/java/org/apache/carbondata/core/scan/filter/FilterUtil.java b/core/src/main/java/org/apache/carbondata/core/scan/filter/FilterUtil.java index 01e1cfa..497ca8c 100644 --- a/core/src/main/java/org/apache/carbondata/core/scan/filter/FilterUtil.java +++ b/core/src/main/java/org/apache/carbondata/core/scan/filter/FilterUtil.java @@ -414,14 +414,21 @@ public final class FilterUtil { String result = null; try { int length = evaluateResultListFinal.size(); + String timeFormat = CarbonProperties.getInstance() + .getProperty(CarbonCommonConstants.CARBON_TIMESTAMP_FORMAT, + CarbonCommonConstants.CARBON_TIMESTAMP_DEFAULT_FORMAT); for (int i = 0; i < length; i++) { result = evaluateResultListFinal.get(i); if (CarbonCommonConstants.MEMBER_DEFAULT_VAL.equals(result)) { - filterValuesList.add(CarbonCommonConstants.MEMBER_DEFAULT_VAL_ARRAY); + if (dataType == DataType.STRING) { + filterValuesList.add(CarbonCommonConstants.MEMBER_DEFAULT_VAL_ARRAY); + } else { + filterValuesList.add(CarbonCommonConstants.EMPTY_BYTE_ARRAY); + } continue; } - filterValuesList.add( - DataTypeUtil.getBytesBasedOnDataTypeForNoDictionaryColumn(result, dataType)); + filterValuesList.add(DataTypeUtil + .getBytesBasedOnDataTypeForNoDictionaryColumn(result, dataType, timeFormat)); } } catch (Throwable ex) { throw new FilterUnsupportedException("Unsupported Filter condition: " + result, ex); http://git-wip-us.apache.org/repos/asf/carbondata/blob/36ceb59f/core/src/main/java/org/apache/carbondata/core/scan/filter/executer/RangeValueFilterExecuterImpl.java ---------------------------------------------------------------------- diff --git a/core/src/main/java/org/apache/carbondata/core/scan/filter/executer/RangeValueFilterExecuterImpl.java b/core/src/main/java/org/apache/carbondata/core/scan/filter/executer/RangeValueFilterExecuterImpl.java index f2d5a69..c7a0ae7 100644 --- a/core/src/main/java/org/apache/carbondata/core/scan/filter/executer/RangeValueFilterExecuterImpl.java +++ b/core/src/main/java/org/apache/carbondata/core/scan/filter/executer/RangeValueFilterExecuterImpl.java @@ -27,6 +27,7 @@ import org.apache.carbondata.core.datastore.chunk.impl.DimensionRawColumnChunk; import org.apache.carbondata.core.keygenerator.directdictionary.DirectDictionaryGenerator; import org.apache.carbondata.core.keygenerator.directdictionary.DirectDictionaryKeyGeneratorFactory; import org.apache.carbondata.core.metadata.AbsoluteTableIdentifier; +import org.apache.carbondata.core.metadata.datatype.DataType; import org.apache.carbondata.core.metadata.encoder.Encoding; import org.apache.carbondata.core.metadata.schema.table.column.CarbonDimension; import org.apache.carbondata.core.scan.expression.Expression; @@ -453,7 +454,8 @@ public class RangeValueFilterExecuterImpl extends ValueBasedFilterExecuterImpl { private void updateForNoDictionaryColumn(int start, int end, DimensionColumnDataChunk dataChunk, BitSet bitset) { for (int j = start; j <= end; j++) { - if (dataChunk.compareTo(j, CarbonCommonConstants.MEMBER_DEFAULT_VAL_ARRAY) == 0) { + if (dataChunk.compareTo(j, CarbonCommonConstants.MEMBER_DEFAULT_VAL_ARRAY) == 0 + || dataChunk.compareTo(j, CarbonCommonConstants.EMPTY_BYTE_ARRAY) == 0) { bitset.flip(j); } } @@ -562,7 +564,11 @@ public class RangeValueFilterExecuterImpl extends ValueBasedFilterExecuterImpl { defaultValue = ByteUtil.toBytes(key); } } else { - defaultValue = CarbonCommonConstants.MEMBER_DEFAULT_VAL_ARRAY; + if (dimColEvaluatorInfo.getDimension().getDataType() == DataType.STRING) { + defaultValue = CarbonCommonConstants.MEMBER_DEFAULT_VAL_ARRAY; + } else { + defaultValue = CarbonCommonConstants.EMPTY_BYTE_ARRAY; + } } // evaluate result for lower range value first and then perform and operation in the // upper range value in order to compute the final result http://git-wip-us.apache.org/repos/asf/carbondata/blob/36ceb59f/core/src/main/java/org/apache/carbondata/core/scan/filter/executer/RestructureEvaluatorImpl.java ---------------------------------------------------------------------- diff --git a/core/src/main/java/org/apache/carbondata/core/scan/filter/executer/RestructureEvaluatorImpl.java b/core/src/main/java/org/apache/carbondata/core/scan/filter/executer/RestructureEvaluatorImpl.java index d72b955..c570ed2 100644 --- a/core/src/main/java/org/apache/carbondata/core/scan/filter/executer/RestructureEvaluatorImpl.java +++ b/core/src/main/java/org/apache/carbondata/core/scan/filter/executer/RestructureEvaluatorImpl.java @@ -21,6 +21,7 @@ import java.nio.charset.Charset; import java.util.List; import org.apache.carbondata.core.constants.CarbonCommonConstants; +import org.apache.carbondata.core.metadata.datatype.DataType; import org.apache.carbondata.core.metadata.encoder.Encoding; import org.apache.carbondata.core.metadata.schema.table.column.CarbonDimension; import org.apache.carbondata.core.metadata.schema.table.column.CarbonMeasure; @@ -53,10 +54,12 @@ public abstract class RestructureEvaluatorImpl implements FilterExecuter { if (!dimension.hasEncoding(Encoding.DICTIONARY)) { // for no dictionary cases // 3 cases: is NUll, is Not Null and filter on default value of newly added column - if (null == defaultValue) { + if (null == defaultValue && dimension.getDataType() == DataType.STRING) { // default value for case where user gives is Null condition defaultValue = CarbonCommonConstants.MEMBER_DEFAULT_VAL .getBytes(Charset.forName(CarbonCommonConstants.DEFAULT_CHARSET)); + } else if (null == defaultValue) { + defaultValue = CarbonCommonConstants.EMPTY_BYTE_ARRAY; } List noDictionaryFilterValuesList = filterValues.getNoDictionaryFilterValuesList(); for (byte[] filterValue : noDictionaryFilterValuesList) { http://git-wip-us.apache.org/repos/asf/carbondata/blob/36ceb59f/core/src/main/java/org/apache/carbondata/core/scan/filter/executer/RowLevelFilterExecuterImpl.java ---------------------------------------------------------------------- diff --git a/core/src/main/java/org/apache/carbondata/core/scan/filter/executer/RowLevelFilterExecuterImpl.java b/core/src/main/java/org/apache/carbondata/core/scan/filter/executer/RowLevelFilterExecuterImpl.java index 3f25d9b..b79f18d 100644 --- a/core/src/main/java/org/apache/carbondata/core/scan/filter/executer/RowLevelFilterExecuterImpl.java +++ b/core/src/main/java/org/apache/carbondata/core/scan/filter/executer/RowLevelFilterExecuterImpl.java @@ -269,6 +269,8 @@ public class RowLevelFilterExecuterImpl implements FilterExecuter { if (null != memberBytes) { if (Arrays.equals(CarbonCommonConstants.MEMBER_DEFAULT_VAL_ARRAY, memberBytes)) { memberBytes = null; + } else if (memberBytes.length == 0) { + memberBytes = null; } record[dimColumnEvaluatorInfo.getRowIndex()] = DataTypeUtil .getDataBasedOnDataTypeForNoDictionaryColumn(memberBytes, http://git-wip-us.apache.org/repos/asf/carbondata/blob/36ceb59f/core/src/main/java/org/apache/carbondata/core/scan/filter/executer/RowLevelRangeLessThanEqualFilterExecuterImpl.java ---------------------------------------------------------------------- diff --git a/core/src/main/java/org/apache/carbondata/core/scan/filter/executer/RowLevelRangeLessThanEqualFilterExecuterImpl.java b/core/src/main/java/org/apache/carbondata/core/scan/filter/executer/RowLevelRangeLessThanEqualFilterExecuterImpl.java index 50231d6..f8886f9 100644 --- a/core/src/main/java/org/apache/carbondata/core/scan/filter/executer/RowLevelRangeLessThanEqualFilterExecuterImpl.java +++ b/core/src/main/java/org/apache/carbondata/core/scan/filter/executer/RowLevelRangeLessThanEqualFilterExecuterImpl.java @@ -274,6 +274,8 @@ public class RowLevelRangeLessThanEqualFilterExecuterImpl extends RowLevelFilter } else { defaultValue = ByteUtil.toBytes(key); } + } else if (dimColEvaluatorInfoList.get(0).getDimension().getDataType() != DataType.STRING) { + defaultValue = CarbonCommonConstants.EMPTY_BYTE_ARRAY; } BitSet bitSet = null; if (dimensionColumnDataChunk.isExplicitSorted()) { http://git-wip-us.apache.org/repos/asf/carbondata/blob/36ceb59f/core/src/main/java/org/apache/carbondata/core/scan/filter/executer/RowLevelRangeLessThanFiterExecuterImpl.java ---------------------------------------------------------------------- diff --git a/core/src/main/java/org/apache/carbondata/core/scan/filter/executer/RowLevelRangeLessThanFiterExecuterImpl.java b/core/src/main/java/org/apache/carbondata/core/scan/filter/executer/RowLevelRangeLessThanFiterExecuterImpl.java index 1972f8e..580f963 100644 --- a/core/src/main/java/org/apache/carbondata/core/scan/filter/executer/RowLevelRangeLessThanFiterExecuterImpl.java +++ b/core/src/main/java/org/apache/carbondata/core/scan/filter/executer/RowLevelRangeLessThanFiterExecuterImpl.java @@ -276,6 +276,8 @@ public class RowLevelRangeLessThanFiterExecuterImpl extends RowLevelFilterExecut } else { defaultValue = ByteUtil.toBytes(key); } + } else if (dimColEvaluatorInfoList.get(0).getDimension().getDataType() != DataType.STRING) { + defaultValue = CarbonCommonConstants.EMPTY_BYTE_ARRAY; } BitSet bitSet = null; if (dimensionColumnDataChunk.isExplicitSorted()) { http://git-wip-us.apache.org/repos/asf/carbondata/blob/36ceb59f/core/src/main/java/org/apache/carbondata/core/scan/filter/resolver/RowLevelRangeFilterResolverImpl.java ---------------------------------------------------------------------- diff --git a/core/src/main/java/org/apache/carbondata/core/scan/filter/resolver/RowLevelRangeFilterResolverImpl.java b/core/src/main/java/org/apache/carbondata/core/scan/filter/resolver/RowLevelRangeFilterResolverImpl.java index 3e27594..c4df001 100644 --- a/core/src/main/java/org/apache/carbondata/core/scan/filter/resolver/RowLevelRangeFilterResolverImpl.java +++ b/core/src/main/java/org/apache/carbondata/core/scan/filter/resolver/RowLevelRangeFilterResolverImpl.java @@ -28,6 +28,7 @@ import org.apache.carbondata.core.datastore.block.SegmentProperties; import org.apache.carbondata.core.keygenerator.directdictionary.DirectDictionaryGenerator; import org.apache.carbondata.core.keygenerator.directdictionary.DirectDictionaryKeyGeneratorFactory; import org.apache.carbondata.core.metadata.AbsoluteTableIdentifier; +import org.apache.carbondata.core.metadata.datatype.DataType; import org.apache.carbondata.core.metadata.encoder.Encoding; import org.apache.carbondata.core.metadata.schema.table.column.CarbonDimension; import org.apache.carbondata.core.metadata.schema.table.column.CarbonMeasure; @@ -45,6 +46,7 @@ import org.apache.carbondata.core.scan.filter.intf.FilterExecuterType; import org.apache.carbondata.core.scan.filter.resolver.resolverinfo.DimColumnResolvedFilterInfo; import org.apache.carbondata.core.scan.filter.resolver.resolverinfo.MeasureColumnResolvedFilterInfo; import org.apache.carbondata.core.util.ByteUtil; +import org.apache.carbondata.core.util.CarbonProperties; import org.apache.carbondata.core.util.DataTypeUtil; public class RowLevelRangeFilterResolverImpl extends ConditionalFilterResolverImpl { @@ -159,15 +161,22 @@ public class RowLevelRangeFilterResolverImpl extends ConditionalFilterResolverIm } List filterValuesList = new ArrayList(20); boolean invalidRowsPresent = false; + String timeFormat = CarbonProperties.getInstance() + .getProperty(CarbonCommonConstants.CARBON_TIMESTAMP_FORMAT, + CarbonCommonConstants.CARBON_TIMESTAMP_DEFAULT_FORMAT); for (ExpressionResult result : listOfExpressionResults) { try { if (result.getString() == null) { - filterValuesList.add(CarbonCommonConstants.MEMBER_DEFAULT_VAL_ARRAY); + if (result.getDataType() == DataType.STRING) { + filterValuesList.add(CarbonCommonConstants.MEMBER_DEFAULT_VAL_ARRAY); + } else { + filterValuesList.add(CarbonCommonConstants.EMPTY_BYTE_ARRAY); + } continue; } filterValuesList.add(DataTypeUtil - .getBytesBasedOnDataTypeForNoDictionaryColumn(result.getString(), - result.getDataType())); + .getBytesBasedOnDataTypeForNoDictionaryColumn(result.getString(), result.getDataType(), + timeFormat)); } catch (FilterIllegalMemberException e) { // Any invalid member while evaluation shall be ignored, system will log the // error only once since all rows the evaluation happens so inorder to avoid http://git-wip-us.apache.org/repos/asf/carbondata/blob/36ceb59f/core/src/main/java/org/apache/carbondata/core/util/DataTypeUtil.java ---------------------------------------------------------------------- diff --git a/core/src/main/java/org/apache/carbondata/core/util/DataTypeUtil.java b/core/src/main/java/org/apache/carbondata/core/util/DataTypeUtil.java index 2cd7ce5..2e65983 100644 --- a/core/src/main/java/org/apache/carbondata/core/util/DataTypeUtil.java +++ b/core/src/main/java/org/apache/carbondata/core/util/DataTypeUtil.java @@ -37,6 +37,7 @@ import org.apache.carbondata.core.datastore.page.ColumnPage; import org.apache.carbondata.core.keygenerator.directdictionary.DirectDictionaryGenerator; import org.apache.carbondata.core.keygenerator.directdictionary.DirectDictionaryKeyGeneratorFactory; import org.apache.carbondata.core.metadata.datatype.DataType; +import org.apache.carbondata.core.metadata.encoder.Encoding; import org.apache.carbondata.core.metadata.schema.table.column.CarbonDimension; import org.apache.carbondata.core.metadata.schema.table.column.CarbonMeasure; import org.apache.carbondata.core.metadata.schema.table.column.ColumnSchema; @@ -54,9 +55,11 @@ public final class DataTypeUtil { private static final ThreadLocal timeStampformatter = new ThreadLocal() { @Override protected DateFormat initialValue() { - return new SimpleDateFormat(CarbonProperties.getInstance() + DateFormat dateFormat = new SimpleDateFormat(CarbonProperties.getInstance() .getProperty(CarbonCommonConstants.CARBON_TIMESTAMP_FORMAT, CarbonCommonConstants.CARBON_TIMESTAMP_DEFAULT_FORMAT)); + dateFormat.setLenient(false); + return dateFormat; } }; @@ -367,7 +370,7 @@ public final class DataTypeUtil { } public static byte[] getBytesBasedOnDataTypeForNoDictionaryColumn(String dimensionValue, - DataType actualDataType) { + DataType actualDataType, String dateFormat) { switch (actualDataType) { case STRING: return ByteUtil.toBytes(dimensionValue); @@ -379,6 +382,20 @@ public final class DataTypeUtil { return ByteUtil.toBytes(Integer.parseInt(dimensionValue)); case LONG: return ByteUtil.toBytes(Long.parseLong(dimensionValue)); + case TIMESTAMP: + Date dateToStr = null; + DateFormat dateFormatter = null; + try { + if (null != dateFormat) { + dateFormatter = new SimpleDateFormat(dateFormat); + } else { + dateFormatter = timeStampformatter.get(); + } + dateToStr = dateFormatter.parse(dimensionValue); + return ByteUtil.toBytes(dateToStr.getTime()); + } catch (ParseException e) { + throw new NumberFormatException(e.getMessage()); + } default: return ByteUtil.toBytes(dimensionValue); } @@ -411,6 +428,8 @@ public final class DataTypeUtil { return ByteUtil.toInt(dataInBytes, 0, dataInBytes.length); case LONG: return ByteUtil.toLong(dataInBytes, 0, dataInBytes.length); + case TIMESTAMP: + return ByteUtil.toLong(dataInBytes, 0, dataInBytes.length) * 1000L; default: return ByteUtil.toString(dataInBytes, 0, dataInBytes.length); } @@ -679,12 +698,30 @@ public final class DataTypeUtil { return String.valueOf(Long.parseLong(data)) .getBytes(Charset.forName(CarbonCommonConstants.DEFAULT_CHARSET)); case DATE: - case TIMESTAMP: DirectDictionaryGenerator directDictionaryGenerator = DirectDictionaryKeyGeneratorFactory .getDirectDictionaryGenerator(columnSchema.getDataType()); int value = directDictionaryGenerator.generateDirectSurrogateKey(data); return String.valueOf(value) .getBytes(Charset.forName(CarbonCommonConstants.DEFAULT_CHARSET)); + case TIMESTAMP: + if (columnSchema.hasEncoding(Encoding.DIRECT_DICTIONARY)) { + DirectDictionaryGenerator directDictionaryGenerator1 = + DirectDictionaryKeyGeneratorFactory + .getDirectDictionaryGenerator(columnSchema.getDataType()); + int value1 = directDictionaryGenerator1.generateDirectSurrogateKey(data); + return String.valueOf(value1) + .getBytes(Charset.forName(CarbonCommonConstants.DEFAULT_CHARSET)); + } else { + try { + Date dateToStr = timeStampformatter.get().parse(data); + return ByteUtil.toBytes(dateToStr.getTime()); + } catch (ParseException e) { + LOGGER.error( + "Cannot convert value to Time/Long type value. Value is considered as null" + e + .getMessage()); + return null; + } + } case DECIMAL: String parsedValue = parseStringToBigDecimal(data, columnSchema); if (null == parsedValue) { http://git-wip-us.apache.org/repos/asf/carbondata/blob/36ceb59f/integration/spark-common-cluster-test/src/test/scala/org/apache/carbondata/cluster/sdv/generated/DataLoadingTestCase.scala ---------------------------------------------------------------------- diff --git a/integration/spark-common-cluster-test/src/test/scala/org/apache/carbondata/cluster/sdv/generated/DataLoadingTestCase.scala b/integration/spark-common-cluster-test/src/test/scala/org/apache/carbondata/cluster/sdv/generated/DataLoadingTestCase.scala index f32ae10..c8c88e2 100644 --- a/integration/spark-common-cluster-test/src/test/scala/org/apache/carbondata/cluster/sdv/generated/DataLoadingTestCase.scala +++ b/integration/spark-common-cluster-test/src/test/scala/org/apache/carbondata/cluster/sdv/generated/DataLoadingTestCase.scala @@ -61,7 +61,7 @@ class DataLoadingTestCase extends QueryTest with BeforeAndAfterAll { //Data load--->Action--->IGNORE--->Logger-->True test("BadRecord_Dataload_003", Include) { - sql(s"""CREATE TABLE uniqdata (CUST_ID int,CUST_NAME String,ACTIVE_EMUI_VERSION string, DOB timestamp, DOJ timestamp, BIGINT_COLUMN1 bigint,BIGINT_COLUMN2 bigint,DECIMAL_COLUMN1 decimal(30,10), DECIMAL_COLUMN2 decimal(36,10),Double_COLUMN1 double, Double_COLUMN2 double,INTEGER_COLUMN1 int) STORED BY 'org.apache.carbondata.format'""").collect + sql(s"""CREATE TABLE uniqdata (CUST_ID int,CUST_NAME String,ACTIVE_EMUI_VERSION string, DOB timestamp, DOJ timestamp, BIGINT_COLUMN1 bigint,BIGINT_COLUMN2 bigint,DECIMAL_COLUMN1 decimal(30,10), DECIMAL_COLUMN2 decimal(36,10),Double_COLUMN1 double, Double_COLUMN2 double,INTEGER_COLUMN1 int) STORED BY 'org.apache.carbondata.format' TBLPROPERTIES('DICTIONARY_INCLUDE'='DOB,DOJ')""").collect sql(s"""LOAD DATA INPATH '$resourcesPath/Data/InsertData/2000_UniqData.csv' into table uniqdata OPTIONS('DELIMITER'=',' , 'QUOTECHAR'='"','BAD_RECORDS_LOGGER_ENABLE'='TRUE', 'BAD_RECORDS_ACTION'='IGNORE','FILEHEADER'='CUST_ID,CUST_NAME,ACTIVE_EMUI_VERSION,DOB,DOJ,BIGINT_COLUMN1,BIGINT_COLUMN2,DECIMAL_COLUMN1,DECIMAL_COLUMN2,Double_COLUMN1,Double_COLUMN2,INTEGER_COLUMN1')""").collect checkAnswer(s"""select count(*) from uniqdata""", Seq(Row(2010)), "DataLoadingTestCase-BadRecord_Dataload_003") @@ -72,7 +72,7 @@ class DataLoadingTestCase extends QueryTest with BeforeAndAfterAll { //Data load--->Action--->Ignore--->Logger-->False test("BadRecord_Dataload_004", Include) { sql(s"""drop table if exists uniqdata""").collect - sql(s""" CREATE TABLE uniqdata (CUST_ID int,CUST_NAME String,ACTIVE_EMUI_VERSION string, DOB timestamp, DOJ timestamp, BIGINT_COLUMN1 bigint,BIGINT_COLUMN2 bigint,DECIMAL_COLUMN1 decimal(30,10), DECIMAL_COLUMN2 decimal(36,10),Double_COLUMN1 double, Double_COLUMN2 double,INTEGER_COLUMN1 int) STORED BY 'org.apache.carbondata.format'""").collect + sql(s""" CREATE TABLE uniqdata (CUST_ID int,CUST_NAME String,ACTIVE_EMUI_VERSION string, DOB timestamp, DOJ timestamp, BIGINT_COLUMN1 bigint,BIGINT_COLUMN2 bigint,DECIMAL_COLUMN1 decimal(30,10), DECIMAL_COLUMN2 decimal(36,10),Double_COLUMN1 double, Double_COLUMN2 double,INTEGER_COLUMN1 int) STORED BY 'org.apache.carbondata.format' TBLPROPERTIES('DICTIONARY_INCLUDE'='DOB,DOJ')""").collect sql(s"""LOAD DATA INPATH '$resourcesPath/Data/InsertData/2000_UniqData.csv' into table uniqdata OPTIONS('DELIMITER'=',' , 'QUOTECHAR'='"','BAD_RECORDS_LOGGER_ENABLE'='FALSE', 'BAD_RECORDS_ACTION'='IGNORE','FILEHEADER'='CUST_ID,CUST_NAME,ACTIVE_EMUI_VERSION,DOB,DOJ,BIGINT_COLUMN1,BIGINT_COLUMN2,DECIMAL_COLUMN1,DECIMAL_COLUMN2,Double_COLUMN1,Double_COLUMN2,INTEGER_COLUMN1')""").collect checkAnswer(s"""select count(*) from uniqdata""", Seq(Row(2010)), "DataLoadingTestCase-BadRecord_Dataload_004") @@ -94,7 +94,7 @@ class DataLoadingTestCase extends QueryTest with BeforeAndAfterAll { //Data load--->Action--->Redirect--->Logger-->False test("BadRecord_Dataload_006", Include) { sql(s"""drop table if exists uniqdata""").collect - sql(s""" CREATE TABLE uniqdata (CUST_ID int,CUST_NAME String,ACTIVE_EMUI_VERSION string, DOB timestamp, DOJ timestamp, BIGINT_COLUMN1 bigint,BIGINT_COLUMN2 bigint,DECIMAL_COLUMN1 decimal(30,10), DECIMAL_COLUMN2 decimal(36,10),Double_COLUMN1 double, Double_COLUMN2 double,INTEGER_COLUMN1 int) STORED BY 'org.apache.carbondata.format'""").collect + sql(s""" CREATE TABLE uniqdata (CUST_ID int,CUST_NAME String,ACTIVE_EMUI_VERSION string, DOB timestamp, DOJ timestamp, BIGINT_COLUMN1 bigint,BIGINT_COLUMN2 bigint,DECIMAL_COLUMN1 decimal(30,10), DECIMAL_COLUMN2 decimal(36,10),Double_COLUMN1 double, Double_COLUMN2 double,INTEGER_COLUMN1 int) STORED BY 'org.apache.carbondata.format' TBLPROPERTIES('DICTIONARY_INCLUDE'='DOB,DOJ')""").collect sql(s"""LOAD DATA INPATH '$resourcesPath/Data/InsertData/2000_UniqData.csv' into table uniqdata OPTIONS('DELIMITER'=',' , 'QUOTECHAR'='"','BAD_RECORDS_LOGGER_ENABLE'='FALSE', 'BAD_RECORDS_ACTION'='REDIRECT','FILEHEADER'='CUST_ID,CUST_NAME,ACTIVE_EMUI_VERSION,DOB,DOJ,BIGINT_COLUMN1,BIGINT_COLUMN2,DECIMAL_COLUMN1,DECIMAL_COLUMN2,Double_COLUMN1,Double_COLUMN2,INTEGER_COLUMN1')""").collect checkAnswer(s"""select count(*) from uniqdata""", Seq(Row(2010)), "DataLoadingTestCase-BadRecord_Dataload_006") @@ -104,7 +104,7 @@ class DataLoadingTestCase extends QueryTest with BeforeAndAfterAll { //Data load-->Dictionary_Exclude test("BadRecord_Dataload_007", Include) { - sql(s"""CREATE TABLE uniq_exclude (CUST_ID int,CUST_NAME String,ACTIVE_EMUI_VERSION string, DOB timestamp, DOJ timestamp, BIGINT_COLUMN1 bigint,BIGINT_COLUMN2 bigint,DECIMAL_COLUMN1 decimal(30,10), DECIMAL_COLUMN2 decimal(36,10),Double_COLUMN1 double, Double_COLUMN2 double,INTEGER_COLUMN1 int) STORED BY 'org.apache.carbondata.format' TBLPROPERTIES('DICTIONARY_EXCLUDE'='CUST_NAME,ACTIVE_EMUI_VERSION')""").collect + sql(s"""CREATE TABLE uniq_exclude (CUST_ID int,CUST_NAME String,ACTIVE_EMUI_VERSION string, DOB timestamp, DOJ timestamp, BIGINT_COLUMN1 bigint,BIGINT_COLUMN2 bigint,DECIMAL_COLUMN1 decimal(30,10), DECIMAL_COLUMN2 decimal(36,10),Double_COLUMN1 double, Double_COLUMN2 double,INTEGER_COLUMN1 int) STORED BY 'org.apache.carbondata.format' TBLPROPERTIES('DICTIONARY_EXCLUDE'='CUST_NAME,ACTIVE_EMUI_VERSION','DICTIONARY_INCLUDE'='DOB,DOJ')""").collect sql(s"""LOAD DATA INPATH '$resourcesPath/Data/InsertData/2000_UniqData.csv' into table uniq_exclude OPTIONS('DELIMITER'=',' , 'QUOTECHAR'='"','BAD_RECORDS_LOGGER_ENABLE'='TRUE', 'BAD_RECORDS_ACTION'='REDIRECT','FILEHEADER'='CUST_ID,CUST_NAME,ACTIVE_EMUI_VERSION,DOB,DOJ,BIGINT_COLUMN1,BIGINT_COLUMN2,DECIMAL_COLUMN1,DECIMAL_COLUMN2,Double_COLUMN1,Double_COLUMN2,INTEGER_COLUMN1')""").collect checkAnswer(s"""select count(*) from uniq_exclude""", Seq(Row(2010)), "DataLoadingTestCase-BadRecord_Dataload_007") @@ -718,7 +718,7 @@ class DataLoadingTestCase extends QueryTest with BeforeAndAfterAll { //Show loads--->Action=Fail--->Logger=True test("BadRecord_Dataload_024", Include) { dropTable("uniqdata") - sql(s"""CREATE TABLE uniqdata (CUST_ID int,CUST_NAME String,ACTIVE_EMUI_VERSION string, DOB timestamp, DOJ timestamp, BIGINT_COLUMN1 bigint,BIGINT_COLUMN2 bigint,DECIMAL_COLUMN1 decimal(30,10), DECIMAL_COLUMN2 decimal(36,10),Double_COLUMN1 double, Double_COLUMN2 double,INTEGER_COLUMN1 int) STORED BY 'org.apache.carbondata.format'""").collect + sql(s"""CREATE TABLE uniqdata (CUST_ID int,CUST_NAME String,ACTIVE_EMUI_VERSION string, DOB timestamp, DOJ timestamp, BIGINT_COLUMN1 bigint,BIGINT_COLUMN2 bigint,DECIMAL_COLUMN1 decimal(30,10), DECIMAL_COLUMN2 decimal(36,10),Double_COLUMN1 double, Double_COLUMN2 double,INTEGER_COLUMN1 int) STORED BY 'org.apache.carbondata.format' TBLPROPERTIES('DICTIONARY_INCLUDE'='DOB,DOJ')""").collect intercept[Exception] { sql(s"""LOAD DATA INPATH '$resourcesPath/Data/InsertData/2000_UniqData.csv' into table uniqdata OPTIONS('DELIMITER'=',' , 'QUOTECHAR'='"','BAD_RECORDS_ACTION'='FAIL','FILEHEADER'='CUST_ID,CUST_NAME,ACTIVE_EMUI_VERSION,DOB,DOJ,BIGINT_COLUMN1,BIGINT_COLUMN2,DECIMAL_COLUMN1,DECIMAL_COLUMN2,Double_COLUMN1,Double_COLUMN2,INTEGER_COLUMN1')""").collect } @@ -890,7 +890,7 @@ class DataLoadingTestCase extends QueryTest with BeforeAndAfterAll { //Check for the incremental load data DML without "DELIMITER" specified loading the data successfully. test("Incremental_Data_Load_001_001-001-TC-09_840", Include) { - sql(s"""create table DL_WithOutDELIMITER(Active_status String,Item_type_cd INT,Qty_day_avg INT,Qty_total INT,Sell_price BIGINT,Sell_pricep DOUBLE,Discount_price DOUBLE,Profit DECIMAL(3,2),Item_code String,Item_name String,Outlet_name String,Update_time TIMESTAMP,Create_date String)STORED BY 'org.apache.carbondata.format'""").collect + sql(s"""create table DL_WithOutDELIMITER(Active_status String,Item_type_cd INT,Qty_day_avg INT,Qty_total INT,Sell_price BIGINT,Sell_pricep DOUBLE,Discount_price DOUBLE,Profit DECIMAL(3,2),Item_code String,Item_name String,Outlet_name String,Update_time TIMESTAMP,Create_date String)STORED BY 'org.apache.carbondata.format' TBLPROPERTIES('DICTIONARY_INCLUDE'='Update_time')""").collect sql(s"""LOAD DATA INPATH '$resourcesPath/Data/InsertData/T_Hive1.csv' INTO table DL_WithOutDELIMITER options ('QUOTECHAR'='\', 'FILEHEADER'='Active_status,Item_type_cd,Qty_day_avg,Qty_total,Sell_price,Sell_pricep,Discount_price,Profit,Item_code,Item_name,Outlet_name,Update_time,Create_date')""").collect sql(s"""LOAD DATA INPATH '$resourcesPath/Data/InsertData/T_Hive1.csv' INTO table DL_WithOutDELIMITER options ('QUOTECHAR'='\', 'FILEHEADER'='Active_status,Item_type_cd,Qty_day_avg,Qty_total,Sell_price,Sell_pricep,Discount_price,Profit,Item_code,Item_name,Outlet_name,Update_time,Create_date')""").collect checkAnswer(s"""select count(*) from DL_WithOutDELIMITER""", @@ -1406,7 +1406,7 @@ class DataLoadingTestCase extends QueryTest with BeforeAndAfterAll { //Check for the incremental load data DML without "QUOTECHAR" specified loading the data successfully. test("Incremental_Data_Load_001_001-001-TC-11_840", Include) { - sql(s"""CREATE TABLE DL_without_QUOTECHAR (CUST_ID int,CUST_NAME String,ACTIVE_EMUI_VERSION string, DOB timestamp, DOJ timestamp, BIGINT_COLUMN1 bigint,BIGINT_COLUMN2 bigint,DECIMAL_COLUMN1 decimal(30,10), DECIMAL_COLUMN2 decimal(36,10),Double_COLUMN1 double, Double_COLUMN2 double,INTEGER_COLUMN1 int) STORED BY 'org.apache.carbondata.format'""").collect + sql(s"""CREATE TABLE DL_without_QUOTECHAR (CUST_ID int,CUST_NAME String,ACTIVE_EMUI_VERSION string, DOB timestamp, DOJ timestamp, BIGINT_COLUMN1 bigint,BIGINT_COLUMN2 bigint,DECIMAL_COLUMN1 decimal(30,10), DECIMAL_COLUMN2 decimal(36,10),Double_COLUMN1 double, Double_COLUMN2 double,INTEGER_COLUMN1 int) STORED BY 'org.apache.carbondata.format' TBLPROPERTIES('DICTIONARY_INCLUDE'='DOB,DOJ')""").collect sql(s"""LOAD DATA INPATH '$resourcesPath/Data/InsertData/2000_UniqData.csv' into table DL_without_QUOTECHAR OPTIONS('DELIMITER'=',' , 'BAD_RECORDS_LOGGER_ENABLE'='TRUE', 'BAD_RECORDS_ACTION'='REDIRECT','FILEHEADER'='CUST_ID,CUST_NAME,ACTIVE_EMUI_VERSION,DOB,DOJ,BIGINT_COLUMN1,BIGINT_COLUMN2,DECIMAL_COLUMN1,DECIMAL_COLUMN2,Double_COLUMN1,Double_COLUMN2,INTEGER_COLUMN1')""").collect checkAnswer(s"""select count(*) from DL_without_QUOTECHAR""", Seq(Row(2010)), "DataLoadingTestCase_Incremental_Data_Load_001_001-001-TC-11_840") http://git-wip-us.apache.org/repos/asf/carbondata/blob/36ceb59f/integration/spark-common-cluster-test/src/test/scala/org/apache/carbondata/cluster/sdv/generated/QueriesBVATestCase.scala ---------------------------------------------------------------------- diff --git a/integration/spark-common-cluster-test/src/test/scala/org/apache/carbondata/cluster/sdv/generated/QueriesBVATestCase.scala b/integration/spark-common-cluster-test/src/test/scala/org/apache/carbondata/cluster/sdv/generated/QueriesBVATestCase.scala index 1aac73b..5e00bde 100644 --- a/integration/spark-common-cluster-test/src/test/scala/org/apache/carbondata/cluster/sdv/generated/QueriesBVATestCase.scala +++ b/integration/spark-common-cluster-test/src/test/scala/org/apache/carbondata/cluster/sdv/generated/QueriesBVATestCase.scala @@ -33,7 +33,7 @@ class QueriesBVATestCase extends QueryTest with BeforeAndAfterAll { sql(s"""drop table if exists Test_Boundary""").collect sql(s"""drop table if exists Test_Boundary_hive""").collect - sql(s"""create table Test_Boundary (c1_int int,c2_Bigint Bigint,c3_Decimal Decimal(38,30),c4_double double,c5_string string,c6_Timestamp Timestamp,c7_Datatype_Desc string) STORED BY 'org.apache.carbondata.format'""").collect + sql(s"""create table Test_Boundary (c1_int int,c2_Bigint Bigint,c3_Decimal Decimal(38,30),c4_double double,c5_string string,c6_Timestamp Timestamp,c7_Datatype_Desc string) STORED BY 'org.apache.carbondata.format' TBLPROPERTIES('DICTIONARY_INCLUDE'='c6_Timestamp')""").collect sql(s"""create table Test_Boundary_hive (c1_int int,c2_Bigint Bigint,c3_Decimal Decimal(38,30),c4_double double,c5_string string,c6_Timestamp Timestamp,c7_Datatype_Desc string) ROW FORMAT DELIMITED FIELDS TERMINATED BY ','""").collect http://git-wip-us.apache.org/repos/asf/carbondata/blob/36ceb59f/integration/spark-common-cluster-test/src/test/scala/org/apache/carbondata/cluster/sdv/generated/QueriesBasicTestCase.scala ---------------------------------------------------------------------- diff --git a/integration/spark-common-cluster-test/src/test/scala/org/apache/carbondata/cluster/sdv/generated/QueriesBasicTestCase.scala b/integration/spark-common-cluster-test/src/test/scala/org/apache/carbondata/cluster/sdv/generated/QueriesBasicTestCase.scala index 362352b..1b525e3 100644 --- a/integration/spark-common-cluster-test/src/test/scala/org/apache/carbondata/cluster/sdv/generated/QueriesBasicTestCase.scala +++ b/integration/spark-common-cluster-test/src/test/scala/org/apache/carbondata/cluster/sdv/generated/QueriesBasicTestCase.scala @@ -39,7 +39,7 @@ class QueriesBasicTestCase extends QueryTest with BeforeAndAfterAll { sql(s"""drop table if exists uniqdata""").collect sql(s"""drop table if exists uniqdata_hive""").collect - sql(s"""CREATE TABLE uniqdata (CUST_ID int,CUST_NAME String,ACTIVE_EMUI_VERSION string, DOB timestamp, DOJ timestamp, BIGINT_COLUMN1 bigint,BIGINT_COLUMN2 bigint,DECIMAL_COLUMN1 decimal(30,10), DECIMAL_COLUMN2 decimal(36,10),Double_COLUMN1 double, Double_COLUMN2 double,INTEGER_COLUMN1 int) STORED BY 'org.apache.carbondata.format'""").collect + sql(s"""CREATE TABLE uniqdata (CUST_ID int,CUST_NAME String,ACTIVE_EMUI_VERSION string, DOB timestamp, DOJ timestamp, BIGINT_COLUMN1 bigint,BIGINT_COLUMN2 bigint,DECIMAL_COLUMN1 decimal(30,10), DECIMAL_COLUMN2 decimal(36,10),Double_COLUMN1 double, Double_COLUMN2 double,INTEGER_COLUMN1 int) STORED BY 'org.apache.carbondata.format' TBLPROPERTIES('DICTIONARY_INCLUDE'='DOB,DOJ')""").collect sql(s"""CREATE TABLE uniqdata_hive (CUST_ID int,CUST_NAME String,ACTIVE_EMUI_VERSION string, DOB timestamp, DOJ timestamp, BIGINT_COLUMN1 bigint,BIGINT_COLUMN2 bigint,DECIMAL_COLUMN1 decimal(30,10), DECIMAL_COLUMN2 decimal(36,10),Double_COLUMN1 double, Double_COLUMN2 double,INTEGER_COLUMN1 int) ROW FORMAT DELIMITED FIELDS TERMINATED BY ','""").collect http://git-wip-us.apache.org/repos/asf/carbondata/blob/36ceb59f/integration/spark-common-cluster-test/src/test/scala/org/apache/carbondata/cluster/sdv/generated/QueriesCompactionTestCase.scala ---------------------------------------------------------------------- diff --git a/integration/spark-common-cluster-test/src/test/scala/org/apache/carbondata/cluster/sdv/generated/QueriesCompactionTestCase.scala b/integration/spark-common-cluster-test/src/test/scala/org/apache/carbondata/cluster/sdv/generated/QueriesCompactionTestCase.scala index 13115ff..5fdc098 100644 --- a/integration/spark-common-cluster-test/src/test/scala/org/apache/carbondata/cluster/sdv/generated/QueriesCompactionTestCase.scala +++ b/integration/spark-common-cluster-test/src/test/scala/org/apache/carbondata/cluster/sdv/generated/QueriesCompactionTestCase.scala @@ -274,7 +274,7 @@ class QueriesCompactionTestCase extends QueryTest with BeforeAndAfterAll { sql(s"""drop table if exists Comp_DICTIONARY_EXCLUDE""").collect sql(s"""drop table if exists Comp_DICTIONARY_EXCLUDE_hive""").collect - sql(s"""create table Comp_DICTIONARY_EXCLUDE (imei string,deviceInformationId int,MAC string,deviceColor string,device_backColor string,modelId string,marketName string,AMSize string,ROMSize string,CUPAudit string,CPIClocked string,series string,productionDate timestamp,bomCode string,internalModels string, deliveryTime string, channelsId string, channelsName string , deliveryAreaId string, deliveryCountry string, deliveryProvince string, deliveryCity string,deliveryDistrict string, deliveryStreet string, oxSingleNumber string, ActiveCheckTime string, ActiveAreaId string, ActiveCountry string, ActiveProvince string, Activecity string, ActiveDistrict string, ActiveStreet string, ActiveOperatorId string, Active_releaseId string, Active_EMUIVersion string, Active_operaSysVersion string, Active_BacVerNumber string, Active_BacFlashVer string, Active_webUIVersion string, Active_webUITypeCarrVer string,Active_webTypeDataVerNumber string, Active_operatorsVersion string, Active_phonePAD PartitionedVersions string, Latest_YEAR int, Latest_MONTH int, Latest_DAY Decimal(30,10), Latest_HOUR string, Latest_areaId string, Latest_country string, Latest_province string, Latest_city string, Latest_district string, Latest_street string, Latest_releaseId string, Latest_EMUIVersion string, Latest_operaSysVersion string, Latest_BacVerNumber string, Latest_BacFlashVer string, Latest_webUIVersion string, Latest_webUITypeCarrVer string, Latest_webTypeDataVerNumber string, Latest_operatorsVersion string, Latest_phonePADPartitionedVersions string, Latest_operatorId string, gamePointDescription string,gamePointId double,contractNumber BigInt) STORED BY 'org.apache.carbondata.format' TBLPROPERTIES('DICTIONARY_EXCLUDE'='imei')""").collect + sql(s"""create table Comp_DICTIONARY_EXCLUDE (imei string,deviceInformationId int,MAC string,deviceColor string,device_backColor string,modelId string,marketName string,AMSize string,ROMSize string,CUPAudit string,CPIClocked string,series string,productionDate timestamp,bomCode string,internalModels string, deliveryTime string, channelsId string, channelsName string , deliveryAreaId string, deliveryCountry string, deliveryProvince string, deliveryCity string,deliveryDistrict string, deliveryStreet string, oxSingleNumber string, ActiveCheckTime string, ActiveAreaId string, ActiveCountry string, ActiveProvince string, Activecity string, ActiveDistrict string, ActiveStreet string, ActiveOperatorId string, Active_releaseId string, Active_EMUIVersion string, Active_operaSysVersion string, Active_BacVerNumber string, Active_BacFlashVer string, Active_webUIVersion string, Active_webUITypeCarrVer string,Active_webTypeDataVerNumber string, Active_operatorsVersion string, Active_phonePAD PartitionedVersions string, Latest_YEAR int, Latest_MONTH int, Latest_DAY Decimal(30,10), Latest_HOUR string, Latest_areaId string, Latest_country string, Latest_province string, Latest_city string, Latest_district string, Latest_street string, Latest_releaseId string, Latest_EMUIVersion string, Latest_operaSysVersion string, Latest_BacVerNumber string, Latest_BacFlashVer string, Latest_webUIVersion string, Latest_webUITypeCarrVer string, Latest_webTypeDataVerNumber string, Latest_operatorsVersion string, Latest_phonePADPartitionedVersions string, Latest_operatorId string, gamePointDescription string,gamePointId double,contractNumber BigInt) STORED BY 'org.apache.carbondata.format' TBLPROPERTIES('DICTIONARY_EXCLUDE'='imei', 'DICTIONARY_INCLUDE'='productionDate')""").collect sql(s"""create table Comp_DICTIONARY_EXCLUDE_hive (imei string,deviceInformationId int,MAC string,deviceColor string,device_backColor string,modelId string,marketName string,AMSize string,ROMSize string,CUPAudit string,CPIClocked string,series string,productionDate timestamp,bomCode string,internalModels string,deliveryTime string,channelsId string,channelsName string,deliveryAreaId string,deliveryCountry string,deliveryProvince string,deliveryCity string,deliveryDistrict string,deliveryStreet string,oxSingleNumber string,contractNumber BigInt,ActiveCheckTime string,ActiveAreaId string,ActiveCountry string,ActiveProvince string,Activecity string,ActiveDistrict string,ActiveStreet string,ActiveOperatorId string,Active_releaseId string,Active_EMUIVersion string,Active_operaSysVersion string,Active_BacVerNumber string,Active_BacFlashVer string,Active_webUIVersion string,Active_webUITypeCarrVer string,Active_webTypeDataVerNumber string,Active_operatorsVersion string,Active_phonePAD PartitionedVersions string,Latest_YEAR int,Latest_MONTH int,Latest_DAY Decimal(30,10),Latest_HOUR string,Latest_areaId string,Latest_country string,Latest_province string,Latest_city string,Latest_district string,Latest_street string,Latest_releaseId string,Latest_EMUIVersion string,Latest_operaSysVersion string,Latest_BacVerNumber string,Latest_BacFlashVer string,Latest_webUIVersion string,Latest_webUITypeCarrVer string,Latest_webTypeDataVerNumber string,Latest_operatorsVersion string,Latest_phonePADPartitionedVersions string,Latest_operatorId string,gamePointId double,gamePointDescription string) ROW FORMAT DELIMITED FIELDS TERMINATED BY ','""").collect http://git-wip-us.apache.org/repos/asf/carbondata/blob/36ceb59f/integration/spark-common-cluster-test/src/test/scala/org/apache/carbondata/cluster/sdv/generated/QueriesExcludeDictionaryTestCase.scala ---------------------------------------------------------------------- diff --git a/integration/spark-common-cluster-test/src/test/scala/org/apache/carbondata/cluster/sdv/generated/QueriesExcludeDictionaryTestCase.scala b/integration/spark-common-cluster-test/src/test/scala/org/apache/carbondata/cluster/sdv/generated/QueriesExcludeDictionaryTestCase.scala index 4b434a2..fcd20fd 100644 --- a/integration/spark-common-cluster-test/src/test/scala/org/apache/carbondata/cluster/sdv/generated/QueriesExcludeDictionaryTestCase.scala +++ b/integration/spark-common-cluster-test/src/test/scala/org/apache/carbondata/cluster/sdv/generated/QueriesExcludeDictionaryTestCase.scala @@ -33,7 +33,7 @@ class QueriesExcludeDictionaryTestCase extends QueryTest with BeforeAndAfterAll sql(s"""drop table if exists TABLE_DICTIONARY_EXCLUDE""").collect sql(s"""drop table if exists TABLE_DICTIONARY_EXCLUDE1_hive""").collect - sql(s"""create table TABLE_DICTIONARY_EXCLUDE (imei string,deviceInformationId int,MAC string,deviceColor string,device_backColor string,modelId string,marketName string,AMSize string,ROMSize string,CUPAudit string,CPIClocked string,series string,productionDate timestamp,bomCode string,internalModels string, deliveryTime string, channelsId string, channelsName string , deliveryAreaId string, deliveryCountry string, deliveryProvince string, deliveryCity string,deliveryDistrict string, deliveryStreet string, oxSingleNumber string, ActiveCheckTime string, ActiveAreaId string, ActiveCountry string, ActiveProvince string, Activecity string, ActiveDistrict string, ActiveStreet string, ActiveOperatorId string, Active_releaseId string, Active_EMUIVersion string, Active_operaSysVersion string, Active_BacVerNumber string, Active_BacFlashVer string, Active_webUIVersion string, Active_webUITypeCarrVer string,Active_webTypeDataVerNumber string, Active_operatorsVersion string, Active_phonePA DPartitionedVersions string, Latest_YEAR int, Latest_MONTH int, Latest_DAY Decimal(30,10), Latest_HOUR string, Latest_areaId string, Latest_country string, Latest_province string, Latest_city string, Latest_district string, Latest_street string, Latest_releaseId string, Latest_EMUIVersion string, Latest_operaSysVersion string, Latest_BacVerNumber string, Latest_BacFlashVer string, Latest_webUIVersion string, Latest_webUITypeCarrVer string, Latest_webTypeDataVerNumber string, Latest_operatorsVersion string, Latest_phonePADPartitionedVersions string, Latest_operatorId string, gamePointDescription string,gamePointId double,contractNumber BigInt) STORED BY 'org.apache.carbondata.format' TBLPROPERTIES('DICTIONARY_EXCLUDE'='imei')""").collect + sql(s"""create table TABLE_DICTIONARY_EXCLUDE (imei string,deviceInformationId int,MAC string,deviceColor string,device_backColor string,modelId string,marketName string,AMSize string,ROMSize string,CUPAudit string,CPIClocked string,series string,productionDate timestamp,bomCode string,internalModels string, deliveryTime string, channelsId string, channelsName string , deliveryAreaId string, deliveryCountry string, deliveryProvince string, deliveryCity string,deliveryDistrict string, deliveryStreet string, oxSingleNumber string, ActiveCheckTime string, ActiveAreaId string, ActiveCountry string, ActiveProvince string, Activecity string, ActiveDistrict string, ActiveStreet string, ActiveOperatorId string, Active_releaseId string, Active_EMUIVersion string, Active_operaSysVersion string, Active_BacVerNumber string, Active_BacFlashVer string, Active_webUIVersion string, Active_webUITypeCarrVer string,Active_webTypeDataVerNumber string, Active_operatorsVersion string, Active_phonePA DPartitionedVersions string, Latest_YEAR int, Latest_MONTH int, Latest_DAY Decimal(30,10), Latest_HOUR string, Latest_areaId string, Latest_country string, Latest_province string, Latest_city string, Latest_district string, Latest_street string, Latest_releaseId string, Latest_EMUIVersion string, Latest_operaSysVersion string, Latest_BacVerNumber string, Latest_BacFlashVer string, Latest_webUIVersion string, Latest_webUITypeCarrVer string, Latest_webTypeDataVerNumber string, Latest_operatorsVersion string, Latest_phonePADPartitionedVersions string, Latest_operatorId string, gamePointDescription string,gamePointId double,contractNumber BigInt) STORED BY 'org.apache.carbondata.format' TBLPROPERTIES('DICTIONARY_EXCLUDE'='imei', 'DICTIONARY_INCLUDE'='productionDate')""").collect sql(s"""create table TABLE_DICTIONARY_EXCLUDE1_hive (imei string,deviceInformationId int,MAC string,deviceColor string,device_backColor string,modelId string,marketName string,AMSize string,ROMSize string,CUPAudit string,CPIClocked string,series string,productionDate timestamp,bomCode string,internalModels string,deliveryTime string,channelsId string,channelsName string,deliveryAreaId string,deliveryCountry string,deliveryProvince string,deliveryCity string,deliveryDistrict string,deliveryStreet string,oxSingleNumber string,contractNumber BigInt,ActiveCheckTime string,ActiveAreaId string,ActiveCountry string,ActiveProvince string,Activecity string,ActiveDistrict string,ActiveStreet string,ActiveOperatorId string,Active_releaseId string,Active_EMUIVersion string,Active_operaSysVersion string,Active_BacVerNumber string,Active_BacFlashVer string,Active_webUIVersion string,Active_webUITypeCarrVer string,Active_webTypeDataVerNumber string,Active_operatorsVersion string,Active_phoneP ADPartitionedVersions string,Latest_YEAR int,Latest_MONTH int,Latest_DAY Decimal(30,10),Latest_HOUR string,Latest_areaId string,Latest_country string,Latest_province string,Latest_city string,Latest_district string,Latest_street string,Latest_releaseId string,Latest_EMUIVersion string,Latest_operaSysVersion string,Latest_BacVerNumber string,Latest_BacFlashVer string,Latest_webUIVersion string,Latest_webUITypeCarrVer string,Latest_webTypeDataVerNumber string,Latest_operatorsVersion string,Latest_phonePADPartitionedVersions string,Latest_operatorId string,gamePointId double,gamePointDescription string) ROW FORMAT DELIMITED FIELDS TERMINATED BY ','""").collect