hive-commits mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From gunt...@apache.org
Subject svn commit: r1561947 [1/17] - in /hive/branches/tez: ./ ant/ ant/src/org/apache/hadoop/hive/ant/ beeline/ cli/ cli/src/java/org/apache/hadoop/hive/cli/ common/ common/src/ common/src/java/org/apache/hadoop/hive/common/type/ common/src/java/org/apache/h...
Date Tue, 28 Jan 2014 05:48:10 GMT
Author: gunther
Date: Tue Jan 28 05:48:03 2014
New Revision: 1561947

URL: http://svn.apache.org/r1561947
Log:
Merge latest trunk into branch. (Gunther Hagleitner)

Added:
    hive/branches/tez/data/files/ext_test_space/
      - copied from r1561942, hive/trunk/data/files/ext_test_space/
    hive/branches/tez/data/files/header_footer_table_3/
      - copied from r1561942, hive/trunk/data/files/header_footer_table_3/
    hive/branches/tez/hcatalog/core/src/main/java/org/apache/hive/hcatalog/data/transfer/impl/ReaderContextImpl.java
      - copied unchanged from r1561942, hive/trunk/hcatalog/core/src/main/java/org/apache/hive/hcatalog/data/transfer/impl/ReaderContextImpl.java
    hive/branches/tez/hcatalog/core/src/main/java/org/apache/hive/hcatalog/data/transfer/impl/WriterContextImpl.java
      - copied unchanged from r1561942, hive/trunk/hcatalog/core/src/main/java/org/apache/hive/hcatalog/data/transfer/impl/WriterContextImpl.java
    hive/branches/tez/metastore/src/gen/thrift/gen-javabean/org/apache/hadoop/hive/metastore/api/AddPartitionsRequest.java
      - copied unchanged from r1561942, hive/trunk/metastore/src/gen/thrift/gen-javabean/org/apache/hadoop/hive/metastore/api/AddPartitionsRequest.java
    hive/branches/tez/metastore/src/gen/thrift/gen-javabean/org/apache/hadoop/hive/metastore/api/AddPartitionsResult.java
      - copied unchanged from r1561942, hive/trunk/metastore/src/gen/thrift/gen-javabean/org/apache/hadoop/hive/metastore/api/AddPartitionsResult.java
    hive/branches/tez/ql/src/gen/vectorization/ExpressionTemplates/ColumnArithmeticColumnDecimal.txt
      - copied unchanged from r1561942, hive/trunk/ql/src/gen/vectorization/ExpressionTemplates/ColumnArithmeticColumnDecimal.txt
    hive/branches/tez/ql/src/gen/vectorization/ExpressionTemplates/ColumnArithmeticScalarDecimal.txt
      - copied unchanged from r1561942, hive/trunk/ql/src/gen/vectorization/ExpressionTemplates/ColumnArithmeticScalarDecimal.txt
    hive/branches/tez/ql/src/gen/vectorization/ExpressionTemplates/FilterDecimalColumnCompareColumn.txt
      - copied unchanged from r1561942, hive/trunk/ql/src/gen/vectorization/ExpressionTemplates/FilterDecimalColumnCompareColumn.txt
    hive/branches/tez/ql/src/gen/vectorization/ExpressionTemplates/FilterDecimalColumnCompareScalar.txt
      - copied unchanged from r1561942, hive/trunk/ql/src/gen/vectorization/ExpressionTemplates/FilterDecimalColumnCompareScalar.txt
    hive/branches/tez/ql/src/gen/vectorization/ExpressionTemplates/FilterDecimalScalarCompareColumn.txt
      - copied unchanged from r1561942, hive/trunk/ql/src/gen/vectorization/ExpressionTemplates/FilterDecimalScalarCompareColumn.txt
    hive/branches/tez/ql/src/gen/vectorization/ExpressionTemplates/ScalarArithmeticColumnDecimal.txt
      - copied unchanged from r1561942, hive/trunk/ql/src/gen/vectorization/ExpressionTemplates/ScalarArithmeticColumnDecimal.txt
    hive/branches/tez/ql/src/java/org/apache/hadoop/hive/ql/exec/NodeUtils.java
      - copied unchanged from r1561942, hive/trunk/ql/src/java/org/apache/hadoop/hive/ql/exec/NodeUtils.java
    hive/branches/tez/ql/src/java/org/apache/hadoop/hive/ql/exec/vector/VectorSMBMapJoinOperator.java
      - copied unchanged from r1561942, hive/trunk/ql/src/java/org/apache/hadoop/hive/ql/exec/vector/VectorSMBMapJoinOperator.java
    hive/branches/tez/ql/src/java/org/apache/hadoop/hive/ql/exec/vector/expressions/DecimalUtil.java
      - copied unchanged from r1561942, hive/trunk/ql/src/java/org/apache/hadoop/hive/ql/exec/vector/expressions/DecimalUtil.java
    hive/branches/tez/ql/src/java/org/apache/hadoop/hive/ql/security/authorization/plugin/
      - copied from r1561942, hive/trunk/ql/src/java/org/apache/hadoop/hive/ql/security/authorization/plugin/
    hive/branches/tez/ql/src/test/org/apache/hadoop/hive/ql/security/authorization/
      - copied from r1561942, hive/trunk/ql/src/test/org/apache/hadoop/hive/ql/security/authorization/
    hive/branches/tez/ql/src/test/queries/clientnegative/windowing_invalid_udaf.q
      - copied unchanged from r1561942, hive/trunk/ql/src/test/queries/clientnegative/windowing_invalid_udaf.q
    hive/branches/tez/ql/src/test/queries/clientpositive/authorization_view.q
      - copied unchanged from r1561942, hive/trunk/ql/src/test/queries/clientpositive/authorization_view.q
    hive/branches/tez/ql/src/test/queries/clientpositive/external_table_with_space_in_location_path.q
      - copied unchanged from r1561942, hive/trunk/ql/src/test/queries/clientpositive/external_table_with_space_in_location_path.q
    hive/branches/tez/ql/src/test/queries/clientpositive/metadata_only_queries_with_filters.q
      - copied unchanged from r1561942, hive/trunk/ql/src/test/queries/clientpositive/metadata_only_queries_with_filters.q
    hive/branches/tez/ql/src/test/queries/clientpositive/root_dir_external_table.q
      - copied unchanged from r1561942, hive/trunk/ql/src/test/queries/clientpositive/root_dir_external_table.q
    hive/branches/tez/ql/src/test/queries/clientpositive/show_roles.q
      - copied unchanged from r1561942, hive/trunk/ql/src/test/queries/clientpositive/show_roles.q
    hive/branches/tez/ql/src/test/queries/clientpositive/union_top_level.q
      - copied unchanged from r1561942, hive/trunk/ql/src/test/queries/clientpositive/union_top_level.q
    hive/branches/tez/ql/src/test/queries/clientpositive/vectorized_bucketmapjoin1.q
      - copied unchanged from r1561942, hive/trunk/ql/src/test/queries/clientpositive/vectorized_bucketmapjoin1.q
    hive/branches/tez/ql/src/test/queries/clientpositive/windowing_udaf2.q
      - copied unchanged from r1561942, hive/trunk/ql/src/test/queries/clientpositive/windowing_udaf2.q
    hive/branches/tez/ql/src/test/results/clientnegative/windowing_invalid_udaf.q.out
      - copied unchanged from r1561942, hive/trunk/ql/src/test/results/clientnegative/windowing_invalid_udaf.q.out
    hive/branches/tez/ql/src/test/results/clientpositive/authorization_view.q.out
      - copied unchanged from r1561942, hive/trunk/ql/src/test/results/clientpositive/authorization_view.q.out
    hive/branches/tez/ql/src/test/results/clientpositive/external_table_with_space_in_location_path.q.out
      - copied unchanged from r1561942, hive/trunk/ql/src/test/results/clientpositive/external_table_with_space_in_location_path.q.out
    hive/branches/tez/ql/src/test/results/clientpositive/metadata_only_queries_with_filters.q.out
      - copied unchanged from r1561942, hive/trunk/ql/src/test/results/clientpositive/metadata_only_queries_with_filters.q.out
    hive/branches/tez/ql/src/test/results/clientpositive/root_dir_external_table.q.out
      - copied unchanged from r1561942, hive/trunk/ql/src/test/results/clientpositive/root_dir_external_table.q.out
    hive/branches/tez/ql/src/test/results/clientpositive/show_roles.q.out
      - copied unchanged from r1561942, hive/trunk/ql/src/test/results/clientpositive/show_roles.q.out
    hive/branches/tez/ql/src/test/results/clientpositive/union_top_level.q.out
      - copied unchanged from r1561942, hive/trunk/ql/src/test/results/clientpositive/union_top_level.q.out
    hive/branches/tez/ql/src/test/results/clientpositive/vectorized_bucketmapjoin1.q.out
      - copied unchanged from r1561942, hive/trunk/ql/src/test/results/clientpositive/vectorized_bucketmapjoin1.q.out
    hive/branches/tez/ql/src/test/results/clientpositive/windowing_udaf2.q.out
      - copied unchanged from r1561942, hive/trunk/ql/src/test/results/clientpositive/windowing_udaf2.q.out
Removed:
    hive/branches/tez/ql/src/test/queries/clientnegative/union.q
    hive/branches/tez/ql/src/test/results/clientnegative/union.q.out
Modified:
    hive/branches/tez/   (props changed)
    hive/branches/tez/ant/   (props changed)
    hive/branches/tez/ant/src/org/apache/hadoop/hive/ant/GenVectorCode.java
    hive/branches/tez/beeline/   (props changed)
    hive/branches/tez/cli/   (props changed)
    hive/branches/tez/cli/src/java/org/apache/hadoop/hive/cli/CliDriver.java
    hive/branches/tez/common/   (props changed)
    hive/branches/tez/common/src/   (props changed)
    hive/branches/tez/common/src/java/org/apache/hadoop/hive/common/type/Decimal128.java
    hive/branches/tez/common/src/java/org/apache/hadoop/hive/conf/HiveConf.java
    hive/branches/tez/common/src/test/org/apache/hadoop/hive/common/type/TestDecimal128.java
    hive/branches/tez/conf/hive-default.xml.template
    hive/branches/tez/contrib/   (props changed)
    hive/branches/tez/data/conf/hive-site.xml
    hive/branches/tez/data/files/datatypes.txt
    hive/branches/tez/hbase-handler/   (props changed)
    hive/branches/tez/hbase-handler/src/java/org/apache/hadoop/hive/hbase/HBaseStorageHandler.java
    hive/branches/tez/hbase-handler/src/test/queries/positive/hbase_bulk.m
    hive/branches/tez/hbase-handler/src/test/queries/positive/hbase_queries.q
    hive/branches/tez/hbase-handler/src/test/results/negative/cascade_dbdrop.q.out
    hive/branches/tez/hbase-handler/src/test/results/positive/hbase_bulk.m.out
    hive/branches/tez/hbase-handler/src/test/results/positive/hbase_queries.q.out
    hive/branches/tez/hbase-handler/src/test/templates/TestHBaseCliDriver.vm
    hive/branches/tez/hbase-handler/src/test/templates/TestHBaseNegativeCliDriver.vm
    hive/branches/tez/hcatalog/   (props changed)
    hive/branches/tez/hcatalog/core/   (props changed)
    hive/branches/tez/hcatalog/core/src/main/java/org/apache/hcatalog/cli/SemanticAnalysis/CreateTableHook.java
    hive/branches/tez/hcatalog/core/src/main/java/org/apache/hcatalog/cli/SemanticAnalysis/HCatSemanticAnalyzer.java
    hive/branches/tez/hcatalog/core/src/main/java/org/apache/hcatalog/security/HdfsAuthorizationProvider.java
    hive/branches/tez/hcatalog/core/src/main/java/org/apache/hive/hcatalog/cli/SemanticAnalysis/CreateTableHook.java
    hive/branches/tez/hcatalog/core/src/main/java/org/apache/hive/hcatalog/cli/SemanticAnalysis/HCatSemanticAnalyzer.java
    hive/branches/tez/hcatalog/core/src/main/java/org/apache/hive/hcatalog/data/JsonSerDe.java
    hive/branches/tez/hcatalog/core/src/main/java/org/apache/hive/hcatalog/data/transfer/DataTransferFactory.java
    hive/branches/tez/hcatalog/core/src/main/java/org/apache/hive/hcatalog/data/transfer/HCatReader.java
    hive/branches/tez/hcatalog/core/src/main/java/org/apache/hive/hcatalog/data/transfer/ReaderContext.java
    hive/branches/tez/hcatalog/core/src/main/java/org/apache/hive/hcatalog/data/transfer/WriterContext.java
    hive/branches/tez/hcatalog/core/src/main/java/org/apache/hive/hcatalog/data/transfer/impl/HCatInputFormatReader.java
    hive/branches/tez/hcatalog/core/src/main/java/org/apache/hive/hcatalog/data/transfer/impl/HCatOutputFormatWriter.java
    hive/branches/tez/hcatalog/core/src/test/java/org/apache/hcatalog/mapreduce/TestHCatMultiOutputFormat.java
    hive/branches/tez/hcatalog/core/src/test/java/org/apache/hive/hcatalog/data/TestJsonSerDe.java
    hive/branches/tez/hcatalog/core/src/test/java/org/apache/hive/hcatalog/data/TestReaderWriter.java
    hive/branches/tez/hcatalog/core/src/test/java/org/apache/hive/hcatalog/mapreduce/TestHCatMultiOutputFormat.java
    hive/branches/tez/hcatalog/hcatalog-pig-adapter/   (props changed)
    hive/branches/tez/hcatalog/server-extensions/   (props changed)
    hive/branches/tez/hcatalog/src/test/e2e/templeton/drivers/TestDriverCurl.pm
    hive/branches/tez/hcatalog/src/test/e2e/templeton/tests/jobstatus.conf
    hive/branches/tez/hcatalog/storage-handlers/hbase/   (props changed)
    hive/branches/tez/hcatalog/webhcat/java-client/   (props changed)
    hive/branches/tez/hcatalog/webhcat/svr/   (props changed)
    hive/branches/tez/hwi/   (props changed)
    hive/branches/tez/itests/hive-unit/src/test/java/org/apache/hive/beeline/TestBeeLineWithArgs.java
    hive/branches/tez/itests/hive-unit/src/test/java/org/apache/hive/jdbc/TestJdbcDriver2.java
    hive/branches/tez/itests/hive-unit/src/test/java/org/apache/hive/jdbc/TestJdbcWithMiniHS2.java
    hive/branches/tez/itests/hive-unit/src/test/java/org/apache/hive/jdbc/miniHS2/TestHiveServer2.java
    hive/branches/tez/itests/qtest/pom.xml
    hive/branches/tez/itests/util/src/main/java/org/apache/hadoop/hive/ql/QTestUtil.java
    hive/branches/tez/itests/util/src/main/java/org/apache/hadoop/hive/ql/hooks/VerifyOutputTableLocationSchemeIsFileHook.java
    hive/branches/tez/itests/util/src/main/java/org/apache/hadoop/hive/ql/hooks/VerifyPartitionIsNotSubdirectoryOfTableHook.java
    hive/branches/tez/itests/util/src/main/java/org/apache/hadoop/hive/ql/hooks/VerifyPartitionIsSubdirectoryOfTableHook.java
    hive/branches/tez/jdbc/   (props changed)
    hive/branches/tez/jdbc/src/java/org/apache/hive/jdbc/HiveBaseResultSet.java
    hive/branches/tez/jdbc/src/java/org/apache/hive/jdbc/HiveConnection.java
    hive/branches/tez/jdbc/src/java/org/apache/hive/jdbc/HivePreparedStatement.java
    hive/branches/tez/jdbc/src/java/org/apache/hive/jdbc/HiveResultSetMetaData.java
    hive/branches/tez/jdbc/src/java/org/apache/hive/jdbc/JdbcColumn.java
    hive/branches/tez/jdbc/src/java/org/apache/hive/jdbc/Utils.java
    hive/branches/tez/metastore/   (props changed)
    hive/branches/tez/metastore/if/hive_metastore.thrift
    hive/branches/tez/metastore/pom.xml
    hive/branches/tez/metastore/src/gen/thrift/gen-cpp/ThriftHiveMetastore.cpp
    hive/branches/tez/metastore/src/gen/thrift/gen-cpp/ThriftHiveMetastore.h
    hive/branches/tez/metastore/src/gen/thrift/gen-cpp/ThriftHiveMetastore_server.skeleton.cpp
    hive/branches/tez/metastore/src/gen/thrift/gen-cpp/hive_metastore_types.cpp
    hive/branches/tez/metastore/src/gen/thrift/gen-cpp/hive_metastore_types.h
    hive/branches/tez/metastore/src/gen/thrift/gen-javabean/org/apache/hadoop/hive/metastore/api/PartitionsByExprResult.java
    hive/branches/tez/metastore/src/gen/thrift/gen-javabean/org/apache/hadoop/hive/metastore/api/ThriftHiveMetastore.java
    hive/branches/tez/metastore/src/gen/thrift/gen-php/metastore/ThriftHiveMetastore.php
    hive/branches/tez/metastore/src/gen/thrift/gen-php/metastore/Types.php
    hive/branches/tez/metastore/src/gen/thrift/gen-py/hive_metastore/ThriftHiveMetastore-remote
    hive/branches/tez/metastore/src/gen/thrift/gen-py/hive_metastore/ThriftHiveMetastore.py
    hive/branches/tez/metastore/src/gen/thrift/gen-py/hive_metastore/ttypes.py
    hive/branches/tez/metastore/src/gen/thrift/gen-rb/hive_metastore_types.rb
    hive/branches/tez/metastore/src/gen/thrift/gen-rb/thrift_hive_metastore.rb
    hive/branches/tez/metastore/src/java/org/apache/hadoop/hive/metastore/HiveMetaStore.java
    hive/branches/tez/metastore/src/java/org/apache/hadoop/hive/metastore/HiveMetaStoreClient.java
    hive/branches/tez/metastore/src/java/org/apache/hadoop/hive/metastore/IMetaStoreClient.java
    hive/branches/tez/metastore/src/java/org/apache/hadoop/hive/metastore/MetaStoreDirectSql.java
    hive/branches/tez/metastore/src/java/org/apache/hadoop/hive/metastore/ObjectStore.java
    hive/branches/tez/metastore/src/java/org/apache/hadoop/hive/metastore/RawStore.java
    hive/branches/tez/metastore/src/java/org/apache/hadoop/hive/metastore/parser/ExpressionTree.java
    hive/branches/tez/metastore/src/java/org/apache/hadoop/hive/metastore/parser/Filter.g
    hive/branches/tez/metastore/src/test/org/apache/hadoop/hive/metastore/DummyRawStoreControlledCommit.java
    hive/branches/tez/metastore/src/test/org/apache/hadoop/hive/metastore/DummyRawStoreForJdoConnection.java
    hive/branches/tez/metastore/src/test/org/apache/hadoop/hive/metastore/VerifyingObjectStore.java
    hive/branches/tez/odbc/   (props changed)
    hive/branches/tez/packaging/   (props changed)
    hive/branches/tez/packaging/src/main/assembly/bin.xml
    hive/branches/tez/pom.xml
    hive/branches/tez/ql/   (props changed)
    hive/branches/tez/ql/src/gen/vectorization/ExpressionTemplates/FilterColumnCompareColumn.txt
    hive/branches/tez/ql/src/java/org/apache/hadoop/hive/ql/Context.java
    hive/branches/tez/ql/src/java/org/apache/hadoop/hive/ql/Driver.java
    hive/branches/tez/ql/src/java/org/apache/hadoop/hive/ql/DriverContext.java
    hive/branches/tez/ql/src/java/org/apache/hadoop/hive/ql/ErrorMsg.java
    hive/branches/tez/ql/src/java/org/apache/hadoop/hive/ql/exec/ArchiveUtils.java
    hive/branches/tez/ql/src/java/org/apache/hadoop/hive/ql/exec/DDLTask.java
    hive/branches/tez/ql/src/java/org/apache/hadoop/hive/ql/exec/FileSinkOperator.java
    hive/branches/tez/ql/src/java/org/apache/hadoop/hive/ql/exec/FunctionRegistry.java
    hive/branches/tez/ql/src/java/org/apache/hadoop/hive/ql/exec/HashTableSinkOperator.java
    hive/branches/tez/ql/src/java/org/apache/hadoop/hive/ql/exec/JoinOperator.java
    hive/branches/tez/ql/src/java/org/apache/hadoop/hive/ql/exec/MoveTask.java
    hive/branches/tez/ql/src/java/org/apache/hadoop/hive/ql/exec/OperatorFactory.java
    hive/branches/tez/ql/src/java/org/apache/hadoop/hive/ql/exec/SMBMapJoinOperator.java
    hive/branches/tez/ql/src/java/org/apache/hadoop/hive/ql/exec/SkewJoinHandler.java
    hive/branches/tez/ql/src/java/org/apache/hadoop/hive/ql/exec/StatsTask.java
    hive/branches/tez/ql/src/java/org/apache/hadoop/hive/ql/exec/TableScanOperator.java
    hive/branches/tez/ql/src/java/org/apache/hadoop/hive/ql/exec/Utilities.java
    hive/branches/tez/ql/src/java/org/apache/hadoop/hive/ql/exec/mr/ExecDriver.java
    hive/branches/tez/ql/src/java/org/apache/hadoop/hive/ql/exec/mr/HashTableLoader.java
    hive/branches/tez/ql/src/java/org/apache/hadoop/hive/ql/exec/mr/MapRedTask.java
    hive/branches/tez/ql/src/java/org/apache/hadoop/hive/ql/exec/mr/MapredLocalTask.java
    hive/branches/tez/ql/src/java/org/apache/hadoop/hive/ql/exec/tez/DagUtils.java
    hive/branches/tez/ql/src/java/org/apache/hadoop/hive/ql/exec/tez/TezTask.java
    hive/branches/tez/ql/src/java/org/apache/hadoop/hive/ql/exec/vector/expressions/NullUtil.java
    hive/branches/tez/ql/src/java/org/apache/hadoop/hive/ql/hooks/Entity.java
    hive/branches/tez/ql/src/java/org/apache/hadoop/hive/ql/index/IndexMetadataChangeTask.java
    hive/branches/tez/ql/src/java/org/apache/hadoop/hive/ql/index/IndexPredicateAnalyzer.java
    hive/branches/tez/ql/src/java/org/apache/hadoop/hive/ql/index/bitmap/BitmapIndexHandler.java
    hive/branches/tez/ql/src/java/org/apache/hadoop/hive/ql/index/compact/CompactIndexHandler.java
    hive/branches/tez/ql/src/java/org/apache/hadoop/hive/ql/io/BucketizedHiveInputFormat.java
    hive/branches/tez/ql/src/java/org/apache/hadoop/hive/ql/io/CombineHiveInputFormat.java
    hive/branches/tez/ql/src/java/org/apache/hadoop/hive/ql/io/HiveFileFormatUtils.java
    hive/branches/tez/ql/src/java/org/apache/hadoop/hive/ql/io/HiveInputFormat.java
    hive/branches/tez/ql/src/java/org/apache/hadoop/hive/ql/io/orc/OrcInputFormat.java
    hive/branches/tez/ql/src/java/org/apache/hadoop/hive/ql/io/orc/RecordReaderImpl.java
    hive/branches/tez/ql/src/java/org/apache/hadoop/hive/ql/io/rcfile/merge/BlockMergeTask.java
    hive/branches/tez/ql/src/java/org/apache/hadoop/hive/ql/io/rcfile/merge/RCFileMergeMapper.java
    hive/branches/tez/ql/src/java/org/apache/hadoop/hive/ql/io/rcfile/stats/PartialScanMapper.java
    hive/branches/tez/ql/src/java/org/apache/hadoop/hive/ql/io/rcfile/stats/PartialScanTask.java
    hive/branches/tez/ql/src/java/org/apache/hadoop/hive/ql/io/rcfile/truncate/ColumnTruncateMapper.java
    hive/branches/tez/ql/src/java/org/apache/hadoop/hive/ql/io/rcfile/truncate/ColumnTruncateTask.java
    hive/branches/tez/ql/src/java/org/apache/hadoop/hive/ql/metadata/Hive.java
    hive/branches/tez/ql/src/java/org/apache/hadoop/hive/ql/metadata/HiveMetaStoreChecker.java
    hive/branches/tez/ql/src/java/org/apache/hadoop/hive/ql/metadata/HiveUtils.java
    hive/branches/tez/ql/src/java/org/apache/hadoop/hive/ql/metadata/Partition.java
    hive/branches/tez/ql/src/java/org/apache/hadoop/hive/ql/metadata/Table.java
    hive/branches/tez/ql/src/java/org/apache/hadoop/hive/ql/metadata/formatting/JsonMetaDataFormatter.java
    hive/branches/tez/ql/src/java/org/apache/hadoop/hive/ql/metadata/formatting/TextMetaDataFormatter.java
    hive/branches/tez/ql/src/java/org/apache/hadoop/hive/ql/optimizer/AbstractBucketJoinProc.java
    hive/branches/tez/ql/src/java/org/apache/hadoop/hive/ql/optimizer/GenMRTableScan1.java
    hive/branches/tez/ql/src/java/org/apache/hadoop/hive/ql/optimizer/GenMRUnion1.java
    hive/branches/tez/ql/src/java/org/apache/hadoop/hive/ql/optimizer/GenMapRedUtils.java
    hive/branches/tez/ql/src/java/org/apache/hadoop/hive/ql/optimizer/IndexUtils.java
    hive/branches/tez/ql/src/java/org/apache/hadoop/hive/ql/optimizer/SamplePruner.java
    hive/branches/tez/ql/src/java/org/apache/hadoop/hive/ql/optimizer/SimpleFetchAggregation.java
    hive/branches/tez/ql/src/java/org/apache/hadoop/hive/ql/optimizer/SimpleFetchOptimizer.java
    hive/branches/tez/ql/src/java/org/apache/hadoop/hive/ql/optimizer/SizeBasedBigTableSelectorForAutoSMJ.java
    hive/branches/tez/ql/src/java/org/apache/hadoop/hive/ql/optimizer/StatsOptimizer.java
    hive/branches/tez/ql/src/java/org/apache/hadoop/hive/ql/optimizer/physical/BucketingSortingOpProcFactory.java
    hive/branches/tez/ql/src/java/org/apache/hadoop/hive/ql/optimizer/physical/CommonJoinTaskDispatcher.java
    hive/branches/tez/ql/src/java/org/apache/hadoop/hive/ql/optimizer/physical/GenMRSkewJoinProcessor.java
    hive/branches/tez/ql/src/java/org/apache/hadoop/hive/ql/optimizer/physical/MapJoinResolver.java
    hive/branches/tez/ql/src/java/org/apache/hadoop/hive/ql/optimizer/physical/MetadataOnlyOptimizer.java
    hive/branches/tez/ql/src/java/org/apache/hadoop/hive/ql/optimizer/physical/Vectorizer.java
    hive/branches/tez/ql/src/java/org/apache/hadoop/hive/ql/optimizer/unionproc/UnionProcFactory.java
    hive/branches/tez/ql/src/java/org/apache/hadoop/hive/ql/parse/BaseSemanticAnalyzer.java
    hive/branches/tez/ql/src/java/org/apache/hadoop/hive/ql/parse/DDLSemanticAnalyzer.java
    hive/branches/tez/ql/src/java/org/apache/hadoop/hive/ql/parse/ExplainSemanticAnalyzer.java
    hive/branches/tez/ql/src/java/org/apache/hadoop/hive/ql/parse/ExportSemanticAnalyzer.java
    hive/branches/tez/ql/src/java/org/apache/hadoop/hive/ql/parse/FromClauseParser.g
    hive/branches/tez/ql/src/java/org/apache/hadoop/hive/ql/parse/HiveLexer.g
    hive/branches/tez/ql/src/java/org/apache/hadoop/hive/ql/parse/HiveParser.g
    hive/branches/tez/ql/src/java/org/apache/hadoop/hive/ql/parse/IdentifiersParser.g
    hive/branches/tez/ql/src/java/org/apache/hadoop/hive/ql/parse/ImportSemanticAnalyzer.java
    hive/branches/tez/ql/src/java/org/apache/hadoop/hive/ql/parse/LoadSemanticAnalyzer.java
    hive/branches/tez/ql/src/java/org/apache/hadoop/hive/ql/parse/PTFTranslator.java
    hive/branches/tez/ql/src/java/org/apache/hadoop/hive/ql/parse/SemanticAnalyzer.java
    hive/branches/tez/ql/src/java/org/apache/hadoop/hive/ql/parse/SemanticAnalyzerFactory.java
    hive/branches/tez/ql/src/java/org/apache/hadoop/hive/ql/parse/UnparseTranslator.java
    hive/branches/tez/ql/src/java/org/apache/hadoop/hive/ql/plan/AddPartitionDesc.java
    hive/branches/tez/ql/src/java/org/apache/hadoop/hive/ql/plan/ConditionalResolverCommonJoin.java
    hive/branches/tez/ql/src/java/org/apache/hadoop/hive/ql/plan/ConditionalResolverSkewJoin.java
    hive/branches/tez/ql/src/java/org/apache/hadoop/hive/ql/plan/DynamicPartitionCtx.java
    hive/branches/tez/ql/src/java/org/apache/hadoop/hive/ql/plan/ExprNodeDescUtils.java
    hive/branches/tez/ql/src/java/org/apache/hadoop/hive/ql/plan/FileSinkDesc.java
    hive/branches/tez/ql/src/java/org/apache/hadoop/hive/ql/plan/HashTableSinkDesc.java
    hive/branches/tez/ql/src/java/org/apache/hadoop/hive/ql/plan/HiveOperation.java
    hive/branches/tez/ql/src/java/org/apache/hadoop/hive/ql/plan/JoinDesc.java
    hive/branches/tez/ql/src/java/org/apache/hadoop/hive/ql/plan/MapWork.java
    hive/branches/tez/ql/src/java/org/apache/hadoop/hive/ql/plan/MapredLocalWork.java
    hive/branches/tez/ql/src/java/org/apache/hadoop/hive/ql/plan/PlanUtils.java
    hive/branches/tez/ql/src/java/org/apache/hadoop/hive/ql/plan/RoleDDLDesc.java
    hive/branches/tez/ql/src/java/org/apache/hadoop/hive/ql/plan/StatsWork.java
    hive/branches/tez/ql/src/java/org/apache/hadoop/hive/ql/processors/SetProcessor.java
    hive/branches/tez/ql/src/java/org/apache/hadoop/hive/ql/security/authorization/StorageBasedAuthorizationProvider.java
    hive/branches/tez/ql/src/java/org/apache/hadoop/hive/ql/session/SessionState.java
    hive/branches/tez/ql/src/java/org/apache/hadoop/hive/ql/stats/StatsFactory.java
    hive/branches/tez/ql/src/java/org/apache/hadoop/hive/ql/stats/StatsUtils.java
    hive/branches/tez/ql/src/java/org/apache/hadoop/hive/ql/udf/generic/GenericUDF.java
    hive/branches/tez/ql/src/java/org/apache/hadoop/hive/ql/udf/generic/GenericUDFBridge.java
    hive/branches/tez/ql/src/java/org/apache/hadoop/hive/ql/udf/generic/GenericUDFOPEqualOrGreaterThan.java
    hive/branches/tez/ql/src/java/org/apache/hadoop/hive/ql/udf/generic/GenericUDFOPEqualOrLessThan.java
    hive/branches/tez/ql/src/java/org/apache/hadoop/hive/ql/udf/generic/GenericUDFOPGreaterThan.java
    hive/branches/tez/ql/src/java/org/apache/hadoop/hive/ql/udf/generic/GenericUDFOPLessThan.java
    hive/branches/tez/ql/src/test/org/apache/hadoop/hive/ql/exec/TestExecDriver.java
    hive/branches/tez/ql/src/test/org/apache/hadoop/hive/ql/exec/TestPlan.java
    hive/branches/tez/ql/src/test/org/apache/hadoop/hive/ql/exec/TestUtilities.java
    hive/branches/tez/ql/src/test/org/apache/hadoop/hive/ql/exec/vector/expressions/TestVectorArithmeticExpressions.java
    hive/branches/tez/ql/src/test/org/apache/hadoop/hive/ql/exec/vector/expressions/TestVectorFilterExpressions.java
    hive/branches/tez/ql/src/test/org/apache/hadoop/hive/ql/io/TestHiveBinarySearchRecordReader.java
    hive/branches/tez/ql/src/test/org/apache/hadoop/hive/ql/io/TestSymlinkTextInputFormat.java
    hive/branches/tez/ql/src/test/org/apache/hadoop/hive/ql/io/orc/TestOrcFile.java
    hive/branches/tez/ql/src/test/org/apache/hadoop/hive/ql/io/orc/TestVectorizedORCReader.java
    hive/branches/tez/ql/src/test/org/apache/hadoop/hive/ql/metadata/TestHiveMetaStoreChecker.java
    hive/branches/tez/ql/src/test/org/apache/hadoop/hive/ql/optimizer/physical/TestVectorizer.java
    hive/branches/tez/ql/src/test/org/apache/hadoop/hive/ql/plan/TestConditionalResolverCommonJoin.java
    hive/branches/tez/ql/src/test/org/apache/hadoop/hive/ql/udf/generic/TestGenericUDFLTrim.java
    hive/branches/tez/ql/src/test/org/apache/hadoop/hive/ql/udf/generic/TestGenericUDFLpad.java
    hive/branches/tez/ql/src/test/org/apache/hadoop/hive/ql/udf/generic/TestGenericUDFRTrim.java
    hive/branches/tez/ql/src/test/org/apache/hadoop/hive/ql/udf/generic/TestGenericUDFRpad.java
    hive/branches/tez/ql/src/test/org/apache/hadoop/hive/ql/udf/generic/TestGenericUDFTrim.java
    hive/branches/tez/ql/src/test/queries/clientpositive/metadata_only_queries.q
    hive/branches/tez/ql/src/test/queries/clientpositive/partition_date.q
    hive/branches/tez/ql/src/test/results/clientnegative/addpart1.q.out
    hive/branches/tez/ql/src/test/results/clientnegative/alter_partition_coltype_2columns.q.out
    hive/branches/tez/ql/src/test/results/clientnegative/alter_table_add_partition.q.out
    hive/branches/tez/ql/src/test/results/clientnegative/alter_view_failure4.q.out
    hive/branches/tez/ql/src/test/results/clientnegative/alter_view_failure5.q.out
    hive/branches/tez/ql/src/test/results/clientnegative/alter_view_failure7.q.out
    hive/branches/tez/ql/src/test/results/clientnegative/analyze1.q.out
    hive/branches/tez/ql/src/test/results/clientnegative/dyn_part1.q.out
    hive/branches/tez/ql/src/test/results/clientnegative/truncate_partition_column.q.out
    hive/branches/tez/ql/src/test/results/clientpositive/add_part_exist.q.out
    hive/branches/tez/ql/src/test/results/clientpositive/add_part_multiple.q.out
    hive/branches/tez/ql/src/test/results/clientpositive/alter_partition_coltype.q.out
    hive/branches/tez/ql/src/test/results/clientpositive/auto_join25.q.out
    hive/branches/tez/ql/src/test/results/clientpositive/create_view_partitioned.q.out
    hive/branches/tez/ql/src/test/results/clientpositive/infer_bucket_sort_convert_join.q.out
    hive/branches/tez/ql/src/test/results/clientpositive/mapjoin_hook.q.out
    hive/branches/tez/ql/src/test/results/clientpositive/metadata_only_queries.q.out
    hive/branches/tez/ql/src/test/results/clientpositive/overridden_confs.q.out
    hive/branches/tez/ql/src/test/results/clientpositive/partition_date.q.out
    hive/branches/tez/ql/src/test/results/clientpositive/partitions_json.q.out
    hive/branches/tez/ql/src/test/results/clientpositive/tez/metadata_only_queries.q.out
    hive/branches/tez/ql/src/test/results/compiler/plan/case_sensitivity.q.xml
    hive/branches/tez/ql/src/test/results/compiler/plan/cast1.q.xml
    hive/branches/tez/ql/src/test/results/compiler/plan/groupby1.q.xml
    hive/branches/tez/ql/src/test/results/compiler/plan/groupby2.q.xml
    hive/branches/tez/ql/src/test/results/compiler/plan/groupby3.q.xml
    hive/branches/tez/ql/src/test/results/compiler/plan/groupby4.q.xml
    hive/branches/tez/ql/src/test/results/compiler/plan/groupby5.q.xml
    hive/branches/tez/ql/src/test/results/compiler/plan/groupby6.q.xml
    hive/branches/tez/ql/src/test/results/compiler/plan/input1.q.xml
    hive/branches/tez/ql/src/test/results/compiler/plan/input2.q.xml
    hive/branches/tez/ql/src/test/results/compiler/plan/input20.q.xml
    hive/branches/tez/ql/src/test/results/compiler/plan/input3.q.xml
    hive/branches/tez/ql/src/test/results/compiler/plan/input4.q.xml
    hive/branches/tez/ql/src/test/results/compiler/plan/input5.q.xml
    hive/branches/tez/ql/src/test/results/compiler/plan/input6.q.xml
    hive/branches/tez/ql/src/test/results/compiler/plan/input7.q.xml
    hive/branches/tez/ql/src/test/results/compiler/plan/input8.q.xml
    hive/branches/tez/ql/src/test/results/compiler/plan/input9.q.xml
    hive/branches/tez/ql/src/test/results/compiler/plan/input_part1.q.xml
    hive/branches/tez/ql/src/test/results/compiler/plan/input_testsequencefile.q.xml
    hive/branches/tez/ql/src/test/results/compiler/plan/input_testxpath.q.xml
    hive/branches/tez/ql/src/test/results/compiler/plan/input_testxpath2.q.xml
    hive/branches/tez/ql/src/test/results/compiler/plan/join1.q.xml
    hive/branches/tez/ql/src/test/results/compiler/plan/join2.q.xml
    hive/branches/tez/ql/src/test/results/compiler/plan/join3.q.xml
    hive/branches/tez/ql/src/test/results/compiler/plan/join4.q.xml
    hive/branches/tez/ql/src/test/results/compiler/plan/join5.q.xml
    hive/branches/tez/ql/src/test/results/compiler/plan/join6.q.xml
    hive/branches/tez/ql/src/test/results/compiler/plan/join7.q.xml
    hive/branches/tez/ql/src/test/results/compiler/plan/join8.q.xml
    hive/branches/tez/ql/src/test/results/compiler/plan/sample1.q.xml
    hive/branches/tez/ql/src/test/results/compiler/plan/sample2.q.xml
    hive/branches/tez/ql/src/test/results/compiler/plan/sample3.q.xml
    hive/branches/tez/ql/src/test/results/compiler/plan/sample4.q.xml
    hive/branches/tez/ql/src/test/results/compiler/plan/sample5.q.xml
    hive/branches/tez/ql/src/test/results/compiler/plan/sample6.q.xml
    hive/branches/tez/ql/src/test/results/compiler/plan/sample7.q.xml
    hive/branches/tez/ql/src/test/results/compiler/plan/subq.q.xml
    hive/branches/tez/ql/src/test/results/compiler/plan/udf1.q.xml
    hive/branches/tez/ql/src/test/results/compiler/plan/udf4.q.xml
    hive/branches/tez/ql/src/test/results/compiler/plan/udf6.q.xml
    hive/branches/tez/ql/src/test/results/compiler/plan/udf_case.q.xml
    hive/branches/tez/ql/src/test/results/compiler/plan/udf_when.q.xml
    hive/branches/tez/ql/src/test/results/compiler/plan/union.q.xml
    hive/branches/tez/ql/src/test/templates/TestCliDriver.vm
    hive/branches/tez/ql/src/test/templates/TestCompareCliDriver.vm
    hive/branches/tez/ql/src/test/templates/TestNegativeCliDriver.vm
    hive/branches/tez/ql/src/test/templates/TestParse.vm
    hive/branches/tez/ql/src/test/templates/TestParseNegative.vm
    hive/branches/tez/serde/   (props changed)
    hive/branches/tez/service/   (props changed)
    hive/branches/tez/service/if/TCLIService.thrift
    hive/branches/tez/service/src/gen/thrift/gen-cpp/TCLIService_types.cpp
    hive/branches/tez/service/src/gen/thrift/gen-cpp/TCLIService_types.h
    hive/branches/tez/service/src/gen/thrift/gen-javabean/org/apache/hive/service/cli/thrift/TBinaryColumn.java
    hive/branches/tez/service/src/gen/thrift/gen-javabean/org/apache/hive/service/cli/thrift/TBoolColumn.java
    hive/branches/tez/service/src/gen/thrift/gen-javabean/org/apache/hive/service/cli/thrift/TByteColumn.java
    hive/branches/tez/service/src/gen/thrift/gen-javabean/org/apache/hive/service/cli/thrift/TDoubleColumn.java
    hive/branches/tez/service/src/gen/thrift/gen-javabean/org/apache/hive/service/cli/thrift/TExecuteStatementReq.java
    hive/branches/tez/service/src/gen/thrift/gen-javabean/org/apache/hive/service/cli/thrift/TGetTablesReq.java
    hive/branches/tez/service/src/gen/thrift/gen-javabean/org/apache/hive/service/cli/thrift/TI16Column.java
    hive/branches/tez/service/src/gen/thrift/gen-javabean/org/apache/hive/service/cli/thrift/TI32Column.java
    hive/branches/tez/service/src/gen/thrift/gen-javabean/org/apache/hive/service/cli/thrift/TI64Column.java
    hive/branches/tez/service/src/gen/thrift/gen-javabean/org/apache/hive/service/cli/thrift/TOpenSessionReq.java
    hive/branches/tez/service/src/gen/thrift/gen-javabean/org/apache/hive/service/cli/thrift/TOpenSessionResp.java
    hive/branches/tez/service/src/gen/thrift/gen-javabean/org/apache/hive/service/cli/thrift/TRow.java
    hive/branches/tez/service/src/gen/thrift/gen-javabean/org/apache/hive/service/cli/thrift/TRowSet.java
    hive/branches/tez/service/src/gen/thrift/gen-javabean/org/apache/hive/service/cli/thrift/TStatus.java
    hive/branches/tez/service/src/gen/thrift/gen-javabean/org/apache/hive/service/cli/thrift/TStringColumn.java
    hive/branches/tez/service/src/gen/thrift/gen-javabean/org/apache/hive/service/cli/thrift/TTypeQualifiers.java
    hive/branches/tez/service/src/gen/thrift/gen-py/TCLIService/ttypes.py
    hive/branches/tez/service/src/gen/thrift/gen-rb/t_c_l_i_service_types.rb
    hive/branches/tez/service/src/java/org/apache/hive/service/auth/LdapAuthenticationProviderImpl.java
    hive/branches/tez/service/src/java/org/apache/hive/service/cli/Type.java
    hive/branches/tez/service/src/java/org/apache/hive/service/server/HiveServer2.java
    hive/branches/tez/shims/   (props changed)
    hive/branches/tez/shims/0.20/   (props changed)
    hive/branches/tez/shims/0.20/src/main/java/org/apache/hadoop/hive/shims/Hadoop20Shims.java
    hive/branches/tez/shims/0.20S/   (props changed)
    hive/branches/tez/shims/0.20S/src/main/java/org/apache/hadoop/hive/shims/Hadoop20SShims.java
    hive/branches/tez/shims/0.23/   (props changed)
    hive/branches/tez/shims/0.23/pom.xml
    hive/branches/tez/shims/0.23/src/main/java/org/apache/hadoop/hive/shims/Hadoop23Shims.java
    hive/branches/tez/shims/aggregator/   (props changed)
    hive/branches/tez/shims/common/   (props changed)
    hive/branches/tez/shims/common-secure/   (props changed)
    hive/branches/tez/shims/common-secure/src/main/java/org/apache/hadoop/hive/shims/HadoopShimsSecure.java
    hive/branches/tez/shims/common/src/main/java/org/apache/hadoop/hive/shims/HadoopShims.java
    hive/branches/tez/testutils/   (props changed)

Propchange: hive/branches/tez/
------------------------------------------------------------------------------
--- svn:ignore (original)
+++ svn:ignore Tue Jan 28 05:48:03 2014
@@ -1,5 +1,6 @@
-build
-build-eclipse
+datanucleus.log
+eclipse-build
+target
 .arc_jira_lib
 .classpath*
 .externalToolBuilders

Propchange: hive/branches/tez/
------------------------------------------------------------------------------
  Merged /hive/trunk:r1557012-1561942

Propchange: hive/branches/tez/ant/
------------------------------------------------------------------------------
--- svn:ignore (added)
+++ svn:ignore Tue Jan 28 05:48:03 2014
@@ -0,0 +1 @@
+target

Modified: hive/branches/tez/ant/src/org/apache/hadoop/hive/ant/GenVectorCode.java
URL: http://svn.apache.org/viewvc/hive/branches/tez/ant/src/org/apache/hadoop/hive/ant/GenVectorCode.java?rev=1561947&r1=1561946&r2=1561947&view=diff
==============================================================================
--- hive/branches/tez/ant/src/org/apache/hadoop/hive/ant/GenVectorCode.java (original)
+++ hive/branches/tez/ant/src/org/apache/hadoop/hive/ant/GenVectorCode.java Tue Jan 28 05:48:03 2014
@@ -107,6 +107,18 @@ public class GenVectorCode extends Task 
       {"ColumnDivideColumn", "Modulo", "double", "long", "%"},
       {"ColumnDivideColumn", "Modulo", "double", "double", "%"},
 
+      {"ColumnArithmeticScalarDecimal", "Add"},
+      {"ColumnArithmeticScalarDecimal", "Subtract"},
+      {"ColumnArithmeticScalarDecimal", "Multiply"},
+
+      {"ScalarArithmeticColumnDecimal", "Add"},
+      {"ScalarArithmeticColumnDecimal", "Subtract"},
+      {"ScalarArithmeticColumnDecimal", "Multiply"},
+
+      {"ColumnArithmeticColumnDecimal", "Add"},
+      {"ColumnArithmeticColumnDecimal", "Subtract"},
+      {"ColumnArithmeticColumnDecimal", "Multiply"},
+
       {"ColumnCompareScalar", "Equal", "long", "double", "=="},
       {"ColumnCompareScalar", "Equal", "double", "double", "=="},
       {"ColumnCompareScalar", "NotEqual", "long", "double", "!="},
@@ -235,6 +247,27 @@ public class GenVectorCode extends Task 
       {"FilterStringScalarCompareColumn", "Greater", ">"},
       {"FilterStringScalarCompareColumn", "GreaterEqual", ">="},
 
+      {"FilterDecimalColumnCompareScalar", "Equal", "=="},
+      {"FilterDecimalColumnCompareScalar", "NotEqual", "!="},
+      {"FilterDecimalColumnCompareScalar", "Less", "<"},
+      {"FilterDecimalColumnCompareScalar", "LessEqual", "<="},
+      {"FilterDecimalColumnCompareScalar", "Greater", ">"},
+      {"FilterDecimalColumnCompareScalar", "GreaterEqual", ">="},
+
+      {"FilterDecimalScalarCompareColumn", "Equal", "=="},
+      {"FilterDecimalScalarCompareColumn", "NotEqual", "!="},
+      {"FilterDecimalScalarCompareColumn", "Less", "<"},
+      {"FilterDecimalScalarCompareColumn", "LessEqual", "<="},
+      {"FilterDecimalScalarCompareColumn", "Greater", ">"},
+      {"FilterDecimalScalarCompareColumn", "GreaterEqual", ">="},
+
+      {"FilterDecimalColumnCompareColumn", "Equal", "=="},
+      {"FilterDecimalColumnCompareColumn", "NotEqual", "!="},
+      {"FilterDecimalColumnCompareColumn", "Less", "<"},
+      {"FilterDecimalColumnCompareColumn", "LessEqual", "<="},
+      {"FilterDecimalColumnCompareColumn", "Greater", ">"},
+      {"FilterDecimalColumnCompareColumn", "GreaterEqual", ">="},
+
       {"StringScalarCompareColumn", "Equal", "=="},
       {"StringScalarCompareColumn", "NotEqual", "!="},
       {"StringScalarCompareColumn", "Less", "<"},
@@ -539,6 +572,12 @@ public class GenVectorCode extends Task 
     for (String [] tdesc : templateExpansions) {
       if (tdesc[0].equals("ColumnArithmeticScalar") || tdesc[0].equals("ColumnDivideScalar")) {
         generateColumnArithmeticScalar(tdesc);
+      } else if (tdesc[0].equals("ColumnArithmeticScalarDecimal")) {
+        generateColumnArithmeticScalarDecimal(tdesc);
+      } else if (tdesc[0].equals("ScalarArithmeticColumnDecimal")) {
+        generateScalarArithmeticColumnDecimal(tdesc);
+      } else if (tdesc[0].equals("ColumnArithmeticColumnDecimal")) {
+        generateColumnArithmeticColumnDecimal(tdesc);
       } else if (tdesc[0].equals("ColumnCompareScalar")) {
         generateColumnCompareScalar(tdesc);
       } else if (tdesc[0].equals("ScalarCompareColumn")) {
@@ -593,6 +632,12 @@ public class GenVectorCode extends Task 
         generateIfExprScalarColumn(tdesc);
       } else if (tdesc[0].equals("IfExprScalarScalar")) {
         generateIfExprScalarScalar(tdesc);
+      } else if (tdesc[0].equals("FilterDecimalColumnCompareScalar")) {
+        generateFilterDecimalColumnCompareScalar(tdesc);
+      } else if (tdesc[0].equals("FilterDecimalScalarCompareColumn")) {
+        generateFilterDecimalScalarCompareColumn(tdesc);
+      } else if (tdesc[0].equals("FilterDecimalColumnCompareColumn")) {
+        generateFilterDecimalColumnCompareColumn(tdesc);
       } else {
         continue;
       }
@@ -1116,6 +1161,48 @@ public class GenVectorCode extends Task 
     generateColumnBinaryOperatorScalar(tdesc, returnType, className);
   }
 
+  private void generateColumnArithmeticScalarDecimal(String[] tdesc) throws IOException {
+    String operatorName = tdesc[1];
+    String className = "DecimalCol" + operatorName + "DecimalScalar";
+
+    // Read the template into a string;
+    File templateFile = new File(joinPath(this.expressionTemplateDirectory, tdesc[0] + ".txt"));
+    String templateString = readFile(templateFile);
+    templateString = templateString.replaceAll("<ClassName>", className);
+    templateString = templateString.replaceAll("<Operator>", operatorName.toLowerCase());
+
+    writeFile(templateFile.lastModified(), expressionOutputDirectory, expressionClassesDirectory,
+       className, templateString);
+  }
+
+  private void generateScalarArithmeticColumnDecimal(String[] tdesc) throws IOException {
+    String operatorName = tdesc[1];
+    String className = "DecimalScalar" + operatorName + "DecimalColumn";
+
+    // Read the template into a string;
+    File templateFile = new File(joinPath(this.expressionTemplateDirectory, tdesc[0] + ".txt"));
+    String templateString = readFile(templateFile);
+    templateString = templateString.replaceAll("<ClassName>", className);
+    templateString = templateString.replaceAll("<Operator>", operatorName.toLowerCase());
+
+    writeFile(templateFile.lastModified(), expressionOutputDirectory, expressionClassesDirectory,
+       className, templateString);
+  }
+
+  private void generateColumnArithmeticColumnDecimal(String[] tdesc) throws IOException {
+    String operatorName = tdesc[1];
+    String className = "DecimalCol" + operatorName + "DecimalColumn";
+
+    // Read the template into a string;
+    File templateFile = new File(joinPath(this.expressionTemplateDirectory, tdesc[0] + ".txt"));
+    String templateString = readFile(templateFile);
+    templateString = templateString.replaceAll("<ClassName>", className);
+    templateString = templateString.replaceAll("<Operator>", operatorName.toLowerCase());
+
+    writeFile(templateFile.lastModified(), expressionOutputDirectory, expressionClassesDirectory,
+       className, templateString);
+  }
+
   private void generateScalarArithmeticColumn(String[] tdesc) throws IOException {
     String operatorName = tdesc[1];
     String operandType1 = tdesc[2];
@@ -1126,6 +1213,39 @@ public class GenVectorCode extends Task 
     generateScalarBinaryOperatorColumn(tdesc, returnType, className);
   }
 
+  private void generateFilterDecimalColumnCompareScalar(String[] tdesc) throws IOException {
+    String operatorName = tdesc[1];
+    String className = "FilterDecimalCol" + operatorName + "DecimalScalar";
+    generateDecimalColumnCompare(tdesc, className);
+  }
+
+  private void generateFilterDecimalScalarCompareColumn(String[] tdesc) throws IOException {
+    String operatorName = tdesc[1];
+    String className = "FilterDecimalScalar" + operatorName + "DecimalColumn";
+    generateDecimalColumnCompare(tdesc, className);
+  }
+
+  private void generateFilterDecimalColumnCompareColumn(String[] tdesc) throws IOException {
+    String operatorName = tdesc[1];
+    String className = "FilterDecimalCol" + operatorName + "DecimalColumn";
+    generateDecimalColumnCompare(tdesc, className);
+  }
+
+  private void generateDecimalColumnCompare(String[] tdesc, String className)
+      throws IOException {
+    String operatorSymbol = tdesc[2];
+
+    // Read the template into a string;
+    File templateFile = new File(joinPath(this.expressionTemplateDirectory, tdesc[0] + ".txt"));
+    String templateString = readFile(templateFile);
+
+    // Expand, and write result
+    templateString = templateString.replaceAll("<ClassName>", className);
+    templateString = templateString.replaceAll("<OperatorSymbol>", operatorSymbol);
+    writeFile(templateFile.lastModified(), expressionOutputDirectory, expressionClassesDirectory,
+        className, templateString);
+  }
+
   static void writeFile(long templateTime, String outputDir, String classesDir,
        String className, String str) throws IOException {
     File outputFile = new File(outputDir, className + ".java");

Propchange: hive/branches/tez/beeline/
------------------------------------------------------------------------------
--- svn:ignore (added)
+++ svn:ignore Tue Jan 28 05:48:03 2014
@@ -0,0 +1 @@
+target

Propchange: hive/branches/tez/cli/
------------------------------------------------------------------------------
--- svn:ignore (added)
+++ svn:ignore Tue Jan 28 05:48:03 2014
@@ -0,0 +1 @@
+target

Modified: hive/branches/tez/cli/src/java/org/apache/hadoop/hive/cli/CliDriver.java
URL: http://svn.apache.org/viewvc/hive/branches/tez/cli/src/java/org/apache/hadoop/hive/cli/CliDriver.java?rev=1561947&r1=1561946&r2=1561947&view=diff
==============================================================================
--- hive/branches/tez/cli/src/java/org/apache/hadoop/hive/cli/CliDriver.java (original)
+++ hive/branches/tez/cli/src/java/org/apache/hadoop/hive/cli/CliDriver.java Tue Jan 28 05:48:03 2014
@@ -67,7 +67,6 @@ import org.apache.hadoop.hive.ql.session
 import org.apache.hadoop.hive.ql.session.SessionState.LogHelper;
 import org.apache.hadoop.hive.service.HiveClient;
 import org.apache.hadoop.hive.service.HiveServerException;
-import org.apache.hadoop.hive.shims.ShimLoader;
 import org.apache.hadoop.io.IOUtils;
 import org.apache.thrift.TException;
 
@@ -99,6 +98,7 @@ public class CliDriver {
 
   public int processCmd(String cmd) {
     CliSessionState ss = (CliSessionState) SessionState.get();
+    ss.setLastCommand(cmd);
     // Flush the print stream, so it doesn't include output from the last command
     ss.err.flush();
     String cmd_trimmed = cmd.trim();

Propchange: hive/branches/tez/common/
------------------------------------------------------------------------------
--- svn:ignore (added)
+++ svn:ignore Tue Jan 28 05:48:03 2014
@@ -0,0 +1 @@
+target

Propchange: hive/branches/tez/common/src/
------------------------------------------------------------------------------
--- svn:ignore (added)
+++ svn:ignore Tue Jan 28 05:48:03 2014
@@ -0,0 +1 @@
+gen

Modified: hive/branches/tez/common/src/java/org/apache/hadoop/hive/common/type/Decimal128.java
URL: http://svn.apache.org/viewvc/hive/branches/tez/common/src/java/org/apache/hadoop/hive/common/type/Decimal128.java?rev=1561947&r1=1561946&r2=1561947&view=diff
==============================================================================
--- hive/branches/tez/common/src/java/org/apache/hadoop/hive/common/type/Decimal128.java (original)
+++ hive/branches/tez/common/src/java/org/apache/hadoop/hive/common/type/Decimal128.java Tue Jan 28 05:48:03 2014
@@ -16,6 +16,7 @@
 package org.apache.hadoop.hive.common.type;
 
 import java.math.BigDecimal;
+import java.math.MathContext;
 import java.nio.IntBuffer;
 
 /**
@@ -1097,23 +1098,27 @@ public final class Decimal128 extends Nu
    *          right operand
    * @param quotient
    *          result object to receive the calculation result
-   * @param remainder
-   *          result object to receive the calculation result
    * @param scale
    *          scale of the result. must be 0 or positive.
    */
   public static void divide(Decimal128 left, Decimal128 right,
-      Decimal128 quotient, Decimal128 remainder, short scale) {
+      Decimal128 quotient, short scale) {
     if (quotient == left || quotient == right) {
       throw new IllegalArgumentException(
           "result object cannot be left or right operand");
     }
 
     quotient.update(left);
-    quotient.divideDestructive(right, scale, remainder);
+    quotient.divideDestructive(right, scale);
   }
 
   /**
+   * As of 1/20/2014 this has a known bug in division. See
+   * TestDecimal128.testKnownPriorErrors(). Keeping this source
+   * code available since eventually it is better to fix this.
+   * The fix employed now is to replace this code with code that
+   * uses Java BigDecimal divide.
+   *
    * Performs division, changing the scale of this object to the specified
    * value.
    * <p>
@@ -1128,7 +1133,7 @@ public final class Decimal128 extends Nu
    * @param remainder
    *          object to receive remainder
    */
-  public void divideDestructive(Decimal128 right, short newScale,
+  public void divideDestructiveNativeDecimal128(Decimal128 right, short newScale,
       Decimal128 remainder) {
     if (right.signum == 0) {
       SqlMathUtil.throwZeroDivisionException();
@@ -1174,6 +1179,23 @@ public final class Decimal128 extends Nu
   }
 
   /**
+   * Divide the target object by right, and scale the result to newScale.
+   *
+   * This uses HiveDecimal to get a correct answer with the same rounding
+   * behavior as HiveDecimal, but it is expensive.
+   *
+   * In the future, a native implementation could be faster.
+   */
+  public void divideDestructive(Decimal128 right, short newScale) {
+    HiveDecimal rightHD = HiveDecimal.create(right.toBigDecimal());
+    HiveDecimal thisHD = HiveDecimal.create(this.toBigDecimal());
+    HiveDecimal result = thisHD.divide(rightHD);
+    this.update(result.bigDecimalValue().toPlainString(), newScale);
+    this.unscaledValue.throwIfExceedsTenToThirtyEight();
+  }
+
+
+  /**
    * Makes this {@code Decimal128} a positive number. Unlike
    * java.math.BigDecimal, this method is destructive.
    */

Modified: hive/branches/tez/common/src/java/org/apache/hadoop/hive/conf/HiveConf.java
URL: http://svn.apache.org/viewvc/hive/branches/tez/common/src/java/org/apache/hadoop/hive/conf/HiveConf.java?rev=1561947&r1=1561946&r2=1561947&view=diff
==============================================================================
--- hive/branches/tez/common/src/java/org/apache/hadoop/hive/conf/HiveConf.java (original)
+++ hive/branches/tez/common/src/java/org/apache/hadoop/hive/conf/HiveConf.java Tue Jan 28 05:48:03 2014
@@ -146,7 +146,6 @@ public class HiveConf extends Configurat
    */
   public static final HiveConf.ConfVars[] dbVars = {
     HiveConf.ConfVars.HADOOPBIN,
-    HiveConf.ConfVars.HADOOPJT,
     HiveConf.ConfVars.METASTOREWAREHOUSE,
     HiveConf.ConfVars.SCRATCHDIR
   };
@@ -231,22 +230,23 @@ public class HiveConf extends Configurat
     // a symbolic name to reference in the Hive source code. Properties with non-null
     // values will override any values set in the underlying Hadoop configuration.
     HADOOPBIN("hadoop.bin.path", findHadoopBinary()),
-    HADOOPFS("fs.default.name", null),
     HIVE_FS_HAR_IMPL("fs.har.impl", "org.apache.hadoop.hive.shims.HiveHarFileSystem"),
-    HADOOPMAPFILENAME("map.input.file", null),
-    HADOOPMAPREDINPUTDIR("mapred.input.dir", null),
-    HADOOPMAPREDINPUTDIRRECURSIVE("mapred.input.dir.recursive", false),
-    HADOOPJT("mapred.job.tracker", null),
-    MAPREDMAXSPLITSIZE("mapred.max.split.size", 256000000L),
-    MAPREDMINSPLITSIZE("mapred.min.split.size", 1L),
-    MAPREDMINSPLITSIZEPERNODE("mapred.min.split.size.per.rack", 1L),
-    MAPREDMINSPLITSIZEPERRACK("mapred.min.split.size.per.node", 1L),
+    HADOOPFS(ShimLoader.getHadoopShims().getHadoopConfNames().get("HADOOPFS"), null),
+    HADOOPMAPFILENAME(ShimLoader.getHadoopShims().getHadoopConfNames().get("HADOOPMAPFILENAME"), null),
+    HADOOPMAPREDINPUTDIR(ShimLoader.getHadoopShims().getHadoopConfNames().get("HADOOPMAPREDINPUTDIR"), null),
+    HADOOPMAPREDINPUTDIRRECURSIVE(ShimLoader.getHadoopShims().getHadoopConfNames().get("HADOOPMAPREDINPUTDIRRECURSIVE"), false),
+    MAPREDMAXSPLITSIZE(ShimLoader.getHadoopShims().getHadoopConfNames().get("MAPREDMAXSPLITSIZE"), 256000000L),
+    MAPREDMINSPLITSIZE(ShimLoader.getHadoopShims().getHadoopConfNames().get("MAPREDMINSPLITSIZE"), 1L),
+    MAPREDMINSPLITSIZEPERNODE(ShimLoader.getHadoopShims().getHadoopConfNames().get("MAPREDMINSPLITSIZEPERNODE"), 1L),
+    MAPREDMINSPLITSIZEPERRACK(ShimLoader.getHadoopShims().getHadoopConfNames().get("MAPREDMINSPLITSIZEPERRACK"), 1L),
     // The number of reduce tasks per job. Hadoop sets this value to 1 by default
     // By setting this property to -1, Hive will automatically determine the correct
     // number of reducers.
-    HADOOPNUMREDUCERS("mapred.reduce.tasks", -1),
-    HADOOPJOBNAME("mapred.job.name", null),
-    HADOOPSPECULATIVEEXECREDUCERS("mapred.reduce.tasks.speculative.execution", true),
+    HADOOPNUMREDUCERS(ShimLoader.getHadoopShims().getHadoopConfNames().get("HADOOPNUMREDUCERS"), -1),
+    HADOOPJOBNAME(ShimLoader.getHadoopShims().getHadoopConfNames().get("HADOOPJOBNAME"), null),
+    HADOOPSPECULATIVEEXECREDUCERS(ShimLoader.getHadoopShims().getHadoopConfNames().get("HADOOPSPECULATIVEEXECREDUCERS"), true),
+    MAPREDSETUPCLEANUPNEEDED(ShimLoader.getHadoopShims().getHadoopConfNames().get("MAPREDSETUPCLEANUPNEEDED"), false),
+    MAPREDTASKCLEANUPNEEDED(ShimLoader.getHadoopShims().getHadoopConfNames().get("MAPREDTASKCLEANUPNEEDED"), false),
 
     // Metastore stuff. Be sure to update HiveConf.metaVars when you add
     // something here!
@@ -642,6 +642,7 @@ public class HiveConf extends Configurat
     // higher compute cost.
     HIVE_STATS_NDV_ERROR("hive.stats.ndv.error", (float)20.0),
     HIVE_STATS_KEY_PREFIX_MAX_LENGTH("hive.stats.key.prefix.max.length", 150),
+    HIVE_STATS_KEY_PREFIX_RESERVE_LENGTH("hive.stats.key.prefix.reserve.length", 24),
     HIVE_STATS_KEY_PREFIX("hive.stats.key.prefix", ""), // internal usage only
     // if length of variable length data type cannot be determined this length will be used.
     HIVE_STATS_MAX_VARIABLE_LENGTH("hive.stats.max.variable.length", 100),
@@ -795,7 +796,7 @@ public class HiveConf extends Configurat
     // Number of async threads
     HIVE_SERVER2_ASYNC_EXEC_THREADS("hive.server2.async.exec.threads", 100),
     // Number of seconds HiveServer2 shutdown will wait for async threads to terminate
-    HIVE_SERVER2_ASYNC_EXEC_SHUTDOWN_TIMEOUT("hive.server2.async.exec.shutdown.timeout", 10L),
+    HIVE_SERVER2_ASYNC_EXEC_SHUTDOWN_TIMEOUT("hive.server2.async.exec.shutdown.timeout", 10),
     // Size of the wait queue for async thread pool in HiveServer2.
     // After hitting this limit, the async thread pool will reject new requests.
     HIVE_SERVER2_ASYNC_EXEC_WAIT_QUEUE_SIZE("hive.server2.async.exec.wait.queue.size", 100),

Modified: hive/branches/tez/common/src/test/org/apache/hadoop/hive/common/type/TestDecimal128.java
URL: http://svn.apache.org/viewvc/hive/branches/tez/common/src/test/org/apache/hadoop/hive/common/type/TestDecimal128.java?rev=1561947&r1=1561946&r2=1561947&view=diff
==============================================================================
--- hive/branches/tez/common/src/test/org/apache/hadoop/hive/common/type/TestDecimal128.java (original)
+++ hive/branches/tez/common/src/test/org/apache/hadoop/hive/common/type/TestDecimal128.java Tue Jan 28 05:48:03 2014
@@ -17,6 +17,8 @@ package org.apache.hadoop.hive.common.ty
 
 import static org.junit.Assert.*;
 
+import java.util.Random;
+
 import org.junit.After;
 import org.junit.Before;
 import org.junit.Test;
@@ -47,11 +49,38 @@ public class TestDecimal128 {
 
   @Test
   public void testCalculateTenThirtyEight() {
+
+    // find 10^38
     Decimal128 ten = new Decimal128(10, (short) 0);
     Decimal128 val = new Decimal128(1, (short) 0);
     for (int i = 0; i < 38; ++i) {
       val.multiplyDestructive(ten, (short) 0);
     }
+
+    // verify it
+    String s = val.toFormalString();
+    assertEquals("100000000000000000000000000000000000000", s);
+    boolean overflow = false;
+
+    // show that it is is an overflow for precision 38
+    try {
+      val.checkPrecisionOverflow(38);
+    } catch (Exception e) {
+      overflow = true;
+    }
+    assertTrue(overflow);
+
+    // subtract one
+    val.subtractDestructive(one, (short) 0);
+    overflow = false;
+
+    // show that it does not overflow for precision 38
+    try {
+      val.checkPrecisionOverflow(38);
+    } catch (Exception e) {
+      overflow = true;
+    }
+    assertFalse(overflow);
   }
 
   @Test
@@ -247,28 +276,104 @@ public class TestDecimal128 {
   @Test
   public void testDivide() {
     Decimal128 quotient = new Decimal128();
-    Decimal128 remainder = new Decimal128();
-    Decimal128.divide(two, one, quotient, remainder, (short) 2);
+    Decimal128.divide(two, one, quotient, (short) 2);
     assertEquals(0, quotient.compareTo(two));
-    assertTrue(remainder.isZero());
 
-    Decimal128.divide(two, two, quotient, remainder, (short) 2);
+    Decimal128.divide(two, two, quotient, (short) 2);
     assertEquals(0, quotient.compareTo(one));
-    assertTrue(remainder.isZero());
 
     Decimal128 three = new Decimal128(3);
     Decimal128 four = new Decimal128(4);
-    Decimal128.divide(three, four, quotient, remainder, (short) 2);
+    Decimal128.divide(three, four, quotient, (short) 2);
     assertEquals("0.75", quotient.toFormalString());
-    assertEquals("0", remainder.toFormalString());
 
-    Decimal128.divide(three, four, quotient, remainder, (short) 1);
-    assertEquals("0.7", quotient.toFormalString());
-    assertEquals("0.2", remainder.toFormalString());
-
-    Decimal128.divide(three, four, quotient, remainder, (short) 0);
-    assertEquals("0", quotient.toFormalString());
-    assertEquals("3", remainder.toFormalString());
+    Decimal128.divide(three, four, quotient, (short) 1);
+    assertEquals("0.8", quotient.toFormalString());
+
+    Decimal128.divide(three, four, quotient, (short) 0);
+    assertEquals("1", quotient.toFormalString());
+
+    Decimal128 two = new Decimal128(2);
+    Decimal128.divide(two, three, quotient, (short) 4);
+    assertEquals("0.6667", quotient.toFormalString());
+  }
+
+  @Test
+  public void testRandomMultiplyDivideInverse() {
+    final int N = 100000;
+    final long MASK56 = 0x00FFFFFFFFFFFFL; // 56 bit mask to generate positive 56 bit longs
+                                           // from random signed longs
+    int seed = 897089790;
+    Random rand = new Random(seed);
+    long l1, l2;
+    for (int i = 1; i <= N; i++) {
+      l1 = rand.nextLong() & MASK56;
+      l2 = rand.nextLong() & MASK56;
+      verifyMultiplyDivideInverse(l1, l2);
+    }
+  }
+
+  /**
+   * Verify that a * b / b == a
+   * for decimal division for scale 0 with integer inputs.
+   *
+   * Not valid if abs(a * b) >= 10**38.
+   */
+  private void verifyMultiplyDivideInverse(long a, long b) {
+    final short scale = 0;
+
+    // ignore zero-divide cases
+    if (b == 0) {
+      return;
+    }
+    Decimal128 decA = new Decimal128(a, scale);
+    Decimal128 decB = new Decimal128(b, scale);
+    decA.multiplyDestructive(decB, scale);
+    decA.checkPrecisionOverflow(38); // caller must make sure product of inputs is not too big
+    decA.divideDestructive(decB, scale);
+    assertEquals("Error for a = " + Long.toString(a) + ", b = " + Long.toString(b),
+        new Decimal128(a, scale), decA);
+  }
+
+
+  @Test
+  public void testRandomAddSubtractInverse() {
+    final int N = 1000000;
+    int seed = 1427480960;
+    Random rand = new Random(seed);
+    long l1, l2;
+    for (int i = 1; i <= N; i++) {
+      l1 = rand.nextLong();
+      l2 = rand.nextLong();
+      verifyAddSubtractInverse(l1, l2);
+    }
+  }
+
+  /**
+   * Verify that (a + b) - b == a
+   * for decimal add and subtract for scale 0 with long integer inputs.
+   */
+  private void verifyAddSubtractInverse(long a, long b) {
+    final short scale = 0;
+    Decimal128 decA = new Decimal128(a, scale);
+    Decimal128 decB = new Decimal128(b, scale);
+    decA.addDestructive(decB, scale);
+
+    decA.subtractDestructive(decB, scale);
+    assertEquals("Error for a = " + Long.toString(a) + ", b = " + Long.toString(b),
+        new Decimal128(a, scale), decA);
+  }
+
+  /**
+   * During earlier code testing, if we found errors, test them here as regression tests.
+   */
+  @Test
+  public void testKnownPriorErrors() {
+
+    // Regression test for defect reported in HIVE-6243
+    long a = 213474114411690L;
+    long b = 5062120663L;
+    verifyMultiplyDivideInverse(a, b);
   }
 
   @Test
@@ -281,13 +386,12 @@ public class TestDecimal128 {
     Decimal128 current = new Decimal128(1, SCALE);
     Decimal128 multiplier = new Decimal128();
     Decimal128 dividor = new Decimal128();
-    Decimal128 remainder = new Decimal128();
     Decimal128 one = new Decimal128(1);
     for (int i = LOOPS; i > 0; --i) {
       multiplier.update(i, SCALE);
       current.multiplyDestructive(multiplier, SCALE);
       dividor.update(1 + 2 * i, SCALE);
-      current.divideDestructive(dividor, SCALE, remainder);
+      current.divideDestructive(dividor, SCALE);
       current.addDestructive(one, SCALE);
     }
     current.multiplyDestructive(new Decimal128(2), SCALE);
@@ -307,17 +411,16 @@ public class TestDecimal128 {
     Decimal128 total = new Decimal128(0);
     Decimal128 multiplier = new Decimal128();
     Decimal128 dividor = new Decimal128();
-    Decimal128 remainder = new Decimal128();
     Decimal128 current = new Decimal128();
     for (int i = 0; i < LOOPS; ++i) {
       current.update(3, SCALE);
       dividor.update(2 * i + 1, SCALE);
-      current.divideDestructive(dividor, SCALE, remainder);
+      current.divideDestructive(dividor, SCALE);
       for (int j = 1; j <= i; ++j) {
         multiplier.update(i + j, SCALE);
         dividor.update(16 * j, SCALE);
         current.multiplyDestructive(multiplier, SCALE);
-        current.divideDestructive(dividor, SCALE, remainder);
+        current.divideDestructive(dividor, SCALE);
       }
 
       total.addDestructive(current, SCALE);
@@ -329,16 +432,15 @@ public class TestDecimal128 {
   @Test
   public void testDoubleValue() {
     Decimal128 quotient = new Decimal128();
-    Decimal128 remainder = new Decimal128();
 
     Decimal128 three = new Decimal128(3);
     Decimal128 four = new Decimal128(9);
-    Decimal128.divide(three, four, quotient, remainder, (short) 38);
+    Decimal128.divide(three, four, quotient, (short) 38);
     assertEquals(0.33333333333333333333333333d, quotient.doubleValue(),
         0.0000000000000000000000001d);
 
     Decimal128 minusThree = new Decimal128(-3);
-    Decimal128.divide(minusThree, four, quotient, remainder, (short) 38);
+    Decimal128.divide(minusThree, four, quotient, (short) 38);
     assertEquals(-0.33333333333333333333333333d, quotient.doubleValue(),
         0.0000000000000000000000001d);
   }
@@ -346,15 +448,14 @@ public class TestDecimal128 {
   @Test
   public void testFloatValue() {
     Decimal128 quotient = new Decimal128();
-    Decimal128 remainder = new Decimal128();
 
     Decimal128 three = new Decimal128(3);
     Decimal128 four = new Decimal128(9);
-    Decimal128.divide(three, four, quotient, remainder, (short) 38);
+    Decimal128.divide(three, four, quotient, (short) 38);
     assertEquals(0.3333333333333333f, quotient.floatValue(), 0.00000000001f);
 
     Decimal128 minusThree = new Decimal128(-3);
-    Decimal128.divide(minusThree, four, quotient, remainder, (short) 38);
+    Decimal128.divide(minusThree, four, quotient, (short) 38);
     assertEquals(-0.333333333333333f, quotient.floatValue(), 0.00000000001f);
   }
 
@@ -410,5 +511,29 @@ public class TestDecimal128 {
       fail();
     } catch (ArithmeticException ex) {
     }
+
+    // Try the extremes of precision and scale.
+
+    // digit  measuring stick:
+    //                12345678901234567890123456789012345678
+    new Decimal128("0.99999999999999999999999999999999999999", (short) 38)
+      .checkPrecisionOverflow(38);
+
+    try {
+      new Decimal128("0.99999999999999999999999999999999999999", (short) 38)
+        .checkPrecisionOverflow(37);
+      fail();
+    } catch (ArithmeticException ex) {
+    }
+
+    new Decimal128("99999999999999999999999999999999999999", (short) 0)
+      .checkPrecisionOverflow(38);
+
+    try {
+      new Decimal128("99999999999999999999999999999999999999", (short) 0)
+        .checkPrecisionOverflow(37);
+      fail();
+    } catch (ArithmeticException ex) {
+    }
   }
 }

Modified: hive/branches/tez/conf/hive-default.xml.template
URL: http://svn.apache.org/viewvc/hive/branches/tez/conf/hive-default.xml.template?rev=1561947&r1=1561946&r2=1561947&view=diff
==============================================================================
--- hive/branches/tez/conf/hive-default.xml.template (original)
+++ hive/branches/tez/conf/hive-default.xml.template Tue Jan 28 05:48:03 2014
@@ -177,8 +177,8 @@
 
 <property>
   <name>datanucleus.connectionPoolingType</name>
-  <value>DBCP</value>
-  <description>Uses a DBCP connection pool for JDBC metastore</description>
+  <value>BoneCP</value>
+  <description>Uses a BoneCP connection pool for JDBC metastore</description>
 </property>
 
 <property>
@@ -1307,6 +1307,17 @@
     exceeds a certain length, a hash of the key is used instead.  If the value &lt; 0 then hashing
     is never used, if the value >= 0 then hashing is used only when the key prefixes length
     exceeds that value.  The key prefix is defined as everything preceding the task ID in the key.
+    For counter type stats, it's maxed by mapreduce.job.counters.group.name.max, which is by default 128.
+  </description>
+</property>
+
+<property>
+  <name>hive.stats.key.prefix.reserve.length</name>
+  <value>24</value>
+  <description>
+    Reserved length for postfix of stats key. Currently only meaningful for counter type which should
+    keep length of full stats key smaller than max length configured by hive.stats.key.prefix.max.length.
+    For counter type, it should be bigger than the length of LB spec if exists.
   </description>
 </property>
 
@@ -2127,7 +2138,7 @@
 
 <property>
   <name>hive.metastore.integral.jdo.pushdown</name>
-  <value>false</false>
+  <value>false</value>
   <description>
    Allow JDO query pushdown for integral partition columns in metastore. Off by default. This
    improves metastore perf for integral columns, especially if there's a large number of partitions.

Propchange: hive/branches/tez/contrib/
------------------------------------------------------------------------------
--- svn:ignore (added)
+++ svn:ignore Tue Jan 28 05:48:03 2014
@@ -0,0 +1 @@
+target

Modified: hive/branches/tez/data/conf/hive-site.xml
URL: http://svn.apache.org/viewvc/hive/branches/tez/data/conf/hive-site.xml?rev=1561947&r1=1561946&r2=1561947&view=diff
==============================================================================
--- hive/branches/tez/data/conf/hive-site.xml (original)
+++ hive/branches/tez/data/conf/hive-site.xml Tue Jan 28 05:48:03 2014
@@ -183,6 +183,12 @@
 <property>
   <name>hive.stats.dbclass</name>
   <value>jdbc:derby</value>
-  <description>The default storatge that stores temporary hive statistics. Currently, jdbc, hbase and counter type is supported</description>
+  <description>The storage for temporary stats generated by tasks. Currently, jdbc, hbase and counter types are supported</description>
 </property>
+
+<property>
+  <name>hive.stats.key.prefix.reserve.length</name>
+  <value>0</value>
+</property>
+
 </configuration>

Modified: hive/branches/tez/data/files/datatypes.txt
URL: http://svn.apache.org/viewvc/hive/branches/tez/data/files/datatypes.txt?rev=1561947&r1=1561946&r2=1561947&view=diff
==============================================================================
--- hive/branches/tez/data/files/datatypes.txt (original)
+++ hive/branches/tez/data/files/datatypes.txt Tue Jan 28 05:48:03 2014
@@ -1,3 +1,3 @@
 \N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N\N
 -1false-1.1\N\N\N-1-1-1.0-1\N\N\N\N\N\N\N\N
-1true1.11121x2ykva92.2111.01abcd1111213142212212x1abcd22012-04-22 09:00:00.123456789123456789.0123456YWJjZA==2013-01-01abc123abc123
+1true1.11121x2ykva92.2111.01abcd1111213142212212x1abcd22012-04-22 09:00:00.123456789123456789.0123456YWJjZA==2013-01-01abc123abc123X'01FF'

Propchange: hive/branches/tez/hbase-handler/
------------------------------------------------------------------------------
--- svn:ignore (added)
+++ svn:ignore Tue Jan 28 05:48:03 2014
@@ -0,0 +1 @@
+target

Modified: hive/branches/tez/hbase-handler/src/java/org/apache/hadoop/hive/hbase/HBaseStorageHandler.java
URL: http://svn.apache.org/viewvc/hive/branches/tez/hbase-handler/src/java/org/apache/hadoop/hive/hbase/HBaseStorageHandler.java?rev=1561947&r1=1561946&r2=1561947&view=diff
==============================================================================
--- hive/branches/tez/hbase-handler/src/java/org/apache/hadoop/hive/hbase/HBaseStorageHandler.java (original)
+++ hive/branches/tez/hbase-handler/src/java/org/apache/hadoop/hive/hbase/HBaseStorageHandler.java Tue Jan 28 05:48:03 2014
@@ -333,8 +333,10 @@ public class HBaseStorageHandler extends
     // do this for reconciling HBaseStorageHandler for use in HCatalog
     // check to see if this an input job or an outputjob
     if (this.configureInputJobProps) {
+      for (String k : jobProperties.keySet()) {
+        jobConf.set(k, jobProperties.get(k));
+      }
       try {
-        HBaseConfiguration.addHbaseResources(jobConf);
         addHBaseDelegationToken(jobConf);
       }//try
       catch (IOException e) {
@@ -342,8 +344,6 @@ public class HBaseStorageHandler extends
       } //input job properties
     }
     else {
-      Configuration copyOfConf = new Configuration(jobConf);
-      HBaseConfiguration.addHbaseResources(copyOfConf);
       jobProperties.put(TableOutputFormat.OUTPUT_TABLE, tableName);
     } // output job properties
   }

Modified: hive/branches/tez/hbase-handler/src/test/queries/positive/hbase_bulk.m
URL: http://svn.apache.org/viewvc/hive/branches/tez/hbase-handler/src/test/queries/positive/hbase_bulk.m?rev=1561947&r1=1561946&r2=1561947&view=diff
==============================================================================
--- hive/branches/tez/hbase-handler/src/test/queries/positive/hbase_bulk.m (original)
+++ hive/branches/tez/hbase-handler/src/test/queries/positive/hbase_bulk.m Tue Jan 28 05:48:03 2014
@@ -20,7 +20,8 @@ stored as
 inputformat
 'org.apache.hadoop.mapred.TextInputFormat'
 outputformat
-'org.apache.hadoop.hive.ql.io.HiveNullValueSequenceFileOutputFormat';
+'org.apache.hadoop.hive.ql.io.HiveNullValueSequenceFileOutputFormat'
+location '/tmp/data/hbpartition';
 
 -- this should produce one file, but we do not
 -- know what it will be called, so we will copy it to a well known
@@ -30,13 +31,15 @@ select distinct value
 from src
 where value='val_100' or value='val_200';
 
-dfs -count /build/ql/test/data/warehouse/hbpartition;
-dfs -cp /build/ql/test/data/warehouse/hbpartition/* /tmp/hbpartition.lst;
+dfs -count /tmp/data/hbpartition;
+dfs -cp /tmp/data/hbpartition/* /tmp/hbpartition.lst;
 
 set mapred.reduce.tasks=3;
 set hive.mapred.partitioner=org.apache.hadoop.mapred.lib.TotalOrderPartitioner;
 set total.order.partitioner.natural.order=false;
 set total.order.partitioner.path=/tmp/hbpartition.lst;
+set mapreduce.totalorderpartitioner.naturalorder=false;
+set mapreduce.totalorderpartitioner.path=/tmp/hbpartition.lst;
 
 -- this should produce three files in /tmp/hbsort/cf
 -- include some trailing blanks and nulls to make sure we handle them correctly

Modified: hive/branches/tez/hbase-handler/src/test/queries/positive/hbase_queries.q
URL: http://svn.apache.org/viewvc/hive/branches/tez/hbase-handler/src/test/queries/positive/hbase_queries.q?rev=1561947&r1=1561946&r2=1561947&view=diff
==============================================================================
--- hive/branches/tez/hbase-handler/src/test/queries/positive/hbase_queries.q (original)
+++ hive/branches/tez/hbase-handler/src/test/queries/positive/hbase_queries.q Tue Jan 28 05:48:03 2014
@@ -37,7 +37,7 @@ ORDER BY key, value LIMIT 20;
 EXPLAIN 
 SELECT Y.*
 FROM 
-(SELECT hbase_table_1.* FROM hbase_table_1 WHERE hbase_table_1.key > 100) x
+(SELECT hbase_table_1.* FROM hbase_table_1 WHERE 100 < hbase_table_1.key) x
 JOIN 
 (SELECT hbase_table_2.* FROM hbase_table_2 WHERE hbase_table_2.key < 120) Y
 ON (x.key = Y.key)
@@ -45,7 +45,7 @@ ORDER BY key, value;
 
 SELECT Y.*
 FROM 
-(SELECT hbase_table_1.* FROM hbase_table_1 WHERE hbase_table_1.key > 100) x
+(SELECT hbase_table_1.* FROM hbase_table_1 WHERE 100 < hbase_table_1.key) x
 JOIN 
 (SELECT hbase_table_2.* FROM hbase_table_2 WHERE hbase_table_2.key < 120) Y
 ON (x.key = Y.key)

Modified: hive/branches/tez/hbase-handler/src/test/results/negative/cascade_dbdrop.q.out
URL: http://svn.apache.org/viewvc/hive/branches/tez/hbase-handler/src/test/results/negative/cascade_dbdrop.q.out?rev=1561947&r1=1561946&r2=1561947&view=diff
==============================================================================
--- hive/branches/tez/hbase-handler/src/test/results/negative/cascade_dbdrop.q.out (original)
+++ hive/branches/tez/hbase-handler/src/test/results/negative/cascade_dbdrop.q.out Tue Jan 28 05:48:03 2014
@@ -37,7 +37,11 @@ Found 3 items
 #### A masked pattern was here ####
 PREHOOK: query: DROP DATABASE IF EXISTS hbaseDB CASCADE
 PREHOOK: type: DROPDATABASE
+PREHOOK: Input: database:hbasedb
+PREHOOK: Output: database:hbasedb
 POSTHOOK: query: DROP DATABASE IF EXISTS hbaseDB CASCADE
 POSTHOOK: type: DROPDATABASE
+POSTHOOK: Input: database:hbasedb
+POSTHOOK: Output: database:hbasedb
 Command failed with exit code = 1
 Query returned non-zero code: 1, cause: null

Modified: hive/branches/tez/hbase-handler/src/test/results/positive/hbase_bulk.m.out
URL: http://svn.apache.org/viewvc/hive/branches/tez/hbase-handler/src/test/results/positive/hbase_bulk.m.out?rev=1561947&r1=1561946&r2=1561947&view=diff
==============================================================================
--- hive/branches/tez/hbase-handler/src/test/results/positive/hbase_bulk.m.out (original)
+++ hive/branches/tez/hbase-handler/src/test/results/positive/hbase_bulk.m.out Tue Jan 28 05:48:03 2014
@@ -33,6 +33,7 @@ inputformat
 'org.apache.hadoop.mapred.TextInputFormat'
 outputformat
 'org.apache.hadoop.hive.ql.io.HiveNullValueSequenceFileOutputFormat'
+#### A masked pattern was here ####
 PREHOOK: type: CREATETABLE
 POSTHOOK: query: -- this is a dummy table used for controlling how the input file
 -- for TotalOrderPartitioner is created
@@ -44,6 +45,7 @@ inputformat
 'org.apache.hadoop.mapred.TextInputFormat'
 outputformat
 'org.apache.hadoop.hive.ql.io.HiveNullValueSequenceFileOutputFormat'
+#### A masked pattern was here ####
 POSTHOOK: type: CREATETABLE
 POSTHOOK: Output: default@hbpartition
 PREHOOK: query: -- this should produce one file, but we do not

Modified: hive/branches/tez/hbase-handler/src/test/results/positive/hbase_queries.q.out
URL: http://svn.apache.org/viewvc/hive/branches/tez/hbase-handler/src/test/results/positive/hbase_queries.q.out?rev=1561947&r1=1561946&r2=1561947&view=diff
==============================================================================
--- hive/branches/tez/hbase-handler/src/test/results/positive/hbase_queries.q.out (original)
+++ hive/branches/tez/hbase-handler/src/test/results/positive/hbase_queries.q.out Tue Jan 28 05:48:03 2014
@@ -262,7 +262,7 @@ POSTHOOK: Input: default@src
 PREHOOK: query: EXPLAIN 
 SELECT Y.*
 FROM 
-(SELECT hbase_table_1.* FROM hbase_table_1 WHERE hbase_table_1.key > 100) x
+(SELECT hbase_table_1.* FROM hbase_table_1 WHERE 100 < hbase_table_1.key) x
 JOIN 
 (SELECT hbase_table_2.* FROM hbase_table_2 WHERE hbase_table_2.key < 120) Y
 ON (x.key = Y.key)
@@ -271,14 +271,14 @@ PREHOOK: type: QUERY
 POSTHOOK: query: EXPLAIN 
 SELECT Y.*
 FROM 
-(SELECT hbase_table_1.* FROM hbase_table_1 WHERE hbase_table_1.key > 100) x
+(SELECT hbase_table_1.* FROM hbase_table_1 WHERE 100 < hbase_table_1.key) x
 JOIN 
 (SELECT hbase_table_2.* FROM hbase_table_2 WHERE hbase_table_2.key < 120) Y
 ON (x.key = Y.key)
 ORDER BY key, value
 POSTHOOK: type: QUERY
 ABSTRACT SYNTAX TREE:
-  (TOK_QUERY (TOK_FROM (TOK_JOIN (TOK_SUBQUERY (TOK_QUERY (TOK_FROM (TOK_TABREF (TOK_TABNAME hbase_table_1))) (TOK_INSERT (TOK_DESTINATION (TOK_DIR TOK_TMP_FILE)) (TOK_SELECT (TOK_SELEXPR (TOK_ALLCOLREF (TOK_TABNAME hbase_table_1)))) (TOK_WHERE (> (. (TOK_TABLE_OR_COL hbase_table_1) key) 100)))) x) (TOK_SUBQUERY (TOK_QUERY (TOK_FROM (TOK_TABREF (TOK_TABNAME hbase_table_2))) (TOK_INSERT (TOK_DESTINATION (TOK_DIR TOK_TMP_FILE)) (TOK_SELECT (TOK_SELEXPR (TOK_ALLCOLREF (TOK_TABNAME hbase_table_2)))) (TOK_WHERE (< (. (TOK_TABLE_OR_COL hbase_table_2) key) 120)))) Y) (= (. (TOK_TABLE_OR_COL x) key) (. (TOK_TABLE_OR_COL Y) key)))) (TOK_INSERT (TOK_DESTINATION (TOK_DIR TOK_TMP_FILE)) (TOK_SELECT (TOK_SELEXPR (TOK_ALLCOLREF (TOK_TABNAME Y)))) (TOK_ORDERBY (TOK_TABSORTCOLNAMEASC (TOK_TABLE_OR_COL key)) (TOK_TABSORTCOLNAMEASC (TOK_TABLE_OR_COL value)))))
+  (TOK_QUERY (TOK_FROM (TOK_JOIN (TOK_SUBQUERY (TOK_QUERY (TOK_FROM (TOK_TABREF (TOK_TABNAME hbase_table_1))) (TOK_INSERT (TOK_DESTINATION (TOK_DIR TOK_TMP_FILE)) (TOK_SELECT (TOK_SELEXPR (TOK_ALLCOLREF (TOK_TABNAME hbase_table_1)))) (TOK_WHERE (< 100 (. (TOK_TABLE_OR_COL hbase_table_1) key))))) x) (TOK_SUBQUERY (TOK_QUERY (TOK_FROM (TOK_TABREF (TOK_TABNAME hbase_table_2))) (TOK_INSERT (TOK_DESTINATION (TOK_DIR TOK_TMP_FILE)) (TOK_SELECT (TOK_SELEXPR (TOK_ALLCOLREF (TOK_TABNAME hbase_table_2)))) (TOK_WHERE (< (. (TOK_TABLE_OR_COL hbase_table_2) key) 120)))) Y) (= (. (TOK_TABLE_OR_COL x) key) (. (TOK_TABLE_OR_COL Y) key)))) (TOK_INSERT (TOK_DESTINATION (TOK_DIR TOK_TMP_FILE)) (TOK_SELECT (TOK_SELEXPR (TOK_ALLCOLREF (TOK_TABNAME Y)))) (TOK_ORDERBY (TOK_TABSORTCOLNAMEASC (TOK_TABLE_OR_COL key)) (TOK_TABSORTCOLNAMEASC (TOK_TABLE_OR_COL value)))))
 
 STAGE DEPENDENCIES:
   Stage-1 is a root stage
@@ -294,7 +294,7 @@ STAGE PLANS:
             alias: hbase_table_1
             Filter Operator
               predicate:
-                  expr: (key > 100)
+                  expr: (100 < key)
                   type: boolean
               Select Operator
                 expressions:
@@ -396,7 +396,7 @@ STAGE PLANS:
 
 PREHOOK: query: SELECT Y.*
 FROM 
-(SELECT hbase_table_1.* FROM hbase_table_1 WHERE hbase_table_1.key > 100) x
+(SELECT hbase_table_1.* FROM hbase_table_1 WHERE 100 < hbase_table_1.key) x
 JOIN 
 (SELECT hbase_table_2.* FROM hbase_table_2 WHERE hbase_table_2.key < 120) Y
 ON (x.key = Y.key)
@@ -407,7 +407,7 @@ PREHOOK: Input: default@hbase_table_2
 #### A masked pattern was here ####
 POSTHOOK: query: SELECT Y.*
 FROM 
-(SELECT hbase_table_1.* FROM hbase_table_1 WHERE hbase_table_1.key > 100) x
+(SELECT hbase_table_1.* FROM hbase_table_1 WHERE 100 < hbase_table_1.key) x
 JOIN 
 (SELECT hbase_table_2.* FROM hbase_table_2 WHERE hbase_table_2.key < 120) Y
 ON (x.key = Y.key)

Modified: hive/branches/tez/hbase-handler/src/test/templates/TestHBaseCliDriver.vm
URL: http://svn.apache.org/viewvc/hive/branches/tez/hbase-handler/src/test/templates/TestHBaseCliDriver.vm?rev=1561947&r1=1561946&r2=1561947&view=diff
==============================================================================
--- hive/branches/tez/hbase-handler/src/test/templates/TestHBaseCliDriver.vm (original)
+++ hive/branches/tez/hbase-handler/src/test/templates/TestHBaseCliDriver.vm Tue Jan 28 05:48:03 2014
@@ -120,21 +120,17 @@ public class $className extends TestCase
       qt.clearTestSideEffects();
       int ecode = qt.executeClient(fname);
       if (ecode != 0) {
-        fail("Client Execution failed with error code = " + ecode);
+        qt.failed(ecode, fname, null);
       }
 
       ecode = qt.checkCliDriverResults(fname);
       if (ecode != 0) {
-        fail("Client execution results failed with error code = " + ecode);
+        qt.failedDiff(ecode, fname, null);
       }
       qt.clearPostTestEffects();
 
     } catch (Throwable e) {
-      System.err.println("Exception: " + e.getMessage());
-      e.printStackTrace();
-      System.err.println("Failed query: " + fname);
-      System.err.flush();
-      fail("Unexpected exception");
+      qt.failed(e, fname, null);
     }
 
     long elapsedTime = System.currentTimeMillis() - startTime;

Modified: hive/branches/tez/hbase-handler/src/test/templates/TestHBaseNegativeCliDriver.vm
URL: http://svn.apache.org/viewvc/hive/branches/tez/hbase-handler/src/test/templates/TestHBaseNegativeCliDriver.vm?rev=1561947&r1=1561946&r2=1561947&view=diff
==============================================================================
--- hive/branches/tez/hbase-handler/src/test/templates/TestHBaseNegativeCliDriver.vm (original)
+++ hive/branches/tez/hbase-handler/src/test/templates/TestHBaseNegativeCliDriver.vm Tue Jan 28 05:48:03 2014
@@ -120,21 +120,17 @@ public class $className extends TestCase
       qt.clearTestSideEffects();
       int ecode = qt.executeClient(fname);
       if (ecode == 0) {
-        fail("Client Execution failed with error code = " + ecode);
+        qt.failed(fname, null);
       }
 
       ecode = qt.checkCliDriverResults(fname);
       if (ecode != 0) {
-        fail("Client execution results failed with error code = " + ecode);
+        qt.failedDiff(ecode, fname, null);
       }
       qt.clearPostTestEffects();
 
     } catch (Throwable e) {
-      System.err.println("Exception: " + e.getMessage());
-      e.printStackTrace();
-      System.err.println("Failed query: " + fname);
-      System.err.flush();
-      fail("Unexpected exception");
+      qt.failed(e, fname, null);
     }
 
     long elapsedTime = System.currentTimeMillis() - startTime;

Propchange: hive/branches/tez/hcatalog/
------------------------------------------------------------------------------
--- svn:ignore (added)
+++ svn:ignore Tue Jan 28 05:48:03 2014
@@ -0,0 +1 @@
+target

Propchange: hive/branches/tez/hcatalog/core/
------------------------------------------------------------------------------
--- svn:ignore (added)
+++ svn:ignore Tue Jan 28 05:48:03 2014
@@ -0,0 +1 @@
+target

Modified: hive/branches/tez/hcatalog/core/src/main/java/org/apache/hcatalog/cli/SemanticAnalysis/CreateTableHook.java
URL: http://svn.apache.org/viewvc/hive/branches/tez/hcatalog/core/src/main/java/org/apache/hcatalog/cli/SemanticAnalysis/CreateTableHook.java?rev=1561947&r1=1561946&r2=1561947&view=diff
==============================================================================
--- hive/branches/tez/hcatalog/core/src/main/java/org/apache/hcatalog/cli/SemanticAnalysis/CreateTableHook.java (original)
+++ hive/branches/tez/hcatalog/core/src/main/java/org/apache/hcatalog/cli/SemanticAnalysis/CreateTableHook.java Tue Jan 28 05:48:03 2014
@@ -49,6 +49,7 @@ import org.apache.hcatalog.mapreduce.HCa
 /**
  * @deprecated Use/modify {@link org.apache.hive.hcatalog.cli.SemanticAnalysis.CreateTableHook} instead
  */
+@Deprecated
 final class CreateTableHook extends HCatSemanticAnalyzerBase {
 
   private String tableName;
@@ -216,7 +217,7 @@ final class CreateTableHook extends HCat
       try {
         Table table = context.getHive().newTable(desc.getTableName());
         if (desc.getLocation() != null) {
-          table.setDataLocation(new Path(desc.getLocation()).toUri());
+          table.setDataLocation(new Path(desc.getLocation()));
         }
         if (desc.getStorageHandler() != null) {
           table.setProperty(

Modified: hive/branches/tez/hcatalog/core/src/main/java/org/apache/hcatalog/cli/SemanticAnalysis/HCatSemanticAnalyzer.java
URL: http://svn.apache.org/viewvc/hive/branches/tez/hcatalog/core/src/main/java/org/apache/hcatalog/cli/SemanticAnalysis/HCatSemanticAnalyzer.java?rev=1561947&r1=1561946&r2=1561947&view=diff
==============================================================================
--- hive/branches/tez/hcatalog/core/src/main/java/org/apache/hcatalog/cli/SemanticAnalysis/HCatSemanticAnalyzer.java (original)
+++ hive/branches/tez/hcatalog/core/src/main/java/org/apache/hcatalog/cli/SemanticAnalysis/HCatSemanticAnalyzer.java Tue Jan 28 05:48:03 2014
@@ -78,7 +78,7 @@ public class HCatSemanticAnalyzer extend
     case HiveParser.TOK_ALTERTABLE_PARTITION:
       if (((ASTNode) ast.getChild(1)).getToken().getType() == HiveParser.TOK_ALTERTABLE_FILEFORMAT) {
         return ast;
-      } else if (((ASTNode) ast.getChild(1)).getToken().getType() == HiveParser.TOK_ALTERTABLE_ALTERPARTS_MERGEFILES) {
+      } else if (((ASTNode) ast.getChild(1)).getToken().getType() == HiveParser.TOK_ALTERTABLE_MERGEFILES) {
         // unsupported
         throw new SemanticException("Operation not supported.");
       } else {

Modified: hive/branches/tez/hcatalog/core/src/main/java/org/apache/hcatalog/security/HdfsAuthorizationProvider.java
URL: http://svn.apache.org/viewvc/hive/branches/tez/hcatalog/core/src/main/java/org/apache/hcatalog/security/HdfsAuthorizationProvider.java?rev=1561947&r1=1561946&r2=1561947&view=diff
==============================================================================
--- hive/branches/tez/hcatalog/core/src/main/java/org/apache/hcatalog/security/HdfsAuthorizationProvider.java (original)
+++ hive/branches/tez/hcatalog/core/src/main/java/org/apache/hcatalog/security/HdfsAuthorizationProvider.java Tue Jan 28 05:48:03 2014
@@ -208,7 +208,7 @@ public class HdfsAuthorizationProvider e
     if (part == null || part.getLocation() == null) {
       authorize(table, readRequiredPriv, writeRequiredPriv);
     } else {
-      authorize(part.getPartitionPath(), readRequiredPriv, writeRequiredPriv);
+      authorize(part.getDataLocation(), readRequiredPriv, writeRequiredPriv);
     }
   }
 

Modified: hive/branches/tez/hcatalog/core/src/main/java/org/apache/hive/hcatalog/cli/SemanticAnalysis/CreateTableHook.java
URL: http://svn.apache.org/viewvc/hive/branches/tez/hcatalog/core/src/main/java/org/apache/hive/hcatalog/cli/SemanticAnalysis/CreateTableHook.java?rev=1561947&r1=1561946&r2=1561947&view=diff
==============================================================================
--- hive/branches/tez/hcatalog/core/src/main/java/org/apache/hive/hcatalog/cli/SemanticAnalysis/CreateTableHook.java (original)
+++ hive/branches/tez/hcatalog/core/src/main/java/org/apache/hive/hcatalog/cli/SemanticAnalysis/CreateTableHook.java Tue Jan 28 05:48:03 2014
@@ -202,7 +202,7 @@ final class CreateTableHook extends HCat
             desc.getSerName(),
             desc.getInputFormat(),
             desc.getOutputFormat());
-        //Authorization checks are performed by the storageHandler.getAuthorizationProvider(), if  
+        //Authorization checks are performed by the storageHandler.getAuthorizationProvider(), if
         //StorageDelegationAuthorizationProvider is used.
       } catch (IOException e) {
         throw new SemanticException(e);
@@ -213,7 +213,7 @@ final class CreateTableHook extends HCat
       try {
         Table table = context.getHive().newTable(desc.getTableName());
         if (desc.getLocation() != null) {
-          table.setDataLocation(new Path(desc.getLocation()).toUri());
+          table.setDataLocation(new Path(desc.getLocation()));
         }
         if (desc.getStorageHandler() != null) {
           table.setProperty(

Modified: hive/branches/tez/hcatalog/core/src/main/java/org/apache/hive/hcatalog/cli/SemanticAnalysis/HCatSemanticAnalyzer.java
URL: http://svn.apache.org/viewvc/hive/branches/tez/hcatalog/core/src/main/java/org/apache/hive/hcatalog/cli/SemanticAnalysis/HCatSemanticAnalyzer.java?rev=1561947&r1=1561946&r2=1561947&view=diff
==============================================================================
--- hive/branches/tez/hcatalog/core/src/main/java/org/apache/hive/hcatalog/cli/SemanticAnalysis/HCatSemanticAnalyzer.java (original)
+++ hive/branches/tez/hcatalog/core/src/main/java/org/apache/hive/hcatalog/cli/SemanticAnalysis/HCatSemanticAnalyzer.java Tue Jan 28 05:48:03 2014
@@ -75,7 +75,7 @@ public class HCatSemanticAnalyzer extend
     case HiveParser.TOK_ALTERTABLE_PARTITION:
       if (((ASTNode) ast.getChild(1)).getToken().getType() == HiveParser.TOK_ALTERTABLE_FILEFORMAT) {
         return ast;
-      } else if (((ASTNode) ast.getChild(1)).getToken().getType() == HiveParser.TOK_ALTERTABLE_ALTERPARTS_MERGEFILES) {
+      } else if (((ASTNode) ast.getChild(1)).getToken().getType() == HiveParser.TOK_ALTERTABLE_MERGEFILES) {
         // unsupported
         throw new SemanticException("Operation not supported.");
       } else {



Mime
View raw message