hive-commits mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From gunt...@apache.org
Subject svn commit: r1527883 [1/6] - in /hive/branches/tez: ./ ant/src/org/apache/hadoop/hive/ant/ beeline/src/java/org/apache/hive/beeline/ bin/ common/src/java/org/apache/hadoop/hive/conf/ conf/ contrib/src/java/org/apache/hadoop/hive/contrib/fileformat/base...
Date Tue, 01 Oct 2013 04:48:48 GMT
Author: gunther
Date: Tue Oct  1 04:48:44 2013
New Revision: 1527883

URL: http://svn.apache.org/r1527883
Log:
Merge latest trunk into branch (Gunther Hagleitner)

Added:
    hive/branches/tez/ant/src/org/apache/hadoop/hive/ant/GenVectorCode.java
      - copied unchanged from r1527867, hive/trunk/ant/src/org/apache/hadoop/hive/ant/GenVectorCode.java
    hive/branches/tez/ant/src/org/apache/hadoop/hive/ant/GenVectorTestCode.java
      - copied unchanged from r1527867, hive/trunk/ant/src/org/apache/hadoop/hive/ant/GenVectorTestCode.java
    hive/branches/tez/data/files/alltypesorc
      - copied unchanged from r1527867, hive/trunk/data/files/alltypesorc
    hive/branches/tez/data/files/exported_table/
      - copied from r1527867, hive/trunk/data/files/exported_table/
    hive/branches/tez/hcatalog/bin/templeton.cmd
      - copied unchanged from r1527867, hive/trunk/hcatalog/bin/templeton.cmd
    hive/branches/tez/hcatalog/src/test/e2e/templeton/tests/jobstatus.conf
      - copied unchanged from r1527867, hive/trunk/hcatalog/src/test/e2e/templeton/tests/jobstatus.conf
    hive/branches/tez/hcatalog/src/test/e2e/templeton/tests/jobsubmission_streaming.conf
      - copied unchanged from r1527867, hive/trunk/hcatalog/src/test/e2e/templeton/tests/jobsubmission_streaming.conf
    hive/branches/tez/jdbc/src/java/org/apache/hive/jdbc/HttpBasicAuthInterceptor.java
      - copied unchanged from r1527867, hive/trunk/jdbc/src/java/org/apache/hive/jdbc/HttpBasicAuthInterceptor.java
    hive/branches/tez/ql/src/gen/vectorization/
      - copied from r1527867, hive/trunk/ql/src/gen/vectorization/
    hive/branches/tez/ql/src/java/org/apache/hadoop/hive/ql/exec/vector/
      - copied from r1527867, hive/trunk/ql/src/java/org/apache/hadoop/hive/ql/exec/vector/
    hive/branches/tez/ql/src/java/org/apache/hadoop/hive/ql/io/CommonRCFileInputFormat.java
      - copied unchanged from r1527867, hive/trunk/ql/src/java/org/apache/hadoop/hive/ql/io/CommonRCFileInputFormat.java
    hive/branches/tez/ql/src/java/org/apache/hadoop/hive/ql/io/FSRecordWriter.java
      - copied unchanged from r1527867, hive/trunk/ql/src/java/org/apache/hadoop/hive/ql/io/FSRecordWriter.java
    hive/branches/tez/ql/src/java/org/apache/hadoop/hive/ql/io/VectorizedRCFileInputFormat.java
      - copied unchanged from r1527867, hive/trunk/ql/src/java/org/apache/hadoop/hive/ql/io/VectorizedRCFileInputFormat.java
    hive/branches/tez/ql/src/java/org/apache/hadoop/hive/ql/io/VectorizedRCFileRecordReader.java
      - copied unchanged from r1527867, hive/trunk/ql/src/java/org/apache/hadoop/hive/ql/io/VectorizedRCFileRecordReader.java
    hive/branches/tez/ql/src/java/org/apache/hadoop/hive/ql/io/orc/VectorizedOrcInputFormat.java
      - copied unchanged from r1527867, hive/trunk/ql/src/java/org/apache/hadoop/hive/ql/io/orc/VectorizedOrcInputFormat.java
    hive/branches/tez/ql/src/java/org/apache/hadoop/hive/ql/io/orc/VectorizedOrcSerde.java
      - copied unchanged from r1527867, hive/trunk/ql/src/java/org/apache/hadoop/hive/ql/io/orc/VectorizedOrcSerde.java
    hive/branches/tez/ql/src/java/org/apache/hadoop/hive/ql/optimizer/physical/Vectorizer.java
      - copied unchanged from r1527867, hive/trunk/ql/src/java/org/apache/hadoop/hive/ql/optimizer/physical/Vectorizer.java
    hive/branches/tez/ql/src/java/org/apache/hadoop/hive/ql/parse/LeadLagInfo.java
      - copied unchanged from r1527867, hive/trunk/ql/src/java/org/apache/hadoop/hive/ql/parse/LeadLagInfo.java
    hive/branches/tez/ql/src/test/org/apache/hadoop/hive/ql/exec/vector/
      - copied from r1527867, hive/trunk/ql/src/test/org/apache/hadoop/hive/ql/exec/vector/
    hive/branches/tez/ql/src/test/org/apache/hadoop/hive/ql/io/orc/TestVectorizedORCReader.java
      - copied unchanged from r1527867, hive/trunk/ql/src/test/org/apache/hadoop/hive/ql/io/orc/TestVectorizedORCReader.java
    hive/branches/tez/ql/src/test/queries/clientnegative/illegal_partition_type.q
      - copied unchanged from r1527867, hive/trunk/ql/src/test/queries/clientnegative/illegal_partition_type.q
    hive/branches/tez/ql/src/test/queries/clientnegative/illegal_partition_type2.q
      - copied unchanged from r1527867, hive/trunk/ql/src/test/queries/clientnegative/illegal_partition_type2.q
    hive/branches/tez/ql/src/test/queries/clientpositive/cast_to_int.q
      - copied unchanged from r1527867, hive/trunk/ql/src/test/queries/clientpositive/cast_to_int.q
    hive/branches/tez/ql/src/test/queries/clientpositive/import_exported_table.q
      - copied unchanged from r1527867, hive/trunk/ql/src/test/queries/clientpositive/import_exported_table.q
    hive/branches/tez/ql/src/test/queries/clientpositive/partition_type_check.q
      - copied unchanged from r1527867, hive/trunk/ql/src/test/queries/clientpositive/partition_type_check.q
    hive/branches/tez/ql/src/test/queries/clientpositive/vectorization_short_regress.q
      - copied unchanged from r1527867, hive/trunk/ql/src/test/queries/clientpositive/vectorization_short_regress.q
    hive/branches/tez/ql/src/test/queries/clientpositive/vectorized_rcfile_columnar.q
      - copied unchanged from r1527867, hive/trunk/ql/src/test/queries/clientpositive/vectorized_rcfile_columnar.q
    hive/branches/tez/ql/src/test/results/clientnegative/illegal_partition_type.q.out
      - copied unchanged from r1527867, hive/trunk/ql/src/test/results/clientnegative/illegal_partition_type.q.out
    hive/branches/tez/ql/src/test/results/clientnegative/illegal_partition_type2.q.out
      - copied unchanged from r1527867, hive/trunk/ql/src/test/results/clientnegative/illegal_partition_type2.q.out
    hive/branches/tez/ql/src/test/results/clientpositive/cast_to_int.q.out
      - copied unchanged from r1527867, hive/trunk/ql/src/test/results/clientpositive/cast_to_int.q.out
    hive/branches/tez/ql/src/test/results/clientpositive/import_exported_table.q.out
      - copied unchanged from r1527867, hive/trunk/ql/src/test/results/clientpositive/import_exported_table.q.out
    hive/branches/tez/ql/src/test/results/clientpositive/partition_type_check.q.out
      - copied unchanged from r1527867, hive/trunk/ql/src/test/results/clientpositive/partition_type_check.q.out
    hive/branches/tez/ql/src/test/results/clientpositive/vectorization_short_regress.q.out
      - copied unchanged from r1527867, hive/trunk/ql/src/test/results/clientpositive/vectorization_short_regress.q.out
    hive/branches/tez/ql/src/test/results/clientpositive/vectorized_rcfile_columnar.q.out
      - copied unchanged from r1527867, hive/trunk/ql/src/test/results/clientpositive/vectorized_rcfile_columnar.q.out
    hive/branches/tez/service/src/java/org/apache/hive/service/cli/thrift/EmbeddedThriftBinaryCLIService.java
      - copied unchanged from r1527867, hive/trunk/service/src/java/org/apache/hive/service/cli/thrift/EmbeddedThriftBinaryCLIService.java
    hive/branches/tez/service/src/java/org/apache/hive/service/cli/thrift/ThriftBinaryCLIService.java
      - copied unchanged from r1527867, hive/trunk/service/src/java/org/apache/hive/service/cli/thrift/ThriftBinaryCLIService.java
    hive/branches/tez/service/src/java/org/apache/hive/service/cli/thrift/ThriftHttpCLIService.java
      - copied unchanged from r1527867, hive/trunk/service/src/java/org/apache/hive/service/cli/thrift/ThriftHttpCLIService.java
    hive/branches/tez/service/src/java/org/apache/hive/service/cli/thrift/ThriftHttpServlet.java
      - copied unchanged from r1527867, hive/trunk/service/src/java/org/apache/hive/service/cli/thrift/ThriftHttpServlet.java
    hive/branches/tez/service/src/test/org/apache/hive/service/cli/TestEmbeddedThriftBinaryCLIService.java
      - copied unchanged from r1527867, hive/trunk/service/src/test/org/apache/hive/service/cli/TestEmbeddedThriftBinaryCLIService.java
    hive/branches/tez/service/src/test/org/apache/hive/service/cli/thrift/TestThriftBinaryCLIService.java
      - copied unchanged from r1527867, hive/trunk/service/src/test/org/apache/hive/service/cli/thrift/TestThriftBinaryCLIService.java
    hive/branches/tez/service/src/test/org/apache/hive/service/cli/thrift/TestThriftHttpCLIService.java
      - copied unchanged from r1527867, hive/trunk/service/src/test/org/apache/hive/service/cli/thrift/TestThriftHttpCLIService.java
    hive/branches/tez/service/src/test/org/apache/hive/service/cli/thrift/ThriftCLIServiceTest.java
      - copied unchanged from r1527867, hive/trunk/service/src/test/org/apache/hive/service/cli/thrift/ThriftCLIServiceTest.java
    hive/branches/tez/testutils/ptest2/src/test/java/org/apache/hive/ptest/execution/TestScripts.testAlternativeTestJVM.approved.txt
      - copied unchanged from r1527867, hive/trunk/testutils/ptest2/src/test/java/org/apache/hive/ptest/execution/TestScripts.testAlternativeTestJVM.approved.txt
Removed:
    hive/branches/tez/data/files/TestSerDe.jar
    hive/branches/tez/service/src/java/org/apache/hive/service/cli/thrift/EmbeddedThriftCLIService.java
    hive/branches/tez/service/src/test/org/apache/hive/service/cli/TestEmbeddedThriftCLIService.java
    hive/branches/tez/service/src/test/org/apache/hive/service/cli/thrift/TestThriftCLIService.java
    hive/branches/tez/testutils/ptest2/src/main/java/org/apache/hive/ptest/execution/CleanupPhase.java
    hive/branches/tez/testutils/ptest2/src/test/java/org/apache/hive/ptest/execution/TestCleanupPhase.java
Modified:
    hive/branches/tez/   (props changed)
    hive/branches/tez/beeline/src/java/org/apache/hive/beeline/BeeLine.java
    hive/branches/tez/bin/hive
    hive/branches/tez/build-common.xml
    hive/branches/tez/build.xml
    hive/branches/tez/common/src/java/org/apache/hadoop/hive/conf/HiveConf.java
    hive/branches/tez/conf/hive-default.xml.template
    hive/branches/tez/contrib/src/java/org/apache/hadoop/hive/contrib/fileformat/base64/Base64TextOutputFormat.java
    hive/branches/tez/eclipse-templates/.classpath
    hive/branches/tez/hbase-handler/src/java/org/apache/hadoop/hive/hbase/HiveHFileOutputFormat.java
    hive/branches/tez/hcatalog/build-support/checkstyle/apache_header.txt
    hive/branches/tez/hcatalog/build.xml
    hive/branches/tez/hcatalog/core/src/test/java/org/apache/hcatalog/cli/DummyStorageHandler.java
    hive/branches/tez/hcatalog/src/docs/src/documentation/content/xdocs/queue.xml
    hive/branches/tez/hcatalog/src/test/e2e/templeton/build.xml
    hive/branches/tez/hcatalog/src/test/e2e/templeton/conf/default.conf
    hive/branches/tez/hcatalog/src/test/e2e/templeton/drivers/TestDriverCurl.pm
    hive/branches/tez/hcatalog/src/test/e2e/templeton/tests/ddl.conf
    hive/branches/tez/hcatalog/src/test/e2e/templeton/tests/jobsubmission.conf
    hive/branches/tez/hcatalog/storage-handlers/hbase/src/java/org/apache/hcatalog/hbase/HBaseBaseOutputFormat.java
    hive/branches/tez/hcatalog/storage-handlers/hbase/src/test/org/apache/hcatalog/hbase/TestHCatHBaseInputFormat.java
    hive/branches/tez/hcatalog/webhcat/svr/src/main/config/webhcat-default.xml
    hive/branches/tez/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/AppConfig.java
    hive/branches/tez/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/ExecServiceImpl.java
    hive/branches/tez/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/HcatDelegator.java
    hive/branches/tez/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/HiveDelegator.java
    hive/branches/tez/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/JarDelegator.java
    hive/branches/tez/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/LauncherDelegator.java
    hive/branches/tez/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/PigDelegator.java
    hive/branches/tez/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/QueueStatusBean.java
    hive/branches/tez/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/Server.java
    hive/branches/tez/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/StreamingDelegator.java
    hive/branches/tez/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/tool/HDFSCleanup.java
    hive/branches/tez/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/tool/HDFSStorage.java
    hive/branches/tez/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/tool/JobState.java
    hive/branches/tez/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/tool/TempletonControllerJob.java
    hive/branches/tez/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/tool/TempletonUtils.java
    hive/branches/tez/hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/tool/TrivialExecService.java
    hive/branches/tez/hcatalog/webhcat/svr/src/test/java/org/apache/hive/hcatalog/templeton/TestServer.java
    hive/branches/tez/hcatalog/webhcat/svr/src/test/java/org/apache/hive/hcatalog/templeton/tool/TestTempletonUtils.java
    hive/branches/tez/ivy/libraries.properties
    hive/branches/tez/jdbc/ivy.xml
    hive/branches/tez/jdbc/src/java/org/apache/hive/jdbc/HiveConnection.java
    hive/branches/tez/jdbc/src/java/org/apache/hive/jdbc/HiveDriver.java
    hive/branches/tez/jdbc/src/java/org/apache/hive/jdbc/Utils.java
    hive/branches/tez/jdbc/src/test/org/apache/hive/jdbc/TestJdbcDriver2.java
    hive/branches/tez/metastore/scripts/upgrade/postgres/014-HIVE-3764.postgres.sql
    hive/branches/tez/metastore/scripts/upgrade/postgres/hive-schema-0.12.0.postgres.sql
    hive/branches/tez/metastore/scripts/upgrade/postgres/hive-schema-0.13.0.postgres.sql
    hive/branches/tez/metastore/scripts/upgrade/postgres/upgrade-0.11.0-to-0.12.0.postgres.sql
    hive/branches/tez/metastore/scripts/upgrade/postgres/upgrade-0.12.0-to-0.13.0.postgres.sql
    hive/branches/tez/metastore/src/java/org/apache/hadoop/hive/metastore/MetaStoreDirectSql.java
    hive/branches/tez/metastore/src/java/org/apache/hadoop/hive/metastore/ObjectStore.java
    hive/branches/tez/ql/build.xml
    hive/branches/tez/ql/src/java/org/apache/hadoop/hive/ql/ErrorMsg.java
    hive/branches/tez/ql/src/java/org/apache/hadoop/hive/ql/exec/FetchOperator.java
    hive/branches/tez/ql/src/java/org/apache/hadoop/hive/ql/exec/FileSinkOperator.java
    hive/branches/tez/ql/src/java/org/apache/hadoop/hive/ql/exec/FilterOperator.java
    hive/branches/tez/ql/src/java/org/apache/hadoop/hive/ql/exec/GroupByOperator.java
    hive/branches/tez/ql/src/java/org/apache/hadoop/hive/ql/exec/KeyWrapper.java
    hive/branches/tez/ql/src/java/org/apache/hadoop/hive/ql/exec/Operator.java
    hive/branches/tez/ql/src/java/org/apache/hadoop/hive/ql/exec/OperatorFactory.java
    hive/branches/tez/ql/src/java/org/apache/hadoop/hive/ql/exec/ReduceSinkOperator.java
    hive/branches/tez/ql/src/java/org/apache/hadoop/hive/ql/exec/UnionOperator.java
    hive/branches/tez/ql/src/java/org/apache/hadoop/hive/ql/exec/Utilities.java
    hive/branches/tez/ql/src/java/org/apache/hadoop/hive/ql/exec/mr/ExecDriver.java
    hive/branches/tez/ql/src/java/org/apache/hadoop/hive/ql/exec/persistence/PTFRowContainer.java
    hive/branches/tez/ql/src/java/org/apache/hadoop/hive/ql/exec/persistence/RowContainer.java
    hive/branches/tez/ql/src/java/org/apache/hadoop/hive/ql/io/HiveBinaryOutputFormat.java
    hive/branches/tez/ql/src/java/org/apache/hadoop/hive/ql/io/HiveFileFormatUtils.java
    hive/branches/tez/ql/src/java/org/apache/hadoop/hive/ql/io/HiveIgnoreKeyTextOutputFormat.java
    hive/branches/tez/ql/src/java/org/apache/hadoop/hive/ql/io/HiveNullValueSequenceFileOutputFormat.java
    hive/branches/tez/ql/src/java/org/apache/hadoop/hive/ql/io/HiveOutputFormat.java
    hive/branches/tez/ql/src/java/org/apache/hadoop/hive/ql/io/HivePassThroughOutputFormat.java
    hive/branches/tez/ql/src/java/org/apache/hadoop/hive/ql/io/HivePassThroughRecordWriter.java
    hive/branches/tez/ql/src/java/org/apache/hadoop/hive/ql/io/HiveSequenceFileOutputFormat.java
    hive/branches/tez/ql/src/java/org/apache/hadoop/hive/ql/io/RCFileOutputFormat.java
    hive/branches/tez/ql/src/java/org/apache/hadoop/hive/ql/io/avro/AvroContainerOutputFormat.java
    hive/branches/tez/ql/src/java/org/apache/hadoop/hive/ql/io/avro/AvroGenericRecordWriter.java
    hive/branches/tez/ql/src/java/org/apache/hadoop/hive/ql/io/orc/BitFieldReader.java
    hive/branches/tez/ql/src/java/org/apache/hadoop/hive/ql/io/orc/DynamicByteArray.java
    hive/branches/tez/ql/src/java/org/apache/hadoop/hive/ql/io/orc/IntegerReader.java
    hive/branches/tez/ql/src/java/org/apache/hadoop/hive/ql/io/orc/OrcInputFormat.java
    hive/branches/tez/ql/src/java/org/apache/hadoop/hive/ql/io/orc/OrcOutputFormat.java
    hive/branches/tez/ql/src/java/org/apache/hadoop/hive/ql/io/orc/OrcSerde.java
    hive/branches/tez/ql/src/java/org/apache/hadoop/hive/ql/io/orc/Reader.java
    hive/branches/tez/ql/src/java/org/apache/hadoop/hive/ql/io/orc/ReaderImpl.java
    hive/branches/tez/ql/src/java/org/apache/hadoop/hive/ql/io/orc/RecordReader.java
    hive/branches/tez/ql/src/java/org/apache/hadoop/hive/ql/io/orc/RecordReaderImpl.java
    hive/branches/tez/ql/src/java/org/apache/hadoop/hive/ql/io/orc/RunLengthByteReader.java
    hive/branches/tez/ql/src/java/org/apache/hadoop/hive/ql/io/orc/RunLengthIntegerReader.java
    hive/branches/tez/ql/src/java/org/apache/hadoop/hive/ql/io/orc/RunLengthIntegerReaderV2.java
    hive/branches/tez/ql/src/java/org/apache/hadoop/hive/ql/io/orc/Writer.java
    hive/branches/tez/ql/src/java/org/apache/hadoop/hive/ql/io/orc/WriterImpl.java
    hive/branches/tez/ql/src/java/org/apache/hadoop/hive/ql/optimizer/correlation/CorrelationUtilities.java
    hive/branches/tez/ql/src/java/org/apache/hadoop/hive/ql/optimizer/physical/PhysicalOptimizer.java
    hive/branches/tez/ql/src/java/org/apache/hadoop/hive/ql/parse/BaseSemanticAnalyzer.java
    hive/branches/tez/ql/src/java/org/apache/hadoop/hive/ql/parse/ColumnStatsSemanticAnalyzer.java
    hive/branches/tez/ql/src/java/org/apache/hadoop/hive/ql/parse/DDLSemanticAnalyzer.java
    hive/branches/tez/ql/src/java/org/apache/hadoop/hive/ql/parse/PTFTranslator.java
    hive/branches/tez/ql/src/java/org/apache/hadoop/hive/ql/parse/TypeCheckProcFactory.java
    hive/branches/tez/ql/src/java/org/apache/hadoop/hive/ql/parse/WindowingExprNodeEvaluatorFactory.java
    hive/branches/tez/ql/src/java/org/apache/hadoop/hive/ql/plan/AbstractOperatorDesc.java
    hive/branches/tez/ql/src/java/org/apache/hadoop/hive/ql/plan/CreateTableDesc.java
    hive/branches/tez/ql/src/java/org/apache/hadoop/hive/ql/plan/ExprNodeGenericFuncDesc.java
    hive/branches/tez/ql/src/java/org/apache/hadoop/hive/ql/plan/MapWork.java
    hive/branches/tez/ql/src/java/org/apache/hadoop/hive/ql/plan/PTFDesc.java
    hive/branches/tez/ql/src/java/org/apache/hadoop/hive/ql/plan/PTFDeserializer.java
    hive/branches/tez/ql/src/java/org/apache/hadoop/hive/ql/session/SessionState.java
    hive/branches/tez/ql/src/java/org/apache/hadoop/hive/ql/udf/UDFHex.java
    hive/branches/tez/ql/src/java/org/apache/hadoop/hive/ql/util/JavaDataModel.java
    hive/branches/tez/ql/src/test/org/apache/hadoop/hive/ql/QTestUtil.java
    hive/branches/tez/ql/src/test/org/apache/hadoop/hive/ql/io/orc/TestInputOutputFormat.java
    hive/branches/tez/ql/src/test/org/apache/hadoop/hive/ql/io/udf/Rot13OutputFormat.java
    hive/branches/tez/ql/src/test/org/apache/hadoop/hive/ql/udf/TestToInteger.java
    hive/branches/tez/ql/src/test/queries/clientnegative/deletejar.q
    hive/branches/tez/ql/src/test/queries/clientnegative/invalid_columns.q
    hive/branches/tez/ql/src/test/queries/clientpositive/alter1.q
    hive/branches/tez/ql/src/test/queries/clientpositive/alter_partition_coltype.q
    hive/branches/tez/ql/src/test/queries/clientpositive/input16.q
    hive/branches/tez/ql/src/test/queries/clientpositive/input16_cc.q
    hive/branches/tez/ql/src/test/queries/clientpositive/reduce_deduplicate_extended.q
    hive/branches/tez/ql/src/test/queries/clientpositive/union_null.q
    hive/branches/tez/ql/src/test/results/beelinepositive/union_null.q.out
    hive/branches/tez/ql/src/test/results/clientnegative/alter_table_add_partition.q.out
    hive/branches/tez/ql/src/test/results/clientnegative/alter_view_failure5.q.out
    hive/branches/tez/ql/src/test/results/clientnegative/columnstats_tbllvl.q.out
    hive/branches/tez/ql/src/test/results/clientnegative/columnstats_tbllvl_incorrect_column.q.out
    hive/branches/tez/ql/src/test/results/clientpositive/add_part_exist.q.out
    hive/branches/tez/ql/src/test/results/clientpositive/alter1.q.out
    hive/branches/tez/ql/src/test/results/clientpositive/alter2.q.out
    hive/branches/tez/ql/src/test/results/clientpositive/alter3.q.out
    hive/branches/tez/ql/src/test/results/clientpositive/alter4.q.out
    hive/branches/tez/ql/src/test/results/clientpositive/alter5.q.out
    hive/branches/tez/ql/src/test/results/clientpositive/alter_index.q.out
    hive/branches/tez/ql/src/test/results/clientpositive/alter_partition_coltype.q.out
    hive/branches/tez/ql/src/test/results/clientpositive/alter_rename_partition.q.out
    hive/branches/tez/ql/src/test/results/clientpositive/describe_table_json.q.out
    hive/branches/tez/ql/src/test/results/clientpositive/index_creation.q.out
    hive/branches/tez/ql/src/test/results/clientpositive/input2.q.out
    hive/branches/tez/ql/src/test/results/clientpositive/input3.q.out
    hive/branches/tez/ql/src/test/results/clientpositive/input4.q.out
    hive/branches/tez/ql/src/test/results/clientpositive/plan_json.q.out
    hive/branches/tez/ql/src/test/results/clientpositive/reduce_deduplicate_extended.q.out
    hive/branches/tez/ql/src/test/results/clientpositive/rename_column.q.out
    hive/branches/tez/ql/src/test/results/clientpositive/show_tables.q.out
    hive/branches/tez/ql/src/test/results/clientpositive/union_null.q.out
    hive/branches/tez/serde/src/java/org/apache/hadoop/hive/serde2/SerDeStats.java
    hive/branches/tez/serde/src/java/org/apache/hadoop/hive/serde2/columnar/ColumnarSerDe.java
    hive/branches/tez/serde/src/java/org/apache/hadoop/hive/serde2/lazy/LazyInteger.java
    hive/branches/tez/serde/src/java/org/apache/hadoop/hive/serde2/lazy/LazyLong.java
    hive/branches/tez/serde/src/java/org/apache/hadoop/hive/serde2/lazy/LazyUtils.java
    hive/branches/tez/serde/src/java/org/apache/hadoop/hive/serde2/lazy/objectinspector/primitive/LazyHiveVarcharObjectInspector.java
    hive/branches/tez/serde/src/java/org/apache/hadoop/hive/serde2/objectinspector/primitive/AbstractPrimitiveObjectInspector.java
    hive/branches/tez/serde/src/java/org/apache/hadoop/hive/serde2/objectinspector/primitive/JavaHiveVarcharObjectInspector.java
    hive/branches/tez/serde/src/java/org/apache/hadoop/hive/serde2/objectinspector/primitive/WritableHiveVarcharObjectInspector.java
    hive/branches/tez/serde/src/test/org/apache/hadoop/hive/serde2/lazy/TestLazySimpleSerDe.java
    hive/branches/tez/service/src/java/org/apache/hive/service/cli/thrift/ThriftCLIService.java
    hive/branches/tez/service/src/java/org/apache/hive/service/server/HiveServer2.java
    hive/branches/tez/service/src/test/org/apache/hive/service/auth/TestPlainSaslHelper.java
    hive/branches/tez/service/src/test/org/apache/hive/service/cli/CLIServiceTest.java
    hive/branches/tez/service/src/test/org/apache/hive/service/cli/session/TestSessionHooks.java
    hive/branches/tez/testutils/ptest2/src/main/java/org/apache/hive/ptest/execution/PTest.java
    hive/branches/tez/testutils/ptest2/src/main/java/org/apache/hive/ptest/execution/conf/TestConfiguration.java
    hive/branches/tez/testutils/ptest2/src/main/java/org/apache/hive/ptest/execution/context/CloudExecutionContextProvider.java
    hive/branches/tez/testutils/ptest2/src/main/resources/batch-exec.vm
    hive/branches/tez/testutils/ptest2/src/test/java/org/apache/hive/ptest/execution/TestScripts.java
    hive/branches/tez/testutils/ptest2/src/test/java/org/apache/hive/ptest/execution/TestScripts.testBatch.approved.txt

Propchange: hive/branches/tez/
------------------------------------------------------------------------------
  Merged /hive/branches/vectorization:r1466908-1527856
  Merged /hive/trunk:r1526306-1527867

Modified: hive/branches/tez/beeline/src/java/org/apache/hive/beeline/BeeLine.java
URL: http://svn.apache.org/viewvc/hive/branches/tez/beeline/src/java/org/apache/hive/beeline/BeeLine.java?rev=1527883&r1=1527882&r2=1527883&view=diff
==============================================================================
--- hive/branches/tez/beeline/src/java/org/apache/hive/beeline/BeeLine.java (original)
+++ hive/branches/tez/beeline/src/java/org/apache/hive/beeline/BeeLine.java Tue Oct  1 04:48:44 2013
@@ -502,7 +502,7 @@ public class BeeLine {
 
     for (int i = 0; i < args.length; i++) {
       if (args[i].equals("--help") || args[i].equals("-h")) {
-        usage();
+        // Return false here, so usage will be printed.
         return false;
       }
 

Modified: hive/branches/tez/bin/hive
URL: http://svn.apache.org/viewvc/hive/branches/tez/bin/hive?rev=1527883&r1=1527882&r2=1527883&view=diff
==============================================================================
--- hive/branches/tez/bin/hive (original)
+++ hive/branches/tez/bin/hive Tue Oct  1 04:48:44 2013
@@ -117,7 +117,7 @@ elif [ "${HIVE_AUX_JARS_PATH}" != "" ]; 
   fi
   AUX_CLASSPATH=${HIVE_AUX_JARS_PATH}
   AUX_PARAM=file://${HIVE_AUX_JARS_PATH}
-  AUX_PARAM=`echo $AUX_PARAM | sed 's/,/,file:\/\//g'`
+  AUX_PARAM=`echo $AUX_PARAM | sed 's/:/,file:\/\//g'`
 fi
 
 # adding jars from auxlib directory

Modified: hive/branches/tez/build-common.xml
URL: http://svn.apache.org/viewvc/hive/branches/tez/build-common.xml?rev=1527883&r1=1527882&r2=1527883&view=diff
==============================================================================
--- hive/branches/tez/build-common.xml (original)
+++ hive/branches/tez/build-common.xml Tue Oct  1 04:48:44 2013
@@ -59,7 +59,7 @@
   <property name="test.output" value="true"/>
   <property name="test.junit.output.format" value="xml"/>
   <property name="test.junit.output.usefile" value="true"/>
-  <property name="minimr.query.files" value="list_bucket_dml_10.q,input16_cc.q,scriptfile1.q,scriptfile1_win.q,bucket4.q,bucketmapjoin6.q,disable_merge_for_bucketing.q,reduce_deduplicate.q,smb_mapjoin_8.q,join1.q,groupby2.q,bucketizedhiveinputformat.q,bucketmapjoin7.q,optrstat_groupby.q,bucket_num_reducers.q,bucket5.q,load_fs2.q,bucket_num_reducers2.q,infer_bucket_sort_merge.q,infer_bucket_sort_reducers_power_two.q,infer_bucket_sort_dyn_part.q,infer_bucket_sort_bucketed_table.q,infer_bucket_sort_map_operators.q,infer_bucket_sort_num_buckets.q,leftsemijoin_mr.q,schemeAuthority.q,schemeAuthority2.q,truncate_column_buckets.q,remote_script.q,,load_hdfs_file_with_space_in_the_name.q,parallel_orderby.q"/>
+  <property name="minimr.query.files" value="list_bucket_dml_10.q,input16_cc.q,scriptfile1.q,scriptfile1_win.q,bucket4.q,bucketmapjoin6.q,disable_merge_for_bucketing.q,reduce_deduplicate.q,smb_mapjoin_8.q,join1.q,groupby2.q,bucketizedhiveinputformat.q,bucketmapjoin7.q,optrstat_groupby.q,bucket_num_reducers.q,bucket5.q,load_fs2.q,bucket_num_reducers2.q,infer_bucket_sort_merge.q,infer_bucket_sort_reducers_power_two.q,infer_bucket_sort_dyn_part.q,infer_bucket_sort_bucketed_table.q,infer_bucket_sort_map_operators.q,infer_bucket_sort_num_buckets.q,leftsemijoin_mr.q,schemeAuthority.q,schemeAuthority2.q,truncate_column_buckets.q,remote_script.q,,load_hdfs_file_with_space_in_the_name.q,parallel_orderby.q,import_exported_table.q"/>
   <property name="minimr.query.negative.files" value="cluster_tasklog_retrieval.q,minimr_broken_pipe.q,mapreduce_stack_trace.q,mapreduce_stack_trace_turnoff.q,mapreduce_stack_trace_hadoop20.q,mapreduce_stack_trace_turnoff_hadoop20.q" />
   <property name="test.silent" value="true"/>
   <property name="hadoopVersion" value="${hadoop.version.ant-internal}"/>
@@ -304,7 +304,6 @@
      encoding="${build.encoding}"
      srcdir="${test.src.dir}"
      includes="org/apache/**/hive/**/*.java"
-     excludes="**/TestSerDe.java"
      destdir="${test.build.classes}"
      debug="${javac.debug}"
      optimize="${javac.optimize}"
@@ -339,8 +338,13 @@
     </jar>
     <delete file="${test.build.dir}/test-serdes.jar"/>
     <jar jarfile="${test.build.dir}/test-serdes.jar">
-        <fileset dir="${test.build.classes}" includes="**/serde2/*.class"/>
+        <fileset dir="${test.build.classes}" includes="**/serde2/*.class" excludes="**/serde2/TestSerDe.class"/>
     </jar>  	
+    <delete file="${test.build.dir}/TestSerDe.jar"/>
+    <jar jarfile="${test.build.dir}/TestSerDe.jar">
+        <fileset dir="${test.build.classes}" includes="**/serde2/TestSerDe.class"/>
+    </jar>
+    <delete file="${test.build.classes}/org/apache/hadoop/hive/serde2/TestSerDe.class"/> 
   </target>
 
   <target name="test-conditions">
@@ -491,7 +495,7 @@
       <batchtest todir="${test.build.dir}" unless="testcase">
         <fileset dir="${test.build.classes}"
                  includes="**/${test.include}.class"
-                 excludes="**/TestSerDe.class,**/TestHiveMetaStore.class,**/TestBeeLineDriver.class,**/TestHiveServer2Concurrency.class,**/*$*.class,${test.junit.exclude}" />
+		 excludes="**/ql/exec/vector/util/*.class,**/ql/exec/vector/udf/legacy/*.class,**/ql/exec/vector/udf/generic/*.class,**/TestSerDe.class,**/TestHiveMetaStore.class,**/TestBeeLineDriver.class,**/TestHiveServer2Concurrency.class,**/*$*.class,${test.junit.exclude}" />
       </batchtest>
       <batchtest todir="${test.build.dir}" if="testcase">
         <fileset dir="${test.build.classes}" includes="**/${testcase}.class"/>

Modified: hive/branches/tez/build.xml
URL: http://svn.apache.org/viewvc/hive/branches/tez/build.xml?rev=1527883&r1=1527882&r2=1527883&view=diff
==============================================================================
--- hive/branches/tez/build.xml (original)
+++ hive/branches/tez/build.xml Tue Oct  1 04:48:44 2013
@@ -263,6 +263,12 @@
   <target name="init" depends="ivy-init-antlib,deploy-ant-tasks">
     <echo message="Project: ${ant.project.name}"/>
     <iterate target="init" iterate="${iterate.hive.all}"/>
+
+    <mkdir dir="${build.dir.hive}/ql/gen/vector/org/apache/hadoop/hive/ql/exec/vector/expressions/gen"/>
+    <mkdir dir="${build.dir.hive}/ql/gen/vector/org/apache/hadoop/hive/ql/exec/vector/expressions/aggregates/gen"/>
+    <mkdir dir="${build.dir.hive}/ql/test/src/org/apache/hadoop/hive/ql/exec/vector/expressions/gen"/>
+    <vectorcodegen templateBaseDir="${hive.root}/ql/src/gen/vectorization/" buildDir="${build.dir.hive}" />
+
   </target>
 
   <target name="test-init">
@@ -283,8 +289,13 @@
     <subant target="jar">
       <fileset dir="." includes="ant/build.xml"/>
     </subant>
+
     <taskdef name="getversionpref" classname="org.apache.hadoop.hive.ant.GetVersionPref"
              classpath="${build.dir.hive}/anttasks/hive-anttasks-${version}.jar"/>
+
+    <taskdef name="vectorcodegen" classname="org.apache.hadoop.hive.ant.GenVectorCode"
+        classpath="${build.dir.hive}/anttasks/hive-anttasks-${version}.jar"/>
+
   </target>
 
   
@@ -744,6 +755,7 @@
       <packageset dir="ql/src/test"/>
       <packageset dir="ql/src/gen/thrift/gen-javabean"/>
       <packageset dir="${build.dir.hive}/ql/gen/antlr/gen-java"/>
+      <packageset dir="${build.dir.hive}/ql/gen/vector"/>
       <packageset dir="shims/src/common/java"/>
 
       <link href="${javadoc.link.java}"/>

Modified: hive/branches/tez/common/src/java/org/apache/hadoop/hive/conf/HiveConf.java
URL: http://svn.apache.org/viewvc/hive/branches/tez/common/src/java/org/apache/hadoop/hive/conf/HiveConf.java?rev=1527883&r1=1527882&r2=1527883&view=diff
==============================================================================
--- hive/branches/tez/common/src/java/org/apache/hadoop/hive/conf/HiveConf.java (original)
+++ hive/branches/tez/common/src/java/org/apache/hadoop/hive/conf/HiveConf.java Tue Oct  1 04:48:44 2013
@@ -739,6 +739,19 @@ public class HiveConf extends Configurat
     HIVE_DDL_OUTPUT_FORMAT("hive.ddl.output.format", null),
     HIVE_ENTITY_SEPARATOR("hive.entity.separator", "@"),
 
+    // binary or http
+    HIVE_SERVER2_TRANSPORT_MODE("hive.server2.transport.mode", "binary"),
+
+    // http (over thrift) transport settings
+    HIVE_SERVER2_THRIFT_HTTP_PORT("hive.server2.thrift.http.port", 10001),
+    HIVE_SERVER2_THRIFT_HTTP_PATH("hive.server2.thrift.http.path", "cliservice"),
+    HIVE_SERVER2_THRIFT_HTTP_MIN_WORKER_THREADS("hive.server2.thrift.http.min.worker.threads", 5),
+    HIVE_SERVER2_THRIFT_HTTP_MAX_WORKER_THREADS("hive.server2.thrift.http.max.worker.threads", 500),
+
+    // binary transport settings
+    HIVE_SERVER2_THRIFT_PORT("hive.server2.thrift.port", 10000),
+    HIVE_SERVER2_THRIFT_BIND_HOST("hive.server2.thrift.bind.host", ""),
+    HIVE_SERVER2_THRIFT_SASL_QOP("hive.server2.thrift.sasl.qop", "auth"),
     HIVE_SERVER2_THRIFT_MIN_WORKER_THREADS("hive.server2.thrift.min.worker.threads", 5),
     HIVE_SERVER2_THRIFT_MAX_WORKER_THREADS("hive.server2.thrift.max.worker.threads", 500),
 
@@ -748,10 +761,6 @@ public class HiveConf extends Configurat
     // Number of seconds HiveServer2 shutdown will wait for async threads to terminate
     HIVE_SERVER2_ASYNC_EXEC_SHUTDOWN_TIMEOUT("hive.server2.async.exec.shutdown.timeout", 10),
 
-    HIVE_SERVER2_THRIFT_PORT("hive.server2.thrift.port", 10000),
-    HIVE_SERVER2_THRIFT_BIND_HOST("hive.server2.thrift.bind.host", ""),
-    HIVE_SERVER2_THRIFT_SASL_QOP("hive.server2.thrift.sasl.qop", "auth"),
-
 
     // HiveServer2 auth configuration
     HIVE_SERVER2_AUTHENTICATION("hive.server2.authentication", "NONE"),
@@ -808,6 +817,11 @@ public class HiveConf extends Configurat
     HIVE_OPTIMIZE_TEZ("hive.optimize.tez", false),
     HIVE_JAR_DIRECTORY("hive.jar.directory", "hdfs:///user/hive/"),
     HIVE_USER_INSTALL_DIR("hive.user.install.directory", "hdfs:///user/"),
+
+    //Vectorization enabled
+    HIVE_VECTORIZATION_ENABLED("hive.vectorized.execution.enabled", false),
+
+    HIVE_TYPE_CHECK_ON_INSERT("hive.typecheck.on.insert", true),
     ;
 
     public final String varname;
@@ -1274,6 +1288,6 @@ public class HiveConf extends Configurat
     } else {
       return Integer.parseInt(m.group(1));
     }
-
   }
+
 }

Modified: hive/branches/tez/conf/hive-default.xml.template
URL: http://svn.apache.org/viewvc/hive/branches/tez/conf/hive-default.xml.template?rev=1527883&r1=1527882&r2=1527883&view=diff
==============================================================================
--- hive/branches/tez/conf/hive-default.xml.template (original)
+++ hive/branches/tez/conf/hive-default.xml.template Tue Oct  1 04:48:44 2013
@@ -877,7 +877,36 @@
   <description>Read from a binary stream and treat each hive.binary.record.max.length bytes as a record.
   The last record before the end of stream can have less than hive.binary.record.max.length bytes</description>
 </property>
+    
+<property>
+  <name>hive.server2.transport.mode</name>
+  <value>binary</value>
+  <description>Server transport mode. "binary" or "http".</description>
+</property>    
+
+<property>
+  <name>hive.server2.thrift.http.port</name>
+  <value>10001</value>
+  <description>Port number when in HTTP mode.</description>
+</property> 
+
+<property>
+  <name>hive.server2.thrift.http.path</name>
+  <value>cliservice</value>
+  <description>Path component of URL endpoint when in HTTP mode.</description>
+</property> 
 
+<property>
+  <name>hive.server2.thrift.http.min.worker.threads</name>
+  <value>5</value>
+  <description>Minimum number of worker threads when in HTTP mode.</description>
+</property> 
+
+<property>
+  <name>hive.server2.thrift.http.max.worker.threads</name>
+  <value>500</value>
+  <description>Maximum number of worker threads when in HTTP mode.</description>
+</property> 
 
 <property>
   <name>hive.script.recordreader</name>
@@ -1755,6 +1784,8 @@
   </description>
 </property>
 
+
+
 <property>
   <name>hive.hmshandler.retry.attempts</name>
   <value>1</value>
@@ -1989,6 +2020,15 @@
 </property>
 
 <property>
+  <name>hive.vectorized.execution.enabled</name>
+  <value>false</value>
+  <description>
+  This flag should be set to true to enable vectorized mode of query execution.
+  The default value is false.
+  </description>
+</property>
+
+<property>
   <name>hive.metastore.schema.verification</name>
   <value>false</value>
    <description>
@@ -2000,5 +2040,4 @@
    </description>
 </property>
 
-
 </configuration>

Modified: hive/branches/tez/contrib/src/java/org/apache/hadoop/hive/contrib/fileformat/base64/Base64TextOutputFormat.java
URL: http://svn.apache.org/viewvc/hive/branches/tez/contrib/src/java/org/apache/hadoop/hive/contrib/fileformat/base64/Base64TextOutputFormat.java?rev=1527883&r1=1527882&r2=1527883&view=diff
==============================================================================
--- hive/branches/tez/contrib/src/java/org/apache/hadoop/hive/contrib/fileformat/base64/Base64TextOutputFormat.java (original)
+++ hive/branches/tez/contrib/src/java/org/apache/hadoop/hive/contrib/fileformat/base64/Base64TextOutputFormat.java Tue Oct  1 04:48:44 2013
@@ -24,7 +24,7 @@ import java.util.Properties;
 
 import org.apache.commons.codec.binary.Base64;
 import org.apache.hadoop.fs.Path;
-import org.apache.hadoop.hive.ql.exec.FileSinkOperator.RecordWriter;
+import org.apache.hadoop.hive.ql.io.FSRecordWriter;
 import org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat;
 import org.apache.hadoop.io.BytesWritable;
 import org.apache.hadoop.io.Text;
@@ -53,13 +53,13 @@ public class Base64TextOutputFormat<K ex
    * Base64RecordWriter.
    *
    */
-  public static class Base64RecordWriter implements RecordWriter,
+  public static class Base64RecordWriter implements FSRecordWriter,
       JobConfigurable {
 
-    RecordWriter writer;
+    FSRecordWriter writer;
     BytesWritable bytesWritable;
 
-    public Base64RecordWriter(RecordWriter writer) {
+    public Base64RecordWriter(FSRecordWriter writer) {
       this.writer = writer;
       bytesWritable = new BytesWritable();
     }
@@ -119,7 +119,7 @@ public class Base64TextOutputFormat<K ex
   }
 
   @Override
-  public RecordWriter getHiveRecordWriter(JobConf jc, Path finalOutPath,
+  public FSRecordWriter getHiveRecordWriter(JobConf jc, Path finalOutPath,
       Class<? extends Writable> valueClass, boolean isCompressed,
       Properties tableProperties, Progressable progress) throws IOException {
 

Modified: hive/branches/tez/eclipse-templates/.classpath
URL: http://svn.apache.org/viewvc/hive/branches/tez/eclipse-templates/.classpath?rev=1527883&r1=1527882&r2=1527883&view=diff
==============================================================================
--- hive/branches/tez/eclipse-templates/.classpath (original)
+++ hive/branches/tez/eclipse-templates/.classpath Tue Oct  1 04:48:44 2013
@@ -76,6 +76,8 @@
   <classpathentry kind="lib" path="build/ivy/lib/default/bonecp-@BoneCP.version@.jar"/>
   <classpathentry kind="lib" path="build/ivy/lib/default/commons-pool-@commons-pool.version@.jar"/>
   <classpathentry kind="lib" path="build/ivy/lib/default/commons-lang3-@commons-lang3.version@.jar"/>
+  <classpathentry kind="lib" path="build/ivy/lib/default/httpcore-@httpcore.version@.jar"/>
+  <classpathentry kind="lib" path="build/ivy/lib/default/httpclient-@httpclient.version@.jar"/>
   <classpathentry kind="lib" path="build/ivy/lib/default/slf4j-api-@slf4j-api.version@.jar"/>
   <classpathentry kind="lib" path="build/ivy/lib/default/slf4j-log4j12-@slf4j-log4j12.version@.jar"/>
   <classpathentry kind="lib" path="build/ivy/lib/default/JavaEWAH-@javaewah.version@.jar"/>

Modified: hive/branches/tez/hbase-handler/src/java/org/apache/hadoop/hive/hbase/HiveHFileOutputFormat.java
URL: http://svn.apache.org/viewvc/hive/branches/tez/hbase-handler/src/java/org/apache/hadoop/hive/hbase/HiveHFileOutputFormat.java?rev=1527883&r1=1527882&r2=1527883&view=diff
==============================================================================
--- hive/branches/tez/hbase-handler/src/java/org/apache/hadoop/hive/hbase/HiveHFileOutputFormat.java (original)
+++ hive/branches/tez/hbase-handler/src/java/org/apache/hadoop/hive/hbase/HiveHFileOutputFormat.java Tue Oct  1 04:48:44 2013
@@ -34,7 +34,7 @@ import org.apache.hadoop.hbase.KeyValue;
 import org.apache.hadoop.hbase.io.ImmutableBytesWritable;
 import org.apache.hadoop.hbase.mapreduce.HFileOutputFormat;
 import org.apache.hadoop.hbase.util.Bytes;
-import org.apache.hadoop.hive.ql.exec.FileSinkOperator.RecordWriter;
+import org.apache.hadoop.hive.ql.io.FSRecordWriter;
 import org.apache.hadoop.hive.ql.io.HiveOutputFormat;
 import org.apache.hadoop.hive.shims.ShimLoader;
 import org.apache.hadoop.io.Text;
@@ -71,7 +71,7 @@ public class HiveHFileOutputFormat exten
   }
 
   @Override
-  public RecordWriter getHiveRecordWriter(
+  public FSRecordWriter getHiveRecordWriter(
     final JobConf jc,
     final Path finalOutPath,
     Class<? extends Writable> valueClass,
@@ -120,7 +120,7 @@ public class HiveHFileOutputFormat exten
       ++i;
     }
 
-    return new RecordWriter() {
+    return new FSRecordWriter() {
 
       @Override
       public void close(boolean abort) throws IOException {

Modified: hive/branches/tez/hcatalog/build-support/checkstyle/apache_header.txt
URL: http://svn.apache.org/viewvc/hive/branches/tez/hcatalog/build-support/checkstyle/apache_header.txt?rev=1527883&r1=1527882&r2=1527883&view=diff
==============================================================================
--- hive/branches/tez/hcatalog/build-support/checkstyle/apache_header.txt (original)
+++ hive/branches/tez/hcatalog/build-support/checkstyle/apache_header.txt Tue Oct  1 04:48:44 2013
@@ -1,19 +1,19 @@
 ^#!
 ^<\?(xml|xml-stylesheet).*>$
 ^\W*$
-\W*Licensed to the Apache Software Foundation \(ASF\) under one$
-\W*or more contributor license agreements.  See the NOTICE file$
-\W*distributed with this work for additional information$
-\W*regarding copyright ownership.  The ASF licenses this file$
-\W*to you under the Apache License, Version 2.0 \(the$
-\W*"License"\); you may not use this file except in compliance$
-\W*with the License.  You may obtain a copy of the License at$
-\W*$
-\W*http://www.apache.org/licenses/LICENSE-2.0$
-\W*$
-\W*Unless required by applicable law or agreed to in writing,$
-\W*software distributed under the License is distributed on an$
-\W*"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY$
-\W*KIND, either express or implied.  See the License for the$
-\W*specific language governing permissions and limitations$
-\W*under the License.$
+.*? Licensed to the Apache Software Foundation \(ASF\) under one$
+.*? or more contributor license agreements.  See the NOTICE file$
+.*? distributed with this work for additional information$
+.*? regarding copyright ownership.  The ASF licenses this file$
+.*? to you under the Apache License, Version 2.0 \(the$
+.*? "License"\); you may not use this file except in compliance$
+.*? with the License.  You may obtain a copy of the License at$
+.*?$
+.*? http://www.apache.org/licenses/LICENSE-2.0$
+.*?$
+.*? Unless required by applicable law or agreed to in writing,$
+.*? software distributed under the License is distributed on an$
+.*? "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY$
+.*? KIND, either express or implied.  See the License for the$
+.*? specific language governing permissions and limitations$
+.*? under the License.$

Modified: hive/branches/tez/hcatalog/build.xml
URL: http://svn.apache.org/viewvc/hive/branches/tez/hcatalog/build.xml?rev=1527883&r1=1527882&r2=1527883&view=diff
==============================================================================
--- hive/branches/tez/hcatalog/build.xml (original)
+++ hive/branches/tez/hcatalog/build.xml Tue Oct  1 04:48:44 2013
@@ -363,6 +363,7 @@
                 <include name="hcat"/>
                 <include name="hcat.py"/>
                 <include name="hcatcfg.py"/>
+                <include name="templeton.cmd"/>
             </fileset>
 
         </copy>

Modified: hive/branches/tez/hcatalog/core/src/test/java/org/apache/hcatalog/cli/DummyStorageHandler.java
URL: http://svn.apache.org/viewvc/hive/branches/tez/hcatalog/core/src/test/java/org/apache/hcatalog/cli/DummyStorageHandler.java?rev=1527883&r1=1527882&r2=1527883&view=diff
==============================================================================
--- hive/branches/tez/hcatalog/core/src/test/java/org/apache/hcatalog/cli/DummyStorageHandler.java (original)
+++ hive/branches/tez/hcatalog/core/src/test/java/org/apache/hcatalog/cli/DummyStorageHandler.java Tue Oct  1 04:48:44 2013
@@ -28,6 +28,7 @@ import org.apache.hadoop.fs.FileSystem;
 import org.apache.hadoop.fs.Path;
 import org.apache.hadoop.hive.metastore.HiveMetaHook;
 import org.apache.hadoop.hive.metastore.api.Database;
+import org.apache.hadoop.hive.ql.io.FSRecordWriter;
 import org.apache.hadoop.hive.ql.io.HiveOutputFormat;
 import org.apache.hadoop.hive.ql.metadata.AuthorizationException;
 import org.apache.hadoop.hive.ql.metadata.HiveException;
@@ -285,7 +286,7 @@ class DummyStorageHandler extends HCatSt
      * org.apache.hadoop.util.Progressable)
      */
     @Override
-    public org.apache.hadoop.hive.ql.exec.FileSinkOperator.RecordWriter getHiveRecordWriter(
+    public FSRecordWriter getHiveRecordWriter(
       JobConf jc, Path finalOutPath,
       Class<? extends Writable> valueClass, boolean isCompressed,
       Properties tableProperties, Progressable progress)

Modified: hive/branches/tez/hcatalog/src/docs/src/documentation/content/xdocs/queue.xml
URL: http://svn.apache.org/viewvc/hive/branches/tez/hcatalog/src/docs/src/documentation/content/xdocs/queue.xml?rev=1527883&r1=1527882&r2=1527883&view=diff
==============================================================================
--- hive/branches/tez/hcatalog/src/docs/src/documentation/content/xdocs/queue.xml (original)
+++ hive/branches/tez/hcatalog/src/docs/src/documentation/content/xdocs/queue.xml Tue Oct  1 04:48:44 2013
@@ -143,7 +143,17 @@
  "exitValue": 0,
  "user": "ctdean",
  "callback": null,
- "completed": "done"
+ "completed": "done",
+ "userargs" => {
+   "callback"  => null,
+   "define"    => [],
+   "enablelog" => "false",
+   "execute"   => "select a,rand(b) from mynums",
+   "file"      => null,
+   "files"     => [],
+   "statusdir" => null,
+   "user.name" => "hadoopqa",
+ },
 }
 </source>
   </section>

Modified: hive/branches/tez/hcatalog/src/test/e2e/templeton/build.xml
URL: http://svn.apache.org/viewvc/hive/branches/tez/hcatalog/src/test/e2e/templeton/build.xml?rev=1527883&r1=1527882&r2=1527883&view=diff
==============================================================================
--- hive/branches/tez/hcatalog/src/test/e2e/templeton/build.xml (original)
+++ hive/branches/tez/hcatalog/src/test/e2e/templeton/build.xml Tue Oct  1 04:48:44 2013
@@ -94,6 +94,7 @@
 
         <exec executable="./test_harness.pl" dir="${test.location}" failonerror="true">
             <env key="HARNESS_ROOT" value="."/>
+            <env key="DRIVER_ROOT" value="${basedir}/drivers"/>
             <env key="TH_WORKING_DIR" value="${test.location}"/>
             <env key="TH_INPDIR_LOCAL" value="${inpdir.local}"/>
             <env key="TH_INPDIR_HDFS" value="${inpdir.hdfs}"/>
@@ -109,9 +110,10 @@
             <env key="SECURE_MODE" value="${secure.mode}"/>
             <arg line="${tests.to.run}"/>
             <arg value="${basedir}/tests/serverstatus.conf"/>
+            <arg value="${basedir}/tests/jobsubmission_streaming.conf"/>
             <arg value="${basedir}/tests/ddl.conf"/>
+            <arg value="${basedir}/tests/jobstatus.conf"/>
             <arg value="${basedir}/tests/jobsubmission.conf"/>
-            <arg value="${basedir}/tests/jobsubmission2.conf"/>
         </exec>
     </target>
 
@@ -124,6 +126,7 @@
         <property name="tests.to.run" value=""/>
         <exec executable="${harness.dir}/test_harness.pl" dir="${test.location}" failonerror="true">
             <env key="HARNESS_ROOT" value="${harness.dir}"/>
+            <env key="DRIVER_ROOT" value="${basedir}/drivers"/>
             <env key="TH_WORKING_DIR" value="${test.location}"/>
             <env key="TH_INPDIR_LOCAL" value="${inpdir.local}"/>
             <env key="TH_INPDIR_HDFS" value="${inpdir.hdfs}"/>

Modified: hive/branches/tez/hcatalog/src/test/e2e/templeton/conf/default.conf
URL: http://svn.apache.org/viewvc/hive/branches/tez/hcatalog/src/test/e2e/templeton/conf/default.conf?rev=1527883&r1=1527882&r2=1527883&view=diff
==============================================================================
--- hive/branches/tez/hcatalog/src/test/e2e/templeton/conf/default.conf (original)
+++ hive/branches/tez/hcatalog/src/test/e2e/templeton/conf/default.conf Tue Oct  1 04:48:44 2013
@@ -1,3 +1,4 @@
+############################################################################           
 # Licensed to the Apache Software Foundation (ASF) under one
 # or more contributor license agreements.  See the NOTICE file
 # distributed with this work for additional information
@@ -14,7 +15,7 @@
 # KIND, either express or implied.  See the License for the
 # specific language governing permissions and limitations
 # under the License.
-
+                                                                                       
 my $me = `whoami`;
 chomp $me;
 

Modified: hive/branches/tez/hcatalog/src/test/e2e/templeton/drivers/TestDriverCurl.pm
URL: http://svn.apache.org/viewvc/hive/branches/tez/hcatalog/src/test/e2e/templeton/drivers/TestDriverCurl.pm?rev=1527883&r1=1527882&r2=1527883&view=diff
==============================================================================
--- hive/branches/tez/hcatalog/src/test/e2e/templeton/drivers/TestDriverCurl.pm (original)
+++ hive/branches/tez/hcatalog/src/test/e2e/templeton/drivers/TestDriverCurl.pm Tue Oct  1 04:48:44 2013
@@ -1,3 +1,4 @@
+############################################################################           
 # Licensed to the Apache Software Foundation (ASF) under one
 # or more contributor license agreements.  See the NOTICE file
 # distributed with this work for additional information
@@ -14,7 +15,7 @@
 # KIND, either express or implied.  See the License for the
 # specific language governing permissions and limitations
 # under the License.
-
+                                                                                       
 package TestDriverCurl;
 
 ###########################################################################
@@ -35,6 +36,7 @@ use strict;
 use English;
 use Storable qw(dclone);
 use File::Glob ':glob';
+use JSON::Path;
 
 my $passedStr = 'passed';
 my $failedStr = 'failed';
@@ -150,13 +152,16 @@ sub new
 sub globalSetup
   {
     my ($self, $globalHash, $log) = @_;
+    my $subName = (caller(0))[3];
+
 
     # Setup the output path
     my $me = `whoami`;
     chomp $me;
-    my $jobId = $globalHash->{'job-id'};
-    my $timeId = time;
-    $globalHash->{'runid'} = $me . "-" . $timeId . "-" . $jobId;
+    #usernames on windows can be "domain\username" change the "\"
+    # as runid is used in file names
+    $me =~ s/\\/_/;
+    $globalHash->{'runid'} = $me . "." . time;
 
     # if "-ignore false" was provided on the command line,
     # it means do run tests even when marked as 'ignore'
@@ -170,7 +175,6 @@ sub globalSetup
 
     $globalHash->{'outpath'} = $globalHash->{'outpathbase'} . "/" . $globalHash->{'runid'} . "/";
     $globalHash->{'localpath'} = $globalHash->{'localpathbase'} . "/" . $globalHash->{'runid'} . "/";
-    $globalHash->{'tmpPath'} = $globalHash->{'tmpPath'} . "/" . $globalHash->{'runid'} . "/";
     $globalHash->{'webhdfs_url'} = $ENV{'WEBHDFS_URL'};
     $globalHash->{'templeton_url'} = $ENV{'TEMPLETON_URL'};
     $globalHash->{'current_user'} = $ENV{'USER_NAME'};
@@ -186,11 +190,6 @@ sub globalSetup
     $globalHash->{'inpdir_hdfs'} = $ENV{'TH_INPDIR_HDFS'};
 
     $globalHash->{'is_secure_mode'} = $ENV{'SECURE_MODE'};
-  }
-
-sub globalSetupConditional
-  {
-    my ($self, $globalHash, $log) = @_;
 
     # add libexec location to the path
     if (defined($ENV{'PATH'})) {
@@ -225,12 +224,6 @@ sub globalSetupConditional
 # None
 sub globalCleanup
   {
-    # noop there because the removal of temp directories, which are created in #globalSetupConditional(), is to be
-    # performed in method #globalCleanupConditional().
-  }
-
-sub globalCleanupConditional
-  {
     my ($self, $globalHash, $log) = @_;
 
     IPC::Run::run(['rm', '-rf', $globalHash->{'tmpPath'}], \undef, $log, $log) or 
@@ -292,6 +285,17 @@ sub replaceParameters
       my @new_options = ();
       foreach my $option (@options) {
         $option = $self->replaceParametersInArg($option, $testCmd, $log);
+        if (isWindows()) {
+          my $equal_pos = index($option, '=');
+          if ($equal_pos != -1) {
+            my $left = substr($option, 0, $equal_pos);
+            my $right = substr($option, $equal_pos+1);
+            if ($right =~ /=/) {
+              $right = '"'.$right.'"';
+              $option = $left . "=" . $right;
+            }
+          }
+        }
         push @new_options, ($option);
       }
       $testCmd->{$aPfix . 'post_options'} = \@new_options;
@@ -306,6 +310,15 @@ sub replaceParameters
       }
     }    
 
+    if (defined $testCmd->{$aPfix . 'json_path'}) {
+      my $json_path_matches = $testCmd->{$aPfix . 'json_path'};
+      my @keys = keys %{$json_path_matches};
+
+      foreach my $key (@keys) {
+        my $new_value = $self->replaceParametersInArg($json_path_matches->{$key}, $testCmd, $log);
+        $json_path_matches->{$key} = $new_value;
+      }
+    }
 
   }
 
@@ -498,7 +511,7 @@ sub execCurlCmd(){
     $testCmd->{'http_daemon'} = $d;
     $testCmd->{'callback_url'} = $d->url . 'templeton/$jobId';
     push @curl_cmd, ('-d', 'callback=' . $testCmd->{'callback_url'});
-    #	push ${testCmd->{'post_options'}}, ('callback=' . $testCmd->{'callback_url'});
+    push @{$testCmd->{$argPrefix . 'post_options'}}, ('callback=' . $testCmd->{'callback_url'});
     #	#my @options = @{$testCmd->{'post_options'}};
     #	print $log "post options  @options\n";
   }
@@ -510,7 +523,7 @@ sub execCurlCmd(){
   push @curl_cmd, ("-X", $method, "-o", $res_body, "-D", $res_header);  
   push @curl_cmd, ($url);
 
-  print $log "$0:$subName Going to run command : " .  join (' ', @curl_cmd);
+  print $log "$0:$subName Going to run command : " .  join (' , ', @curl_cmd);
   print $log "\n";
 
 
@@ -604,6 +617,37 @@ sub compare
 
     my $json_hash;
     my %json_info;
+    # for information on JSONPath, check http://goessner.net/articles/JsonPath/
+    if (defined $testCmd->{'json_path'}) {
+      my $json_matches = $testCmd->{'json_path'};
+      foreach my $key (keys %$json_matches) {
+        my $regex_expected_value = $json_matches->{$key};
+        my $path = JSON::Path->new($key);
+        my $value; 
+        # when filter_job_status is defined 
+        if (defined $testCmd->{'filter_job_status'}) {
+	        # decode $testResult->{'body'} to an array of hash
+	        my $body = decode_json $testResult->{'body'};
+	        # in the tests, we run this case with jobName = "PigLatin:loadstore.pig"
+	        # filter $body to leave only records with this jobName
+	        my @filtered_body = grep {($_->{detail}{profile}{jobName} eq "PigLatin:loadstore.pig")}  @$body;
+			my @sorted_filtered_body = sort { $a->{id} <=> $b->{id} } @filtered_body;
+        	$value = $path->value(\@sorted_filtered_body);
+        } else {
+        	$value = $path->value($testResult->{'body'});
+        }
+        
+        if ($value !~ /$regex_expected_value/s) {
+          print $log "$0::$subName INFO check failed:"
+            . " json pattern check failed. For field "
+              . "$key, regex <" . $regex_expected_value
+                . "> did not match the result <" . $value
+                  . ">\n";
+          $result = 0;
+          last;
+        }
+      }
+    } 
     if (defined $testCmd->{'json_field_substr_match'} || $testCmd->{'json_field_match_object'}) {
       my $json = new JSON;
       $json_hash = $json->utf8->decode($testResult->{'body'});
@@ -639,7 +683,7 @@ sub compare
         print $log "Comparing $key: $json_field_val with regex /$regex_expected_value/\n";
 
         if ($json_field_val !~ /$regex_expected_value/s) {
-          print $log "$0::$subName INFO check failed:" 
+          print $log "$0::$subName WARN check failed:" 
             . " json pattern check failed. For field "
               . "$key, regex <" . $regex_expected_value 
                 . "> did not match the result <" . $json_field_val
@@ -654,7 +698,7 @@ sub compare
         print $log "Comparing $key: " . dump($json_field_val) . ",expected value:  " . dump($regex_expected_obj);
 
         if (!Compare($json_field_val, $regex_expected_obj)) {
-          print $log "$0::$subName INFO check failed:" 
+          print $log "$0::$subName WARN check failed:" 
             . " json compare failed. For field "
               . "$key, regex <" . dump($regex_expected_obj)
                 . "> did not match the result <" . dump($json_field_val)
@@ -671,7 +715,7 @@ sub compare
       sleep $testCmd->{'kill_job_timeout'};
       my $jobid = $json_hash->{'id'};
       if (!defined $jobid) {
-        print $log "$0::$subName INFO check failed: " 
+        print $log "$0::$subName WARN check failed: " 
           . "no jobid (id field)found in result";
         $result = 0;
       } else {
@@ -682,13 +726,14 @@ sub compare
 
     #try to get the call back url request until timeout
     if ($result == 1 && defined $testCmd->{'check_call_back'}) {
-      my $d = $testCmd->{'http_daemon'};
-      if (defined $testCmd->{'timeout_seconds'}) {
-        $d->timeout($testCmd->{'timeout_seconds'})
-      }
-      else {      
-        $d->timeout(300);         #wait for 5 mins by default
+
+      my $timeout = 300; #wait for 5 mins for callback
+      if(defined $testCmd->{'timeout'}){
+        $timeout = $testCmd->{'timeout'};
       }
+
+      my $d = $testCmd->{'http_daemon'};
+      $d->timeout($timeout);
       my $url_requested;
       $testCmd->{'callback_url'} =~ s/\$jobId/$json_hash->{'id'}/g;
       print $log "Expanded callback url : <" . $testCmd->{'callback_url'} . ">\n";
@@ -717,13 +762,12 @@ sub compare
 
     }
 
-    
     if ( (defined $testCmd->{'check_job_created'})
          || (defined $testCmd->{'check_job_complete'})
-         || (defined $testCmd->{'check_job_exit_value'}) ) {
+         || (defined $testCmd->{'check_job_exit_value'}) ) {    
       my $jobid = $json_hash->{'id'};
       if (!defined $jobid) {
-        print $log "$0::$subName INFO check failed: " 
+        print $log "$0::$subName WARN check failed: " 
           . "no jobid (id field)found in result";
         $result = 0;
       } else {
@@ -731,7 +775,7 @@ sub compare
         my $json = new JSON;
         my $res_hash = $json->utf8->decode($jobResult->{'body'});
         if (! defined $res_hash->{'status'}) {
-          print $log "$0::$subName INFO check failed: " 
+          print $log "$0::$subName WARN check failed: " 
             . "jobresult not defined ";
           $result = 0;
         }
@@ -739,10 +783,6 @@ sub compare
           my $jobComplete;
           my $NUM_RETRIES = 60;
           my $SLEEP_BETWEEN_RETRIES = 5;
-          if (defined $testCmd->{'timeout_seconds'} && $testCmd->{'timeout_seconds'} > 0) {
-            $SLEEP_BETWEEN_RETRIES = ($testCmd->{'timeout_seconds'} / $NUM_RETRIES);
-            print $log "found timeout_seconds & set SLEEP_BETWEEN_RETRIES=$SLEEP_BETWEEN_RETRIES";
-          }
 
           #first wait for job completion
           while ($NUM_RETRIES-- > 0) {
@@ -756,7 +796,7 @@ sub compare
             $res_hash = $json->utf8->decode($jobResult->{'body'});
           }
           if ( (!defined $jobComplete) || lc($jobComplete) ne "true") {
-            print $log "$0::$subName INFO check failed: " 
+            print $log "$0::$subName WARN check failed: " 
               . " timeout on wait for job completion ";
             $result = 0;
           } else { 
@@ -772,12 +812,140 @@ sub compare
             if (defined($testCmd->{'check_job_exit_value'})) {
               my $exitValue = $res_hash->{'exitValue'};
               my $expectedExitValue = $testCmd->{'check_job_exit_value'};
-              if ( (!defined $exitValue) || $exitValue ne $expectedExitValue) {
+              if ( (!defined $exitValue) || $exitValue % 128 ne $expectedExitValue) {
                 print $log "check_job_exit_value failed. got exitValue $exitValue,  expected  $expectedExitValue";
                 $result = 0;
               }
             }
           }
+
+	  #Check userargs
+	  print $log "$0::$subName INFO Checking userargs";
+          my @options = @{$testCmd->{'post_options'}};
+          if( !defined $res_hash->{'userargs'}){
+            print $log "$0::$subName INFO expected userargs" 
+                . " but userargs not defined\n";
+            $result = 0;
+          }
+
+	  #create exp_userargs hash from @options
+          my %exp_userargs = ();
+          foreach my $opt ( @options ){
+            print $log "opt $opt";
+            my ($key, $val) = split q:=:, $opt, 2;   
+            if(defined $exp_userargs{$key}){
+
+              #if we have already seen this value
+              #then make the value an array and push new value in
+              if(ref($exp_userargs{$key}) eq ""){
+                my @ar = ($exp_userargs{$key});
+                $exp_userargs{$key} = \@ar;
+              }
+              my $ar = $exp_userargs{$key}; 
+              push @$ar, ($val); 
+            }
+            else{
+              $exp_userargs{$key} = $val;	
+            }
+          }
+
+          my %r_userargs = %{$res_hash->{'userargs'}};
+          foreach my $key( keys %exp_userargs){
+            if( !defined $r_userargs{$key}){
+              print $log "$0::$subName INFO $key not found in userargs \n";
+              $result = 0;
+              next;
+            }
+              
+            print $log "$0::$subName DEBUG comparing expected " 
+                . " $key ->" . dump($exp_userargs{$key})
+                . " With result $key ->" . dump($r_userargs{$key}) . "\n";
+
+            if (!Compare($exp_userargs{$key}, $r_userargs{$key})) {
+              print $log "$0::$subName WARN check failed:" 
+                  . " json compare failed. For field "
+                  . "$key, regex <" . dump($r_userargs{$key})
+                  . "> did not match the result <" . dump($exp_userargs{$key})
+                  . ">\n";
+              $result = 0;
+            }
+          }
+		  if ($result != 0 && $testCmd->{'check_logs'}) {
+            my $testCmdBasics = $self->copyTestBasicConfig($testCmd);
+            $testCmdBasics->{'method'} = 'GET';
+            $testCmdBasics->{'url'} = ':WEBHDFS_URL:/webhdfs/v1:OUTDIR:' . '/status/logs?op=LISTSTATUS';
+            my $curl_result = $self->execCurlCmd($testCmdBasics, "", $log);
+            my $path = JSON::Path->new("FileStatuses.FileStatus[*].pathSuffix");
+            my @value = $path->values($curl_result->{'body'});
+            if ($testCmd->{'check_logs'}->{'job_num'} && $testCmd->{'check_logs'}->{'job_num'} ne (scalar @value)-1) {
+              print $log "$0::$subName INFO check failed: "
+                . " Expect " . $testCmd->{'check_logs'}->{'job_num'} . " jobs in logs, but get " . scalar @value;
+              $result = 0;
+              return $result;
+            }
+            foreach my $jobid (@value) {
+              if ($jobid eq 'list.txt') {
+                next;
+              }
+              my $testCmdBasics = $self->copyTestBasicConfig($testCmd);
+              $testCmdBasics->{'method'} = 'GET';
+              $testCmdBasics->{'url'} = ':WEBHDFS_URL:/webhdfs/v1:OUTDIR:' . '/status/logs/' . $jobid . '?op=LISTSTATUS';
+              my $curl_result = $self->execCurlCmd($testCmdBasics, "", $log);
+
+              my $path = JSON::Path->new("FileStatuses.FileStatus[*]");
+              my @value = $path->values($curl_result->{'body'});
+
+              my $foundjobconf = 0;
+              foreach my $elem (@value) {
+                if ($elem->{'pathSuffix'} eq "job.xml.html") {
+                  $foundjobconf = 1;
+                  if ($elem->{'length'} eq "0") {
+                    print $log "$0::$subName INFO check failed: "
+                      . " job.xml.html for " . $jobid . " is empty";
+					$result = 0;
+					return $result;
+                  }
+                  next;
+                }
+                my $attempt = $elem->{'pathSuffix'};
+                my $testCmdBasics = $self->copyTestBasicConfig($testCmd);
+                $testCmdBasics->{'method'} = 'GET';
+                $testCmdBasics->{'url'} = ':WEBHDFS_URL:/webhdfs/v1:OUTDIR:' . '/status/logs/' . $jobid . '/' . $attempt . '?op=LISTSTATUS';
+                my $curl_result = $self->execCurlCmd($testCmdBasics, "", $log);
+                my $path = JSON::Path->new("FileStatuses.FileStatus[*].pathSuffix");
+                my @value = $path->values($curl_result->{'body'});
+                my @files = ('stderr', 'stdout', 'syslog');
+                foreach my $file (@files) {
+                  if ( !grep( /$file/, @value ) ) {
+                    print $log "$0::$subName INFO check failed: "
+                      . " Cannot find " . $file . " in logs/" . $attempt;
+                    $result = 0;
+                    return $result;
+                  }
+                }
+                $path = JSON::Path->new("FileStatuses.FileStatus[*].length");
+                @value = $path->values($curl_result->{'body'});
+                my $foundnonzerofile = 0;
+                foreach my $length (@value) {
+                  if ($length ne "0") {
+                    $foundnonzerofile = 1;
+                  }
+                }
+                if (!$foundnonzerofile) {
+                  print $log "$0::$subName INFO check failed: "
+                    . " All files in logs/" . $attempt . " are empty";
+                  $result = 0;
+                  return $result;
+                }
+              }
+              if (!$foundjobconf) {
+                print $log "$0::$subName INFO check failed: "
+                  . " Cannot find job.xml.html for " . $jobid;
+				$result = 0;
+				return $result;
+              }
+            }
+          }
         }
       }
     }
@@ -1357,5 +1525,13 @@ sub tmpIPCRunJoinStdoe {
   return ( $? );
 }
 
-
+sub isWindows
+{
+    if($^O =~ /mswin/i) {
+        return 1;
+    }
+    else {
+        return 0;
+    }
+}
 1;

Modified: hive/branches/tez/hcatalog/src/test/e2e/templeton/tests/ddl.conf
URL: http://svn.apache.org/viewvc/hive/branches/tez/hcatalog/src/test/e2e/templeton/tests/ddl.conf?rev=1527883&r1=1527882&r2=1527883&view=diff
==============================================================================
--- hive/branches/tez/hcatalog/src/test/e2e/templeton/tests/ddl.conf (original)
+++ hive/branches/tez/hcatalog/src/test/e2e/templeton/tests/ddl.conf Tue Oct  1 04:48:44 2013
@@ -284,7 +284,7 @@ $cfg = 
                                 #	  'json_field_substr_match' => {'table-name' => 'templeton_testtab1'}, 
      'json_field_match_object' => { 'columns' => '[
                  { "name" : "i", "type" : "int", "comment" : "from deserializer" },
-                 { "name" : "j", "type" : "bigint", "comment" : "from deserializer"  }
+                 { "name" : "j", "type" : "bigint", "comment" : "from deserializer" }
            ]' },
     },
     {                           #drop table
@@ -354,7 +354,7 @@ $cfg = 
                       "fieldsTerminatedBy" : "\u0001",
                       "collectionItemsTerminatedBy" : "\u0002",
                       "mapKeysTerminatedBy" : "\u0003",
-                      "linesTerminatedBy" : "\n",
+                      "linesTerminatedBy" : "\\\n",
 
                       "serde" : {
                         "name" : "org.apache.hadoop.hive.serde2.columnar.ColumnarSerDe",
@@ -406,6 +406,7 @@ $cfg = 
                   "type" : "bigint", "comment" : "from deserializer"
               },
 	     {
+                  "comment" : "IP Address of the User",
                   "name" : "ip",
                   "type" : "string", "comment" : "from deserializer"
               }
@@ -516,15 +517,7 @@ $cfg = 
      'url' => ':TEMPLETON_URL:/templeton/v1/ddl',
      'status_code' => 200,
      'post_options' => ['user.name=:UNAME:',
-                        'exec=create table if not exists templetontest_parts (i int, j bigint, ip STRING COMMENT "IP Address of the User")
-COMMENT "This is the page view table"
- PARTITIONED BY(dt STRING, country STRING)
-ROW FORMAT DELIMITED
-  FIELDS TERMINATED BY "\001"
-  COLLECTION ITEMS TERMINATED BY "\002"
-  MAP KEYS TERMINATED BY "\003"
-STORED AS rcfile
---LOCATION "table1_location" '],
+                        'exec=create table if not exists templetontest_parts (i int, j bigint, ip STRING COMMENT \'IP Address of the User\') COMMENT \'This is the page view table\'  PARTITIONED BY(dt STRING, country STRING) ROW FORMAT DELIMITED FIELDS TERMINATED BY \'\001\'  COLLECTION ITEMS TERMINATED BY \'\002\'  MAP KEYS TERMINATED BY \'\003\' STORED AS rcfile --LOCATION \'table1_location\' '],
      'json_field_substr_match' => {'stderr' => 'OK'},
     },
     {
@@ -632,6 +625,7 @@ STORED AS rcfile
 		   "type" : "bigint", "comment" : "from deserializer"
 	       },
 	       {
+		   "comment" : "IP Address of the User",
 		   "name" : "ip",
 		   "type" : "string", "comment" : "from deserializer"
 	      }
@@ -684,6 +678,7 @@ STORED AS rcfile
 		   "type" : "bigint", "comment" : "from deserializer"
 	       },
 	       {
+		   "comment" : "IP Address of the User",
 		   "name" : "ip",
 		   "type" : "string", "comment" : "from deserializer"
 	      }
@@ -732,7 +727,7 @@ STORED AS rcfile
      'status_code' => 404,
      'json_field_substr_match' => 
        {
-        'error' => 'FAILED: SemanticException \[Error 10006\]: Partition not found \{dt=20120101\, country=IN\}'
+        'error' => 'Partition not found {dt=20120101, country=IN}'
        },
     },
 
@@ -761,7 +756,7 @@ STORED AS rcfile
      'method' => 'POST',
      'url' => ':TEMPLETON_URL:/templeton/v1/ddl?user.name=:UNAME:',
      'status_code' => 200,
-     'post_options' => ['user.name=:UNAME:','exec=create table if not exists templeton_testcol_tab (i int comment "column with comment", j bigint) STORED AS rcfile;'],
+     'post_options' => ['user.name=:UNAME:','exec=create table if not exists templeton_testcol_tab (i int comment \'column with comment\', j bigint) STORED AS rcfile;'],
      'json_field_substr_match' => {'stderr' => 'OK'},
     },
     {
@@ -779,7 +774,7 @@ STORED AS rcfile
      'json_field_match_object' => 
      {
       'columns' => '[
-                 { "name" : "i", "type" : "int", "comment" : "from deserializer"},
+                 { "name" : "i", "type" : "int", "comment" : "from deserializer" },
                  { "name" : "j", "type" : "bigint", "comment" : "from deserializer" }
            ]' 
      },
@@ -1088,9 +1083,7 @@ STORED AS rcfile
      'status_code' => 200,
      'post_options' => ['user.name=:UNAME:',
                         'permissions=---------',
-                        'exec=create table templetontest_hcatgp(i int, j bigint)  
-                         PARTITIONED BY(dt STRING, country STRING)
-                         STORED AS rcfile;'
+                        'exec=create table templetontest_hcatgp(i int, j bigint) PARTITIONED BY(dt STRING, 3Bcountry STRING) STORED AS rcfile;'
                        ],
      'json_field_substr_match' => {'stderr' => 'OK', 'exitcode' => '^0$'}
     },

Modified: hive/branches/tez/hcatalog/src/test/e2e/templeton/tests/jobsubmission.conf
URL: http://svn.apache.org/viewvc/hive/branches/tez/hcatalog/src/test/e2e/templeton/tests/jobsubmission.conf?rev=1527883&r1=1527882&r2=1527883&view=diff
==============================================================================
--- hive/branches/tez/hcatalog/src/test/e2e/templeton/tests/jobsubmission.conf (original)
+++ hive/branches/tez/hcatalog/src/test/e2e/templeton/tests/jobsubmission.conf Tue Oct  1 04:48:44 2013
@@ -35,79 +35,76 @@ $cfg = 
  [
 ##=============================================================================================================
   {
-   'name' => 'TestStreaming',
+   'name' => 'TestKillJob',
    'tests' => 
    [
     {
      'num' => 1,
      'method' => 'POST',
      'url' => ':TEMPLETON_URL:/templeton/v1/mapreduce/streaming',
-     'post_options' => ['user.name=:UNAME:','input=:INPDIR_HDFS:/nums.txt','output=:OUTDIR:/mycounts', 
-                        'mapper=/bin/cat', 'reducer=/usr/bin/wc'],
-     'json_field_substr_match' => { 'id' => '\d+'},
-                                #results
-     'status_code' => 200,
-     'check_job_created' => 1,
-     'check_job_complete' => 'SUCCESS',
-     'check_job_exit_value' => 0,
-     'check_call_back' => 1,
-    },
-    {
-     #-ve test - no input file
-     'num' => 2,
-     'ignore' => 'wait for fix in hadoop 1.0.3',
-     'method' => 'POST',
-     'url' => ':TEMPLETON_URL:/templeton/v1/mapreduce/streaming',
-     'post_options' => ['user.name=:UNAME:','input=:INPDIR_HDFS:/nums.txt','output=:OUTDIR:/mycounts', 
-                        'mapper=/bin/ls no_such-file-12e3', 'reducer=/usr/bin/wc'],
+     'post_options' => ['user.name=:UNAME:','input=:INPDIR_HDFS:/nums.txt',
+                        'input=:INPDIR_HDFS:/nums.txt',
+                        'output=:OUTDIR:/mycounts', 
+                        'mapper=sleep 100', 'reducer=wc'],
      'json_field_substr_match' => { 'id' => '\d+'},
                                 #results
      'status_code' => 200,
      'check_job_created' => 1,
-     'check_job_complete' => 'FAILURE',
-     'check_call_back' => 1,
+     'check_job_complete' => 'KILLED',
+#     'check_call_back' => 1, #TODO - enable call back check after fix
+     'kill_job_timeout' => 10,
     },
-
    ]
   },
 ##=============================================================================================================
   {
-   'name' => 'TestKillJob',
+   'name' => 'TestMapReduce',
    'tests' => 
    [
     {
+         
      'num' => 1,
      'method' => 'POST',
-     'url' => ':TEMPLETON_URL:/templeton/v1/mapreduce/streaming',
-     'post_options' => ['user.name=:UNAME:','input=:INPDIR_HDFS:/nums.txt','output=:OUTDIR:/mycounts', 
-                        'mapper=/bin/sleep 100', 'reducer=/usr/bin/wc'],
+     'url' => ':TEMPLETON_URL:/templeton/v1/mapreduce/jar',
+     'post_options' => ['user.name=:UNAME:','arg=:INPDIR_HDFS:/nums.txt', 'arg= :OUTDIR:/wc.txt', 
+                        'jar=:INPDIR_HDFS:/hexamples.jar', 'class=wordcount', ],
      'json_field_substr_match' => { 'id' => '\d+'},
                                 #results
      'status_code' => 200,
      'check_job_created' => 1,
-     'check_job_complete' => 'KILLED',
-#     'check_call_back' => 1, #TODO - enable call back check after fix
-     'kill_job_timeout' => 10,
+     'check_job_complete' => 'SUCCESS',
+     'check_job_exit_value' => 0,
+     'check_call_back' => 1,
     },
-   ]
-  },
-##=============================================================================================================
-  {
-   'name' => 'TestMapReduce',
-   'tests' => 
-   [
     {
          
-     'num' => 1,
+     'num' => 2,
+     'method' => 'POST',
+     'url' => ':TEMPLETON_URL:/templeton/v1/mapreduce/jar',
+     'post_options' => ['user.name=:UNAME:','arg=-mt', 'arg=660000', 
+                        'jar=:INPDIR_HDFS:/hexamples.jar', 'class=sleep', ],
+     'json_field_substr_match' => { 'id' => '\d+'},
+                                #results
+     'status_code' => 200,
+     'check_job_created' => 1,
+     'check_job_complete' => 'SUCCESS',
+     'check_job_exit_value' => 0,
+     'check_call_back' => 1,
+     'timeout' => 840, #increase timeout as this test takes long
+    },
+    {
+     # with log enabled 
+     'num' => 3,
      'method' => 'POST',
      'url' => ':TEMPLETON_URL:/templeton/v1/mapreduce/jar',
      'post_options' => ['user.name=:UNAME:','arg=:INPDIR_HDFS:/nums.txt', 'arg= :OUTDIR:/wc.txt', 
-                        'jar=:INPDIR_HDFS:/hexamples.jar', 'class=wordcount', ],
+                        'jar=:INPDIR_HDFS:/hexamples.jar', 'class=wordcount', 'statusdir=:OUTDIR:/status', 'enablelog=true' ],
      'json_field_substr_match' => { 'id' => '\d+'},
                                 #results
      'status_code' => 200,
      'check_job_created' => 1,
-     'check_job_complete' => 'SUCCESS', 
+     'check_job_complete' => 'SUCCESS',
+     'check_logs' => { 'job_num' => '1' },
      'check_job_exit_value' => 0,
      'check_call_back' => 1,
     },
@@ -120,7 +117,6 @@ $cfg = 
    [
     {
                                 #test syntax error
-     'ignore' => 'fails in current version',
      'num' => 1,
      'method' => 'POST',
      'url' => ':TEMPLETON_URL:/templeton/v1/pig',
@@ -129,7 +125,6 @@ $cfg = 
                                 #results
      'status_code' => 200,
      'check_job_created' => 1,
-     'check_job_complete' => 'FAILURE', 
      'check_job_exit_value' => 8,
      'check_call_back' => 1,
     },
@@ -143,8 +138,7 @@ $cfg = 
                                 #results
      'status_code' => 200,
      'check_job_created' => 1,
-     'check_job_exit_value' => 0,
-     'check_job_complete' => 'SUCCESS', 
+     'check_job_complete' => 'SUCCESS',
      'check_job_exit_value' => 0,
      'check_call_back' => 1,
     },
@@ -158,7 +152,7 @@ $cfg = 
                                 #results
      'status_code' => 200,
      'check_job_created' => 1,
-     'check_job_complete' => 'SUCCESS', 
+     'check_job_complete' => 'SUCCESS',
      'check_job_exit_value' => 0,
      'check_call_back' => 1,
     },
@@ -228,21 +222,50 @@ $cfg = 
     {
                                 #no file to be copied, should result in launcher job error 
      'num' => 8,
-     ignore => 'check is disabled for now in templeton',
      'method' => 'POST',
      'url' => ':TEMPLETON_URL:/templeton/v1/pig',
-     'post_options' => ['user.name=:UNAME:', 'arg=-p', 'arg=INPDIR=:INPDIR_HDFS:','arg=-p', 'arg= OUTDIR=:OUTDIR:', 'file=:INPDIR_HDFS:/no_such_file.pig',
+     'post_options' => ['user.name=:UNAME:', 'arg=-p', 'arg=INPDIR=:INPDIR_HDFS:','arg=-p', 'arg=OUTDIR=:OUTDIR:', 'file=:INPDIR_HDFS:/no_such_file.pig',
                         'files=:INPDIR_HDFS:/rowcountmacro.pig' ],
      'json_field_substr_match' => { 'error' => 'does not exist'},
                                 #results
-     'status_code' => 200,
-     'check_job_complete' => 'FAILURE', 
+     'status_code' => 400,
 
     },
 	
+	{
+                                #Auto add quote around args
+     'ignore' => 'MS9 feature, will reenable later',
+     'num' => 9,
+       'method' => 'POST',
+     'url' => ':TEMPLETON_URL:/templeton/v1/pig',
+     'post_options' => ['user.name=:UNAME:','arg=-check', 'file=:INPDIR_HDFS:/loadstore.pig', 'arg=-p', 'arg=INPDIR=:INPDIR_HDFS:','arg=-p', 'arg=OUTDIR=:OUTDIR:', ],
+     'json_field_substr_match' => { 'id' => '\d+'},
+                                #results
+     'status_code' => 200,
+     'check_job_created' => 1,
+     'check_job_complete' => 'SUCCESS',
+     'check_job_exit_value' => 0,
+     'check_call_back' => 1,
+    },
 
+    {
+                                #a simple load store script with log enabled
+     'num' => 9,
+     'method' => 'POST',
+     'url' => ':TEMPLETON_URL:/templeton/v1/pig',
+     'post_options' => ['user.name=:UNAME:', 'arg=-p', 'arg=INPDIR=:INPDIR_HDFS:','arg=-p', 'arg=OUTDIR=:OUTDIR:', 'file=:INPDIR_HDFS:/loadstore.pig',
+                    'statusdir=:OUTDIR:/status', 'enablelog=true'],
+     'json_field_substr_match' => { 'id' => '\d+'},
+                                #results
+     'status_code' => 200,
+     'check_job_created' => 1,
+     'check_job_complete' => 'SUCCESS',
+     'check_logs' => { 'job_num' => '1' },
+     'check_job_exit_value' => 0,
+     'check_call_back' => 1,
+    },
 
-    #test 9
+    #test 10
     #TODO jython test
 
 
@@ -257,7 +280,6 @@ $cfg = 
    [
     {
                                 #test syntax error
-     'ignore' => 'fails in current version',
      'num' => 1,
      'method' => 'POST',
      'url' => ':TEMPLETON_URL:/templeton/v1/hive',
@@ -266,8 +288,7 @@ $cfg = 
                                 #results
      'status_code' => 200,
      'check_job_created' => 1,
-     'check_job_complete' => 'FAILURE', 
-     'check_job_exit_value' => 11,
+     'check_job_exit_value' => 64,
 
     },
  
@@ -305,7 +326,7 @@ $cfg = 
      'num' => 4,
      'method' => 'POST',
      'url' => ':TEMPLETON_URL:/templeton/v1/hive',
-     'post_options' => ['user.name=:UNAME:','execute=create external table mynums(a int, b int) location ":INPDIR_HDFS:/numstable/";', ],
+     'post_options' => ['user.name=:UNAME:','execute=create external table mynums(a int, b int) location \':INPDIR_HDFS:/numstable/\';', ],
      'json_field_substr_match' => { 'id' => '\d+'},
                                 #results
      'status_code' => 200,
@@ -412,7 +433,49 @@ $cfg = 
    ]
   },
 
-
+    {
+                                #test add jar
+     'num' => 9,
+     'method' => 'POST',
+     'url' => ':TEMPLETON_URL:/templeton/v1/hive',
+     'post_options' => ['user.name=:UNAME:','execute=add jar piggybank.jar', 'files=:INPDIR_HDFS:/piggybank.jar',],
+     'json_field_substr_match' => { 'id' => '\d+'},
+                                #results
+     'status_code' => 200,
+     'check_job_created' => 1,
+     'check_job_complete' => 'SUCCESS', 
+     'check_job_exit_value' => 0,
+     'check_call_back' => 1,
+    },
+    {
+                                #test add jar when the jar is not shipped
+     'num' => 10,
+     'method' => 'POST',
+     'url' => ':TEMPLETON_URL:/templeton/v1/hive',
+     'post_options' => ['user.name=:UNAME:','execute=add jar piggybank.jar',],
+     'json_field_substr_match' => { 'id' => '\d+'},
+                                #results
+     'status_code' => 200,
+     'check_job_created' => 1,
+     'check_job_complete' => 'SUCCESS', 
+     'check_job_exit_value' => 1,
+     'check_call_back' => 1,
+    }, 
+    {
+                                #enable logs
+     'num' => 11,
+     'method' => 'POST',
+     'url' => ':TEMPLETON_URL:/templeton/v1/hive',	
+     'post_options' => ['user.name=:UNAME:','execute=select a,b from mynums', 'statusdir=:OUTDIR:/status', 'enablelog=true'],
+     'json_field_substr_match' => { 'id' => '\d+'},
+                                #results
+     'status_code' => 200,
+     'check_job_created' => 1,
+     'check_job_complete' => 'SUCCESS',
+     'check_logs' => { 'job_num' => '1' },
+     'check_job_exit_value' => 0,
+     'check_call_back' => 1,
+    },
 
 
 

Modified: hive/branches/tez/hcatalog/storage-handlers/hbase/src/java/org/apache/hcatalog/hbase/HBaseBaseOutputFormat.java
URL: http://svn.apache.org/viewvc/hive/branches/tez/hcatalog/storage-handlers/hbase/src/java/org/apache/hcatalog/hbase/HBaseBaseOutputFormat.java?rev=1527883&r1=1527882&r2=1527883&view=diff
==============================================================================
--- hive/branches/tez/hcatalog/storage-handlers/hbase/src/java/org/apache/hcatalog/hbase/HBaseBaseOutputFormat.java (original)
+++ hive/branches/tez/hcatalog/storage-handlers/hbase/src/java/org/apache/hcatalog/hbase/HBaseBaseOutputFormat.java Tue Oct  1 04:48:44 2013
@@ -25,6 +25,7 @@ import java.util.Properties;
 import org.apache.hadoop.fs.FileSystem;
 import org.apache.hadoop.fs.Path;
 import org.apache.hadoop.hbase.client.Put;
+import org.apache.hadoop.hive.ql.io.FSRecordWriter;
 import org.apache.hadoop.hive.ql.io.HiveOutputFormat;
 import org.apache.hadoop.io.Writable;
 import org.apache.hadoop.io.WritableComparable;
@@ -40,7 +41,7 @@ public class HBaseBaseOutputFormat imple
   HiveOutputFormat<WritableComparable<?>, Put> {
 
   @Override
-  public org.apache.hadoop.hive.ql.exec.FileSinkOperator.RecordWriter getHiveRecordWriter(
+  public FSRecordWriter getHiveRecordWriter(
     JobConf jc, Path finalOutPath,
     Class<? extends Writable> valueClass, boolean isCompressed,
     Properties tableProperties, Progressable progress)

Modified: hive/branches/tez/hcatalog/storage-handlers/hbase/src/test/org/apache/hcatalog/hbase/TestHCatHBaseInputFormat.java
URL: http://svn.apache.org/viewvc/hive/branches/tez/hcatalog/storage-handlers/hbase/src/test/org/apache/hcatalog/hbase/TestHCatHBaseInputFormat.java?rev=1527883&r1=1527882&r2=1527883&view=diff
==============================================================================
--- hive/branches/tez/hcatalog/storage-handlers/hbase/src/test/org/apache/hcatalog/hbase/TestHCatHBaseInputFormat.java (original)
+++ hive/branches/tez/hcatalog/storage-handlers/hbase/src/test/org/apache/hcatalog/hbase/TestHCatHBaseInputFormat.java Tue Oct  1 04:48:44 2013
@@ -229,7 +229,7 @@ public class TestHCatHBaseInputFormat ex
     // Note: These asserts only works in case of LocalJobRunner as they run in same jvm.
     // If using MiniMRCluster, the tests will have to be modified.
     assertFalse(MapReadHTable.error);
-    assertEquals(MapReadHTable.count, 1);
+    assertEquals(1, MapReadHTable.count);
 
     String dropTableQuery = "DROP TABLE " + hbaseTableName;
     CommandProcessorResponse responseThree = hcatDriver.run(dropTableQuery);
@@ -291,7 +291,7 @@ public class TestHCatHBaseInputFormat ex
     job.setNumReduceTasks(0);
     assertTrue(job.waitForCompletion(true));
     assertFalse(MapReadProjHTable.error);
-    assertEquals(MapReadProjHTable.count, 1);
+    assertEquals(1, MapReadProjHTable.count);
 
     String dropTableQuery = "DROP TABLE " + tableName;
     CommandProcessorResponse responseThree = hcatDriver.run(dropTableQuery);
@@ -325,7 +325,7 @@ public class TestHCatHBaseInputFormat ex
         HCatUtil.serialize(getHiveConf().getAllProperties()));
 
     // output settings
-    Path outputDir = new Path(getTestDir(), "mapred/testHBaseTableProjectionReadMR");
+    Path outputDir = new Path(getTestDir(), "mapred/testHBaseInputFormatProjectionReadMR");
     FileSystem fs = getFileSystem();
     if (fs.exists(outputDir)) {
       fs.delete(outputDir, true);
@@ -361,8 +361,8 @@ public class TestHCatHBaseInputFormat ex
     RunningJob runJob = JobClient.runJob(job);
     runJob.waitForCompletion();
     assertTrue(runJob.isSuccessful());
-    assertFalse(MapReadProjHTable.error);
-    assertEquals(MapReadProjHTable.count, 1);
+    assertFalse(MapReadProjectionHTable.error);
+    assertEquals(1, MapReadProjectionHTable.count);
 
     String dropTableQuery = "DROP TABLE " + tableName;
     CommandProcessorResponse responseThree = hcatDriver.run(dropTableQuery);

Modified: hive/branches/tez/hcatalog/webhcat/svr/src/main/config/webhcat-default.xml
URL: http://svn.apache.org/viewvc/hive/branches/tez/hcatalog/webhcat/svr/src/main/config/webhcat-default.xml?rev=1527883&r1=1527882&r2=1527883&view=diff
==============================================================================
--- hive/branches/tez/hcatalog/webhcat/svr/src/main/config/webhcat-default.xml (original)
+++ hive/branches/tez/hcatalog/webhcat/svr/src/main/config/webhcat-default.xml Tue Oct  1 04:48:44 2013
@@ -80,32 +80,38 @@
   </property>
 
   <property>
+    <name>templeton.python</name>
+    <value>${env.PYTHON_CMD}</value>
+    <description>The path to the python executable.</description>
+  </property>
+
+  <property>
     <name>templeton.pig.archive</name>
-    <value>hdfs:///apps/templeton/pig-0.10.1.tar.gz</value>
+    <value></value>
     <description>The path to the Pig archive.</description>
   </property>
 
   <property>
     <name>templeton.pig.path</name>
-    <value>pig-0.10.1.tar.gz/pig-0.10.1/bin/pig</value>
+    <value>pig-0.11.1.tar.gz/pig-0.11.1/bin/pig</value>
     <description>The path to the Pig executable.</description>
   </property>
 
   <property>
     <name>templeton.hcat</name>
-    <value>${env.HCAT_PREFIX}/bin/hcat</value>
+    <value>${env.HCAT_PREFIX}/bin/hcat.py</value>
     <description>The path to the hcatalog executable.</description>
   </property>
 
   <property>
     <name>templeton.hive.archive</name>
-    <value>hdfs:///apps/templeton/hive-0.10.0.tar.gz</value>
+    <value></value>
     <description>The path to the Hive archive.</description>
   </property>
 
   <property>
     <name>templeton.hive.path</name>
-    <value>hive-0.10.0.tar.gz/hive-0.10.0/bin/hive</value>
+    <value>hive-0.11.0.tar.gz/hive-0.11.0/bin/hive</value>
     <description>The path to the Hive executable.</description>
   </property>
 



Mime
View raw message