hive-commits mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From gunt...@apache.org
Subject svn commit: r1555193 [1/6] - in /hive/branches/tez: ./ ant/src/org/apache/hadoop/hive/ant/ common/src/java/org/apache/hadoop/hive/common/type/ common/src/test/org/apache/hadoop/hive/common/type/ itests/hive-unit/src/test/java/org/apache/hadoop/hive/jdb...
Date Fri, 03 Jan 2014 18:37:36 GMT
Author: gunther
Date: Fri Jan  3 18:37:34 2014
New Revision: 1555193

URL: http://svn.apache.org/r1555193
Log:
Merge latest trunk into branch. (Gunther Hagleitner)

Added:
    hive/branches/tez/common/src/java/org/apache/hadoop/hive/common/type/Decimal128.java
      - copied unchanged from r1555192, hive/trunk/common/src/java/org/apache/hadoop/hive/common/type/Decimal128.java
    hive/branches/tez/common/src/java/org/apache/hadoop/hive/common/type/SignedInt128.java
      - copied unchanged from r1555192, hive/trunk/common/src/java/org/apache/hadoop/hive/common/type/SignedInt128.java
    hive/branches/tez/common/src/java/org/apache/hadoop/hive/common/type/SqlMathUtil.java
      - copied unchanged from r1555192, hive/trunk/common/src/java/org/apache/hadoop/hive/common/type/SqlMathUtil.java
    hive/branches/tez/common/src/java/org/apache/hadoop/hive/common/type/UnsignedInt128.java
      - copied unchanged from r1555192, hive/trunk/common/src/java/org/apache/hadoop/hive/common/type/UnsignedInt128.java
    hive/branches/tez/common/src/test/org/apache/hadoop/hive/common/type/TestDecimal128.java
      - copied unchanged from r1555192, hive/trunk/common/src/test/org/apache/hadoop/hive/common/type/TestDecimal128.java
    hive/branches/tez/common/src/test/org/apache/hadoop/hive/common/type/TestSignedInt128.java
      - copied unchanged from r1555192, hive/trunk/common/src/test/org/apache/hadoop/hive/common/type/TestSignedInt128.java
    hive/branches/tez/common/src/test/org/apache/hadoop/hive/common/type/TestSqlMathUtil.java
      - copied unchanged from r1555192, hive/trunk/common/src/test/org/apache/hadoop/hive/common/type/TestSqlMathUtil.java
    hive/branches/tez/common/src/test/org/apache/hadoop/hive/common/type/TestUnsignedInt128.java
      - copied unchanged from r1555192, hive/trunk/common/src/test/org/apache/hadoop/hive/common/type/TestUnsignedInt128.java
    hive/branches/tez/metastore/scripts/upgrade/mysql/015-HIVE-5700.mysql.sql
      - copied unchanged from r1555192, hive/trunk/metastore/scripts/upgrade/mysql/015-HIVE-5700.mysql.sql
    hive/branches/tez/metastore/scripts/upgrade/oracle/015-HIVE-5700.oracle.sql
      - copied unchanged from r1555192, hive/trunk/metastore/scripts/upgrade/oracle/015-HIVE-5700.oracle.sql
    hive/branches/tez/metastore/scripts/upgrade/postgres/015-HIVE-5700.postgres.sql
      - copied unchanged from r1555192, hive/trunk/metastore/scripts/upgrade/postgres/015-HIVE-5700.postgres.sql
    hive/branches/tez/ql/src/test/queries/clientcompare/
      - copied from r1555192, hive/trunk/ql/src/test/queries/clientcompare/
    hive/branches/tez/ql/src/test/queries/clientpositive/vectorized_case.q
      - copied unchanged from r1555192, hive/trunk/ql/src/test/queries/clientpositive/vectorized_case.q
    hive/branches/tez/ql/src/test/results/clientpositive/vectorized_case.q.out
      - copied unchanged from r1555192, hive/trunk/ql/src/test/results/clientpositive/vectorized_case.q.out
    hive/branches/tez/ql/src/test/templates/TestCompareCliDriver.vm
      - copied unchanged from r1555192, hive/trunk/ql/src/test/templates/TestCompareCliDriver.vm
    hive/branches/tez/service/src/gen/thrift/gen-javabean/org/apache/hive/service/cli/thrift/TBinaryColumn.java
      - copied unchanged from r1555192, hive/trunk/service/src/gen/thrift/gen-javabean/org/apache/hive/service/cli/thrift/TBinaryColumn.java
    hive/branches/tez/service/src/gen/thrift/gen-javabean/org/apache/hive/service/cli/thrift/TBoolColumn.java
      - copied unchanged from r1555192, hive/trunk/service/src/gen/thrift/gen-javabean/org/apache/hive/service/cli/thrift/TBoolColumn.java
    hive/branches/tez/service/src/gen/thrift/gen-javabean/org/apache/hive/service/cli/thrift/TByteColumn.java
      - copied unchanged from r1555192, hive/trunk/service/src/gen/thrift/gen-javabean/org/apache/hive/service/cli/thrift/TByteColumn.java
    hive/branches/tez/service/src/gen/thrift/gen-javabean/org/apache/hive/service/cli/thrift/TDoubleColumn.java
      - copied unchanged from r1555192, hive/trunk/service/src/gen/thrift/gen-javabean/org/apache/hive/service/cli/thrift/TDoubleColumn.java
    hive/branches/tez/service/src/gen/thrift/gen-javabean/org/apache/hive/service/cli/thrift/TI16Column.java
      - copied unchanged from r1555192, hive/trunk/service/src/gen/thrift/gen-javabean/org/apache/hive/service/cli/thrift/TI16Column.java
    hive/branches/tez/service/src/gen/thrift/gen-javabean/org/apache/hive/service/cli/thrift/TI32Column.java
      - copied unchanged from r1555192, hive/trunk/service/src/gen/thrift/gen-javabean/org/apache/hive/service/cli/thrift/TI32Column.java
    hive/branches/tez/service/src/gen/thrift/gen-javabean/org/apache/hive/service/cli/thrift/TI64Column.java
      - copied unchanged from r1555192, hive/trunk/service/src/gen/thrift/gen-javabean/org/apache/hive/service/cli/thrift/TI64Column.java
    hive/branches/tez/service/src/gen/thrift/gen-javabean/org/apache/hive/service/cli/thrift/TStringColumn.java
      - copied unchanged from r1555192, hive/trunk/service/src/gen/thrift/gen-javabean/org/apache/hive/service/cli/thrift/TStringColumn.java
    hive/branches/tez/service/src/java/org/apache/hive/service/cli/Column.java
      - copied unchanged from r1555192, hive/trunk/service/src/java/org/apache/hive/service/cli/Column.java
    hive/branches/tez/service/src/java/org/apache/hive/service/cli/ColumnBasedSet.java
      - copied unchanged from r1555192, hive/trunk/service/src/java/org/apache/hive/service/cli/ColumnBasedSet.java
    hive/branches/tez/service/src/java/org/apache/hive/service/cli/RowBasedSet.java
      - copied unchanged from r1555192, hive/trunk/service/src/java/org/apache/hive/service/cli/RowBasedSet.java
    hive/branches/tez/service/src/java/org/apache/hive/service/cli/RowSetFactory.java
      - copied unchanged from r1555192, hive/trunk/service/src/java/org/apache/hive/service/cli/RowSetFactory.java
Removed:
    hive/branches/tez/service/src/java/org/apache/hive/service/cli/Row.java
Modified:
    hive/branches/tez/   (props changed)
    hive/branches/tez/ant/src/org/apache/hadoop/hive/ant/QTestGenTask.java
    hive/branches/tez/itests/hive-unit/src/test/java/org/apache/hadoop/hive/jdbc/TestJdbcDriver.java
    hive/branches/tez/itests/hive-unit/src/test/java/org/apache/hive/jdbc/TestJdbcDriver2.java
    hive/branches/tez/itests/hive-unit/src/test/java/org/apache/hive/jdbc/miniHS2/TestHiveServer2.java
    hive/branches/tez/itests/qtest/pom.xml
    hive/branches/tez/itests/util/src/main/java/org/apache/hadoop/hive/ql/QTestUtil.java
    hive/branches/tez/jdbc/src/java/org/apache/hive/jdbc/HiveBaseResultSet.java
    hive/branches/tez/jdbc/src/java/org/apache/hive/jdbc/HiveConnection.java
    hive/branches/tez/jdbc/src/java/org/apache/hive/jdbc/HiveDatabaseMetaData.java
    hive/branches/tez/jdbc/src/java/org/apache/hive/jdbc/HiveQueryResultSet.java
    hive/branches/tez/metastore/scripts/upgrade/mysql/014-HIVE-3764.mysql.sql
    hive/branches/tez/metastore/scripts/upgrade/mysql/upgrade-0.12.0-to-0.13.0.mysql.sql
    hive/branches/tez/metastore/scripts/upgrade/oracle/upgrade-0.12.0-to-0.13.0.oracle.sql
    hive/branches/tez/metastore/scripts/upgrade/postgres/011-HIVE-3649.postgres.sql
    hive/branches/tez/metastore/scripts/upgrade/postgres/014-HIVE-3764.postgres.sql
    hive/branches/tez/metastore/scripts/upgrade/postgres/upgrade-0.12.0-to-0.13.0.postgres.sql
    hive/branches/tez/ql/src/java/org/apache/hadoop/hive/ql/exec/DDLTask.java
    hive/branches/tez/ql/src/java/org/apache/hadoop/hive/ql/exec/FetchFormatter.java
    hive/branches/tez/ql/src/java/org/apache/hadoop/hive/ql/exec/ListSinkOperator.java
    hive/branches/tez/ql/src/java/org/apache/hadoop/hive/ql/exec/vector/VectorizationContext.java
    hive/branches/tez/ql/src/java/org/apache/hadoop/hive/ql/optimizer/physical/Vectorizer.java
    hive/branches/tez/ql/src/java/org/apache/hadoop/hive/ql/parse/DDLSemanticAnalyzer.java
    hive/branches/tez/ql/src/java/org/apache/hadoop/hive/ql/plan/RoleDDLDesc.java
    hive/branches/tez/ql/src/java/org/apache/hadoop/hive/ql/plan/ShowGrantDesc.java
    hive/branches/tez/ql/src/test/results/clientnegative/authorization_fail_3.q.out
    hive/branches/tez/ql/src/test/results/clientnegative/authorization_fail_4.q.out
    hive/branches/tez/ql/src/test/results/clientnegative/authorization_fail_5.q.out
    hive/branches/tez/ql/src/test/results/clientnegative/authorization_fail_7.q.out
    hive/branches/tez/ql/src/test/results/clientnegative/authorization_part.q.out
    hive/branches/tez/ql/src/test/results/clientnegative/unset_table_property.q.out
    hive/branches/tez/ql/src/test/results/clientnegative/unset_view_property.q.out
    hive/branches/tez/ql/src/test/results/clientpositive/alter_rename_partition_authorization.q.out
    hive/branches/tez/ql/src/test/results/clientpositive/authorization_1.q.out
    hive/branches/tez/ql/src/test/results/clientpositive/authorization_2.q.out
    hive/branches/tez/ql/src/test/results/clientpositive/authorization_3.q.out
    hive/branches/tez/ql/src/test/results/clientpositive/authorization_4.q.out
    hive/branches/tez/ql/src/test/results/clientpositive/authorization_5.q.out
    hive/branches/tez/ql/src/test/results/clientpositive/authorization_6.q.out
    hive/branches/tez/ql/src/test/results/clientpositive/keyword_1.q.out
    hive/branches/tez/ql/src/test/results/clientpositive/show_tblproperties.q.out
    hive/branches/tez/ql/src/test/results/clientpositive/unset_table_view_property.q.out
    hive/branches/tez/serde/src/java/org/apache/hadoop/hive/serde2/SerDeUtils.java
    hive/branches/tez/service/if/TCLIService.thrift
    hive/branches/tez/service/src/gen/thrift/gen-cpp/TCLIService_types.cpp
    hive/branches/tez/service/src/gen/thrift/gen-cpp/TCLIService_types.h
    hive/branches/tez/service/src/gen/thrift/gen-javabean/org/apache/hive/service/cli/thrift/TColumn.java
    hive/branches/tez/service/src/gen/thrift/gen-javabean/org/apache/hive/service/cli/thrift/TExecuteStatementReq.java
    hive/branches/tez/service/src/gen/thrift/gen-javabean/org/apache/hive/service/cli/thrift/TGetTablesReq.java
    hive/branches/tez/service/src/gen/thrift/gen-javabean/org/apache/hive/service/cli/thrift/TOpenSessionReq.java
    hive/branches/tez/service/src/gen/thrift/gen-javabean/org/apache/hive/service/cli/thrift/TOpenSessionResp.java
    hive/branches/tez/service/src/gen/thrift/gen-javabean/org/apache/hive/service/cli/thrift/TProtocolVersion.java
    hive/branches/tez/service/src/gen/thrift/gen-javabean/org/apache/hive/service/cli/thrift/TRowSet.java
    hive/branches/tez/service/src/gen/thrift/gen-javabean/org/apache/hive/service/cli/thrift/TStatus.java
    hive/branches/tez/service/src/gen/thrift/gen-javabean/org/apache/hive/service/cli/thrift/TTypeQualifiers.java
    hive/branches/tez/service/src/gen/thrift/gen-py/TCLIService/ttypes.py
    hive/branches/tez/service/src/gen/thrift/gen-rb/t_c_l_i_service_types.rb
    hive/branches/tez/service/src/java/org/apache/hive/service/cli/CLIService.java
    hive/branches/tez/service/src/java/org/apache/hive/service/cli/ColumnValue.java
    hive/branches/tez/service/src/java/org/apache/hive/service/cli/OperationHandle.java
    hive/branches/tez/service/src/java/org/apache/hive/service/cli/RowSet.java
    hive/branches/tez/service/src/java/org/apache/hive/service/cli/SessionHandle.java
    hive/branches/tez/service/src/java/org/apache/hive/service/cli/TableSchema.java
    hive/branches/tez/service/src/java/org/apache/hive/service/cli/operation/GetCatalogsOperation.java
    hive/branches/tez/service/src/java/org/apache/hive/service/cli/operation/GetColumnsOperation.java
    hive/branches/tez/service/src/java/org/apache/hive/service/cli/operation/GetFunctionsOperation.java
    hive/branches/tez/service/src/java/org/apache/hive/service/cli/operation/GetSchemasOperation.java
    hive/branches/tez/service/src/java/org/apache/hive/service/cli/operation/GetTableTypesOperation.java
    hive/branches/tez/service/src/java/org/apache/hive/service/cli/operation/GetTablesOperation.java
    hive/branches/tez/service/src/java/org/apache/hive/service/cli/operation/GetTypeInfoOperation.java
    hive/branches/tez/service/src/java/org/apache/hive/service/cli/operation/HiveCommandOperation.java
    hive/branches/tez/service/src/java/org/apache/hive/service/cli/operation/Operation.java
    hive/branches/tez/service/src/java/org/apache/hive/service/cli/operation/SQLOperation.java
    hive/branches/tez/service/src/java/org/apache/hive/service/cli/session/HiveSession.java
    hive/branches/tez/service/src/java/org/apache/hive/service/cli/session/HiveSessionImpl.java
    hive/branches/tez/service/src/java/org/apache/hive/service/cli/session/HiveSessionImplwithUGI.java
    hive/branches/tez/service/src/java/org/apache/hive/service/cli/session/SessionManager.java
    hive/branches/tez/service/src/java/org/apache/hive/service/cli/thrift/EmbeddedThriftBinaryCLIService.java
    hive/branches/tez/service/src/java/org/apache/hive/service/cli/thrift/ThriftCLIService.java
    hive/branches/tez/service/src/java/org/apache/hive/service/cli/thrift/ThriftCLIServiceClient.java
    hive/branches/tez/service/src/test/org/apache/hive/service/cli/thrift/ThriftCLIServiceTest.java

Propchange: hive/branches/tez/
------------------------------------------------------------------------------
  Merged /hive/trunk:r1554720-1555192

Modified: hive/branches/tez/ant/src/org/apache/hadoop/hive/ant/QTestGenTask.java
URL: http://svn.apache.org/viewvc/hive/branches/tez/ant/src/org/apache/hadoop/hive/ant/QTestGenTask.java?rev=1555193&r1=1555192&r2=1555193&view=diff
==============================================================================
--- hive/branches/tez/ant/src/org/apache/hadoop/hive/ant/QTestGenTask.java (original)
+++ hive/branches/tez/ant/src/org/apache/hadoop/hive/ant/QTestGenTask.java Fri Jan  3 18:37:34 2014
@@ -290,7 +290,6 @@ public class QTestGenTask extends Task {
   }
 
   public void execute() throws BuildException {
-
     if (getTemplatePath().equals("")) {
       throw new BuildException("No templatePath attribute specified");
     }
@@ -311,10 +310,6 @@ public class QTestGenTask extends Task {
       throw new BuildException("No logDirectory specified");
     }
 
-    if (resultsDirectory == null) {
-      throw new BuildException("No resultsDirectory specified");
-    }
-
     if (className == null) {
       throw new BuildException("No className specified");
     }
@@ -384,8 +379,7 @@ public class QTestGenTask extends Task {
         qFilesMap.put(qFile.getName(), relativePath(hiveRootDir, qFile));
       }
 
-      // Make sure the output directory exists, if it doesn't
-      // then create it.
+      // Make sure the output directory exists, if it doesn't then create it.
       outDir = new File(outputDirectory);
       if (!outDir.exists()) {
         outDir.mkdirs();
@@ -395,15 +389,19 @@ public class QTestGenTask extends Task {
       if (!logDir.exists()) {
         throw new BuildException("Log Directory " + logDir.getCanonicalPath() + " does not exist");
       }
-      
-      resultsDir = new File(resultsDirectory);
-      if (!resultsDir.exists()) {
-        throw new BuildException("Results Directory " + resultsDir.getCanonicalPath() + " does not exist");
+
+      if (resultsDirectory != null) {
+        resultsDir = new File(resultsDirectory);
+        if (!resultsDir.exists()) {
+          throw new BuildException("Results Directory "
+              + resultsDir.getCanonicalPath() + " does not exist");
+        }
       }
     } catch (Exception e) {
+      e.printStackTrace();
       throw new BuildException(e);
     }
-    
+
     VelocityEngine ve = new VelocityEngine();
 
     try {
@@ -439,7 +437,9 @@ public class QTestGenTask extends Task {
       ctx.put("queryDir", relativePath(hiveRootDir, queryDir));
       ctx.put("qfiles", qFiles);
       ctx.put("qfilesMap", qFilesMap);
-      ctx.put("resultsDir", relativePath(hiveRootDir, resultsDir));
+      if (resultsDir != null) {
+        ctx.put("resultsDir", relativePath(hiveRootDir, resultsDir));
+      }
       ctx.put("logDir", relativePath(hiveRootDir, logDir));
       ctx.put("clusterMode", clusterMode);
       ctx.put("hiveConfDir", hiveConfDir);
@@ -462,6 +462,7 @@ public class QTestGenTask extends Task {
     } catch(ResourceNotFoundException e) {
       throw new BuildException("Resource not found", e);
     } catch(Exception e) {
+      e.printStackTrace();
       throw new BuildException("Generation failed", e);
     }
   }

Modified: hive/branches/tez/itests/hive-unit/src/test/java/org/apache/hadoop/hive/jdbc/TestJdbcDriver.java
URL: http://svn.apache.org/viewvc/hive/branches/tez/itests/hive-unit/src/test/java/org/apache/hadoop/hive/jdbc/TestJdbcDriver.java?rev=1555193&r1=1555192&r2=1555193&view=diff
==============================================================================
--- hive/branches/tez/itests/hive-unit/src/test/java/org/apache/hadoop/hive/jdbc/TestJdbcDriver.java (original)
+++ hive/branches/tez/itests/hive-unit/src/test/java/org/apache/hadoop/hive/jdbc/TestJdbcDriver.java Fri Jan  3 18:37:34 2014
@@ -1140,4 +1140,45 @@ public class TestJdbcDriver extends Test
     stmt.close();
   }
 
+  public void testShowGrant() throws SQLException {
+    Statement stmt = con.createStatement();
+    stmt.execute("grant select on table " + dataTypeTableName + " to user hive_test_user");
+    stmt.execute("show grant user hive_test_user on table " + dataTypeTableName);
+
+    ResultSet res = stmt.getResultSet();
+    assertTrue(res.next());
+    assertEquals("database", res.getString(1));
+    assertEquals("default", res.getString(2));
+    assertTrue(res.next());
+    assertEquals("table", res.getString(1));
+    assertEquals(dataTypeTableName, res.getString(2));
+    assertTrue(res.next());
+    assertEquals("principalName", res.getString(1));
+    assertEquals("hive_test_user", res.getString(2));
+    assertTrue(res.next());
+    assertEquals("principalType", res.getString(1));
+    assertEquals("USER", res.getString(2));
+    assertTrue(res.next());
+    assertEquals("privilege", res.getString(1));
+    assertEquals("Select", res.getString(2));
+    assertTrue(res.next());
+    assertEquals("grantTime", res.getString(1));
+    assertTrue(res.next());
+    assertEquals("grantor", res.getString(1));
+    assertFalse(res.next());
+    res.close();
+  }
+
+  public void testShowRoleGrant() throws SQLException {
+    Statement stmt = con.createStatement();
+    stmt.execute("create role role1");
+    stmt.execute("grant role role1 to user hive_test_user");
+    stmt.execute("show role grant user hive_test_user");
+
+    ResultSet res = stmt.getResultSet();
+    assertTrue(res.next());
+    assertEquals("role1", res.getString(1));
+    assertFalse(res.next());
+    res.close();
+  }
 }

Modified: hive/branches/tez/itests/hive-unit/src/test/java/org/apache/hive/jdbc/TestJdbcDriver2.java
URL: http://svn.apache.org/viewvc/hive/branches/tez/itests/hive-unit/src/test/java/org/apache/hive/jdbc/TestJdbcDriver2.java?rev=1555193&r1=1555192&r2=1555193&view=diff
==============================================================================
--- hive/branches/tez/itests/hive-unit/src/test/java/org/apache/hive/jdbc/TestJdbcDriver2.java (original)
+++ hive/branches/tez/itests/hive-unit/src/test/java/org/apache/hive/jdbc/TestJdbcDriver2.java Fri Jan  3 18:37:34 2014
@@ -593,8 +593,8 @@ public class TestJdbcDriver2 {
   /**
    * Execute "set x" and extract value from key=val format result
    * Verify the extracted value
-   * @param stmt
-   * @return
+   * @param key
+   * @param expectedVal
    * @throws Exception
    */
   private void verifyConfValue(String key, String expectedVal) throws Exception {
@@ -1928,4 +1928,50 @@ public class TestJdbcDriver2 {
     }
   }
 
+  public void testShowGrant() throws SQLException {
+    Statement stmt = con.createStatement();
+    stmt.execute("grant select on table " + dataTypeTableName + " to user hive_test_user");
+    stmt.execute("show grant user hive_test_user on table " + dataTypeTableName);
+
+    ResultSet res = stmt.getResultSet();
+    ResultSetMetaData metaData = res.getMetaData();
+
+    assertEquals("property", metaData.getColumnName(1));
+    assertEquals("value", metaData.getColumnName(2));
+
+    assertTrue(res.next());
+    assertEquals("database", res.getString(1));
+    assertEquals("default", res.getString(2));
+    assertTrue(res.next());
+    assertEquals("table", res.getString(1));
+    assertEquals(dataTypeTableName, res.getString(2));
+    assertTrue(res.next());
+    assertEquals("principalName", res.getString(1));
+    assertEquals("hive_test_user", res.getString(2));
+    assertTrue(res.next());
+    assertEquals("principalType", res.getString(1));
+    assertEquals("USER", res.getString(2));
+    assertTrue(res.next());
+    assertEquals("privilege", res.getString(1));
+    assertEquals("Select", res.getString(2));
+    assertTrue(res.next());
+    assertEquals("grantTime", res.getString(1));
+    assertTrue(res.next());
+    assertEquals("grantor", res.getString(1));
+    assertFalse(res.next());
+    res.close();
+  }
+
+  public void testShowRoleGrant() throws SQLException {
+    Statement stmt = con.createStatement();
+    stmt.execute("create role role1");
+    stmt.execute("grant role role1 to user hive_test_user");
+    stmt.execute("show role grant user hive_test_user");
+
+    ResultSet res = stmt.getResultSet();
+    assertTrue(res.next());
+    assertEquals("role1", res.getString(1));
+    assertFalse(res.next());
+    res.close();
+  }
 }

Modified: hive/branches/tez/itests/hive-unit/src/test/java/org/apache/hive/jdbc/miniHS2/TestHiveServer2.java
URL: http://svn.apache.org/viewvc/hive/branches/tez/itests/hive-unit/src/test/java/org/apache/hive/jdbc/miniHS2/TestHiveServer2.java?rev=1555193&r1=1555192&r2=1555193&view=diff
==============================================================================
--- hive/branches/tez/itests/hive-unit/src/test/java/org/apache/hive/jdbc/miniHS2/TestHiveServer2.java (original)
+++ hive/branches/tez/itests/hive-unit/src/test/java/org/apache/hive/jdbc/miniHS2/TestHiveServer2.java Fri Jan  3 18:37:34 2014
@@ -64,6 +64,6 @@ public class TestHiveServer2 {
     serviceClient.executeStatement(sessHandle, "CREATE TABLE " + tabName + " (id INT)", confOverlay);
     OperationHandle opHandle = serviceClient.executeStatement(sessHandle, "SHOW TABLES", confOverlay);
     RowSet rowSet = serviceClient.fetchResults(opHandle);
-    assertFalse(rowSet.getSize() == 0);
+    assertFalse(rowSet.numRows() == 0);
   }
 }

Modified: hive/branches/tez/itests/qtest/pom.xml
URL: http://svn.apache.org/viewvc/hive/branches/tez/itests/qtest/pom.xml?rev=1555193&r1=1555192&r2=1555193&view=diff
==============================================================================
--- hive/branches/tez/itests/qtest/pom.xml (original)
+++ hive/branches/tez/itests/qtest/pom.xml Fri Jan  3 18:37:34 2014
@@ -389,6 +389,7 @@
                   classpath="${test.classpath}" />
                 <mkdir dir="${project.build.directory}/qfile-results/clientpositive/" />
                 <mkdir dir="${project.build.directory}/qfile-results/clientnegative/" />
+                <mkdir dir="${project.build.directory}/qfile-results/clientcompare"/>
                 <mkdir dir="${project.build.directory}/qfile-results/positive/" />
                 <mkdir dir="${project.build.directory}/qfile-results/negative/" />
                 <mkdir dir="${project.build.directory}/qfile-results/hbase-handler/positive/" />
@@ -457,6 +458,20 @@
                   logDirectory="${project.build.directory}/qfile-results/clientnegative/"
                   hadoopVersion="${active.hadoop.version}"/>
 
+                <!-- Compare Cli -->
+                <qtestgen hiveRootDirectory="${basedir}/${hive.path.to.root}/"
+                  outputDirectory="${project.build.directory}/generated-test-sources/java/org/apache/hadoop/hive/cli/"
+                  templatePath="${basedir}/${hive.path.to.root}/ql/src/test/templates/" template="TestCompareCliDriver.vm"
+                  queryDirectory="${basedir}/${hive.path.to.root}/ql/src/test/queries/clientcompare/"
+                  queryFile="${qfile}"
+                  queryFileRegex="${qfile_regex}"
+                  clusterMode="${clustermode}"
+                  runDisabled="${run_disabled}"
+                  className="TestCompareCliDriver"
+                  logFile="${project.build.directory}/testcompareclidrivergen.log"
+                  logDirectory="${project.build.directory}/qfile-results/clientcompare/"
+                  hadoopVersion="${active.hadoop.version}"/>
+
                 <!-- Minimr -->
                 <qtestgen hiveRootDirectory="${basedir}/${hive.path.to.root}/"
                   outputDirectory="${project.build.directory}/generated-test-sources/java/org/apache/hadoop/hive/cli/"

Modified: hive/branches/tez/itests/util/src/main/java/org/apache/hadoop/hive/ql/QTestUtil.java
URL: http://svn.apache.org/viewvc/hive/branches/tez/itests/util/src/main/java/org/apache/hadoop/hive/ql/QTestUtil.java?rev=1555193&r1=1555192&r2=1555193&view=diff
==============================================================================
--- hive/branches/tez/itests/util/src/main/java/org/apache/hadoop/hive/ql/QTestUtil.java (original)
+++ hive/branches/tez/itests/util/src/main/java/org/apache/hadoop/hive/ql/QTestUtil.java Fri Jan  3 18:37:34 2014
@@ -29,6 +29,7 @@ import java.io.FileNotFoundException;
 import java.io.FileOutputStream;
 import java.io.FileReader;
 import java.io.FileWriter;
+import java.io.FilenameFilter;
 import java.io.IOException;
 import java.io.InputStreamReader;
 import java.io.OutputStreamWriter;
@@ -38,6 +39,7 @@ import java.io.StringWriter;
 import java.io.UnsupportedEncodingException;
 import java.net.URL;
 import java.util.ArrayList;
+import java.util.Arrays;
 import java.util.Collection;
 import java.util.Deque;
 import java.util.HashMap;
@@ -91,11 +93,16 @@ import org.apache.hadoop.mapred.Sequence
 import org.apache.hadoop.mapred.TextInputFormat;
 import org.apache.hadoop.util.Shell;
 import org.apache.thrift.protocol.TBinaryProtocol;
+import org.apache.tools.ant.BuildException;
 import org.apache.zookeeper.WatchedEvent;
 import org.apache.zookeeper.Watcher;
 import org.apache.zookeeper.ZooKeeper;
 import org.junit.Assume;
 
+import com.google.common.collect.Collections2;
+import com.google.common.collect.ImmutableList;
+import com.google.common.collect.Ordering;
+
 /**
  * QTestUtil.
  *
@@ -414,12 +421,21 @@ public class QTestUtil {
   }
 
   public void addFile(String queryFile) throws IOException {
+    addFile(queryFile, false);
+  }
+
+  public void addFile(String queryFile, boolean partial) throws IOException {
     addFile(new File(queryFile));
   }
 
-  public void addFile(File qf) throws IOException  {
+  public void addFile(File qf) throws IOException {
+    addFile(qf, false);
+  }
+
+  public void addFile(File qf, boolean partial) throws IOException  {
     String query = readEntireFileIntoString(qf);
     qMap.put(qf.getName(), query);
+    if (partial) return;
 
     if(checkHadoopVersionExclude(qf.getName(), query)
       || checkOSExclude(qf.getName(), query)) {
@@ -794,7 +810,7 @@ public class QTestUtil {
     cliInit(tname, true);
   }
 
-  public void cliInit(String tname, boolean recreate) throws Exception {
+  public String cliInit(String tname, boolean recreate) throws Exception {
     if (recreate) {
       cleanUp();
       createSources();
@@ -806,10 +822,17 @@ public class QTestUtil {
     assert ss != null;
     ss.in = System.in;
 
-    File qf = new File(outDir, tname);
-    File outf = null;
-    outf = new File(logDir);
-    outf = new File(outf, qf.getName().concat(".out"));
+    String stdoutName = null;
+    if (outDir != null) {
+      // TODO: why is this needed?
+      File qf = new File(outDir, tname);
+      stdoutName = qf.getName().concat(".out");
+    } else {
+      stdoutName = tname + ".out";
+    }
+
+    File outf = new File(logDir);
+    outf = new File(outf, stdoutName);
     FileOutputStream fo = new FileOutputStream(outf);
     ss.out = new PrintStream(fo, true, "UTF-8");
     ss.err = new CachingPrintStream(fo, true, "UTF-8");
@@ -830,6 +853,7 @@ public class QTestUtil {
       ss.initFiles.add("../../data/scripts/test_init_file.sql");
     }
     cliDriver.processInitFiles(ss);
+    return outf.getAbsolutePath();
   }
 
   private CliSessionState startSessionState()
@@ -870,7 +894,17 @@ public class QTestUtil {
     }
   }
 
+  private static final String CRLF = System.getProperty("line.separator");
+  public int executeClient(String tname1, String tname2) {
+    String commands = getCommands(tname1) + CRLF + getCommands(tname2);
+    return cliDriver.processLine(commands);
+  }
+
   public int executeClient(String tname) {
+    return cliDriver.processLine(getCommands(tname));
+  }
+
+  private String getCommands(String tname) {
     String commands = qMap.get(tname);
     StringBuilder newCommands = new StringBuilder(commands.length());
     int lastMatchEnd = 0;
@@ -882,7 +916,7 @@ public class QTestUtil {
     }
     newCommands.append(commands.substring(lastMatchEnd, commands.length()));
     commands = newCommands.toString();
-    return cliDriver.processLine(commands);
+    return commands;
   }
 
   public boolean shouldBeSkipped(String tname) {
@@ -1233,6 +1267,22 @@ public class QTestUtil {
     return exitVal;
   }
 
+
+  public int checkCompareCliDriverResults(String tname, List<String> outputs) throws Exception {
+    assert outputs.size() > 1;
+    maskPatterns(planMask, outputs.get(0));
+    for (int i = 1; i < outputs.size(); ++i) {
+      maskPatterns(planMask, outputs.get(i));
+      int ecode = executeDiffCommand(
+          outputs.get(i - 1), outputs.get(i), false, qSortSet.contains(tname));
+      if (ecode != 0) {
+        System.out.println("Files don't match: " + outputs.get(i - 1) + " and " + outputs.get(i));
+        return ecode;
+      }
+    }
+    return 0;
+  }
+
   private static int overwriteResults(String inFileName, String outFileName) throws Exception {
     // This method can be replaced with Files.copy(source, target, REPLACE_EXISTING)
     // once Hive uses JAVA 7.
@@ -1605,4 +1655,58 @@ public class QTestUtil {
       return path + File.separator;
     }
   }
+
+  private static String[] cachedQvFileList = null;
+  private static ImmutableList<String> cachedDefaultQvFileList = null;
+  private static Pattern qvSuffix = Pattern.compile("_[0-9]+.qv$", Pattern.CASE_INSENSITIVE);
+
+  public static List<String> getVersionFiles(String queryDir, String tname) {
+    ensureQvFileList(queryDir);
+    List<String> result = getVersionFilesInternal(tname);
+    if (result == null) {
+      result = cachedDefaultQvFileList;
+    }
+    return result;
+  }
+
+  private static void ensureQvFileList(String queryDir) {
+    if (cachedQvFileList != null) return;
+    // Not thread-safe.
+    System.out.println("Getting versions from " + queryDir);
+    cachedQvFileList = (new File(queryDir)).list(new FilenameFilter() {
+      @Override
+      public boolean accept(File dir, String name) {
+        return name.toLowerCase().endsWith(".qv");
+      }
+    });
+    if (cachedQvFileList == null) return; // no files at all
+    Arrays.sort(cachedQvFileList, String.CASE_INSENSITIVE_ORDER);
+    List<String> defaults = getVersionFilesInternal("default");
+    cachedDefaultQvFileList = (defaults != null)
+        ? ImmutableList.copyOf(defaults) : ImmutableList.<String>of();
+  }
+
+  private static List<String> getVersionFilesInternal(String tname) {
+    if (cachedQvFileList == null) {
+      return new ArrayList<String>();
+    }
+    int pos = Arrays.binarySearch(cachedQvFileList, tname, String.CASE_INSENSITIVE_ORDER);
+    if (pos >= 0) {
+      throw new BuildException("Unexpected file list element: " + cachedQvFileList[pos]);
+    }
+    List<String> result = null;
+    for (pos = (-pos - 1); pos < cachedQvFileList.length; ++pos) {
+      String candidate = cachedQvFileList[pos];
+      if (candidate.length() <= tname.length()
+          || !tname.equalsIgnoreCase(candidate.substring(0, tname.length()))
+          || !qvSuffix.matcher(candidate.substring(tname.length())).matches()) {
+        break;
+      }
+      if (result == null) {
+        result = new ArrayList<String>();
+      }
+      result.add(candidate);
+    }
+    return result;
+  }
 }

Modified: hive/branches/tez/jdbc/src/java/org/apache/hive/jdbc/HiveBaseResultSet.java
URL: http://svn.apache.org/viewvc/hive/branches/tez/jdbc/src/java/org/apache/hive/jdbc/HiveBaseResultSet.java?rev=1555193&r1=1555192&r2=1555193&view=diff
==============================================================================
--- hive/branches/tez/jdbc/src/java/org/apache/hive/jdbc/HiveBaseResultSet.java (original)
+++ hive/branches/tez/jdbc/src/java/org/apache/hive/jdbc/HiveBaseResultSet.java Fri Jan  3 18:37:34 2014
@@ -44,25 +44,17 @@ import java.util.Map;
 
 import org.apache.hive.service.cli.TableSchema;
 import org.apache.hive.service.cli.Type;
-import org.apache.hive.service.cli.thrift.TBoolValue;
-import org.apache.hive.service.cli.thrift.TByteValue;
-import org.apache.hive.service.cli.thrift.TColumnValue;
-import org.apache.hive.service.cli.thrift.TDoubleValue;
-import org.apache.hive.service.cli.thrift.TI16Value;
-import org.apache.hive.service.cli.thrift.TI32Value;
-import org.apache.hive.service.cli.thrift.TI64Value;
-import org.apache.hive.service.cli.thrift.TRow;
-import org.apache.hive.service.cli.thrift.TStringValue;
 
 /**
  * Data independent base class which implements the common part of
  * all Hive result sets.
  */
 public abstract class HiveBaseResultSet implements ResultSet {
+
   protected Statement statement = null;
   protected SQLWarning warningChain = null;
   protected boolean wasNull = false;
-  protected TRow row;
+  protected Object[] row;
   protected List<String> columnNames;
   protected List<String> columnTypes;
   protected List<JdbcColumnAttributes> columnAttributes;
@@ -380,176 +372,47 @@ public abstract class HiveBaseResultSet 
     throw new SQLException("Method not supported");
   }
 
-  private Boolean getBooleanValue(TBoolValue tBoolValue) {
-    if (tBoolValue.isSetValue()) {
-      wasNull = false;
-      return tBoolValue.isValue();
-    }
-    wasNull = true;
-    return null;
-  }
-
-  private Byte getByteValue(TByteValue tByteValue) {
-    if (tByteValue.isSetValue()) {
-      wasNull = false;
-      return tByteValue.getValue();
-    }
-    wasNull = true;
-    return null;
-  }
-
-  private Short getShortValue(TI16Value tI16Value) {
-    if (tI16Value.isSetValue()) {
-      wasNull = false;
-      return tI16Value.getValue();
-    }
-    wasNull = true;
-    return null;
-  }
-
-  private Integer getIntegerValue(TI32Value tI32Value) {
-    if (tI32Value.isSetValue()) {
-      wasNull = false;
-      return tI32Value.getValue();
-    }
-    wasNull = true;
-    return null;
-  }
-
-  private Long getLongValue(TI64Value tI64Value) {
-    if (tI64Value.isSetValue()) {
-      wasNull = false;
-      return tI64Value.getValue();
-    }
-    wasNull = true;
-    return null;
-  }
-
-  private Double getDoubleValue(TDoubleValue tDoubleValue) {
-    if (tDoubleValue.isSetValue()) {
-      wasNull = false;
-      return tDoubleValue.getValue();
-    }
-    wasNull = true;
-    return null;
-  }
-
-  private String getStringValue(TStringValue tStringValue) {
-    if (tStringValue.isSetValue()) {
-      wasNull = false;
-      return tStringValue.getValue();
-    }
-    wasNull = true;
-    return null;
-  }
-
-  private Date getDateValue(TStringValue tStringValue) {
-    if (tStringValue.isSetValue()) {
-      wasNull = false;
-      return Date.valueOf(tStringValue.getValue());
-    }
-    wasNull = true;
-    return null;
-  }
-
-  private Timestamp getTimestampValue(TStringValue tStringValue) {
-    if (tStringValue.isSetValue()) {
-      wasNull = false;
-      return Timestamp.valueOf(tStringValue.getValue());
-    }
-    wasNull = true;
-    return null;
-  }
-
-  private byte[] getBinaryValue(TStringValue tString) {
-    if (tString.isSetValue()) {
-      wasNull = false;
-      return tString.getValue().getBytes();
-    }
-    wasNull = true;
-    return null;
-  }
-
-  private BigDecimal getBigDecimalValue(TStringValue tStringValue) {
-    if (tStringValue.isSetValue()) {
-      wasNull = false;
-      return new BigDecimal(tStringValue.getValue());
-    }
-    wasNull = true;
-    return null;
-  }
-
   private Object getColumnValue(int columnIndex) throws SQLException {
     if (row == null) {
       throw new SQLException("No row found.");
     }
-    List<TColumnValue> colVals = row.getColVals();
-    if (colVals == null) {
+    if (row.length == 0) {
       throw new SQLException("RowSet does not contain any columns!");
     }
-    if (columnIndex > colVals.size()) {
+    if (columnIndex > row.length) {
       throw new SQLException("Invalid columnIndex: " + columnIndex);
     }
-
-    TColumnValue tColumnValue = colVals.get(columnIndex - 1);
     Type columnType = getSchema().getColumnDescriptorAt(columnIndex - 1).getType();
 
-    switch (columnType) {
-    case BOOLEAN_TYPE:
-      return getBooleanValue(tColumnValue.getBoolVal());
-    case TINYINT_TYPE:
-      return getByteValue(tColumnValue.getByteVal());
-    case SMALLINT_TYPE:
-      return getShortValue(tColumnValue.getI16Val());
-    case INT_TYPE:
-      return getIntegerValue(tColumnValue.getI32Val());
-    case BIGINT_TYPE:
-      return getLongValue(tColumnValue.getI64Val());
-    case FLOAT_TYPE:
-      return getDoubleValue(tColumnValue.getDoubleVal());
-    case DOUBLE_TYPE:
-      return getDoubleValue(tColumnValue.getDoubleVal());
-    case STRING_TYPE:
-      return getStringValue(tColumnValue.getStringVal());
-    case CHAR_TYPE:
-      return getStringValue(tColumnValue.getStringVal());
-    case VARCHAR_TYPE:
-      return getStringValue(tColumnValue.getStringVal());
-    case BINARY_TYPE:
-      return getBinaryValue(tColumnValue.getStringVal());
-    case DATE_TYPE:
-      return getDateValue(tColumnValue.getStringVal());
-    case TIMESTAMP_TYPE:
-      return getTimestampValue(tColumnValue.getStringVal());
-    case DECIMAL_TYPE:
-      return getBigDecimalValue(tColumnValue.getStringVal());
-    case NULL_TYPE:
-      wasNull = true;
-      return null;
-    default:
-      throw new SQLException("Unrecognized column type:" + columnType);
+    try {
+      Object evaluated = evaluate(columnType, row[columnIndex - 1]);
+      wasNull = evaluated == null;
+      return evaluated;
+    } catch (Exception e) {
+      e.printStackTrace();
+      throw new SQLException("Unrecognized column type:" + columnType, e);
     }
+  }
 
-    /*
-    switch (tColumnValue.getSetField()) {
-    case BOOL_VAL:
-      return getBooleanValue(tColumnValue.getBoolVal());
-    case BYTE_VAL:
-      return getByteValue(tColumnValue.getByteVal());
-    case I16_VAL:
-      return getShortValue(tColumnValue.getI16Val());
-    case I32_VAL:
-      return getIntegerValue(tColumnValue.getI32Val());
-    case I64_VAL:
-      return getLongValue(tColumnValue.getI64Val());
-    case DOUBLE_VAL:
-      return getDoubleValue(tColumnValue.getDoubleVal());
-    case STRING_VAL:
-      return getStringValue(tColumnValue.getStringVal());
-    default:
-      throw new SQLException("Unrecognized column type:" + tColumnValue.getSetField());
+  private Object evaluate(Type type, Object value) {
+    if (value == null) {
+      return null;
+    }
+    switch (type) {
+      case BINARY_TYPE:
+        if (value instanceof String) {
+          return ((String) value).getBytes();
+        }
+        return value;
+      case TIMESTAMP_TYPE:
+        return Timestamp.valueOf((String) value);
+      case DECIMAL_TYPE:
+        return new BigDecimal((String)value);
+      case DATE_TYPE:
+        return Date.valueOf((String) value);
+      default:
+        return value;
     }
-    */
   }
 
   public Object getObject(int columnIndex) throws SQLException {

Modified: hive/branches/tez/jdbc/src/java/org/apache/hive/jdbc/HiveConnection.java
URL: http://svn.apache.org/viewvc/hive/branches/tez/jdbc/src/java/org/apache/hive/jdbc/HiveConnection.java?rev=1555193&r1=1555192&r2=1555193&view=diff
==============================================================================
--- hive/branches/tez/jdbc/src/java/org/apache/hive/jdbc/HiveConnection.java (original)
+++ hive/branches/tez/jdbc/src/java/org/apache/hive/jdbc/HiveConnection.java Fri Jan  3 18:37:34 2014
@@ -93,12 +93,13 @@ public class HiveConnection implements j
   private final Map<String, String> hiveVarMap;
   private final boolean isEmbeddedMode;
   private TTransport transport;
-  private TCLIService.Iface client;
+  private TCLIService.Iface client;   // todo should be replaced by CliServiceClient
   private boolean isClosed = true;
   private SQLWarning warningChain = null;
   private TSessionHandle sessHandle = null;
   private final List<TProtocolVersion> supportedProtocols = new LinkedList<TProtocolVersion>();
   private int loginTimeout = 0;
+  private TProtocolVersion protocol;
 
   public HiveConnection(String uri, Properties info) throws SQLException {
     setupLoginTimeout();
@@ -138,6 +139,7 @@ public class HiveConnection implements j
     supportedProtocols.add(TProtocolVersion.HIVE_CLI_SERVICE_PROTOCOL_V3);
     supportedProtocols.add(TProtocolVersion.HIVE_CLI_SERVICE_PROTOCOL_V4);
     supportedProtocols.add(TProtocolVersion.HIVE_CLI_SERVICE_PROTOCOL_V5);
+    supportedProtocols.add(TProtocolVersion.HIVE_CLI_SERVICE_PROTOCOL_V6);
 
     // open client session
     openSession();
@@ -268,8 +270,10 @@ public class HiveConnection implements j
       if (!supportedProtocols.contains(openResp.getServerProtocolVersion())) {
         throw new TException("Unsupported Hive2 protocol");
       }
+      protocol = openResp.getServerProtocolVersion();
       sessHandle = openResp.getSessionHandle();
     } catch (TException e) {
+      e.printStackTrace();
       throw new SQLException("Could not establish connection to "
           + jdbcURI + ": " + e.getMessage(), " 08S01", e);
     }
@@ -932,4 +936,7 @@ public class HiveConnection implements j
     throw new SQLException("Method not supported");
   }
 
+  public TProtocolVersion getProtocol() {
+    return protocol;
+  }
 }

Modified: hive/branches/tez/jdbc/src/java/org/apache/hive/jdbc/HiveDatabaseMetaData.java
URL: http://svn.apache.org/viewvc/hive/branches/tez/jdbc/src/java/org/apache/hive/jdbc/HiveDatabaseMetaData.java?rev=1555193&r1=1555192&r2=1555193&view=diff
==============================================================================
--- hive/branches/tez/jdbc/src/java/org/apache/hive/jdbc/HiveDatabaseMetaData.java (original)
+++ hive/branches/tez/jdbc/src/java/org/apache/hive/jdbc/HiveDatabaseMetaData.java Fri Jan  3 18:37:34 2014
@@ -135,7 +135,7 @@ public class HiveDatabaseMetaData implem
     }
     Utils.verifySuccess(catalogResp.getStatus());
 
-    return new HiveQueryResultSet.Builder(null)
+    return new HiveQueryResultSet.Builder(connection)
     .setClient(client)
     .setSessionHandle(sessHandle)
     .setStmtHandle(catalogResp.getOperationHandle())
@@ -219,7 +219,7 @@ public class HiveDatabaseMetaData implem
     }
     Utils.verifySuccess(colResp.getStatus());
     // build the resultset from response
-    return new HiveQueryResultSet.Builder(null)
+    return new HiveQueryResultSet.Builder(connection)
     .setClient(client)
     .setSessionHandle(sessHandle)
     .setStmtHandle(colResp.getOperationHandle())
@@ -331,7 +331,7 @@ public class HiveDatabaseMetaData implem
     }
     Utils.verifySuccess(funcResp.getStatus());
 
-    return new HiveQueryResultSet.Builder(null)
+    return new HiveQueryResultSet.Builder(connection)
     .setClient(client)
     .setSessionHandle(sessHandle)
     .setStmtHandle(funcResp.getOperationHandle())
@@ -344,7 +344,7 @@ public class HiveDatabaseMetaData implem
 
   public ResultSet getImportedKeys(String catalog, String schema, String table)
       throws SQLException {
-    return new HiveQueryResultSet.Builder(null)
+    return new HiveQueryResultSet.Builder(connection)
     .setClient(client)
     .setEmptyResultSet(true)
     .setSchema(
@@ -486,7 +486,7 @@ public class HiveDatabaseMetaData implem
       throws SQLException {
     // Hive doesn't support primary keys
     // using local schema with empty resultset
-    return new HiveQueryResultSet.Builder(null).setClient(client).setEmptyResultSet(true).
+    return new HiveQueryResultSet.Builder(connection).setClient(client).setEmptyResultSet(true).
         setSchema(Arrays.asList("TABLE_CAT", "TABLE_SCHEM", "TABLE_NAME", "COLUMN_NAME", "KEY_SEQ", "PK_NAME" ),
             Arrays.asList("STRING",    "STRING",      "STRING",     "STRING",       "INT",  "STRING"))
             .build();
@@ -497,7 +497,7 @@ public class HiveDatabaseMetaData implem
       throws SQLException {
     // Hive doesn't support primary keys
     // using local schema with empty resultset
-    return new HiveQueryResultSet.Builder(null).setClient(client).setEmptyResultSet(true).
+    return new HiveQueryResultSet.Builder(connection).setClient(client).setEmptyResultSet(true).
                   setSchema(
                     Arrays.asList("PROCEDURE_CAT", "PROCEDURE_SCHEM", "PROCEDURE_NAME", "COLUMN_NAME", "COLUMN_TYPE",
                               "DATA_TYPE", "TYPE_NAME", "PRECISION", "LENGTH", "SCALE", "RADIX", "NULLABLE", "REMARKS",
@@ -518,7 +518,7 @@ public class HiveDatabaseMetaData implem
       String procedureNamePattern) throws SQLException {
     // Hive doesn't support primary keys
     // using local schema with empty resultset
-    return new HiveQueryResultSet.Builder(null).setClient(client).setEmptyResultSet(true).
+    return new HiveQueryResultSet.Builder(connection).setClient(client).setEmptyResultSet(true).
                   setSchema(
                     Arrays.asList("PROCEDURE_CAT", "PROCEDURE_SCHEM", "PROCEDURE_NAME", "RESERVERD", "RESERVERD",
                                   "RESERVERD", "REMARKS", "PROCEDURE_TYPE", "SPECIFIC_NAME"),
@@ -572,7 +572,7 @@ public class HiveDatabaseMetaData implem
     }
     Utils.verifySuccess(schemaResp.getStatus());
 
-    return new HiveQueryResultSet.Builder(null)
+    return new HiveQueryResultSet.Builder(connection)
     .setClient(client)
     .setSessionHandle(sessHandle)
     .setStmtHandle(schemaResp.getOperationHandle())
@@ -616,7 +616,7 @@ public class HiveDatabaseMetaData implem
     }
     Utils.verifySuccess(tableTypeResp.getStatus());
 
-    return new HiveQueryResultSet.Builder(null)
+    return new HiveQueryResultSet.Builder(connection)
     .setClient(client)
     .setSessionHandle(sessHandle)
     .setStmtHandle(tableTypeResp.getOperationHandle())
@@ -649,7 +649,7 @@ public class HiveDatabaseMetaData implem
     }
     Utils.verifySuccess(getTableResp.getStatus());
 
-    return new HiveQueryResultSet.Builder(null)
+    return new HiveQueryResultSet.Builder(connection)
     .setClient(client)
     .setSessionHandle(sessHandle)
     .setStmtHandle(getTableResp.getOperationHandle())
@@ -705,7 +705,7 @@ public class HiveDatabaseMetaData implem
       throw new SQLException(e.getMessage(), "08S01", e);
     }
     Utils.verifySuccess(getTypeInfoResp.getStatus());
-    return new HiveQueryResultSet.Builder(null)
+    return new HiveQueryResultSet.Builder(connection)
     .setClient(client)
     .setSessionHandle(sessHandle)
     .setStmtHandle(getTypeInfoResp.getOperationHandle())

Modified: hive/branches/tez/jdbc/src/java/org/apache/hive/jdbc/HiveQueryResultSet.java
URL: http://svn.apache.org/viewvc/hive/branches/tez/jdbc/src/java/org/apache/hive/jdbc/HiveQueryResultSet.java?rev=1555193&r1=1555192&r2=1555193&view=diff
==============================================================================
--- hive/branches/tez/jdbc/src/java/org/apache/hive/jdbc/HiveQueryResultSet.java (original)
+++ hive/branches/tez/jdbc/src/java/org/apache/hive/jdbc/HiveQueryResultSet.java Fri Jan  3 18:37:34 2014
@@ -20,6 +20,7 @@ package org.apache.hive.jdbc;
 
 import static org.apache.hive.service.cli.thrift.TCLIServiceConstants.TYPE_NAMES;
 
+import java.sql.Connection;
 import java.sql.ResultSet;
 import java.sql.ResultSetMetaData;
 import java.sql.Statement;
@@ -31,6 +32,8 @@ import java.util.List;
 import org.apache.commons.logging.Log;
 import org.apache.commons.logging.LogFactory;
 import org.apache.hadoop.hive.common.type.HiveDecimal;
+import org.apache.hive.service.cli.RowSet;
+import org.apache.hive.service.cli.RowSetFactory;
 import org.apache.hive.service.cli.TableSchema;
 import org.apache.hive.service.cli.thrift.TCLIService;
 import org.apache.hive.service.cli.thrift.TCLIServiceConstants;
@@ -42,7 +45,8 @@ import org.apache.hive.service.cli.thrif
 import org.apache.hive.service.cli.thrift.TGetResultSetMetadataResp;
 import org.apache.hive.service.cli.thrift.TOperationHandle;
 import org.apache.hive.service.cli.thrift.TPrimitiveTypeEntry;
-import org.apache.hive.service.cli.thrift.TRow;
+import org.apache.hive.service.cli.thrift.TProtocolVersion;
+import org.apache.hive.service.cli.thrift.TRowSet;
 import org.apache.hive.service.cli.thrift.TSessionHandle;
 import org.apache.hive.service.cli.thrift.TTableSchema;
 import org.apache.hive.service.cli.thrift.TTypeQualifierValue;
@@ -63,15 +67,19 @@ public class HiveQueryResultSet extends 
   private int fetchSize;
   private int rowsFetched = 0;
 
-  private List<TRow> fetchedRows;
-  private Iterator<TRow> fetchedRowsItr;
+  private RowSet fetchedRows;
+  private Iterator<Object[]> fetchedRowsItr;
   private boolean isClosed = false;
   private boolean emptyResultSet = false;
   private boolean isScrollable = false;
   private boolean fetchFirst = false;
 
+  private final TProtocolVersion protocol;
+
+
   public static class Builder {
 
+    private final Connection connection;
     private final Statement statement;
     private TCLIService.Iface client = null;
     private TOperationHandle stmtHandle = null;
@@ -91,8 +99,14 @@ public class HiveQueryResultSet extends 
     private boolean emptyResultSet = false;
     private boolean isScrollable = false;
 
-    public Builder(Statement statement) {
+    public Builder(Statement statement) throws SQLException {
       this.statement = statement;
+      this.connection = statement.getConnection();
+    }
+
+    public Builder(Connection connection) {
+      this.statement = null;
+      this.connection = connection;
     }
 
     public Builder setClient(TCLIService.Iface client) {
@@ -155,6 +169,10 @@ public class HiveQueryResultSet extends 
     public HiveQueryResultSet build() throws SQLException {
       return new HiveQueryResultSet(this);
     }
+
+    public TProtocolVersion getProtocolVersion() throws SQLException {
+      return ((HiveConnection)connection).getProtocol();
+    }
   }
 
   protected HiveQueryResultSet(Builder builder) throws SQLException {
@@ -178,6 +196,7 @@ public class HiveQueryResultSet extends 
       this.maxRows = builder.maxRows;
     }
     this.isScrollable = builder.isScrollable;
+    this.protocol = builder.getProtocolVersion();
   }
 
   /**
@@ -217,6 +236,7 @@ public class HiveQueryResultSet extends 
    * Retrieve schema from the server
    */
   private void retrieveSchema() throws SQLException {
+    System.err.println("[HiveQueryResultSet/next] 0");
     try {
       TGetResultSetMetadataReq metadataReq = new TGetResultSetMetadataReq(stmtHandle);
       // TODO need session handle
@@ -304,13 +324,14 @@ public class HiveQueryResultSet extends 
         fetchedRowsItr = null;
         fetchFirst = false;
       }
-
       if (fetchedRows == null || !fetchedRowsItr.hasNext()) {
         TFetchResultsReq fetchReq = new TFetchResultsReq(stmtHandle,
             orientation, fetchSize);
         TFetchResultsResp fetchResp = client.FetchResults(fetchReq);
         Utils.verifySuccessWithInfo(fetchResp.getStatus());
-        fetchedRows = fetchResp.getResults().getRows();
+
+        TRowSet results = fetchResp.getResults();
+        fetchedRows = RowSetFactory.create(results, protocol);
         fetchedRowsItr = fetchedRows.iterator();
       }
 

Modified: hive/branches/tez/metastore/scripts/upgrade/mysql/014-HIVE-3764.mysql.sql
URL: http://svn.apache.org/viewvc/hive/branches/tez/metastore/scripts/upgrade/mysql/014-HIVE-3764.mysql.sql?rev=1555193&r1=1555192&r2=1555193&view=diff
==============================================================================
--- hive/branches/tez/metastore/scripts/upgrade/mysql/014-HIVE-3764.mysql.sql (original)
+++ hive/branches/tez/metastore/scripts/upgrade/mysql/014-HIVE-3764.mysql.sql Fri Jan  3 18:37:34 2014
@@ -1,3 +1,5 @@
+SELECT '< HIVE-3764 Support metastore version consistency check >' AS ' ';
+
 -- Table structure for VERSION
 CREATE TABLE IF NOT EXISTS `VERSION` (
   `VER_ID` BIGINT NOT NULL,

Modified: hive/branches/tez/metastore/scripts/upgrade/mysql/upgrade-0.12.0-to-0.13.0.mysql.sql
URL: http://svn.apache.org/viewvc/hive/branches/tez/metastore/scripts/upgrade/mysql/upgrade-0.12.0-to-0.13.0.mysql.sql?rev=1555193&r1=1555192&r2=1555193&view=diff
==============================================================================
--- hive/branches/tez/metastore/scripts/upgrade/mysql/upgrade-0.12.0-to-0.13.0.mysql.sql (original)
+++ hive/branches/tez/metastore/scripts/upgrade/mysql/upgrade-0.12.0-to-0.13.0.mysql.sql Fri Jan  3 18:37:34 2014
@@ -1,11 +1,6 @@
 SELECT 'Upgrading MetaStore schema from 0.12.0 to 0.13.0' AS ' ';
 
-UPDATE PARTITION_KEY_VALS
-  INNER JOIN PARTITIONS ON PARTITION_KEY_VALS.PART_ID = PARTITIONS.PART_ID
-  INNER JOIN PARTITION_KEYS ON PARTITION_KEYS.TBL_ID = PARTITIONS.TBL_ID
-    AND PARTITION_KEYS.INTEGER_IDX = PARTITION_KEY_VALS.INTEGER_IDX
-    AND PARTITION_KEYS.PKEY_TYPE = 'date'
-SET PART_KEY_VAL = IFNULL(DATE_FORMAT(cast(PART_KEY_VAL as date),'%Y-%m-%d'), PART_KEY_VAL);
+SOURCE 015-HIVE-5700.mysql.sql;
 
 UPDATE VERSION SET SCHEMA_VERSION='0.13.0', VERSION_COMMENT='Hive release version 0.13.0' where VER_ID=1;
 SELECT 'Finished upgrading MetaStore schema from 0.12.0 to 0.13.0' AS ' ';

Modified: hive/branches/tez/metastore/scripts/upgrade/oracle/upgrade-0.12.0-to-0.13.0.oracle.sql
URL: http://svn.apache.org/viewvc/hive/branches/tez/metastore/scripts/upgrade/oracle/upgrade-0.12.0-to-0.13.0.oracle.sql?rev=1555193&r1=1555192&r2=1555193&view=diff
==============================================================================
--- hive/branches/tez/metastore/scripts/upgrade/oracle/upgrade-0.12.0-to-0.13.0.oracle.sql (original)
+++ hive/branches/tez/metastore/scripts/upgrade/oracle/upgrade-0.12.0-to-0.13.0.oracle.sql Fri Jan  3 18:37:34 2014
@@ -1,28 +1,6 @@
 SELECT 'Upgrading MetaStore schema from 0.12.0 to 0.13.0' AS Status from dual;
 
-CREATE FUNCTION hive13_to_date(date_str IN VARCHAR2)
-RETURN DATE
-IS dt DATE;
-BEGIN
-  dt := TO_DATE(date_str, 'YYYY-MM-DD');
-  RETURN dt;
-EXCEPTION
-  WHEN others THEN RETURN null;
-END;
-/
-
-MERGE INTO PARTITION_KEY_VALS
-USING (
-  SELECT SRC.PART_ID as IPART_ID, SRC.INTEGER_IDX as IINTEGER_IDX, 
-     NVL(TO_CHAR(hive13_to_date(PART_KEY_VAL),'YYYY-MM-DD'), PART_KEY_VAL) as NORM
-  FROM PARTITION_KEY_VALS SRC
-    INNER JOIN PARTITIONS ON SRC.PART_ID = PARTITIONS.PART_ID
-    INNER JOIN PARTITION_KEYS ON PARTITION_KEYS.TBL_ID = PARTITIONS.TBL_ID
-      AND PARTITION_KEYS.INTEGER_IDX = SRC.INTEGER_IDX AND PARTITION_KEYS.PKEY_TYPE = 'date'
-) ON (IPART_ID = PARTITION_KEY_VALS.PART_ID AND IINTEGER_IDX = PARTITION_KEY_VALS.INTEGER_IDX)
-WHEN MATCHED THEN UPDATE SET PART_KEY_VAL = NORM;
-
-DROP FUNCTION hive13_to_date;
+@015-HIVE-5700.oracle.sql;
 
 UPDATE VERSION SET SCHEMA_VERSION='0.13.0', VERSION_COMMENT='Hive release version 0.13.0' where VER_ID=1;
 SELECT 'Finished upgrading MetaStore schema from 0.12.0 to 0.13.0' AS Status from dual;

Modified: hive/branches/tez/metastore/scripts/upgrade/postgres/011-HIVE-3649.postgres.sql
URL: http://svn.apache.org/viewvc/hive/branches/tez/metastore/scripts/upgrade/postgres/011-HIVE-3649.postgres.sql?rev=1555193&r1=1555192&r2=1555193&view=diff
==============================================================================
--- hive/branches/tez/metastore/scripts/upgrade/postgres/011-HIVE-3649.postgres.sql (original)
+++ hive/branches/tez/metastore/scripts/upgrade/postgres/011-HIVE-3649.postgres.sql Fri Jan  3 18:37:34 2014
@@ -1,3 +1,5 @@
+SELECT '< HIVE-3649 Hive List Bucketing - enhance DDL to specify list bucketing table >';
+
 -- Add new not null column into SDS table in three steps
 
 -- Step 1: Add the column allowing null

Modified: hive/branches/tez/metastore/scripts/upgrade/postgres/014-HIVE-3764.postgres.sql
URL: http://svn.apache.org/viewvc/hive/branches/tez/metastore/scripts/upgrade/postgres/014-HIVE-3764.postgres.sql?rev=1555193&r1=1555192&r2=1555193&view=diff
==============================================================================
--- hive/branches/tez/metastore/scripts/upgrade/postgres/014-HIVE-3764.postgres.sql (original)
+++ hive/branches/tez/metastore/scripts/upgrade/postgres/014-HIVE-3764.postgres.sql Fri Jan  3 18:37:34 2014
@@ -1,3 +1,5 @@
+SELECT '< HIVE-3764 Support metastore version consistency check >';
+
 --
 -- Table structure for VERSION
 --

Modified: hive/branches/tez/metastore/scripts/upgrade/postgres/upgrade-0.12.0-to-0.13.0.postgres.sql
URL: http://svn.apache.org/viewvc/hive/branches/tez/metastore/scripts/upgrade/postgres/upgrade-0.12.0-to-0.13.0.postgres.sql?rev=1555193&r1=1555192&r2=1555193&view=diff
==============================================================================
--- hive/branches/tez/metastore/scripts/upgrade/postgres/upgrade-0.12.0-to-0.13.0.postgres.sql (original)
+++ hive/branches/tez/metastore/scripts/upgrade/postgres/upgrade-0.12.0-to-0.13.0.postgres.sql Fri Jan  3 18:37:34 2014
@@ -1,24 +1,6 @@
 SELECT 'Upgrading MetaStore schema from 0.12.0 to 0.13.0';
 
-CREATE FUNCTION hive13_to_date(date_str text) RETURNS DATE AS $$
-DECLARE dt DATE;
-BEGIN
-  dt := date_str::DATE;
-  RETURN dt;
-EXCEPTION
-  WHEN others THEN RETURN null;
-END;
-$$ LANGUAGE plpgsql;
-
-UPDATE "PARTITION_KEY_VALS"
-SET "PART_KEY_VAL" = COALESCE(TO_CHAR(hive13_to_date(src."PART_KEY_VAL"),'YYYY-MM-DD'), src."PART_KEY_VAL")
-FROM "PARTITION_KEY_VALS" src
-  INNER JOIN "PARTITIONS" ON src."PART_ID" = "PARTITIONS"."PART_ID"
-  INNER JOIN "PARTITION_KEYS" ON "PARTITION_KEYS"."TBL_ID" = "PARTITIONS"."TBL_ID"
-    AND "PARTITION_KEYS"."INTEGER_IDX" = src."INTEGER_IDX"
-    AND "PARTITION_KEYS"."PKEY_TYPE" = 'date';
-
-DROP FUNCTION hive13_to_date(date_str text);
+\i 015-HIVE-5700.postgres.sql;
 
 UPDATE "VERSION" SET "SCHEMA_VERSION"='0.13.0', "VERSION_COMMENT"='Hive release version 0.13.0' where "VER_ID"=1;
 SELECT 'Finished upgrading MetaStore schema from 0.12.0 to 0.13.0';

Modified: hive/branches/tez/ql/src/java/org/apache/hadoop/hive/ql/exec/DDLTask.java
URL: http://svn.apache.org/viewvc/hive/branches/tez/ql/src/java/org/apache/hadoop/hive/ql/exec/DDLTask.java?rev=1555193&r1=1555192&r2=1555193&view=diff
==============================================================================
--- hive/branches/tez/ql/src/java/org/apache/hadoop/hive/ql/exec/DDLTask.java (original)
+++ hive/branches/tez/ql/src/java/org/apache/hadoop/hive/ql/exec/DDLTask.java Fri Jan  3 18:37:34 2014
@@ -508,11 +508,8 @@ public class DDLTask extends Task<DDLWor
   }
 
   private int showGrants(ShowGrantDesc showGrantDesc) throws HiveException {
-    DataOutput outStream = null;
+    StringBuilder builder = new StringBuilder();
     try {
-      Path resFile = new Path(showGrantDesc.getResFile());
-      FileSystem fs = resFile.getFileSystem(conf);
-      outStream = fs.create(resFile);
       PrincipalDesc principalDesc = showGrantDesc.getPrincipalDesc();
       PrivilegeObjectDesc hiveObjectDesc = showGrantDesc.getHiveObj();
       String principalName = principalDesc.getName();
@@ -520,21 +517,8 @@ public class DDLTask extends Task<DDLWor
         List<HiveObjectPrivilege> users = db.showPrivilegeGrant(
             HiveObjectType.GLOBAL, principalName, principalDesc.getType(),
             null, null, null, null);
-        if (users != null && users.size() > 0) {
-          boolean first = true;
-          sortPrivileges(users);
-          for (HiveObjectPrivilege usr : users) {
-            if (!first) {
-              outStream.write(terminator);
-            } else {
-              first = false;
-            }
-
-            writeGrantInfo(outStream, principalDesc.getType(), principalName,
-                null, null, null, null, usr.getGrantInfo());
-
-          }
-        }
+        writeGrantInfo(builder, principalDesc.getType(), principalName,
+            null, null, null, null, users);
       } else {
         String obj = hiveObjectDesc.getObject();
         boolean notFound = true;
@@ -576,22 +560,8 @@ public class DDLTask extends Task<DDLWor
           // show database level privileges
           List<HiveObjectPrivilege> dbs = db.showPrivilegeGrant(HiveObjectType.DATABASE, principalName,
               principalDesc.getType(), dbName, null, null, null);
-          if (dbs != null && dbs.size() > 0) {
-            boolean first = true;
-            sortPrivileges(dbs);
-            for (HiveObjectPrivilege db : dbs) {
-              if (!first) {
-                outStream.write(terminator);
-              } else {
-                first = false;
-              }
-
-              writeGrantInfo(outStream, principalDesc.getType(), principalName,
-                  dbName, null, null, null, db.getGrantInfo());
-
-            }
-          }
-
+          writeGrantInfo(builder, principalDesc.getType(), principalName,
+              dbName, null, null, null, dbs);
         } else {
           if (showGrantDesc.getColumns() != null) {
             // show column level privileges
@@ -600,67 +570,28 @@ public class DDLTask extends Task<DDLWor
                   HiveObjectType.COLUMN, principalName,
                   principalDesc.getType(), dbName, tableName, partValues,
                   columnName);
-              if (columnss != null && columnss.size() > 0) {
-                boolean first = true;
-                sortPrivileges(columnss);
-                for (HiveObjectPrivilege col : columnss) {
-                  if (!first) {
-                    outStream.write(terminator);
-                  } else {
-                    first = false;
-                  }
-
-                  writeGrantInfo(outStream, principalDesc.getType(),
-                      principalName, dbName, tableName, partName, columnName,
-                      col.getGrantInfo());
-                }
-              }
+              writeGrantInfo(builder, principalDesc.getType(),
+                  principalName, dbName, tableName, partName, columnName,
+                  columnss);
             }
           } else if (hiveObjectDesc.getPartSpec() != null) {
             // show partition level privileges
             List<HiveObjectPrivilege> parts = db.showPrivilegeGrant(
                 HiveObjectType.PARTITION, principalName, principalDesc
                     .getType(), dbName, tableName, partValues, null);
-            if (parts != null && parts.size() > 0) {
-              boolean first = true;
-              sortPrivileges(parts);
-              for (HiveObjectPrivilege part : parts) {
-                if (!first) {
-                  outStream.write(terminator);
-                } else {
-                  first = false;
-                }
-
-                writeGrantInfo(outStream, principalDesc.getType(),
-                    principalName, dbName, tableName, partName, null, part.getGrantInfo());
-
-              }
-            }
+            writeGrantInfo(builder, principalDesc.getType(),
+                principalName, dbName, tableName, partName, null, parts);
           } else {
             // show table level privileges
             List<HiveObjectPrivilege> tbls = db.showPrivilegeGrant(
                 HiveObjectType.TABLE, principalName, principalDesc.getType(),
                 dbName, tableName, null, null);
-            if (tbls != null && tbls.size() > 0) {
-              boolean first = true;
-              sortPrivileges(tbls);
-              for (HiveObjectPrivilege tbl : tbls) {
-                if (!first) {
-                  outStream.write(terminator);
-                } else {
-                  first = false;
-                }
-
-                writeGrantInfo(outStream, principalDesc.getType(),
-                    principalName, dbName, tableName, null, null, tbl.getGrantInfo());
-
-              }
-            }
+            writeGrantInfo(builder, principalDesc.getType(),
+                principalName, dbName, tableName, null, null, tbls);
           }
         }
       }
-      ((FSDataOutputStream) outStream).close();
-      outStream = null;
+      writeToFile(builder.toString(), showGrantDesc.getResFile());
     } catch (FileNotFoundException e) {
       LOG.info("show table status: " + stringifyException(e));
       return 1;
@@ -670,8 +601,6 @@ public class DDLTask extends Task<DDLWor
     } catch (Exception e) {
       e.printStackTrace();
       throw new HiveException(e);
-    } finally {
-      IOUtils.closeStream((FSDataOutputStream) outStream);
     }
     return 0;
   }
@@ -840,7 +769,7 @@ public class DDLTask extends Task<DDLWor
           FileSystem fs = resFile.getFileSystem(conf);
           outStream = fs.create(resFile);
           for (Role role : roles) {
-            outStream.writeBytes("role name:" + role.getRoleName());
+            outStream.writeBytes(role.getRoleName());
             outStream.write(terminator);
           }
           ((FSDataOutputStream) outStream).close();
@@ -2813,43 +2742,36 @@ public class DDLTask extends Task<DDLWor
 
     // show table properties - populate the output stream
     Table tbl = db.getTable(tableName, false);
-    DataOutput outStream = null;
     try {
-      Path resFile = new Path(showTblPrpt.getResFile());
-      FileSystem fs = resFile.getFileSystem(conf);
-      outStream = fs.create(resFile);
-
       if (tbl == null) {
         String errMsg = "Table " + tableName + " does not exist";
-        outStream.write(errMsg.getBytes("UTF-8"));
-        ((FSDataOutputStream) outStream).close();
-        outStream = null;
+        writeToFile(errMsg, showTblPrpt.getResFile());
         return 0;
       }
 
       LOG.info("DDLTask: show properties for " + tbl.getTableName());
 
+      StringBuilder builder = new StringBuilder();
       String propertyName = showTblPrpt.getPropertyName();
       if (propertyName != null) {
         String propertyValue = tbl.getProperty(propertyName);
         if (propertyValue == null) {
           String errMsg = "Table " + tableName + " does not have property: " + propertyName;
-          outStream.write(errMsg.getBytes("UTF-8"));
+          builder.append(errMsg);
         }
         else {
-          outStream.writeBytes(propertyValue);
+          builder.append(propertyValue);
         }
       }
       else {
         Map<String, String> properties = tbl.getParameters();
         for (String key : properties.keySet()) {
-          writeKeyValuePair(outStream, key, properties.get(key));
+          writeKeyValuePair(builder, key, properties.get(key));
         }
       }
 
       LOG.info("DDLTask: written data for showing properties of " + tbl.getTableName());
-      ((FSDataOutputStream) outStream).close();
-      outStream = null;
+      writeToFile(builder.toString(), showTblPrpt.getResFile());
 
     } catch (FileNotFoundException e) {
       LOG.info("show table properties: " + stringifyException(e));
@@ -2859,12 +2781,27 @@ public class DDLTask extends Task<DDLWor
       return 1;
     } catch (Exception e) {
       throw new HiveException(e);
-    } finally {
-      IOUtils.closeStream((FSDataOutputStream) outStream);
     }
 
     return 0;
   }
+
+  private void writeToFile(String data, String file) throws IOException {
+    Path resFile = new Path(file);
+    FileSystem fs = resFile.getFileSystem(conf);
+    FSDataOutputStream out = fs.create(resFile);
+    try {
+      if (data != null && !data.isEmpty()) {
+        OutputStreamWriter writer = new OutputStreamWriter(out, "UTF-8");
+        writer.write(data);
+        writer.write((char) terminator);
+        writer.flush();
+      }
+    } finally {
+      IOUtils.closeStream(out);
+    }
+  }
+
   /**
    * Write the description of a table to a file.
    *
@@ -2965,45 +2902,54 @@ public class DDLTask extends Task<DDLWor
     }
   }
 
-  public static void writeGrantInfo(DataOutput outStream,
+  public static void writeGrantInfo(StringBuilder builder,
       PrincipalType principalType, String principalName, String dbName,
       String tableName, String partName, String columnName,
-      PrivilegeGrantInfo grantInfo) throws IOException {
+      List<HiveObjectPrivilege> privileges) throws IOException {
+    if (privileges == null || privileges.isEmpty()) {
+      return;
+    }
 
-    String privilege = grantInfo.getPrivilege();
-    long unixTimestamp = grantInfo.getCreateTime() * 1000L;
-    Date createTime = new Date(unixTimestamp);
-    String grantor = grantInfo.getGrantor();
+    sortPrivileges(privileges);
 
-    if (dbName != null) {
-      writeKeyValuePair(outStream, "database", dbName);
-    }
-    if (tableName != null) {
-      writeKeyValuePair(outStream, "table", tableName);
-    }
-    if (partName != null) {
-      writeKeyValuePair(outStream, "partition", partName);
-    }
-    if (columnName != null) {
-      writeKeyValuePair(outStream, "columnName", columnName);
-    }
+    for (HiveObjectPrivilege privilege : privileges) {
+      PrivilegeGrantInfo grantInfo = privilege.getGrantInfo();
+      String privName = grantInfo.getPrivilege();
+      long unixTimestamp = grantInfo.getCreateTime() * 1000L;
+      Date createTime = new Date(unixTimestamp);
+      String grantor = grantInfo.getGrantor();
 
-    writeKeyValuePair(outStream, "principalName", principalName);
-    writeKeyValuePair(outStream, "principalType", "" + principalType);
-    writeKeyValuePair(outStream, "privilege", privilege);
-    writeKeyValuePair(outStream, "grantTime", "" + createTime);
-    if (grantor != null) {
-      writeKeyValuePair(outStream, "grantor", grantor);
+      if (dbName != null) {
+        writeKeyValuePair(builder, "database", dbName);
+      }
+      if (tableName != null) {
+        writeKeyValuePair(builder, "table", tableName);
+      }
+      if (partName != null) {
+        writeKeyValuePair(builder, "partition", partName);
+      }
+      if (columnName != null) {
+        writeKeyValuePair(builder, "columnName", columnName);
+      }
+
+      writeKeyValuePair(builder, "principalName", principalName);
+      writeKeyValuePair(builder, "principalType", "" + principalType);
+      writeKeyValuePair(builder, "privilege", privName);
+      writeKeyValuePair(builder, "grantTime", "" + createTime);
+      if (grantor != null) {
+        writeKeyValuePair(builder, "grantor", grantor);
+      }
     }
   }
 
-  private static void writeKeyValuePair(DataOutput outStream, String key,
+  private static void writeKeyValuePair(StringBuilder builder, String key,
       String value) throws IOException {
-    outStream.write(terminator);
-    outStream.writeBytes(key);
-    outStream.write(separator);
-    outStream.writeBytes(value);
-    outStream.write(separator);
+    if (builder.length() > 0) {
+      builder.append((char)terminator);
+    }
+    builder.append(key);
+    builder.append((char)separator);
+    builder.append(value);
   }
 
   private void setAlterProtectMode(boolean protectModeEnable,
@@ -3510,8 +3456,8 @@ public class DDLTask extends Task<DDLWor
    *
    * @param params
    *          Parameters.
-   * @param user
-   *          user that is doing the updating.
+   * @param conf
+   *          HiveConf of session
    */
   private boolean updateModifiedParameters(Map<String, String> params, HiveConf conf) throws HiveException {
     String user = null;

Modified: hive/branches/tez/ql/src/java/org/apache/hadoop/hive/ql/exec/FetchFormatter.java
URL: http://svn.apache.org/viewvc/hive/branches/tez/ql/src/java/org/apache/hadoop/hive/ql/exec/FetchFormatter.java?rev=1555193&r1=1555192&r2=1555193&view=diff
==============================================================================
--- hive/branches/tez/ql/src/java/org/apache/hadoop/hive/ql/exec/FetchFormatter.java (original)
+++ hive/branches/tez/ql/src/java/org/apache/hadoop/hive/ql/exec/FetchFormatter.java Fri Jan  3 18:37:34 2014
@@ -42,8 +42,11 @@ public interface FetchFormatter<T> exten
 
   public static class ThriftFormatter implements FetchFormatter<Object> {
 
+    int protocol;
+
     @Override
     public void initialize(Configuration hconf, Properties props) throws Exception {
+      protocol = hconf.getInt(ListSinkOperator.OUTPUT_PROTOCOL, 0);
     }
 
     @Override
@@ -56,7 +59,7 @@ public interface FetchFormatter<T> exten
         StructField fieldRef = fields.get(i);
         Object field = structOI.getStructFieldData(row, fieldRef);
         converted[i] = field == null ? null :
-            SerDeUtils.toThriftPayload(field, fieldRef.getFieldObjectInspector());
+            SerDeUtils.toThriftPayload(field, fieldRef.getFieldObjectInspector(), protocol);
       }
       return converted;
     }

Modified: hive/branches/tez/ql/src/java/org/apache/hadoop/hive/ql/exec/ListSinkOperator.java
URL: http://svn.apache.org/viewvc/hive/branches/tez/ql/src/java/org/apache/hadoop/hive/ql/exec/ListSinkOperator.java?rev=1555193&r1=1555192&r2=1555193&view=diff
==============================================================================
--- hive/branches/tez/ql/src/java/org/apache/hadoop/hive/ql/exec/ListSinkOperator.java (original)
+++ hive/branches/tez/ql/src/java/org/apache/hadoop/hive/ql/exec/ListSinkOperator.java Fri Jan  3 18:37:34 2014
@@ -36,6 +36,7 @@ import org.apache.hadoop.util.Reflection
 public class ListSinkOperator extends Operator<ListSinkDesc> {
 
   public static final String OUTPUT_FORMATTER = "output.formatter";
+  public static final String OUTPUT_PROTOCOL = "output.protocol";
 
   private transient List res;
   private transient FetchFormatter fetcher;

Modified: hive/branches/tez/ql/src/java/org/apache/hadoop/hive/ql/exec/vector/VectorizationContext.java
URL: http://svn.apache.org/viewvc/hive/branches/tez/ql/src/java/org/apache/hadoop/hive/ql/exec/vector/VectorizationContext.java?rev=1555193&r1=1555192&r2=1555193&view=diff
==============================================================================
--- hive/branches/tez/ql/src/java/org/apache/hadoop/hive/ql/exec/vector/VectorizationContext.java (original)
+++ hive/branches/tez/ql/src/java/org/apache/hadoop/hive/ql/exec/vector/VectorizationContext.java Fri Jan  3 18:37:34 2014
@@ -293,7 +293,20 @@ public class VectorizationContext {
                    || arg0Type(expr).equals("float"))) {
         return true;
       }
-    } else if (gudf instanceof GenericUDFTimestamp && arg0Type(expr).equals("string")) {
+    } else if ((gudf instanceof GenericUDFTimestamp && arg0Type(expr).equals("string"))
+
+            /* GenericUDFCase and GenericUDFWhen are implemented with the UDF Adaptor because
+             * of their complexity and generality. In the future, variations of these
+             * can be optimized to run faster for the vectorized code path. For example,
+             * CASE col WHEN 1 then "one" WHEN 2 THEN "two" ELSE "other" END
+             * is an example of a GenericUDFCase that has all constant arguments
+             * except for the first argument. This is probably a common case and a
+             * good candidate for a fast, special-purpose VectorExpression. Then
+             * the UDF Adaptor code path could be used as a catch-all for
+             * non-optimized general cases.
+             */
+            || gudf instanceof GenericUDFCase
+            || gudf instanceof GenericUDFWhen) {
       return true;
     }
     return false;

Modified: hive/branches/tez/ql/src/java/org/apache/hadoop/hive/ql/optimizer/physical/Vectorizer.java
URL: http://svn.apache.org/viewvc/hive/branches/tez/ql/src/java/org/apache/hadoop/hive/ql/optimizer/physical/Vectorizer.java?rev=1555193&r1=1555192&r2=1555193&view=diff
==============================================================================
--- hive/branches/tez/ql/src/java/org/apache/hadoop/hive/ql/optimizer/physical/Vectorizer.java (original)
+++ hive/branches/tez/ql/src/java/org/apache/hadoop/hive/ql/optimizer/physical/Vectorizer.java Fri Jan  3 18:37:34 2014
@@ -126,6 +126,7 @@ import org.apache.hadoop.hive.ql.udf.gen
 import org.apache.hadoop.hive.ql.udf.generic.GenericUDFAbs;
 import org.apache.hadoop.hive.ql.udf.generic.GenericUDFBetween;
 import org.apache.hadoop.hive.ql.udf.generic.GenericUDFBridge;
+import org.apache.hadoop.hive.ql.udf.generic.GenericUDFCase;
 import org.apache.hadoop.hive.ql.udf.generic.GenericUDFCeil;
 import org.apache.hadoop.hive.ql.udf.generic.GenericUDFConcat;
 import org.apache.hadoop.hive.ql.udf.generic.GenericUDFFloor;
@@ -156,6 +157,7 @@ import org.apache.hadoop.hive.ql.udf.gen
 import org.apache.hadoop.hive.ql.udf.generic.GenericUDFTimestamp;
 import org.apache.hadoop.hive.ql.udf.generic.GenericUDFToUnixTimeStamp;
 import org.apache.hadoop.hive.ql.udf.generic.GenericUDFUpper;
+import org.apache.hadoop.hive.ql.udf.generic.GenericUDFWhen;
 
 public class Vectorizer implements PhysicalPlanResolver {
 
@@ -253,6 +255,8 @@ public class Vectorizer implements Physi
     supportedGenericUDFs.add(GenericUDFAbs.class);
     supportedGenericUDFs.add(GenericUDFBetween.class);
     supportedGenericUDFs.add(GenericUDFIn.class);
+    supportedGenericUDFs.add(GenericUDFCase.class);
+    supportedGenericUDFs.add(GenericUDFWhen.class);
 
     // For type casts
     supportedGenericUDFs.add(UDFToLong.class);

Modified: hive/branches/tez/ql/src/java/org/apache/hadoop/hive/ql/parse/DDLSemanticAnalyzer.java
URL: http://svn.apache.org/viewvc/hive/branches/tez/ql/src/java/org/apache/hadoop/hive/ql/parse/DDLSemanticAnalyzer.java?rev=1555193&r1=1555192&r2=1555193&view=diff
==============================================================================
--- hive/branches/tez/ql/src/java/org/apache/hadoop/hive/ql/parse/DDLSemanticAnalyzer.java (original)
+++ hive/branches/tez/ql/src/java/org/apache/hadoop/hive/ql/parse/DDLSemanticAnalyzer.java Fri Jan  3 18:37:34 2014
@@ -130,9 +130,6 @@ import org.apache.hadoop.hive.ql.securit
 import org.apache.hadoop.hive.ql.session.SessionState;
 import org.apache.hadoop.hive.serde.serdeConstants;
 import org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe;
-import org.apache.hadoop.hive.serde2.objectinspector.PrimitiveObjectInspector.PrimitiveCategory;
-import org.apache.hadoop.hive.serde2.objectinspector.primitive.PrimitiveObjectInspectorUtils;
-import org.apache.hadoop.hive.serde2.typeinfo.BaseCharTypeInfo;
 import org.apache.hadoop.hive.serde2.typeinfo.CharTypeInfo;
 import org.apache.hadoop.hive.serde2.typeinfo.DecimalTypeInfo;
 import org.apache.hadoop.hive.serde2.typeinfo.VarcharTypeInfo;
@@ -523,6 +520,7 @@ public class DDLSemanticAnalyzer extends
         principalDesc, privHiveObj, cols);
     rootTasks.add(TaskFactory.get(new DDLWork(getInputs(), getOutputs(),
         showGrant), conf));
+    setFetchTask(createFetchTask(ShowGrantDesc.getSchema()));
   }
 
   private void analyzeGrant(ASTNode ast) throws SemanticException {
@@ -683,6 +681,7 @@ public class DDLSemanticAnalyzer extends
     createRoleDesc.setResFile(ctx.getResFile().toString());
     rootTasks.add(TaskFactory.get(new DDLWork(getInputs(), getOutputs(),
         createRoleDesc), conf));
+    setFetchTask(createFetchTask(RoleDDLDesc.getSchema()));
   }
 
   private void analyzeAlterDatabase(ASTNode ast) throws SemanticException {
@@ -1932,10 +1931,8 @@ public class DDLSemanticAnalyzer extends
   }
 
   /**
-   * Create a FetchTask for a given table and thrift ddl schema.
+   * Create a FetchTask for a given thrift ddl schema.
    *
-   * @param tablename
-   *          tablename
    * @param schema
    *          thrift ddl
    */

Modified: hive/branches/tez/ql/src/java/org/apache/hadoop/hive/ql/plan/RoleDDLDesc.java
URL: http://svn.apache.org/viewvc/hive/branches/tez/ql/src/java/org/apache/hadoop/hive/ql/plan/RoleDDLDesc.java?rev=1555193&r1=1555192&r2=1555193&view=diff
==============================================================================
--- hive/branches/tez/ql/src/java/org/apache/hadoop/hive/ql/plan/RoleDDLDesc.java (original)
+++ hive/branches/tez/ql/src/java/org/apache/hadoop/hive/ql/plan/RoleDDLDesc.java Fri Jan  3 18:37:34 2014
@@ -39,6 +39,15 @@ public class RoleDDLDesc extends DDLDesc
   
   private String roleOwnerName;
 
+  /**
+   * thrift ddl for the result of show role.
+   */
+  private static String schema = "role#string";
+
+  public static String getSchema() {
+    return schema;
+  }
+
   public static enum RoleOperation {
     DROP_ROLE("drop_role"), CREATE_ROLE("create_role"), SHOW_ROLE_GRANT("show_roles");
     private String operationName;

Modified: hive/branches/tez/ql/src/java/org/apache/hadoop/hive/ql/plan/ShowGrantDesc.java
URL: http://svn.apache.org/viewvc/hive/branches/tez/ql/src/java/org/apache/hadoop/hive/ql/plan/ShowGrantDesc.java?rev=1555193&r1=1555192&r2=1555193&view=diff
==============================================================================
--- hive/branches/tez/ql/src/java/org/apache/hadoop/hive/ql/plan/ShowGrantDesc.java (original)
+++ hive/branches/tez/ql/src/java/org/apache/hadoop/hive/ql/plan/ShowGrantDesc.java Fri Jan  3 18:37:34 2014
@@ -30,6 +30,11 @@ public class ShowGrantDesc {
 
   private String resFile;
 
+  /**
+   * thrift ddl for the result of show grant.
+   */
+  private static final String schema = "property,value#string:string";
+
   public ShowGrantDesc(){
   }
   
@@ -40,7 +45,11 @@ public class ShowGrantDesc {
     this.hiveObj = subjectObj;
     this.columns = columns;
   }
-  
+
+  public static String getSchema() {
+    return schema;
+  }
+
   @Explain(displayName="principal desc")
   public PrincipalDesc getPrincipalDesc() {
     return principalDesc;

Modified: hive/branches/tez/ql/src/test/results/clientnegative/authorization_fail_3.q.out
URL: http://svn.apache.org/viewvc/hive/branches/tez/ql/src/test/results/clientnegative/authorization_fail_3.q.out?rev=1555193&r1=1555192&r2=1555193&view=diff
==============================================================================
--- hive/branches/tez/ql/src/test/results/clientnegative/authorization_fail_3.q.out (original)
+++ hive/branches/tez/ql/src/test/results/clientnegative/authorization_fail_3.q.out Fri Jan  3 18:37:34 2014
@@ -24,14 +24,13 @@ PREHOOK: query: show grant user hive_tes
 PREHOOK: type: SHOW_GRANT
 POSTHOOK: query: show grant user hive_test_user on table authorization_fail_3
 POSTHOOK: type: SHOW_GRANT
-
-database	default	
-table	authorization_fail_3	
-principalName	hive_test_user	
-principalType	USER	
-privilege	Create	
+database	default
+table	authorization_fail_3
+principalName	hive_test_user
+principalType	USER
+privilege	Create
 #### A masked pattern was here ####
-grantor	hive_test_user	
+grantor	hive_test_user
 PREHOOK: query: show grant user hive_test_user on table authorization_fail_3 partition (ds='2010')
 PREHOOK: type: SHOW_GRANT
 POSTHOOK: query: show grant user hive_test_user on table authorization_fail_3 partition (ds='2010')

Modified: hive/branches/tez/ql/src/test/results/clientnegative/authorization_fail_4.q.out
URL: http://svn.apache.org/viewvc/hive/branches/tez/ql/src/test/results/clientnegative/authorization_fail_4.q.out?rev=1555193&r1=1555192&r2=1555193&view=diff
==============================================================================
--- hive/branches/tez/ql/src/test/results/clientnegative/authorization_fail_4.q.out (original)
+++ hive/branches/tez/ql/src/test/results/clientnegative/authorization_fail_4.q.out Fri Jan  3 18:37:34 2014
@@ -38,42 +38,38 @@ PREHOOK: query: show grant user hive_tes
 PREHOOK: type: SHOW_GRANT
 POSTHOOK: query: show grant user hive_test_user on table authorization_fail_4
 POSTHOOK: type: SHOW_GRANT
-
-database	default	
-table	authorization_fail_4	
-principalName	hive_test_user	
-principalType	USER	
-privilege	Alter	
+database	default
+table	authorization_fail_4
+principalName	hive_test_user
+principalType	USER
+privilege	Alter
 #### A masked pattern was here ####
-grantor	hive_test_user	
-
-database	default	
-table	authorization_fail_4	
-principalName	hive_test_user	
-principalType	USER	
-privilege	Create	
+grantor	hive_test_user
+database	default
+table	authorization_fail_4
+principalName	hive_test_user
+principalType	USER
+privilege	Create
 #### A masked pattern was here ####
-grantor	hive_test_user	
+grantor	hive_test_user
 PREHOOK: query: show grant user hive_test_user on table authorization_fail_4 partition (ds='2010')
 PREHOOK: type: SHOW_GRANT
 POSTHOOK: query: show grant user hive_test_user on table authorization_fail_4 partition (ds='2010')
 POSTHOOK: type: SHOW_GRANT
-
-database	default	
-table	authorization_fail_4	
-partition	ds=2010	
-principalName	hive_test_user	
-principalType	USER	
-privilege	Alter	
+database	default
+table	authorization_fail_4
+partition	ds=2010
+principalName	hive_test_user
+principalType	USER
+privilege	Alter
 #### A masked pattern was here ####
-grantor	hive_test_user	
-
-database	default	
-table	authorization_fail_4	
-partition	ds=2010	
-principalName	hive_test_user	
-principalType	USER	
-privilege	Create	
+grantor	hive_test_user
+database	default
+table	authorization_fail_4
+partition	ds=2010
+principalName	hive_test_user
+principalType	USER
+privilege	Create
 #### A masked pattern was here ####
-grantor	hive_test_user	
+grantor	hive_test_user
 Authorization failed:No privilege 'Select' found for inputs { database:default, table:authorization_fail_4, partitionName:ds=2010, columnName:key}. Use SHOW GRANT to get more details.



Mime
View raw message