hive-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Hive QA (JIRA)" <j...@apache.org>
Subject [jira] [Commented] (HIVE-4518) Counter Strike: Operation Operator
Date Wed, 13 Nov 2013 19:23:22 GMT

    [ https://issues.apache.org/jira/browse/HIVE-4518?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=13821708#comment-13821708 ] 

Hive QA commented on HIVE-4518:
-------------------------------



{color:red}Overall{color}: -1 no tests executed

Here are the results of testing the latest attachment:
https://issues.apache.org/jira/secure/attachment/12613468/HIVE-4518.8.patch

Test results: http://bigtop01.cloudera.org:8080/job/PreCommit-HIVE-Build/263/testReport
Console output: http://bigtop01.cloudera.org:8080/job/PreCommit-HIVE-Build/263/console

Messages:
{noformat}
Executing org.apache.hive.ptest.execution.PrepPhase
Tests failed with: NonZeroExitCodeException: Command 'bash /data/hive-ptest/working/scratch/source-prep.sh' failed with exit status 1 and output '+ [[ -n '' ]]
+ export 'ANT_OPTS=-Xmx1g -XX:MaxPermSize=256m -Dhttp.proxyHost=localhost -Dhttp.proxyPort=3128'
+ ANT_OPTS='-Xmx1g -XX:MaxPermSize=256m -Dhttp.proxyHost=localhost -Dhttp.proxyPort=3128'
+ export 'M2_OPTS=-Xmx1g -XX:MaxPermSize=256m -Dhttp.proxyHost=localhost -Dhttp.proxyPort=3128'
+ M2_OPTS='-Xmx1g -XX:MaxPermSize=256m -Dhttp.proxyHost=localhost -Dhttp.proxyPort=3128'
+ cd /data/hive-ptest/working/
+ tee /data/hive-ptest/logs/PreCommit-HIVE-Build-263/source-prep.txt
+ [[ false == \t\r\u\e ]]
+ mkdir -p maven ivy
+ [[ svn = \s\v\n ]]
+ [[ -n '' ]]
+ [[ -d apache-svn-trunk-source ]]
+ [[ ! -d apache-svn-trunk-source/.svn ]]
+ [[ ! -d apache-svn-trunk-source ]]
+ cd apache-svn-trunk-source
+ svn revert -R .
Reverted 'common/src/test/org/apache/hadoop/hive/common/type/TestHiveDecimal.java'
Reverted 'serde/src/java/org/apache/hadoop/hive/serde2/typeinfo/TypeInfoUtils.java'
Reverted 'serde/src/java/org/apache/hadoop/hive/serde2/typeinfo/HiveDecimalUtils.java'
Reverted 'serde/src/java/org/apache/hadoop/hive/serde2/objectinspector/PrimitiveObjectInspector.java'
Reverted 'serde/src/java/org/apache/hadoop/hive/serde2/objectinspector/primitive/AbstractPrimitiveObjectInspector.java'
Reverted 'serde/src/java/org/apache/hadoop/hive/serde2/objectinspector/primitive/WritableConstantShortObjectInspector.java'
Reverted 'serde/src/java/org/apache/hadoop/hive/serde2/objectinspector/primitive/WritableConstantLongObjectInspector.java'
Reverted 'serde/src/java/org/apache/hadoop/hive/serde2/objectinspector/primitive/WritableConstantByteObjectInspector.java'
Reverted 'serde/src/java/org/apache/hadoop/hive/serde2/objectinspector/primitive/WritableConstantHiveDecimalObjectInspector.java'
Reverted 'serde/src/java/org/apache/hadoop/hive/serde2/objectinspector/primitive/WritableConstantIntObjectInspector.java'
Reverted 'ql/src/test/results/clientnegative/udf_assert_true2.q.out'
Reverted 'ql/src/test/results/clientnegative/invalid_arithmetic_type.q.out'
Reverted 'ql/src/test/results/clientpositive/auto_join13.q.out'
Reverted 'ql/src/test/results/clientpositive/decimal_udf.q.out'
Reverted 'ql/src/test/results/clientpositive/rcfile_createas1.q.out'
Reverted 'ql/src/test/results/clientpositive/udf_pmod.q.out'
Reverted 'ql/src/test/results/clientpositive/orc_createas1.q.out'
Reverted 'ql/src/test/results/clientpositive/rcfile_merge2.q.out'
Reverted 'ql/src/test/results/clientpositive/ql_rewrite_gbtoidx.q.out'
Reverted 'ql/src/test/results/clientpositive/input8.q.out'
Reverted 'ql/src/test/results/clientpositive/vectorization_short_regress.q.out'
Reverted 'ql/src/test/results/clientpositive/decimal_6.q.out'
Reverted 'ql/src/test/results/clientpositive/rcfile_merge1.q.out'
Reverted 'ql/src/test/results/clientpositive/bucketmapjoin_negative3.q.out'
Reverted 'ql/src/test/results/clientpositive/windowing_expressions.q.out'
Reverted 'ql/src/test/results/clientpositive/vectorization_5.q.out'
Reverted 'ql/src/test/results/clientpositive/num_op_type_conv.q.out'
Reverted 'ql/src/test/results/clientpositive/skewjoin.q.out'
Reverted 'ql/src/test/results/clientpositive/vectorization_15.q.out'
Reverted 'ql/src/test/results/clientpositive/ppd_constant_expr.q.out'
Reverted 'ql/src/test/results/clientpositive/vectorized_math_funcs.q.out'
Reverted 'ql/src/test/results/clientpositive/auto_join2.q.out'
Reverted 'ql/src/test/results/compiler/plan/join2.q.xml'
Reverted 'ql/src/test/results/compiler/plan/input8.q.xml'
Reverted 'ql/src/test/results/compiler/plan/udf4.q.xml'
Reverted 'ql/src/test/results/compiler/plan/input20.q.xml'
Reverted 'ql/src/test/results/compiler/plan/sample1.q.xml'
Reverted 'ql/src/test/results/compiler/plan/sample2.q.xml'
Reverted 'ql/src/test/results/compiler/plan/sample3.q.xml'
Reverted 'ql/src/test/results/compiler/plan/sample4.q.xml'
Reverted 'ql/src/test/results/compiler/plan/sample5.q.xml'
Reverted 'ql/src/test/results/compiler/plan/sample6.q.xml'
Reverted 'ql/src/test/results/compiler/plan/sample7.q.xml'
Reverted 'ql/src/test/results/compiler/plan/cast1.q.xml'
Reverted 'ql/src/test/org/apache/hadoop/hive/ql/exec/vector/TestVectorizationContext.java'
Reverted 'ql/src/test/org/apache/hadoop/hive/ql/exec/vector/TestVectorSelectOperator.java'
Reverted 'ql/src/test/org/apache/hadoop/hive/ql/udf/TestUDFPosMod.java'
Reverted 'ql/src/test/org/apache/hadoop/hive/ql/udf/TestUDFOPDivide.java'
Reverted 'ql/src/test/org/apache/hadoop/hive/ql/udf/TestUDFOPMod.java'
Reverted 'ql/src/java/org/apache/hadoop/hive/ql/optimizer/physical/Vectorizer.java'
Reverted 'ql/src/java/org/apache/hadoop/hive/ql/parse/TypeCheckProcFactory.java'
Reverted 'ql/src/java/org/apache/hadoop/hive/ql/exec/FunctionRegistry.java'
Reverted 'ql/src/java/org/apache/hadoop/hive/ql/udf/UDFOPMultiply.java'
Reverted 'ql/src/java/org/apache/hadoop/hive/ql/udf/UDFBaseNumericOp.java'
Reverted 'ql/src/java/org/apache/hadoop/hive/ql/udf/UDFPosMod.java'
Reverted 'ql/src/java/org/apache/hadoop/hive/ql/udf/UDFOPDivide.java'
Reverted 'ql/src/java/org/apache/hadoop/hive/ql/udf/UDFOPMod.java'
Reverted 'ql/src/java/org/apache/hadoop/hive/ql/udf/UDFOPPlus.java'
Reverted 'ql/src/java/org/apache/hadoop/hive/ql/udf/UDFOPMinus.java'
++ egrep -v '^X|^Performing status on external'
++ awk '{print $2}'
++ svn status --no-ignore
+ rm -rf target datanucleus.log ant/target shims/target shims/0.20/target shims/assembly/target shims/0.20S/target shims/0.23/target shims/common/target shims/common-secure/target packaging/target hbase-handler/target testutils/target jdbc/target metastore/target itests/target itests/hcatalog-unit/target itests/test-serde/target itests/qtest/target itests/hive-unit/target itests/custom-serde/target itests/util/target hcatalog/target hcatalog/storage-handlers/hbase/target hcatalog/server-extensions/target hcatalog/core/target hcatalog/webhcat/svr/target hcatalog/webhcat/java-client/target hcatalog/hcatalog-pig-adapter/target hwi/target common/target common/src/gen service/target contrib/target serde/target beeline/target odbc/target cli/target ql/dependency-reduced-pom.xml ql/target ql/src/test/org/apache/hadoop/hive/ql/udf/generic/TestGenericUDFOPDivide.java ql/src/test/org/apache/hadoop/hive/ql/udf/generic/TestGenericUDFOPMinus.java ql/src/test/org/apache/hadoop/hive/ql/udf/generic/TestGenericUDFOPMultiply.java ql/src/test/org/apache/hadoop/hive/ql/udf/generic/TestGenericUDFOPMod.java ql/src/test/org/apache/hadoop/hive/ql/udf/generic/TestGenericUDFPosMod.java ql/src/test/org/apache/hadoop/hive/ql/udf/generic/TestGenericUDFOPPlus.java ql/src/java/org/apache/hadoop/hive/ql/udf/generic/GenericUDFOPPlus.java ql/src/java/org/apache/hadoop/hive/ql/udf/generic/GenericUDFBaseNumeric.java ql/src/java/org/apache/hadoop/hive/ql/udf/generic/GenericUDFOPMod.java ql/src/java/org/apache/hadoop/hive/ql/udf/generic/GenericUDFOPMinus.java ql/src/java/org/apache/hadoop/hive/ql/udf/generic/GenericUDFPosMod.java ql/src/java/org/apache/hadoop/hive/ql/udf/generic/GenericUDFOPDivide.java ql/src/java/org/apache/hadoop/hive/ql/udf/generic/GenericUDFOPMultiply.java
+ svn update

Fetching external item into 'hcatalog/src/test/e2e/harness'
External at revision 1541663.

At revision 1541663.
+ patchCommandPath=/data/hive-ptest/working/scratch/smart-apply-patch.sh
+ patchFilePath=/data/hive-ptest/working/scratch/build.patch
+ [[ -f /data/hive-ptest/working/scratch/build.patch ]]
+ chmod +x /data/hive-ptest/working/scratch/smart-apply-patch.sh
+ /data/hive-ptest/working/scratch/smart-apply-patch.sh /data/hive-ptest/working/scratch/build.patch
Going to apply patch with: patch -p0
patching file common/src/java/org/apache/hadoop/hive/conf/HiveConf.java
patching file conf/hive-default.xml.template
patching file data/conf/hive-site.xml
patching file itests/util/src/main/java/org/apache/hadoop/hive/ql/hooks/OptrStatGroupByHook.java
patching file ql/src/java/org/apache/hadoop/hive/ql/ErrorMsg.java
patching file ql/src/java/org/apache/hadoop/hive/ql/QueryPlan.java
patching file ql/src/java/org/apache/hadoop/hive/ql/exec/AbstractMapJoinOperator.java
patching file ql/src/java/org/apache/hadoop/hive/ql/exec/CommonJoinOperator.java
patching file ql/src/java/org/apache/hadoop/hive/ql/exec/DemuxOperator.java
patching file ql/src/java/org/apache/hadoop/hive/ql/exec/FetchOperator.java
patching file ql/src/java/org/apache/hadoop/hive/ql/exec/FileSinkOperator.java
patching file ql/src/java/org/apache/hadoop/hive/ql/exec/GroupByOperator.java
patching file ql/src/java/org/apache/hadoop/hive/ql/exec/MapJoinOperator.java
patching file ql/src/java/org/apache/hadoop/hive/ql/exec/MuxOperator.java
patching file ql/src/java/org/apache/hadoop/hive/ql/exec/Operator.java
patching file ql/src/java/org/apache/hadoop/hive/ql/exec/OperatorFactory.java
patching file ql/src/java/org/apache/hadoop/hive/ql/exec/ReduceSinkOperator.java
patching file ql/src/java/org/apache/hadoop/hive/ql/exec/SMBMapJoinOperator.java
patching file ql/src/java/org/apache/hadoop/hive/ql/exec/Utilities.java
patching file ql/src/java/org/apache/hadoop/hive/ql/exec/mr/ExecDriver.java
patching file ql/src/java/org/apache/hadoop/hive/ql/exec/mr/ExecReducer.java
patching file ql/src/java/org/apache/hadoop/hive/ql/exec/mr/HadoopJobExecHelper.java
patching file ql/src/java/org/apache/hadoop/hive/ql/exec/mr/HadoopJobExecHook.java
patching file ql/src/java/org/apache/hadoop/hive/ql/exec/mr/MapredLocalTask.java
patching file ql/src/java/org/apache/hadoop/hive/ql/exec/vector/VectorFileSinkOperator.java
patching file ql/src/java/org/apache/hadoop/hive/ql/io/rcfile/merge/BlockMergeTask.java
patching file ql/src/java/org/apache/hadoop/hive/ql/io/rcfile/stats/PartialScanTask.java
patching file ql/src/java/org/apache/hadoop/hive/ql/io/rcfile/truncate/ColumnTruncateTask.java
patching file ql/src/java/org/apache/hadoop/hive/ql/parse/MapReduceCompiler.java
patching file ql/src/java/org/apache/hadoop/hive/ql/parse/SemanticAnalyzer.java
Hunk #1 succeeded at 5229 (offset 63 lines).
patching file ql/src/test/org/apache/hadoop/hive/ql/exec/TestOperators.java
patching file ql/src/test/org/apache/hadoop/hive/ql/exec/vector/TestVectorGroupByOperator.java
patching file ql/src/test/org/apache/hadoop/hive/ql/testutil/OperatorTestUtils.java
patching file ql/src/test/queries/clientpositive/insert_into3.q
patching file ql/src/test/queries/clientpositive/optrstat_groupby.q
patching file ql/src/test/results/clientpositive/insert_into3.q.out
patching file ql/src/test/results/clientpositive/optrstat_groupby.q.out
patching file ql/src/test/results/compiler/plan/case_sensitivity.q.xml
patching file ql/src/test/results/compiler/plan/cast1.q.xml
patching file ql/src/test/results/compiler/plan/groupby1.q.xml
patching file ql/src/test/results/compiler/plan/groupby2.q.xml
patching file ql/src/test/results/compiler/plan/groupby3.q.xml
patching file ql/src/test/results/compiler/plan/groupby4.q.xml
patching file ql/src/test/results/compiler/plan/groupby5.q.xml
patching file ql/src/test/results/compiler/plan/groupby6.q.xml
patching file ql/src/test/results/compiler/plan/input1.q.xml
patching file ql/src/test/results/compiler/plan/input2.q.xml
patching file ql/src/test/results/compiler/plan/input20.q.xml
patching file ql/src/test/results/compiler/plan/input3.q.xml
patching file ql/src/test/results/compiler/plan/input4.q.xml
patching file ql/src/test/results/compiler/plan/input5.q.xml
patching file ql/src/test/results/compiler/plan/input6.q.xml
patching file ql/src/test/results/compiler/plan/input7.q.xml
patching file ql/src/test/results/compiler/plan/input8.q.xml
patching file ql/src/test/results/compiler/plan/input9.q.xml
patching file ql/src/test/results/compiler/plan/input_part1.q.xml
patching file ql/src/test/results/compiler/plan/input_testsequencefile.q.xml
patching file ql/src/test/results/compiler/plan/input_testxpath.q.xml
patching file ql/src/test/results/compiler/plan/input_testxpath2.q.xml
patching file ql/src/test/results/compiler/plan/join1.q.xml
patching file ql/src/test/results/compiler/plan/join2.q.xml
patching file ql/src/test/results/compiler/plan/join3.q.xml
patching file ql/src/test/results/compiler/plan/join4.q.xml
patching file ql/src/test/results/compiler/plan/join5.q.xml
patching file ql/src/test/results/compiler/plan/join6.q.xml
patching file ql/src/test/results/compiler/plan/join7.q.xml
patching file ql/src/test/results/compiler/plan/join8.q.xml
patching file ql/src/test/results/compiler/plan/sample1.q.xml
patching file ql/src/test/results/compiler/plan/sample2.q.xml
patching file ql/src/test/results/compiler/plan/sample3.q.xml
patching file ql/src/test/results/compiler/plan/sample4.q.xml
patching file ql/src/test/results/compiler/plan/sample5.q.xml
patching file ql/src/test/results/compiler/plan/sample6.q.xml
patching file ql/src/test/results/compiler/plan/sample7.q.xml
patching file ql/src/test/results/compiler/plan/subq.q.xml
patching file ql/src/test/results/compiler/plan/udf1.q.xml
patching file ql/src/test/results/compiler/plan/udf4.q.xml
patching file ql/src/test/results/compiler/plan/udf6.q.xml
patching file ql/src/test/results/compiler/plan/udf_case.q.xml
patching file ql/src/test/results/compiler/plan/udf_when.q.xml
patching file ql/src/test/results/compiler/plan/union.q.xml
+ [[ maven == \m\a\v\e\n ]]
+ rm -rf /data/hive-ptest/working/maven/org/apache/hive
+ mvn -B clean install -DskipTests -Dmaven.repo.local=/data/hive-ptest/working/maven
[INFO] Scanning for projects...
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Build Order:
[INFO] 
[INFO] Hive
[INFO] Hive Ant Utilities
[INFO] Hive Shims Common
[INFO] Hive Shims 0.20
[INFO] Hive Shims Secure Common
[INFO] Hive Shims 0.20S
[INFO] Hive Shims 0.23
[INFO] Hive Shims
[INFO] Hive Common
[INFO] Hive Serde
[INFO] Hive Metastore
[INFO] Hive Query Language
[INFO] Hive Service
[INFO] Hive JDBC
[INFO] Hive Beeline
[INFO] Hive CLI
[INFO] Hive Contrib
[INFO] Hive HBase Handler
[INFO] Hive HCatalog
[INFO] Hive HCatalog Core
[INFO] Hive HCatalog Pig Adapter
[INFO] Hive HCatalog Server Extensions
[INFO] Hive HCatalog Webhcat Java Client
[INFO] Hive HCatalog Webhcat
[INFO] Hive HCatalog HBase Storage Handler
[INFO] Hive HWI
[INFO] Hive ODBC
[INFO] Hive Shims Aggregator
[INFO] Hive TestUtils
[INFO] Hive Packaging
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Building Hive 0.13.0-SNAPSHOT
[INFO] ------------------------------------------------------------------------
[INFO] 
[INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive ---
[INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source (includes = [datanucleus.log, derby.log], excludes = [])
[INFO] 
[INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive ---
[INFO] Executing tasks

main:
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive ---
[INFO] Executing tasks

main:
    [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/target/tmp
    [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/target/warehouse
    [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/target/tmp/conf
     [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/target/tmp/conf
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-install-plugin:2.4:install (default-install) @ hive ---
[INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/pom.xml to /data/hive-ptest/working/maven/org/apache/hive/hive/0.13.0-SNAPSHOT/hive-0.13.0-SNAPSHOT.pom
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Building Hive Ant Utilities 0.13.0-SNAPSHOT
[INFO] ------------------------------------------------------------------------
[INFO] 
[INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-ant ---
[INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/ant (includes = [datanucleus.log, derby.log], excludes = [])
[INFO] 
[INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-ant ---
[debug] execute contextualize
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/ant/src/main/resources
[INFO] 
[INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-ant ---
[INFO] Executing tasks

main:
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-ant ---
[INFO] Compiling 5 source files to /data/hive-ptest/working/apache-svn-trunk-source/ant/target/classes
[WARNING] Note: /data/hive-ptest/working/apache-svn-trunk-source/ant/src/org/apache/hadoop/hive/ant/QTestGenTask.java uses or overrides a deprecated API.
[WARNING] Note: Recompile with -Xlint:deprecation for details.
[WARNING] Note: /data/hive-ptest/working/apache-svn-trunk-source/ant/src/org/apache/hadoop/hive/ant/DistinctElementsClassPath.java uses unchecked or unsafe operations.
[WARNING] Note: Recompile with -Xlint:unchecked for details.
[INFO] 
[INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) @ hive-ant ---
[debug] execute contextualize
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/ant/src/test/resources
[INFO] 
[INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-ant ---
[INFO] Executing tasks

main:
    [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/ant/target/tmp
    [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/ant/target/warehouse
    [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/ant/target/tmp/conf
     [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/ant/target/tmp/conf
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-ant ---
[INFO] No sources to compile
[INFO] 
[INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-ant ---
[INFO] Tests are skipped.
[INFO] 
[INFO] --- maven-jar-plugin:2.2:jar (default-jar) @ hive-ant ---
[INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/ant/target/hive-ant-0.13.0-SNAPSHOT.jar
[INFO] 
[INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-ant ---
[INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/ant/target/hive-ant-0.13.0-SNAPSHOT.jar to /data/hive-ptest/working/maven/org/apache/hive/hive-ant/0.13.0-SNAPSHOT/hive-ant-0.13.0-SNAPSHOT.jar
[INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/ant/pom.xml to /data/hive-ptest/working/maven/org/apache/hive/hive-ant/0.13.0-SNAPSHOT/hive-ant-0.13.0-SNAPSHOT.pom
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Building Hive Shims Common 0.13.0-SNAPSHOT
[INFO] ------------------------------------------------------------------------
[INFO] 
[INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-shims-common ---
[INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/shims/common (includes = [datanucleus.log, derby.log], excludes = [])
[INFO] 
[INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-shims-common ---
[debug] execute contextualize
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/shims/common/src/main/resources
[INFO] 
[INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-shims-common ---
[INFO] Executing tasks

main:
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-shims-common ---
[INFO] Compiling 15 source files to /data/hive-ptest/working/apache-svn-trunk-source/shims/common/target/classes
[WARNING] Note: Some input files use or override a deprecated API.
[WARNING] Note: Recompile with -Xlint:deprecation for details.
[INFO] 
[INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) @ hive-shims-common ---
[debug] execute contextualize
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/shims/common/src/test/resources
[INFO] 
[INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-shims-common ---
[INFO] Executing tasks

main:
    [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/common/target/tmp
    [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/common/target/warehouse
    [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/common/target/tmp/conf
     [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/shims/common/target/tmp/conf
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-shims-common ---
[INFO] No sources to compile
[INFO] 
[INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-shims-common ---
[INFO] Tests are skipped.
[INFO] 
[INFO] --- maven-jar-plugin:2.2:jar (default-jar) @ hive-shims-common ---
[INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/shims/common/target/hive-shims-common-0.13.0-SNAPSHOT.jar
[INFO] 
[INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-shims-common ---
[INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/shims/common/target/hive-shims-common-0.13.0-SNAPSHOT.jar to /data/hive-ptest/working/maven/org/apache/hive/shims/hive-shims-common/0.13.0-SNAPSHOT/hive-shims-common-0.13.0-SNAPSHOT.jar
[INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/shims/common/pom.xml to /data/hive-ptest/working/maven/org/apache/hive/shims/hive-shims-common/0.13.0-SNAPSHOT/hive-shims-common-0.13.0-SNAPSHOT.pom
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Building Hive Shims 0.20 0.13.0-SNAPSHOT
[INFO] ------------------------------------------------------------------------
[INFO] 
[INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-shims-0.20 ---
[INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20 (includes = [datanucleus.log, derby.log], excludes = [])
[INFO] 
[INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-shims-0.20 ---
[debug] execute contextualize
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20/src/main/resources
[INFO] 
[INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-shims-0.20 ---
[INFO] Executing tasks

main:
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-shims-0.20 ---
[INFO] Compiling 2 source files to /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20/target/classes
[WARNING] Note: /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20/src/main/java/org/apache/hadoop/hive/shims/Hadoop20Shims.java uses or overrides a deprecated API.
[WARNING] Note: Recompile with -Xlint:deprecation for details.
[WARNING] Note: /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20/src/main/java/org/apache/hadoop/hive/shims/Hadoop20Shims.java uses unchecked or unsafe operations.
[WARNING] Note: Recompile with -Xlint:unchecked for details.
[INFO] 
[INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) @ hive-shims-0.20 ---
[debug] execute contextualize
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20/src/test/resources
[INFO] 
[INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-shims-0.20 ---
[INFO] Executing tasks

main:
    [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20/target/tmp
    [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20/target/warehouse
    [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20/target/tmp/conf
     [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20/target/tmp/conf
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-shims-0.20 ---
[INFO] No sources to compile
[INFO] 
[INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-shims-0.20 ---
[INFO] Tests are skipped.
[INFO] 
[INFO] --- maven-jar-plugin:2.2:jar (default-jar) @ hive-shims-0.20 ---
[INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20/target/hive-shims-0.20-0.13.0-SNAPSHOT.jar
[INFO] 
[INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-shims-0.20 ---
[INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20/target/hive-shims-0.20-0.13.0-SNAPSHOT.jar to /data/hive-ptest/working/maven/org/apache/hive/shims/hive-shims-0.20/0.13.0-SNAPSHOT/hive-shims-0.20-0.13.0-SNAPSHOT.jar
[INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20/pom.xml to /data/hive-ptest/working/maven/org/apache/hive/shims/hive-shims-0.20/0.13.0-SNAPSHOT/hive-shims-0.20-0.13.0-SNAPSHOT.pom
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Building Hive Shims Secure Common 0.13.0-SNAPSHOT
[INFO] ------------------------------------------------------------------------
[INFO] 
[INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-shims-common-secure ---
[INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/shims/common-secure (includes = [datanucleus.log, derby.log], excludes = [])
[INFO] 
[INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-shims-common-secure ---
[debug] execute contextualize
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/shims/common-secure/src/main/resources
[INFO] 
[INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-shims-common-secure ---
[INFO] Executing tasks

main:
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-shims-common-secure ---
[INFO] Compiling 12 source files to /data/hive-ptest/working/apache-svn-trunk-source/shims/common-secure/target/classes
[WARNING] Note: /data/hive-ptest/working/apache-svn-trunk-source/shims/common-secure/src/main/java/org/apache/hadoop/hive/shims/HadoopShimsSecure.java uses or overrides a deprecated API.
[WARNING] Note: Recompile with -Xlint:deprecation for details.
[WARNING] Note: Some input files use unchecked or unsafe operations.
[WARNING] Note: Recompile with -Xlint:unchecked for details.
[INFO] 
[INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) @ hive-shims-common-secure ---
[debug] execute contextualize
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/shims/common-secure/src/test/resources
[INFO] 
[INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-shims-common-secure ---
[INFO] Executing tasks

main:
    [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/common-secure/target/tmp
    [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/common-secure/target/warehouse
    [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/common-secure/target/tmp/conf
     [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/shims/common-secure/target/tmp/conf
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-shims-common-secure ---
[INFO] No sources to compile
[INFO] 
[INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-shims-common-secure ---
[INFO] Tests are skipped.
[INFO] 
[INFO] --- maven-jar-plugin:2.2:jar (default-jar) @ hive-shims-common-secure ---
[INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/shims/common-secure/target/hive-shims-common-secure-0.13.0-SNAPSHOT.jar
[INFO] 
[INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-shims-common-secure ---
[INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/shims/common-secure/target/hive-shims-common-secure-0.13.0-SNAPSHOT.jar to /data/hive-ptest/working/maven/org/apache/hive/shims/hive-shims-common-secure/0.13.0-SNAPSHOT/hive-shims-common-secure-0.13.0-SNAPSHOT.jar
[INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/shims/common-secure/pom.xml to /data/hive-ptest/working/maven/org/apache/hive/shims/hive-shims-common-secure/0.13.0-SNAPSHOT/hive-shims-common-secure-0.13.0-SNAPSHOT.pom
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Building Hive Shims 0.20S 0.13.0-SNAPSHOT
[INFO] ------------------------------------------------------------------------
[INFO] 
[INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-shims-0.20S ---
[INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20S (includes = [datanucleus.log, derby.log], excludes = [])
[INFO] 
[INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-shims-0.20S ---
[debug] execute contextualize
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20S/src/main/resources
[INFO] 
[INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-shims-0.20S ---
[INFO] Executing tasks

main:
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-shims-0.20S ---
[INFO] Compiling 3 source files to /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20S/target/classes
[WARNING] Note: /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20S/src/main/java/org/apache/hadoop/hive/shims/Hadoop20SShims.java uses or overrides a deprecated API.
[WARNING] Note: Recompile with -Xlint:deprecation for details.
[INFO] 
[INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) @ hive-shims-0.20S ---
[debug] execute contextualize
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20S/src/test/resources
[INFO] 
[INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-shims-0.20S ---
[INFO] Executing tasks

main:
    [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20S/target/tmp
    [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20S/target/warehouse
    [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20S/target/tmp/conf
     [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20S/target/tmp/conf
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-shims-0.20S ---
[INFO] No sources to compile
[INFO] 
[INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-shims-0.20S ---
[INFO] Tests are skipped.
[INFO] 
[INFO] --- maven-jar-plugin:2.2:jar (default-jar) @ hive-shims-0.20S ---
[INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20S/target/hive-shims-0.20S-0.13.0-SNAPSHOT.jar
[INFO] 
[INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-shims-0.20S ---
[INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20S/target/hive-shims-0.20S-0.13.0-SNAPSHOT.jar to /data/hive-ptest/working/maven/org/apache/hive/shims/hive-shims-0.20S/0.13.0-SNAPSHOT/hive-shims-0.20S-0.13.0-SNAPSHOT.jar
[INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20S/pom.xml to /data/hive-ptest/working/maven/org/apache/hive/shims/hive-shims-0.20S/0.13.0-SNAPSHOT/hive-shims-0.20S-0.13.0-SNAPSHOT.pom
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Building Hive Shims 0.23 0.13.0-SNAPSHOT
[INFO] ------------------------------------------------------------------------
[INFO] 
[INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-shims-0.23 ---
[INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/shims/0.23 (includes = [datanucleus.log, derby.log], excludes = [])
[INFO] 
[INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-shims-0.23 ---
[debug] execute contextualize
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/shims/0.23/src/main/resources
[INFO] 
[INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-shims-0.23 ---
[INFO] Executing tasks

main:
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-shims-0.23 ---
[INFO] Compiling 3 source files to /data/hive-ptest/working/apache-svn-trunk-source/shims/0.23/target/classes
[WARNING] Note: /data/hive-ptest/working/apache-svn-trunk-source/shims/0.23/src/main/java/org/apache/hadoop/hive/shims/Hadoop23Shims.java uses or overrides a deprecated API.
[WARNING] Note: Recompile with -Xlint:deprecation for details.
[INFO] 
[INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) @ hive-shims-0.23 ---
[debug] execute contextualize
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/shims/0.23/src/test/resources
[INFO] 
[INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-shims-0.23 ---
[INFO] Executing tasks

main:
    [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/0.23/target/tmp
    [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/0.23/target/warehouse
    [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/0.23/target/tmp/conf
     [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/shims/0.23/target/tmp/conf
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-shims-0.23 ---
[INFO] No sources to compile
[INFO] 
[INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-shims-0.23 ---
[INFO] Tests are skipped.
[INFO] 
[INFO] --- maven-jar-plugin:2.2:jar (default-jar) @ hive-shims-0.23 ---
[INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/shims/0.23/target/hive-shims-0.23-0.13.0-SNAPSHOT.jar
[INFO] 
[INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-shims-0.23 ---
[INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/shims/0.23/target/hive-shims-0.23-0.13.0-SNAPSHOT.jar to /data/hive-ptest/working/maven/org/apache/hive/shims/hive-shims-0.23/0.13.0-SNAPSHOT/hive-shims-0.23-0.13.0-SNAPSHOT.jar
[INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/shims/0.23/pom.xml to /data/hive-ptest/working/maven/org/apache/hive/shims/hive-shims-0.23/0.13.0-SNAPSHOT/hive-shims-0.23-0.13.0-SNAPSHOT.pom
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Building Hive Shims 0.13.0-SNAPSHOT
[INFO] ------------------------------------------------------------------------
[INFO] 
[INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-shims ---
[INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/shims/assembly (includes = [datanucleus.log, derby.log], excludes = [])
[INFO] 
[INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-shims ---
[debug] execute contextualize
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/shims/assembly/src/main/resources
[INFO] 
[INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-shims ---
[INFO] Executing tasks

main:
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-shims ---
[INFO] No sources to compile
[INFO] 
[INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) @ hive-shims ---
[debug] execute contextualize
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/shims/assembly/src/test/resources
[INFO] 
[INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-shims ---
[INFO] Executing tasks

main:
    [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/assembly/target/tmp
    [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/assembly/target/warehouse
    [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/assembly/target/tmp/conf
     [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/shims/assembly/target/tmp/conf
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-shims ---
[INFO] No sources to compile
[INFO] 
[INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-shims ---
[INFO] Tests are skipped.
[INFO] 
[INFO] --- maven-jar-plugin:2.2:jar (default-jar) @ hive-shims ---
[WARNING] JAR will be empty - no content was marked for inclusion!
[INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/shims/assembly/target/hive-shims-0.13.0-SNAPSHOT.jar
[INFO] 
[INFO] --- maven-assembly-plugin:2.3:single (uberjar) @ hive-shims ---
[INFO] Reading assembly descriptor: src/assemble/uberjar.xml
[WARNING] Artifact: org.apache.hive:hive-shims:jar:0.13.0-SNAPSHOT references the same file as the assembly destination file. Moving it to a temporary location for inclusion.
[INFO] META-INF/MANIFEST.MF already added, skipping
[INFO] META-INF/MANIFEST.MF already added, skipping
[INFO] META-INF/MANIFEST.MF already added, skipping
[INFO] META-INF/MANIFEST.MF already added, skipping
[INFO] META-INF/MANIFEST.MF already added, skipping
[INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/shims/assembly/target/hive-shims-0.13.0-SNAPSHOT.jar
[INFO] META-INF/MANIFEST.MF already added, skipping
[INFO] META-INF/MANIFEST.MF already added, skipping
[INFO] META-INF/MANIFEST.MF already added, skipping
[INFO] META-INF/MANIFEST.MF already added, skipping
[INFO] META-INF/MANIFEST.MF already added, skipping
[WARNING] Configuration options: 'appendAssemblyId' is set to false, and 'classifier' is missing.
Instead of attaching the assembly file: /data/hive-ptest/working/apache-svn-trunk-source/shims/assembly/target/hive-shims-0.13.0-SNAPSHOT.jar, it will become the file for main project artifact.
NOTE: If multiple descriptors or descriptor-formats are provided for this project, the value of this file will be non-deterministic!
[WARNING] Replacing pre-existing project main-artifact file: /data/hive-ptest/working/apache-svn-trunk-source/shims/assembly/target/archive-tmp/hive-shims-0.13.0-SNAPSHOT.jar
with assembly file: /data/hive-ptest/working/apache-svn-trunk-source/shims/assembly/target/hive-shims-0.13.0-SNAPSHOT.jar
[INFO] 
[INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-shims ---
[INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/shims/assembly/target/hive-shims-0.13.0-SNAPSHOT.jar to /data/hive-ptest/working/maven/org/apache/hive/hive-shims/0.13.0-SNAPSHOT/hive-shims-0.13.0-SNAPSHOT.jar
[INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/shims/assembly/pom.xml to /data/hive-ptest/working/maven/org/apache/hive/hive-shims/0.13.0-SNAPSHOT/hive-shims-0.13.0-SNAPSHOT.pom
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Building Hive Common 0.13.0-SNAPSHOT
[INFO] ------------------------------------------------------------------------
[INFO] 
[INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-common ---
[INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/common (includes = [datanucleus.log, derby.log], excludes = [])
[INFO] 
[INFO] --- maven-antrun-plugin:1.7:run (generate-version-annotation) @ hive-common ---
[INFO] Executing tasks

main:
[INFO] Executed tasks
[INFO] 
[INFO] --- build-helper-maven-plugin:1.8:add-source (add-source) @ hive-common ---
[INFO] Source directory: /data/hive-ptest/working/apache-svn-trunk-source/common/src/gen added.
[INFO] 
[INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-common ---
[debug] execute contextualize
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] Copying 1 resource
[INFO] 
[INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-common ---
[INFO] Executing tasks

main:
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-common ---
[INFO] Compiling 31 source files to /data/hive-ptest/working/apache-svn-trunk-source/common/target/classes
[WARNING] Note: /data/hive-ptest/working/apache-svn-trunk-source/common/src/java/org/apache/hadoop/hive/common/ObjectPair.java uses unchecked or unsafe operations.
[WARNING] Note: Recompile with -Xlint:unchecked for details.
[INFO] 
[INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) @ hive-common ---
[debug] execute contextualize
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] Copying 4 resources
[INFO] 
[INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-common ---
[INFO] Executing tasks

main:
    [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/common/target/tmp
    [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/common/target/warehouse
    [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/common/target/tmp/conf
     [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/common/target/tmp/conf
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-common ---
[INFO] Compiling 8 source files to /data/hive-ptest/working/apache-svn-trunk-source/common/target/test-classes
[INFO] 
[INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-common ---
[INFO] Tests are skipped.
[INFO] 
[INFO] --- maven-jar-plugin:2.2:jar (default-jar) @ hive-common ---
[INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/common/target/hive-common-0.13.0-SNAPSHOT.jar
[INFO] 
[INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-common ---
[INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/common/target/hive-common-0.13.0-SNAPSHOT.jar to /data/hive-ptest/working/maven/org/apache/hive/hive-common/0.13.0-SNAPSHOT/hive-common-0.13.0-SNAPSHOT.jar
[INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/common/pom.xml to /data/hive-ptest/working/maven/org/apache/hive/hive-common/0.13.0-SNAPSHOT/hive-common-0.13.0-SNAPSHOT.pom
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Building Hive Serde 0.13.0-SNAPSHOT
[INFO] ------------------------------------------------------------------------
[INFO] 
[INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-serde ---
[INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/serde (includes = [datanucleus.log, derby.log], excludes = [])
[INFO] 
[INFO] --- build-helper-maven-plugin:1.8:add-source (add-source) @ hive-serde ---
[INFO] Source directory: /data/hive-ptest/working/apache-svn-trunk-source/serde/src/gen/protobuf/gen-java added.
[INFO] Source directory: /data/hive-ptest/working/apache-svn-trunk-source/serde/src/gen/thrift/gen-javabean added.
[INFO] 
[INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-serde ---
[debug] execute contextualize
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/serde/src/main/resources
[INFO] 
[INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-serde ---
[INFO] Executing tasks

main:
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-serde ---
[INFO] Compiling 351 source files to /data/hive-ptest/working/apache-svn-trunk-source/serde/target/classes
[WARNING] Note: Some input files use or override a deprecated API.
[WARNING] Note: Recompile with -Xlint:deprecation for details.
[WARNING] Note: Some input files use unchecked or unsafe operations.
[WARNING] Note: Recompile with -Xlint:unchecked for details.
[INFO] 
[INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) @ hive-serde ---
[debug] execute contextualize
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/serde/src/test/resources
[INFO] 
[INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-serde ---
[INFO] Executing tasks

main:
    [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/serde/target/tmp
    [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/serde/target/warehouse
    [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/serde/target/tmp/conf
     [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/serde/target/tmp/conf
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-serde ---
[INFO] Compiling 41 source files to /data/hive-ptest/working/apache-svn-trunk-source/serde/target/test-classes
[WARNING] Note: Some input files use or override a deprecated API.
[WARNING] Note: Recompile with -Xlint:deprecation for details.
[WARNING] Note: Some input files use unchecked or unsafe operations.
[WARNING] Note: Recompile with -Xlint:unchecked for details.
[INFO] 
[INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-serde ---
[INFO] Tests are skipped.
[INFO] 
[INFO] --- maven-jar-plugin:2.2:jar (default-jar) @ hive-serde ---
[INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/serde/target/hive-serde-0.13.0-SNAPSHOT.jar
[INFO] 
[INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-serde ---
[INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/serde/target/hive-serde-0.13.0-SNAPSHOT.jar to /data/hive-ptest/working/maven/org/apache/hive/hive-serde/0.13.0-SNAPSHOT/hive-serde-0.13.0-SNAPSHOT.jar
[INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/serde/pom.xml to /data/hive-ptest/working/maven/org/apache/hive/hive-serde/0.13.0-SNAPSHOT/hive-serde-0.13.0-SNAPSHOT.pom
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Building Hive Metastore 0.13.0-SNAPSHOT
[INFO] ------------------------------------------------------------------------
[INFO] 
[INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-metastore ---
[INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/metastore (includes = [datanucleus.log, derby.log], excludes = [])
[INFO] 
[INFO] --- build-helper-maven-plugin:1.8:add-source (add-source) @ hive-metastore ---
[INFO] Source directory: /data/hive-ptest/working/apache-svn-trunk-source/metastore/src/model added.
[INFO] Source directory: /data/hive-ptest/working/apache-svn-trunk-source/metastore/src/gen/thrift/gen-javabean added.
[INFO] 
[INFO] --- antlr3-maven-plugin:3.4:antlr (default) @ hive-metastore ---
[INFO] ANTLR: Processing source directory /data/hive-ptest/working/apache-svn-trunk-source/metastore/src/java
ANTLR Parser Generator  Version 3.4
org/apache/hadoop/hive/metastore/parser/Filter.g
[INFO] 
[INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-metastore ---
[debug] execute contextualize
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] Copying 1 resource
[INFO] 
[INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-metastore ---
[INFO] Executing tasks

main:
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-metastore ---
[INFO] Compiling 132 source files to /data/hive-ptest/working/apache-svn-trunk-source/metastore/target/classes
[WARNING] Note: Some input files use or override a deprecated API.
[WARNING] Note: Recompile with -Xlint:deprecation for details.
[WARNING] Note: Some input files use unchecked or unsafe operations.
[WARNING] Note: Recompile with -Xlint:unchecked for details.
[INFO] 
[INFO] --- datanucleus-maven-plugin:3.3.0-release:enhance (default) @ hive-metastore ---
[INFO] DataNucleus Enhancer (version 3.2.2) for API "JDO" using JRE "1.6"
DataNucleus Enhancer : Classpath
>>  /data/hive-ptest/working/maven/org/datanucleus/datanucleus-maven-plugin/3.3.0-release/datanucleus-maven-plugin-3.3.0-release.jar
>>  /data/hive-ptest/working/maven/org/datanucleus/datanucleus-core/3.2.2/datanucleus-core-3.2.2.jar
>>  /data/hive-ptest/working/maven/org/codehaus/plexus/plexus-utils/3.0.8/plexus-utils-3.0.8.jar
>>  /data/hive-ptest/working/maven/org/codehaus/plexus/plexus-component-annotations/1.5.5/plexus-component-annotations-1.5.5.jar
>>  /data/hive-ptest/working/maven/org/sonatype/sisu/sisu-inject-bean/2.3.0/sisu-inject-bean-2.3.0.jar
>>  /data/hive-ptest/working/maven/org/sonatype/sisu/sisu-guice/3.1.0/sisu-guice-3.1.0-no_aop.jar
>>  /data/hive-ptest/working/maven/org/sonatype/sisu/sisu-guava/0.9.9/sisu-guava-0.9.9.jar
>>  /data/hive-ptest/working/maven/org/apache/xbean/xbean-reflect/3.4/xbean-reflect-3.4.jar
>>  /data/hive-ptest/working/maven/log4j/log4j/1.2.12/log4j-1.2.12.jar
>>  /data/hive-ptest/working/maven/commons-logging/commons-logging-api/1.1/commons-logging-api-1.1.jar
>>  /data/hive-ptest/working/maven/com/google/collections/google-collections/1.0/google-collections-1.0.jar
>>  /data/hive-ptest/working/maven/junit/junit/3.8.2/junit-3.8.2.jar
>>  /data/hive-ptest/working/apache-svn-trunk-source/metastore/target/classes
>>  /data/hive-ptest/working/apache-svn-trunk-source/serde/target/hive-serde-0.13.0-SNAPSHOT.jar
>>  /data/hive-ptest/working/apache-svn-trunk-source/common/target/hive-common-0.13.0-SNAPSHOT.jar
>>  /data/hive-ptest/working/maven/org/apache/commons/commons-compress/1.4.1/commons-compress-1.4.1.jar
>>  /data/hive-ptest/working/maven/org/tukaani/xz/1.0/xz-1.0.jar
>>  /data/hive-ptest/working/maven/commons-codec/commons-codec/1.4/commons-codec-1.4.jar
>>  /data/hive-ptest/working/maven/org/apache/avro/avro/1.7.1/avro-1.7.1.jar
>>  /data/hive-ptest/working/maven/org/codehaus/jackson/jackson-core-asl/1.8.8/jackson-core-asl-1.8.8.jar
>>  /data/hive-ptest/working/maven/com/thoughtworks/paranamer/paranamer/2.3/paranamer-2.3.jar
>>  /data/hive-ptest/working/maven/org/xerial/snappy/snappy-java/1.0.4.1/snappy-java-1.0.4.1.jar
>>  /data/hive-ptest/working/apache-svn-trunk-source/shims/assembly/target/hive-shims-0.13.0-SNAPSHOT.jar
>>  /data/hive-ptest/working/apache-svn-trunk-source/shims/common/target/hive-shims-common-0.13.0-SNAPSHOT.jar
>>  /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20/target/hive-shims-0.20-0.13.0-SNAPSHOT.jar
>>  /data/hive-ptest/working/apache-svn-trunk-source/shims/common-secure/target/hive-shims-common-secure-0.13.0-SNAPSHOT.jar
>>  /data/hive-ptest/working/maven/org/apache/zookeeper/zookeeper/3.4.3/zookeeper-3.4.3.jar
>>  /data/hive-ptest/working/maven/jline/jline/0.9.94/jline-0.9.94.jar
>>  /data/hive-ptest/working/maven/org/jboss/netty/netty/3.2.2.Final/netty-3.2.2.Final.jar
>>  /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20S/target/hive-shims-0.20S-0.13.0-SNAPSHOT.jar
>>  /data/hive-ptest/working/apache-svn-trunk-source/shims/0.23/target/hive-shims-0.23-0.13.0-SNAPSHOT.jar
>>  /data/hive-ptest/working/maven/com/google/guava/guava/11.0.2/guava-11.0.2.jar
>>  /data/hive-ptest/working/maven/com/google/code/findbugs/jsr305/1.3.9/jsr305-1.3.9.jar
>>  /data/hive-ptest/working/maven/commons-cli/commons-cli/1.2/commons-cli-1.2.jar
>>  /data/hive-ptest/working/maven/commons-lang/commons-lang/2.4/commons-lang-2.4.jar
>>  /data/hive-ptest/working/maven/commons-logging/commons-logging/1.1.3/commons-logging-1.1.3.jar
>>  /data/hive-ptest/working/maven/org/apache/derby/derby/10.4.2.0/derby-10.4.2.0.jar
>>  /data/hive-ptest/working/maven/org/datanucleus/datanucleus-api-jdo/3.2.1/datanucleus-api-jdo-3.2.1.jar
>>  /data/hive-ptest/working/maven/org/datanucleus/datanucleus-rdbms/3.2.1/datanucleus-rdbms-3.2.1.jar
>>  /data/hive-ptest/working/maven/javax/jdo/jdo-api/3.0.1/jdo-api-3.0.1.jar
>>  /data/hive-ptest/working/maven/javax/transaction/jta/1.1/jta-1.1.jar
>>  /data/hive-ptest/working/maven/org/antlr/antlr-runtime/3.4/antlr-runtime-3.4.jar
>>  /data/hive-ptest/working/maven/org/antlr/stringtemplate/3.2.1/stringtemplate-3.2.1.jar
>>  /data/hive-ptest/working/maven/antlr/antlr/2.7.7/antlr-2.7.7.jar
>>  /data/hive-ptest/working/maven/org/apache/thrift/libfb303/0.9.0/libfb303-0.9.0.jar
>>  /data/hive-ptest/working/maven/org/apache/thrift/libthrift/0.9.0/libthrift-0.9.0.jar
>>  /data/hive-ptest/working/maven/org/apache/httpcomponents/httpclient/4.1.3/httpclient-4.1.3.jar
>>  /data/hive-ptest/working/maven/org/apache/httpcomponents/httpcore/4.1.3/httpcore-4.1.3.jar
>>  /data/hive-ptest/working/maven/org/apache/hadoop/hadoop-core/1.2.1/hadoop-core-1.2.1.jar
>>  /data/hive-ptest/working/maven/xmlenc/xmlenc/0.52/xmlenc-0.52.jar
>>  /data/hive-ptest/working/maven/com/sun/jersey/jersey-core/1.8/jersey-core-1.8.jar
>>  /data/hive-ptest/working/maven/com/sun/jersey/jersey-json/1.8/jersey-json-1.8.jar
>>  /data/hive-ptest/working/maven/org/codehaus/jettison/jettison/1.1/jettison-1.1.jar
>>  /data/hive-ptest/working/maven/stax/stax-api/1.0.1/stax-api-1.0.1.jar
>>  /data/hive-ptest/working/maven/com/sun/xml/bind/jaxb-impl/2.2.3-1/jaxb-impl-2.2.3-1.jar
>>  /data/hive-ptest/working/maven/javax/xml/bind/jaxb-api/2.2.2/jaxb-api-2.2.2.jar
>>  /data/hive-ptest/working/maven/javax/xml/stream/stax-api/1.0-2/stax-api-1.0-2.jar
>>  /data/hive-ptest/working/maven/javax/activation/activation/1.1/activation-1.1.jar
>>  /data/hive-ptest/working/maven/org/codehaus/jackson/jackson-jaxrs/1.7.1/jackson-jaxrs-1.7.1.jar
>>  /data/hive-ptest/working/maven/org/codehaus/jackson/jackson-xc/1.7.1/jackson-xc-1.7.1.jar
>>  /data/hive-ptest/working/maven/com/sun/jersey/jersey-server/1.8/jersey-server-1.8.jar
>>  /data/hive-ptest/working/maven/asm/asm/3.1/asm-3.1.jar
>>  /data/hive-ptest/working/maven/commons-io/commons-io/2.1/commons-io-2.1.jar
>>  /data/hive-ptest/working/maven/commons-httpclient/commons-httpclient/3.0.1/commons-httpclient-3.0.1.jar
>>  /data/hive-ptest/working/maven/org/apache/commons/commons-math/2.1/commons-math-2.1.jar
>>  /data/hive-ptest/working/maven/commons-configuration/commons-configuration/1.6/commons-configuration-1.6.jar
>>  /data/hive-ptest/working/maven/commons-collections/commons-collections/3.2.1/commons-collections-3.2.1.jar
>>  /data/hive-ptest/working/maven/commons-digester/commons-digester/1.8/commons-digester-1.8.jar
>>  /data/hive-ptest/working/maven/commons-beanutils/commons-beanutils/1.7.0/commons-beanutils-1.7.0.jar
>>  /data/hive-ptest/working/maven/commons-beanutils/commons-beanutils-core/1.8.0/commons-beanutils-core-1.8.0.jar
>>  /data/hive-ptest/working/maven/commons-net/commons-net/1.4.1/commons-net-1.4.1.jar
>>  /data/hive-ptest/working/maven/org/mortbay/jetty/jetty/6.1.26/jetty-6.1.26.jar
>>  /data/hive-ptest/working/maven/org/mortbay/jetty/servlet-api/2.5-20081211/servlet-api-2.5-20081211.jar
>>  /data/hive-ptest/working/maven/org/mortbay/jetty/jetty-util/6.1.26/jetty-util-6.1.26.jar
>>  /data/hive-ptest/working/maven/tomcat/jasper-runtime/5.5.12/jasper-runtime-5.5.12.jar
>>  /data/hive-ptest/working/maven/tomcat/jasper-compiler/5.5.12/jasper-compiler-5.5.12.jar
>>  /data/hive-ptest/working/maven/org/mortbay/jetty/jsp-api-2.1/6.1.14/jsp-api-2.1-6.1.14.jar
>>  /data/hive-ptest/working/maven/org/mortbay/jetty/servlet-api-2.5/6.1.14/servlet-api-2.5-6.1.14.jar
>>  /data/hive-ptest/working/maven/org/mortbay/jetty/jsp-2.1/6.1.14/jsp-2.1-6.1.14.jar
>>  /data/hive-ptest/working/maven/ant/ant/1.6.5/ant-1.6.5.jar
>>  /data/hive-ptest/working/maven/commons-el/commons-el/1.0/commons-el-1.0.jar
>>  /data/hive-ptest/working/maven/net/java/dev/jets3t/jets3t/0.6.1/jets3t-0.6.1.jar
>>  /data/hive-ptest/working/maven/hsqldb/hsqldb/1.8.0.10/hsqldb-1.8.0.10.jar
>>  /data/hive-ptest/working/maven/oro/oro/2.0.8/oro-2.0.8.jar
>>  /data/hive-ptest/working/maven/org/eclipse/jdt/core/3.1.1/core-3.1.1.jar
>>  /data/hive-ptest/working/maven/org/codehaus/jackson/jackson-mapper-asl/1.8.8/jackson-mapper-asl-1.8.8.jar
>>  /data/hive-ptest/working/maven/org/slf4j/slf4j-api/1.6.1/slf4j-api-1.6.1.jar
>>  /data/hive-ptest/working/maven/org/slf4j/slf4j-log4j12/1.6.1/slf4j-log4j12-1.6.1.jar
>>  /data/hive-ptest/working/maven/log4j/log4j/1.2.16/log4j-1.2.16.jar
ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MDatabase
ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MFieldSchema
ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MType
ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MTable
ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MSerDeInfo
ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MOrder
ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MColumnDescriptor
ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MStringList
ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MStorageDescriptor
ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MPartition
ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MIndex
ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MRole
ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MRoleMap
ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MGlobalPrivilege
ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MDBPrivilege
ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MTablePrivilege
ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MPartitionPrivilege
ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MTableColumnPrivilege
ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MPartitionColumnPrivilege
ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MPartitionEvent
ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MMasterKey
ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MDelegationToken
ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MTableColumnStatistics
ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MPartitionColumnStatistics
ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MVersionTable
DataNucleus Enhancer completed with success for 25 classes. Timings : input=577 ms, enhance=922 ms, total=1499 ms. Consult the log for full details

[INFO] 
[INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) @ hive-metastore ---
[debug] execute contextualize
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/metastore/src/test/resources
[INFO] 
[INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-metastore ---
[INFO] Executing tasks

main:
    [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/metastore/target/tmp
    [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/metastore/target/warehouse
    [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/metastore/target/tmp/conf
     [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/metastore/target/tmp/conf
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-metastore ---
[INFO] Compiling 10 source files to /data/hive-ptest/working/apache-svn-trunk-source/metastore/target/test-classes
[INFO] 
[INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-metastore ---
[INFO] Tests are skipped.
[INFO] 
[INFO] --- maven-jar-plugin:2.2:jar (default-jar) @ hive-metastore ---
[INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/metastore/target/hive-metastore-0.13.0-SNAPSHOT.jar
[INFO] 
[INFO] --- maven-jar-plugin:2.2:test-jar (default) @ hive-metastore ---
[INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/metastore/target/hive-metastore-0.13.0-SNAPSHOT-tests.jar
[INFO] 
[INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-metastore ---
[INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/metastore/target/hive-metastore-0.13.0-SNAPSHOT.jar to /data/hive-ptest/working/maven/org/apache/hive/hive-metastore/0.13.0-SNAPSHOT/hive-metastore-0.13.0-SNAPSHOT.jar
[INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/metastore/pom.xml to /data/hive-ptest/working/maven/org/apache/hive/hive-metastore/0.13.0-SNAPSHOT/hive-metastore-0.13.0-SNAPSHOT.pom
[INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/metastore/target/hive-metastore-0.13.0-SNAPSHOT-tests.jar to /data/hive-ptest/working/maven/org/apache/hive/hive-metastore/0.13.0-SNAPSHOT/hive-metastore-0.13.0-SNAPSHOT-tests.jar
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Building Hive Query Language 0.13.0-SNAPSHOT
[INFO] ------------------------------------------------------------------------
[INFO] 
[INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-exec ---
[INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/ql (includes = [datanucleus.log, derby.log], excludes = [])
[INFO] 
[INFO] --- maven-antrun-plugin:1.7:run (generate-sources) @ hive-exec ---
[INFO] Executing tasks

main:
    [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/ql/target/generated-sources/java/org/apache/hadoop/hive/ql/exec/vector/expressions/gen
    [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/ql/target/generated-sources/java/org/apache/hadoop/hive/ql/exec/vector/expressions/aggregates/gen
    [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/ql/target/generated-test-sources/java/org/apache/hadoop/hive/ql/exec/vector/expressions/gen
Generating vector expression code
Generating vector expression test code
[INFO] Executed tasks
[INFO] 
[INFO] --- build-helper-maven-plugin:1.8:add-source (add-source) @ hive-exec ---
[INFO] Source directory: /data/hive-ptest/working/apache-svn-trunk-source/ql/src/gen/protobuf/gen-java added.
[INFO] Source directory: /data/hive-ptest/working/apache-svn-trunk-source/ql/src/gen/thrift/gen-javabean added.
[INFO] Source directory: /data/hive-ptest/working/apache-svn-trunk-source/ql/target/generated-sources/java added.
[INFO] 
[INFO] --- antlr3-maven-plugin:3.4:antlr (default) @ hive-exec ---
[INFO] ANTLR: Processing source directory /data/hive-ptest/working/apache-svn-trunk-source/ql/src/java
ANTLR Parser Generator  Version 3.4
org/apache/hadoop/hive/ql/parse/HiveLexer.g
org/apache/hadoop/hive/ql/parse/HiveParser.g
warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:872:5: 
Decision can match input such as "Identifier KW_RENAME KW_TO" using multiple alternatives: 1, 10

As a result, alternative(s) 10 were disabled for that input
warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1177:5: 
Decision can match input such as "KW_TEXTFILE" using multiple alternatives: 2, 6

As a result, alternative(s) 6 were disabled for that input
warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1177:5: 
Decision can match input such as "KW_SEQUENCEFILE" using multiple alternatives: 1, 6

As a result, alternative(s) 6 were disabled for that input
warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1177:5: 
Decision can match input such as "KW_ORCFILE" using multiple alternatives: 4, 6

As a result, alternative(s) 6 were disabled for that input
warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1177:5: 
Decision can match input such as "KW_RCFILE" using multiple alternatives: 3, 6

As a result, alternative(s) 6 were disabled for that input
warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1190:23: 
Decision can match input such as "KW_ELEM_TYPE" using multiple alternatives: 1, 4

As a result, alternative(s) 4 were disabled for that input
warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1190:23: 
Decision can match input such as "KW_KEY_TYPE" using multiple alternatives: 2, 4

As a result, alternative(s) 4 were disabled for that input
warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1190:23: 
Decision can match input such as "KW_VALUE_TYPE" using multiple alternatives: 3, 4

As a result, alternative(s) 4 were disabled for that input
warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1197:23: 
Decision can match input such as "KW_ELEM_TYPE" using multiple alternatives: 1, 4

As a result, alternative(s) 4 were disabled for that input
warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1197:23: 
Decision can match input such as "KW_VALUE_TYPE" using multiple alternatives: 3, 4

As a result, alternative(s) 4 were disabled for that input
warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1197:23: 
Decision can match input such as "KW_KEY_TYPE" using multiple alternatives: 2, 4

As a result, alternative(s) 4 were disabled for that input
warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1215:29: 
Decision can match input such as "KW_PRETTY {KW_ADD..KW_AFTER, KW_ALTER..KW_ANALYZE, KW_ARCHIVE..KW_CASCADE, KW_CHANGE, KW_CLUSTER..KW_COLLECTION, KW_COLUMNS..KW_CREATE, KW_CUBE, KW_CURSOR..KW_DATA, KW_DATABASES..KW_DISABLE, KW_DISTRIBUTE..KW_ELEM_TYPE, KW_ENABLE, KW_ESCAPED, KW_EXCLUSIVE..KW_EXPORT, KW_EXTERNAL..KW_FLOAT, KW_FOR..KW_FORMATTED, KW_FULL, KW_FUNCTIONS..KW_GROUPING, KW_HOLD_DDLTIME..KW_IDXPROPERTIES, KW_IGNORE..KW_ITEMS, KW_KEYS..KW_LEFT, KW_LIKE..KW_LONG, KW_MAPJOIN..KW_MINUS, KW_MSCK..KW_NOSCAN, KW_NO_DROP..KW_OFFLINE, KW_OPTION, KW_ORCFILE..KW_OUTPUTFORMAT, KW_OVERWRITE, KW_PARTITIONED..KW_PLUS, KW_PRETTY..KW_RECORDWRITER, KW_REGEXP..KW_SCHEMAS, KW_SEMI..KW_TABLES, KW_TBLPROPERTIES..KW_TEXTFILE, KW_TIMESTAMP..KW_TOUCH, KW_TRIGGER..KW_UNARCHIVE, KW_UNDO..KW_UNIONTYPE, KW_UNLOCK..KW_VALUE_TYPE, KW_VIEW, KW_WHILE, KW_WITH}" using multiple alternatives: 3, 4

As a result, alternative(s) 4 were disabled for that input
warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1215:29: 
Decision can match input such as "KW_FORMATTED {KW_ADD..KW_AFTER, KW_ALTER..KW_ANALYZE, KW_ARCHIVE..KW_CASCADE, KW_CHANGE, KW_CLUSTER..KW_COLLECTION, KW_COLUMNS..KW_CREATE, KW_CUBE, KW_CURSOR..KW_DATA, KW_DATABASES..KW_DISABLE, KW_DISTRIBUTE..KW_ELEM_TYPE, KW_ENABLE, KW_ESCAPED, KW_EXCLUSIVE..KW_EXPORT, KW_EXTERNAL..KW_FLOAT, KW_FOR..KW_FORMATTED, KW_FULL, KW_FUNCTIONS..KW_GROUPING, KW_HOLD_DDLTIME..KW_IDXPROPERTIES, KW_IGNORE..KW_ITEMS, KW_KEYS..KW_LEFT, KW_LIKE..KW_LONG, KW_MAPJOIN..KW_MINUS, KW_MSCK..KW_NOSCAN, KW_NO_DROP..KW_OFFLINE, KW_OPTION, KW_ORCFILE..KW_OUTPUTFORMAT, KW_OVERWRITE, KW_PARTITIONED..KW_PLUS, KW_PRETTY..KW_RECORDWRITER, KW_REGEXP..KW_SCHEMAS, KW_SEMI..KW_TABLES, KW_TBLPROPERTIES..KW_TEXTFILE, KW_TIMESTAMP..KW_TOUCH, KW_TRIGGER..KW_UNARCHIVE, KW_UNDO..KW_UNIONTYPE, KW_UNLOCK..KW_VALUE_TYPE, KW_VIEW, KW_WHILE, KW_WITH}" using multiple alternatives: 1, 4

As a result, alternative(s) 4 were disabled for that input
warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1215:29: 
Decision can match input such as "KW_PRETTY Identifier" using multiple alternatives: 3, 4

As a result, alternative(s) 4 were disabled for that input
warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1215:29: 
Decision can match input such as "KW_FORMATTED Identifier" using multiple alternatives: 1, 4

As a result, alternative(s) 4 were disabled for that input
warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1215:29: 
Decision can match input such as "KW_PRETTY KW_PARTITION" using multiple alternatives: 3, 4

As a result, alternative(s) 4 were disabled for that input
warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1215:29: 
Decision can match input such as "KW_FORMATTED KW_PARTITION" using multiple alternatives: 1, 4

As a result, alternative(s) 4 were disabled for that input
warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1486:116: 
Decision can match input such as "KW_STORED KW_AS KW_DIRECTORIES" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1609:5: 
Decision can match input such as "KW_STORED KW_AS KW_RCFILE" using multiple alternatives: 3, 7

As a result, alternative(s) 7 were disabled for that input
warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1609:5: 
Decision can match input such as "KW_STORED KW_AS KW_SEQUENCEFILE" using multiple alternatives: 1, 7

As a result, alternative(s) 7 were disabled for that input
warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1609:5: 
Decision can match input such as "KW_STORED KW_AS KW_ORCFILE" using multiple alternatives: 4, 7

As a result, alternative(s) 7 were disabled for that input
warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1609:5: 
Decision can match input such as "KW_STORED KW_AS KW_TEXTFILE" using multiple alternatives: 2, 7

As a result, alternative(s) 7 were disabled for that input
warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1609:5: 
Decision can match input such as "KW_STORED KW_AS KW_INPUTFORMAT" using multiple alternatives: 5, 7

As a result, alternative(s) 7 were disabled for that input
warning(200): SelectClauseParser.g:149:5: 
Decision can match input such as "KW_NULL DOT {KW_ADD..KW_AFTER, KW_ALTER..KW_ANALYZE, KW_ARCHIVE..KW_CASCADE, KW_CHANGE, KW_CLUSTER..KW_COLLECTION, KW_COLUMNS..KW_CREATE, KW_CUBE, KW_CURSOR..KW_DATA, KW_DATABASES..KW_DISABLE, KW_DISTRIBUTE..KW_ELEM_TYPE, KW_ENABLE, KW_ESCAPED, KW_EXCLUSIVE..KW_EXPORT, KW_EXTERNAL..KW_FLOAT, KW_FOR..KW_FORMATTED, KW_FULL, KW_FUNCTIONS..KW_GROUPING, KW_HOLD_DDLTIME..KW_IDXPROPERTIES, KW_IGNORE..KW_ITEMS, KW_KEYS..KW_LEFT, KW_LIKE..KW_LONG, KW_MAPJOIN..KW_MINUS, KW_MSCK..KW_NOSCAN, KW_NO_DROP..KW_OFFLINE, KW_OPTION, KW_ORCFILE..KW_OUTPUTFORMAT, KW_OVERWRITE, KW_PARTITION..KW_PLUS, KW_PRETTY..KW_RECORDWRITER, KW_REGEXP..KW_SCHEMAS, KW_SEMI..KW_TABLES, KW_TBLPROPERTIES..KW_TEXTFILE, KW_TIMESTAMP..KW_TOUCH, KW_TRIGGER..KW_UNARCHIVE, KW_UNDO..KW_UNIONTYPE, KW_UNLOCK..KW_VALUE_TYPE, KW_VIEW, KW_WHILE, KW_WITH}" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): SelectClauseParser.g:149:5: 
Decision can match input such as "KW_NULL DOT Identifier" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): FromClauseParser.g:127:2: 
Decision can match input such as "KW_LATERAL KW_VIEW KW_OUTER" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): FromClauseParser.g:179:25: 
Decision can match input such as "LPAREN StringLiteral EQUAL" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): FromClauseParser.g:179:25: 
Decision can match input such as "LPAREN StringLiteral COMMA" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): FromClauseParser.g:179:25: 
Decision can match input such as "LPAREN StringLiteral RPAREN" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): FromClauseParser.g:179:68: 
Decision can match input such as "Identifier LPAREN BigintLiteral" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): FromClauseParser.g:179:68: 
Decision can match input such as "Identifier LPAREN KW_CAST" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): FromClauseParser.g:179:68: 
Decision can match input such as "Identifier LPAREN KW_IF" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): FromClauseParser.g:179:68: 
Decision can match input such as "Identifier LPAREN CharSetName" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): FromClauseParser.g:179:68: 
Decision can match input such as "Identifier LPAREN LPAREN" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): FromClauseParser.g:179:68: 
Decision can match input such as "Identifier LPAREN KW_EXISTS" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): FromClauseParser.g:179:68: 
Decision can match input such as "Identifier LPAREN KW_CASE" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): FromClauseParser.g:179:68: 
Decision can match input such as "Identifier LPAREN TinyintLiteral" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): FromClauseParser.g:179:68: 
Decision can match input such as "Identifier LPAREN KW_NULL" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): FromClauseParser.g:179:68: 
Decision can match input such as "Identifier LPAREN SmallintLiteral" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): FromClauseParser.g:179:68: 
Decision can match input such as "Identifier LPAREN KW_FALSE" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): FromClauseParser.g:179:68: 
Decision can match input such as "Identifier LPAREN KW_UNIONTYPE" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): FromClauseParser.g:179:68: 
Decision can match input such as "Identifier LPAREN Number" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): FromClauseParser.g:179:68: 
Decision can match input such as "Identifier LPAREN Identifier" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): FromClauseParser.g:179:68: 
Decision can match input such as "Identifier LPAREN StringLiteral" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): FromClauseParser.g:179:68: 
Decision can match input such as "Identifier LPAREN DecimalLiteral" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): FromClauseParser.g:179:68: 
Decision can match input such as "Identifier LPAREN {KW_ADD..KW_AFTER, KW_ALTER..KW_ANALYZE, KW_ARCHIVE, KW_AS..KW_CASCADE, KW_CHANGE, KW_CLUSTER..KW_COLLECTION, KW_COLUMNS..KW_CREATE, KW_CUBE, KW_CURSOR..KW_DATA, KW_DATABASES, KW_DATETIME..KW_DISABLE, KW_DISTRIBUTE..KW_ELEM_TYPE, KW_ENABLE, KW_ESCAPED, KW_EXCLUSIVE, KW_EXPLAIN..KW_EXPORT, KW_EXTERNAL, KW_FETCH..KW_FLOAT, KW_FOR..KW_FORMATTED, KW_FULL, KW_FUNCTIONS..KW_GROUPING, KW_HOLD_DDLTIME..KW_IDXPROPERTIES, KW_IGNORE..KW_ITEMS, KW_KEYS..KW_LEFT, KW_LIKE..KW_LONG, KW_MAPJOIN..KW_MINUS, KW_MSCK..KW_NOSCAN, KW_NO_DROP, KW_OF..KW_OFFLINE, KW_OPTION, KW_ORCFILE..KW_OUTPUTFORMAT, KW_OVERWRITE, KW_PARTITION..KW_PLUS, KW_PRETTY..KW_RECORDWRITER, KW_REGEXP..KW_SCHEMAS, KW_SEMI..KW_STRING, KW_TABLE..KW_TABLES, KW_TBLPROPERTIES..KW_TEXTFILE, KW_TIMESTAMP..KW_TOUCH, KW_TRIGGER, KW_TRUNCATE..KW_UNARCHIVE, KW_UNDO..KW_UNION, KW_UNLOCK..KW_VALUE_TYPE, KW_VIEW, KW_WHILE, KW_WITH}" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): FromClauseParser.g:179:68: 
Decision can match input such as "Identifier LPAREN KW_STRUCT" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): FromClauseParser.g:179:68: 
Decision can match input such as "Identifier LPAREN KW_DATE" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): FromClauseParser.g:179:68: 
Decision can match input such as "Identifier LPAREN KW_NOT" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): FromClauseParser.g:179:68: 
Decision can match input such as "Identifier LPAREN {MINUS, PLUS, TILDE}" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): FromClauseParser.g:179:68: 
Decision can match input such as "Identifier LPAREN KW_MAP" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): FromClauseParser.g:179:68: 
Decision can match input such as "Identifier LPAREN KW_TRUE" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): FromClauseParser.g:179:68: 
Decision can match input such as "Identifier LPAREN KW_ARRAY" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): FromClauseParser.g:237:16: 
Decision can match input such as "Identifier LPAREN BigintLiteral" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): FromClauseParser.g:237:16: 
Decision can match input such as "Identifier LPAREN KW_CAST" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): FromClauseParser.g:237:16: 
Decision can match input such as "Identifier LPAREN KW_IF" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): FromClauseParser.g:237:16: 
Decision can match input such as "Identifier LPAREN CharSetName" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): FromClauseParser.g:237:16: 
Decision can match input such as "Identifier LPAREN LPAREN" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): FromClauseParser.g:237:16: 
Decision can match input such as "Identifier LPAREN KW_EXISTS" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): FromClauseParser.g:237:16: 
Decision can match input such as "Identifier LPAREN KW_CASE" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): FromClauseParser.g:237:16: 
Decision can match input such as "Identifier LPAREN TinyintLiteral" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): FromClauseParser.g:237:16: 
Decision can match input such as "Identifier LPAREN KW_NULL" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): FromClauseParser.g:237:16: 
Decision can match input such as "Identifier LPAREN SmallintLiteral" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): FromClauseParser.g:237:16: 
Decision can match input such as "Identifier LPAREN KW_FALSE" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): FromClauseParser.g:237:16: 
Decision can match input such as "Identifier LPAREN KW_UNIONTYPE" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): FromClauseParser.g:237:16: 
Decision can match input such as "Identifier LPAREN Number" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): FromClauseParser.g:237:16: 
Decision can match input such as "Identifier LPAREN Identifier" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): FromClauseParser.g:237:16: 
Decision can match input such as "Identifier LPAREN StringLiteral" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): FromClauseParser.g:237:16: 
Decision can match input such as "Identifier LPAREN DecimalLiteral" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): FromClauseParser.g:237:16: 
Decision can match input such as "Identifier LPAREN {KW_ADD..KW_AFTER, KW_ALTER..KW_ANALYZE, KW_ARCHIVE, KW_AS..KW_CASCADE, KW_CHANGE, KW_CLUSTER..KW_COLLECTION, KW_COLUMNS..KW_CREATE, KW_CUBE, KW_CURSOR..KW_DATA, KW_DATABASES, KW_DATETIME..KW_DISABLE, KW_DISTRIBUTE..KW_ELEM_TYPE, KW_ENABLE, KW_ESCAPED, KW_EXCLUSIVE, KW_EXPLAIN..KW_EXPORT, KW_EXTERNAL, KW_FETCH..KW_FLOAT, KW_FOR..KW_FORMATTED, KW_FULL, KW_FUNCTIONS..KW_GROUPING, KW_HOLD_DDLTIME..KW_IDXPROPERTIES, KW_IGNORE..KW_ITEMS, KW_KEYS..KW_LEFT, KW_LIKE..KW_LONG, KW_MAPJOIN..KW_MINUS, KW_MSCK..KW_NOSCAN, KW_NO_DROP, KW_OF..KW_OFFLINE, KW_OPTION, KW_ORCFILE..KW_OUTPUTFORMAT, KW_OVERWRITE, KW_PARTITION..KW_PLUS, KW_PRETTY..KW_RECORDWRITER, KW_REGEXP..KW_SCHEMAS, KW_SEMI..KW_STRING, KW_TABLE..KW_TABLES, KW_TBLPROPERTIES..KW_TEXTFILE, KW_TIMESTAMP..KW_TOUCH, KW_TRIGGER, KW_TRUNCATE..KW_UNARCHIVE, KW_UNDO..KW_UNION, KW_UNLOCK..KW_VALUE_TYPE, KW_VIEW, KW_WHILE, KW_WITH}" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): FromClauseParser.g:237:16: 
Decision can match input such as "Identifier LPAREN KW_STRUCT" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): FromClauseParser.g:237:16: 
Decision can match input such as "Identifier LPAREN KW_DATE" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): FromClauseParser.g:237:16: 
Decision can match input such as "Identifier LPAREN KW_NOT" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): FromClauseParser.g:237:16: 
Decision can match input such as "Identifier LPAREN {MINUS, PLUS, TILDE}" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): FromClauseParser.g:237:16: 
Decision can match input such as "Identifier LPAREN KW_MAP" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): FromClauseParser.g:237:16: 
Decision can match input such as "Identifier LPAREN KW_TRUE" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): FromClauseParser.g:237:16: 
Decision can match input such as "Identifier LPAREN KW_ARRAY" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): IdentifiersParser.g:68:4: 
Decision can match input such as "LPAREN KW_NOT CharSetName" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): IdentifiersParser.g:68:4: 
Decision can match input such as "LPAREN KW_CASE CharSetName" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): IdentifiersParser.g:68:4: 
Decision can match input such as "LPAREN LPAREN CharSetName" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): IdentifiersParser.g:68:4: 
Decision can match input such as "LPAREN KW_NOT Number" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): IdentifiersParser.g:68:4: 
Decision can match input such as "LPAREN KW_NOT KW_UNIONTYPE" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): IdentifiersParser.g:68:4: 
Decision can match input such as "LPAREN KW_NULL {KW_LIKE, KW_REGEXP, KW_RLIKE}" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): IdentifiersParser.g:68:4: 
Decision can match input such as "LPAREN KW_NOT KW_DATE" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): IdentifiersParser.g:68:4: 
Decision can match input such as "LPAREN KW_NOT KW_NOT" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): IdentifiersParser.g:68:4: 
Decision can match input such as "LPAREN KW_CASE KW_NOT" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): IdentifiersParser.g:68:4: 
Decision can match input such as "LPAREN LPAREN KW_NOT" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): IdentifiersParser.g:68:4: 
Decision can match input such as "LPAREN KW_CASE Identifier" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): IdentifiersParser.g:68:4: 
Decision can match input such as "LPAREN KW_NULL AMPERSAND" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): IdentifiersParser.g:68:4: 
Decision can match input such as "LPAREN LPAREN SmallintLiteral" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): IdentifiersParser.g:68:4: 
Decision can match input such as "LPAREN LPAREN TinyintLiteral" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): IdentifiersParser.g:68:4: 
Decision can match input such as "LPAREN LPAREN BigintLiteral" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): IdentifiersParser.g:68:4: 
Decision can match input such as "LPAREN KW_NOT KW_FALSE" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): IdentifiersParser.g:68:4: 
Decision can match input such as "LPAREN KW_NOT KW_STRUCT" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): IdentifiersParser.g:68:4: 
Decision can match input such as "LPAREN KW_NOT KW_TRUE" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): IdentifiersParser.g:68:4: 
Decision can match input such as "LPAREN KW_NOT KW_ARRAY" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): IdentifiersParser.g:68:4: 
Decision can match input such as "LPAREN LPAREN Number" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): IdentifiersParser.g:68:4: 
Decision can match input such as "LPAREN KW_NULL KW_IS" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): IdentifiersParser.g:68:4: 
Decision can match input such as "LPAREN KW_NOT StringLiteral" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): IdentifiersParser.g:68:4: 
Decision can match input such as "LPAREN KW_NOT {MINUS, PLUS, TILDE}" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): IdentifiersParser.g:68:4: 
Decision can match input such as "LPAREN KW_CASE {MINUS, PLUS, TILDE}" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): IdentifiersParser.g:68:4: 
Decision can match input such as "LPAREN LPAREN {MINUS, PLUS, TILDE}" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): IdentifiersParser.g:68:4: 
Decision can match input such as "LPAREN KW_NOT KW_MAP" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): IdentifiersParser.g:68:4: 
Decision can match input such as "LPAREN KW_CASE KW_MAP" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): IdentifiersParser.g:68:4: 
Decision can match input such as "LPAREN LPAREN KW_MAP" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): IdentifiersParser.g:68:4: 
Decision can match input such as "LPAREN KW_NULL LPAREN" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): IdentifiersParser.g:68:4: 
Decision can match input such as "LPAREN LPAREN DecimalLiteral" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): IdentifiersParser.g:68:4: 
Decision can match input such as "LPAREN KW_NULL RPAREN" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): IdentifiersParser.g:68:4: 
Decision can match input such as "LPAREN StringLiteral StringLiteral" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): IdentifiersParser.g:68:4: 
Decision can match input such as "LPAREN KW_NOT KW_IF" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): IdentifiersParser.g:68:4: 
Decision can match input such as "LPAREN KW_CASE KW_TRUE" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): IdentifiersParser.g:68:4: 
Decision can match input such as "LPAREN KW_CASE KW_IF" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): IdentifiersParser.g:68:4: 
Decision can match input such as "LPAREN LPAREN KW_IF" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): IdentifiersParser.g:68:4: 
Decision can match input such as "LPAREN KW_NULL BITWISEOR" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): IdentifiersParser.g:68:4: 
Decision can match input such as "LPAREN KW_NULL DOT" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): IdentifiersParser.g:68:4: 
Decision can match input such as "LPAREN LPAREN KW_EXISTS" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): IdentifiersParser.g:68:4: 
Decision can match input such as "LPAREN KW_CASE StringLiteral" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): IdentifiersParser.g:68:4: 
Decision can match input such as "LPAREN KW_CASE KW_FALSE" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): IdentifiersParser.g:68:4: 
Decision can match input such as "LPAREN KW_NOT Identifier" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): IdentifiersParser.g:68:4: 
Decision can match input such as "LPAREN KW_NULL {DIV..DIVIDE, MOD, STAR}" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): IdentifiersParser.g:68:4: 
Decision can match input such as "LPAREN KW_CASE {KW_ADD..KW_AFTER, KW_ALTER..KW_ANALYZE, KW_ARCHIVE, KW_AS..KW_CASCADE, KW_CHANGE, KW_CLUSTER..KW_COLLECTION, KW_COLUMNS..KW_CREATE, KW_CUBE, KW_CURSOR..KW_DATA, KW_DATABASES, KW_DATETIME..KW_DISABLE, KW_DISTRIBUTE..KW_ELEM_TYPE, KW_ENABLE, KW_ESCAPED, KW_EXCLUSIVE, KW_EXPLAIN..KW_EXPORT, KW_EXTERNAL, KW_FETCH..KW_FLOAT, KW_FOR..KW_FORMATTED, KW_FULL, KW_FUNCTIONS..KW_GROUPING, KW_HOLD_DDLTIME..KW_IDXPROPERTIES, KW_IGNORE..KW_ITEMS, KW_KEYS..KW_LEFT, KW_LIKE..KW_LONG, KW_MAPJOIN..KW_MINUS, KW_MSCK..KW_NOSCAN, KW_NO_DROP, KW_OF..KW_OFFLINE, KW_OPTION, KW_ORCFILE..KW_OUTPUTFORMAT, KW_OVERWRITE, KW_PARTITION..KW_PLUS, KW_PRETTY..KW_RECORDWRITER, KW_REGEXP..KW_SCHEMAS, KW_SEMI..KW_STRING, KW_TABLE..KW_TABLES, KW_TBLPROPERTIES..KW_TEXTFILE, KW_TIMESTAMP..KW_TOUCH, KW_TRIGGER, KW_TRUNCATE..KW_UNARCHIVE, KW_UNDO..KW_UNION, KW_UNLOCK..KW_VALUE_TYPE, KW_VIEW, KW_WHILE, KW_WITH}" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): IdentifiersParser.g:68:4: 
Decision can match input such as "LPAREN KW_NULL NOTEQUAL" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): IdentifiersParser.g:68:4: 
Decision can match input such as "LPAREN KW_NOT KW_EXISTS" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): IdentifiersParser.g:68:4: 
Decision can match input such as "LPAREN KW_NULL EQUAL_NS" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): IdentifiersParser.g:68:4: 
Decision can match input such as "LPAREN KW_EXISTS LPAREN" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): IdentifiersParser.g:68:4: 
Decision can match input such as "LPAREN KW_NULL LESSTHAN" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): IdentifiersParser.g:68:4: 
Decision can match input such as "LPAREN KW_NULL LESSTHANOREQUALTO" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): IdentifiersParser.g:68:4: 
Decision can match input such as "LPAREN LPAREN KW_ARRAY" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): IdentifiersParser.g:68:4: 
Decision can match input such as "LPAREN LPAREN KW_NULL" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): IdentifiersParser.g:68:4: 
Decision can match input such as "LPAREN LPAREN KW_UNIONTYPE" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): IdentifiersParser.g:68:4: 
Decision can match input such as "LPAREN KW_NULL KW_OR" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): IdentifiersParser.g:68:4: 
Decision can match input such as "LPAREN KW_DATE StringLiteral" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): IdentifiersParser.g:68:4: 
Decision can match input such as "LPAREN KW_CASE KW_DATE" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): IdentifiersParser.g:68:4: 
Decision can match input such as "LPAREN LPAREN KW_STRUCT" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): IdentifiersParser.g:68:4: 
Decision can match input such as "LPAREN KW_NULL EQUAL" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): IdentifiersParser.g:68:4: 
Decision can match input such as "LPAREN KW_CAST LPAREN" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): IdentifiersParser.g:68:4: 
Decision can match input such as "LPAREN LPAREN LPAREN" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): IdentifiersParser.g:68:4: 
Decision can match input such as "LPAREN KW_NOT LPAREN" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): IdentifiersParser.g:68:4: 
Decision can match input such as "LPAREN KW_CASE LPAREN" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): IdentifiersParser.g:68:4: 
Decision can match input such as "LPAREN KW_CASE KW_STRUCT" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): IdentifiersParser.g:68:4: 
Decision can match input such as "LPAREN KW_CASE KW_UNIONTYPE" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): IdentifiersParser.g:68:4: 
Decision can match input such as "LPAREN KW_NULL GREATERTHAN" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): IdentifiersParser.g:68:4: 
Decision can match input such as "LPAREN KW_CASE KW_ARRAY" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): IdentifiersParser.g:68:4: 
Decision can match input such as "LPAREN LPAREN KW_DATE" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): IdentifiersParser.g:68:4: 
Decision can match input such as "LPAREN KW_NULL GREATERTHANOREQUALTO" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): IdentifiersParser.g:68:4: 
Decision can match input such as "LPAREN KW_CASE KW_NULL" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): IdentifiersParser.g:68:4: 
Decision can match input such as "LPAREN KW_CASE KW_EXISTS" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): IdentifiersParser.g:68:4: 
Decision can match input such as "LPAREN KW_CASE TinyintLiteral" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): IdentifiersParser.g:68:4: 
Decision can match input such as "LPAREN KW_NULL KW_BETWEEN" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): IdentifiersParser.g:68:4: 
Decision can match input such as "LPAREN KW_CASE SmallintLiteral" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): IdentifiersParser.g:68:4: 
Decision can match input such as "LPAREN KW_NULL {MINUS, PLUS}" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): IdentifiersParser.g:68:4: 
Decision can match input such as "LPAREN KW_NULL KW_NOT" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): IdentifiersParser.g:68:4: 
Decision can match input such as "LPAREN KW_CASE BigintLiteral" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): IdentifiersParser.g:68:4: 
Decision can match input such as "LPAREN KW_NULL LSQUARE" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): IdentifiersParser.g:68:4: 
Decision can match input such as "LPAREN KW_NOT {KW_ADD..KW_AFTER, KW_ALTER..KW_ANALYZE, KW_ARCHIVE, KW_AS..KW_CASCADE, KW_CHANGE, KW_CLUSTER..KW_COLLECTION, KW_COLUMNS..KW_CREATE, KW_CUBE, KW_CURSOR..KW_DATA, KW_DATABASES, KW_DATETIME..KW_DISABLE, KW_DISTRIBUTE..KW_ELEM_TYPE, KW_ENABLE, KW_ESCAPED, KW_EXCLUSIVE, KW_EXPLAIN..KW_EXPORT, KW_EXTERNAL, KW_FETCH..KW_FLOAT, KW_FOR..KW_FORMATTED, KW_FULL, KW_FUNCTIONS..KW_GROUPING, KW_HOLD_DDLTIME..KW_IDXPROPERTIES, KW_IGNORE..KW_ITEMS, KW_KEYS..KW_LEFT, KW_LIKE..KW_LONG, KW_MAPJOIN..KW_MINUS, KW_MSCK..KW_NOSCAN, KW_NO_DROP, KW_OF..KW_OFFLINE, KW_OPTION, KW_ORCFILE..KW_OUTPUTFORMAT, KW_OVERWRITE, KW_PARTITION..KW_PLUS, KW_PRETTY..KW_RECORDWRITER, KW_REGEXP..KW_SCHEMAS, KW_SEMI..KW_STRING, KW_TABLE..KW_TABLES, KW_TBLPROPERTIES..KW_TEXTFILE, KW_TIMESTAMP..KW_TOUCH, KW_TRIGGER, KW_TRUNCATE..KW_UNARCHIVE, KW_UNDO..KW_UNION, KW_UNLOCK..KW_VALUE_TYPE, KW_VIEW, KW_WHILE, KW_WITH}" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): IdentifiersParser.g:68:4: 
Decision can match input such as "LPAREN LPAREN KW_FALSE" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): IdentifiersParser.g:68:4: 
Decision can match input such as "LPAREN KW_NOT KW_NULL" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): IdentifiersParser.g:68:4: 
Decision can match input such as "LPAREN LPAREN {KW_ADD..KW_AFTER, KW_ALTER..KW_ANALYZE, KW_ARCHIVE, KW_AS..KW_CASCADE, KW_CHANGE, KW_CLUSTER..KW_COLLECTION, KW_COLUMNS..KW_CREATE, KW_CUBE, KW_CURSOR..KW_DATA, KW_DATABASES, KW_DATETIME..KW_DISABLE, KW_DISTRIBUTE..KW_ELEM_TYPE, KW_ENABLE, KW_ESCAPED, KW_EXCLUSIVE, KW_EXPLAIN..KW_EXPORT, KW_EXTERNAL, KW_FETCH..KW_FLOAT, KW_FOR..KW_FORMATTED, KW_FULL, KW_FUNCTIONS..KW_GROUPING, KW_HOLD_DDLTIME..KW_IDXPROPERTIES, KW_IGNORE..KW_ITEMS, KW_KEYS..KW_LEFT, KW_LIKE..KW_LONG, KW_MAPJOIN..KW_MINUS, KW_MSCK..KW_NOSCAN, KW_NO_DROP, KW_OF..KW_OFFLINE, KW_OPTION, KW_ORCFILE..KW_OUTPUTFORMAT, KW_OVERWRITE, KW_PARTITION..KW_PLUS, KW_PRETTY..KW_RECORDWRITER, KW_REGEXP..KW_SCHEMAS, KW_SEMI..KW_STRING, KW_TABLE..KW_TABLES, KW_TBLPROPERTIES..KW_TEXTFILE, KW_TIMESTAMP..KW_TOUCH, KW_TRIGGER, KW_TRUNCATE..KW_UNARCHIVE, KW_UNDO..KW_UNION, KW_UNLOCK..KW_VALUE_TYPE, KW_VIEW, KW_WHILE, KW_WITH}" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): IdentifiersParser.g:68:4: 
Decision can match input such as "LPAREN LPAREN KW_TRUE" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): IdentifiersParser.g:68:4: 
Decision can match input such as "LPAREN KW_CASE Number" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): IdentifiersParser.g:68:4: 
Decision can match input such as "LPAREN KW_NULL BITWISEXOR" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): IdentifiersParser.g:68:4: 
Decision can match input such as "LPAREN KW_NOT SmallintLiteral" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): IdentifiersParser.g:68:4: 
Decision can match input such as "LPAREN CharSetName CharSetLiteral" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): IdentifiersParser.g:68:4: 
Decision can match input such as "LPAREN LPAREN KW_CAST" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): IdentifiersParser.g:68:4: 
Decision can match input such as "LPAREN KW_NOT KW_CAST" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): IdentifiersParser.g:68:4: 
Decision can match input such as "LPAREN KW_CASE KW_CAST" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): IdentifiersParser.g:68:4: 
Decision can match input such as "LPAREN KW_NOT BigintLiteral" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): IdentifiersParser.g:68:4: 
Decision can match input such as "LPAREN KW_NULL KW_AND" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): IdentifiersParser.g:68:4: 
Decision can match input such as "LPAREN LPAREN KW_CASE" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): IdentifiersParser.g:68:4: 
Decision can match input such as "LPAREN KW_NULL KW_IN" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): IdentifiersParser.g:68:4: 
Decision can match input such as "LPAREN KW_NOT KW_CASE" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): IdentifiersParser.g:68:4: 
Decision can match input such as "LPAREN KW_CASE KW_CASE" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): IdentifiersParser.g:68:4: 
Decision can match input such as "LPAREN KW_NOT DecimalLiteral" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): IdentifiersParser.g:68:4: 
Decision can match input such as "LPAREN KW_NOT TinyintLiteral" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): IdentifiersParser.g:68:4: 
Decision can match input such as "LPAREN KW_CASE KW_WHEN" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): IdentifiersParser.g:68:4: 
Decision can match input such as "LPAREN LPAREN StringLiteral" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): IdentifiersParser.g:68:4: 
Decision can match input such as "LPAREN LPAREN Identifier" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): IdentifiersParser.g:68:4: 
Decision can match input such as "LPAREN KW_CASE DecimalLiteral" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): IdentifiersParser.g:108:5: 
Decision can match input such as "KW_ORDER KW_BY LPAREN" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): IdentifiersParser.g:121:5: 
Decision can match input such as "KW_CLUSTER KW_BY LPAREN" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): IdentifiersParser.g:133:5: 
Decision can match input such as "KW_PARTITION KW_BY LPAREN" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): IdentifiersParser.g:144:5: 
Decision can match input such as "KW_DISTRIBUTE KW_BY LPAREN" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): IdentifiersParser.g:155:5: 
Decision can match input such as "KW_SORT KW_BY LPAREN" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): IdentifiersParser.g:172:7: 
Decision can match input such as "STAR" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): IdentifiersParser.g:185:5: 
Decision can match input such as "KW_UNIONTYPE" using multiple alternatives: 5, 6

As a result, alternative(s) 6 were disabled for that input
warning(200): IdentifiersParser.g:185:5: 
Decision can match input such as "KW_STRUCT" using multiple alternatives: 4, 6

As a result, alternative(s) 6 were disabled for that input
warning(200): IdentifiersParser.g:185:5: 
Decision can match input such as "KW_ARRAY" using multiple alternatives: 2, 6

As a result, alternative(s) 6 were disabled for that input
warning(200): IdentifiersParser.g:267:5: 
Decision can match input such as "KW_TRUE" using multiple alternatives: 3, 8

As a result, alternative(s) 8 were disabled for that input
warning(200): IdentifiersParser.g:267:5: 
Decision can match input such as "KW_DATE StringLiteral" using multiple alternatives: 2, 3

As a result, alternative(s) 3 were disabled for that input
warning(200): IdentifiersParser.g:267:5: 
Decision can match input such as "KW_FALSE" using multiple alternatives: 3, 8

As a result, alternative(s) 8 were disabled for that input
warning(200): IdentifiersParser.g:267:5: 
Decision can match input such as "KW_NULL" using multiple alternatives: 1, 8

As a result, alternative(s) 8 were disabled for that input
warning(200): IdentifiersParser.g:399:5: 
Decision can match input such as "{KW_LIKE, KW_REGEXP, KW_RLIKE} KW_INSERT KW_OVERWRITE" using multiple alternatives: 2, 9

As a result, alternative(s) 9 were disabled for that input
warning(200): IdentifiersParser.g:399:5: 
Decision can match input such as "{KW_LIKE, KW_REGEXP, KW_RLIKE} KW_CLUSTER KW_BY" using multiple alternatives: 2, 9

As a result, alternative(s) 9 were disabled for that input
warning(200): IdentifiersParser.g:399:5: 
Decision can match input such as "{KW_LIKE, KW_REGEXP, KW_RLIKE} KW_INSERT KW_INTO" using multiple alternatives: 2, 9

As a result, alternative(s) 9 were disabled for that input
warning(200): IdentifiersParser.g:399:5: 
Decision can match input such as "{KW_LIKE, KW_REGEXP, KW_RLIKE} KW_ORDER KW_BY" using multiple alternatives: 2, 9

As a result, alternative(s) 9 were disabled for that input
warning(200): IdentifiersParser.g:399:5: 
Decision can match input such as "{KW_LIKE, KW_REGEXP, KW_RLIKE} KW_MAP LPAREN" using multiple alternatives: 2, 9

As a result, alternative(s) 9 were disabled for that input
warning(200): IdentifiersParser.g:399:5: 
Decision can match input such as "{KW_LIKE, KW_REGEXP, KW_RLIKE} KW_SORT KW_BY" using multiple alternatives: 2, 9

As a result, alternative(s) 9 were disabled for that input
warning(200): IdentifiersParser.g:399:5: 
Decision can match input such as "{KW_LIKE, KW_REGEXP, KW_RLIKE} KW_LATERAL KW_VIEW" using multiple alternatives: 2, 9

As a result, alternative(s) 9 were disabled for that input
warning(200): IdentifiersParser.g:399:5: 
Decision can match input such as "KW_BETWEEN KW_MAP LPAREN" using multiple alternatives: 8, 9

As a result, alternative(s) 9 were disabled for that input
warning(200): IdentifiersParser.g:399:5: 
Decision can match input such as "{KW_LIKE, KW_REGEXP, KW_RLIKE} KW_GROUP KW_BY" using multiple alternatives: 2, 9

As a result, alternative(s) 9 were disabled for that input
warning(200): IdentifiersParser.g:399:5: 
Decision can match input such as "{KW_LIKE, KW_REGEXP, KW_RLIKE} KW_DISTRIBUTE KW_BY" using multiple alternatives: 2, 9

As a result, alternative(s) 9 were disabled for that input
warning(200): IdentifiersParser.g:524:5: 
Decision can match input such as "{AMPERSAND..BITWISEXOR, DIV..DIVIDE, EQUAL..EQUAL_NS, GREATERTHAN..GREATERTHANOREQUALTO, KW_AND, KW_ARRAY, KW_BETWEEN..KW_BOOLEAN, KW_CASE, KW_DOUBLE, KW_FLOAT, KW_IF, KW_IN, KW_INT, KW_LIKE, KW_MAP, KW_NOT, KW_OR, KW_REGEXP, KW_RLIKE, KW_SMALLINT, KW_STRING..KW_STRUCT, KW_TINYINT, KW_UNIONTYPE, KW_WHEN, LESSTHAN..LESSTHANOREQUALTO, MINUS..NOTEQUAL, PLUS, STAR, TILDE}" using multiple alternatives: 1, 3

As a result, alternative(s) 3 were disabled for that input
[INFO] 
[INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-exec ---
[debug] execute contextualize
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] Copying 1 resource
[INFO] 
[INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-exec ---
[INFO] Executing tasks

main:
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-exec ---
[INFO] Compiling 1387 source files to /data/hive-ptest/working/apache-svn-trunk-source/ql/target/classes
[INFO] -------------------------------------------------------------
[WARNING] COMPILATION WARNING : 
[INFO] -------------------------------------------------------------
[WARNING] Note: Some input files use or override a deprecated API.
[WARNING] Note: Recompile with -Xlint:deprecation for details.
[WARNING] Note: Some input files use unchecked or unsafe operations.
[WARNING] Note: Recompile with -Xlint:unchecked for details.
[INFO] 4 warnings 
[INFO] -------------------------------------------------------------
[INFO] -------------------------------------------------------------
[ERROR] COMPILATION ERROR : 
[INFO] -------------------------------------------------------------
[ERROR] /data/hive-ptest/working/apache-svn-trunk-source/ql/src/java/org/apache/hadoop/hive/ql/exec/FileSinkOperator.java:[41,42] cannot find symbol
symbol  : class HiveFatalException
location: package org.apache.hadoop.hive.ql.metadata
[ERROR] /data/hive-ptest/working/apache-svn-trunk-source/ql/src/java/org/apache/hadoop/hive/ql/exec/FileSinkOperator.java:[770,21] cannot find symbol
symbol  : class HiveFatalException
location: class org.apache.hadoop.hive.ql.exec.FileSinkOperator
[INFO] 2 errors 
[INFO] -------------------------------------------------------------
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Hive .............................................. SUCCESS [3.505s]
[INFO] Hive Ant Utilities ................................ SUCCESS [6.215s]
[INFO] Hive Shims Common ................................. SUCCESS [2.636s]
[INFO] Hive Shims 0.20 ................................... SUCCESS [1.883s]
[INFO] Hive Shims Secure Common .......................... SUCCESS [2.979s]
[INFO] Hive Shims 0.20S .................................. SUCCESS [1.333s]
[INFO] Hive Shims 0.23 ................................... SUCCESS [3.126s]
[INFO] Hive Shims ........................................ SUCCESS [3.527s]
[INFO] Hive Common ....................................... SUCCESS [5.317s]
[INFO] Hive Serde ........................................ SUCCESS [12.190s]
[INFO] Hive Metastore .................................... SUCCESS [24.201s]
[INFO] Hive Query Language ............................... FAILURE [29.831s]
[INFO] Hive Service ...................................... SKIPPED
[INFO] Hive JDBC ......................................... SKIPPED
[INFO] Hive Beeline ...................................... SKIPPED
[INFO] Hive CLI .......................................... SKIPPED
[INFO] Hive Contrib ...................................... SKIPPED
[INFO] Hive HBase Handler ................................ SKIPPED
[INFO] Hive HCatalog ..................................... SKIPPED
[INFO] Hive HCatalog Core ................................ SKIPPED
[INFO] Hive HCatalog Pig Adapter ......................... SKIPPED
[INFO] Hive HCatalog Server Extensions ................... SKIPPED
[INFO] Hive HCatalog Webhcat Java Client ................. SKIPPED
[INFO] Hive HCatalog Webhcat ............................. SKIPPED
[INFO] Hive HCatalog HBase Storage Handler ............... SKIPPED
[INFO] Hive HWI .......................................... SKIPPED
[INFO] Hive ODBC ......................................... SKIPPED
[INFO] Hive Shims Aggregator ............................. SKIPPED
[INFO] Hive TestUtils .................................... SKIPPED
[INFO] Hive Packaging .................................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 1:39.289s
[INFO] Finished at: Wed Nov 13 14:21:23 EST 2013
[INFO] Final Memory: 40M/436M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-compiler-plugin:3.1:compile (default-compile) on project hive-exec: Compilation failure: Compilation failure:
[ERROR] /data/hive-ptest/working/apache-svn-trunk-source/ql/src/java/org/apache/hadoop/hive/ql/exec/FileSinkOperator.java:[41,42] cannot find symbol
[ERROR] symbol  : class HiveFatalException
[ERROR] location: package org.apache.hadoop.hive.ql.metadata
[ERROR] /data/hive-ptest/working/apache-svn-trunk-source/ql/src/java/org/apache/hadoop/hive/ql/exec/FileSinkOperator.java:[770,21] cannot find symbol
[ERROR] symbol  : class HiveFatalException
[ERROR] location: class org.apache.hadoop.hive.ql.exec.FileSinkOperator
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hive-exec
+ exit 1
'
{noformat}

This message is automatically generated.

ATTACHMENT ID: 12613468

> Counter Strike: Operation Operator
> ----------------------------------
>
>                 Key: HIVE-4518
>                 URL: https://issues.apache.org/jira/browse/HIVE-4518
>             Project: Hive
>          Issue Type: Improvement
>            Reporter: Gunther Hagleitner
>            Assignee: Gunther Hagleitner
>         Attachments: HIVE-4518.1.patch, HIVE-4518.2.patch, HIVE-4518.3.patch, HIVE-4518.4.patch, HIVE-4518.5.patch, HIVE-4518.6.patch.txt, HIVE-4518.7.patch, HIVE-4518.8.patch
>
>
> Queries of the form:
> from foo
> insert overwrite table bar partition (p) select ...
> insert overwrite table bar partition (p) select ...
> insert overwrite table bar partition (p) select ...
> Generate a huge amount of counters. The reason is that task.progress is turned on for dynamic partitioning queries.
> The counters not only make queries slower than necessary (up to 50%) you will also eventually run out. That's because we're wrapping them in enum values to comply with hadoop 0.17.
> The real reason we turn task.progress on is that we need CREATED_FILES and FATAL counters to ensure dynamic partitioning queries don't go haywire.
> The counters have counter-intuitive names like C1 through C1000 and don't seem really useful by themselves.
> With hadoop 20+ you don't need to wrap the counters anymore, each operator can simply create and increment counters. That should simplify the code a lot.



--
This message was sent by Atlassian JIRA
(v6.1#6144)

Mime
View raw message