hive-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Hive QA (JIRA)" <j...@apache.org>
Subject [jira] [Commented] (HIVE-5581) Implement vectorized year/month/day... etc. for string arguments
Date Wed, 20 Nov 2013 22:15:36 GMT

    [ https://issues.apache.org/jira/browse/HIVE-5581?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=13828201#comment-13828201 ] 

Hive QA commented on HIVE-5581:
-------------------------------



{color:red}Overall{color}: -1 no tests executed

Here are the results of testing the latest attachment:
https://issues.apache.org/jira/secure/attachment/12614861/HIVE-5581.5.patch

Test results: http://bigtop01.cloudera.org:8080/job/PreCommit-HIVE-Build/377/testReport
Console output: http://bigtop01.cloudera.org:8080/job/PreCommit-HIVE-Build/377/console

Messages:
{noformat}
Executing org.apache.hive.ptest.execution.PrepPhase
Tests failed with: NonZeroExitCodeException: Command 'bash /data/hive-ptest/working/scratch/source-prep.sh' failed with exit status 1 and output '+ [[ -n '' ]]
+ export 'ANT_OPTS=-Xmx1g -XX:MaxPermSize=256m -Dhttp.proxyHost=localhost -Dhttp.proxyPort=3128'
+ ANT_OPTS='-Xmx1g -XX:MaxPermSize=256m -Dhttp.proxyHost=localhost -Dhttp.proxyPort=3128'
+ export 'M2_OPTS=-Xmx1g -XX:MaxPermSize=256m -Dhttp.proxyHost=localhost -Dhttp.proxyPort=3128'
+ M2_OPTS='-Xmx1g -XX:MaxPermSize=256m -Dhttp.proxyHost=localhost -Dhttp.proxyPort=3128'
+ cd /data/hive-ptest/working/
+ tee /data/hive-ptest/logs/PreCommit-HIVE-Build-377/source-prep.txt
+ [[ false == \t\r\u\e ]]
+ mkdir -p maven ivy
+ [[ svn = \s\v\n ]]
+ [[ -n '' ]]
+ [[ -d apache-svn-trunk-source ]]
+ [[ ! -d apache-svn-trunk-source/.svn ]]
+ [[ ! -d apache-svn-trunk-source ]]
+ cd apache-svn-trunk-source
+ svn revert -R .
Reverted 'ql/src/test/results/clientnegative/alter_view_failure6.q.out'
Reverted 'ql/src/test/results/clientnegative/alter_view_failure.q.out'
Reverted 'ql/src/test/results/clientnegative/create_view_failure2.q.out'
Reverted 'ql/src/test/results/clientnegative/load_view_failure.q.out'
Reverted 'ql/src/test/results/clientnegative/create_or_replace_view4.q.out'
Reverted 'ql/src/test/results/clientnegative/alter_view_failure7.q.out'
Reverted 'ql/src/test/results/clientnegative/unset_view_property.q.out'
Reverted 'ql/src/test/results/clientnegative/alter_view_failure2.q.out'
Reverted 'ql/src/test/results/clientnegative/create_or_replace_view5.q.out'
Reverted 'ql/src/test/results/clientnegative/invalidate_view1.q.out'
Reverted 'ql/src/test/results/clientnegative/analyze_view.q.out'
Reverted 'ql/src/test/results/clientnegative/deletejar.q.out'
Reverted 'ql/src/test/results/clientnegative/create_view_failure4.q.out'
Reverted 'ql/src/test/results/clientnegative/create_or_replace_view6.q.out'
Reverted 'ql/src/test/results/clientnegative/invalid_columns.q.out'
Reverted 'ql/src/test/results/clientnegative/insert_view_failure.q.out'
Reverted 'ql/src/test/results/clientnegative/alter_view_failure9.q.out'
Reverted 'ql/src/test/results/clientnegative/create_or_replace_view1.q.out'
Reverted 'ql/src/test/results/clientnegative/alter_view_failure4.q.out'
Reverted 'ql/src/test/results/clientnegative/alter_view_as_select_with_partition.q.out'
Reverted 'ql/src/test/results/clientnegative/drop_table_failure2.q.out'
Reverted 'ql/src/test/results/clientnegative/create_or_replace_view7.q.out'
Reverted 'ql/src/test/results/clientnegative/recursive_view.q.out'
Reverted 'ql/src/test/results/clientnegative/create_or_replace_view2.q.out'
Reverted 'ql/src/test/results/clientnegative/alter_view_failure5.q.out'
Reverted 'ql/src/test/results/clientnegative/create_view_failure1.q.out'
Reverted 'ql/src/test/results/clientnegative/create_or_replace_view8.q.out'
Reverted 'ql/src/test/results/clientpositive/view.q.out'
Reverted 'ql/src/test/results/clientpositive/database_drop.q.out'
Reverted 'ql/src/test/results/clientpositive/explain_dependency.q.out'
Reverted 'ql/src/test/results/clientpositive/create_or_replace_view.q.out'
Reverted 'ql/src/test/results/clientpositive/create_like_view.q.out'
Reverted 'ql/src/test/results/clientpositive/create_view.q.out'
Reverted 'ql/src/test/results/clientpositive/show_create_table_view.q.out'
Reverted 'ql/src/test/results/clientpositive/alter_view_rename.q.out'
Reverted 'ql/src/test/results/clientpositive/create_view_partitioned.q.out'
Reverted 'ql/src/test/results/clientpositive/alter_view_as_select.q.out'
Reverted 'ql/src/test/results/clientpositive/ptf.q.out'
Reverted 'ql/src/test/results/clientpositive/unset_table_view_property.q.out'
Reverted 'ql/src/test/results/clientpositive/lateral_view_noalias.q.out'
Reverted 'ql/src/test/results/clientpositive/explain_logical.q.out'
Reverted 'ql/src/test/results/clientpositive/ctas_char.q.out'
Reverted 'ql/src/test/results/clientpositive/create_big_view.q.out'
Reverted 'ql/src/test/results/clientpositive/create_view_translate.q.out'
Reverted 'ql/src/test/results/clientpositive/authorization_8.q.out'
Reverted 'ql/src/test/results/clientpositive/ctas_date.q.out'
Reverted 'ql/src/test/results/clientpositive/view_cast.q.out'
Reverted 'ql/src/test/results/clientpositive/view_inputs.q.out'
Reverted 'ql/src/test/results/clientpositive/ctas_varchar.q.out'
Reverted 'ql/src/test/results/clientpositive/describe_formatted_view_partitioned_json.q.out'
Reverted 'ql/src/test/results/clientpositive/describe_formatted_view_partitioned.q.out'
Reverted 'ql/src/test/results/clientpositive/subquery_exists.q.out'
Reverted 'ql/src/test/results/clientpositive/windowing.q.out'
Reverted 'ql/src/test/results/clientpositive/join_view.q.out'
Reverted 'ql/src/test/results/clientpositive/ppd_union_view.q.out'
Reverted 'ql/src/java/org/apache/hadoop/hive/ql/optimizer/SimpleFetchOptimizer.java'
Reverted 'ql/src/java/org/apache/hadoop/hive/ql/optimizer/GenMapRedUtils.java'
Reverted 'ql/src/java/org/apache/hadoop/hive/ql/plan/PlanUtils.java'
Reverted 'ql/src/java/org/apache/hadoop/hive/ql/plan/HiveOperation.java'
Reverted 'ql/src/java/org/apache/hadoop/hive/ql/parse/SemanticAnalyzer.java'
++ egrep -v '^X|^Performing status on external'
++ awk '{print $2}'
++ svn status --no-ignore
+ rm -rf target datanucleus.log ant/target shims/target shims/0.20/target shims/assembly/target shims/0.20S/target shims/0.23/target shims/common/target shims/common-secure/target packaging/target hbase-handler/target testutils/target jdbc/target metastore/target itests/target itests/hcatalog-unit/target itests/test-serde/target itests/qtest/target itests/hive-unit/target itests/custom-serde/target itests/util/target hcatalog/target hcatalog/storage-handlers/hbase/target hcatalog/server-extensions/target hcatalog/core/target hcatalog/webhcat/svr/target hcatalog/webhcat/java-client/target hcatalog/hcatalog-pig-adapter/target hwi/target common/target common/src/gen service/target contrib/target serde/target beeline/target odbc/target cli/target ql/dependency-reduced-pom.xml ql/target
+ svn update
U    ql/src/java/org/apache/hadoop/hive/ql/io/orc/InStream.java
U    ql/src/java/org/apache/hadoop/hive/ql/io/orc/RecordReaderImpl.java

Fetching external item into 'hcatalog/src/test/e2e/harness'
Updated external to revision 1543968.

Updated to revision 1543968.
+ patchCommandPath=/data/hive-ptest/working/scratch/smart-apply-patch.sh
+ patchFilePath=/data/hive-ptest/working/scratch/build.patch
+ [[ -f /data/hive-ptest/working/scratch/build.patch ]]
+ chmod +x /data/hive-ptest/working/scratch/smart-apply-patch.sh
+ /data/hive-ptest/working/scratch/smart-apply-patch.sh /data/hive-ptest/working/scratch/build.patch
Going to apply patch with: patch -p0
patching file ql/src/java/org/apache/hadoop/hive/ql/exec/vector/expressions/VectorUDFDayOfMonthString.java
patching file ql/src/java/org/apache/hadoop/hive/ql/exec/vector/expressions/VectorUDFHourString.java
patching file ql/src/java/org/apache/hadoop/hive/ql/exec/vector/expressions/VectorUDFMinuteString.java
patching file ql/src/java/org/apache/hadoop/hive/ql/exec/vector/expressions/VectorUDFMonthString.java
patching file ql/src/java/org/apache/hadoop/hive/ql/exec/vector/expressions/VectorUDFSecondString.java
patching file ql/src/java/org/apache/hadoop/hive/ql/exec/vector/expressions/VectorUDFTimestampFieldString.java
patching file ql/src/java/org/apache/hadoop/hive/ql/exec/vector/expressions/VectorUDFUnixTimeStampString.java
patching file ql/src/java/org/apache/hadoop/hive/ql/exec/vector/expressions/VectorUDFWeekOfYearString.java
patching file ql/src/java/org/apache/hadoop/hive/ql/exec/vector/expressions/VectorUDFYearString.java
patching file ql/src/java/org/apache/hadoop/hive/ql/optimizer/physical/Vectorizer.java
Hunk #1 succeeded at 196 (offset 70 lines).
patching file ql/src/java/org/apache/hadoop/hive/ql/udf/UDFDayOfMonth.java
patching file ql/src/java/org/apache/hadoop/hive/ql/udf/UDFHour.java
patching file ql/src/java/org/apache/hadoop/hive/ql/udf/UDFMinute.java
patching file ql/src/java/org/apache/hadoop/hive/ql/udf/UDFMonth.java
patching file ql/src/java/org/apache/hadoop/hive/ql/udf/UDFSecond.java
patching file ql/src/java/org/apache/hadoop/hive/ql/udf/UDFWeekOfYear.java
patching file ql/src/java/org/apache/hadoop/hive/ql/udf/UDFYear.java
patching file ql/src/java/org/apache/hadoop/hive/ql/udf/generic/GenericUDFToUnixTimeStamp.java
patching file ql/src/test/org/apache/hadoop/hive/ql/exec/vector/expressions/TestVectorTimestampExpressions.java
patching file ql/src/test/queries/clientpositive/vectorized_timestamp_funcs.q
patching file ql/src/test/results/clientpositive/vectorized_timestamp_funcs.q.out
+ [[ maven == \m\a\v\e\n ]]
+ rm -rf /data/hive-ptest/working/maven/org/apache/hive
+ mvn -B clean install -DskipTests -Dmaven.repo.local=/data/hive-ptest/working/maven
[INFO] Scanning for projects...
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Build Order:
[INFO] 
[INFO] Hive
[INFO] Hive Ant Utilities
[INFO] Hive Shims Common
[INFO] Hive Shims 0.20
[INFO] Hive Shims Secure Common
[INFO] Hive Shims 0.20S
[INFO] Hive Shims 0.23
[INFO] Hive Shims
[INFO] Hive Common
[INFO] Hive Serde
[INFO] Hive Metastore
[INFO] Hive Query Language
[INFO] Hive Service
[INFO] Hive JDBC
[INFO] Hive Beeline
[INFO] Hive CLI
[INFO] Hive Contrib
[INFO] Hive HBase Handler
[INFO] Hive HCatalog
[INFO] Hive HCatalog Core
[INFO] Hive HCatalog Pig Adapter
[INFO] Hive HCatalog Server Extensions
[INFO] Hive HCatalog Webhcat Java Client
[INFO] Hive HCatalog Webhcat
[INFO] Hive HCatalog HBase Storage Handler
[INFO] Hive HWI
[INFO] Hive ODBC
[INFO] Hive Shims Aggregator
[INFO] Hive TestUtils
[INFO] Hive Packaging
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Building Hive 0.13.0-SNAPSHOT
[INFO] ------------------------------------------------------------------------
[INFO] 
[INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive ---
[INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source (includes = [datanucleus.log, derby.log], excludes = [])
[INFO] 
[INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive ---
[INFO] Executing tasks

main:
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive ---
[INFO] Executing tasks

main:
    [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/target/tmp
    [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/target/warehouse
    [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/target/tmp/conf
     [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/target/tmp/conf
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-install-plugin:2.4:install (default-install) @ hive ---
[INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/pom.xml to /data/hive-ptest/working/maven/org/apache/hive/hive/0.13.0-SNAPSHOT/hive-0.13.0-SNAPSHOT.pom
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Building Hive Ant Utilities 0.13.0-SNAPSHOT
[INFO] ------------------------------------------------------------------------
[INFO] 
[INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-ant ---
[INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/ant (includes = [datanucleus.log, derby.log], excludes = [])
[INFO] 
[INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-ant ---
[debug] execute contextualize
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/ant/src/main/resources
[INFO] 
[INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-ant ---
[INFO] Executing tasks

main:
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-ant ---
[INFO] Compiling 5 source files to /data/hive-ptest/working/apache-svn-trunk-source/ant/target/classes
[WARNING] Note: /data/hive-ptest/working/apache-svn-trunk-source/ant/src/org/apache/hadoop/hive/ant/QTestGenTask.java uses or overrides a deprecated API.
[WARNING] Note: Recompile with -Xlint:deprecation for details.
[WARNING] Note: /data/hive-ptest/working/apache-svn-trunk-source/ant/src/org/apache/hadoop/hive/ant/DistinctElementsClassPath.java uses unchecked or unsafe operations.
[WARNING] Note: Recompile with -Xlint:unchecked for details.
[INFO] 
[INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) @ hive-ant ---
[debug] execute contextualize
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/ant/src/test/resources
[INFO] 
[INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-ant ---
[INFO] Executing tasks

main:
    [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/ant/target/tmp
    [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/ant/target/warehouse
    [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/ant/target/tmp/conf
     [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/ant/target/tmp/conf
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-ant ---
[INFO] No sources to compile
[INFO] 
[INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-ant ---
[INFO] Tests are skipped.
[INFO] 
[INFO] --- maven-jar-plugin:2.2:jar (default-jar) @ hive-ant ---
[INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/ant/target/hive-ant-0.13.0-SNAPSHOT.jar
[INFO] 
[INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-ant ---
[INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/ant/target/hive-ant-0.13.0-SNAPSHOT.jar to /data/hive-ptest/working/maven/org/apache/hive/hive-ant/0.13.0-SNAPSHOT/hive-ant-0.13.0-SNAPSHOT.jar
[INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/ant/pom.xml to /data/hive-ptest/working/maven/org/apache/hive/hive-ant/0.13.0-SNAPSHOT/hive-ant-0.13.0-SNAPSHOT.pom
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Building Hive Shims Common 0.13.0-SNAPSHOT
[INFO] ------------------------------------------------------------------------
[INFO] 
[INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-shims-common ---
[INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/shims/common (includes = [datanucleus.log, derby.log], excludes = [])
[INFO] 
[INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-shims-common ---
[debug] execute contextualize
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/shims/common/src/main/resources
[INFO] 
[INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-shims-common ---
[INFO] Executing tasks

main:
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-shims-common ---
[INFO] Compiling 15 source files to /data/hive-ptest/working/apache-svn-trunk-source/shims/common/target/classes
[WARNING] Note: Some input files use or override a deprecated API.
[WARNING] Note: Recompile with -Xlint:deprecation for details.
[INFO] 
[INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) @ hive-shims-common ---
[debug] execute contextualize
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/shims/common/src/test/resources
[INFO] 
[INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-shims-common ---
[INFO] Executing tasks

main:
    [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/common/target/tmp
    [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/common/target/warehouse
    [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/common/target/tmp/conf
     [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/shims/common/target/tmp/conf
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-shims-common ---
[INFO] No sources to compile
[INFO] 
[INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-shims-common ---
[INFO] Tests are skipped.
[INFO] 
[INFO] --- maven-jar-plugin:2.2:jar (default-jar) @ hive-shims-common ---
[INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/shims/common/target/hive-shims-common-0.13.0-SNAPSHOT.jar
[INFO] 
[INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-shims-common ---
[INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/shims/common/target/hive-shims-common-0.13.0-SNAPSHOT.jar to /data/hive-ptest/working/maven/org/apache/hive/shims/hive-shims-common/0.13.0-SNAPSHOT/hive-shims-common-0.13.0-SNAPSHOT.jar
[INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/shims/common/pom.xml to /data/hive-ptest/working/maven/org/apache/hive/shims/hive-shims-common/0.13.0-SNAPSHOT/hive-shims-common-0.13.0-SNAPSHOT.pom
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Building Hive Shims 0.20 0.13.0-SNAPSHOT
[INFO] ------------------------------------------------------------------------
[INFO] 
[INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-shims-0.20 ---
[INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20 (includes = [datanucleus.log, derby.log], excludes = [])
[INFO] 
[INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-shims-0.20 ---
[debug] execute contextualize
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20/src/main/resources
[INFO] 
[INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-shims-0.20 ---
[INFO] Executing tasks

main:
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-shims-0.20 ---
[INFO] Compiling 2 source files to /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20/target/classes
[WARNING] Note: /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20/src/main/java/org/apache/hadoop/hive/shims/Hadoop20Shims.java uses or overrides a deprecated API.
[WARNING] Note: Recompile with -Xlint:deprecation for details.
[WARNING] Note: /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20/src/main/java/org/apache/hadoop/hive/shims/Hadoop20Shims.java uses unchecked or unsafe operations.
[WARNING] Note: Recompile with -Xlint:unchecked for details.
[INFO] 
[INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) @ hive-shims-0.20 ---
[debug] execute contextualize
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20/src/test/resources
[INFO] 
[INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-shims-0.20 ---
[INFO] Executing tasks

main:
    [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20/target/tmp
    [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20/target/warehouse
    [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20/target/tmp/conf
     [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20/target/tmp/conf
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-shims-0.20 ---
[INFO] No sources to compile
[INFO] 
[INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-shims-0.20 ---
[INFO] Tests are skipped.
[INFO] 
[INFO] --- maven-jar-plugin:2.2:jar (default-jar) @ hive-shims-0.20 ---
[INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20/target/hive-shims-0.20-0.13.0-SNAPSHOT.jar
[INFO] 
[INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-shims-0.20 ---
[INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20/target/hive-shims-0.20-0.13.0-SNAPSHOT.jar to /data/hive-ptest/working/maven/org/apache/hive/shims/hive-shims-0.20/0.13.0-SNAPSHOT/hive-shims-0.20-0.13.0-SNAPSHOT.jar
[INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20/pom.xml to /data/hive-ptest/working/maven/org/apache/hive/shims/hive-shims-0.20/0.13.0-SNAPSHOT/hive-shims-0.20-0.13.0-SNAPSHOT.pom
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Building Hive Shims Secure Common 0.13.0-SNAPSHOT
[INFO] ------------------------------------------------------------------------
[INFO] 
[INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-shims-common-secure ---
[INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/shims/common-secure (includes = [datanucleus.log, derby.log], excludes = [])
[INFO] 
[INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-shims-common-secure ---
[debug] execute contextualize
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/shims/common-secure/src/main/resources
[INFO] 
[INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-shims-common-secure ---
[INFO] Executing tasks

main:
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-shims-common-secure ---
[INFO] Compiling 12 source files to /data/hive-ptest/working/apache-svn-trunk-source/shims/common-secure/target/classes
[WARNING] Note: /data/hive-ptest/working/apache-svn-trunk-source/shims/common-secure/src/main/java/org/apache/hadoop/hive/shims/HadoopShimsSecure.java uses or overrides a deprecated API.
[WARNING] Note: Recompile with -Xlint:deprecation for details.
[WARNING] Note: Some input files use unchecked or unsafe operations.
[WARNING] Note: Recompile with -Xlint:unchecked for details.
[INFO] 
[INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) @ hive-shims-common-secure ---
[debug] execute contextualize
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/shims/common-secure/src/test/resources
[INFO] 
[INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-shims-common-secure ---
[INFO] Executing tasks

main:
    [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/common-secure/target/tmp
    [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/common-secure/target/warehouse
    [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/common-secure/target/tmp/conf
     [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/shims/common-secure/target/tmp/conf
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-shims-common-secure ---
[INFO] No sources to compile
[INFO] 
[INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-shims-common-secure ---
[INFO] Tests are skipped.
[INFO] 
[INFO] --- maven-jar-plugin:2.2:jar (default-jar) @ hive-shims-common-secure ---
[INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/shims/common-secure/target/hive-shims-common-secure-0.13.0-SNAPSHOT.jar
[INFO] 
[INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-shims-common-secure ---
[INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/shims/common-secure/target/hive-shims-common-secure-0.13.0-SNAPSHOT.jar to /data/hive-ptest/working/maven/org/apache/hive/shims/hive-shims-common-secure/0.13.0-SNAPSHOT/hive-shims-common-secure-0.13.0-SNAPSHOT.jar
[INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/shims/common-secure/pom.xml to /data/hive-ptest/working/maven/org/apache/hive/shims/hive-shims-common-secure/0.13.0-SNAPSHOT/hive-shims-common-secure-0.13.0-SNAPSHOT.pom
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Building Hive Shims 0.20S 0.13.0-SNAPSHOT
[INFO] ------------------------------------------------------------------------
[INFO] 
[INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-shims-0.20S ---
[INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20S (includes = [datanucleus.log, derby.log], excludes = [])
[INFO] 
[INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-shims-0.20S ---
[debug] execute contextualize
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20S/src/main/resources
[INFO] 
[INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-shims-0.20S ---
[INFO] Executing tasks

main:
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-shims-0.20S ---
[INFO] Compiling 3 source files to /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20S/target/classes
[WARNING] Note: /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20S/src/main/java/org/apache/hadoop/hive/shims/Hadoop20SShims.java uses or overrides a deprecated API.
[WARNING] Note: Recompile with -Xlint:deprecation for details.
[INFO] 
[INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) @ hive-shims-0.20S ---
[debug] execute contextualize
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20S/src/test/resources
[INFO] 
[INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-shims-0.20S ---
[INFO] Executing tasks

main:
    [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20S/target/tmp
    [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20S/target/warehouse
    [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20S/target/tmp/conf
     [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20S/target/tmp/conf
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-shims-0.20S ---
[INFO] No sources to compile
[INFO] 
[INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-shims-0.20S ---
[INFO] Tests are skipped.
[INFO] 
[INFO] --- maven-jar-plugin:2.2:jar (default-jar) @ hive-shims-0.20S ---
[INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20S/target/hive-shims-0.20S-0.13.0-SNAPSHOT.jar
[INFO] 
[INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-shims-0.20S ---
[INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20S/target/hive-shims-0.20S-0.13.0-SNAPSHOT.jar to /data/hive-ptest/working/maven/org/apache/hive/shims/hive-shims-0.20S/0.13.0-SNAPSHOT/hive-shims-0.20S-0.13.0-SNAPSHOT.jar
[INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20S/pom.xml to /data/hive-ptest/working/maven/org/apache/hive/shims/hive-shims-0.20S/0.13.0-SNAPSHOT/hive-shims-0.20S-0.13.0-SNAPSHOT.pom
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Building Hive Shims 0.23 0.13.0-SNAPSHOT
[INFO] ------------------------------------------------------------------------
[INFO] 
[INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-shims-0.23 ---
[INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/shims/0.23 (includes = [datanucleus.log, derby.log], excludes = [])
[INFO] 
[INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-shims-0.23 ---
[debug] execute contextualize
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/shims/0.23/src/main/resources
[INFO] 
[INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-shims-0.23 ---
[INFO] Executing tasks

main:
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-shims-0.23 ---
[INFO] Compiling 3 source files to /data/hive-ptest/working/apache-svn-trunk-source/shims/0.23/target/classes
[WARNING] Note: /data/hive-ptest/working/apache-svn-trunk-source/shims/0.23/src/main/java/org/apache/hadoop/hive/shims/Hadoop23Shims.java uses or overrides a deprecated API.
[WARNING] Note: Recompile with -Xlint:deprecation for details.
[INFO] 
[INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) @ hive-shims-0.23 ---
[debug] execute contextualize
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/shims/0.23/src/test/resources
[INFO] 
[INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-shims-0.23 ---
[INFO] Executing tasks

main:
    [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/0.23/target/tmp
    [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/0.23/target/warehouse
    [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/0.23/target/tmp/conf
     [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/shims/0.23/target/tmp/conf
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-shims-0.23 ---
[INFO] No sources to compile
[INFO] 
[INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-shims-0.23 ---
[INFO] Tests are skipped.
[INFO] 
[INFO] --- maven-jar-plugin:2.2:jar (default-jar) @ hive-shims-0.23 ---
[INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/shims/0.23/target/hive-shims-0.23-0.13.0-SNAPSHOT.jar
[INFO] 
[INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-shims-0.23 ---
[INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/shims/0.23/target/hive-shims-0.23-0.13.0-SNAPSHOT.jar to /data/hive-ptest/working/maven/org/apache/hive/shims/hive-shims-0.23/0.13.0-SNAPSHOT/hive-shims-0.23-0.13.0-SNAPSHOT.jar
[INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/shims/0.23/pom.xml to /data/hive-ptest/working/maven/org/apache/hive/shims/hive-shims-0.23/0.13.0-SNAPSHOT/hive-shims-0.23-0.13.0-SNAPSHOT.pom
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Building Hive Shims 0.13.0-SNAPSHOT
[INFO] ------------------------------------------------------------------------
[INFO] 
[INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-shims ---
[INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/shims/assembly (includes = [datanucleus.log, derby.log], excludes = [])
[INFO] 
[INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-shims ---
[debug] execute contextualize
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/shims/assembly/src/main/resources
[INFO] 
[INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-shims ---
[INFO] Executing tasks

main:
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-shims ---
[INFO] No sources to compile
[INFO] 
[INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) @ hive-shims ---
[debug] execute contextualize
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/shims/assembly/src/test/resources
[INFO] 
[INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-shims ---
[INFO] Executing tasks

main:
    [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/assembly/target/tmp
    [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/assembly/target/warehouse
    [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/shims/assembly/target/tmp/conf
     [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/shims/assembly/target/tmp/conf
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-shims ---
[INFO] No sources to compile
[INFO] 
[INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-shims ---
[INFO] Tests are skipped.
[INFO] 
[INFO] --- maven-jar-plugin:2.2:jar (default-jar) @ hive-shims ---
[WARNING] JAR will be empty - no content was marked for inclusion!
[INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/shims/assembly/target/hive-shims-0.13.0-SNAPSHOT.jar
[INFO] 
[INFO] --- maven-assembly-plugin:2.3:single (uberjar) @ hive-shims ---
[INFO] Reading assembly descriptor: src/assemble/uberjar.xml
[WARNING] Artifact: org.apache.hive:hive-shims:jar:0.13.0-SNAPSHOT references the same file as the assembly destination file. Moving it to a temporary location for inclusion.
[INFO] META-INF/MANIFEST.MF already added, skipping
[INFO] META-INF/MANIFEST.MF already added, skipping
[INFO] META-INF/MANIFEST.MF already added, skipping
[INFO] META-INF/MANIFEST.MF already added, skipping
[INFO] META-INF/MANIFEST.MF already added, skipping
[INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/shims/assembly/target/hive-shims-0.13.0-SNAPSHOT.jar
[INFO] META-INF/MANIFEST.MF already added, skipping
[INFO] META-INF/MANIFEST.MF already added, skipping
[INFO] META-INF/MANIFEST.MF already added, skipping
[INFO] META-INF/MANIFEST.MF already added, skipping
[INFO] META-INF/MANIFEST.MF already added, skipping
[WARNING] Configuration options: 'appendAssemblyId' is set to false, and 'classifier' is missing.
Instead of attaching the assembly file: /data/hive-ptest/working/apache-svn-trunk-source/shims/assembly/target/hive-shims-0.13.0-SNAPSHOT.jar, it will become the file for main project artifact.
NOTE: If multiple descriptors or descriptor-formats are provided for this project, the value of this file will be non-deterministic!
[WARNING] Replacing pre-existing project main-artifact file: /data/hive-ptest/working/apache-svn-trunk-source/shims/assembly/target/archive-tmp/hive-shims-0.13.0-SNAPSHOT.jar
with assembly file: /data/hive-ptest/working/apache-svn-trunk-source/shims/assembly/target/hive-shims-0.13.0-SNAPSHOT.jar
[INFO] 
[INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-shims ---
[INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/shims/assembly/target/hive-shims-0.13.0-SNAPSHOT.jar to /data/hive-ptest/working/maven/org/apache/hive/hive-shims/0.13.0-SNAPSHOT/hive-shims-0.13.0-SNAPSHOT.jar
[INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/shims/assembly/pom.xml to /data/hive-ptest/working/maven/org/apache/hive/hive-shims/0.13.0-SNAPSHOT/hive-shims-0.13.0-SNAPSHOT.pom
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Building Hive Common 0.13.0-SNAPSHOT
[INFO] ------------------------------------------------------------------------
[INFO] 
[INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-common ---
[INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/common (includes = [datanucleus.log, derby.log], excludes = [])
[INFO] 
[INFO] --- maven-antrun-plugin:1.7:run (generate-version-annotation) @ hive-common ---
[INFO] Executing tasks

main:
[INFO] Executed tasks
[INFO] 
[INFO] --- build-helper-maven-plugin:1.8:add-source (add-source) @ hive-common ---
[INFO] Source directory: /data/hive-ptest/working/apache-svn-trunk-source/common/src/gen added.
[INFO] 
[INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-common ---
[debug] execute contextualize
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] Copying 1 resource
[INFO] 
[INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-common ---
[INFO] Executing tasks

main:
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-common ---
[INFO] Compiling 31 source files to /data/hive-ptest/working/apache-svn-trunk-source/common/target/classes
[WARNING] Note: /data/hive-ptest/working/apache-svn-trunk-source/common/src/java/org/apache/hadoop/hive/common/ObjectPair.java uses unchecked or unsafe operations.
[WARNING] Note: Recompile with -Xlint:unchecked for details.
[INFO] 
[INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) @ hive-common ---
[debug] execute contextualize
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] Copying 4 resources
[INFO] 
[INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-common ---
[INFO] Executing tasks

main:
    [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/common/target/tmp
    [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/common/target/warehouse
    [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/common/target/tmp/conf
     [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/common/target/tmp/conf
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-common ---
[INFO] Compiling 8 source files to /data/hive-ptest/working/apache-svn-trunk-source/common/target/test-classes
[INFO] 
[INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-common ---
[INFO] Tests are skipped.
[INFO] 
[INFO] --- maven-jar-plugin:2.2:jar (default-jar) @ hive-common ---
[INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/common/target/hive-common-0.13.0-SNAPSHOT.jar
[INFO] 
[INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-common ---
[INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/common/target/hive-common-0.13.0-SNAPSHOT.jar to /data/hive-ptest/working/maven/org/apache/hive/hive-common/0.13.0-SNAPSHOT/hive-common-0.13.0-SNAPSHOT.jar
[INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/common/pom.xml to /data/hive-ptest/working/maven/org/apache/hive/hive-common/0.13.0-SNAPSHOT/hive-common-0.13.0-SNAPSHOT.pom
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Building Hive Serde 0.13.0-SNAPSHOT
[INFO] ------------------------------------------------------------------------
[INFO] 
[INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-serde ---
[INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/serde (includes = [datanucleus.log, derby.log], excludes = [])
[INFO] 
[INFO] --- build-helper-maven-plugin:1.8:add-source (add-source) @ hive-serde ---
[INFO] Source directory: /data/hive-ptest/working/apache-svn-trunk-source/serde/src/gen/protobuf/gen-java added.
[INFO] Source directory: /data/hive-ptest/working/apache-svn-trunk-source/serde/src/gen/thrift/gen-javabean added.
[INFO] 
[INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-serde ---
[debug] execute contextualize
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/serde/src/main/resources
[INFO] 
[INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-serde ---
[INFO] Executing tasks

main:
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-serde ---
[INFO] Compiling 351 source files to /data/hive-ptest/working/apache-svn-trunk-source/serde/target/classes
[WARNING] Note: Some input files use or override a deprecated API.
[WARNING] Note: Recompile with -Xlint:deprecation for details.
[WARNING] Note: Some input files use unchecked or unsafe operations.
[WARNING] Note: Recompile with -Xlint:unchecked for details.
[INFO] 
[INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) @ hive-serde ---
[debug] execute contextualize
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/serde/src/test/resources
[INFO] 
[INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-serde ---
[INFO] Executing tasks

main:
    [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/serde/target/tmp
    [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/serde/target/warehouse
    [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/serde/target/tmp/conf
     [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/serde/target/tmp/conf
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-serde ---
[INFO] Compiling 41 source files to /data/hive-ptest/working/apache-svn-trunk-source/serde/target/test-classes
[WARNING] Note: Some input files use or override a deprecated API.
[WARNING] Note: Recompile with -Xlint:deprecation for details.
[WARNING] Note: Some input files use unchecked or unsafe operations.
[WARNING] Note: Recompile with -Xlint:unchecked for details.
[INFO] 
[INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-serde ---
[INFO] Tests are skipped.
[INFO] 
[INFO] --- maven-jar-plugin:2.2:jar (default-jar) @ hive-serde ---
[INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/serde/target/hive-serde-0.13.0-SNAPSHOT.jar
[INFO] 
[INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-serde ---
[INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/serde/target/hive-serde-0.13.0-SNAPSHOT.jar to /data/hive-ptest/working/maven/org/apache/hive/hive-serde/0.13.0-SNAPSHOT/hive-serde-0.13.0-SNAPSHOT.jar
[INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/serde/pom.xml to /data/hive-ptest/working/maven/org/apache/hive/hive-serde/0.13.0-SNAPSHOT/hive-serde-0.13.0-SNAPSHOT.pom
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Building Hive Metastore 0.13.0-SNAPSHOT
[INFO] ------------------------------------------------------------------------
[INFO] 
[INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-metastore ---
[INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/metastore (includes = [datanucleus.log, derby.log], excludes = [])
[INFO] 
[INFO] --- build-helper-maven-plugin:1.8:add-source (add-source) @ hive-metastore ---
[INFO] Source directory: /data/hive-ptest/working/apache-svn-trunk-source/metastore/src/model added.
[INFO] Source directory: /data/hive-ptest/working/apache-svn-trunk-source/metastore/src/gen/thrift/gen-javabean added.
[INFO] 
[INFO] --- antlr3-maven-plugin:3.4:antlr (default) @ hive-metastore ---
[INFO] ANTLR: Processing source directory /data/hive-ptest/working/apache-svn-trunk-source/metastore/src/java
ANTLR Parser Generator  Version 3.4
org/apache/hadoop/hive/metastore/parser/Filter.g
[INFO] 
[INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-metastore ---
[debug] execute contextualize
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] Copying 1 resource
[INFO] 
[INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-metastore ---
[INFO] Executing tasks

main:
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-metastore ---
[INFO] Compiling 132 source files to /data/hive-ptest/working/apache-svn-trunk-source/metastore/target/classes
[WARNING] Note: Some input files use or override a deprecated API.
[WARNING] Note: Recompile with -Xlint:deprecation for details.
[WARNING] Note: Some input files use unchecked or unsafe operations.
[WARNING] Note: Recompile with -Xlint:unchecked for details.
[INFO] 
[INFO] --- datanucleus-maven-plugin:3.3.0-release:enhance (default) @ hive-metastore ---
[INFO] DataNucleus Enhancer (version 3.2.2) for API "JDO" using JRE "1.6"
DataNucleus Enhancer : Classpath
>>  /data/hive-ptest/working/maven/org/datanucleus/datanucleus-maven-plugin/3.3.0-release/datanucleus-maven-plugin-3.3.0-release.jar
>>  /data/hive-ptest/working/maven/org/datanucleus/datanucleus-core/3.2.2/datanucleus-core-3.2.2.jar
>>  /data/hive-ptest/working/maven/org/codehaus/plexus/plexus-utils/3.0.8/plexus-utils-3.0.8.jar
>>  /data/hive-ptest/working/maven/org/codehaus/plexus/plexus-component-annotations/1.5.5/plexus-component-annotations-1.5.5.jar
>>  /data/hive-ptest/working/maven/org/sonatype/sisu/sisu-inject-bean/2.3.0/sisu-inject-bean-2.3.0.jar
>>  /data/hive-ptest/working/maven/org/sonatype/sisu/sisu-guice/3.1.0/sisu-guice-3.1.0-no_aop.jar
>>  /data/hive-ptest/working/maven/org/sonatype/sisu/sisu-guava/0.9.9/sisu-guava-0.9.9.jar
>>  /data/hive-ptest/working/maven/org/apache/xbean/xbean-reflect/3.4/xbean-reflect-3.4.jar
>>  /data/hive-ptest/working/maven/log4j/log4j/1.2.12/log4j-1.2.12.jar
>>  /data/hive-ptest/working/maven/commons-logging/commons-logging-api/1.1/commons-logging-api-1.1.jar
>>  /data/hive-ptest/working/maven/com/google/collections/google-collections/1.0/google-collections-1.0.jar
>>  /data/hive-ptest/working/maven/junit/junit/3.8.2/junit-3.8.2.jar
>>  /data/hive-ptest/working/apache-svn-trunk-source/metastore/target/classes
>>  /data/hive-ptest/working/apache-svn-trunk-source/serde/target/hive-serde-0.13.0-SNAPSHOT.jar
>>  /data/hive-ptest/working/apache-svn-trunk-source/common/target/hive-common-0.13.0-SNAPSHOT.jar
>>  /data/hive-ptest/working/maven/org/apache/commons/commons-compress/1.4.1/commons-compress-1.4.1.jar
>>  /data/hive-ptest/working/maven/org/tukaani/xz/1.0/xz-1.0.jar
>>  /data/hive-ptest/working/maven/commons-codec/commons-codec/1.4/commons-codec-1.4.jar
>>  /data/hive-ptest/working/maven/org/apache/avro/avro/1.7.5/avro-1.7.5.jar
>>  /data/hive-ptest/working/maven/org/codehaus/jackson/jackson-core-asl/1.9.2/jackson-core-asl-1.9.2.jar
>>  /data/hive-ptest/working/maven/com/thoughtworks/paranamer/paranamer/2.3/paranamer-2.3.jar
>>  /data/hive-ptest/working/maven/org/xerial/snappy/snappy-java/1.0.5/snappy-java-1.0.5.jar
>>  /data/hive-ptest/working/apache-svn-trunk-source/shims/assembly/target/hive-shims-0.13.0-SNAPSHOT.jar
>>  /data/hive-ptest/working/maven/com/google/guava/guava/11.0.2/guava-11.0.2.jar
>>  /data/hive-ptest/working/maven/com/google/code/findbugs/jsr305/1.3.9/jsr305-1.3.9.jar
>>  /data/hive-ptest/working/maven/commons-cli/commons-cli/1.2/commons-cli-1.2.jar
>>  /data/hive-ptest/working/maven/commons-lang/commons-lang/2.4/commons-lang-2.4.jar
>>  /data/hive-ptest/working/maven/commons-logging/commons-logging/1.1.3/commons-logging-1.1.3.jar
>>  /data/hive-ptest/working/maven/org/apache/derby/derby/10.4.2.0/derby-10.4.2.0.jar
>>  /data/hive-ptest/working/maven/org/datanucleus/datanucleus-api-jdo/3.2.1/datanucleus-api-jdo-3.2.1.jar
>>  /data/hive-ptest/working/maven/org/datanucleus/datanucleus-rdbms/3.2.1/datanucleus-rdbms-3.2.1.jar
>>  /data/hive-ptest/working/maven/javax/jdo/jdo-api/3.0.1/jdo-api-3.0.1.jar
>>  /data/hive-ptest/working/maven/javax/transaction/jta/1.1/jta-1.1.jar
>>  /data/hive-ptest/working/maven/org/antlr/antlr-runtime/3.4/antlr-runtime-3.4.jar
>>  /data/hive-ptest/working/maven/org/antlr/stringtemplate/3.2.1/stringtemplate-3.2.1.jar
>>  /data/hive-ptest/working/maven/antlr/antlr/2.7.7/antlr-2.7.7.jar
>>  /data/hive-ptest/working/maven/org/apache/thrift/libfb303/0.9.0/libfb303-0.9.0.jar
>>  /data/hive-ptest/working/maven/org/apache/thrift/libthrift/0.9.0/libthrift-0.9.0.jar
>>  /data/hive-ptest/working/maven/org/apache/httpcomponents/httpclient/4.2.5/httpclient-4.2.5.jar
>>  /data/hive-ptest/working/maven/org/apache/httpcomponents/httpcore/4.2.4/httpcore-4.2.4.jar
>>  /data/hive-ptest/working/maven/org/apache/hadoop/hadoop-core/1.2.1/hadoop-core-1.2.1.jar
>>  /data/hive-ptest/working/maven/xmlenc/xmlenc/0.52/xmlenc-0.52.jar
>>  /data/hive-ptest/working/maven/com/sun/jersey/jersey-core/1.14/jersey-core-1.14.jar
>>  /data/hive-ptest/working/maven/com/sun/jersey/jersey-json/1.14/jersey-json-1.14.jar
>>  /data/hive-ptest/working/maven/org/codehaus/jettison/jettison/1.1/jettison-1.1.jar
>>  /data/hive-ptest/working/maven/stax/stax-api/1.0.1/stax-api-1.0.1.jar
>>  /data/hive-ptest/working/maven/com/sun/xml/bind/jaxb-impl/2.2.3-1/jaxb-impl-2.2.3-1.jar
>>  /data/hive-ptest/working/maven/javax/xml/bind/jaxb-api/2.2.2/jaxb-api-2.2.2.jar
>>  /data/hive-ptest/working/maven/javax/xml/stream/stax-api/1.0-2/stax-api-1.0-2.jar
>>  /data/hive-ptest/working/maven/javax/activation/activation/1.1/activation-1.1.jar
>>  /data/hive-ptest/working/maven/org/codehaus/jackson/jackson-jaxrs/1.9.2/jackson-jaxrs-1.9.2.jar
>>  /data/hive-ptest/working/maven/org/codehaus/jackson/jackson-xc/1.9.2/jackson-xc-1.9.2.jar
>>  /data/hive-ptest/working/maven/com/sun/jersey/jersey-server/1.14/jersey-server-1.14.jar
>>  /data/hive-ptest/working/maven/asm/asm/3.1/asm-3.1.jar
>>  /data/hive-ptest/working/maven/commons-io/commons-io/2.4/commons-io-2.4.jar
>>  /data/hive-ptest/working/maven/commons-httpclient/commons-httpclient/3.0.1/commons-httpclient-3.0.1.jar
>>  /data/hive-ptest/working/maven/org/apache/commons/commons-math/2.1/commons-math-2.1.jar
>>  /data/hive-ptest/working/maven/commons-configuration/commons-configuration/1.6/commons-configuration-1.6.jar
>>  /data/hive-ptest/working/maven/commons-collections/commons-collections/3.2.1/commons-collections-3.2.1.jar
>>  /data/hive-ptest/working/maven/commons-digester/commons-digester/1.8/commons-digester-1.8.jar
>>  /data/hive-ptest/working/maven/commons-beanutils/commons-beanutils/1.7.0/commons-beanutils-1.7.0.jar
>>  /data/hive-ptest/working/maven/commons-beanutils/commons-beanutils-core/1.8.0/commons-beanutils-core-1.8.0.jar
>>  /data/hive-ptest/working/maven/commons-net/commons-net/1.4.1/commons-net-1.4.1.jar
>>  /data/hive-ptest/working/maven/org/mortbay/jetty/jetty/6.1.26/jetty-6.1.26.jar
>>  /data/hive-ptest/working/maven/org/mortbay/jetty/servlet-api/2.5-20081211/servlet-api-2.5-20081211.jar
>>  /data/hive-ptest/working/maven/org/mortbay/jetty/jetty-util/6.1.26/jetty-util-6.1.26.jar
>>  /data/hive-ptest/working/maven/tomcat/jasper-runtime/5.5.12/jasper-runtime-5.5.12.jar
>>  /data/hive-ptest/working/maven/tomcat/jasper-compiler/5.5.12/jasper-compiler-5.5.12.jar
>>  /data/hive-ptest/working/maven/org/mortbay/jetty/jsp-api-2.1/6.1.14/jsp-api-2.1-6.1.14.jar
>>  /data/hive-ptest/working/maven/org/mortbay/jetty/servlet-api-2.5/6.1.14/servlet-api-2.5-6.1.14.jar
>>  /data/hive-ptest/working/maven/org/mortbay/jetty/jsp-2.1/6.1.14/jsp-2.1-6.1.14.jar
>>  /data/hive-ptest/working/maven/ant/ant/1.6.5/ant-1.6.5.jar
>>  /data/hive-ptest/working/maven/commons-el/commons-el/1.0/commons-el-1.0.jar
>>  /data/hive-ptest/working/maven/net/java/dev/jets3t/jets3t/0.6.1/jets3t-0.6.1.jar
>>  /data/hive-ptest/working/maven/hsqldb/hsqldb/1.8.0.10/hsqldb-1.8.0.10.jar
>>  /data/hive-ptest/working/maven/oro/oro/2.0.8/oro-2.0.8.jar
>>  /data/hive-ptest/working/maven/org/eclipse/jdt/core/3.1.1/core-3.1.1.jar
>>  /data/hive-ptest/working/maven/org/codehaus/jackson/jackson-mapper-asl/1.9.2/jackson-mapper-asl-1.9.2.jar
>>  /data/hive-ptest/working/maven/org/slf4j/slf4j-api/1.7.5/slf4j-api-1.7.5.jar
>>  /data/hive-ptest/working/maven/org/slf4j/slf4j-log4j12/1.7.5/slf4j-log4j12-1.7.5.jar
>>  /data/hive-ptest/working/maven/log4j/log4j/1.2.16/log4j-1.2.16.jar
ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MDatabase
ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MFieldSchema
ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MType
ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MTable
ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MSerDeInfo
ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MOrder
ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MColumnDescriptor
ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MStringList
ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MStorageDescriptor
ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MPartition
ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MIndex
ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MRole
ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MRoleMap
ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MGlobalPrivilege
ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MDBPrivilege
ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MTablePrivilege
ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MPartitionPrivilege
ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MTableColumnPrivilege
ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MPartitionColumnPrivilege
ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MPartitionEvent
ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MMasterKey
ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MDelegationToken
ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MTableColumnStatistics
ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MPartitionColumnStatistics
ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MVersionTable
DataNucleus Enhancer completed with success for 25 classes. Timings : input=593 ms, enhance=920 ms, total=1513 ms. Consult the log for full details

[INFO] 
[INFO] --- maven-resources-plugin:2.5:testResources (default-testResources) @ hive-metastore ---
[debug] execute contextualize
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-svn-trunk-source/metastore/src/test/resources
[INFO] 
[INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-metastore ---
[INFO] Executing tasks

main:
    [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/metastore/target/tmp
    [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/metastore/target/warehouse
    [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/metastore/target/tmp/conf
     [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-source/metastore/target/tmp/conf
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hive-metastore ---
[INFO] Compiling 10 source files to /data/hive-ptest/working/apache-svn-trunk-source/metastore/target/test-classes
[INFO] 
[INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-metastore ---
[INFO] Tests are skipped.
[INFO] 
[INFO] --- maven-jar-plugin:2.2:jar (default-jar) @ hive-metastore ---
[INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/metastore/target/hive-metastore-0.13.0-SNAPSHOT.jar
[INFO] 
[INFO] --- maven-jar-plugin:2.2:test-jar (default) @ hive-metastore ---
[INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/metastore/target/hive-metastore-0.13.0-SNAPSHOT-tests.jar
[INFO] 
[INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-metastore ---
[INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/metastore/target/hive-metastore-0.13.0-SNAPSHOT.jar to /data/hive-ptest/working/maven/org/apache/hive/hive-metastore/0.13.0-SNAPSHOT/hive-metastore-0.13.0-SNAPSHOT.jar
[INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/metastore/pom.xml to /data/hive-ptest/working/maven/org/apache/hive/hive-metastore/0.13.0-SNAPSHOT/hive-metastore-0.13.0-SNAPSHOT.pom
[INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/metastore/target/hive-metastore-0.13.0-SNAPSHOT-tests.jar to /data/hive-ptest/working/maven/org/apache/hive/hive-metastore/0.13.0-SNAPSHOT/hive-metastore-0.13.0-SNAPSHOT-tests.jar
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Building Hive Query Language 0.13.0-SNAPSHOT
[INFO] ------------------------------------------------------------------------
[INFO] 
[INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-exec ---
[INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/ql (includes = [datanucleus.log, derby.log], excludes = [])
[INFO] 
[INFO] --- maven-antrun-plugin:1.7:run (generate-sources) @ hive-exec ---
[INFO] Executing tasks

main:
    [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/ql/target/generated-sources/java/org/apache/hadoop/hive/ql/exec/vector/expressions/gen
    [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/ql/target/generated-sources/java/org/apache/hadoop/hive/ql/exec/vector/expressions/aggregates/gen
    [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/ql/target/generated-test-sources/java/org/apache/hadoop/hive/ql/exec/vector/expressions/gen
Generating vector expression code
Generating vector expression test code
[INFO] Executed tasks
[INFO] 
[INFO] --- build-helper-maven-plugin:1.8:add-source (add-source) @ hive-exec ---
[INFO] Source directory: /data/hive-ptest/working/apache-svn-trunk-source/ql/src/gen/protobuf/gen-java added.
[INFO] Source directory: /data/hive-ptest/working/apache-svn-trunk-source/ql/src/gen/thrift/gen-javabean added.
[INFO] Source directory: /data/hive-ptest/working/apache-svn-trunk-source/ql/target/generated-sources/java added.
[INFO] 
[INFO] --- antlr3-maven-plugin:3.4:antlr (default) @ hive-exec ---
[INFO] ANTLR: Processing source directory /data/hive-ptest/working/apache-svn-trunk-source/ql/src/java
ANTLR Parser Generator  Version 3.4
org/apache/hadoop/hive/ql/parse/HiveLexer.g
org/apache/hadoop/hive/ql/parse/HiveParser.g
warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:872:5: 
Decision can match input such as "Identifier KW_RENAME KW_TO" using multiple alternatives: 1, 10

As a result, alternative(s) 10 were disabled for that input
warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1177:5: 
Decision can match input such as "KW_TEXTFILE" using multiple alternatives: 2, 6

As a result, alternative(s) 6 were disabled for that input
warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1177:5: 
Decision can match input such as "KW_SEQUENCEFILE" using multiple alternatives: 1, 6

As a result, alternative(s) 6 were disabled for that input
warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1177:5: 
Decision can match input such as "KW_ORCFILE" using multiple alternatives: 4, 6

As a result, alternative(s) 6 were disabled for that input
warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1177:5: 
Decision can match input such as "KW_RCFILE" using multiple alternatives: 3, 6

As a result, alternative(s) 6 were disabled for that input
warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1190:23: 
Decision can match input such as "KW_ELEM_TYPE" using multiple alternatives: 1, 4

As a result, alternative(s) 4 were disabled for that input
warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1190:23: 
Decision can match input such as "KW_KEY_TYPE" using multiple alternatives: 2, 4

As a result, alternative(s) 4 were disabled for that input
warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1190:23: 
Decision can match input such as "KW_VALUE_TYPE" using multiple alternatives: 3, 4

As a result, alternative(s) 4 were disabled for that input
warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1197:23: 
Decision can match input such as "KW_ELEM_TYPE" using multiple alternatives: 1, 4

As a result, alternative(s) 4 were disabled for that input
warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1197:23: 
Decision can match input such as "KW_VALUE_TYPE" using multiple alternatives: 3, 4

As a result, alternative(s) 4 were disabled for that input
warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1197:23: 
Decision can match input such as "KW_KEY_TYPE" using multiple alternatives: 2, 4

As a result, alternative(s) 4 were disabled for that input
warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1215:29: 
Decision can match input such as "KW_PRETTY {KW_ADD..KW_AFTER, KW_ALTER..KW_ANALYZE, KW_ARCHIVE..KW_CASCADE, KW_CHANGE, KW_CLUSTER..KW_COLLECTION, KW_COLUMNS..KW_CREATE, KW_CUBE, KW_CURSOR..KW_DATA, KW_DATABASES..KW_DISABLE, KW_DISTRIBUTE..KW_ELEM_TYPE, KW_ENABLE, KW_ESCAPED, KW_EXCLUSIVE..KW_EXPORT, KW_EXTERNAL..KW_FLOAT, KW_FOR..KW_FORMATTED, KW_FULL, KW_FUNCTIONS..KW_GROUPING, KW_HOLD_DDLTIME..KW_IDXPROPERTIES, KW_IGNORE..KW_ITEMS, KW_KEYS..KW_LEFT, KW_LIKE..KW_LONG, KW_MAPJOIN..KW_MINUS, KW_MSCK..KW_NOSCAN, KW_NO_DROP..KW_OFFLINE, KW_OPTION, KW_ORCFILE..KW_OUTPUTFORMAT, KW_OVERWRITE, KW_PARTITIONED..KW_PLUS, KW_PRETTY..KW_RECORDWRITER, KW_REGEXP..KW_SCHEMAS, KW_SEMI..KW_TABLES, KW_TBLPROPERTIES..KW_TEXTFILE, KW_TIMESTAMP..KW_TOUCH, KW_TRIGGER..KW_UNARCHIVE, KW_UNDO..KW_UNIONTYPE, KW_UNLOCK..KW_VALUE_TYPE, KW_VIEW, KW_WHILE, KW_WITH}" using multiple alternatives: 3, 4

As a result, alternative(s) 4 were disabled for that input
warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1215:29: 
Decision can match input such as "KW_FORMATTED {KW_ADD..KW_AFTER, KW_ALTER..KW_ANALYZE, KW_ARCHIVE..KW_CASCADE, KW_CHANGE, KW_CLUSTER..KW_COLLECTION, KW_COLUMNS..KW_CREATE, KW_CUBE, KW_CURSOR..KW_DATA, KW_DATABASES..KW_DISABLE, KW_DISTRIBUTE..KW_ELEM_TYPE, KW_ENABLE, KW_ESCAPED, KW_EXCLUSIVE..KW_EXPORT, KW_EXTERNAL..KW_FLOAT, KW_FOR..KW_FORMATTED, KW_FULL, KW_FUNCTIONS..KW_GROUPING, KW_HOLD_DDLTIME..KW_IDXPROPERTIES, KW_IGNORE..KW_ITEMS, KW_KEYS..KW_LEFT, KW_LIKE..KW_LONG, KW_MAPJOIN..KW_MINUS, KW_MSCK..KW_NOSCAN, KW_NO_DROP..KW_OFFLINE, KW_OPTION, KW_ORCFILE..KW_OUTPUTFORMAT, KW_OVERWRITE, KW_PARTITIONED..KW_PLUS, KW_PRETTY..KW_RECORDWRITER, KW_REGEXP..KW_SCHEMAS, KW_SEMI..KW_TABLES, KW_TBLPROPERTIES..KW_TEXTFILE, KW_TIMESTAMP..KW_TOUCH, KW_TRIGGER..KW_UNARCHIVE, KW_UNDO..KW_UNIONTYPE, KW_UNLOCK..KW_VALUE_TYPE, KW_VIEW, KW_WHILE, KW_WITH}" using multiple alternatives: 1, 4

As a result, alternative(s) 4 were disabled for that input
warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1215:29: 
Decision can match input such as "KW_PRETTY Identifier" using multiple alternatives: 3, 4

As a result, alternative(s) 4 were disabled for that input
warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1215:29: 
Decision can match input such as "KW_FORMATTED Identifier" using multiple alternatives: 1, 4

As a result, alternative(s) 4 were disabled for that input
warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1215:29: 
Decision can match input such as "KW_PRETTY KW_PARTITION" using multiple alternatives: 3, 4

As a result, alternative(s) 4 were disabled for that input
warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1215:29: 
Decision can match input such as "KW_FORMATTED KW_PARTITION" using multiple alternatives: 1, 4

As a result, alternative(s) 4 were disabled for that input
warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1486:116: 
Decision can match input such as "KW_STORED KW_AS KW_DIRECTORIES" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1609:5: 
Decision can match input such as "KW_STORED KW_AS KW_RCFILE" using multiple alternatives: 3, 7

As a result, alternative(s) 7 were disabled for that input
warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1609:5: 
Decision can match input such as "KW_STORED KW_AS KW_SEQUENCEFILE" using multiple alternatives: 1, 7

As a result, alternative(s) 7 were disabled for that input
warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1609:5: 
Decision can match input such as "KW_STORED KW_AS KW_ORCFILE" using multiple alternatives: 4, 7

As a result, alternative(s) 7 were disabled for that input
warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1609:5: 
Decision can match input such as "KW_STORED KW_AS KW_TEXTFILE" using multiple alternatives: 2, 7

As a result, alternative(s) 7 were disabled for that input
warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1609:5: 
Decision can match input such as "KW_STORED KW_AS KW_INPUTFORMAT" using multiple alternatives: 5, 7

As a result, alternative(s) 7 were disabled for that input
warning(200): SelectClauseParser.g:149:5: 
Decision can match input such as "KW_NULL DOT {KW_ADD..KW_AFTER, KW_ALTER..KW_ANALYZE, KW_ARCHIVE..KW_CASCADE, KW_CHANGE, KW_CLUSTER..KW_COLLECTION, KW_COLUMNS..KW_CREATE, KW_CUBE, KW_CURSOR..KW_DATA, KW_DATABASES..KW_DISABLE, KW_DISTRIBUTE..KW_ELEM_TYPE, KW_ENABLE, KW_ESCAPED, KW_EXCLUSIVE..KW_EXPORT, KW_EXTERNAL..KW_FLOAT, KW_FOR..KW_FORMATTED, KW_FULL, KW_FUNCTIONS..KW_GROUPING, KW_HOLD_DDLTIME..KW_IDXPROPERTIES, KW_IGNORE..KW_ITEMS, KW_KEYS..KW_LEFT, KW_LIKE..KW_LONG, KW_MAPJOIN..KW_MINUS, KW_MSCK..KW_NOSCAN, KW_NO_DROP..KW_OFFLINE, KW_OPTION, KW_ORCFILE..KW_OUTPUTFORMAT, KW_OVERWRITE, KW_PARTITION..KW_PLUS, KW_PRETTY..KW_RECORDWRITER, KW_REGEXP..KW_SCHEMAS, KW_SEMI..KW_TABLES, KW_TBLPROPERTIES..KW_TEXTFILE, KW_TIMESTAMP..KW_TOUCH, KW_TRIGGER..KW_UNARCHIVE, KW_UNDO..KW_UNIONTYPE, KW_UNLOCK..KW_VALUE_TYPE, KW_VIEW, KW_WHILE, KW_WITH}" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): SelectClauseParser.g:149:5: 
Decision can match input such as "KW_NULL DOT Identifier" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): FromClauseParser.g:127:2: 
Decision can match input such as "KW_LATERAL KW_VIEW KW_OUTER" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): FromClauseParser.g:179:25: 
Decision can match input such as "LPAREN StringLiteral EQUAL" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): FromClauseParser.g:179:25: 
Decision can match input such as "LPAREN StringLiteral COMMA" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): FromClauseParser.g:179:25: 
Decision can match input such as "LPAREN StringLiteral RPAREN" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): FromClauseParser.g:179:68: 
Decision can match input such as "Identifier LPAREN BigintLiteral" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): FromClauseParser.g:179:68: 
Decision can match input such as "Identifier LPAREN KW_CAST" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): FromClauseParser.g:179:68: 
Decision can match input such as "Identifier LPAREN KW_IF" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): FromClauseParser.g:179:68: 
Decision can match input such as "Identifier LPAREN CharSetName" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): FromClauseParser.g:179:68: 
Decision can match input such as "Identifier LPAREN LPAREN" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): FromClauseParser.g:179:68: 
Decision can match input such as "Identifier LPAREN KW_EXISTS" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): FromClauseParser.g:179:68: 
Decision can match input such as "Identifier LPAREN KW_CASE" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): FromClauseParser.g:179:68: 
Decision can match input such as "Identifier LPAREN TinyintLiteral" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): FromClauseParser.g:179:68: 
Decision can match input such as "Identifier LPAREN KW_NULL" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): FromClauseParser.g:179:68: 
Decision can match input such as "Identifier LPAREN SmallintLiteral" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): FromClauseParser.g:179:68: 
Decision can match input such as "Identifier LPAREN KW_FALSE" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): FromClauseParser.g:179:68: 
Decision can match input such as "Identifier LPAREN KW_UNIONTYPE" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): FromClauseParser.g:179:68: 
Decision can match input such as "Identifier LPAREN Number" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): FromClauseParser.g:179:68: 
Decision can match input such as "Identifier LPAREN Identifier" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): FromClauseParser.g:179:68: 
Decision can match input such as "Identifier LPAREN StringLiteral" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): FromClauseParser.g:179:68: 
Decision can match input such as "Identifier LPAREN DecimalLiteral" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): FromClauseParser.g:179:68: 
Decision can match input such as "Identifier LPAREN {KW_ADD..KW_AFTER, KW_ALTER..KW_ANALYZE, KW_ARCHIVE, KW_AS..KW_CASCADE, KW_CHANGE, KW_CLUSTER..KW_COLLECTION, KW_COLUMNS..KW_CREATE, KW_CUBE, KW_CURSOR..KW_DATA, KW_DATABASES, KW_DATETIME..KW_DISABLE, KW_DISTRIBUTE..KW_ELEM_TYPE, KW_ENABLE, KW_ESCAPED, KW_EXCLUSIVE, KW_EXPLAIN..KW_EXPORT, KW_EXTERNAL, KW_FETCH..KW_FLOAT, KW_FOR..KW_FORMATTED, KW_FULL, KW_FUNCTIONS..KW_GROUPING, KW_HOLD_DDLTIME..KW_IDXPROPERTIES, KW_IGNORE..KW_ITEMS, KW_KEYS..KW_LEFT, KW_LIKE..KW_LONG, KW_MAPJOIN..KW_MINUS, KW_MSCK..KW_NOSCAN, KW_NO_DROP, KW_OF..KW_OFFLINE, KW_OPTION, KW_ORCFILE..KW_OUTPUTFORMAT, KW_OVERWRITE, KW_PARTITION..KW_PLUS, KW_PRETTY..KW_RECORDWRITER, KW_REGEXP..KW_SCHEMAS, KW_SEMI..KW_STRING, KW_TABLE..KW_TABLES, KW_TBLPROPERTIES..KW_TEXTFILE, KW_TIMESTAMP..KW_TOUCH, KW_TRIGGER, KW_TRUNCATE..KW_UNARCHIVE, KW_UNDO..KW_UNION, KW_UNLOCK..KW_VALUE_TYPE, KW_VIEW, KW_WHILE, KW_WITH}" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): FromClauseParser.g:179:68: 
Decision can match input such as "Identifier LPAREN KW_STRUCT" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): FromClauseParser.g:179:68: 
Decision can match input such as "Identifier LPAREN KW_DATE" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): FromClauseParser.g:179:68: 
Decision can match input such as "Identifier LPAREN KW_NOT" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): FromClauseParser.g:179:68: 
Decision can match input such as "Identifier LPAREN {MINUS, PLUS, TILDE}" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): FromClauseParser.g:179:68: 
Decision can match input such as "Identifier LPAREN KW_MAP" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): FromClauseParser.g:179:68: 
Decision can match input such as "Identifier LPAREN KW_TRUE" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): FromClauseParser.g:179:68: 
Decision can match input such as "Identifier LPAREN KW_ARRAY" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): FromClauseParser.g:237:16: 
Decision can match input such as "Identifier LPAREN BigintLiteral" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): FromClauseParser.g:237:16: 
Decision can match input such as "Identifier LPAREN KW_CAST" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): FromClauseParser.g:237:16: 
Decision can match input such as "Identifier LPAREN KW_IF" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): FromClauseParser.g:237:16: 
Decision can match input such as "Identifier LPAREN CharSetName" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): FromClauseParser.g:237:16: 
Decision can match input such as "Identifier LPAREN LPAREN" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): FromClauseParser.g:237:16: 
Decision can match input such as "Identifier LPAREN KW_EXISTS" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): FromClauseParser.g:237:16: 
Decision can match input such as "Identifier LPAREN KW_CASE" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): FromClauseParser.g:237:16: 
Decision can match input such as "Identifier LPAREN TinyintLiteral" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): FromClauseParser.g:237:16: 
Decision can match input such as "Identifier LPAREN KW_NULL" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): FromClauseParser.g:237:16: 
Decision can match input such as "Identifier LPAREN SmallintLiteral" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): FromClauseParser.g:237:16: 
Decision can match input such as "Identifier LPAREN KW_FALSE" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): FromClauseParser.g:237:16: 
Decision can match input such as "Identifier LPAREN KW_UNIONTYPE" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): FromClauseParser.g:237:16: 
Decision can match input such as "Identifier LPAREN Number" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): FromClauseParser.g:237:16: 
Decision can match input such as "Identifier LPAREN Identifier" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): FromClauseParser.g:237:16: 
Decision can match input such as "Identifier LPAREN StringLiteral" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): FromClauseParser.g:237:16: 
Decision can match input such as "Identifier LPAREN DecimalLiteral" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): FromClauseParser.g:237:16: 
Decision can match input such as "Identifier LPAREN {KW_ADD..KW_AFTER, KW_ALTER..KW_ANALYZE, KW_ARCHIVE, KW_AS..KW_CASCADE, KW_CHANGE, KW_CLUSTER..KW_COLLECTION, KW_COLUMNS..KW_CREATE, KW_CUBE, KW_CURSOR..KW_DATA, KW_DATABASES, KW_DATETIME..KW_DISABLE, KW_DISTRIBUTE..KW_ELEM_TYPE, KW_ENABLE, KW_ESCAPED, KW_EXCLUSIVE, KW_EXPLAIN..KW_EXPORT, KW_EXTERNAL, KW_FETCH..KW_FLOAT, KW_FOR..KW_FORMATTED, KW_FULL, KW_FUNCTIONS..KW_GROUPING, KW_HOLD_DDLTIME..KW_IDXPROPERTIES, KW_IGNORE..KW_ITEMS, KW_KEYS..KW_LEFT, KW_LIKE..KW_LONG, KW_MAPJOIN..KW_MINUS, KW_MSCK..KW_NOSCAN, KW_NO_DROP, KW_OF..KW_OFFLINE, KW_OPTION, KW_ORCFILE..KW_OUTPUTFORMAT, KW_OVERWRITE, KW_PARTITION..KW_PLUS, KW_PRETTY..KW_RECORDWRITER, KW_REGEXP..KW_SCHEMAS, KW_SEMI..KW_STRING, KW_TABLE..KW_TABLES, KW_TBLPROPERTIES..KW_TEXTFILE, KW_TIMESTAMP..KW_TOUCH, KW_TRIGGER, KW_TRUNCATE..KW_UNARCHIVE, KW_UNDO..KW_UNION, KW_UNLOCK..KW_VALUE_TYPE, KW_VIEW, KW_WHILE, KW_WITH}" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): FromClauseParser.g:237:16: 
Decision can match input such as "Identifier LPAREN KW_STRUCT" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): FromClauseParser.g:237:16: 
Decision can match input such as "Identifier LPAREN KW_DATE" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): FromClauseParser.g:237:16: 
Decision can match input such as "Identifier LPAREN KW_NOT" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): FromClauseParser.g:237:16: 
Decision can match input such as "Identifier LPAREN {MINUS, PLUS, TILDE}" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): FromClauseParser.g:237:16: 
Decision can match input such as "Identifier LPAREN KW_MAP" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): FromClauseParser.g:237:16: 
Decision can match input such as "Identifier LPAREN KW_TRUE" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): FromClauseParser.g:237:16: 
Decision can match input such as "Identifier LPAREN KW_ARRAY" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): IdentifiersParser.g:68:4: 
Decision can match input such as "LPAREN KW_NOT CharSetName" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): IdentifiersParser.g:68:4: 
Decision can match input such as "LPAREN KW_CASE CharSetName" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): IdentifiersParser.g:68:4: 
Decision can match input such as "LPAREN LPAREN CharSetName" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): IdentifiersParser.g:68:4: 
Decision can match input such as "LPAREN KW_NOT Number" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): IdentifiersParser.g:68:4: 
Decision can match input such as "LPAREN KW_NOT KW_UNIONTYPE" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): IdentifiersParser.g:68:4: 
Decision can match input such as "LPAREN KW_NULL {KW_LIKE, KW_REGEXP, KW_RLIKE}" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): IdentifiersParser.g:68:4: 
Decision can match input such as "LPAREN KW_NOT KW_DATE" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): IdentifiersParser.g:68:4: 
Decision can match input such as "LPAREN KW_NOT KW_NOT" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): IdentifiersParser.g:68:4: 
Decision can match input such as "LPAREN KW_CASE KW_NOT" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): IdentifiersParser.g:68:4: 
Decision can match input such as "LPAREN LPAREN KW_NOT" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): IdentifiersParser.g:68:4: 
Decision can match input such as "LPAREN KW_CASE Identifier" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): IdentifiersParser.g:68:4: 
Decision can match input such as "LPAREN KW_NULL AMPERSAND" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): IdentifiersParser.g:68:4: 
Decision can match input such as "LPAREN LPAREN SmallintLiteral" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): IdentifiersParser.g:68:4: 
Decision can match input such as "LPAREN LPAREN TinyintLiteral" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): IdentifiersParser.g:68:4: 
Decision can match input such as "LPAREN LPAREN BigintLiteral" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): IdentifiersParser.g:68:4: 
Decision can match input such as "LPAREN KW_NOT KW_FALSE" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): IdentifiersParser.g:68:4: 
Decision can match input such as "LPAREN KW_NOT KW_STRUCT" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): IdentifiersParser.g:68:4: 
Decision can match input such as "LPAREN KW_NOT KW_TRUE" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): IdentifiersParser.g:68:4: 
Decision can match input such as "LPAREN KW_NOT KW_ARRAY" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): IdentifiersParser.g:68:4: 
Decision can match input such as "LPAREN LPAREN Number" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): IdentifiersParser.g:68:4: 
Decision can match input such as "LPAREN KW_NULL KW_IS" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): IdentifiersParser.g:68:4: 
Decision can match input such as "LPAREN KW_NOT StringLiteral" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): IdentifiersParser.g:68:4: 
Decision can match input such as "LPAREN KW_NOT {MINUS, PLUS, TILDE}" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): IdentifiersParser.g:68:4: 
Decision can match input such as "LPAREN KW_CASE {MINUS, PLUS, TILDE}" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): IdentifiersParser.g:68:4: 
Decision can match input such as "LPAREN LPAREN {MINUS, PLUS, TILDE}" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): IdentifiersParser.g:68:4: 
Decision can match input such as "LPAREN KW_NOT KW_MAP" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): IdentifiersParser.g:68:4: 
Decision can match input such as "LPAREN KW_CASE KW_MAP" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): IdentifiersParser.g:68:4: 
Decision can match input such as "LPAREN LPAREN KW_MAP" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): IdentifiersParser.g:68:4: 
Decision can match input such as "LPAREN KW_NULL LPAREN" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): IdentifiersParser.g:68:4: 
Decision can match input such as "LPAREN LPAREN DecimalLiteral" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): IdentifiersParser.g:68:4: 
Decision can match input such as "LPAREN KW_NULL RPAREN" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): IdentifiersParser.g:68:4: 
Decision can match input such as "LPAREN StringLiteral StringLiteral" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): IdentifiersParser.g:68:4: 
Decision can match input such as "LPAREN KW_NOT KW_IF" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): IdentifiersParser.g:68:4: 
Decision can match input such as "LPAREN KW_CASE KW_TRUE" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): IdentifiersParser.g:68:4: 
Decision can match input such as "LPAREN KW_CASE KW_IF" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): IdentifiersParser.g:68:4: 
Decision can match input such as "LPAREN LPAREN KW_IF" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): IdentifiersParser.g:68:4: 
Decision can match input such as "LPAREN KW_NULL BITWISEOR" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): IdentifiersParser.g:68:4: 
Decision can match input such as "LPAREN KW_NULL DOT" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): IdentifiersParser.g:68:4: 
Decision can match input such as "LPAREN LPAREN KW_EXISTS" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): IdentifiersParser.g:68:4: 
Decision can match input such as "LPAREN KW_CASE StringLiteral" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): IdentifiersParser.g:68:4: 
Decision can match input such as "LPAREN KW_CASE KW_FALSE" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): IdentifiersParser.g:68:4: 
Decision can match input such as "LPAREN KW_NOT Identifier" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): IdentifiersParser.g:68:4: 
Decision can match input such as "LPAREN KW_NULL {DIV..DIVIDE, MOD, STAR}" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): IdentifiersParser.g:68:4: 
Decision can match input such as "LPAREN KW_CASE {KW_ADD..KW_AFTER, KW_ALTER..KW_ANALYZE, KW_ARCHIVE, KW_AS..KW_CASCADE, KW_CHANGE, KW_CLUSTER..KW_COLLECTION, KW_COLUMNS..KW_CREATE, KW_CUBE, KW_CURSOR..KW_DATA, KW_DATABASES, KW_DATETIME..KW_DISABLE, KW_DISTRIBUTE..KW_ELEM_TYPE, KW_ENABLE, KW_ESCAPED, KW_EXCLUSIVE, KW_EXPLAIN..KW_EXPORT, KW_EXTERNAL, KW_FETCH..KW_FLOAT, KW_FOR..KW_FORMATTED, KW_FULL, KW_FUNCTIONS..KW_GROUPING, KW_HOLD_DDLTIME..KW_IDXPROPERTIES, KW_IGNORE..KW_ITEMS, KW_KEYS..KW_LEFT, KW_LIKE..KW_LONG, KW_MAPJOIN..KW_MINUS, KW_MSCK..KW_NOSCAN, KW_NO_DROP, KW_OF..KW_OFFLINE, KW_OPTION, KW_ORCFILE..KW_OUTPUTFORMAT, KW_OVERWRITE, KW_PARTITION..KW_PLUS, KW_PRETTY..KW_RECORDWRITER, KW_REGEXP..KW_SCHEMAS, KW_SEMI..KW_STRING, KW_TABLE..KW_TABLES, KW_TBLPROPERTIES..KW_TEXTFILE, KW_TIMESTAMP..KW_TOUCH, KW_TRIGGER, KW_TRUNCATE..KW_UNARCHIVE, KW_UNDO..KW_UNION, KW_UNLOCK..KW_VALUE_TYPE, KW_VIEW, KW_WHILE, KW_WITH}" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): IdentifiersParser.g:68:4: 
Decision can match input such as "LPAREN KW_NULL NOTEQUAL" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): IdentifiersParser.g:68:4: 
Decision can match input such as "LPAREN KW_NOT KW_EXISTS" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): IdentifiersParser.g:68:4: 
Decision can match input such as "LPAREN KW_NULL EQUAL_NS" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): IdentifiersParser.g:68:4: 
Decision can match input such as "LPAREN KW_EXISTS LPAREN" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): IdentifiersParser.g:68:4: 
Decision can match input such as "LPAREN KW_NULL LESSTHAN" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): IdentifiersParser.g:68:4: 
Decision can match input such as "LPAREN KW_NULL LESSTHANOREQUALTO" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): IdentifiersParser.g:68:4: 
Decision can match input such as "LPAREN LPAREN KW_ARRAY" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): IdentifiersParser.g:68:4: 
Decision can match input such as "LPAREN LPAREN KW_NULL" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): IdentifiersParser.g:68:4: 
Decision can match input such as "LPAREN LPAREN KW_UNIONTYPE" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): IdentifiersParser.g:68:4: 
Decision can match input such as "LPAREN KW_NULL KW_OR" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): IdentifiersParser.g:68:4: 
Decision can match input such as "LPAREN KW_DATE StringLiteral" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): IdentifiersParser.g:68:4: 
Decision can match input such as "LPAREN KW_CASE KW_DATE" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): IdentifiersParser.g:68:4: 
Decision can match input such as "LPAREN LPAREN KW_STRUCT" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): IdentifiersParser.g:68:4: 
Decision can match input such as "LPAREN KW_NULL EQUAL" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): IdentifiersParser.g:68:4: 
Decision can match input such as "LPAREN KW_CAST LPAREN" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): IdentifiersParser.g:68:4: 
Decision can match input such as "LPAREN LPAREN LPAREN" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): IdentifiersParser.g:68:4: 
Decision can match input such as "LPAREN KW_NOT LPAREN" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): IdentifiersParser.g:68:4: 
Decision can match input such as "LPAREN KW_CASE LPAREN" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): IdentifiersParser.g:68:4: 
Decision can match input such as "LPAREN KW_CASE KW_STRUCT" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): IdentifiersParser.g:68:4: 
Decision can match input such as "LPAREN KW_CASE KW_UNIONTYPE" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): IdentifiersParser.g:68:4: 
Decision can match input such as "LPAREN KW_NULL GREATERTHAN" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): IdentifiersParser.g:68:4: 
Decision can match input such as "LPAREN KW_CASE KW_ARRAY" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): IdentifiersParser.g:68:4: 
Decision can match input such as "LPAREN LPAREN KW_DATE" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): IdentifiersParser.g:68:4: 
Decision can match input such as "LPAREN KW_NULL GREATERTHANOREQUALTO" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): IdentifiersParser.g:68:4: 
Decision can match input such as "LPAREN KW_CASE KW_NULL" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): IdentifiersParser.g:68:4: 
Decision can match input such as "LPAREN KW_CASE KW_EXISTS" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): IdentifiersParser.g:68:4: 
Decision can match input such as "LPAREN KW_CASE TinyintLiteral" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): IdentifiersParser.g:68:4: 
Decision can match input such as "LPAREN KW_NULL KW_BETWEEN" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): IdentifiersParser.g:68:4: 
Decision can match input such as "LPAREN KW_CASE SmallintLiteral" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): IdentifiersParser.g:68:4: 
Decision can match input such as "LPAREN KW_NULL {MINUS, PLUS}" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): IdentifiersParser.g:68:4: 
Decision can match input such as "LPAREN KW_NULL KW_NOT" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): IdentifiersParser.g:68:4: 
Decision can match input such as "LPAREN KW_CASE BigintLiteral" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): IdentifiersParser.g:68:4: 
Decision can match input such as "LPAREN KW_NULL LSQUARE" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): IdentifiersParser.g:68:4: 
Decision can match input such as "LPAREN KW_NOT {KW_ADD..KW_AFTER, KW_ALTER..KW_ANALYZE, KW_ARCHIVE, KW_AS..KW_CASCADE, KW_CHANGE, KW_CLUSTER..KW_COLLECTION, KW_COLUMNS..KW_CREATE, KW_CUBE, KW_CURSOR..KW_DATA, KW_DATABASES, KW_DATETIME..KW_DISABLE, KW_DISTRIBUTE..KW_ELEM_TYPE, KW_ENABLE, KW_ESCAPED, KW_EXCLUSIVE, KW_EXPLAIN..KW_EXPORT, KW_EXTERNAL, KW_FETCH..KW_FLOAT, KW_FOR..KW_FORMATTED, KW_FULL, KW_FUNCTIONS..KW_GROUPING, KW_HOLD_DDLTIME..KW_IDXPROPERTIES, KW_IGNORE..KW_ITEMS, KW_KEYS..KW_LEFT, KW_LIKE..KW_LONG, KW_MAPJOIN..KW_MINUS, KW_MSCK..KW_NOSCAN, KW_NO_DROP, KW_OF..KW_OFFLINE, KW_OPTION, KW_ORCFILE..KW_OUTPUTFORMAT, KW_OVERWRITE, KW_PARTITION..KW_PLUS, KW_PRETTY..KW_RECORDWRITER, KW_REGEXP..KW_SCHEMAS, KW_SEMI..KW_STRING, KW_TABLE..KW_TABLES, KW_TBLPROPERTIES..KW_TEXTFILE, KW_TIMESTAMP..KW_TOUCH, KW_TRIGGER, KW_TRUNCATE..KW_UNARCHIVE, KW_UNDO..KW_UNION, KW_UNLOCK..KW_VALUE_TYPE, KW_VIEW, KW_WHILE, KW_WITH}" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): IdentifiersParser.g:68:4: 
Decision can match input such as "LPAREN LPAREN KW_FALSE" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): IdentifiersParser.g:68:4: 
Decision can match input such as "LPAREN KW_NOT KW_NULL" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): IdentifiersParser.g:68:4: 
Decision can match input such as "LPAREN LPAREN {KW_ADD..KW_AFTER, KW_ALTER..KW_ANALYZE, KW_ARCHIVE, KW_AS..KW_CASCADE, KW_CHANGE, KW_CLUSTER..KW_COLLECTION, KW_COLUMNS..KW_CREATE, KW_CUBE, KW_CURSOR..KW_DATA, KW_DATABASES, KW_DATETIME..KW_DISABLE, KW_DISTRIBUTE..KW_ELEM_TYPE, KW_ENABLE, KW_ESCAPED, KW_EXCLUSIVE, KW_EXPLAIN..KW_EXPORT, KW_EXTERNAL, KW_FETCH..KW_FLOAT, KW_FOR..KW_FORMATTED, KW_FULL, KW_FUNCTIONS..KW_GROUPING, KW_HOLD_DDLTIME..KW_IDXPROPERTIES, KW_IGNORE..KW_ITEMS, KW_KEYS..KW_LEFT, KW_LIKE..KW_LONG, KW_MAPJOIN..KW_MINUS, KW_MSCK..KW_NOSCAN, KW_NO_DROP, KW_OF..KW_OFFLINE, KW_OPTION, KW_ORCFILE..KW_OUTPUTFORMAT, KW_OVERWRITE, KW_PARTITION..KW_PLUS, KW_PRETTY..KW_RECORDWRITER, KW_REGEXP..KW_SCHEMAS, KW_SEMI..KW_STRING, KW_TABLE..KW_TABLES, KW_TBLPROPERTIES..KW_TEXTFILE, KW_TIMESTAMP..KW_TOUCH, KW_TRIGGER, KW_TRUNCATE..KW_UNARCHIVE, KW_UNDO..KW_UNION, KW_UNLOCK..KW_VALUE_TYPE, KW_VIEW, KW_WHILE, KW_WITH}" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): IdentifiersParser.g:68:4: 
Decision can match input such as "LPAREN LPAREN KW_TRUE" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): IdentifiersParser.g:68:4: 
Decision can match input such as "LPAREN KW_CASE Number" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): IdentifiersParser.g:68:4: 
Decision can match input such as "LPAREN KW_NULL BITWISEXOR" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): IdentifiersParser.g:68:4: 
Decision can match input such as "LPAREN KW_NOT SmallintLiteral" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): IdentifiersParser.g:68:4: 
Decision can match input such as "LPAREN CharSetName CharSetLiteral" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): IdentifiersParser.g:68:4: 
Decision can match input such as "LPAREN LPAREN KW_CAST" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): IdentifiersParser.g:68:4: 
Decision can match input such as "LPAREN KW_NOT KW_CAST" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): IdentifiersParser.g:68:4: 
Decision can match input such as "LPAREN KW_CASE KW_CAST" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): IdentifiersParser.g:68:4: 
Decision can match input such as "LPAREN KW_NOT BigintLiteral" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): IdentifiersParser.g:68:4: 
Decision can match input such as "LPAREN KW_NULL KW_AND" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): IdentifiersParser.g:68:4: 
Decision can match input such as "LPAREN LPAREN KW_CASE" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): IdentifiersParser.g:68:4: 
Decision can match input such as "LPAREN KW_NULL KW_IN" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): IdentifiersParser.g:68:4: 
Decision can match input such as "LPAREN KW_NOT KW_CASE" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): IdentifiersParser.g:68:4: 
Decision can match input such as "LPAREN KW_CASE KW_CASE" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): IdentifiersParser.g:68:4: 
Decision can match input such as "LPAREN KW_NOT DecimalLiteral" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): IdentifiersParser.g:68:4: 
Decision can match input such as "LPAREN KW_NOT TinyintLiteral" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): IdentifiersParser.g:68:4: 
Decision can match input such as "LPAREN KW_CASE KW_WHEN" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): IdentifiersParser.g:68:4: 
Decision can match input such as "LPAREN LPAREN StringLiteral" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): IdentifiersParser.g:68:4: 
Decision can match input such as "LPAREN LPAREN Identifier" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): IdentifiersParser.g:68:4: 
Decision can match input such as "LPAREN KW_CASE DecimalLiteral" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): IdentifiersParser.g:108:5: 
Decision can match input such as "KW_ORDER KW_BY LPAREN" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): IdentifiersParser.g:121:5: 
Decision can match input such as "KW_CLUSTER KW_BY LPAREN" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): IdentifiersParser.g:133:5: 
Decision can match input such as "KW_PARTITION KW_BY LPAREN" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): IdentifiersParser.g:144:5: 
Decision can match input such as "KW_DISTRIBUTE KW_BY LPAREN" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): IdentifiersParser.g:155:5: 
Decision can match input such as "KW_SORT KW_BY LPAREN" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): IdentifiersParser.g:172:7: 
Decision can match input such as "STAR" using multiple alternatives: 1, 2

As a result, alternative(s) 2 were disabled for that input
warning(200): IdentifiersParser.g:185:5: 
Decision can match input such as "KW_UNIONTYPE" using multiple alternatives: 5, 6

As a result, alternative(s) 6 were disabled for that input
warning(200): IdentifiersParser.g:185:5: 
Decision can match input such as "KW_STRUCT" using multiple alternatives: 4, 6

As a result, alternative(s) 6 were disabled for that input
warning(200): IdentifiersParser.g:185:5: 
Decision can match input such as "KW_ARRAY" using multiple alternatives: 2, 6

As a result, alternative(s) 6 were disabled for that input
warning(200): IdentifiersParser.g:267:5: 
Decision can match input such as "KW_TRUE" using multiple alternatives: 3, 8

As a result, alternative(s) 8 were disabled for that input
warning(200): IdentifiersParser.g:267:5: 
Decision can match input such as "KW_DATE StringLiteral" using multiple alternatives: 2, 3

As a result, alternative(s) 3 were disabled for that input
warning(200): IdentifiersParser.g:267:5: 
Decision can match input such as "KW_FALSE" using multiple alternatives: 3, 8

As a result, alternative(s) 8 were disabled for that input
warning(200): IdentifiersParser.g:267:5: 
Decision can match input such as "KW_NULL" using multiple alternatives: 1, 8

As a result, alternative(s) 8 were disabled for that input
warning(200): IdentifiersParser.g:399:5: 
Decision can match input such as "{KW_LIKE, KW_REGEXP, KW_RLIKE} KW_INSERT KW_OVERWRITE" using multiple alternatives: 2, 9

As a result, alternative(s) 9 were disabled for that input
warning(200): IdentifiersParser.g:399:5: 
Decision can match input such as "{KW_LIKE, KW_REGEXP, KW_RLIKE} KW_CLUSTER KW_BY" using multiple alternatives: 2, 9

As a result, alternative(s) 9 were disabled for that input
warning(200): IdentifiersParser.g:399:5: 
Decision can match input such as "{KW_LIKE, KW_REGEXP, KW_RLIKE} KW_INSERT KW_INTO" using multiple alternatives: 2, 9

As a result, alternative(s) 9 were disabled for that input
warning(200): IdentifiersParser.g:399:5: 
Decision can match input such as "{KW_LIKE, KW_REGEXP, KW_RLIKE} KW_ORDER KW_BY" using multiple alternatives: 2, 9

As a result, alternative(s) 9 were disabled for that input
warning(200): IdentifiersParser.g:399:5: 
Decision can match input such as "{KW_LIKE, KW_REGEXP, KW_RLIKE} KW_MAP LPAREN" using multiple alternatives: 2, 9

As a result, alternative(s) 9 were disabled for that input
warning(200): IdentifiersParser.g:399:5: 
Decision can match input such as "{KW_LIKE, KW_REGEXP, KW_RLIKE} KW_SORT KW_BY" using multiple alternatives: 2, 9

As a result, alternative(s) 9 were disabled for that input
warning(200): IdentifiersParser.g:399:5: 
Decision can match input such as "{KW_LIKE, KW_REGEXP, KW_RLIKE} KW_LATERAL KW_VIEW" using multiple alternatives: 2, 9

As a result, alternative(s) 9 were disabled for that input
warning(200): IdentifiersParser.g:399:5: 
Decision can match input such as "KW_BETWEEN KW_MAP LPAREN" using multiple alternatives: 8, 9

As a result, alternative(s) 9 were disabled for that input
warning(200): IdentifiersParser.g:399:5: 
Decision can match input such as "{KW_LIKE, KW_REGEXP, KW_RLIKE} KW_GROUP KW_BY" using multiple alternatives: 2, 9

As a result, alternative(s) 9 were disabled for that input
warning(200): IdentifiersParser.g:399:5: 
Decision can match input such as "{KW_LIKE, KW_REGEXP, KW_RLIKE} KW_DISTRIBUTE KW_BY" using multiple alternatives: 2, 9

As a result, alternative(s) 9 were disabled for that input
warning(200): IdentifiersParser.g:524:5: 
Decision can match input such as "{AMPERSAND..BITWISEXOR, DIV..DIVIDE, EQUAL..EQUAL_NS, GREATERTHAN..GREATERTHANOREQUALTO, KW_AND, KW_ARRAY, KW_BETWEEN..KW_BOOLEAN, KW_CASE, KW_DOUBLE, KW_FLOAT, KW_IF, KW_IN, KW_INT, KW_LIKE, KW_MAP, KW_NOT, KW_OR, KW_REGEXP, KW_RLIKE, KW_SMALLINT, KW_STRING..KW_STRUCT, KW_TINYINT, KW_UNIONTYPE, KW_WHEN, LESSTHAN..LESSTHANOREQUALTO, MINUS..NOTEQUAL, PLUS, STAR, TILDE}" using multiple alternatives: 1, 3

As a result, alternative(s) 3 were disabled for that input
[INFO] 
[INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-exec ---
[debug] execute contextualize
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] Copying 1 resource
[INFO] 
[INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-exec ---
[INFO] Executing tasks

main:
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-exec ---
[INFO] Compiling 1404 source files to /data/hive-ptest/working/apache-svn-trunk-source/ql/target/classes
[INFO] -------------------------------------------------------------
[WARNING] COMPILATION WARNING : 
[INFO] -------------------------------------------------------------
[WARNING] Note: Some input files use or override a deprecated API.
[WARNING] Note: Recompile with -Xlint:deprecation for details.
[WARNING] Note: Some input files use unchecked or unsafe operations.
[WARNING] Note: Recompile with -Xlint:unchecked for details.
[INFO] 4 warnings 
[INFO] -------------------------------------------------------------
[INFO] -------------------------------------------------------------
[ERROR] COMPILATION ERROR : 
[INFO] -------------------------------------------------------------
[ERROR] /data/hive-ptest/working/apache-svn-trunk-source/ql/src/java/org/apache/hadoop/hive/ql/optimizer/physical/Vectorizer.java:[202,30] cannot find symbol
symbol  : class UDFMonth
location: class org.apache.hadoop.hive.ql.optimizer.physical.Vectorizer
[INFO] 1 error
[INFO] -------------------------------------------------------------
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Hive .............................................. SUCCESS [4.732s]
[INFO] Hive Ant Utilities ................................ SUCCESS [7.958s]
[INFO] Hive Shims Common ................................. SUCCESS [3.574s]
[INFO] Hive Shims 0.20 ................................... SUCCESS [2.203s]
[INFO] Hive Shims Secure Common .......................... SUCCESS [2.653s]
[INFO] Hive Shims 0.20S .................................. SUCCESS [1.487s]
[INFO] Hive Shims 0.23 ................................... SUCCESS [3.666s]
[INFO] Hive Shims ........................................ SUCCESS [3.199s]
[INFO] Hive Common ....................................... SUCCESS [13.563s]
[INFO] Hive Serde ........................................ SUCCESS [12.105s]
[INFO] Hive Metastore .................................... SUCCESS [25.250s]
[INFO] Hive Query Language ............................... FAILURE [31.289s]
[INFO] Hive Service ...................................... SKIPPED
[INFO] Hive JDBC ......................................... SKIPPED
[INFO] Hive Beeline ...................................... SKIPPED
[INFO] Hive CLI .......................................... SKIPPED
[INFO] Hive Contrib ...................................... SKIPPED
[INFO] Hive HBase Handler ................................ SKIPPED
[INFO] Hive HCatalog ..................................... SKIPPED
[INFO] Hive HCatalog Core ................................ SKIPPED
[INFO] Hive HCatalog Pig Adapter ......................... SKIPPED
[INFO] Hive HCatalog Server Extensions ................... SKIPPED
[INFO] Hive HCatalog Webhcat Java Client ................. SKIPPED
[INFO] Hive HCatalog Webhcat ............................. SKIPPED
[INFO] Hive HCatalog HBase Storage Handler ............... SKIPPED
[INFO] Hive HWI .......................................... SKIPPED
[INFO] Hive ODBC ......................................... SKIPPED
[INFO] Hive Shims Aggregator ............................. SKIPPED
[INFO] Hive TestUtils .................................... SKIPPED
[INFO] Hive Packaging .................................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 1:54.711s
[INFO] Finished at: Wed Nov 20 17:15:18 EST 2013
[INFO] Final Memory: 58M/373M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-compiler-plugin:3.1:compile (default-compile) on project hive-exec: Compilation failure
[ERROR] /data/hive-ptest/working/apache-svn-trunk-source/ql/src/java/org/apache/hadoop/hive/ql/optimizer/physical/Vectorizer.java:[202,30] cannot find symbol
[ERROR] symbol  : class UDFMonth
[ERROR] location: class org.apache.hadoop.hive.ql.optimizer.physical.Vectorizer
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hive-exec
+ exit 1
'
{noformat}

This message is automatically generated.

ATTACHMENT ID: 12614861

> Implement vectorized year/month/day... etc. for string arguments
> ----------------------------------------------------------------
>
>                 Key: HIVE-5581
>                 URL: https://issues.apache.org/jira/browse/HIVE-5581
>             Project: Hive
>          Issue Type: Sub-task
>          Components: Query Processor
>    Affects Versions: 0.13.0
>            Reporter: Eric Hanson
>            Assignee: Teddy Choi
>         Attachments: HIVE-5581.1.patch.txt, HIVE-5581.2.patch, HIVE-5581.3.patch, HIVE-5581.4.patch, HIVE-5581.5.patch, HIVE-5581.5.patch
>
>
> Functions year(), month(), day(), weekofyear(), hour(), minute(), second() need to be implemented for string arguments in vectorized mode. 
> They already work for timestamp arguments.



--
This message was sent by Atlassian JIRA
(v6.1#6144)

Mime
View raw message