Return-Path: X-Original-To: apmail-hive-dev-archive@www.apache.org Delivered-To: apmail-hive-dev-archive@www.apache.org Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by minotaur.apache.org (Postfix) with SMTP id ED2881078D for ; Mon, 11 Nov 2013 21:30:21 +0000 (UTC) Received: (qmail 1490 invoked by uid 500); 11 Nov 2013 21:30:20 -0000 Delivered-To: apmail-hive-dev-archive@hive.apache.org Received: (qmail 1428 invoked by uid 500); 11 Nov 2013 21:30:20 -0000 Mailing-List: contact dev-help@hive.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: dev@hive.apache.org Delivered-To: mailing list dev@hive.apache.org Received: (qmail 1419 invoked by uid 500); 11 Nov 2013 21:30:20 -0000 Delivered-To: apmail-hadoop-hive-dev@hadoop.apache.org Received: (qmail 1416 invoked by uid 99); 11 Nov 2013 21:30:20 -0000 Received: from arcas.apache.org (HELO arcas.apache.org) (140.211.11.28) by apache.org (qpsmtpd/0.29) with ESMTP; Mon, 11 Nov 2013 21:30:20 +0000 Date: Mon, 11 Nov 2013 21:30:20 +0000 (UTC) From: "Hive QA (JIRA)" To: hive-dev@hadoop.apache.org Message-ID: In-Reply-To: References: Subject: [jira] [Commented] (HIVE-4518) Counter Strike: Operation Operator MIME-Version: 1.0 Content-Type: text/plain; charset=utf-8 Content-Transfer-Encoding: quoted-printable X-JIRA-FingerPrint: 30527f35849b9dde25b450d4833f0394 [ https://issues.apache.org/jira/browse/HIVE-4518?page=3Dcom.atlassian.= jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=3D13819= 418#comment-13819418 ]=20 Hive QA commented on HIVE-4518: ------------------------------- {color:red}Overall{color}: -1 no tests executed Here are the results of testing the latest attachment: https://issues.apache.org/jira/secure/attachment/12613111/HIVE-4518.6.patch= .txt Test results: http://bigtop01.cloudera.org:8080/job/PreCommit-HIVE-Build/24= 2/testReport Console output: http://bigtop01.cloudera.org:8080/job/PreCommit-HIVE-Build/= 242/console Messages: {noformat} Executing org.apache.hive.ptest.execution.PrepPhase Tests failed with: NonZeroExitCodeException: Command 'bash /data/hive-ptest= /working/scratch/source-prep.sh' failed with exit status 1 and output '+ [[= -n '' ]] + export 'ANT_OPTS=3D-Xmx1g -XX:MaxPermSize=3D256m -Dhttp.proxyHost=3Dlocal= host -Dhttp.proxyPort=3D3128' + ANT_OPTS=3D'-Xmx1g -XX:MaxPermSize=3D256m -Dhttp.proxyHost=3Dlocalhost -D= http.proxyPort=3D3128' + export 'M2_OPTS=3D-Xmx1g -XX:MaxPermSize=3D256m -Dhttp.proxyHost=3Dlocalh= ost -Dhttp.proxyPort=3D3128' + M2_OPTS=3D'-Xmx1g -XX:MaxPermSize=3D256m -Dhttp.proxyHost=3Dlocalhost -Dh= ttp.proxyPort=3D3128' + cd /data/hive-ptest/working/ + tee /data/hive-ptest/logs/PreCommit-HIVE-Build-242/source-prep.txt + [[ false =3D=3D \t\r\u\e ]] + mkdir -p maven ivy + [[ svn =3D \s\v\n ]] + [[ -n '' ]] + [[ -d apache-svn-trunk-source ]] + [[ ! -d apache-svn-trunk-source/.svn ]] + [[ ! -d apache-svn-trunk-source ]] + cd apache-svn-trunk-source + svn revert -R . Reverted 'ql/src/test/results/clientpositive/show_functions.q.out' Reverted 'ql/src/java/org/apache/hadoop/hive/ql/parse/SemanticAnalyzer.java= ' Reverted 'ql/src/java/org/apache/hadoop/hive/ql/parse/HiveParser.g' Reverted 'ql/src/java/org/apache/hadoop/hive/ql/exec/FunctionRegistry.java' Reverted 'ql/src/java/org/apache/hadoop/hive/ql/exec/RowSchema.java' Reverted 'ql/src/java/org/apache/hadoop/hive/ql/exec/mr/ExecDriver.java' Reverted 'ql/src/java/org/apache/hadoop/hive/ql/io/OneNullRowInputFormat.ja= va' ++ egrep -v '^X|^Performing status on external' ++ awk '{print $2}' ++ svn status --no-ignore + rm -rf target datanucleus.log ant/target shims/target shims/0.20/target s= hims/assembly/target shims/0.20S/target shims/0.23/target shims/common/targ= et shims/common-secure/target packaging/target hbase-handler/target testuti= ls/target jdbc/target metastore/target itests/target itests/hcatalog-unit/t= arget itests/test-serde/target itests/qtest/target itests/hive-unit/target = itests/custom-serde/target itests/util/target hcatalog/target hcatalog/stor= age-handlers/hbase/target hcatalog/server-extensions/target hcatalog/core/t= arget hcatalog/webhcat/svr/target hcatalog/webhcat/java-client/target hcata= log/hcatalog-pig-adapter/target hwi/target common/target common/src/gen ser= vice/target contrib/target serde/target beeline/target odbc/target cli/targ= et ql/dependency-reduced-pom.xml ql/target ql/src/test/results/clientpositi= ve/select_dummy_source.q.out ql/src/test/results/clientpositive/udf_current= _database.q.out ql/src/test/queries/clientpositive/udf_current_database.q q= l/src/test/queries/clientpositive/select_dummy_source.q ql/src/java/org/apa= che/hadoop/hive/ql/io/NullRowsInputFormat.java ql/src/java/org/apache/hadoo= p/hive/ql/udf/generic/UDFCurrentDB.java + svn update Fetching external item into 'hcatalog/src/test/e2e/harness' External at revision 1540846. At revision 1540846. + patchCommandPath=3D/data/hive-ptest/working/scratch/smart-apply-patch.sh + patchFilePath=3D/data/hive-ptest/working/scratch/build.patch + [[ -f /data/hive-ptest/working/scratch/build.patch ]] + chmod +x /data/hive-ptest/working/scratch/smart-apply-patch.sh + /data/hive-ptest/working/scratch/smart-apply-patch.sh /data/hive-ptest/wo= rking/scratch/build.patch Going to apply patch with: patch -p0 patching file common/src/java/org/apache/hadoop/hive/conf/HiveConf.java patching file conf/hive-default.xml.template patching file data/conf/hive-site.xml patching file ql/src/java/org/apache/hadoop/hive/ql/ErrorMsg.java patching file ql/src/java/org/apache/hadoop/hive/ql/QueryPlan.java patching file ql/src/java/org/apache/hadoop/hive/ql/exec/AbstractMapJoinOpe= rator.java patching file ql/src/java/org/apache/hadoop/hive/ql/exec/CommonJoinOperator= .java patching file ql/src/java/org/apache/hadoop/hive/ql/exec/DemuxOperator.java patching file ql/src/java/org/apache/hadoop/hive/ql/exec/FetchOperator.java patching file ql/src/java/org/apache/hadoop/hive/ql/exec/FileSinkOperator.j= ava patching file ql/src/java/org/apache/hadoop/hive/ql/exec/GroupByOperator.ja= va patching file ql/src/java/org/apache/hadoop/hive/ql/exec/MapJoinOperator.ja= va patching file ql/src/java/org/apache/hadoop/hive/ql/exec/MuxOperator.java patching file ql/src/java/org/apache/hadoop/hive/ql/exec/Operator.java patching file ql/src/java/org/apache/hadoop/hive/ql/exec/OperatorFactory.ja= va patching file ql/src/java/org/apache/hadoop/hive/ql/exec/ReduceSinkOperator= .java patching file ql/src/java/org/apache/hadoop/hive/ql/exec/SMBMapJoinOperator= .java patching file ql/src/java/org/apache/hadoop/hive/ql/exec/Utilities.java patching file ql/src/java/org/apache/hadoop/hive/ql/exec/mr/ExecDriver.java patching file ql/src/java/org/apache/hadoop/hive/ql/exec/mr/ExecReducer.jav= a patching file ql/src/java/org/apache/hadoop/hive/ql/exec/mr/HadoopJobExecHe= lper.java patching file ql/src/java/org/apache/hadoop/hive/ql/exec/mr/HadoopJobExecHo= ok.java patching file ql/src/java/org/apache/hadoop/hive/ql/exec/mr/MapredLocalTask= .java patching file ql/src/java/org/apache/hadoop/hive/ql/exec/vector/VectorFileS= inkOperator.java patching file ql/src/java/org/apache/hadoop/hive/ql/io/rcfile/merge/BlockMe= rgeTask.java patching file ql/src/java/org/apache/hadoop/hive/ql/io/rcfile/stats/Partial= ScanTask.java patching file ql/src/java/org/apache/hadoop/hive/ql/io/rcfile/truncate/Colu= mnTruncateTask.java patching file ql/src/java/org/apache/hadoop/hive/ql/metadata/HiveFatalExcep= tion.java patching file ql/src/java/org/apache/hadoop/hive/ql/parse/MapReduceCompiler= .java patching file ql/src/java/org/apache/hadoop/hive/ql/parse/SemanticAnalyzer.= java patching file ql/src/test/org/apache/hadoop/hive/ql/exec/TestOperators.java patching file ql/src/test/org/apache/hadoop/hive/ql/exec/vector/TestVectorG= roupByOperator.java patching file ql/src/test/org/apache/hadoop/hive/ql/testutil/OperatorTestUt= ils.java patching file ql/src/test/queries/clientpositive/insert_into3.q patching file ql/src/test/queries/clientpositive/optrstat_groupby.q patching file ql/src/test/results/clientpositive/insert_into3.q.out patching file ql/src/test/results/clientpositive/optrstat_groupby.q.out patching file ql/src/test/results/compiler/plan/case_sensitivity.q.xml patching file ql/src/test/results/compiler/plan/cast1.q.xml patching file ql/src/test/results/compiler/plan/groupby1.q.xml patching file ql/src/test/results/compiler/plan/groupby2.q.xml patching file ql/src/test/results/compiler/plan/groupby3.q.xml patching file ql/src/test/results/compiler/plan/groupby4.q.xml patching file ql/src/test/results/compiler/plan/groupby5.q.xml patching file ql/src/test/results/compiler/plan/groupby6.q.xml patching file ql/src/test/results/compiler/plan/input1.q.xml patching file ql/src/test/results/compiler/plan/input2.q.xml patching file ql/src/test/results/compiler/plan/input20.q.xml patching file ql/src/test/results/compiler/plan/input3.q.xml patching file ql/src/test/results/compiler/plan/input4.q.xml patching file ql/src/test/results/compiler/plan/input5.q.xml patching file ql/src/test/results/compiler/plan/input6.q.xml patching file ql/src/test/results/compiler/plan/input7.q.xml patching file ql/src/test/results/compiler/plan/input8.q.xml patching file ql/src/test/results/compiler/plan/input9.q.xml patching file ql/src/test/results/compiler/plan/input_part1.q.xml patching file ql/src/test/results/compiler/plan/input_testsequencefile.q.xm= l patching file ql/src/test/results/compiler/plan/input_testxpath.q.xml patching file ql/src/test/results/compiler/plan/input_testxpath2.q.xml patching file ql/src/test/results/compiler/plan/join1.q.xml patching file ql/src/test/results/compiler/plan/join2.q.xml patching file ql/src/test/results/compiler/plan/join3.q.xml patching file ql/src/test/results/compiler/plan/join4.q.xml patching file ql/src/test/results/compiler/plan/join5.q.xml patching file ql/src/test/results/compiler/plan/join6.q.xml patching file ql/src/test/results/compiler/plan/join7.q.xml patching file ql/src/test/results/compiler/plan/join8.q.xml patching file ql/src/test/results/compiler/plan/sample1.q.xml patching file ql/src/test/results/compiler/plan/sample2.q.xml patching file ql/src/test/results/compiler/plan/sample3.q.xml patching file ql/src/test/results/compiler/plan/sample4.q.xml patching file ql/src/test/results/compiler/plan/sample5.q.xml patching file ql/src/test/results/compiler/plan/sample6.q.xml patching file ql/src/test/results/compiler/plan/sample7.q.xml patching file ql/src/test/results/compiler/plan/subq.q.xml patching file ql/src/test/results/compiler/plan/udf1.q.xml patching file ql/src/test/results/compiler/plan/udf4.q.xml patching file ql/src/test/results/compiler/plan/udf6.q.xml patching file ql/src/test/results/compiler/plan/udf_case.q.xml patching file ql/src/test/results/compiler/plan/udf_when.q.xml patching file ql/src/test/results/compiler/plan/union.q.xml + [[ maven =3D=3D \m\a\v\e\n ]] + rm -rf /data/hive-ptest/working/maven/org/apache/hive + mvn -B clean install -DskipTests -Dmaven.repo.local=3D/data/hive-ptest/wo= rking/maven [INFO] Scanning for projects... [INFO] --------------------------------------------------------------------= ---- [INFO] Reactor Build Order: [INFO]=20 [INFO] Hive [INFO] Hive Ant Utilities [INFO] Hive Shims Common [INFO] Hive Shims 0.20 [INFO] Hive Shims Secure Common [INFO] Hive Shims 0.20S [INFO] Hive Shims 0.23 [INFO] Hive Shims [INFO] Hive Common [INFO] Hive Serde [INFO] Hive Metastore [INFO] Hive Query Language [INFO] Hive Service [INFO] Hive JDBC [INFO] Hive Beeline [INFO] Hive CLI [INFO] Hive Contrib [INFO] Hive HBase Handler [INFO] Hive HCatalog [INFO] Hive HCatalog Core [INFO] Hive HCatalog Pig Adapter [INFO] Hive HCatalog Server Extensions [INFO] Hive HCatalog Webhcat Java Client [INFO] Hive HCatalog Webhcat [INFO] Hive HCatalog HBase Storage Handler [INFO] Hive HWI [INFO] Hive ODBC [INFO] Hive Shims Aggregator [INFO] Hive TestUtils [INFO] Hive Packaging [INFO] = =20 [INFO] --------------------------------------------------------------------= ---- [INFO] Building Hive 0.13.0-SNAPSHOT [INFO] --------------------------------------------------------------------= ---- [INFO]=20 [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive --- [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source (includes = =3D [datanucleus.log, derby.log], excludes =3D []) [INFO]=20 [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive --- [INFO] Executing tasks main: [INFO] Executed tasks [INFO]=20 [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive --- [INFO] Executing tasks main: [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/t= arget/tmp [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/t= arget/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/t= arget/tmp/conf [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-so= urce/target/tmp/conf [INFO] Executed tasks [INFO]=20 [INFO] --- maven-install-plugin:2.4:install (default-install) @ hive --- [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/pom.xml = to /data/hive-ptest/working/maven/org/apache/hive/hive/0.13.0-SNAPSHOT/hive= -0.13.0-SNAPSHOT.pom [INFO] = =20 [INFO] --------------------------------------------------------------------= ---- [INFO] Building Hive Ant Utilities 0.13.0-SNAPSHOT [INFO] --------------------------------------------------------------------= ---- [INFO]=20 [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-ant --- [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/ant (inclu= des =3D [datanucleus.log, derby.log], excludes =3D []) [INFO]=20 [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-= ant --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-= svn-trunk-source/ant/src/main/resources [INFO]=20 [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-ant --- [INFO] Executing tasks main: [INFO] Executed tasks [INFO]=20 [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-ant -= -- [INFO] Compiling 5 source files to /data/hive-ptest/working/apache-svn-trun= k-source/ant/target/classes [WARNING] Note: /data/hive-ptest/working/apache-svn-trunk-source/ant/src/or= g/apache/hadoop/hive/ant/QTestGenTask.java uses or overrides a deprecated A= PI. [WARNING] Note: Recompile with -Xlint:deprecation for details. [WARNING] Note: /data/hive-ptest/working/apache-svn-trunk-source/ant/src/or= g/apache/hadoop/hive/ant/DistinctElementsClassPath.java uses unchecked or u= nsafe operations. [WARNING] Note: Recompile with -Xlint:unchecked for details. [INFO]=20 [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources)= @ hive-ant --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-= svn-trunk-source/ant/src/test/resources [INFO]=20 [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-ant --- [INFO] Executing tasks main: [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/a= nt/target/tmp [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/a= nt/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/a= nt/target/tmp/conf [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-so= urce/ant/target/tmp/conf [INFO] Executed tasks [INFO]=20 [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hi= ve-ant --- [INFO] No sources to compile [INFO]=20 [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-ant --- [INFO] Tests are skipped. [INFO]=20 [INFO] --- maven-jar-plugin:2.2:jar (default-jar) @ hive-ant --- [INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/ant/t= arget/hive-ant-0.13.0-SNAPSHOT.jar [INFO]=20 [INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-ant --= - [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/ant/targ= et/hive-ant-0.13.0-SNAPSHOT.jar to /data/hive-ptest/working/maven/org/apach= e/hive/hive-ant/0.13.0-SNAPSHOT/hive-ant-0.13.0-SNAPSHOT.jar [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/ant/pom.= xml to /data/hive-ptest/working/maven/org/apache/hive/hive-ant/0.13.0-SNAPS= HOT/hive-ant-0.13.0-SNAPSHOT.pom [INFO] = =20 [INFO] --------------------------------------------------------------------= ---- [INFO] Building Hive Shims Common 0.13.0-SNAPSHOT [INFO] --------------------------------------------------------------------= ---- [INFO]=20 [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-shims-common= --- [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/shims/comm= on (includes =3D [datanucleus.log, derby.log], excludes =3D []) [INFO]=20 [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-= shims-common --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-= svn-trunk-source/shims/common/src/main/resources [INFO]=20 [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-shims-comm= on --- [INFO] Executing tasks main: [INFO] Executed tasks [INFO]=20 [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-shims= -common --- [INFO] Compiling 15 source files to /data/hive-ptest/working/apache-svn-tru= nk-source/shims/common/target/classes [WARNING] Note: Some input files use or override a deprecated API. [WARNING] Note: Recompile with -Xlint:deprecation for details. [INFO]=20 [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources)= @ hive-shims-common --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-= svn-trunk-source/shims/common/src/test/resources [INFO]=20 [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-shims-commo= n --- [INFO] Executing tasks main: [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/s= hims/common/target/tmp [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/s= hims/common/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/s= hims/common/target/tmp/conf [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-so= urce/shims/common/target/tmp/conf [INFO] Executed tasks [INFO]=20 [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hi= ve-shims-common --- [INFO] No sources to compile [INFO]=20 [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-shims-comm= on --- [INFO] Tests are skipped. [INFO]=20 [INFO] --- maven-jar-plugin:2.2:jar (default-jar) @ hive-shims-common --- [INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/shims= /common/target/hive-shims-common-0.13.0-SNAPSHOT.jar [INFO]=20 [INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-shims-= common --- [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/shims/co= mmon/target/hive-shims-common-0.13.0-SNAPSHOT.jar to /data/hive-ptest/worki= ng/maven/org/apache/hive/shims/hive-shims-common/0.13.0-SNAPSHOT/hive-shims= -common-0.13.0-SNAPSHOT.jar [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/shims/co= mmon/pom.xml to /data/hive-ptest/working/maven/org/apache/hive/shims/hive-s= hims-common/0.13.0-SNAPSHOT/hive-shims-common-0.13.0-SNAPSHOT.pom [INFO] = =20 [INFO] --------------------------------------------------------------------= ---- [INFO] Building Hive Shims 0.20 0.13.0-SNAPSHOT [INFO] --------------------------------------------------------------------= ---- [INFO]=20 [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-shims-0.20 -= -- [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20= (includes =3D [datanucleus.log, derby.log], excludes =3D []) [INFO]=20 [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-= shims-0.20 --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-= svn-trunk-source/shims/0.20/src/main/resources [INFO]=20 [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-shims-0.20= --- [INFO] Executing tasks main: [INFO] Executed tasks [INFO]=20 [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-shims= -0.20 --- [INFO] Compiling 2 source files to /data/hive-ptest/working/apache-svn-trun= k-source/shims/0.20/target/classes [WARNING] Note: /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20= /src/main/java/org/apache/hadoop/hive/shims/Hadoop20Shims.java uses or over= rides a deprecated API. [WARNING] Note: Recompile with -Xlint:deprecation for details. [WARNING] Note: /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20= /src/main/java/org/apache/hadoop/hive/shims/Hadoop20Shims.java uses uncheck= ed or unsafe operations. [WARNING] Note: Recompile with -Xlint:unchecked for details. [INFO]=20 [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources)= @ hive-shims-0.20 --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-= svn-trunk-source/shims/0.20/src/test/resources [INFO]=20 [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-shims-0.20 = --- [INFO] Executing tasks main: [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/s= hims/0.20/target/tmp [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/s= hims/0.20/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/s= hims/0.20/target/tmp/conf [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-so= urce/shims/0.20/target/tmp/conf [INFO] Executed tasks [INFO]=20 [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hi= ve-shims-0.20 --- [INFO] No sources to compile [INFO]=20 [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-shims-0.20= --- [INFO] Tests are skipped. [INFO]=20 [INFO] --- maven-jar-plugin:2.2:jar (default-jar) @ hive-shims-0.20 --- [INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/shims= /0.20/target/hive-shims-0.20-0.13.0-SNAPSHOT.jar [INFO]=20 [INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-shims-= 0.20 --- [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/shims/0.= 20/target/hive-shims-0.20-0.13.0-SNAPSHOT.jar to /data/hive-ptest/working/m= aven/org/apache/hive/shims/hive-shims-0.20/0.13.0-SNAPSHOT/hive-shims-0.20-= 0.13.0-SNAPSHOT.jar [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/shims/0.= 20/pom.xml to /data/hive-ptest/working/maven/org/apache/hive/shims/hive-shi= ms-0.20/0.13.0-SNAPSHOT/hive-shims-0.20-0.13.0-SNAPSHOT.pom [INFO] = =20 [INFO] --------------------------------------------------------------------= ---- [INFO] Building Hive Shims Secure Common 0.13.0-SNAPSHOT [INFO] --------------------------------------------------------------------= ---- [INFO]=20 [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-shims-common= -secure --- [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/shims/comm= on-secure (includes =3D [datanucleus.log, derby.log], excludes =3D []) [INFO]=20 [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-= shims-common-secure --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-= svn-trunk-source/shims/common-secure/src/main/resources [INFO]=20 [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-shims-comm= on-secure --- [INFO] Executing tasks main: [INFO] Executed tasks [INFO]=20 [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-shims= -common-secure --- [INFO] Compiling 12 source files to /data/hive-ptest/working/apache-svn-tru= nk-source/shims/common-secure/target/classes [WARNING] Note: /data/hive-ptest/working/apache-svn-trunk-source/shims/comm= on-secure/src/main/java/org/apache/hadoop/hive/shims/HadoopShimsSecure.java= uses or overrides a deprecated API. [WARNING] Note: Recompile with -Xlint:deprecation for details. [WARNING] Note: Some input files use unchecked or unsafe operations. [WARNING] Note: Recompile with -Xlint:unchecked for details. [INFO]=20 [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources)= @ hive-shims-common-secure --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-= svn-trunk-source/shims/common-secure/src/test/resources [INFO]=20 [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-shims-commo= n-secure --- [INFO] Executing tasks main: [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/s= hims/common-secure/target/tmp [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/s= hims/common-secure/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/s= hims/common-secure/target/tmp/conf [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-so= urce/shims/common-secure/target/tmp/conf [INFO] Executed tasks [INFO]=20 [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hi= ve-shims-common-secure --- [INFO] No sources to compile [INFO]=20 [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-shims-comm= on-secure --- [INFO] Tests are skipped. [INFO]=20 [INFO] --- maven-jar-plugin:2.2:jar (default-jar) @ hive-shims-common-secur= e --- [INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/shims= /common-secure/target/hive-shims-common-secure-0.13.0-SNAPSHOT.jar [INFO]=20 [INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-shims-= common-secure --- [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/shims/co= mmon-secure/target/hive-shims-common-secure-0.13.0-SNAPSHOT.jar to /data/hi= ve-ptest/working/maven/org/apache/hive/shims/hive-shims-common-secure/0.13.= 0-SNAPSHOT/hive-shims-common-secure-0.13.0-SNAPSHOT.jar [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/shims/co= mmon-secure/pom.xml to /data/hive-ptest/working/maven/org/apache/hive/shims= /hive-shims-common-secure/0.13.0-SNAPSHOT/hive-shims-common-secure-0.13.0-S= NAPSHOT.pom [INFO] = =20 [INFO] --------------------------------------------------------------------= ---- [INFO] Building Hive Shims 0.20S 0.13.0-SNAPSHOT [INFO] --------------------------------------------------------------------= ---- [INFO]=20 [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-shims-0.20S = --- [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20= S (includes =3D [datanucleus.log, derby.log], excludes =3D []) [INFO]=20 [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-= shims-0.20S --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-= svn-trunk-source/shims/0.20S/src/main/resources [INFO]=20 [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-shims-0.20= S --- [INFO] Executing tasks main: [INFO] Executed tasks [INFO]=20 [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-shims= -0.20S --- [INFO] Compiling 3 source files to /data/hive-ptest/working/apache-svn-trun= k-source/shims/0.20S/target/classes [WARNING] Note: /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20= S/src/main/java/org/apache/hadoop/hive/shims/Hadoop20SShims.java uses or ov= errides a deprecated API. [WARNING] Note: Recompile with -Xlint:deprecation for details. [INFO]=20 [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources)= @ hive-shims-0.20S --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-= svn-trunk-source/shims/0.20S/src/test/resources [INFO]=20 [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-shims-0.20S= --- [INFO] Executing tasks main: [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/s= hims/0.20S/target/tmp [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/s= hims/0.20S/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/s= hims/0.20S/target/tmp/conf [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-so= urce/shims/0.20S/target/tmp/conf [INFO] Executed tasks [INFO]=20 [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hi= ve-shims-0.20S --- [INFO] No sources to compile [INFO]=20 [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-shims-0.20= S --- [INFO] Tests are skipped. [INFO]=20 [INFO] --- maven-jar-plugin:2.2:jar (default-jar) @ hive-shims-0.20S --- [INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/shims= /0.20S/target/hive-shims-0.20S-0.13.0-SNAPSHOT.jar [INFO]=20 [INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-shims-= 0.20S --- [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/shims/0.= 20S/target/hive-shims-0.20S-0.13.0-SNAPSHOT.jar to /data/hive-ptest/working= /maven/org/apache/hive/shims/hive-shims-0.20S/0.13.0-SNAPSHOT/hive-shims-0.= 20S-0.13.0-SNAPSHOT.jar [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/shims/0.= 20S/pom.xml to /data/hive-ptest/working/maven/org/apache/hive/shims/hive-sh= ims-0.20S/0.13.0-SNAPSHOT/hive-shims-0.20S-0.13.0-SNAPSHOT.pom [INFO] = =20 [INFO] --------------------------------------------------------------------= ---- [INFO] Building Hive Shims 0.23 0.13.0-SNAPSHOT [INFO] --------------------------------------------------------------------= ---- [INFO]=20 [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-shims-0.23 -= -- [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/shims/0.23= (includes =3D [datanucleus.log, derby.log], excludes =3D []) [INFO]=20 [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-= shims-0.23 --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-= svn-trunk-source/shims/0.23/src/main/resources [INFO]=20 [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-shims-0.23= --- [INFO] Executing tasks main: [INFO] Executed tasks [INFO]=20 [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-shims= -0.23 --- [INFO] Compiling 3 source files to /data/hive-ptest/working/apache-svn-trun= k-source/shims/0.23/target/classes [WARNING] Note: /data/hive-ptest/working/apache-svn-trunk-source/shims/0.23= /src/main/java/org/apache/hadoop/hive/shims/Hadoop23Shims.java uses or over= rides a deprecated API. [WARNING] Note: Recompile with -Xlint:deprecation for details. [INFO]=20 [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources)= @ hive-shims-0.23 --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-= svn-trunk-source/shims/0.23/src/test/resources [INFO]=20 [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-shims-0.23 = --- [INFO] Executing tasks main: [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/s= hims/0.23/target/tmp [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/s= hims/0.23/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/s= hims/0.23/target/tmp/conf [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-so= urce/shims/0.23/target/tmp/conf [INFO] Executed tasks [INFO]=20 [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hi= ve-shims-0.23 --- [INFO] No sources to compile [INFO]=20 [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-shims-0.23= --- [INFO] Tests are skipped. [INFO]=20 [INFO] --- maven-jar-plugin:2.2:jar (default-jar) @ hive-shims-0.23 --- [INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/shims= /0.23/target/hive-shims-0.23-0.13.0-SNAPSHOT.jar [INFO]=20 [INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-shims-= 0.23 --- [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/shims/0.= 23/target/hive-shims-0.23-0.13.0-SNAPSHOT.jar to /data/hive-ptest/working/m= aven/org/apache/hive/shims/hive-shims-0.23/0.13.0-SNAPSHOT/hive-shims-0.23-= 0.13.0-SNAPSHOT.jar [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/shims/0.= 23/pom.xml to /data/hive-ptest/working/maven/org/apache/hive/shims/hive-shi= ms-0.23/0.13.0-SNAPSHOT/hive-shims-0.23-0.13.0-SNAPSHOT.pom [INFO] = =20 [INFO] --------------------------------------------------------------------= ---- [INFO] Building Hive Shims 0.13.0-SNAPSHOT [INFO] --------------------------------------------------------------------= ---- [INFO]=20 [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-shims --- [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/shims/asse= mbly (includes =3D [datanucleus.log, derby.log], excludes =3D []) [INFO]=20 [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-= shims --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-= svn-trunk-source/shims/assembly/src/main/resources [INFO]=20 [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-shims --- [INFO] Executing tasks main: [INFO] Executed tasks [INFO]=20 [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-shims= --- [INFO] No sources to compile [INFO]=20 [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources)= @ hive-shims --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-= svn-trunk-source/shims/assembly/src/test/resources [INFO]=20 [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-shims --- [INFO] Executing tasks main: [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/s= hims/assembly/target/tmp [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/s= hims/assembly/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/s= hims/assembly/target/tmp/conf [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-so= urce/shims/assembly/target/tmp/conf [INFO] Executed tasks [INFO]=20 [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hi= ve-shims --- [INFO] No sources to compile [INFO]=20 [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-shims --- [INFO] Tests are skipped. [INFO]=20 [INFO] --- maven-jar-plugin:2.2:jar (default-jar) @ hive-shims --- [WARNING] JAR will be empty - no content was marked for inclusion! [INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/shims= /assembly/target/hive-shims-0.13.0-SNAPSHOT.jar [INFO]=20 [INFO] --- maven-assembly-plugin:2.3:single (uberjar) @ hive-shims --- [INFO] Reading assembly descriptor: src/assemble/uberjar.xml [WARNING] Artifact: org.apache.hive:hive-shims:jar:0.13.0-SNAPSHOT referenc= es the same file as the assembly destination file. Moving it to a temporary= location for inclusion. [INFO] META-INF/MANIFEST.MF already added, skipping [INFO] META-INF/MANIFEST.MF already added, skipping [INFO] META-INF/MANIFEST.MF already added, skipping [INFO] META-INF/MANIFEST.MF already added, skipping [INFO] META-INF/MANIFEST.MF already added, skipping [INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/shims= /assembly/target/hive-shims-0.13.0-SNAPSHOT.jar [INFO] META-INF/MANIFEST.MF already added, skipping [INFO] META-INF/MANIFEST.MF already added, skipping [INFO] META-INF/MANIFEST.MF already added, skipping [INFO] META-INF/MANIFEST.MF already added, skipping [INFO] META-INF/MANIFEST.MF already added, skipping [WARNING] Configuration options: 'appendAssemblyId' is set to false, and 'c= lassifier' is missing. Instead of attaching the assembly file: /data/hive-ptest/working/apache-svn= -trunk-source/shims/assembly/target/hive-shims-0.13.0-SNAPSHOT.jar, it will= become the file for main project artifact. NOTE: If multiple descriptors or descriptor-formats are provided for this p= roject, the value of this file will be non-deterministic! [WARNING] Replacing pre-existing project main-artifact file: /data/hive-pte= st/working/apache-svn-trunk-source/shims/assembly/target/archive-tmp/hive-s= hims-0.13.0-SNAPSHOT.jar with assembly file: /data/hive-ptest/working/apache-svn-trunk-source/shims/= assembly/target/hive-shims-0.13.0-SNAPSHOT.jar [INFO]=20 [INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-shims = --- [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/shims/as= sembly/target/hive-shims-0.13.0-SNAPSHOT.jar to /data/hive-ptest/working/ma= ven/org/apache/hive/hive-shims/0.13.0-SNAPSHOT/hive-shims-0.13.0-SNAPSHOT.j= ar [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/shims/as= sembly/pom.xml to /data/hive-ptest/working/maven/org/apache/hive/hive-shims= /0.13.0-SNAPSHOT/hive-shims-0.13.0-SNAPSHOT.pom [INFO] = =20 [INFO] --------------------------------------------------------------------= ---- [INFO] Building Hive Common 0.13.0-SNAPSHOT [INFO] --------------------------------------------------------------------= ---- [INFO]=20 [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-common --- [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/common (in= cludes =3D [datanucleus.log, derby.log], excludes =3D []) [INFO]=20 [INFO] --- maven-antrun-plugin:1.7:run (generate-version-annotation) @ hive= -common --- [INFO] Executing tasks main: [INFO] Executed tasks [INFO]=20 [INFO] --- build-helper-maven-plugin:1.8:add-source (add-source) @ hive-com= mon --- [INFO] Source directory: /data/hive-ptest/working/apache-svn-trunk-source/c= ommon/src/gen added. [INFO]=20 [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-= common --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] Copying 1 resource [INFO]=20 [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-common --- [INFO] Executing tasks main: [INFO] Executed tasks [INFO]=20 [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-commo= n --- [INFO] Compiling 31 source files to /data/hive-ptest/working/apache-svn-tru= nk-source/common/target/classes [WARNING] Note: /data/hive-ptest/working/apache-svn-trunk-source/common/src= /java/org/apache/hadoop/hive/common/ObjectPair.java uses unchecked or unsaf= e operations. [WARNING] Note: Recompile with -Xlint:unchecked for details. [INFO]=20 [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources)= @ hive-common --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] Copying 4 resources [INFO]=20 [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-common --- [INFO] Executing tasks main: [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/c= ommon/target/tmp [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/c= ommon/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/c= ommon/target/tmp/conf [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-so= urce/common/target/tmp/conf [INFO] Executed tasks [INFO]=20 [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hi= ve-common --- [INFO] Compiling 8 source files to /data/hive-ptest/working/apache-svn-trun= k-source/common/target/test-classes [INFO]=20 [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-common --- [INFO] Tests are skipped. [INFO]=20 [INFO] --- maven-jar-plugin:2.2:jar (default-jar) @ hive-common --- [INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/commo= n/target/hive-common-0.13.0-SNAPSHOT.jar [INFO]=20 [INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-common= --- [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/common/t= arget/hive-common-0.13.0-SNAPSHOT.jar to /data/hive-ptest/working/maven/org= /apache/hive/hive-common/0.13.0-SNAPSHOT/hive-common-0.13.0-SNAPSHOT.jar [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/common/p= om.xml to /data/hive-ptest/working/maven/org/apache/hive/hive-common/0.13.0= -SNAPSHOT/hive-common-0.13.0-SNAPSHOT.pom [INFO] = =20 [INFO] --------------------------------------------------------------------= ---- [INFO] Building Hive Serde 0.13.0-SNAPSHOT [INFO] --------------------------------------------------------------------= ---- [INFO]=20 [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-serde --- [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/serde (inc= ludes =3D [datanucleus.log, derby.log], excludes =3D []) [INFO]=20 [INFO] --- build-helper-maven-plugin:1.8:add-source (add-source) @ hive-ser= de --- [INFO] Source directory: /data/hive-ptest/working/apache-svn-trunk-source/s= erde/src/gen/protobuf/gen-java added. [INFO] Source directory: /data/hive-ptest/working/apache-svn-trunk-source/s= erde/src/gen/thrift/gen-javabean added. [INFO]=20 [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-= serde --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-= svn-trunk-source/serde/src/main/resources [INFO]=20 [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-serde --- [INFO] Executing tasks main: [INFO] Executed tasks [INFO]=20 [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-serde= --- [INFO] Compiling 351 source files to /data/hive-ptest/working/apache-svn-tr= unk-source/serde/target/classes [WARNING] Note: Some input files use or override a deprecated API. [WARNING] Note: Recompile with -Xlint:deprecation for details. [WARNING] Note: Some input files use unchecked or unsafe operations. [WARNING] Note: Recompile with -Xlint:unchecked for details. [INFO]=20 [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources)= @ hive-serde --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-= svn-trunk-source/serde/src/test/resources [INFO]=20 [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-serde --- [INFO] Executing tasks main: [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/s= erde/target/tmp [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/s= erde/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/s= erde/target/tmp/conf [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-so= urce/serde/target/tmp/conf [INFO] Executed tasks [INFO]=20 [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hi= ve-serde --- [INFO] Compiling 41 source files to /data/hive-ptest/working/apache-svn-tru= nk-source/serde/target/test-classes [WARNING] Note: Some input files use or override a deprecated API. [WARNING] Note: Recompile with -Xlint:deprecation for details. [WARNING] Note: Some input files use unchecked or unsafe operations. [WARNING] Note: Recompile with -Xlint:unchecked for details. [INFO]=20 [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-serde --- [INFO] Tests are skipped. [INFO]=20 [INFO] --- maven-jar-plugin:2.2:jar (default-jar) @ hive-serde --- [INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/serde= /target/hive-serde-0.13.0-SNAPSHOT.jar [INFO]=20 [INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-serde = --- [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/serde/ta= rget/hive-serde-0.13.0-SNAPSHOT.jar to /data/hive-ptest/working/maven/org/a= pache/hive/hive-serde/0.13.0-SNAPSHOT/hive-serde-0.13.0-SNAPSHOT.jar [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/serde/po= m.xml to /data/hive-ptest/working/maven/org/apache/hive/hive-serde/0.13.0-S= NAPSHOT/hive-serde-0.13.0-SNAPSHOT.pom [INFO] = =20 [INFO] --------------------------------------------------------------------= ---- [INFO] Building Hive Metastore 0.13.0-SNAPSHOT [INFO] --------------------------------------------------------------------= ---- [INFO]=20 [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-metastore --= - [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/metastore = (includes =3D [datanucleus.log, derby.log], excludes =3D []) [INFO]=20 [INFO] --- build-helper-maven-plugin:1.8:add-source (add-source) @ hive-met= astore --- [INFO] Source directory: /data/hive-ptest/working/apache-svn-trunk-source/m= etastore/src/model added. [INFO] Source directory: /data/hive-ptest/working/apache-svn-trunk-source/m= etastore/src/gen/thrift/gen-javabean added. [INFO]=20 [INFO] --- antlr3-maven-plugin:3.4:antlr (default) @ hive-metastore --- [INFO] ANTLR: Processing source directory /data/hive-ptest/working/apache-s= vn-trunk-source/metastore/src/java ANTLR Parser Generator Version 3.4 org/apache/hadoop/hive/metastore/parser/Filter.g [INFO]=20 [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-= metastore --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] Copying 1 resource [INFO]=20 [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-metastore = --- [INFO] Executing tasks main: [INFO] Executed tasks [INFO]=20 [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-metas= tore --- [INFO] Compiling 132 source files to /data/hive-ptest/working/apache-svn-tr= unk-source/metastore/target/classes [WARNING] Note: Some input files use or override a deprecated API. [WARNING] Note: Recompile with -Xlint:deprecation for details. [WARNING] Note: Some input files use unchecked or unsafe operations. [WARNING] Note: Recompile with -Xlint:unchecked for details. [INFO]=20 [INFO] --- datanucleus-maven-plugin:3.3.0-release:enhance (default) @ hive-= metastore --- [INFO] DataNucleus Enhancer (version 3.2.2) for API "JDO" using JRE "1.6" DataNucleus Enhancer : Classpath >> /data/hive-ptest/working/maven/org/datanucleus/datanucleus-maven-plugin= /3.3.0-release/datanucleus-maven-plugin-3.3.0-release.jar >> /data/hive-ptest/working/maven/org/datanucleus/datanucleus-core/3.2.2/d= atanucleus-core-3.2.2.jar >> /data/hive-ptest/working/maven/org/codehaus/plexus/plexus-utils/3.0.8/p= lexus-utils-3.0.8.jar >> /data/hive-ptest/working/maven/org/codehaus/plexus/plexus-component-ann= otations/1.5.5/plexus-component-annotations-1.5.5.jar >> /data/hive-ptest/working/maven/org/sonatype/sisu/sisu-inject-bean/2.3.0= /sisu-inject-bean-2.3.0.jar >> /data/hive-ptest/working/maven/org/sonatype/sisu/sisu-guice/3.1.0/sisu-= guice-3.1.0-no_aop.jar >> /data/hive-ptest/working/maven/org/sonatype/sisu/sisu-guava/0.9.9/sisu-= guava-0.9.9.jar >> /data/hive-ptest/working/maven/org/apache/xbean/xbean-reflect/3.4/xbean= -reflect-3.4.jar >> /data/hive-ptest/working/maven/log4j/log4j/1.2.12/log4j-1.2.12.jar >> /data/hive-ptest/working/maven/commons-logging/commons-logging-api/1.1/= commons-logging-api-1.1.jar >> /data/hive-ptest/working/maven/com/google/collections/google-collection= s/1.0/google-collections-1.0.jar >> /data/hive-ptest/working/maven/junit/junit/3.8.2/junit-3.8.2.jar >> /data/hive-ptest/working/apache-svn-trunk-source/metastore/target/class= es >> /data/hive-ptest/working/apache-svn-trunk-source/serde/target/hive-serd= e-0.13.0-SNAPSHOT.jar >> /data/hive-ptest/working/apache-svn-trunk-source/common/target/hive-com= mon-0.13.0-SNAPSHOT.jar >> /data/hive-ptest/working/maven/org/apache/commons/commons-compress/1.4.= 1/commons-compress-1.4.1.jar >> /data/hive-ptest/working/maven/org/tukaani/xz/1.0/xz-1.0.jar >> /data/hive-ptest/working/maven/commons-codec/commons-codec/1.4/commons-= codec-1.4.jar >> /data/hive-ptest/working/maven/org/apache/avro/avro/1.7.1/avro-1.7.1.ja= r >> /data/hive-ptest/working/maven/org/codehaus/jackson/jackson-core-asl/1.= 8.8/jackson-core-asl-1.8.8.jar >> /data/hive-ptest/working/maven/com/thoughtworks/paranamer/paranamer/2.3= /paranamer-2.3.jar >> /data/hive-ptest/working/maven/org/xerial/snappy/snappy-java/1.0.4.1/sn= appy-java-1.0.4.1.jar >> /data/hive-ptest/working/apache-svn-trunk-source/shims/assembly/target/= hive-shims-0.13.0-SNAPSHOT.jar >> /data/hive-ptest/working/apache-svn-trunk-source/shims/common/target/hi= ve-shims-common-0.13.0-SNAPSHOT.jar >> /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20/target/hive= -shims-0.20-0.13.0-SNAPSHOT.jar >> /data/hive-ptest/working/apache-svn-trunk-source/shims/common-secure/ta= rget/hive-shims-common-secure-0.13.0-SNAPSHOT.jar >> /data/hive-ptest/working/maven/org/apache/zookeeper/zookeeper/3.4.3/zoo= keeper-3.4.3.jar >> /data/hive-ptest/working/maven/jline/jline/0.9.94/jline-0.9.94.jar >> /data/hive-ptest/working/maven/org/jboss/netty/netty/3.2.2.Final/netty-= 3.2.2.Final.jar >> /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20S/target/hiv= e-shims-0.20S-0.13.0-SNAPSHOT.jar >> /data/hive-ptest/working/apache-svn-trunk-source/shims/0.23/target/hive= -shims-0.23-0.13.0-SNAPSHOT.jar >> /data/hive-ptest/working/maven/com/google/guava/guava/11.0.2/guava-11.0= .2.jar >> /data/hive-ptest/working/maven/com/google/code/findbugs/jsr305/1.3.9/js= r305-1.3.9.jar >> /data/hive-ptest/working/maven/commons-cli/commons-cli/1.2/commons-cli-= 1.2.jar >> /data/hive-ptest/working/maven/commons-lang/commons-lang/2.4/commons-la= ng-2.4.jar >> /data/hive-ptest/working/maven/commons-logging/commons-logging/1.1.3/co= mmons-logging-1.1.3.jar >> /data/hive-ptest/working/maven/org/apache/derby/derby/10.4.2.0/derby-10= .4.2.0.jar >> /data/hive-ptest/working/maven/org/datanucleus/datanucleus-api-jdo/3.2.= 1/datanucleus-api-jdo-3.2.1.jar >> /data/hive-ptest/working/maven/org/datanucleus/datanucleus-rdbms/3.2.1/= datanucleus-rdbms-3.2.1.jar >> /data/hive-ptest/working/maven/javax/jdo/jdo-api/3.0.1/jdo-api-3.0.1.ja= r >> /data/hive-ptest/working/maven/javax/transaction/jta/1.1/jta-1.1.jar >> /data/hive-ptest/working/maven/org/antlr/antlr-runtime/3.4/antlr-runtim= e-3.4.jar >> /data/hive-ptest/working/maven/org/antlr/stringtemplate/3.2.1/stringtem= plate-3.2.1.jar >> /data/hive-ptest/working/maven/antlr/antlr/2.7.7/antlr-2.7.7.jar >> /data/hive-ptest/working/maven/org/apache/thrift/libfb303/0.9.0/libfb30= 3-0.9.0.jar >> /data/hive-ptest/working/maven/org/apache/thrift/libthrift/0.9.0/libthr= ift-0.9.0.jar >> /data/hive-ptest/working/maven/org/apache/httpcomponents/httpclient/4.1= .3/httpclient-4.1.3.jar >> /data/hive-ptest/working/maven/org/apache/httpcomponents/httpcore/4.1.3= /httpcore-4.1.3.jar >> /data/hive-ptest/working/maven/org/apache/hadoop/hadoop-core/1.2.1/hado= op-core-1.2.1.jar >> /data/hive-ptest/working/maven/xmlenc/xmlenc/0.52/xmlenc-0.52.jar >> /data/hive-ptest/working/maven/com/sun/jersey/jersey-core/1.8/jersey-co= re-1.8.jar >> /data/hive-ptest/working/maven/com/sun/jersey/jersey-json/1.8/jersey-js= on-1.8.jar >> /data/hive-ptest/working/maven/org/codehaus/jettison/jettison/1.1/jetti= son-1.1.jar >> /data/hive-ptest/working/maven/stax/stax-api/1.0.1/stax-api-1.0.1.jar >> /data/hive-ptest/working/maven/com/sun/xml/bind/jaxb-impl/2.2.3-1/jaxb-= impl-2.2.3-1.jar >> /data/hive-ptest/working/maven/javax/xml/bind/jaxb-api/2.2.2/jaxb-api-2= .2.2.jar >> /data/hive-ptest/working/maven/javax/xml/stream/stax-api/1.0-2/stax-api= -1.0-2.jar >> /data/hive-ptest/working/maven/javax/activation/activation/1.1/activati= on-1.1.jar >> /data/hive-ptest/working/maven/org/codehaus/jackson/jackson-jaxrs/1.7.1= /jackson-jaxrs-1.7.1.jar >> /data/hive-ptest/working/maven/org/codehaus/jackson/jackson-xc/1.7.1/ja= ckson-xc-1.7.1.jar >> /data/hive-ptest/working/maven/com/sun/jersey/jersey-server/1.8/jersey-= server-1.8.jar >> /data/hive-ptest/working/maven/asm/asm/3.1/asm-3.1.jar >> /data/hive-ptest/working/maven/commons-io/commons-io/2.1/commons-io-2.1= .jar >> /data/hive-ptest/working/maven/commons-httpclient/commons-httpclient/3.= 0.1/commons-httpclient-3.0.1.jar >> /data/hive-ptest/working/maven/org/apache/commons/commons-math/2.1/comm= ons-math-2.1.jar >> /data/hive-ptest/working/maven/commons-configuration/commons-configurat= ion/1.6/commons-configuration-1.6.jar >> /data/hive-ptest/working/maven/commons-collections/commons-collections/= 3.2.1/commons-collections-3.2.1.jar >> /data/hive-ptest/working/maven/commons-digester/commons-digester/1.8/co= mmons-digester-1.8.jar >> /data/hive-ptest/working/maven/commons-beanutils/commons-beanutils/1.7.= 0/commons-beanutils-1.7.0.jar >> /data/hive-ptest/working/maven/commons-beanutils/commons-beanutils-core= /1.8.0/commons-beanutils-core-1.8.0.jar >> /data/hive-ptest/working/maven/commons-net/commons-net/1.4.1/commons-ne= t-1.4.1.jar >> /data/hive-ptest/working/maven/org/mortbay/jetty/jetty/6.1.26/jetty-6.1= .26.jar >> /data/hive-ptest/working/maven/org/mortbay/jetty/servlet-api/2.5-200812= 11/servlet-api-2.5-20081211.jar >> /data/hive-ptest/working/maven/org/mortbay/jetty/jetty-util/6.1.26/jett= y-util-6.1.26.jar >> /data/hive-ptest/working/maven/tomcat/jasper-runtime/5.5.12/jasper-runt= ime-5.5.12.jar >> /data/hive-ptest/working/maven/tomcat/jasper-compiler/5.5.12/jasper-com= piler-5.5.12.jar >> /data/hive-ptest/working/maven/org/mortbay/jetty/jsp-api-2.1/6.1.14/jsp= -api-2.1-6.1.14.jar >> /data/hive-ptest/working/maven/org/mortbay/jetty/servlet-api-2.5/6.1.14= /servlet-api-2.5-6.1.14.jar >> /data/hive-ptest/working/maven/org/mortbay/jetty/jsp-2.1/6.1.14/jsp-2.1= -6.1.14.jar >> /data/hive-ptest/working/maven/ant/ant/1.6.5/ant-1.6.5.jar >> /data/hive-ptest/working/maven/commons-el/commons-el/1.0/commons-el-1.0= .jar >> /data/hive-ptest/working/maven/net/java/dev/jets3t/jets3t/0.6.1/jets3t-= 0.6.1.jar >> /data/hive-ptest/working/maven/hsqldb/hsqldb/1.8.0.10/hsqldb-1.8.0.10.j= ar >> /data/hive-ptest/working/maven/oro/oro/2.0.8/oro-2.0.8.jar >> /data/hive-ptest/working/maven/org/eclipse/jdt/core/3.1.1/core-3.1.1.ja= r >> /data/hive-ptest/working/maven/org/codehaus/jackson/jackson-mapper-asl/= 1.8.8/jackson-mapper-asl-1.8.8.jar >> /data/hive-ptest/working/maven/org/slf4j/slf4j-api/1.6.1/slf4j-api-1.6.= 1.jar >> /data/hive-ptest/working/maven/org/slf4j/slf4j-log4j12/1.6.1/slf4j-log4= j12-1.6.1.jar >> /data/hive-ptest/working/maven/log4j/log4j/1.2.16/log4j-1.2.16.jar ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MDat= abase ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MFie= ldSchema ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MTyp= e ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MTab= le ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MSer= DeInfo ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MOrd= er ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MCol= umnDescriptor ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MStr= ingList ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MSto= rageDescriptor ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MPar= tition ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MInd= ex ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MRol= e ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MRol= eMap ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MGlo= balPrivilege ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MDBP= rivilege ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MTab= lePrivilege ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MPar= titionPrivilege ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MTab= leColumnPrivilege ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MPar= titionColumnPrivilege ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MPar= titionEvent ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MMas= terKey ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MDel= egationToken ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MTab= leColumnStatistics ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MPar= titionColumnStatistics ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MVer= sionTable DataNucleus Enhancer completed with success for 25 classes. Timings : input= =3D610 ms, enhance=3D920 ms, total=3D1530 ms. Consult the log for full deta= ils [INFO]=20 [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources)= @ hive-metastore --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-= svn-trunk-source/metastore/src/test/resources [INFO]=20 [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-metastore -= -- [INFO] Executing tasks main: [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/m= etastore/target/tmp [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/m= etastore/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/m= etastore/target/tmp/conf [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-so= urce/metastore/target/tmp/conf [INFO] Executed tasks [INFO]=20 [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hi= ve-metastore --- [INFO] Compiling 10 source files to /data/hive-ptest/working/apache-svn-tru= nk-source/metastore/target/test-classes [INFO]=20 [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-metastore = --- [INFO] Tests are skipped. [INFO]=20 [INFO] --- maven-jar-plugin:2.2:jar (default-jar) @ hive-metastore --- [INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/metas= tore/target/hive-metastore-0.13.0-SNAPSHOT.jar [INFO]=20 [INFO] --- maven-jar-plugin:2.2:test-jar (default) @ hive-metastore --- [INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/metas= tore/target/hive-metastore-0.13.0-SNAPSHOT-tests.jar [INFO]=20 [INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-metast= ore --- [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/metastor= e/target/hive-metastore-0.13.0-SNAPSHOT.jar to /data/hive-ptest/working/mav= en/org/apache/hive/hive-metastore/0.13.0-SNAPSHOT/hive-metastore-0.13.0-SNA= PSHOT.jar [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/metastor= e/pom.xml to /data/hive-ptest/working/maven/org/apache/hive/hive-metastore/= 0.13.0-SNAPSHOT/hive-metastore-0.13.0-SNAPSHOT.pom [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/metastor= e/target/hive-metastore-0.13.0-SNAPSHOT-tests.jar to /data/hive-ptest/worki= ng/maven/org/apache/hive/hive-metastore/0.13.0-SNAPSHOT/hive-metastore-0.13= .0-SNAPSHOT-tests.jar [INFO] = =20 [INFO] --------------------------------------------------------------------= ---- [INFO] Building Hive Query Language 0.13.0-SNAPSHOT [INFO] --------------------------------------------------------------------= ---- [INFO]=20 [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-exec --- [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/ql (includ= es =3D [datanucleus.log, derby.log], excludes =3D []) [INFO]=20 [INFO] --- maven-antrun-plugin:1.7:run (generate-sources) @ hive-exec --- [INFO] Executing tasks main: [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/q= l/target/generated-sources/java/org/apache/hadoop/hive/ql/exec/vector/expre= ssions/gen [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/q= l/target/generated-sources/java/org/apache/hadoop/hive/ql/exec/vector/expre= ssions/aggregates/gen [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/q= l/target/generated-test-sources/java/org/apache/hadoop/hive/ql/exec/vector/= expressions/gen Generating vector expression code Generating vector expression test code [INFO] Executed tasks [INFO]=20 [INFO] --- build-helper-maven-plugin:1.8:add-source (add-source) @ hive-exe= c --- [INFO] Source directory: /data/hive-ptest/working/apache-svn-trunk-source/q= l/src/gen/protobuf/gen-java added. [INFO] Source directory: /data/hive-ptest/working/apache-svn-trunk-source/q= l/src/gen/thrift/gen-javabean added. [INFO] Source directory: /data/hive-ptest/working/apache-svn-trunk-source/q= l/target/generated-sources/java added. [INFO]=20 [INFO] --- antlr3-maven-plugin:3.4:antlr (default) @ hive-exec --- [INFO] ANTLR: Processing source directory /data/hive-ptest/working/apache-s= vn-trunk-source/ql/src/java ANTLR Parser Generator Version 3.4 org/apache/hadoop/hive/ql/parse/HiveLexer.g org/apache/hadoop/hive/ql/parse/HiveParser.g warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:872:5:=20 Decision can match input such as "Identifier KW_RENAME KW_TO" using multipl= e alternatives: 1, 10 As a result, alternative(s) 10 were disabled for that input warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1177:5:=20 Decision can match input such as "KW_TEXTFILE" using multiple alternatives:= 2, 6 As a result, alternative(s) 6 were disabled for that input warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1177:5:=20 Decision can match input such as "KW_SEQUENCEFILE" using multiple alternati= ves: 1, 6 As a result, alternative(s) 6 were disabled for that input warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1177:5:=20 Decision can match input such as "KW_ORCFILE" using multiple alternatives: = 4, 6 As a result, alternative(s) 6 were disabled for that input warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1177:5:=20 Decision can match input such as "KW_RCFILE" using multiple alternatives: 3= , 6 As a result, alternative(s) 6 were disabled for that input warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1190:23:=20 Decision can match input such as "KW_ELEM_TYPE" using multiple alternatives= : 1, 4 As a result, alternative(s) 4 were disabled for that input warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1190:23:=20 Decision can match input such as "KW_KEY_TYPE" using multiple alternatives:= 2, 4 As a result, alternative(s) 4 were disabled for that input warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1190:23:=20 Decision can match input such as "KW_VALUE_TYPE" using multiple alternative= s: 3, 4 As a result, alternative(s) 4 were disabled for that input warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1197:23:=20 Decision can match input such as "KW_ELEM_TYPE" using multiple alternatives= : 1, 4 As a result, alternative(s) 4 were disabled for that input warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1197:23:=20 Decision can match input such as "KW_VALUE_TYPE" using multiple alternative= s: 3, 4 As a result, alternative(s) 4 were disabled for that input warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1197:23:=20 Decision can match input such as "KW_KEY_TYPE" using multiple alternatives:= 2, 4 As a result, alternative(s) 4 were disabled for that input warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1215:29:=20 Decision can match input such as "KW_PRETTY {KW_ADD..KW_AFTER, KW_ALTER..KW= _ANALYZE, KW_ARCHIVE..KW_CASCADE, KW_CHANGE, KW_CLUSTER..KW_COLLECTION, KW_= COLUMNS..KW_CREATE, KW_CUBE, KW_CURSOR..KW_DATA, KW_DATABASES..KW_DISABLE, = KW_DISTRIBUTE..KW_ELEM_TYPE, KW_ENABLE, KW_ESCAPED, KW_EXCLUSIVE..KW_EXPORT= , KW_EXTERNAL..KW_FLOAT, KW_FOR..KW_FORMATTED, KW_FULL, KW_FUNCTIONS..KW_GR= OUPING, KW_HOLD_DDLTIME..KW_IDXPROPERTIES, KW_IGNORE..KW_ITEMS, KW_KEYS..KW= _LEFT, KW_LIKE..KW_LONG, KW_MAPJOIN..KW_MINUS, KW_MSCK..KW_NOSCAN, KW_NO_DR= OP..KW_OFFLINE, KW_OPTION, KW_ORCFILE..KW_OUTPUTFORMAT, KW_OVERWRITE, KW_PA= RTITIONED..KW_PLUS, KW_PRETTY..KW_RECORDWRITER, KW_REGEXP..KW_SCHEMAS, KW_S= EMI..KW_TABLES, KW_TBLPROPERTIES..KW_TEXTFILE, KW_TIMESTAMP..KW_TOUCH, KW_T= RIGGER..KW_UNARCHIVE, KW_UNDO..KW_UNIONTYPE, KW_UNLOCK..KW_VALUE_TYPE, KW_V= IEW, KW_WHILE, KW_WITH}" using multiple alternatives: 3, 4 As a result, alternative(s) 4 were disabled for that input warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1215:29:=20 Decision can match input such as "KW_FORMATTED {KW_ADD..KW_AFTER, KW_ALTER.= .KW_ANALYZE, KW_ARCHIVE..KW_CASCADE, KW_CHANGE, KW_CLUSTER..KW_COLLECTION, = KW_COLUMNS..KW_CREATE, KW_CUBE, KW_CURSOR..KW_DATA, KW_DATABASES..KW_DISABL= E, KW_DISTRIBUTE..KW_ELEM_TYPE, KW_ENABLE, KW_ESCAPED, KW_EXCLUSIVE..KW_EXP= ORT, KW_EXTERNAL..KW_FLOAT, KW_FOR..KW_FORMATTED, KW_FULL, KW_FUNCTIONS..KW= _GROUPING, KW_HOLD_DDLTIME..KW_IDXPROPERTIES, KW_IGNORE..KW_ITEMS, KW_KEYS.= .KW_LEFT, KW_LIKE..KW_LONG, KW_MAPJOIN..KW_MINUS, KW_MSCK..KW_NOSCAN, KW_NO= _DROP..KW_OFFLINE, KW_OPTION, KW_ORCFILE..KW_OUTPUTFORMAT, KW_OVERWRITE, KW= _PARTITIONED..KW_PLUS, KW_PRETTY..KW_RECORDWRITER, KW_REGEXP..KW_SCHEMAS, K= W_SEMI..KW_TABLES, KW_TBLPROPERTIES..KW_TEXTFILE, KW_TIMESTAMP..KW_TOUCH, K= W_TRIGGER..KW_UNARCHIVE, KW_UNDO..KW_UNIONTYPE, KW_UNLOCK..KW_VALUE_TYPE, K= W_VIEW, KW_WHILE, KW_WITH}" using multiple alternatives: 1, 4 As a result, alternative(s) 4 were disabled for that input warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1215:29:=20 Decision can match input such as "KW_PRETTY Identifier" using multiple alte= rnatives: 3, 4 As a result, alternative(s) 4 were disabled for that input warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1215:29:=20 Decision can match input such as "KW_FORMATTED Identifier" using multiple a= lternatives: 1, 4 As a result, alternative(s) 4 were disabled for that input warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1215:29:=20 Decision can match input such as "KW_PRETTY KW_PARTITION" using multiple al= ternatives: 3, 4 As a result, alternative(s) 4 were disabled for that input warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1215:29:=20 Decision can match input such as "KW_FORMATTED KW_PARTITION" using multiple= alternatives: 1, 4 As a result, alternative(s) 4 were disabled for that input warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1486:116:=20 Decision can match input such as "KW_STORED KW_AS KW_DIRECTORIES" using mul= tiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1609:5:=20 Decision can match input such as "KW_STORED KW_AS KW_RCFILE" using multiple= alternatives: 3, 7 As a result, alternative(s) 7 were disabled for that input warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1609:5:=20 Decision can match input such as "KW_STORED KW_AS KW_SEQUENCEFILE" using mu= ltiple alternatives: 1, 7 As a result, alternative(s) 7 were disabled for that input warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1609:5:=20 Decision can match input such as "KW_STORED KW_AS KW_ORCFILE" using multipl= e alternatives: 4, 7 As a result, alternative(s) 7 were disabled for that input warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1609:5:=20 Decision can match input such as "KW_STORED KW_AS KW_TEXTFILE" using multip= le alternatives: 2, 7 As a result, alternative(s) 7 were disabled for that input warning(200): org/apache/hadoop/hive/ql/parse/HiveParser.g:1609:5:=20 Decision can match input such as "KW_STORED KW_AS KW_INPUTFORMAT" using mul= tiple alternatives: 5, 7 As a result, alternative(s) 7 were disabled for that input warning(200): SelectClauseParser.g:149:5:=20 Decision can match input such as "KW_NULL DOT {KW_ADD..KW_AFTER, KW_ALTER..= KW_ANALYZE, KW_ARCHIVE..KW_CASCADE, KW_CHANGE, KW_CLUSTER..KW_COLLECTION, K= W_COLUMNS..KW_CREATE, KW_CUBE, KW_CURSOR..KW_DATA, KW_DATABASES..KW_DISABLE= , KW_DISTRIBUTE..KW_ELEM_TYPE, KW_ENABLE, KW_ESCAPED, KW_EXCLUSIVE..KW_EXPO= RT, KW_EXTERNAL..KW_FLOAT, KW_FOR..KW_FORMATTED, KW_FULL, KW_FUNCTIONS..KW_= GROUPING, KW_HOLD_DDLTIME..KW_IDXPROPERTIES, KW_IGNORE..KW_ITEMS, KW_KEYS..= KW_LEFT, KW_LIKE..KW_LONG, KW_MAPJOIN..KW_MINUS, KW_MSCK..KW_NOSCAN, KW_NO_= DROP..KW_OFFLINE, KW_OPTION, KW_ORCFILE..KW_OUTPUTFORMAT, KW_OVERWRITE, KW_= PARTITION..KW_PLUS, KW_PRETTY..KW_RECORDWRITER, KW_REGEXP..KW_SCHEMAS, KW_S= EMI..KW_TABLES, KW_TBLPROPERTIES..KW_TEXTFILE, KW_TIMESTAMP..KW_TOUCH, KW_T= RIGGER..KW_UNARCHIVE, KW_UNDO..KW_UNIONTYPE, KW_UNLOCK..KW_VALUE_TYPE, KW_V= IEW, KW_WHILE, KW_WITH}" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): SelectClauseParser.g:149:5:=20 Decision can match input such as "KW_NULL DOT Identifier" using multiple al= ternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:127:2:=20 Decision can match input such as "KW_LATERAL KW_VIEW KW_OUTER" using multip= le alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:179:25:=20 Decision can match input such as "LPAREN StringLiteral EQUAL" using multipl= e alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:179:25:=20 Decision can match input such as "LPAREN StringLiteral COMMA" using multipl= e alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:179:25:=20 Decision can match input such as "LPAREN StringLiteral RPAREN" using multip= le alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:179:68:=20 Decision can match input such as "Identifier LPAREN BigintLiteral" using mu= ltiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:179:68:=20 Decision can match input such as "Identifier LPAREN KW_CAST" using multiple= alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:179:68:=20 Decision can match input such as "Identifier LPAREN KW_IF" using multiple a= lternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:179:68:=20 Decision can match input such as "Identifier LPAREN CharSetName" using mult= iple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:179:68:=20 Decision can match input such as "Identifier LPAREN LPAREN" using multiple = alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:179:68:=20 Decision can match input such as "Identifier LPAREN KW_EXISTS" using multip= le alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:179:68:=20 Decision can match input such as "Identifier LPAREN KW_CASE" using multiple= alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:179:68:=20 Decision can match input such as "Identifier LPAREN TinyintLiteral" using m= ultiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:179:68:=20 Decision can match input such as "Identifier LPAREN KW_NULL" using multiple= alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:179:68:=20 Decision can match input such as "Identifier LPAREN SmallintLiteral" using = multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:179:68:=20 Decision can match input such as "Identifier LPAREN KW_FALSE" using multipl= e alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:179:68:=20 Decision can match input such as "Identifier LPAREN KW_UNIONTYPE" using mul= tiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:179:68:=20 Decision can match input such as "Identifier LPAREN Number" using multiple = alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:179:68:=20 Decision can match input such as "Identifier LPAREN Identifier" using multi= ple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:179:68:=20 Decision can match input such as "Identifier LPAREN StringLiteral" using mu= ltiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:179:68:=20 Decision can match input such as "Identifier LPAREN DecimalLiteral" using m= ultiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:179:68:=20 Decision can match input such as "Identifier LPAREN {KW_ADD..KW_AFTER, KW_A= LTER..KW_ANALYZE, KW_ARCHIVE, KW_AS..KW_CASCADE, KW_CHANGE, KW_CLUSTER..KW_= COLLECTION, KW_COLUMNS..KW_CREATE, KW_CUBE, KW_CURSOR..KW_DATA, KW_DATABASE= S, KW_DATETIME..KW_DISABLE, KW_DISTRIBUTE..KW_ELEM_TYPE, KW_ENABLE, KW_ESCA= PED, KW_EXCLUSIVE, KW_EXPLAIN..KW_EXPORT, KW_EXTERNAL, KW_FETCH..KW_FLOAT, = KW_FOR..KW_FORMATTED, KW_FULL, KW_FUNCTIONS..KW_GROUPING, KW_HOLD_DDLTIME..= KW_IDXPROPERTIES, KW_IGNORE..KW_ITEMS, KW_KEYS..KW_LEFT, KW_LIKE..KW_LONG, = KW_MAPJOIN..KW_MINUS, KW_MSCK..KW_NOSCAN, KW_NO_DROP, KW_OF..KW_OFFLINE, KW= _OPTION, KW_ORCFILE..KW_OUTPUTFORMAT, KW_OVERWRITE, KW_PARTITION..KW_PLUS, = KW_PRETTY..KW_RECORDWRITER, KW_REGEXP..KW_SCHEMAS, KW_SEMI..KW_STRING, KW_T= ABLE..KW_TABLES, KW_TBLPROPERTIES..KW_TEXTFILE, KW_TIMESTAMP..KW_TOUCH, KW_= TRIGGER, KW_TRUNCATE..KW_UNARCHIVE, KW_UNDO..KW_UNION, KW_UNLOCK..KW_VALUE_= TYPE, KW_VIEW, KW_WHILE, KW_WITH}" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:179:68:=20 Decision can match input such as "Identifier LPAREN KW_STRUCT" using multip= le alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:179:68:=20 Decision can match input such as "Identifier LPAREN KW_DATE" using multiple= alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:179:68:=20 Decision can match input such as "Identifier LPAREN KW_NOT" using multiple = alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:179:68:=20 Decision can match input such as "Identifier LPAREN {MINUS, PLUS, TILDE}" u= sing multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:179:68:=20 Decision can match input such as "Identifier LPAREN KW_MAP" using multiple = alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:179:68:=20 Decision can match input such as "Identifier LPAREN KW_TRUE" using multiple= alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:179:68:=20 Decision can match input such as "Identifier LPAREN KW_ARRAY" using multipl= e alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:237:16:=20 Decision can match input such as "Identifier LPAREN BigintLiteral" using mu= ltiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:237:16:=20 Decision can match input such as "Identifier LPAREN KW_CAST" using multiple= alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:237:16:=20 Decision can match input such as "Identifier LPAREN KW_IF" using multiple a= lternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:237:16:=20 Decision can match input such as "Identifier LPAREN CharSetName" using mult= iple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:237:16:=20 Decision can match input such as "Identifier LPAREN LPAREN" using multiple = alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:237:16:=20 Decision can match input such as "Identifier LPAREN KW_EXISTS" using multip= le alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:237:16:=20 Decision can match input such as "Identifier LPAREN KW_CASE" using multiple= alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:237:16:=20 Decision can match input such as "Identifier LPAREN TinyintLiteral" using m= ultiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:237:16:=20 Decision can match input such as "Identifier LPAREN KW_NULL" using multiple= alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:237:16:=20 Decision can match input such as "Identifier LPAREN SmallintLiteral" using = multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:237:16:=20 Decision can match input such as "Identifier LPAREN KW_FALSE" using multipl= e alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:237:16:=20 Decision can match input such as "Identifier LPAREN KW_UNIONTYPE" using mul= tiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:237:16:=20 Decision can match input such as "Identifier LPAREN Number" using multiple = alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:237:16:=20 Decision can match input such as "Identifier LPAREN Identifier" using multi= ple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:237:16:=20 Decision can match input such as "Identifier LPAREN StringLiteral" using mu= ltiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:237:16:=20 Decision can match input such as "Identifier LPAREN DecimalLiteral" using m= ultiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:237:16:=20 Decision can match input such as "Identifier LPAREN {KW_ADD..KW_AFTER, KW_A= LTER..KW_ANALYZE, KW_ARCHIVE, KW_AS..KW_CASCADE, KW_CHANGE, KW_CLUSTER..KW_= COLLECTION, KW_COLUMNS..KW_CREATE, KW_CUBE, KW_CURSOR..KW_DATA, KW_DATABASE= S, KW_DATETIME..KW_DISABLE, KW_DISTRIBUTE..KW_ELEM_TYPE, KW_ENABLE, KW_ESCA= PED, KW_EXCLUSIVE, KW_EXPLAIN..KW_EXPORT, KW_EXTERNAL, KW_FETCH..KW_FLOAT, = KW_FOR..KW_FORMATTED, KW_FULL, KW_FUNCTIONS..KW_GROUPING, KW_HOLD_DDLTIME..= KW_IDXPROPERTIES, KW_IGNORE..KW_ITEMS, KW_KEYS..KW_LEFT, KW_LIKE..KW_LONG, = KW_MAPJOIN..KW_MINUS, KW_MSCK..KW_NOSCAN, KW_NO_DROP, KW_OF..KW_OFFLINE, KW= _OPTION, KW_ORCFILE..KW_OUTPUTFORMAT, KW_OVERWRITE, KW_PARTITION..KW_PLUS, = KW_PRETTY..KW_RECORDWRITER, KW_REGEXP..KW_SCHEMAS, KW_SEMI..KW_STRING, KW_T= ABLE..KW_TABLES, KW_TBLPROPERTIES..KW_TEXTFILE, KW_TIMESTAMP..KW_TOUCH, KW_= TRIGGER, KW_TRUNCATE..KW_UNARCHIVE, KW_UNDO..KW_UNION, KW_UNLOCK..KW_VALUE_= TYPE, KW_VIEW, KW_WHILE, KW_WITH}" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:237:16:=20 Decision can match input such as "Identifier LPAREN KW_STRUCT" using multip= le alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:237:16:=20 Decision can match input such as "Identifier LPAREN KW_DATE" using multiple= alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:237:16:=20 Decision can match input such as "Identifier LPAREN KW_NOT" using multiple = alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:237:16:=20 Decision can match input such as "Identifier LPAREN {MINUS, PLUS, TILDE}" u= sing multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:237:16:=20 Decision can match input such as "Identifier LPAREN KW_MAP" using multiple = alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:237:16:=20 Decision can match input such as "Identifier LPAREN KW_TRUE" using multiple= alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): FromClauseParser.g:237:16:=20 Decision can match input such as "Identifier LPAREN KW_ARRAY" using multipl= e alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4:=20 Decision can match input such as "LPAREN KW_NOT CharSetName" using multiple= alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4:=20 Decision can match input such as "LPAREN KW_CASE CharSetName" using multipl= e alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4:=20 Decision can match input such as "LPAREN LPAREN CharSetName" using multiple= alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4:=20 Decision can match input such as "LPAREN KW_NOT Number" using multiple alte= rnatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4:=20 Decision can match input such as "LPAREN KW_NOT KW_UNIONTYPE" using multipl= e alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4:=20 Decision can match input such as "LPAREN KW_NULL {KW_LIKE, KW_REGEXP, KW_RL= IKE}" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4:=20 Decision can match input such as "LPAREN KW_NOT KW_DATE" using multiple alt= ernatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4:=20 Decision can match input such as "LPAREN KW_NOT KW_NOT" using multiple alte= rnatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4:=20 Decision can match input such as "LPAREN KW_CASE KW_NOT" using multiple alt= ernatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4:=20 Decision can match input such as "LPAREN LPAREN KW_NOT" using multiple alte= rnatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4:=20 Decision can match input such as "LPAREN KW_CASE Identifier" using multiple= alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4:=20 Decision can match input such as "LPAREN KW_NULL AMPERSAND" using multiple = alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4:=20 Decision can match input such as "LPAREN LPAREN SmallintLiteral" using mult= iple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4:=20 Decision can match input such as "LPAREN LPAREN TinyintLiteral" using multi= ple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4:=20 Decision can match input such as "LPAREN LPAREN BigintLiteral" using multip= le alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4:=20 Decision can match input such as "LPAREN KW_NOT KW_FALSE" using multiple al= ternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4:=20 Decision can match input such as "LPAREN KW_NOT KW_STRUCT" using multiple a= lternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4:=20 Decision can match input such as "LPAREN KW_NOT KW_TRUE" using multiple alt= ernatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4:=20 Decision can match input such as "LPAREN KW_NOT KW_ARRAY" using multiple al= ternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4:=20 Decision can match input such as "LPAREN LPAREN Number" using multiple alte= rnatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4:=20 Decision can match input such as "LPAREN KW_NULL KW_IS" using multiple alte= rnatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4:=20 Decision can match input such as "LPAREN KW_NOT StringLiteral" using multip= le alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4:=20 Decision can match input such as "LPAREN KW_NOT {MINUS, PLUS, TILDE}" using= multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4:=20 Decision can match input such as "LPAREN KW_CASE {MINUS, PLUS, TILDE}" usin= g multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4:=20 Decision can match input such as "LPAREN LPAREN {MINUS, PLUS, TILDE}" using= multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4:=20 Decision can match input such as "LPAREN KW_NOT KW_MAP" using multiple alte= rnatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4:=20 Decision can match input such as "LPAREN KW_CASE KW_MAP" using multiple alt= ernatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4:=20 Decision can match input such as "LPAREN LPAREN KW_MAP" using multiple alte= rnatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4:=20 Decision can match input such as "LPAREN KW_NULL LPAREN" using multiple alt= ernatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4:=20 Decision can match input such as "LPAREN LPAREN DecimalLiteral" using multi= ple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4:=20 Decision can match input such as "LPAREN KW_NULL RPAREN" using multiple alt= ernatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4:=20 Decision can match input such as "LPAREN StringLiteral StringLiteral" using= multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4:=20 Decision can match input such as "LPAREN KW_NOT KW_IF" using multiple alter= natives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4:=20 Decision can match input such as "LPAREN KW_CASE KW_TRUE" using multiple al= ternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4:=20 Decision can match input such as "LPAREN KW_CASE KW_IF" using multiple alte= rnatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4:=20 Decision can match input such as "LPAREN LPAREN KW_IF" using multiple alter= natives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4:=20 Decision can match input such as "LPAREN KW_NULL BITWISEOR" using multiple = alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4:=20 Decision can match input such as "LPAREN KW_NULL DOT" using multiple altern= atives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4:=20 Decision can match input such as "LPAREN LPAREN KW_EXISTS" using multiple a= lternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4:=20 Decision can match input such as "LPAREN KW_CASE StringLiteral" using multi= ple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4:=20 Decision can match input such as "LPAREN KW_CASE KW_FALSE" using multiple a= lternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4:=20 Decision can match input such as "LPAREN KW_NOT Identifier" using multiple = alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4:=20 Decision can match input such as "LPAREN KW_NULL {DIV..DIVIDE, MOD, STAR}" = using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4:=20 Decision can match input such as "LPAREN KW_CASE {KW_ADD..KW_AFTER, KW_ALTE= R..KW_ANALYZE, KW_ARCHIVE, KW_AS..KW_CASCADE, KW_CHANGE, KW_CLUSTER..KW_COL= LECTION, KW_COLUMNS..KW_CREATE, KW_CUBE, KW_CURSOR..KW_DATA, KW_DATABASES, = KW_DATETIME..KW_DISABLE, KW_DISTRIBUTE..KW_ELEM_TYPE, KW_ENABLE, KW_ESCAPED= , KW_EXCLUSIVE, KW_EXPLAIN..KW_EXPORT, KW_EXTERNAL, KW_FETCH..KW_FLOAT, KW_= FOR..KW_FORMATTED, KW_FULL, KW_FUNCTIONS..KW_GROUPING, KW_HOLD_DDLTIME..KW_= IDXPROPERTIES, KW_IGNORE..KW_ITEMS, KW_KEYS..KW_LEFT, KW_LIKE..KW_LONG, KW_= MAPJOIN..KW_MINUS, KW_MSCK..KW_NOSCAN, KW_NO_DROP, KW_OF..KW_OFFLINE, KW_OP= TION, KW_ORCFILE..KW_OUTPUTFORMAT, KW_OVERWRITE, KW_PARTITION..KW_PLUS, KW_= PRETTY..KW_RECORDWRITER, KW_REGEXP..KW_SCHEMAS, KW_SEMI..KW_STRING, KW_TABL= E..KW_TABLES, KW_TBLPROPERTIES..KW_TEXTFILE, KW_TIMESTAMP..KW_TOUCH, KW_TRI= GGER, KW_TRUNCATE..KW_UNARCHIVE, KW_UNDO..KW_UNION, KW_UNLOCK..KW_VALUE_TYP= E, KW_VIEW, KW_WHILE, KW_WITH}" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4:=20 Decision can match input such as "LPAREN KW_NULL NOTEQUAL" using multiple a= lternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4:=20 Decision can match input such as "LPAREN KW_NOT KW_EXISTS" using multiple a= lternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4:=20 Decision can match input such as "LPAREN KW_NULL EQUAL_NS" using multiple a= lternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4:=20 Decision can match input such as "LPAREN KW_EXISTS LPAREN" using multiple a= lternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4:=20 Decision can match input such as "LPAREN KW_NULL LESSTHAN" using multiple a= lternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4:=20 Decision can match input such as "LPAREN KW_NULL LESSTHANOREQUALTO" using m= ultiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4:=20 Decision can match input such as "LPAREN LPAREN KW_ARRAY" using multiple al= ternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4:=20 Decision can match input such as "LPAREN LPAREN KW_NULL" using multiple alt= ernatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4:=20 Decision can match input such as "LPAREN LPAREN KW_UNIONTYPE" using multipl= e alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4:=20 Decision can match input such as "LPAREN KW_NULL KW_OR" using multiple alte= rnatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4:=20 Decision can match input such as "LPAREN KW_DATE StringLiteral" using multi= ple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4:=20 Decision can match input such as "LPAREN KW_CASE KW_DATE" using multiple al= ternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4:=20 Decision can match input such as "LPAREN LPAREN KW_STRUCT" using multiple a= lternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4:=20 Decision can match input such as "LPAREN KW_NULL EQUAL" using multiple alte= rnatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4:=20 Decision can match input such as "LPAREN KW_CAST LPAREN" using multiple alt= ernatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4:=20 Decision can match input such as "LPAREN LPAREN LPAREN" using multiple alte= rnatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4:=20 Decision can match input such as "LPAREN KW_NOT LPAREN" using multiple alte= rnatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4:=20 Decision can match input such as "LPAREN KW_CASE LPAREN" using multiple alt= ernatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4:=20 Decision can match input such as "LPAREN KW_CASE KW_STRUCT" using multiple = alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4:=20 Decision can match input such as "LPAREN KW_CASE KW_UNIONTYPE" using multip= le alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4:=20 Decision can match input such as "LPAREN KW_NULL GREATERTHAN" using multipl= e alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4:=20 Decision can match input such as "LPAREN KW_CASE KW_ARRAY" using multiple a= lternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4:=20 Decision can match input such as "LPAREN LPAREN KW_DATE" using multiple alt= ernatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4:=20 Decision can match input such as "LPAREN KW_NULL GREATERTHANOREQUALTO" usin= g multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4:=20 Decision can match input such as "LPAREN KW_CASE KW_NULL" using multiple al= ternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4:=20 Decision can match input such as "LPAREN KW_CASE KW_EXISTS" using multiple = alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4:=20 Decision can match input such as "LPAREN KW_CASE TinyintLiteral" using mult= iple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4:=20 Decision can match input such as "LPAREN KW_NULL KW_BETWEEN" using multiple= alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4:=20 Decision can match input such as "LPAREN KW_CASE SmallintLiteral" using mul= tiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4:=20 Decision can match input such as "LPAREN KW_NULL {MINUS, PLUS}" using multi= ple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4:=20 Decision can match input such as "LPAREN KW_NULL KW_NOT" using multiple alt= ernatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4:=20 Decision can match input such as "LPAREN KW_CASE BigintLiteral" using multi= ple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4:=20 Decision can match input such as "LPAREN KW_NULL LSQUARE" using multiple al= ternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4:=20 Decision can match input such as "LPAREN KW_NOT {KW_ADD..KW_AFTER, KW_ALTER= ..KW_ANALYZE, KW_ARCHIVE, KW_AS..KW_CASCADE, KW_CHANGE, KW_CLUSTER..KW_COLL= ECTION, KW_COLUMNS..KW_CREATE, KW_CUBE, KW_CURSOR..KW_DATA, KW_DATABASES, K= W_DATETIME..KW_DISABLE, KW_DISTRIBUTE..KW_ELEM_TYPE, KW_ENABLE, KW_ESCAPED,= KW_EXCLUSIVE, KW_EXPLAIN..KW_EXPORT, KW_EXTERNAL, KW_FETCH..KW_FLOAT, KW_F= OR..KW_FORMATTED, KW_FULL, KW_FUNCTIONS..KW_GROUPING, KW_HOLD_DDLTIME..KW_I= DXPROPERTIES, KW_IGNORE..KW_ITEMS, KW_KEYS..KW_LEFT, KW_LIKE..KW_LONG, KW_M= APJOIN..KW_MINUS, KW_MSCK..KW_NOSCAN, KW_NO_DROP, KW_OF..KW_OFFLINE, KW_OPT= ION, KW_ORCFILE..KW_OUTPUTFORMAT, KW_OVERWRITE, KW_PARTITION..KW_PLUS, KW_P= RETTY..KW_RECORDWRITER, KW_REGEXP..KW_SCHEMAS, KW_SEMI..KW_STRING, KW_TABLE= ..KW_TABLES, KW_TBLPROPERTIES..KW_TEXTFILE, KW_TIMESTAMP..KW_TOUCH, KW_TRIG= GER, KW_TRUNCATE..KW_UNARCHIVE, KW_UNDO..KW_UNION, KW_UNLOCK..KW_VALUE_TYPE= , KW_VIEW, KW_WHILE, KW_WITH}" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4:=20 Decision can match input such as "LPAREN LPAREN KW_FALSE" using multiple al= ternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4:=20 Decision can match input such as "LPAREN KW_NOT KW_NULL" using multiple alt= ernatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4:=20 Decision can match input such as "LPAREN LPAREN {KW_ADD..KW_AFTER, KW_ALTER= ..KW_ANALYZE, KW_ARCHIVE, KW_AS..KW_CASCADE, KW_CHANGE, KW_CLUSTER..KW_COLL= ECTION, KW_COLUMNS..KW_CREATE, KW_CUBE, KW_CURSOR..KW_DATA, KW_DATABASES, K= W_DATETIME..KW_DISABLE, KW_DISTRIBUTE..KW_ELEM_TYPE, KW_ENABLE, KW_ESCAPED,= KW_EXCLUSIVE, KW_EXPLAIN..KW_EXPORT, KW_EXTERNAL, KW_FETCH..KW_FLOAT, KW_F= OR..KW_FORMATTED, KW_FULL, KW_FUNCTIONS..KW_GROUPING, KW_HOLD_DDLTIME..KW_I= DXPROPERTIES, KW_IGNORE..KW_ITEMS, KW_KEYS..KW_LEFT, KW_LIKE..KW_LONG, KW_M= APJOIN..KW_MINUS, KW_MSCK..KW_NOSCAN, KW_NO_DROP, KW_OF..KW_OFFLINE, KW_OPT= ION, KW_ORCFILE..KW_OUTPUTFORMAT, KW_OVERWRITE, KW_PARTITION..KW_PLUS, KW_P= RETTY..KW_RECORDWRITER, KW_REGEXP..KW_SCHEMAS, KW_SEMI..KW_STRING, KW_TABLE= ..KW_TABLES, KW_TBLPROPERTIES..KW_TEXTFILE, KW_TIMESTAMP..KW_TOUCH, KW_TRIG= GER, KW_TRUNCATE..KW_UNARCHIVE, KW_UNDO..KW_UNION, KW_UNLOCK..KW_VALUE_TYPE= , KW_VIEW, KW_WHILE, KW_WITH}" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4:=20 Decision can match input such as "LPAREN LPAREN KW_TRUE" using multiple alt= ernatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4:=20 Decision can match input such as "LPAREN KW_CASE Number" using multiple alt= ernatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4:=20 Decision can match input such as "LPAREN KW_NULL BITWISEXOR" using multiple= alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4:=20 Decision can match input such as "LPAREN KW_NOT SmallintLiteral" using mult= iple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4:=20 Decision can match input such as "LPAREN CharSetName CharSetLiteral" using = multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4:=20 Decision can match input such as "LPAREN LPAREN KW_CAST" using multiple alt= ernatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4:=20 Decision can match input such as "LPAREN KW_NOT KW_CAST" using multiple alt= ernatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4:=20 Decision can match input such as "LPAREN KW_CASE KW_CAST" using multiple al= ternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4:=20 Decision can match input such as "LPAREN KW_NOT BigintLiteral" using multip= le alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4:=20 Decision can match input such as "LPAREN KW_NULL KW_AND" using multiple alt= ernatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4:=20 Decision can match input such as "LPAREN LPAREN KW_CASE" using multiple alt= ernatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4:=20 Decision can match input such as "LPAREN KW_NULL KW_IN" using multiple alte= rnatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4:=20 Decision can match input such as "LPAREN KW_NOT KW_CASE" using multiple alt= ernatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4:=20 Decision can match input such as "LPAREN KW_CASE KW_CASE" using multiple al= ternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4:=20 Decision can match input such as "LPAREN KW_NOT DecimalLiteral" using multi= ple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4:=20 Decision can match input such as "LPAREN KW_NOT TinyintLiteral" using multi= ple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4:=20 Decision can match input such as "LPAREN KW_CASE KW_WHEN" using multiple al= ternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4:=20 Decision can match input such as "LPAREN LPAREN StringLiteral" using multip= le alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4:=20 Decision can match input such as "LPAREN LPAREN Identifier" using multiple = alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:68:4:=20 Decision can match input such as "LPAREN KW_CASE DecimalLiteral" using mult= iple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:108:5:=20 Decision can match input such as "KW_ORDER KW_BY LPAREN" using multiple alt= ernatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:121:5:=20 Decision can match input such as "KW_CLUSTER KW_BY LPAREN" using multiple a= lternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:133:5:=20 Decision can match input such as "KW_PARTITION KW_BY LPAREN" using multiple= alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:144:5:=20 Decision can match input such as "KW_DISTRIBUTE KW_BY LPAREN" using multipl= e alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:155:5:=20 Decision can match input such as "KW_SORT KW_BY LPAREN" using multiple alte= rnatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:172:7:=20 Decision can match input such as "STAR" using multiple alternatives: 1, 2 As a result, alternative(s) 2 were disabled for that input warning(200): IdentifiersParser.g:185:5:=20 Decision can match input such as "KW_UNIONTYPE" using multiple alternatives= : 5, 6 As a result, alternative(s) 6 were disabled for that input warning(200): IdentifiersParser.g:185:5:=20 Decision can match input such as "KW_STRUCT" using multiple alternatives: 4= , 6 As a result, alternative(s) 6 were disabled for that input warning(200): IdentifiersParser.g:185:5:=20 Decision can match input such as "KW_ARRAY" using multiple alternatives: 2,= 6 As a result, alternative(s) 6 were disabled for that input warning(200): IdentifiersParser.g:267:5:=20 Decision can match input such as "KW_TRUE" using multiple alternatives: 3, = 8 As a result, alternative(s) 8 were disabled for that input warning(200): IdentifiersParser.g:267:5:=20 Decision can match input such as "KW_DATE StringLiteral" using multiple alt= ernatives: 2, 3 As a result, alternative(s) 3 were disabled for that input warning(200): IdentifiersParser.g:267:5:=20 Decision can match input such as "KW_FALSE" using multiple alternatives: 3,= 8 As a result, alternative(s) 8 were disabled for that input warning(200): IdentifiersParser.g:267:5:=20 Decision can match input such as "KW_NULL" using multiple alternatives: 1, = 8 As a result, alternative(s) 8 were disabled for that input warning(200): IdentifiersParser.g:399:5:=20 Decision can match input such as "{KW_LIKE, KW_REGEXP, KW_RLIKE} KW_INSERT = KW_OVERWRITE" using multiple alternatives: 2, 9 As a result, alternative(s) 9 were disabled for that input warning(200): IdentifiersParser.g:399:5:=20 Decision can match input such as "{KW_LIKE, KW_REGEXP, KW_RLIKE} KW_CLUSTER= KW_BY" using multiple alternatives: 2, 9 As a result, alternative(s) 9 were disabled for that input warning(200): IdentifiersParser.g:399:5:=20 Decision can match input such as "{KW_LIKE, KW_REGEXP, KW_RLIKE} KW_INSERT = KW_INTO" using multiple alternatives: 2, 9 As a result, alternative(s) 9 were disabled for that input warning(200): IdentifiersParser.g:399:5:=20 Decision can match input such as "{KW_LIKE, KW_REGEXP, KW_RLIKE} KW_ORDER K= W_BY" using multiple alternatives: 2, 9 As a result, alternative(s) 9 were disabled for that input warning(200): IdentifiersParser.g:399:5:=20 Decision can match input such as "{KW_LIKE, KW_REGEXP, KW_RLIKE} KW_MAP LPA= REN" using multiple alternatives: 2, 9 As a result, alternative(s) 9 were disabled for that input warning(200): IdentifiersParser.g:399:5:=20 Decision can match input such as "{KW_LIKE, KW_REGEXP, KW_RLIKE} KW_SORT KW= _BY" using multiple alternatives: 2, 9 As a result, alternative(s) 9 were disabled for that input warning(200): IdentifiersParser.g:399:5:=20 Decision can match input such as "{KW_LIKE, KW_REGEXP, KW_RLIKE} KW_LATERAL= KW_VIEW" using multiple alternatives: 2, 9 As a result, alternative(s) 9 were disabled for that input warning(200): IdentifiersParser.g:399:5:=20 Decision can match input such as "KW_BETWEEN KW_MAP LPAREN" using multiple = alternatives: 8, 9 As a result, alternative(s) 9 were disabled for that input warning(200): IdentifiersParser.g:399:5:=20 Decision can match input such as "{KW_LIKE, KW_REGEXP, KW_RLIKE} KW_GROUP K= W_BY" using multiple alternatives: 2, 9 As a result, alternative(s) 9 were disabled for that input warning(200): IdentifiersParser.g:399:5:=20 Decision can match input such as "{KW_LIKE, KW_REGEXP, KW_RLIKE} KW_DISTRIB= UTE KW_BY" using multiple alternatives: 2, 9 As a result, alternative(s) 9 were disabled for that input warning(200): IdentifiersParser.g:524:5:=20 Decision can match input such as "{AMPERSAND..BITWISEXOR, DIV..DIVIDE, EQUA= L..EQUAL_NS, GREATERTHAN..GREATERTHANOREQUALTO, KW_AND, KW_ARRAY, KW_BETWEE= N..KW_BOOLEAN, KW_CASE, KW_DOUBLE, KW_FLOAT, KW_IF, KW_IN, KW_INT, KW_LIKE,= KW_MAP, KW_NOT, KW_OR, KW_REGEXP, KW_RLIKE, KW_SMALLINT, KW_STRING..KW_STR= UCT, KW_TINYINT, KW_UNIONTYPE, KW_WHEN, LESSTHAN..LESSTHANOREQUALTO, MINUS.= .NOTEQUAL, PLUS, STAR, TILDE}" using multiple alternatives: 1, 3 As a result, alternative(s) 3 were disabled for that input [INFO]=20 [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-= exec --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] Copying 1 resource [INFO]=20 [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-exec --- [INFO] Executing tasks main: [INFO] Executed tasks [INFO]=20 [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-exec = --- [INFO] Compiling 1388 source files to /data/hive-ptest/working/apache-svn-t= runk-source/ql/target/classes [WARNING] Note: Some input files use or override a deprecated API. [WARNING] Note: Recompile with -Xlint:deprecation for details. [WARNING] Note: Some input files use unchecked or unsafe operations. [WARNING] Note: Recompile with -Xlint:unchecked for details. [INFO]=20 [INFO] --- build-helper-maven-plugin:1.8:add-test-source (add-test-sources)= @ hive-exec --- [INFO] Test Source directory: /data/hive-ptest/working/apache-svn-trunk-sou= rce/ql/target/generated-test-sources/java added. [INFO]=20 [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources)= @ hive-exec --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] Copying 4 resources [INFO]=20 [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-exec --- [INFO] Executing tasks main: [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/q= l/target/tmp [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/q= l/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/q= l/target/tmp/conf [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-so= urce/ql/target/tmp/conf [INFO] Executed tasks [INFO]=20 [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hi= ve-exec --- [INFO] Compiling 134 source files to /data/hive-ptest/working/apache-svn-tr= unk-source/ql/target/test-classes [WARNING] Note: Some input files use or override a deprecated API. [WARNING] Note: Recompile with -Xlint:deprecation for details. [WARNING] Note: Some input files use unchecked or unsafe operations. [WARNING] Note: Recompile with -Xlint:unchecked for details. [INFO]=20 [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-exec --- [INFO] Tests are skipped. [INFO]=20 [INFO] --- maven-jar-plugin:2.2:jar (default-jar) @ hive-exec --- [INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/ql/ta= rget/hive-exec-0.13.0-SNAPSHOT.jar [INFO]=20 [INFO] --- maven-jar-plugin:2.2:test-jar (default) @ hive-exec --- [INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/ql/ta= rget/hive-exec-0.13.0-SNAPSHOT-tests.jar [INFO]=20 [INFO] --- maven-shade-plugin:2.1:shade (build-exec-bundle) @ hive-exec --- [INFO] Excluding org.apache.hive:hive-ant:jar:0.13.0-SNAPSHOT from the shad= ed jar. [INFO] Excluding org.apache.velocity:velocity:jar:1.5 from the shaded jar. [INFO] Excluding commons-collections:commons-collections:jar:3.1 from the s= haded jar. [INFO] Including org.apache.hive:hive-common:jar:0.13.0-SNAPSHOT in the sha= ded jar. [INFO] Excluding commons-cli:commons-cli:jar:1.2 from the shaded jar. [INFO] Excluding org.apache.hive:hive-metastore:jar:0.13.0-SNAPSHOT from th= e shaded jar. [INFO] Excluding com.jolbox:bonecp:jar:0.7.1.RELEASE from the shaded jar. [INFO] Excluding org.apache.derby:derby:jar:10.4.2.0 from the shaded jar. [INFO] Excluding org.datanucleus:datanucleus-api-jdo:jar:3.2.1 from the sha= ded jar. [INFO] Excluding org.datanucleus:datanucleus-rdbms:jar:3.2.1 from the shade= d jar. [INFO] Excluding javax.jdo:jdo-api:jar:3.0.1 from the shaded jar. [INFO] Excluding javax.transaction:jta:jar:1.1 from the shaded jar. [INFO] Including org.apache.hive:hive-serde:jar:0.13.0-SNAPSHOT in the shad= ed jar. [INFO] Including org.apache.hive:hive-shims:jar:0.13.0-SNAPSHOT in the shad= ed jar. [INFO] Excluding org.apache.hive.shims:hive-shims-common:jar:0.13.0-SNAPSHO= T from the shaded jar. [INFO] Excluding org.apache.hive.shims:hive-shims-0.20:jar:0.13.0-SNAPSHOT = from the shaded jar. [INFO] Excluding org.apache.hive.shims:hive-shims-common-secure:jar:0.13.0-= SNAPSHOT from the shaded jar. [INFO] Excluding org.apache.hive.shims:hive-shims-0.20S:jar:0.13.0-SNAPSHOT= from the shaded jar. [INFO] Excluding org.apache.hive.shims:hive-shims-0.23:jar:0.13.0-SNAPSHOT = from the shaded jar. [INFO] Including com.esotericsoftware.kryo:kryo:jar:2.22 in the shaded jar. [INFO] Excluding commons-codec:commons-codec:jar:1.4 from the shaded jar. [INFO] Excluding commons-httpclient:commons-httpclient:jar:3.0.1 from the s= haded jar. [INFO] Excluding commons-io:commons-io:jar:2.4 from the shaded jar. [INFO] Including commons-lang:commons-lang:jar:2.4 in the shaded jar. [INFO] Excluding commons-logging:commons-logging:jar:1.1.3 from the shaded = jar. [INFO] Including javolution:javolution:jar:5.5.1 in the shaded jar. [INFO] Excluding log4j:log4j:jar:1.2.16 from the shaded jar. [INFO] Excluding org.antlr:antlr-runtime:jar:3.4 from the shaded jar. [INFO] Excluding org.antlr:stringtemplate:jar:3.2.1 from the shaded jar. [INFO] Excluding antlr:antlr:jar:2.7.7 from the shaded jar. [INFO] Excluding org.antlr:ST4:jar:4.0.4 from the shaded jar. [INFO] Excluding org.apache.avro:avro:jar:1.7.1 from the shaded jar. [INFO] Excluding com.thoughtworks.paranamer:paranamer:jar:2.3 from the shad= ed jar. [INFO] Excluding org.xerial.snappy:snappy-java:jar:1.0.4.1 from the shaded = jar. [INFO] Excluding org.apache.avro:avro-mapred:jar:1.7.1 from the shaded jar. [INFO] Excluding org.apache.avro:avro-ipc:jar:1.7.1 from the shaded jar. [INFO] Excluding io.netty:netty:jar:3.4.0.Final from the shaded jar. [INFO] Excluding org.mortbay.jetty:servlet-api:jar:2.5-20081211 from the sh= aded jar. [INFO] Excluding org.apache.ant:ant:jar:1.9.1 from the shaded jar. [INFO] Excluding org.apache.ant:ant-launcher:jar:1.9.1 from the shaded jar. [INFO] Excluding org.apache.commons:commons-compress:jar:1.4.1 from the sha= ded jar. [INFO] Excluding org.tukaani:xz:jar:1.0 from the shaded jar. [INFO] Excluding org.apache.thrift:libfb303:jar:0.9.0 from the shaded jar. [INFO] Including org.apache.thrift:libthrift:jar:0.9.0 in the shaded jar. [INFO] Excluding org.apache.httpcomponents:httpclient:jar:4.1.3 from the sh= aded jar. [INFO] Excluding org.apache.httpcomponents:httpcore:jar:4.1.3 from the shad= ed jar. [INFO] Excluding org.apache.zookeeper:zookeeper:jar:3.4.3 from the shaded j= ar. [INFO] Excluding jline:jline:jar:0.9.94 from the shaded jar. [INFO] Excluding org.jboss.netty:netty:jar:3.2.2.Final from the shaded jar. [INFO] Excluding org.codehaus.groovy:groovy-all:jar:2.1.6 from the shaded j= ar. [INFO] Including org.codehaus.jackson:jackson-core-asl:jar:1.9.2 in the sha= ded jar. [INFO] Including org.codehaus.jackson:jackson-mapper-asl:jar:1.9.2 in the s= haded jar. [INFO] Excluding org.datanucleus:datanucleus-core:jar:3.2.2 from the shaded= jar. [INFO] Including com.google.guava:guava:jar:11.0.2 in the shaded jar. [INFO] Excluding com.google.code.findbugs:jsr305:jar:1.3.9 from the shaded = jar. [INFO] Including com.google.protobuf:protobuf-java:jar:2.5.0 in the shaded = jar. [INFO] Including com.googlecode.javaewah:JavaEWAH:jar:0.3.2 in the shaded j= ar. [INFO] Including org.iq80.snappy:snappy:jar:0.2 in the shaded jar. [INFO] Including org.json:json:jar:20090211 in the shaded jar. [INFO] Excluding stax:stax-api:jar:1.0.1 from the shaded jar. [INFO] Excluding org.apache.hadoop:hadoop-core:jar:1.2.1 from the shaded ja= r. [INFO] Excluding xmlenc:xmlenc:jar:0.52 from the shaded jar. [INFO] Excluding com.sun.jersey:jersey-core:jar:1.8 from the shaded jar. [INFO] Excluding com.sun.jersey:jersey-json:jar:1.8 from the shaded jar. [INFO] Excluding org.codehaus.jettison:jettison:jar:1.1 from the shaded jar= . [INFO] Excluding com.sun.xml.bind:jaxb-impl:jar:2.2.3-1 from the shaded jar= . [INFO] Excluding javax.xml.bind:jaxb-api:jar:2.2.2 from the shaded jar. [INFO] Excluding javax.xml.stream:stax-api:jar:1.0-2 from the shaded jar. [INFO] Excluding javax.activation:activation:jar:1.1 from the shaded jar. [INFO] Excluding org.codehaus.jackson:jackson-jaxrs:jar:1.7.1 from the shad= ed jar. [INFO] Excluding org.codehaus.jackson:jackson-xc:jar:1.7.1 from the shaded = jar. [INFO] Excluding com.sun.jersey:jersey-server:jar:1.8 from the shaded jar. [INFO] Excluding asm:asm:jar:3.1 from the shaded jar. [INFO] Excluding org.apache.commons:commons-math:jar:2.1 from the shaded ja= r. [INFO] Excluding commons-configuration:commons-configuration:jar:1.6 from t= he shaded jar. [INFO] Excluding commons-digester:commons-digester:jar:1.8 from the shaded = jar. [INFO] Excluding commons-beanutils:commons-beanutils:jar:1.7.0 from the sha= ded jar. [INFO] Excluding commons-beanutils:commons-beanutils-core:jar:1.8.0 from th= e shaded jar. [INFO] Excluding commons-net:commons-net:jar:1.4.1 from the shaded jar. [INFO] Excluding org.mortbay.jetty:jetty:jar:6.1.26 from the shaded jar. [INFO] Excluding org.mortbay.jetty:jetty-util:jar:6.1.26 from the shaded ja= r. [INFO] Excluding tomcat:jasper-runtime:jar:5.5.12 from the shaded jar. [INFO] Excluding tomcat:jasper-compiler:jar:5.5.12 from the shaded jar. [INFO] Excluding org.mortbay.jetty:jsp-api-2.1:jar:6.1.14 from the shaded j= ar. [INFO] Excluding org.mortbay.jetty:servlet-api-2.5:jar:6.1.14 from the shad= ed jar. [INFO] Excluding org.mortbay.jetty:jsp-2.1:jar:6.1.14 from the shaded jar. [INFO] Excluding ant:ant:jar:1.6.5 from the shaded jar. [INFO] Excluding commons-el:commons-el:jar:1.0 from the shaded jar. [INFO] Excluding net.java.dev.jets3t:jets3t:jar:0.6.1 from the shaded jar. [INFO] Excluding hsqldb:hsqldb:jar:1.8.0.10 from the shaded jar. [INFO] Excluding oro:oro:jar:2.0.8 from the shaded jar. [INFO] Excluding org.eclipse.jdt:core:jar:3.1.1 from the shaded jar. [INFO] Excluding org.slf4j:slf4j-api:jar:1.6.1 from the shaded jar. [INFO] Excluding org.slf4j:slf4j-log4j12:jar:1.6.1 from the shaded jar. [INFO] Replacing original artifact with shaded artifact. [INFO] Replacing /data/hive-ptest/working/apache-svn-trunk-source/ql/target= /hive-exec-0.13.0-SNAPSHOT.jar with /data/hive-ptest/working/apache-svn-tru= nk-source/ql/target/hive-exec-0.13.0-SNAPSHOT-shaded.jar [INFO] Dependency-reduced POM written at: /data/hive-ptest/working/apache-s= vn-trunk-source/ql/dependency-reduced-pom.xml [INFO] Dependency-reduced POM written at: /data/hive-ptest/working/apache-s= vn-trunk-source/ql/dependency-reduced-pom.xml [INFO]=20 [INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-exec -= -- [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/ql/targe= t/hive-exec-0.13.0-SNAPSHOT.jar to /data/hive-ptest/working/maven/org/apach= e/hive/hive-exec/0.13.0-SNAPSHOT/hive-exec-0.13.0-SNAPSHOT.jar [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/ql/depen= dency-reduced-pom.xml to /data/hive-ptest/working/maven/org/apache/hive/hiv= e-exec/0.13.0-SNAPSHOT/hive-exec-0.13.0-SNAPSHOT.pom [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/ql/targe= t/hive-exec-0.13.0-SNAPSHOT-tests.jar to /data/hive-ptest/working/maven/org= /apache/hive/hive-exec/0.13.0-SNAPSHOT/hive-exec-0.13.0-SNAPSHOT-tests.jar [INFO] = =20 [INFO] --------------------------------------------------------------------= ---- [INFO] Building Hive Service 0.13.0-SNAPSHOT [INFO] --------------------------------------------------------------------= ---- [INFO]=20 [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-service --- [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/service (i= ncludes =3D [datanucleus.log, derby.log], excludes =3D []) [INFO]=20 [INFO] --- build-helper-maven-plugin:1.8:add-source (add-source) @ hive-ser= vice --- [INFO] Source directory: /data/hive-ptest/working/apache-svn-trunk-source/s= ervice/src/model added. [INFO] Source directory: /data/hive-ptest/working/apache-svn-trunk-source/s= ervice/src/gen/thrift/gen-javabean added. [INFO]=20 [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-= service --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-= svn-trunk-source/service/src/main/resources [INFO]=20 [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-service --= - [INFO] Executing tasks main: [INFO] Executed tasks [INFO]=20 [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-servi= ce --- [INFO] Compiling 153 source files to /data/hive-ptest/working/apache-svn-tr= unk-source/service/target/classes [WARNING] Note: /data/hive-ptest/working/apache-svn-trunk-source/service/sr= c/java/org/apache/hive/service/cli/operation/SQLOperation.java uses or over= rides a deprecated API. [WARNING] Note: Recompile with -Xlint:deprecation for details. [WARNING] Note: Some input files use unchecked or unsafe operations. [WARNING] Note: Recompile with -Xlint:unchecked for details. [INFO]=20 [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources)= @ hive-service --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-= svn-trunk-source/service/src/test/resources [INFO]=20 [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-service --- [INFO] Executing tasks main: [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/s= ervice/target/tmp [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/s= ervice/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/s= ervice/target/tmp/conf [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-so= urce/service/target/tmp/conf [INFO] Executed tasks [INFO]=20 [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hi= ve-service --- [INFO] Compiling 7 source files to /data/hive-ptest/working/apache-svn-trun= k-source/service/target/test-classes [INFO]=20 [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-service --= - [INFO] Tests are skipped. [INFO]=20 [INFO] --- maven-jar-plugin:2.2:jar (default-jar) @ hive-service --- [INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/servi= ce/target/hive-service-0.13.0-SNAPSHOT.jar [INFO]=20 [INFO] --- maven-jar-plugin:2.2:test-jar (default) @ hive-service --- [INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/servi= ce/target/hive-service-0.13.0-SNAPSHOT-tests.jar [INFO]=20 [INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-servic= e --- [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/service/= target/hive-service-0.13.0-SNAPSHOT.jar to /data/hive-ptest/working/maven/o= rg/apache/hive/hive-service/0.13.0-SNAPSHOT/hive-service-0.13.0-SNAPSHOT.ja= r [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/service/= pom.xml to /data/hive-ptest/working/maven/org/apache/hive/hive-service/0.13= .0-SNAPSHOT/hive-service-0.13.0-SNAPSHOT.pom [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/service/= target/hive-service-0.13.0-SNAPSHOT-tests.jar to /data/hive-ptest/working/m= aven/org/apache/hive/hive-service/0.13.0-SNAPSHOT/hive-service-0.13.0-SNAPS= HOT-tests.jar [INFO] = =20 [INFO] --------------------------------------------------------------------= ---- [INFO] Building Hive JDBC 0.13.0-SNAPSHOT [INFO] --------------------------------------------------------------------= ---- [INFO]=20 [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-jdbc --- [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/jdbc (incl= udes =3D [datanucleus.log, derby.log], excludes =3D []) [INFO]=20 [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-= jdbc --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-= svn-trunk-source/jdbc/src/main/resources [INFO]=20 [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-jdbc --- [INFO] Executing tasks main: [INFO] Executed tasks [INFO]=20 [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-jdbc = --- [INFO] Compiling 30 source files to /data/hive-ptest/working/apache-svn-tru= nk-source/jdbc/target/classes [WARNING] Note: Some input files use or override a deprecated API. [WARNING] Note: Recompile with -Xlint:deprecation for details. [WARNING] Note: Some input files use unchecked or unsafe operations. [WARNING] Note: Recompile with -Xlint:unchecked for details. [INFO]=20 [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources)= @ hive-jdbc --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-= svn-trunk-source/jdbc/src/test/resources [INFO]=20 [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-jdbc --- [INFO] Executing tasks main: [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/j= dbc/target/tmp [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/j= dbc/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/j= dbc/target/tmp/conf [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-so= urce/jdbc/target/tmp/conf [INFO] Executed tasks [INFO]=20 [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hi= ve-jdbc --- [INFO] No sources to compile [INFO]=20 [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-jdbc --- [INFO] Tests are skipped. [INFO]=20 [INFO] --- maven-jar-plugin:2.2:jar (default-jar) @ hive-jdbc --- [INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/jdbc/= target/hive-jdbc-0.13.0-SNAPSHOT.jar [INFO]=20 [INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-jdbc -= -- [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/jdbc/tar= get/hive-jdbc-0.13.0-SNAPSHOT.jar to /data/hive-ptest/working/maven/org/apa= che/hive/hive-jdbc/0.13.0-SNAPSHOT/hive-jdbc-0.13.0-SNAPSHOT.jar [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/jdbc/pom= .xml to /data/hive-ptest/working/maven/org/apache/hive/hive-jdbc/0.13.0-SNA= PSHOT/hive-jdbc-0.13.0-SNAPSHOT.pom [INFO] = =20 [INFO] --------------------------------------------------------------------= ---- [INFO] Building Hive Beeline 0.13.0-SNAPSHOT [INFO] --------------------------------------------------------------------= ---- [INFO]=20 [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-beeline --- [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/beeline (i= ncludes =3D [datanucleus.log, derby.log], excludes =3D []) [INFO]=20 [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-= beeline --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] Copying 2 resources [INFO]=20 [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-beeline --= - [INFO] Executing tasks main: [INFO] Executed tasks [INFO]=20 [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-beeli= ne --- [INFO] Compiling 31 source files to /data/hive-ptest/working/apache-svn-tru= nk-source/beeline/target/classes [WARNING] /data/hive-ptest/working/apache-svn-trunk-source/beeline/src/java= /org/apache/hive/beeline/SunSignalHandler.java:[28,16] warning: sun.misc.Si= gnal is Sun proprietary API and may be removed in a future release [WARNING] /data/hive-ptest/working/apache-svn-trunk-source/beeline/src/java= /org/apache/hive/beeline/SunSignalHandler.java:[29,16] warning: sun.misc.Si= gnalHandler is Sun proprietary API and may be removed in a future release [WARNING] /data/hive-ptest/working/apache-svn-trunk-source/beeline/src/java= /org/apache/hive/beeline/SunSignalHandler.java:[31,64] warning: sun.misc.Si= gnalHandler is Sun proprietary API and may be removed in a future release [WARNING] /data/hive-ptest/working/apache-svn-trunk-source/beeline/src/java= /org/apache/hive/beeline/SunSignalHandler.java:[44,23] warning: sun.misc.Si= gnal is Sun proprietary API and may be removed in a future release [WARNING] /data/hive-ptest/working/apache-svn-trunk-source/beeline/src/java= /org/apache/hive/beeline/SunSignalHandler.java:[37,24] warning: sun.misc.Si= gnal is Sun proprietary API and may be removed in a future release [WARNING] /data/hive-ptest/working/apache-svn-trunk-source/beeline/src/java= /org/apache/hive/beeline/SunSignalHandler.java:[37,5] warning: sun.misc.Sig= nal is Sun proprietary API and may be removed in a future release [WARNING] Note: Some input files use unchecked or unsafe operations. [WARNING] Note: Recompile with -Xlint:unchecked for details. [INFO]=20 [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources)= @ hive-beeline --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-= svn-trunk-source/beeline/src/test/resources [INFO]=20 [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-beeline --- [INFO] Executing tasks main: [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/b= eeline/target/tmp [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/b= eeline/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/b= eeline/target/tmp/conf [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-so= urce/beeline/target/tmp/conf [INFO] Executed tasks [INFO]=20 [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hi= ve-beeline --- [INFO] Compiling 1 source file to /data/hive-ptest/working/apache-svn-trunk= -source/beeline/target/test-classes [INFO]=20 [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-beeline --= - [INFO] Tests are skipped. [INFO]=20 [INFO] --- maven-jar-plugin:2.2:jar (default-jar) @ hive-beeline --- [INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/beeli= ne/target/hive-beeline-0.13.0-SNAPSHOT.jar [INFO]=20 [INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-beelin= e --- [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/beeline/= target/hive-beeline-0.13.0-SNAPSHOT.jar to /data/hive-ptest/working/maven/o= rg/apache/hive/hive-beeline/0.13.0-SNAPSHOT/hive-beeline-0.13.0-SNAPSHOT.ja= r [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/beeline/= pom.xml to /data/hive-ptest/working/maven/org/apache/hive/hive-beeline/0.13= .0-SNAPSHOT/hive-beeline-0.13.0-SNAPSHOT.pom [INFO] = =20 [INFO] --------------------------------------------------------------------= ---- [INFO] Building Hive CLI 0.13.0-SNAPSHOT [INFO] --------------------------------------------------------------------= ---- [INFO]=20 [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-cli --- [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/cli (inclu= des =3D [datanucleus.log, derby.log], excludes =3D []) [INFO]=20 [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-= cli --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-= svn-trunk-source/cli/src/main/resources [INFO]=20 [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-cli --- [INFO] Executing tasks main: [INFO] Executed tasks [INFO]=20 [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-cli -= -- [INFO] Compiling 4 source files to /data/hive-ptest/working/apache-svn-trun= k-source/cli/target/classes [WARNING] /data/hive-ptest/working/apache-svn-trunk-source/cli/src/java/org= /apache/hadoop/hive/cli/CliDriver.java:[74,16] warning: sun.misc.Signal is = Sun proprietary API and may be removed in a future release [WARNING] /data/hive-ptest/working/apache-svn-trunk-source/cli/src/java/org= /apache/hadoop/hive/cli/CliDriver.java:[75,16] warning: sun.misc.SignalHand= ler is Sun proprietary API and may be removed in a future release [WARNING] /data/hive-ptest/working/apache-svn-trunk-source/cli/src/java/org= /apache/hadoop/hive/cli/CliDriver.java:[371,5] warning: sun.misc.SignalHand= ler is Sun proprietary API and may be removed in a future release [WARNING] /data/hive-ptest/working/apache-svn-trunk-source/cli/src/java/org= /apache/hadoop/hive/cli/CliDriver.java:[372,5] warning: sun.misc.Signal is = Sun proprietary API and may be removed in a future release [WARNING] /data/hive-ptest/working/apache-svn-trunk-source/cli/src/java/org= /apache/hadoop/hive/cli/CliDriver.java:[377,27] warning: sun.misc.Signal is= Sun proprietary API and may be removed in a future release [WARNING] /data/hive-ptest/working/apache-svn-trunk-source/cli/src/java/org= /apache/hadoop/hive/cli/CliDriver.java:[378,52] warning: sun.misc.SignalHan= dler is Sun proprietary API and may be removed in a future release [WARNING] /data/hive-ptest/working/apache-svn-trunk-source/cli/src/java/org= /apache/hadoop/hive/cli/CliDriver.java:[378,52] warning: sun.misc.SignalHan= dler is Sun proprietary API and may be removed in a future release [WARNING] /data/hive-ptest/working/apache-svn-trunk-source/cli/src/java/org= /apache/hadoop/hive/cli/CliDriver.java:[383,28] warning: sun.misc.Signal is= Sun proprietary API and may be removed in a future release [WARNING] /data/hive-ptest/working/apache-svn-trunk-source/cli/src/java/org= /apache/hadoop/hive/cli/CliDriver.java:[378,19] warning: sun.misc.Signal is= Sun proprietary API and may be removed in a future release [WARNING] /data/hive-ptest/working/apache-svn-trunk-source/cli/src/java/org= /apache/hadoop/hive/cli/CliDriver.java:[439,9] warning: sun.misc.Signal is = Sun proprietary API and may be removed in a future release [WARNING] Note: /data/hive-ptest/working/apache-svn-trunk-source/cli/src/ja= va/org/apache/hadoop/hive/cli/RCFileCat.java uses or overrides a deprecated= API. [WARNING] Note: Recompile with -Xlint:deprecation for details. [WARNING] Note: /data/hive-ptest/working/apache-svn-trunk-source/cli/src/ja= va/org/apache/hadoop/hive/cli/CliDriver.java uses unchecked or unsafe opera= tions. [WARNING] Note: Recompile with -Xlint:unchecked for details. [INFO]=20 [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources)= @ hive-cli --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-= svn-trunk-source/cli/src/test/resources [INFO]=20 [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-cli --- [INFO] Executing tasks main: [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/c= li/target/tmp [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/c= li/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/c= li/target/tmp/conf [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-so= urce/cli/target/tmp/conf [INFO] Executed tasks [INFO]=20 [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hi= ve-cli --- [INFO] Compiling 4 source files to /data/hive-ptest/working/apache-svn-trun= k-source/cli/target/test-classes [WARNING] Note: /data/hive-ptest/working/apache-svn-trunk-source/cli/src/te= st/org/apache/hadoop/hive/cli/TestCliDriverMethods.java uses unchecked or u= nsafe operations. [WARNING] Note: Recompile with -Xlint:unchecked for details. [INFO]=20 [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-cli --- [INFO] Tests are skipped. [INFO]=20 [INFO] --- maven-jar-plugin:2.2:jar (default-jar) @ hive-cli --- [INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/cli/t= arget/hive-cli-0.13.0-SNAPSHOT.jar [INFO]=20 [INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-cli --= - [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/cli/targ= et/hive-cli-0.13.0-SNAPSHOT.jar to /data/hive-ptest/working/maven/org/apach= e/hive/hive-cli/0.13.0-SNAPSHOT/hive-cli-0.13.0-SNAPSHOT.jar [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/cli/pom.= xml to /data/hive-ptest/working/maven/org/apache/hive/hive-cli/0.13.0-SNAPS= HOT/hive-cli-0.13.0-SNAPSHOT.pom [INFO] = =20 [INFO] --------------------------------------------------------------------= ---- [INFO] Building Hive Contrib 0.13.0-SNAPSHOT [INFO] --------------------------------------------------------------------= ---- [INFO]=20 [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-contrib --- [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/contrib (i= ncludes =3D [datanucleus.log, derby.log], excludes =3D []) [INFO]=20 [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-= contrib --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-= svn-trunk-source/contrib/src/main/resources [INFO]=20 [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-contrib --= - [INFO] Executing tasks main: [INFO] Executed tasks [INFO]=20 [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-contr= ib --- [INFO] Compiling 39 source files to /data/hive-ptest/working/apache-svn-tru= nk-source/contrib/target/classes [WARNING] Note: Some input files use or override a deprecated API. [WARNING] Note: Recompile with -Xlint:deprecation for details. [WARNING] Note: /data/hive-ptest/working/apache-svn-trunk-source/contrib/sr= c/java/org/apache/hadoop/hive/contrib/udf/example/UDFExampleStructPrint.jav= a uses unchecked or unsafe operations. [WARNING] Note: Recompile with -Xlint:unchecked for details. [INFO]=20 [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources)= @ hive-contrib --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-= svn-trunk-source/contrib/src/test/resources [INFO]=20 [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-contrib --- [INFO] Executing tasks main: [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/c= ontrib/target/tmp [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/c= ontrib/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/c= ontrib/target/tmp/conf [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-so= urce/contrib/target/tmp/conf [INFO] Executed tasks [INFO]=20 [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hi= ve-contrib --- [INFO] Compiling 2 source files to /data/hive-ptest/working/apache-svn-trun= k-source/contrib/target/test-classes [WARNING] Note: /data/hive-ptest/working/apache-svn-trunk-source/contrib/sr= c/test/org/apache/hadoop/hive/contrib/serde2/TestRegexSerDe.java uses or ov= errides a deprecated API. [WARNING] Note: Recompile with -Xlint:deprecation for details. [INFO]=20 [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-contrib --= - [INFO] Tests are skipped. [INFO]=20 [INFO] --- maven-jar-plugin:2.2:jar (default-jar) @ hive-contrib --- [INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/contr= ib/target/hive-contrib-0.13.0-SNAPSHOT.jar [INFO]=20 [INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-contri= b --- [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/contrib/= target/hive-contrib-0.13.0-SNAPSHOT.jar to /data/hive-ptest/working/maven/o= rg/apache/hive/hive-contrib/0.13.0-SNAPSHOT/hive-contrib-0.13.0-SNAPSHOT.ja= r [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/contrib/= pom.xml to /data/hive-ptest/working/maven/org/apache/hive/hive-contrib/0.13= .0-SNAPSHOT/hive-contrib-0.13.0-SNAPSHOT.pom [INFO] = =20 [INFO] --------------------------------------------------------------------= ---- [INFO] Building Hive HBase Handler 0.13.0-SNAPSHOT [INFO] --------------------------------------------------------------------= ---- [INFO]=20 [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-hbase-handle= r --- [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/hbase-hand= ler (includes =3D [datanucleus.log, derby.log], excludes =3D []) [INFO]=20 [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-= hbase-handler --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-= svn-trunk-source/hbase-handler/src/main/resources [INFO]=20 [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-hbase-hand= ler --- [INFO] Executing tasks main: [INFO] Executed tasks [INFO]=20 [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-hbase= -handler --- [INFO] Compiling 13 source files to /data/hive-ptest/working/apache-svn-tru= nk-source/hbase-handler/target/classes [WARNING] Note: Some input files use or override a deprecated API. [WARNING] Note: Recompile with -Xlint:deprecation for details. [INFO]=20 [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources)= @ hive-hbase-handler --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-= svn-trunk-source/hbase-handler/src/test/resources [INFO]=20 [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-hbase-handl= er --- [INFO] Executing tasks main: [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/h= base-handler/target/tmp [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/h= base-handler/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/h= base-handler/target/tmp/conf [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-so= urce/hbase-handler/target/tmp/conf [INFO] Executed tasks [INFO]=20 [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hi= ve-hbase-handler --- [INFO] Compiling 2 source files to /data/hive-ptest/working/apache-svn-trun= k-source/hbase-handler/target/test-classes [INFO]=20 [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-hbase-hand= ler --- [INFO] Tests are skipped. [INFO]=20 [INFO] --- maven-jar-plugin:2.2:jar (default-jar) @ hive-hbase-handler --- [INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/hbase= -handler/target/hive-hbase-handler-0.13.0-SNAPSHOT.jar [INFO]=20 [INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-hbase-= handler --- [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/hbase-ha= ndler/target/hive-hbase-handler-0.13.0-SNAPSHOT.jar to /data/hive-ptest/wor= king/maven/org/apache/hive/hive-hbase-handler/0.13.0-SNAPSHOT/hive-hbase-ha= ndler-0.13.0-SNAPSHOT.jar [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/hbase-ha= ndler/pom.xml to /data/hive-ptest/working/maven/org/apache/hive/hive-hbase-= handler/0.13.0-SNAPSHOT/hive-hbase-handler-0.13.0-SNAPSHOT.pom [INFO] = =20 [INFO] --------------------------------------------------------------------= ---- [INFO] Building Hive HCatalog 0.13.0-SNAPSHOT [INFO] --------------------------------------------------------------------= ---- [INFO]=20 [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-hcatalog --- [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/hcatalog (= includes =3D [datanucleus.log, derby.log], excludes =3D []) [INFO]=20 [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-hcatalog -= -- [INFO] Executing tasks main: [INFO] Executed tasks [INFO]=20 [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-hcatalog --= - [INFO] Executing tasks main: [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/h= catalog/target/tmp [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/h= catalog/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/h= catalog/target/tmp/conf [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-so= urce/hcatalog/target/tmp/conf [INFO] Executed tasks [INFO]=20 [INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-hcatal= og --- [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/hcatalog= /pom.xml to /data/hive-ptest/working/maven/org/apache/hive/hcatalog/hive-hc= atalog/0.13.0-SNAPSHOT/hive-hcatalog-0.13.0-SNAPSHOT.pom [INFO] = =20 [INFO] --------------------------------------------------------------------= ---- [INFO] Building Hive HCatalog Core 0.13.0-SNAPSHOT [INFO] --------------------------------------------------------------------= ---- [INFO]=20 [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-hcatalog-cor= e --- [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/c= ore (includes =3D [datanucleus.log, derby.log], excludes =3D []) [INFO]=20 [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-= hcatalog-core --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-= svn-trunk-source/hcatalog/core/src/main/resources [INFO]=20 [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-hcatalog-c= ore --- [INFO] Executing tasks main: [INFO] Executed tasks [INFO]=20 [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-hcata= log-core --- [INFO] Compiling 144 source files to /data/hive-ptest/working/apache-svn-tr= unk-source/hcatalog/core/target/classes [WARNING] Note: Some input files use or override a deprecated API. [WARNING] Note: Recompile with -Xlint:deprecation for details. [WARNING] Note: Some input files use unchecked or unsafe operations. [WARNING] Note: Recompile with -Xlint:unchecked for details. [INFO]=20 [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources)= @ hive-hcatalog-core --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-= svn-trunk-source/hcatalog/core/src/test/resources [INFO]=20 [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-hcatalog-co= re --- [INFO] Executing tasks main: [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/h= catalog/core/target/tmp [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/h= catalog/core/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/h= catalog/core/target/tmp/conf [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-so= urce/hcatalog/core/target/tmp/conf [INFO] Executed tasks [INFO]=20 [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hi= ve-hcatalog-core --- [INFO] Compiling 67 source files to /data/hive-ptest/working/apache-svn-tru= nk-source/hcatalog/core/target/test-classes [WARNING] Note: Some input files use or override a deprecated API. [WARNING] Note: Recompile with -Xlint:deprecation for details. [WARNING] Note: Some input files use unchecked or unsafe operations. [WARNING] Note: Recompile with -Xlint:unchecked for details. [INFO]=20 [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-hcatalog-c= ore --- [INFO] Tests are skipped. [INFO]=20 [INFO] --- maven-jar-plugin:2.2:jar (default-jar) @ hive-hcatalog-core --- [INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/hcata= log/core/target/hive-hcatalog-core-0.13.0-SNAPSHOT.jar [INFO]=20 [INFO] --- maven-jar-plugin:2.2:test-jar (default) @ hive-hcatalog-core --- [INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/hcata= log/core/target/hive-hcatalog-core-0.13.0-SNAPSHOT-tests.jar [INFO]=20 [INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-hcatal= og-core --- [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/hcatalog= /core/target/hive-hcatalog-core-0.13.0-SNAPSHOT.jar to /data/hive-ptest/wor= king/maven/org/apache/hive/hcatalog/hive-hcatalog-core/0.13.0-SNAPSHOT/hive= -hcatalog-core-0.13.0-SNAPSHOT.jar [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/hcatalog= /core/pom.xml to /data/hive-ptest/working/maven/org/apache/hive/hcatalog/hi= ve-hcatalog-core/0.13.0-SNAPSHOT/hive-hcatalog-core-0.13.0-SNAPSHOT.pom [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/hcatalog= /core/target/hive-hcatalog-core-0.13.0-SNAPSHOT-tests.jar to /data/hive-pte= st/working/maven/org/apache/hive/hcatalog/hive-hcatalog-core/0.13.0-SNAPSHO= T/hive-hcatalog-core-0.13.0-SNAPSHOT-tests.jar [INFO] = =20 [INFO] --------------------------------------------------------------------= ---- [INFO] Building Hive HCatalog Pig Adapter 0.13.0-SNAPSHOT [INFO] --------------------------------------------------------------------= ---- [INFO]=20 [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-hcatalog-pig= -adapter --- [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/h= catalog-pig-adapter (includes =3D [datanucleus.log, derby.log], excludes = =3D []) [INFO]=20 [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-= hcatalog-pig-adapter --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-= svn-trunk-source/hcatalog/hcatalog-pig-adapter/src/main/resources [INFO]=20 [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-hcatalog-p= ig-adapter --- [INFO] Executing tasks main: [INFO] Executed tasks [INFO]=20 [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-hcata= log-pig-adapter --- [INFO] Compiling 10 source files to /data/hive-ptest/working/apache-svn-tru= nk-source/hcatalog/hcatalog-pig-adapter/target/classes [WARNING] Note: Some input files use or override a deprecated API. [WARNING] Note: Recompile with -Xlint:deprecation for details. [WARNING] Note: Some input files use unchecked or unsafe operations. [WARNING] Note: Recompile with -Xlint:unchecked for details. [INFO]=20 [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources)= @ hive-hcatalog-pig-adapter --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-= svn-trunk-source/hcatalog/hcatalog-pig-adapter/src/test/resources [INFO]=20 [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-hcatalog-pi= g-adapter --- [INFO] Executing tasks main: [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/h= catalog/hcatalog-pig-adapter/target/tmp [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/h= catalog/hcatalog-pig-adapter/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/h= catalog/hcatalog-pig-adapter/target/tmp/conf [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-so= urce/hcatalog/hcatalog-pig-adapter/target/tmp/conf [INFO] Executed tasks [INFO]=20 [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hi= ve-hcatalog-pig-adapter --- [INFO] Compiling 26 source files to /data/hive-ptest/working/apache-svn-tru= nk-source/hcatalog/hcatalog-pig-adapter/target/test-classes [WARNING] Note: Some input files use or override a deprecated API. [WARNING] Note: Recompile with -Xlint:deprecation for details. [INFO]=20 [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-hcatalog-p= ig-adapter --- [INFO] Tests are skipped. [INFO]=20 [INFO] --- maven-jar-plugin:2.2:jar (default-jar) @ hive-hcatalog-pig-adapt= er --- [INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/hcata= log/hcatalog-pig-adapter/target/hive-hcatalog-pig-adapter-0.13.0-SNAPSHOT.j= ar [INFO]=20 [INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-hcatal= og-pig-adapter --- [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/hcatalog= /hcatalog-pig-adapter/target/hive-hcatalog-pig-adapter-0.13.0-SNAPSHOT.jar = to /data/hive-ptest/working/maven/org/apache/hive/hcatalog/hive-hcatalog-pi= g-adapter/0.13.0-SNAPSHOT/hive-hcatalog-pig-adapter-0.13.0-SNAPSHOT.jar [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/hcatalog= /hcatalog-pig-adapter/pom.xml to /data/hive-ptest/working/maven/org/apache/= hive/hcatalog/hive-hcatalog-pig-adapter/0.13.0-SNAPSHOT/hive-hcatalog-pig-a= dapter-0.13.0-SNAPSHOT.pom [INFO] = =20 [INFO] --------------------------------------------------------------------= ---- [INFO] Building Hive HCatalog Server Extensions 0.13.0-SNAPSHOT [INFO] --------------------------------------------------------------------= ---- [INFO]=20 [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-hcatalog-ser= ver-extensions --- [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/s= erver-extensions (includes =3D [datanucleus.log, derby.log], excludes =3D [= ]) [INFO]=20 [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-= hcatalog-server-extensions --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-= svn-trunk-source/hcatalog/server-extensions/src/main/resources [INFO]=20 [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-hcatalog-s= erver-extensions --- [INFO] Executing tasks main: [INFO] Executed tasks [INFO]=20 [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-hcata= log-server-extensions --- [INFO] Compiling 38 source files to /data/hive-ptest/working/apache-svn-tru= nk-source/hcatalog/server-extensions/target/classes [WARNING] Note: Some input files use or override a deprecated API. [WARNING] Note: Recompile with -Xlint:deprecation for details. [WARNING] Note: Some input files use unchecked or unsafe operations. [WARNING] Note: Recompile with -Xlint:unchecked for details. [INFO]=20 [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources)= @ hive-hcatalog-server-extensions --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-= svn-trunk-source/hcatalog/server-extensions/src/test/resources [INFO]=20 [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-hcatalog-se= rver-extensions --- [INFO] Executing tasks main: [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/h= catalog/server-extensions/target/tmp [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/h= catalog/server-extensions/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/h= catalog/server-extensions/target/tmp/conf [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-so= urce/hcatalog/server-extensions/target/tmp/conf [INFO] Executed tasks [INFO]=20 [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hi= ve-hcatalog-server-extensions --- [INFO] Compiling 4 source files to /data/hive-ptest/working/apache-svn-trun= k-source/hcatalog/server-extensions/target/test-classes [WARNING] Note: Some input files use or override a deprecated API. [WARNING] Note: Recompile with -Xlint:deprecation for details. [INFO]=20 [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-hcatalog-s= erver-extensions --- [INFO] Tests are skipped. [INFO]=20 [INFO] --- maven-jar-plugin:2.2:jar (default-jar) @ hive-hcatalog-server-ex= tensions --- [INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/hcata= log/server-extensions/target/hive-hcatalog-server-extensions-0.13.0-SNAPSHO= T.jar [INFO]=20 [INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-hcatal= og-server-extensions --- [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/hcatalog= /server-extensions/target/hive-hcatalog-server-extensions-0.13.0-SNAPSHOT.j= ar to /data/hive-ptest/working/maven/org/apache/hive/hcatalog/hive-hcatalog= -server-extensions/0.13.0-SNAPSHOT/hive-hcatalog-server-extensions-0.13.0-S= NAPSHOT.jar [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/hcatalog= /server-extensions/pom.xml to /data/hive-ptest/working/maven/org/apache/hiv= e/hcatalog/hive-hcatalog-server-extensions/0.13.0-SNAPSHOT/hive-hcatalog-se= rver-extensions-0.13.0-SNAPSHOT.pom [INFO] = =20 [INFO] --------------------------------------------------------------------= ---- [INFO] Building Hive HCatalog Webhcat Java Client 0.13.0-SNAPSHOT [INFO] --------------------------------------------------------------------= ---- [INFO]=20 [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-webhcat-java= -client --- [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/w= ebhcat/java-client (includes =3D [datanucleus.log, derby.log], excludes =3D= []) [INFO]=20 [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-= webhcat-java-client --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-= svn-trunk-source/hcatalog/webhcat/java-client/src/main/resources [INFO]=20 [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-webhcat-ja= va-client --- [INFO] Executing tasks main: [INFO] Executed tasks [INFO]=20 [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-webhc= at-java-client --- [INFO] Compiling 20 source files to /data/hive-ptest/working/apache-svn-tru= nk-source/hcatalog/webhcat/java-client/target/classes [WARNING] Note: Some input files use or override a deprecated API. [WARNING] Note: Recompile with -Xlint:deprecation for details. [INFO]=20 [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources)= @ hive-webhcat-java-client --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-= svn-trunk-source/hcatalog/webhcat/java-client/src/test/resources [INFO]=20 [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-webhcat-jav= a-client --- [INFO] Executing tasks main: [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/h= catalog/webhcat/java-client/target/tmp [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/h= catalog/webhcat/java-client/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/h= catalog/webhcat/java-client/target/tmp/conf [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-so= urce/hcatalog/webhcat/java-client/target/tmp/conf [INFO] Executed tasks [INFO]=20 [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hi= ve-webhcat-java-client --- [INFO] Compiling 2 source files to /data/hive-ptest/working/apache-svn-trun= k-source/hcatalog/webhcat/java-client/target/test-classes [WARNING] Note: Some input files use or override a deprecated API. [WARNING] Note: Recompile with -Xlint:deprecation for details. [INFO]=20 [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-webhcat-ja= va-client --- [INFO] Tests are skipped. [INFO]=20 [INFO] --- maven-jar-plugin:2.2:jar (default-jar) @ hive-webhcat-java-clien= t --- [INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/hcata= log/webhcat/java-client/target/hive-webhcat-java-client-0.13.0-SNAPSHOT.jar [INFO]=20 [INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-webhca= t-java-client --- [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/hcatalog= /webhcat/java-client/target/hive-webhcat-java-client-0.13.0-SNAPSHOT.jar to= /data/hive-ptest/working/maven/org/apache/hive/hcatalog/hive-webhcat-java-= client/0.13.0-SNAPSHOT/hive-webhcat-java-client-0.13.0-SNAPSHOT.jar [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/hcatalog= /webhcat/java-client/pom.xml to /data/hive-ptest/working/maven/org/apache/h= ive/hcatalog/hive-webhcat-java-client/0.13.0-SNAPSHOT/hive-webhcat-java-cli= ent-0.13.0-SNAPSHOT.pom [INFO] = =20 [INFO] --------------------------------------------------------------------= ---- [INFO] Building Hive HCatalog Webhcat 0.13.0-SNAPSHOT [INFO] --------------------------------------------------------------------= ---- [INFO]=20 [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-webhcat --- [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/w= ebhcat/svr (includes =3D [datanucleus.log, derby.log], excludes =3D []) [INFO]=20 [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-= webhcat --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-= svn-trunk-source/hcatalog/webhcat/svr/src/main/resources [INFO]=20 [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-webhcat --= - [INFO] Executing tasks main: [INFO] Executed tasks [INFO]=20 [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-webhc= at --- [INFO] Compiling 65 source files to /data/hive-ptest/working/apache-svn-tru= nk-source/hcatalog/webhcat/svr/target/classes [WARNING] Note: Some input files use unchecked or unsafe operations. [WARNING] Note: Recompile with -Xlint:unchecked for details. [INFO]=20 [INFO] --- maven-javadoc-plugin:2.4:javadoc (resourcesdoc.xml) @ hive-webhc= at --- [INFO] Setting property: classpath.resource.loader.class =3D> 'org.codehaus= .plexus.velocity.ContextClassLoaderResourceLoader'. [INFO] Setting property: velocimacro.messages.on =3D> 'false'. [INFO] Setting property: resource.loader =3D> 'classpath'. [INFO] Setting property: resource.manager.logwhenfound =3D> 'false'. [INFO] **************************************************************=20 [INFO] Starting Jakarta Velocity v1.4 [INFO] RuntimeInstance initializing. [INFO] Default Properties File: org/apache/velocity/runtime/defaults/veloci= ty.properties [INFO] Default ResourceManager initializing. (class org.apache.velocity.run= time.resource.ResourceManagerImpl) [INFO] Resource Loader Instantiated: org.codehaus.plexus.velocity.ContextCl= assLoaderResourceLoader [INFO] ClasspathResourceLoader : initialization starting. [INFO] ClasspathResourceLoader : initialization complete. [INFO] ResourceCache : initialized. (class org.apache.velocity.runtime.reso= urce.ResourceCacheImpl) [INFO] Default ResourceManager initialization complete. [INFO] Loaded System Directive: org.apache.velocity.runtime.directive.Liter= al [INFO] Loaded System Directive: org.apache.velocity.runtime.directive.Macro [INFO] Loaded System Directive: org.apache.velocity.runtime.directive.Parse [INFO] Loaded System Directive: org.apache.velocity.runtime.directive.Inclu= de [INFO] Loaded System Directive: org.apache.velocity.runtime.directive.Forea= ch [INFO] Created: 20 parsers. [INFO] Velocimacro : initialization starting. [INFO] Velocimacro : adding VMs from VM library template : VM_global_librar= y.vm [ERROR] ResourceManager : unable to find resource 'VM_global_library.vm' in= any resource loader. [INFO] Velocimacro : error using VM library template VM_global_library.vm = : org.apache.velocity.exception.ResourceNotFoundException: Unable to find r= esource 'VM_global_library.vm' [INFO] Velocimacro : VM library template macro registration complete. [INFO] Velocimacro : allowInline =3D true : VMs can be defined inline in te= mplates [INFO] Velocimacro : allowInlineToOverride =3D false : VMs defined inline m= ay NOT replace previous VM definitions [INFO] Velocimacro : allowInlineLocal =3D false : VMs defined inline will b= e global in scope if allowed. [INFO] Velocimacro : initialization complete. [INFO] Velocity successfully started. Loading source files for package org.apache.hive.hcatalog.templeton... [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/= webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/SimpleExceptio= nMapper.java] [parsing completed 30ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/= webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/JsonBuilder.ja= va] [parsing completed 8ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/= webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/JobItemBean.ja= va] [parsing completed 1ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/= webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/JarDelegator.j= ava] [parsing completed 11ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/= webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/LauncherDelega= tor.java] [parsing completed 21ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/= webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/ExecServiceImp= l.java] [parsing completed 20ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/= webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/DeleteDelegato= r.java] [parsing completed 8ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/= webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/ExecBean.java] [parsing completed 5ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/= webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/PigDelegator.j= ava] [parsing completed 13ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/= webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/TempletonDeleg= ator.java] [parsing completed 1ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/= webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/StreamingDeleg= ator.java] [parsing completed 13ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/= webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/DatabaseDesc.j= ava] [parsing completed 1ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/= webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/CompleteBean.j= ava] [parsing completed 1ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/= webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/HiveDelegator.= java] [parsing completed 19ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/= webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/Server.java] [parsing completed 130ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/= webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/BadParam.java] [parsing completed 1ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/= webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/ColumnDesc.jav= a] [parsing completed 2ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/= webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/PartitionDesc.= java] [parsing completed 1ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/= webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/CatchallExcept= ionMapper.java] [parsing completed 1ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/= webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/HcatDelegator.= java] [parsing completed 52ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/= webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/StatusDelegato= r.java] [parsing completed 3ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/= webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/QueueStatusBea= n.java] [parsing completed 1ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/= webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/CompleteDelega= tor.java] [parsing completed 13ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/= webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/GroupPermissio= nsDesc.java] [parsing completed 5ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/= webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/AppConfig.java= ] [parsing completed 12ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/= webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/HcatException.= java] [parsing completed 1ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/= webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/QueueException= .java] [parsing completed 0ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/= webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/TableLikeDesc.= java] [parsing completed 1ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/= webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/TableDesc.java= ] [parsing completed 12ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/= webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/WadlConfig.jav= a] [parsing completed 0ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/= webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/BusyException.= java] [parsing completed 1ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/= webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/ExecService.ja= va] [parsing completed 4ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/= webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/NotAuthorizedE= xception.java] [parsing completed 0ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/= webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/EnqueueBean.ja= va] [parsing completed 1ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/= webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/SimpleWebExcep= tion.java] [parsing completed 6ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/= webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/ListDelegator.= java] [parsing completed 5ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/= webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/Main.java] [parsing completed 18ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/= webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/SecureProxySup= port.java] [parsing completed 10ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/= webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/MaxByteArrayOu= tputStream.java] [parsing completed 5ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/= webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/ProxyUserSuppo= rt.java] [parsing completed 13ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/= webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/UgiFactory.jav= a] [parsing completed 1ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/= webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/CallbackFailed= Exception.java] [parsing completed 0ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/= webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/TablePropertyD= esc.java] [parsing completed 1ms] Loading source files for package org.apache.hive.hcatalog.templeton.tool... [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/= webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/tool/ZooKeeper= Storage.java] [parsing completed 22ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/= webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/tool/NullRecor= dReader.java] [parsing completed 1ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/= webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/tool/PigJobIDP= arser.java] [parsing completed 0ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/= webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/tool/Templeton= Storage.java] [parsing completed 4ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/= webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/tool/JobIDPars= er.java] [parsing completed 1ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/= webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/tool/JobState.= java] [parsing completed 12ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/= webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/tool/Templeton= Utils.java] [parsing completed 18ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/= webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/tool/HDFSStora= ge.java] [parsing completed 25ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/= webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/tool/Delegatio= nTokenCache.java] [parsing completed 0ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/= webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/tool/JarJobIDP= arser.java] [parsing completed 1ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/= webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/tool/JobSubmis= sionConstants.java] [parsing completed 1ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/= webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/tool/ZooKeeper= Cleanup.java] [parsing completed 6ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/= webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/tool/JobStateT= racker.java] [parsing completed 6ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/= webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/tool/NotFoundE= xception.java] [parsing completed 0ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/= webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/tool/SingleInp= utFormat.java] [parsing completed 1ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/= webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/tool/LogRetrie= ver.java] [parsing completed 9ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/= webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/tool/NullSplit= .java] [parsing completed 2ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/= webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/tool/HDFSClean= up.java] [parsing completed 6ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/= webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/tool/Templeton= ControllerJob.java] [parsing completed 6ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/= webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/tool/HiveJobID= Parser.java] [parsing completed 3ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/= webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/tool/LaunchMap= per.java] [parsing completed 9ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/= webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/tool/TrivialEx= ecService.java] [parsing completed 2ms] Constructing Javadoc information... [search path for source files: /data/hive-ptest/working/apache-svn-trunk-so= urce/hcatalog/webhcat/svr/src/main/java] [search path for class files: /usr/java/jdk1.6.0_34/jre/lib/resources.jar,/= usr/java/jdk1.6.0_34/jre/lib/rt.jar,/usr/java/jdk1.6.0_34/jre/lib/sunrsasig= n.jar,/usr/java/jdk1.6.0_34/jre/lib/jsse.jar,/usr/java/jdk1.6.0_34/jre/lib/= jce.jar,/usr/java/jdk1.6.0_34/jre/lib/charsets.jar,/usr/java/jdk1.6.0_34/jr= e/lib/modules/jdk.boot.jar,/usr/java/jdk1.6.0_34/jre/classes,/usr/java/jdk1= .6.0_34/jre/lib/ext/localedata.jar,/usr/java/jdk1.6.0_34/jre/lib/ext/sunpkc= s11.jar,/usr/java/jdk1.6.0_34/jre/lib/ext/sunjce_provider.jar,/usr/java/jdk= 1.6.0_34/jre/lib/ext/dnsns.jar,/data/hive-ptest/working/apache-svn-trunk-so= urce/hcatalog/webhcat/svr/target/classes,/data/hive-ptest/working/maven/org= /datanucleus/datanucleus-api-jdo/3.2.1/datanucleus-api-jdo-3.2.1.jar,/data/= hive-ptest/working/maven/org/antlr/stringtemplate/3.2.1/stringtemplate-3.2.= 1.jar,/data/hive-ptest/working/maven/com/sun/jersey/jersey-json/1.14/jersey= -json-1.14.jar,/data/hive-ptest/working/maven/org/apache/zookeeper/zookeepe= r/3.4.3/zookeeper-3.4.3.jar,/data/hive-ptest/working/apache-svn-trunk-sourc= e/shims/0.23/target/hive-shims-0.23-0.13.0-SNAPSHOT.jar,/data/hive-ptest/wo= rking/maven/commons-net/commons-net/1.4.1/commons-net-1.4.1.jar,/data/hive-= ptest/working/maven/javax/mail/mail/1.4.1/mail-1.4.1.jar,/data/hive-ptest/w= orking/maven/javax/mail/mail/1.4.1/activation.jar,/data/hive-ptest/working/= maven/org/datanucleus/datanucleus-rdbms/3.2.1/datanucleus-rdbms-3.2.1.jar,/= data/hive-ptest/working/maven/commons-httpclient/commons-httpclient/3.0.1/c= ommons-httpclient-3.0.1.jar,/data/hive-ptest/working/apache-svn-trunk-sourc= e/hcatalog/core/target/hive-hcatalog-core-0.13.0-SNAPSHOT.jar,/data/hive-pt= est/working/maven/org/codehaus/jackson/jackson-core-asl/1.9.2/jackson-core-= asl-1.9.2.jar,/data/hive-ptest/working/maven/org/apache/thrift/libthrift/0.= 9.0/libthrift-0.9.0.jar,/data/hive-ptest/working/maven/org/eclipse/jetty/ag= gregate/jetty-all-server/7.6.0.v20120127/jetty-all-server-7.6.0.v20120127.j= ar,/data/hive-ptest/working/maven/xerces/xercesImpl/2.6.1/xercesImpl-2.6.1.= jar,/data/hive-ptest/working/maven/antlr/antlr/2.7.7/antlr-2.7.7.jar,/data/= hive-ptest/working/maven/commons-configuration/commons-configuration/1.6/co= mmons-configuration-1.6.jar,/data/hive-ptest/working/apache-svn-trunk-sourc= e/shims/common-secure/target/hive-shims-common-secure-0.13.0-SNAPSHOT.jar,/= data/hive-ptest/working/maven/org/apache/httpcomponents/httpclient/4.1.3/ht= tpclient-4.1.3.jar,/data/hive-ptest/working/apache-svn-trunk-source/serde/t= arget/hive-serde-0.13.0-SNAPSHOT.jar,/data/hive-ptest/working/maven/javax/s= ervlet/servlet-api/2.5/servlet-api-2.5.jar,/data/hive-ptest/working/maven/c= om/sun/jdmk/jmxtools/1.2.1/jmxtools-1.2.1.jar,/data/hive-ptest/working/mave= n/org/apache/velocity/velocity/1.5/velocity-1.5.jar,/data/hive-ptest/workin= g/maven/com/jolbox/bonecp/0.7.1.RELEASE/bonecp-0.7.1.RELEASE.jar,/data/hive= -ptest/working/maven/javax/xml/stream/stax-api/1.0-2/stax-api-1.0-2.jar,/da= ta/hive-ptest/working/maven/org/slf4j/slf4j-log4j12/1.6.1/slf4j-log4j12-1.6= .1.jar,/data/hive-ptest/working/maven/javax/jms/jms/1.1/jms-1.1.jar,/data/h= ive-ptest/working/maven/commons-lang/commons-lang/2.4/commons-lang-2.4.jar,= /data/hive-ptest/working/maven/javax/xml/bind/jaxb-api/2.2.2/jaxb-api-2.2.2= .jar,/data/hive-ptest/working/maven/com/sun/jersey/jersey-core/1.14/jersey-= core-1.14.jar,/data/hive-ptest/working/maven/org/apache/commons/commons-mat= h/2.1/commons-math-2.1.jar,/data/hive-ptest/working/maven/org/apache/httpco= mponents/httpcore/4.1.3/httpcore-4.1.3.jar,/data/hive-ptest/working/maven/o= rg/xerial/snappy/snappy-java/1.0.4.1/snappy-java-1.0.4.1.jar,/data/hive-pte= st/working/maven/commons-collections/commons-collections/3.2.1/commons-coll= ections-3.2.1.jar,/data/hive-ptest/working/maven/org/antlr/ST4/4.0.4/ST4-4.= 0.4.jar,/data/hive-ptest/working/maven/org/apache/commons/commons-exec/1.1/= commons-exec-1.1.jar,/data/hive-ptest/working/maven/com/google/guava/guava/= 11.0.2/guava-11.0.2.jar,/data/hive-ptest/working/maven/org/datanucleus/data= nucleus-core/3.2.2/datanucleus-core-3.2.2.jar,/data/hive-ptest/working/mave= n/org/apache/hadoop/hadoop-core/1.2.1/hadoop-core-1.2.1.jar,/data/hive-ptes= t/working/maven/org/tukaani/xz/1.0/xz-1.0.jar,/data/hive-ptest/working/mave= n/org/mortbay/jetty/servlet-api-2.5/6.1.14/servlet-api-2.5-6.1.14.jar,/data= /hive-ptest/working/maven/javax/activation/activation/1.1/activation-1.1.ja= r,/data/hive-ptest/working/maven/org/codehaus/jackson/jackson-jaxrs/1.9.2/j= ackson-jaxrs-1.9.2.jar,/data/hive-ptest/working/maven/stax/stax-api/1.0.1/s= tax-api-1.0.1.jar,/data/hive-ptest/working/maven/org/codehaus/jettison/jett= ison/1.1/jettison-1.1.jar,/data/hive-ptest/working/maven/org/apache/commons= /commons-compress/1.4.1/commons-compress-1.4.1.jar,/data/hive-ptest/working= /maven/org/antlr/antlr-runtime/3.4/antlr-runtime-3.4.jar,/data/hive-ptest/w= orking/maven/com/sun/jersey/jersey-server/1.14/jersey-server-1.14.jar,/usr/= java/jdk1.6.0_34/jre/../lib/tools.jar,/data/hive-ptest/working/maven/org/ap= ache/ant/ant/1.9.1/ant-1.9.1.jar,/data/hive-ptest/working/maven/io/netty/ne= tty/3.4.0.Final/netty-3.4.0.Final.jar,/data/hive-ptest/working/maven/org/sl= f4j/jul-to-slf4j/1.6.1/jul-to-slf4j-1.6.1.jar,/data/hive-ptest/working/mave= n/com/sun/jersey/contribs/wadl-resourcedoc-doclet/1.4/wadl-resourcedoc-docl= et-1.4.jar,/data/hive-ptest/working/maven/org/mortbay/jetty/jetty/6.1.26/je= tty-6.1.26.jar,/data/hive-ptest/working/maven/org/apache/avro/avro-mapred/1= .7.1/avro-mapred-1.7.1.jar,/data/hive-ptest/working/maven/oro/oro/2.0.8/oro= -2.0.8.jar,/data/hive-ptest/working/maven/org/eclipse/jdt/core/3.1.1/core-3= .1.1.jar,/data/hive-ptest/working/apache-svn-trunk-source/shims/0.20/target= /hive-shims-0.20-0.13.0-SNAPSHOT.jar,/data/hive-ptest/working/maven/javax/j= do/jdo-api/3.0.1/jdo-api-3.0.1.jar,/data/hive-ptest/working/maven/javax/tra= nsaction/jta/1.1/jta-1.1.jar,/data/hive-ptest/working/maven/log4j/log4j/1.2= .15/log4j-1.2.15.jar,/data/hive-ptest/working/apache-svn-trunk-source/commo= n/target/hive-common-0.13.0-SNAPSHOT.jar,/data/hive-ptest/working/apache-sv= n-trunk-source/ql/target/hive-exec-0.13.0-SNAPSHOT.jar,/data/hive-ptest/wor= king/maven/org/apache/geronimo/specs/geronimo-annotation_1.0_spec/1.1.1/ger= onimo-annotation_1.0_spec-1.1.1.jar,/data/hive-ptest/working/maven/org/mort= bay/jetty/servlet-api/2.5-20081211/servlet-api-2.5-20081211.jar,/data/hive-= ptest/working/maven/org/apache/avro/avro-ipc/1.7.1/avro-ipc-1.7.1.jar,/data= /hive-ptest/working/maven/net/java/dev/jets3t/jets3t/0.6.1/jets3t-0.6.1.jar= ,/data/hive-ptest/working/maven/com/sun/jersey/jersey-servlet/1.14/jersey-s= ervlet-1.14.jar,/data/hive-ptest/working/maven/commons-logging/commons-logg= ing/1.1.3/commons-logging-1.1.3.jar,/data/hive-ptest/working/maven/com/goog= le/code/findbugs/jsr305/1.3.9/jsr305-1.3.9.jar,/data/hive-ptest/working/mav= en/xmlenc/xmlenc/0.52/xmlenc-0.52.jar,/data/hive-ptest/working/maven/org/mo= rtbay/jetty/jsp-2.1/6.1.14/jsp-2.1-6.1.14.jar,/data/hive-ptest/working/mave= n/commons-el/commons-el/1.0/commons-el-1.0.jar,/data/hive-ptest/working/apa= che-svn-trunk-source/shims/0.20S/target/hive-shims-0.20S-0.13.0-SNAPSHOT.ja= r,/data/hive-ptest/working/maven/jline/jline/0.9.94/jline-0.9.94.jar,/data/= hive-ptest/working/maven/org/mortbay/jetty/jetty-util/6.1.26/jetty-util-6.1= .26.jar,/data/hive-ptest/working/maven/org/slf4j/slf4j-api/1.6.1/slf4j-api-= 1.6.1.jar,/data/hive-ptest/working/maven/org/jboss/netty/netty/3.2.2.Final/= netty-3.2.2.Final.jar,/data/hive-ptest/working/maven/org/codehaus/jackson/j= ackson-xc/1.9.2/jackson-xc-1.9.2.jar,/data/hive-ptest/working/maven/commons= -beanutils/commons-beanutils-core/1.8.0/commons-beanutils-core-1.8.0.jar,/d= ata/hive-ptest/working/maven/ant/ant/1.6.5/ant-1.6.5.jar,/data/hive-ptest/w= orking/apache-svn-trunk-source/cli/target/hive-cli-0.13.0-SNAPSHOT.jar,/dat= a/hive-ptest/working/maven/org/apache/thrift/libfb303/0.9.0/libfb303-0.9.0.= jar,/data/hive-ptest/working/maven/org/codehaus/groovy/groovy-all/2.1.6/gro= ovy-all-2.1.6.jar,/data/hive-ptest/working/maven/org/apache/derby/derby/10.= 4.2.0/derby-10.4.2.0.jar,/data/hive-ptest/working/maven/org/apache/derby/de= rby/10.4.2.0/derbyLocale_cs.jar,/data/hive-ptest/working/maven/org/apache/d= erby/derby/10.4.2.0/derbyLocale_de_DE.jar,/data/hive-ptest/working/maven/or= g/apache/derby/derby/10.4.2.0/derbyLocale_es.jar,/data/hive-ptest/working/m= aven/org/apache/derby/derby/10.4.2.0/derbyLocale_fr.jar,/data/hive-ptest/wo= rking/maven/org/apache/derby/derby/10.4.2.0/derbyLocale_hu.jar,/data/hive-p= test/working/maven/org/apache/derby/derby/10.4.2.0/derbyLocale_it.jar,/data= /hive-ptest/working/maven/org/apache/derby/derby/10.4.2.0/derbyLocale_ja_JP= .jar,/data/hive-ptest/working/maven/org/apache/derby/derby/10.4.2.0/derbyLo= cale_ko_KR.jar,/data/hive-ptest/working/maven/org/apache/derby/derby/10.4.2= .0/derbyLocale_pl.jar,/data/hive-ptest/working/maven/org/apache/derby/derby= /10.4.2.0/derbyLocale_pt_BR.jar,/data/hive-ptest/working/maven/org/apache/d= erby/derby/10.4.2.0/derbyLocale_ru.jar,/data/hive-ptest/working/maven/org/a= pache/derby/derby/10.4.2.0/derbyLocale_zh_CN.jar,/data/hive-ptest/working/m= aven/org/apache/derby/derby/10.4.2.0/derbyLocale_zh_TW.jar,/data/hive-ptest= /working/apache-svn-trunk-source/metastore/target/hive-metastore-0.13.0-SNA= PSHOT.jar,/data/hive-ptest/working/maven/asm/asm-commons/3.1/asm-commons-3.= 1.jar,/data/hive-ptest/working/maven/com/sun/xml/bind/jaxb-impl/2.2.3-1/jax= b-impl-2.2.3-1.jar,/data/hive-ptest/working/maven/com/sun/xml/bind/jaxb-imp= l/2.2.3-1/jaxb-api.jar,/data/hive-ptest/working/maven/com/sun/xml/bind/jaxb= -impl/2.2.3-1/activation.jar,/data/hive-ptest/working/maven/com/sun/xml/bin= d/jaxb-impl/2.2.3-1/jsr173_1.0_api.jar,/data/hive-ptest/working/maven/com/s= un/xml/bind/jaxb-impl/2.2.3-1/jaxb1-impl.jar,/data/hive-ptest/working/apach= e-svn-trunk-source/service/target/hive-service-0.13.0-SNAPSHOT.jar,/data/hi= ve-ptest/working/maven/commons-beanutils/commons-beanutils/1.7.0/commons-be= anutils-1.7.0.jar,/data/hive-ptest/working/maven/org/apache/hadoop/hadoop-t= ools/1.2.1/hadoop-tools-1.2.1.jar,/data/hive-ptest/working/maven/asm/asm-tr= ee/3.1/asm-tree-3.1.jar,/data/hive-ptest/working/maven/com/thoughtworks/par= anamer/paranamer/2.2/paranamer-2.2.jar,/data/hive-ptest/working/maven/commo= ns-io/commons-io/2.1/commons-io-2.1.jar,/data/hive-ptest/working/maven/org/= codehaus/jackson/jackson-mapper-asl/1.9.2/jackson-mapper-asl-1.9.2.jar,/dat= a/hive-ptest/working/maven/tomcat/jasper-runtime/5.5.12/jasper-runtime-5.5.= 12.jar,/data/hive-ptest/working/maven/org/apache/avro/avro/1.7.1/avro-1.7.1= .jar,/data/hive-ptest/working/maven/commons-digester/commons-digester/1.8/c= ommons-digester-1.8.jar,/data/hive-ptest/working/maven/org/apache/geronimo/= specs/geronimo-jaspic_1.0_spec/1.0/geronimo-jaspic_1.0_spec-1.0.jar,/data/h= ive-ptest/working/maven/org/mortbay/jetty/jsp-api-2.1/6.1.14/jsp-api-2.1-6.= 1.14.jar,/data/hive-ptest/working/maven/hsqldb/hsqldb/1.8.0.10/hsqldb-1.8.0= .10.jar,/data/hive-ptest/working/apache-svn-trunk-source/shims/common/targe= t/hive-shims-common-0.13.0-SNAPSHOT.jar,/data/hive-ptest/working/maven/comm= ons-cli/commons-cli/1.2/commons-cli-1.2.jar,/data/hive-ptest/working/apache= -svn-trunk-source/shims/assembly/target/hive-shims-0.13.0-SNAPSHOT.jar,/dat= a/hive-ptest/working/maven/org/apache/geronimo/specs/geronimo-jta_1.1_spec/= 1.1.1/geronimo-jta_1.1_spec-1.1.1.jar,/data/hive-ptest/working/maven/org/ap= ache/ant/ant-launcher/1.9.1/ant-launcher-1.9.1.jar,/data/hive-ptest/working= /maven/com/sun/jmx/jmxri/1.2.1/jmxri-1.2.1.jar,/data/hive-ptest/working/mav= en/tomcat/jasper-compiler/5.5.12/jasper-compiler-5.5.12.jar,/data/hive-ptes= t/working/apache-svn-trunk-source/ant/target/hive-ant-0.13.0-SNAPSHOT.jar,/= data/hive-ptest/working/maven/asm/asm/3.1/asm-3.1.jar,/data/hive-ptest/work= ing/maven/commons-codec/commons-codec/1.4/commons-codec-1.4.jar] [loading javax/ws/rs/core/Response.class(javax/ws/rs/core:Response.class)] [loading javax/ws/rs/ext/ExceptionMapper.class(javax/ws/rs/ext:ExceptionMap= per.class)] [loading javax/ws/rs/ext/Provider.class(javax/ws/rs/ext:Provider.class)] [loading java/io/IOException.class(java/io:IOException.class)] [loading java/util/Map.class(java/util:Map.class)] [loading java/util/HashMap.class(java/util:HashMap.class)] [loading javax/ws/rs/core/MediaType.class(javax/ws/rs/core:MediaType.class)= ] [loading org/codehaus/jackson/map/ObjectMapper.class(org/codehaus/jackson/m= ap:ObjectMapper.class)] [loading java/lang/Throwable.class(java/lang:Throwable.class)] [loading java/io/Serializable.class(java/io:Serializable.class)] [loading java/lang/Object.class(java/lang:Object.class)] [loading java/lang/String.class(java/lang:String.class)] [loading java/io/ByteArrayOutputStream.class(java/io:ByteArrayOutputStream.= class)] [loading org/apache/hadoop/hive/ql/ErrorMsg.class(org/apache/hadoop/hive/ql= :ErrorMsg.class)] [loading org/eclipse/jetty/http/HttpStatus.class(org/eclipse/jetty/http:Htt= pStatus.class)] [loading java/lang/Integer.class(java/lang:Integer.class)] [loading org/apache/hadoop/mapred/JobStatus.class(org/apache/hadoop/mapred:= JobStatus.class)] [loading org/apache/hadoop/mapred/JobProfile.class(org/apache/hadoop/mapred= :JobProfile.class)] [loading java/lang/Long.class(java/lang:Long.class)] [loading java/util/ArrayList.class(java/util:ArrayList.class)] [loading java/util/List.class(java/util:List.class)] [loading org/apache/commons/logging/Log.class(org/apache/commons/logging:Lo= g.class)] [loading org/apache/commons/logging/LogFactory.class(org/apache/commons/log= ging:LogFactory.class)] [loading org/apache/hadoop/conf/Configuration.class(org/apache/hadoop/conf:= Configuration.class)] [loading java/lang/Enum.class(java/lang:Enum.class)] [loading java/lang/Comparable.class(java/lang:Comparable.class)] [loading java/lang/Exception.class(java/lang:Exception.class)] [loading java/io/FileNotFoundException.class(java/io:FileNotFoundException.= class)] [loading java/net/URISyntaxException.class(java/net:URISyntaxException.clas= s)] [loading org/apache/commons/exec/ExecuteException.class(org/apache/commons/= exec:ExecuteException.class)] [loading java/security/PrivilegedExceptionAction.class(java/security:Privil= egedExceptionAction.class)] [loading org/apache/hadoop/fs/Path.class(org/apache/hadoop/fs:Path.class)] [loading org/apache/hadoop/hive/conf/HiveConf.class(org/apache/hadoop/hive/= conf:HiveConf.class)] [loading org/apache/hadoop/security/UserGroupInformation.class(org/apache/h= adoop/security:UserGroupInformation.class)] [loading org/apache/hadoop/util/StringUtils.class(org/apache/hadoop/util:St= ringUtils.class)] [loading org/apache/hadoop/util/ToolRunner.class(org/apache/hadoop/util:Too= lRunner.class)] [loading java/io/File.class(java/io:File.class)] [loading java/net/URL.class(java/net:URL.class)] [loading org/apache/hadoop/util/VersionInfo.class(org/apache/hadoop/util:Ve= rsionInfo.class)] [loading java/lang/Iterable.class(java/lang:Iterable.class)] [loading org/apache/hadoop/io/Writable.class(org/apache/hadoop/io:Writable.= class)] [loading java/lang/InterruptedException.class(java/lang:InterruptedExceptio= n.class)] [loading java/io/BufferedReader.class(java/io:BufferedReader.class)] [loading java/io/InputStream.class(java/io:InputStream.class)] [loading java/io/InputStreamReader.class(java/io:InputStreamReader.class)] [loading java/io/OutputStream.class(java/io:OutputStream.class)] [loading java/io/PrintWriter.class(java/io:PrintWriter.class)] [loading java/util/Map$Entry.class(java/util:Map$Entry.class)] [loading java/util/concurrent/Semaphore.class(java/util/concurrent:Semaphor= e.class)] [loading org/apache/commons/exec/CommandLine.class(org/apache/commons/exec:= CommandLine.class)] [loading org/apache/commons/exec/DefaultExecutor.class(org/apache/commons/e= xec:DefaultExecutor.class)] [loading org/apache/commons/exec/ExecuteWatchdog.class(org/apache/commons/e= xec:ExecuteWatchdog.class)] [loading org/apache/commons/exec/PumpStreamHandler.class(org/apache/commons= /exec:PumpStreamHandler.class)] [loading org/apache/hadoop/util/Shell.class(org/apache/hadoop/util:Shell.cl= ass)] [loading java/lang/Thread.class(java/lang:Thread.class)] [loading java/lang/Runnable.class(java/lang:Runnable.class)] [loading org/apache/hadoop/hive/shims/HadoopShims.class(org/apache/hadoop/h= ive/shims:HadoopShims.class)] [loading org/apache/hadoop/hive/shims/HadoopShims$WebHCatJTShim.class(org/a= pache/hadoop/hive/shims:HadoopShims$WebHCatJTShim.class)] [loading org/apache/hadoop/hive/shims/ShimLoader.class(org/apache/hadoop/hi= ve/shims:ShimLoader.class)] [loading org/apache/hadoop/mapred/JobID.class(org/apache/hadoop/mapred:JobI= D.class)] [loading java/util/Arrays.class(java/util:Arrays.class)] [loading javax/xml/bind/annotation/XmlRootElement.class(javax/xml/bind/anno= tation:XmlRootElement.class)] [loading java/net/InetAddress.class(java/net:InetAddress.class)] [loading java/net/UnknownHostException.class(java/net:UnknownHostException.= class)] [loading java/text/MessageFormat.class(java/text:MessageFormat.class)] [loading java/util/Collections.class(java/util:Collections.class)] [loading java/util/regex/Matcher.class(java/util/regex:Matcher.class)] [loading java/util/regex/Pattern.class(java/util/regex:Pattern.class)] [loading javax/servlet/http/HttpServletRequest.class(javax/servlet/http:Htt= pServletRequest.class)] [loading javax/ws/rs/DELETE.class(javax/ws/rs:DELETE.class)] [loading javax/ws/rs/FormParam.class(javax/ws/rs:FormParam.class)] [loading javax/ws/rs/GET.class(javax/ws/rs:GET.class)] [loading javax/ws/rs/POST.class(javax/ws/rs:POST.class)] [loading javax/ws/rs/PUT.class(javax/ws/rs:PUT.class)] [loading javax/ws/rs/Path.class(javax/ws/rs:Path.class)] [loading javax/ws/rs/PathParam.class(javax/ws/rs:PathParam.class)] [loading javax/ws/rs/Produces.class(javax/ws/rs:Produces.class)] [loading javax/ws/rs/QueryParam.class(javax/ws/rs:QueryParam.class)] [loading javax/ws/rs/core/Context.class(javax/ws/rs/core:Context.class)] [loading javax/ws/rs/core/SecurityContext.class(javax/ws/rs/core:SecurityCo= ntext.class)] [loading javax/ws/rs/core/UriInfo.class(javax/ws/rs/core:UriInfo.class)] [loading org/apache/hadoop/security/authentication/client/PseudoAuthenticat= or.class(org/apache/hadoop/security/authentication/client:PseudoAuthenticat= or.class)] [loading com/sun/jersey/api/NotFoundException.class(com/sun/jersey/api:NotF= oundException.class)] [loading java/net/URI.class(java/net:URI.class)] [loading org/apache/commons/lang/StringUtils.class(org/apache/commons/lang:= StringUtils.class)] [loading org/apache/hadoop/fs/FileStatus.class(org/apache/hadoop/fs:FileSta= tus.class)] [loading org/apache/hadoop/fs/FileSystem.class(org/apache/hadoop/fs:FileSys= tem.class)] [loading java/util/Date.class(java/util:Date.class)] [loading org/apache/hadoop/hive/common/classification/InterfaceAudience.cla= ss(org/apache/hadoop/hive/common/classification:InterfaceAudience.class)] [loading org/apache/hadoop/hive/metastore/HiveMetaStoreClient.class(org/apa= che/hadoop/hive/metastore:HiveMetaStoreClient.class)] [loading org/apache/hive/hcatalog/common/HCatUtil.class(org/apache/hive/hca= talog/common:HCatUtil.class)] [loading org/apache/hadoop/hive/common/classification/InterfaceAudience$Pri= vate.class(org/apache/hadoop/hive/common/classification:InterfaceAudience$P= rivate.class)] [loading com/sun/jersey/api/wadl/config/WadlGeneratorConfig.class(com/sun/j= ersey/api/wadl/config:WadlGeneratorConfig.class)] [loading com/sun/jersey/api/wadl/config/WadlGeneratorDescription.class(com/= sun/jersey/api/wadl/config:WadlGeneratorDescription.class)] [loading com/sun/jersey/server/wadl/generators/resourcedoc/WadlGeneratorRes= ourceDocSupport.class(com/sun/jersey/server/wadl/generators/resourcedoc:Wad= lGeneratorResourceDocSupport.class)] [loading com/sun/jersey/api/core/PackagesResourceConfig.class(com/sun/jerse= y/api/core:PackagesResourceConfig.class)] [loading com/sun/jersey/spi/container/servlet/ServletContainer.class(com/su= n/jersey/spi/container/servlet:ServletContainer.class)] [loading org/apache/hadoop/hive/common/classification/InterfaceStability.cl= ass(org/apache/hadoop/hive/common/classification:InterfaceStability.class)] [loading org/apache/hadoop/hdfs/web/AuthFilter.class(org/apache/hadoop/hdfs= /web:AuthFilter.class)] [loading org/apache/hadoop/util/GenericOptionsParser.class(org/apache/hadoo= p/util:GenericOptionsParser.class)] [loading org/eclipse/jetty/rewrite/handler/RedirectPatternRule.class(org/ec= lipse/jetty/rewrite/handler:RedirectPatternRule.class)] [loading org/eclipse/jetty/rewrite/handler/RewriteHandler.class(org/eclipse= /jetty/rewrite/handler:RewriteHandler.class)] [loading org/eclipse/jetty/server/Handler.class(org/eclipse/jetty/server:Ha= ndler.class)] [loading org/eclipse/jetty/server/Server.class(org/eclipse/jetty/server:Ser= ver.class)] [loading org/eclipse/jetty/server/handler/HandlerList.class(org/eclipse/jet= ty/server/handler:HandlerList.class)] [loading org/eclipse/jetty/servlet/FilterHolder.class(org/eclipse/jetty/ser= vlet:FilterHolder.class)] [loading org/eclipse/jetty/servlet/FilterMapping.class(org/eclipse/jetty/se= rvlet:FilterMapping.class)] [loading org/eclipse/jetty/servlet/ServletContextHandler.class(org/eclipse/= jetty/servlet:ServletContextHandler.class)] [loading org/eclipse/jetty/servlet/ServletHolder.class(org/eclipse/jetty/se= rvlet:ServletHolder.class)] [loading org/slf4j/bridge/SLF4JBridgeHandler.class(org/slf4j/bridge:SLF4JBr= idgeHandler.class)] [loading org/apache/hadoop/hive/common/classification/InterfaceAudience$Lim= itedPrivate.class(org/apache/hadoop/hive/common/classification:InterfaceAud= ience$LimitedPrivate.class)] [loading org/apache/hadoop/hive/common/classification/InterfaceStability$Un= stable.class(org/apache/hadoop/hive/common/classification:InterfaceStabilit= y$Unstable.class)] [loading org/apache/hadoop/hive/metastore/api/MetaException.class(org/apach= e/hadoop/hive/metastore/api:MetaException.class)] [loading org/apache/hadoop/io/Text.class(org/apache/hadoop/io:Text.class)] [loading org/apache/hadoop/security/Credentials.class(org/apache/hadoop/sec= urity:Credentials.class)] [loading org/apache/hadoop/security/token/Token.class(org/apache/hadoop/sec= urity/token:Token.class)] [loading org/apache/thrift/TException.class(org/apache/thrift:TException.cl= ass)] [loading java/io/Closeable.class(java/io:Closeable.class)] [loading java/io/Flushable.class(java/io:Flushable.class)] [loading org/apache/hadoop/security/Groups.class(org/apache/hadoop/security= :Groups.class)] [loading java/util/HashSet.class(java/util:HashSet.class)] [loading java/util/Set.class(java/util:Set.class)] [loading java/util/concurrent/ConcurrentHashMap.class(java/util/concurrent:= ConcurrentHashMap.class)] [loading java/io/UnsupportedEncodingException.class(java/io:UnsupportedEnco= dingException.class)] [loading org/apache/zookeeper/CreateMode.class(org/apache/zookeeper:CreateM= ode.class)] [loading org/apache/zookeeper/KeeperException.class(org/apache/zookeeper:Ke= eperException.class)] [loading org/apache/zookeeper/WatchedEvent.class(org/apache/zookeeper:Watch= edEvent.class)] [loading org/apache/zookeeper/Watcher.class(org/apache/zookeeper:Watcher.cl= ass)] [loading org/apache/zookeeper/ZooDefs.class(org/apache/zookeeper:ZooDefs.cl= ass)] [loading org/apache/zookeeper/ZooDefs$Ids.class(org/apache/zookeeper:ZooDef= s$Ids.class)] [loading org/apache/zookeeper/ZooKeeper.class(org/apache/zookeeper:ZooKeepe= r.class)] [loading org/apache/hadoop/io/NullWritable.class(org/apache/hadoop/io:NullW= ritable.class)] [loading org/apache/hadoop/mapreduce/InputSplit.class(org/apache/hadoop/map= reduce:InputSplit.class)] [loading org/apache/hadoop/mapreduce/RecordReader.class(org/apache/hadoop/m= apreduce:RecordReader.class)] [loading org/apache/hadoop/mapreduce/TaskAttemptContext.class(org/apache/ha= doop/mapreduce:TaskAttemptContext.class)] [loading java/net/URLConnection.class(java/net:URLConnection.class)] [loading java/util/Collection.class(java/util:Collection.class)] [loading javax/ws/rs/core/UriBuilder.class(javax/ws/rs/core:UriBuilder.clas= s)] [loading java/io/OutputStreamWriter.class(java/io:OutputStreamWriter.class)= ] [loading org/apache/hadoop/hive/common/classification/InterfaceStability$Ev= olving.class(org/apache/hadoop/hive/common/classification:InterfaceStabilit= y$Evolving.class)] [loading org/apache/zookeeper/data/Stat.class(org/apache/zookeeper/data:Sta= t.class)] [loading org/apache/hadoop/mapreduce/InputFormat.class(org/apache/hadoop/ma= preduce:InputFormat.class)] [loading org/apache/hadoop/mapreduce/JobContext.class(org/apache/hadoop/map= reduce:JobContext.class)] [loading org/apache/hadoop/mapred/JobClient.class(org/apache/hadoop/mapred:= JobClient.class)] [loading org/apache/hadoop/mapred/JobConf.class(org/apache/hadoop/mapred:Jo= bConf.class)] [loading org/apache/hadoop/mapred/RunningJob.class(org/apache/hadoop/mapred= :RunningJob.class)] [loading java/io/DataInput.class(java/io:DataInput.class)] [loading java/io/DataOutput.class(java/io:DataOutput.class)] [loading org/apache/hadoop/conf/Configured.class(org/apache/hadoop/conf:Con= figured.class)] [loading org/apache/hadoop/fs/permission/FsPermission.class(org/apache/hado= op/fs/permission:FsPermission.class)] [loading org/apache/hadoop/mapreduce/Job.class(org/apache/hadoop/mapreduce:= Job.class)] [loading org/apache/hadoop/mapreduce/JobID.class(org/apache/hadoop/mapreduc= e:JobID.class)] [loading org/apache/hadoop/mapreduce/lib/output/NullOutputFormat.class(org/= apache/hadoop/mapreduce/lib/output:NullOutputFormat.class)] [loading org/apache/hadoop/mapreduce/security/token/delegation/DelegationTo= kenIdentifier.class(org/apache/hadoop/mapreduce/security/token/delegation:D= elegationTokenIdentifier.class)] [loading org/apache/hadoop/util/Tool.class(org/apache/hadoop/util:Tool.clas= s)] [loading org/apache/hadoop/conf/Configurable.class(org/apache/hadoop/conf:C= onfigurable.class)] [loading java/lang/ClassNotFoundException.class(java/lang:ClassNotFoundExce= ption.class)] [loading org/apache/hadoop/mapreduce/Mapper.class(org/apache/hadoop/mapredu= ce:Mapper.class)] [loading java/util/Iterator.class(java/util:Iterator.class)] [loading java/util/LinkedList.class(java/util:LinkedList.class)] [loading java/util/concurrent/ExecutorService.class(java/util/concurrent:Ex= ecutorService.class)] [loading java/util/concurrent/Executors.class(java/util/concurrent:Executor= s.class)] [loading java/util/concurrent/TimeUnit.class(java/util/concurrent:TimeUnit.= class)] [loading org/apache/hadoop/mapreduce/Mapper$Context.class(org/apache/hadoop= /mapreduce:Mapper$Context.class)] [loading java/lang/Process.class(java/lang:Process.class)] [loading java/lang/StringBuilder.class(java/lang:StringBuilder.class)] [loading java/lang/ProcessBuilder.class(java/lang:ProcessBuilder.class)] [loading java/lang/annotation/Target.class(java/lang/annotation:Target.clas= s)] [loading java/lang/annotation/ElementType.class(java/lang/annotation:Elemen= tType.class)] [loading java/lang/annotation/Retention.class(java/lang/annotation:Retentio= n.class)] [loading java/lang/annotation/RetentionPolicy.class(java/lang/annotation:Re= tentionPolicy.class)] [loading java/lang/annotation/Annotation.class(java/lang/annotation:Annotat= ion.class)] [loading java/lang/SuppressWarnings.class(java/lang:SuppressWarnings.class)= ] [loading java/lang/Override.class(java/lang:Override.class)] [loading javax/ws/rs/HttpMethod.class(javax/ws/rs:HttpMethod.class)] [loading java/lang/Deprecated.class(java/lang:Deprecated.class)] [loading /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/= svr/target/classes/org/apache/hive/hcatalog/templeton/SecureProxySupport$3.= class] [loading /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/= svr/target/classes/org/apache/hive/hcatalog/templeton/SecureProxySupport$1.= class] [loading /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/= svr/target/classes/org/apache/hive/hcatalog/templeton/HcatDelegator$1.class= ] [loading /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/= svr/target/classes/org/apache/hive/hcatalog/templeton/LauncherDelegator$1.c= lass] [loading /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/= svr/target/classes/org/apache/hive/hcatalog/templeton/SecureProxySupport$2.= class] [loading /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/= svr/target/classes/org/apache/hive/hcatalog/templeton/HcatException$1.class= ] [loading /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/= svr/target/classes/org/apache/hive/hcatalog/templeton/tool/TempletonControl= lerJob$2.class] [loading /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/= svr/target/classes/org/apache/hive/hcatalog/templeton/tool/TempletonControl= lerJob$2$1.class] [loading /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/= svr/target/classes/org/apache/hive/hcatalog/templeton/tool/ZooKeeperStorage= $1.class] [loading /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/= svr/target/classes/org/apache/hive/hcatalog/templeton/tool/TempletonControl= lerJob$1.class] [loading /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/= svr/target/classes/org/apache/hive/hcatalog/templeton/tool/LogRetriever$1.c= lass] [loading /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/= svr/target/classes/org/apache/hive/hcatalog/templeton/tool/ZooKeeperStorage= $2.class] [loading /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/= svr/target/classes/org/apache/hive/hcatalog/templeton/tool/TempletonUtils$1= .class] [loading /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/= svr/target/classes/org/apache/hive/hcatalog/templeton/tool/HDFSStorage$1.cl= ass] [done in 7291 ms] [WARNING] Javadoc Warnings [WARNING] Nov 11, 2013 4:27:06 PM com.sun.jersey.wadl.resourcedoc.ResourceD= oclet start [WARNING] INFO: Wrote /data/hive-ptest/working/apache-svn-trunk-source/hcat= alog/webhcat/svr/target/classes/resourcedoc.xml [INFO]=20 [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources)= @ hive-webhcat --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-= svn-trunk-source/hcatalog/webhcat/svr/src/test/resources [INFO]=20 [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-webhcat --- [INFO] Executing tasks main: [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/h= catalog/webhcat/svr/target/tmp [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/h= catalog/webhcat/svr/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/h= catalog/webhcat/svr/target/tmp/conf [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-so= urce/hcatalog/webhcat/svr/target/tmp/conf [INFO] Executed tasks [INFO]=20 [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hi= ve-webhcat --- [INFO] Compiling 9 source files to /data/hive-ptest/working/apache-svn-trun= k-source/hcatalog/webhcat/svr/target/test-classes [WARNING] Note: /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/w= ebhcat/svr/src/test/java/org/apache/hive/hcatalog/templeton/TestDesc.java u= ses unchecked or unsafe operations. [WARNING] Note: Recompile with -Xlint:unchecked for details. [INFO]=20 [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-webhcat --= - [INFO] Tests are skipped. [INFO]=20 [INFO] --- maven-jar-plugin:2.2:jar (default-jar) @ hive-webhcat --- [INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/hcata= log/webhcat/svr/target/hive-webhcat-0.13.0-SNAPSHOT.jar [INFO]=20 [INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-webhca= t --- [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/hcatalog= /webhcat/svr/target/hive-webhcat-0.13.0-SNAPSHOT.jar to /data/hive-ptest/wo= rking/maven/org/apache/hive/hcatalog/hive-webhcat/0.13.0-SNAPSHOT/hive-webh= cat-0.13.0-SNAPSHOT.jar [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/hcatalog= /webhcat/svr/pom.xml to /data/hive-ptest/working/maven/org/apache/hive/hcat= alog/hive-webhcat/0.13.0-SNAPSHOT/hive-webhcat-0.13.0-SNAPSHOT.pom [INFO] = =20 [INFO] --------------------------------------------------------------------= ---- [INFO] Building Hive HCatalog HBase Storage Handler 0.13.0-SNAPSHOT [INFO] --------------------------------------------------------------------= ---- [INFO]=20 [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-hbase-storag= e-handler --- [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/s= torage-handlers/hbase (includes =3D [datanucleus.log, derby.log], excludes = =3D []) [INFO]=20 [INFO] --- build-helper-maven-plugin:1.8:add-source (add-source) @ hive-hba= se-storage-handler --- [INFO] Source directory: /data/hive-ptest/working/apache-svn-trunk-source/h= catalog/storage-handlers/hbase/src/gen-java added. [INFO]=20 [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-= hbase-storage-handler --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] Copying 1 resource [INFO]=20 [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-hbase-stor= age-handler --- [INFO] Executing tasks main: [INFO] Executed tasks [INFO]=20 [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-hbase= -storage-handler --- [INFO] Compiling 35 source files to /data/hive-ptest/working/apache-svn-tru= nk-source/hcatalog/storage-handlers/hbase/target/classes [WARNING] Note: Some input files use or override a deprecated API. [WARNING] Note: Recompile with -Xlint:deprecation for details. [WARNING] Note: Some input files use unchecked or unsafe operations. [WARNING] Note: Recompile with -Xlint:unchecked for details. [INFO]=20 [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources)= @ hive-hbase-storage-handler --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-= svn-trunk-source/hcatalog/storage-handlers/hbase/src/test/resources [INFO]=20 [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-hbase-stora= ge-handler --- [INFO] Executing tasks main: [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/h= catalog/storage-handlers/hbase/target/tmp [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/h= catalog/storage-handlers/hbase/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/h= catalog/storage-handlers/hbase/target/tmp/conf [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-so= urce/hcatalog/storage-handlers/hbase/target/tmp/conf [INFO] Executed tasks [INFO]=20 [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hi= ve-hbase-storage-handler --- [INFO] Compiling 21 source files to /data/hive-ptest/working/apache-svn-tru= nk-source/hcatalog/storage-handlers/hbase/target/test-classes [WARNING] Note: Some input files use or override a deprecated API. [WARNING] Note: Recompile with -Xlint:deprecation for details. [INFO]=20 [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-hbase-stor= age-handler --- [INFO] Tests are skipped. [INFO]=20 [INFO] --- maven-jar-plugin:2.2:jar (default-jar) @ hive-hbase-storage-hand= ler --- [INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/hcata= log/storage-handlers/hbase/target/hive-hbase-storage-handler-0.13.0-SNAPSHO= T.jar [INFO]=20 [INFO] --- maven-jar-plugin:2.2:test-jar (default) @ hive-hbase-storage-han= dler --- [INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/hcata= log/storage-handlers/hbase/target/hive-hbase-storage-handler-0.13.0-SNAPSHO= T-tests.jar [INFO]=20 [INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-hbase-= storage-handler --- [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/hcatalog= /storage-handlers/hbase/target/hive-hbase-storage-handler-0.13.0-SNAPSHOT.j= ar to /data/hive-ptest/working/maven/org/apache/hive/hcatalog/hive-hbase-st= orage-handler/0.13.0-SNAPSHOT/hive-hbase-storage-handler-0.13.0-SNAPSHOT.ja= r [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/hcatalog= /storage-handlers/hbase/pom.xml to /data/hive-ptest/working/maven/org/apach= e/hive/hcatalog/hive-hbase-storage-handler/0.13.0-SNAPSHOT/hive-hbase-stora= ge-handler-0.13.0-SNAPSHOT.pom [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/hcatalog= /storage-handlers/hbase/target/hive-hbase-storage-handler-0.13.0-SNAPSHOT-t= ests.jar to /data/hive-ptest/working/maven/org/apache/hive/hcatalog/hive-hb= ase-storage-handler/0.13.0-SNAPSHOT/hive-hbase-storage-handler-0.13.0-SNAPS= HOT-tests.jar [INFO] = =20 [INFO] --------------------------------------------------------------------= ---- [INFO] Building Hive HWI 0.13.0-SNAPSHOT [INFO] --------------------------------------------------------------------= ---- [INFO]=20 [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-hwi --- [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/hwi (inclu= des =3D [datanucleus.log, derby.log], excludes =3D []) [INFO]=20 [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-= hwi --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-= svn-trunk-source/hwi/src/main/resources [INFO]=20 [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-hwi --- [INFO] Executing tasks main: [INFO] Executed tasks [INFO]=20 [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-hwi -= -- [INFO] Compiling 6 source files to /data/hive-ptest/working/apache-svn-trun= k-source/hwi/target/classes [INFO]=20 [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources)= @ hive-hwi --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-= svn-trunk-source/hwi/src/test/resources [INFO]=20 [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-hwi --- [INFO] Executing tasks main: [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/h= wi/target/tmp [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/h= wi/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/h= wi/target/tmp/conf [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-so= urce/hwi/target/tmp/conf [INFO] Executed tasks [INFO]=20 [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hi= ve-hwi --- [INFO] Compiling 2 source files to /data/hive-ptest/working/apache-svn-trun= k-source/hwi/target/test-classes [INFO]=20 [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-hwi --- [INFO] Tests are skipped. [INFO]=20 [INFO] --- maven-jar-plugin:2.2:jar (default-jar) @ hive-hwi --- [INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/hwi/t= arget/hive-hwi-0.13.0-SNAPSHOT.jar [INFO]=20 [INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-hwi --= - [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/hwi/targ= et/hive-hwi-0.13.0-SNAPSHOT.jar to /data/hive-ptest/working/maven/org/apach= e/hive/hive-hwi/0.13.0-SNAPSHOT/hive-hwi-0.13.0-SNAPSHOT.jar [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/hwi/pom.= xml to /data/hive-ptest/working/maven/org/apache/hive/hive-hwi/0.13.0-SNAPS= HOT/hive-hwi-0.13.0-SNAPSHOT.pom [INFO] = =20 [INFO] --------------------------------------------------------------------= ---- [INFO] Building Hive ODBC 0.13.0-SNAPSHOT [INFO] --------------------------------------------------------------------= ---- [INFO]=20 [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-odbc --- [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/odbc (incl= udes =3D [datanucleus.log, derby.log], excludes =3D []) [INFO]=20 [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-odbc --- [INFO] Executing tasks main: [INFO] Executed tasks [INFO]=20 [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-odbc --- [INFO] Executing tasks main: [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/o= dbc/target/tmp [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/o= dbc/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/o= dbc/target/tmp/conf [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-so= urce/odbc/target/tmp/conf [INFO] Executed tasks [INFO]=20 [INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-odbc -= -- [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/odbc/pom= .xml to /data/hive-ptest/working/maven/org/apache/hive/hive-odbc/0.13.0-SNA= PSHOT/hive-odbc-0.13.0-SNAPSHOT.pom [INFO] = =20 [INFO] --------------------------------------------------------------------= ---- [INFO] Building Hive Shims Aggregator 0.13.0-SNAPSHOT [INFO] --------------------------------------------------------------------= ---- [INFO]=20 [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-shims-aggreg= ator --- [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/shims (inc= ludes =3D [datanucleus.log, derby.log], excludes =3D []) [INFO]=20 [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-shims-aggr= egator --- [INFO] Executing tasks main: [INFO] Executed tasks [INFO]=20 [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-shims-aggre= gator --- [INFO] Executing tasks main: [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/s= hims/target/tmp [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/s= hims/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/s= hims/target/tmp/conf [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-so= urce/shims/target/tmp/conf [INFO] Executed tasks [INFO]=20 [INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-shims-= aggregator --- [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/shims/po= m.xml to /data/hive-ptest/working/maven/org/apache/hive/hive-shims-aggregat= or/0.13.0-SNAPSHOT/hive-shims-aggregator-0.13.0-SNAPSHOT.pom [INFO] = =20 [INFO] --------------------------------------------------------------------= ---- [INFO] Building Hive TestUtils 0.13.0-SNAPSHOT [INFO] --------------------------------------------------------------------= ---- [INFO]=20 [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-testutils --= - [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/testutils = (includes =3D [datanucleus.log, derby.log], excludes =3D []) [INFO]=20 [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-= testutils --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-= svn-trunk-source/testutils/src/main/resources [INFO]=20 [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-testutils = --- [INFO] Executing tasks main: [INFO] Executed tasks [INFO]=20 [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-testu= tils --- [INFO] Compiling 2 source files to /data/hive-ptest/working/apache-svn-trun= k-source/testutils/target/classes [INFO]=20 [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources)= @ hive-testutils --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-= svn-trunk-source/testutils/src/test/resources [INFO]=20 [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-testutils -= -- [INFO] Executing tasks main: [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/t= estutils/target/tmp [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/t= estutils/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/t= estutils/target/tmp/conf [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-so= urce/testutils/target/tmp/conf [INFO] Executed tasks [INFO]=20 [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hi= ve-testutils --- [INFO] No sources to compile [INFO]=20 [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-testutils = --- [INFO] Tests are skipped. [INFO]=20 [INFO] --- maven-jar-plugin:2.2:jar (default-jar) @ hive-testutils --- [INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/testu= tils/target/hive-testutils-0.13.0-SNAPSHOT.jar [INFO]=20 [INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-testut= ils --- [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/testutil= s/target/hive-testutils-0.13.0-SNAPSHOT.jar to /data/hive-ptest/working/mav= en/org/apache/hive/hive-testutils/0.13.0-SNAPSHOT/hive-testutils-0.13.0-SNA= PSHOT.jar [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/testutil= s/pom.xml to /data/hive-ptest/working/maven/org/apache/hive/hive-testutils/= 0.13.0-SNAPSHOT/hive-testutils-0.13.0-SNAPSHOT.pom [INFO] = =20 [INFO] --------------------------------------------------------------------= ---- [INFO] Building Hive Packaging 0.13.0-SNAPSHOT [INFO] --------------------------------------------------------------------= ---- [INFO]=20 [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-packaging --= - [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/packaging = (includes =3D [datanucleus.log, derby.log], excludes =3D []) [INFO]=20 [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-packaging = --- [INFO] Executing tasks main: [INFO] Executed tasks [INFO]=20 [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-packaging -= -- [INFO] Executing tasks main: [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/p= ackaging/target/tmp [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/p= ackaging/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/p= ackaging/target/tmp/conf [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-so= urce/packaging/target/tmp/conf [INFO] Executed tasks [INFO]=20 [INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-packag= ing --- [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/packagin= g/pom.xml to /data/hive-ptest/working/maven/org/apache/hive/hive-packaging/= 0.13.0-SNAPSHOT/hive-packaging-0.13.0-SNAPSHOT.pom [INFO] --------------------------------------------------------------------= ---- [INFO] Reactor Summary: [INFO]=20 [INFO] Hive .............................................. SUCCESS [2.844s] [INFO] Hive Ant Utilities ................................ SUCCESS [7.022s] [INFO] Hive Shims Common ................................. SUCCESS [3.186s] [INFO] Hive Shims 0.20 ................................... SUCCESS [1.754s] [INFO] Hive Shims Secure Common .......................... SUCCESS [3.194s] [INFO] Hive Shims 0.20S .................................. SUCCESS [1.310s] [INFO] Hive Shims 0.23 ................................... SUCCESS [3.845s] [INFO] Hive Shims ........................................ SUCCESS [3.250s] [INFO] Hive Common ....................................... SUCCESS [8.878s] [INFO] Hive Serde ........................................ SUCCESS [11.298s= ] [INFO] Hive Metastore .................................... SUCCESS [22.615s= ] [INFO] Hive Query Language ............................... SUCCESS [50.980s= ] [INFO] Hive Service ...................................... SUCCESS [5.724s] [INFO] Hive JDBC ......................................... SUCCESS [1.125s] [INFO] Hive Beeline ...................................... SUCCESS [1.381s] [INFO] Hive CLI .......................................... SUCCESS [1.730s] [INFO] Hive Contrib ...................................... SUCCESS [1.698s] [INFO] Hive HBase Handler ................................ SUCCESS [1.236s] [INFO] Hive HCatalog ..................................... SUCCESS [0.222s] [INFO] Hive HCatalog Core ................................ SUCCESS [2.758s] [INFO] Hive HCatalog Pig Adapter ......................... SUCCESS [0.954s] [INFO] Hive HCatalog Server Extensions ................... SUCCESS [0.828s] [INFO] Hive HCatalog Webhcat Java Client ................. SUCCESS [0.646s] [INFO] Hive HCatalog Webhcat ............................. SUCCESS [10.384s= ] [INFO] Hive HCatalog HBase Storage Handler ............... SUCCESS [1.870s] [INFO] Hive HWI .......................................... SUCCESS [0.761s] [INFO] Hive ODBC ......................................... SUCCESS [0.330s] [INFO] Hive Shims Aggregator ............................. SUCCESS [0.236s] [INFO] Hive TestUtils .................................... SUCCESS [0.225s] [INFO] Hive Packaging .................................... SUCCESS [0.379s] [INFO] --------------------------------------------------------------------= ---- [INFO] BUILD SUCCESS [INFO] --------------------------------------------------------------------= ---- [INFO] Total time: 2:34.842s [INFO] Finished at: Mon Nov 11 16:27:11 EST 2013 [INFO] Final Memory: 72M/425M [INFO] --------------------------------------------------------------------= ---- + mvn -B test -Dmaven.repo.local=3D/data/hive-ptest/working/maven -Dtest=3D= TestDummy [INFO] Scanning for projects... [INFO] --------------------------------------------------------------------= ---- [INFO] Reactor Build Order: [INFO]=20 [INFO] Hive [INFO] Hive Ant Utilities [INFO] Hive Shims Common [INFO] Hive Shims 0.20 [INFO] Hive Shims Secure Common [INFO] Hive Shims 0.20S [INFO] Hive Shims 0.23 [INFO] Hive Shims [INFO] Hive Common [INFO] Hive Serde [INFO] Hive Metastore [INFO] Hive Query Language [INFO] Hive Service [INFO] Hive JDBC [INFO] Hive Beeline [INFO] Hive CLI [INFO] Hive Contrib [INFO] Hive HBase Handler [INFO] Hive HCatalog [INFO] Hive HCatalog Core [INFO] Hive HCatalog Pig Adapter [INFO] Hive HCatalog Server Extensions [INFO] Hive HCatalog Webhcat Java Client [INFO] Hive HCatalog Webhcat [INFO] Hive HCatalog HBase Storage Handler [INFO] Hive HWI [INFO] Hive ODBC [INFO] Hive Shims Aggregator [INFO] Hive TestUtils [INFO] Hive Packaging [INFO] = =20 [INFO] --------------------------------------------------------------------= ---- [INFO] Building Hive 0.13.0-SNAPSHOT [INFO] --------------------------------------------------------------------= ---- [INFO]=20 [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive --- [INFO] Executing tasks main: [INFO] Executed tasks [INFO]=20 [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive --- [INFO] Executing tasks main: [delete] Deleting directory /data/hive-ptest/working/apache-svn-trunk-so= urce/target/tmp [delete] Deleting directory /data/hive-ptest/working/apache-svn-trunk-so= urce/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/t= arget/tmp [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/t= arget/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/t= arget/tmp/conf [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-so= urce/target/tmp/conf [INFO] Executed tasks [INFO] = =20 [INFO] --------------------------------------------------------------------= ---- [INFO] Building Hive Ant Utilities 0.13.0-SNAPSHOT [INFO] --------------------------------------------------------------------= ---- [INFO]=20 [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-= ant --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-= svn-trunk-source/ant/src/main/resources [INFO]=20 [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-ant --- [INFO] Executing tasks main: [INFO] Executed tasks [INFO]=20 [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-ant -= -- [INFO] Nothing to compile - all classes are up to date [INFO]=20 [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources)= @ hive-ant --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-= svn-trunk-source/ant/src/test/resources [INFO]=20 [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-ant --- [INFO] Executing tasks main: [delete] Deleting directory /data/hive-ptest/working/apache-svn-trunk-so= urce/ant/target/tmp [delete] Deleting directory /data/hive-ptest/working/apache-svn-trunk-so= urce/ant/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/a= nt/target/tmp [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/a= nt/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/a= nt/target/tmp/conf [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-so= urce/ant/target/tmp/conf [INFO] Executed tasks [INFO]=20 [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hi= ve-ant --- [INFO] No sources to compile [INFO]=20 [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-ant --- [INFO] No tests to run. [INFO] = =20 [INFO] --------------------------------------------------------------------= ---- [INFO] Building Hive Shims Common 0.13.0-SNAPSHOT [INFO] --------------------------------------------------------------------= ---- [INFO]=20 [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-= shims-common --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-= svn-trunk-source/shims/common/src/main/resources [INFO]=20 [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-shims-comm= on --- [INFO] Executing tasks main: [INFO] Executed tasks [INFO]=20 [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-shims= -common --- [INFO] Nothing to compile - all classes are up to date [INFO]=20 [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources)= @ hive-shims-common --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-= svn-trunk-source/shims/common/src/test/resources [INFO]=20 [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-shims-commo= n --- [INFO] Executing tasks main: [delete] Deleting directory /data/hive-ptest/working/apache-svn-trunk-so= urce/shims/common/target/tmp [delete] Deleting directory /data/hive-ptest/working/apache-svn-trunk-so= urce/shims/common/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/s= hims/common/target/tmp [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/s= hims/common/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/s= hims/common/target/tmp/conf [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-so= urce/shims/common/target/tmp/conf [INFO] Executed tasks [INFO]=20 [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hi= ve-shims-common --- [INFO] No sources to compile [INFO]=20 [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-shims-comm= on --- [INFO] No tests to run. [INFO] = =20 [INFO] --------------------------------------------------------------------= ---- [INFO] Building Hive Shims 0.20 0.13.0-SNAPSHOT [INFO] --------------------------------------------------------------------= ---- [INFO]=20 [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-= shims-0.20 --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-= svn-trunk-source/shims/0.20/src/main/resources [INFO]=20 [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-shims-0.20= --- [INFO] Executing tasks main: [INFO] Executed tasks [INFO]=20 [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-shims= -0.20 --- [INFO] Nothing to compile - all classes are up to date [INFO]=20 [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources)= @ hive-shims-0.20 --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-= svn-trunk-source/shims/0.20/src/test/resources [INFO]=20 [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-shims-0.20 = --- [INFO] Executing tasks main: [delete] Deleting directory /data/hive-ptest/working/apache-svn-trunk-so= urce/shims/0.20/target/tmp [delete] Deleting directory /data/hive-ptest/working/apache-svn-trunk-so= urce/shims/0.20/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/s= hims/0.20/target/tmp [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/s= hims/0.20/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/s= hims/0.20/target/tmp/conf [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-so= urce/shims/0.20/target/tmp/conf [INFO] Executed tasks [INFO]=20 [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hi= ve-shims-0.20 --- [INFO] No sources to compile [INFO]=20 [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-shims-0.20= --- [INFO] No tests to run. [INFO] = =20 [INFO] --------------------------------------------------------------------= ---- [INFO] Building Hive Shims Secure Common 0.13.0-SNAPSHOT [INFO] --------------------------------------------------------------------= ---- [INFO]=20 [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-= shims-common-secure --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-= svn-trunk-source/shims/common-secure/src/main/resources [INFO]=20 [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-shims-comm= on-secure --- [INFO] Executing tasks main: [INFO] Executed tasks [INFO]=20 [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-shims= -common-secure --- [INFO] Nothing to compile - all classes are up to date [INFO]=20 [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources)= @ hive-shims-common-secure --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-= svn-trunk-source/shims/common-secure/src/test/resources [INFO]=20 [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-shims-commo= n-secure --- [INFO] Executing tasks main: [delete] Deleting directory /data/hive-ptest/working/apache-svn-trunk-so= urce/shims/common-secure/target/tmp [delete] Deleting directory /data/hive-ptest/working/apache-svn-trunk-so= urce/shims/common-secure/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/s= hims/common-secure/target/tmp [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/s= hims/common-secure/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/s= hims/common-secure/target/tmp/conf [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-so= urce/shims/common-secure/target/tmp/conf [INFO] Executed tasks [INFO]=20 [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hi= ve-shims-common-secure --- [INFO] No sources to compile [INFO]=20 [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-shims-comm= on-secure --- [INFO] No tests to run. [INFO] = =20 [INFO] --------------------------------------------------------------------= ---- [INFO] Building Hive Shims 0.20S 0.13.0-SNAPSHOT [INFO] --------------------------------------------------------------------= ---- [INFO]=20 [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-= shims-0.20S --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-= svn-trunk-source/shims/0.20S/src/main/resources [INFO]=20 [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-shims-0.20= S --- [INFO] Executing tasks main: [INFO] Executed tasks [INFO]=20 [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-shims= -0.20S --- [INFO] Nothing to compile - all classes are up to date [INFO]=20 [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources)= @ hive-shims-0.20S --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-= svn-trunk-source/shims/0.20S/src/test/resources [INFO]=20 [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-shims-0.20S= --- [INFO] Executing tasks main: [delete] Deleting directory /data/hive-ptest/working/apache-svn-trunk-so= urce/shims/0.20S/target/tmp [delete] Deleting directory /data/hive-ptest/working/apache-svn-trunk-so= urce/shims/0.20S/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/s= hims/0.20S/target/tmp [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/s= hims/0.20S/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/s= hims/0.20S/target/tmp/conf [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-so= urce/shims/0.20S/target/tmp/conf [INFO] Executed tasks [INFO]=20 [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hi= ve-shims-0.20S --- [INFO] No sources to compile [INFO]=20 [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-shims-0.20= S --- [INFO] No tests to run. [INFO] = =20 [INFO] --------------------------------------------------------------------= ---- [INFO] Building Hive Shims 0.23 0.13.0-SNAPSHOT [INFO] --------------------------------------------------------------------= ---- [INFO]=20 [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-= shims-0.23 --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-= svn-trunk-source/shims/0.23/src/main/resources [INFO]=20 [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-shims-0.23= --- [INFO] Executing tasks main: [INFO] Executed tasks [INFO]=20 [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-shims= -0.23 --- [INFO] Nothing to compile - all classes are up to date [INFO]=20 [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources)= @ hive-shims-0.23 --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-= svn-trunk-source/shims/0.23/src/test/resources [INFO]=20 [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-shims-0.23 = --- [INFO] Executing tasks main: [delete] Deleting directory /data/hive-ptest/working/apache-svn-trunk-so= urce/shims/0.23/target/tmp [delete] Deleting directory /data/hive-ptest/working/apache-svn-trunk-so= urce/shims/0.23/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/s= hims/0.23/target/tmp [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/s= hims/0.23/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/s= hims/0.23/target/tmp/conf [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-so= urce/shims/0.23/target/tmp/conf [INFO] Executed tasks [INFO]=20 [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hi= ve-shims-0.23 --- [INFO] No sources to compile [INFO]=20 [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-shims-0.23= --- [INFO] No tests to run. [INFO] = =20 [INFO] --------------------------------------------------------------------= ---- [INFO] Building Hive Shims 0.13.0-SNAPSHOT [INFO] --------------------------------------------------------------------= ---- [INFO]=20 [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-= shims --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-= svn-trunk-source/shims/assembly/src/main/resources [INFO]=20 [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-shims --- [INFO] Executing tasks main: [INFO] Executed tasks [INFO]=20 [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-shims= --- [INFO] No sources to compile [INFO]=20 [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources)= @ hive-shims --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-= svn-trunk-source/shims/assembly/src/test/resources [INFO]=20 [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-shims --- [INFO] Executing tasks main: [delete] Deleting directory /data/hive-ptest/working/apache-svn-trunk-so= urce/shims/assembly/target/tmp [delete] Deleting directory /data/hive-ptest/working/apache-svn-trunk-so= urce/shims/assembly/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/s= hims/assembly/target/tmp [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/s= hims/assembly/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/s= hims/assembly/target/tmp/conf [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-so= urce/shims/assembly/target/tmp/conf [INFO] Executed tasks [INFO]=20 [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hi= ve-shims --- [INFO] No sources to compile [INFO]=20 [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-shims --- [INFO] No tests to run. [INFO] = =20 [INFO] --------------------------------------------------------------------= ---- [INFO] Building Hive Common 0.13.0-SNAPSHOT [INFO] --------------------------------------------------------------------= ---- [INFO]=20 [INFO] --- maven-antrun-plugin:1.7:run (generate-version-annotation) @ hive= -common --- [INFO] Executing tasks main: [INFO] Executed tasks [INFO]=20 [INFO] --- build-helper-maven-plugin:1.8:add-source (add-source) @ hive-com= mon --- [INFO] Source directory: /data/hive-ptest/working/apache-svn-trunk-source/c= ommon/src/gen added. [INFO]=20 [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-= common --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] Copying 1 resource [INFO]=20 [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-common --- [INFO] Executing tasks main: [INFO] Executed tasks [INFO]=20 [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-commo= n --- [INFO] Compiling 1 source file to /data/hive-ptest/working/apache-svn-trunk= -source/common/target/classes [INFO]=20 [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources)= @ hive-common --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] Copying 4 resources [INFO]=20 [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-common --- [INFO] Executing tasks main: [delete] Deleting directory /data/hive-ptest/working/apache-svn-trunk-so= urce/common/target/tmp [delete] Deleting directory /data/hive-ptest/working/apache-svn-trunk-so= urce/common/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/c= ommon/target/tmp [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/c= ommon/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/c= ommon/target/tmp/conf [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-so= urce/common/target/tmp/conf [INFO] Executed tasks [INFO]=20 [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hi= ve-common --- [INFO] Nothing to compile - all classes are up to date [INFO]=20 [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-common --- [INFO] = =20 [INFO] --------------------------------------------------------------------= ---- [INFO] Building Hive Serde 0.13.0-SNAPSHOT [INFO] --------------------------------------------------------------------= ---- [INFO]=20 [INFO] --- build-helper-maven-plugin:1.8:add-source (add-source) @ hive-ser= de --- [INFO] Source directory: /data/hive-ptest/working/apache-svn-trunk-source/s= erde/src/gen/protobuf/gen-java added. [INFO] Source directory: /data/hive-ptest/working/apache-svn-trunk-source/s= erde/src/gen/thrift/gen-javabean added. [INFO]=20 [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-= serde --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-= svn-trunk-source/serde/src/main/resources [INFO]=20 [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-serde --- [INFO] Executing tasks main: [INFO] Executed tasks [INFO]=20 [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-serde= --- [INFO] Compiling 1 source file to /data/hive-ptest/working/apache-svn-trunk= -source/serde/target/classes [INFO]=20 [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources)= @ hive-serde --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-= svn-trunk-source/serde/src/test/resources [INFO]=20 [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-serde --- [INFO] Executing tasks main: [delete] Deleting directory /data/hive-ptest/working/apache-svn-trunk-so= urce/serde/target/tmp [delete] Deleting directory /data/hive-ptest/working/apache-svn-trunk-so= urce/serde/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/s= erde/target/tmp [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/s= erde/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/s= erde/target/tmp/conf [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-so= urce/serde/target/tmp/conf [INFO] Executed tasks [INFO]=20 [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hi= ve-serde --- [INFO] Nothing to compile - all classes are up to date [INFO]=20 [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-serde --- [INFO] = =20 [INFO] --------------------------------------------------------------------= ---- [INFO] Building Hive Metastore 0.13.0-SNAPSHOT [INFO] --------------------------------------------------------------------= ---- [INFO]=20 [INFO] --- build-helper-maven-plugin:1.8:add-source (add-source) @ hive-met= astore --- [INFO] Source directory: /data/hive-ptest/working/apache-svn-trunk-source/m= etastore/src/model added. [INFO] Source directory: /data/hive-ptest/working/apache-svn-trunk-source/m= etastore/src/gen/thrift/gen-javabean added. [INFO]=20 [INFO] --- antlr3-maven-plugin:3.4:antlr (default) @ hive-metastore --- [INFO] ANTLR: Processing source directory /data/hive-ptest/working/apache-s= vn-trunk-source/metastore/src/java ANTLR Parser Generator Version 3.4 Grammar /data/hive-ptest/working/apache-svn-trunk-source/metastore/src/java= /org/apache/hadoop/hive/metastore/parser/Filter.g is up to date - build ski= pped [INFO]=20 [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-= metastore --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] Copying 1 resource [INFO]=20 [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-metastore = --- [INFO] Executing tasks main: [INFO] Executed tasks [INFO]=20 [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-metas= tore --- [INFO] Compiling 1 source file to /data/hive-ptest/working/apache-svn-trunk= -source/metastore/target/classes [INFO]=20 [INFO] --- datanucleus-maven-plugin:3.3.0-release:enhance (default) @ hive-= metastore --- [INFO] DataNucleus Enhancer (version 3.2.2) for API "JDO" using JRE "1.6" DataNucleus Enhancer : Classpath >> /data/hive-ptest/working/maven/org/datanucleus/datanucleus-maven-plugin= /3.3.0-release/datanucleus-maven-plugin-3.3.0-release.jar >> /data/hive-ptest/working/maven/org/datanucleus/datanucleus-core/3.2.2/d= atanucleus-core-3.2.2.jar >> /data/hive-ptest/working/maven/org/codehaus/plexus/plexus-utils/3.0.8/p= lexus-utils-3.0.8.jar >> /data/hive-ptest/working/maven/org/codehaus/plexus/plexus-component-ann= otations/1.5.5/plexus-component-annotations-1.5.5.jar >> /data/hive-ptest/working/maven/org/sonatype/sisu/sisu-inject-bean/2.3.0= /sisu-inject-bean-2.3.0.jar >> /data/hive-ptest/working/maven/org/sonatype/sisu/sisu-guice/3.1.0/sisu-= guice-3.1.0-no_aop.jar >> /data/hive-ptest/working/maven/org/sonatype/sisu/sisu-guava/0.9.9/sisu-= guava-0.9.9.jar >> /data/hive-ptest/working/maven/org/apache/xbean/xbean-reflect/3.4/xbean= -reflect-3.4.jar >> /data/hive-ptest/working/maven/log4j/log4j/1.2.12/log4j-1.2.12.jar >> /data/hive-ptest/working/maven/commons-logging/commons-logging-api/1.1/= commons-logging-api-1.1.jar >> /data/hive-ptest/working/maven/com/google/collections/google-collection= s/1.0/google-collections-1.0.jar >> /data/hive-ptest/working/maven/junit/junit/3.8.2/junit-3.8.2.jar >> /data/hive-ptest/working/apache-svn-trunk-source/metastore/target/class= es >> /data/hive-ptest/working/apache-svn-trunk-source/serde/target/classes >> /data/hive-ptest/working/apache-svn-trunk-source/common/target/classes >> /data/hive-ptest/working/maven/org/apache/commons/commons-compress/1.4.= 1/commons-compress-1.4.1.jar >> /data/hive-ptest/working/maven/org/tukaani/xz/1.0/xz-1.0.jar >> /data/hive-ptest/working/maven/commons-codec/commons-codec/1.4/commons-= codec-1.4.jar >> /data/hive-ptest/working/maven/org/apache/avro/avro/1.7.1/avro-1.7.1.ja= r >> /data/hive-ptest/working/maven/org/codehaus/jackson/jackson-core-asl/1.= 8.8/jackson-core-asl-1.8.8.jar >> /data/hive-ptest/working/maven/com/thoughtworks/paranamer/paranamer/2.3= /paranamer-2.3.jar >> /data/hive-ptest/working/maven/org/xerial/snappy/snappy-java/1.0.4.1/sn= appy-java-1.0.4.1.jar >> /data/hive-ptest/working/apache-svn-trunk-source/shims/assembly/target/= classes >> /data/hive-ptest/working/apache-svn-trunk-source/shims/common/target/cl= asses >> /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20/target/clas= ses >> /data/hive-ptest/working/apache-svn-trunk-source/shims/common-secure/ta= rget/classes >> /data/hive-ptest/working/maven/org/apache/zookeeper/zookeeper/3.4.3/zoo= keeper-3.4.3.jar >> /data/hive-ptest/working/maven/jline/jline/0.9.94/jline-0.9.94.jar >> /data/hive-ptest/working/maven/org/jboss/netty/netty/3.2.2.Final/netty-= 3.2.2.Final.jar >> /data/hive-ptest/working/apache-svn-trunk-source/shims/0.20S/target/cla= sses >> /data/hive-ptest/working/apache-svn-trunk-source/shims/0.23/target/clas= ses >> /data/hive-ptest/working/maven/com/google/guava/guava/11.0.2/guava-11.0= .2.jar >> /data/hive-ptest/working/maven/com/google/code/findbugs/jsr305/1.3.9/js= r305-1.3.9.jar >> /data/hive-ptest/working/maven/commons-cli/commons-cli/1.2/commons-cli-= 1.2.jar >> /data/hive-ptest/working/maven/commons-lang/commons-lang/2.4/commons-la= ng-2.4.jar >> /data/hive-ptest/working/maven/commons-logging/commons-logging/1.1.3/co= mmons-logging-1.1.3.jar >> /data/hive-ptest/working/maven/org/apache/derby/derby/10.4.2.0/derby-10= .4.2.0.jar >> /data/hive-ptest/working/maven/org/datanucleus/datanucleus-api-jdo/3.2.= 1/datanucleus-api-jdo-3.2.1.jar >> /data/hive-ptest/working/maven/org/datanucleus/datanucleus-rdbms/3.2.1/= datanucleus-rdbms-3.2.1.jar >> /data/hive-ptest/working/maven/javax/jdo/jdo-api/3.0.1/jdo-api-3.0.1.ja= r >> /data/hive-ptest/working/maven/javax/transaction/jta/1.1/jta-1.1.jar >> /data/hive-ptest/working/maven/org/antlr/antlr-runtime/3.4/antlr-runtim= e-3.4.jar >> /data/hive-ptest/working/maven/org/antlr/stringtemplate/3.2.1/stringtem= plate-3.2.1.jar >> /data/hive-ptest/working/maven/antlr/antlr/2.7.7/antlr-2.7.7.jar >> /data/hive-ptest/working/maven/org/apache/thrift/libfb303/0.9.0/libfb30= 3-0.9.0.jar >> /data/hive-ptest/working/maven/org/apache/thrift/libthrift/0.9.0/libthr= ift-0.9.0.jar >> /data/hive-ptest/working/maven/org/apache/httpcomponents/httpclient/4.1= .3/httpclient-4.1.3.jar >> /data/hive-ptest/working/maven/org/apache/httpcomponents/httpcore/4.1.3= /httpcore-4.1.3.jar >> /data/hive-ptest/working/maven/org/apache/hadoop/hadoop-core/1.2.1/hado= op-core-1.2.1.jar >> /data/hive-ptest/working/maven/xmlenc/xmlenc/0.52/xmlenc-0.52.jar >> /data/hive-ptest/working/maven/com/sun/jersey/jersey-core/1.8/jersey-co= re-1.8.jar >> /data/hive-ptest/working/maven/com/sun/jersey/jersey-json/1.8/jersey-js= on-1.8.jar >> /data/hive-ptest/working/maven/org/codehaus/jettison/jettison/1.1/jetti= son-1.1.jar >> /data/hive-ptest/working/maven/stax/stax-api/1.0.1/stax-api-1.0.1.jar >> /data/hive-ptest/working/maven/com/sun/xml/bind/jaxb-impl/2.2.3-1/jaxb-= impl-2.2.3-1.jar >> /data/hive-ptest/working/maven/javax/xml/bind/jaxb-api/2.2.2/jaxb-api-2= .2.2.jar >> /data/hive-ptest/working/maven/javax/xml/stream/stax-api/1.0-2/stax-api= -1.0-2.jar >> /data/hive-ptest/working/maven/javax/activation/activation/1.1/activati= on-1.1.jar >> /data/hive-ptest/working/maven/org/codehaus/jackson/jackson-jaxrs/1.7.1= /jackson-jaxrs-1.7.1.jar >> /data/hive-ptest/working/maven/org/codehaus/jackson/jackson-xc/1.7.1/ja= ckson-xc-1.7.1.jar >> /data/hive-ptest/working/maven/com/sun/jersey/jersey-server/1.8/jersey-= server-1.8.jar >> /data/hive-ptest/working/maven/asm/asm/3.1/asm-3.1.jar >> /data/hive-ptest/working/maven/commons-io/commons-io/2.1/commons-io-2.1= .jar >> /data/hive-ptest/working/maven/commons-httpclient/commons-httpclient/3.= 0.1/commons-httpclient-3.0.1.jar >> /data/hive-ptest/working/maven/org/apache/commons/commons-math/2.1/comm= ons-math-2.1.jar >> /data/hive-ptest/working/maven/commons-configuration/commons-configurat= ion/1.6/commons-configuration-1.6.jar >> /data/hive-ptest/working/maven/commons-collections/commons-collections/= 3.2.1/commons-collections-3.2.1.jar >> /data/hive-ptest/working/maven/commons-digester/commons-digester/1.8/co= mmons-digester-1.8.jar >> /data/hive-ptest/working/maven/commons-beanutils/commons-beanutils/1.7.= 0/commons-beanutils-1.7.0.jar >> /data/hive-ptest/working/maven/commons-beanutils/commons-beanutils-core= /1.8.0/commons-beanutils-core-1.8.0.jar >> /data/hive-ptest/working/maven/commons-net/commons-net/1.4.1/commons-ne= t-1.4.1.jar >> /data/hive-ptest/working/maven/org/mortbay/jetty/jetty/6.1.26/jetty-6.1= .26.jar >> /data/hive-ptest/working/maven/org/mortbay/jetty/servlet-api/2.5-200812= 11/servlet-api-2.5-20081211.jar >> /data/hive-ptest/working/maven/org/mortbay/jetty/jetty-util/6.1.26/jett= y-util-6.1.26.jar >> /data/hive-ptest/working/maven/tomcat/jasper-runtime/5.5.12/jasper-runt= ime-5.5.12.jar >> /data/hive-ptest/working/maven/tomcat/jasper-compiler/5.5.12/jasper-com= piler-5.5.12.jar >> /data/hive-ptest/working/maven/org/mortbay/jetty/jsp-api-2.1/6.1.14/jsp= -api-2.1-6.1.14.jar >> /data/hive-ptest/working/maven/org/mortbay/jetty/servlet-api-2.5/6.1.14= /servlet-api-2.5-6.1.14.jar >> /data/hive-ptest/working/maven/org/mortbay/jetty/jsp-2.1/6.1.14/jsp-2.1= -6.1.14.jar >> /data/hive-ptest/working/maven/ant/ant/1.6.5/ant-1.6.5.jar >> /data/hive-ptest/working/maven/commons-el/commons-el/1.0/commons-el-1.0= .jar >> /data/hive-ptest/working/maven/net/java/dev/jets3t/jets3t/0.6.1/jets3t-= 0.6.1.jar >> /data/hive-ptest/working/maven/hsqldb/hsqldb/1.8.0.10/hsqldb-1.8.0.10.j= ar >> /data/hive-ptest/working/maven/oro/oro/2.0.8/oro-2.0.8.jar >> /data/hive-ptest/working/maven/org/eclipse/jdt/core/3.1.1/core-3.1.1.ja= r >> /data/hive-ptest/working/maven/org/codehaus/jackson/jackson-mapper-asl/= 1.8.8/jackson-mapper-asl-1.8.8.jar >> /data/hive-ptest/working/maven/org/slf4j/slf4j-api/1.6.1/slf4j-api-1.6.= 1.jar >> /data/hive-ptest/working/maven/org/slf4j/slf4j-log4j12/1.6.1/slf4j-log4= j12-1.6.1.jar >> /data/hive-ptest/working/maven/log4j/log4j/1.2.16/log4j-1.2.16.jar ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MDat= abase ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MFie= ldSchema ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MTyp= e ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MTab= le ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MSer= DeInfo ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MOrd= er ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MCol= umnDescriptor ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MStr= ingList ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MSto= rageDescriptor ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MPar= tition ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MInd= ex ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MRol= e ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MRol= eMap ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MGlo= balPrivilege ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MDBP= rivilege ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MTab= lePrivilege ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MPar= titionPrivilege ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MTab= leColumnPrivilege ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MPar= titionColumnPrivilege ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MPar= titionEvent ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MMas= terKey ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MDel= egationToken ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MTab= leColumnStatistics ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MPar= titionColumnStatistics ENHANCED (PersistenceCapable) : org.apache.hadoop.hive.metastore.model.MVer= sionTable DataNucleus Enhancer completed with success for 25 classes. Timings : input= =3D728 ms, enhance=3D325 ms, total=3D1053 ms. Consult the log for full deta= ils [INFO]=20 [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources)= @ hive-metastore --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-= svn-trunk-source/metastore/src/test/resources [INFO]=20 [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-metastore -= -- [INFO] Executing tasks main: [delete] Deleting directory /data/hive-ptest/working/apache-svn-trunk-so= urce/metastore/target/tmp [delete] Deleting directory /data/hive-ptest/working/apache-svn-trunk-so= urce/metastore/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/m= etastore/target/tmp [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/m= etastore/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/m= etastore/target/tmp/conf [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-so= urce/metastore/target/tmp/conf [INFO] Executed tasks [INFO]=20 [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hi= ve-metastore --- [INFO] Nothing to compile - all classes are up to date [INFO]=20 [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-metastore = --- [INFO] = =20 [INFO] --------------------------------------------------------------------= ---- [INFO] Building Hive Query Language 0.13.0-SNAPSHOT [INFO] --------------------------------------------------------------------= ---- [INFO]=20 [INFO] --- maven-antrun-plugin:1.7:run (generate-sources) @ hive-exec --- [INFO] Executing tasks main: Generating vector expression code Generating vector expression test code [INFO] Executed tasks [INFO]=20 [INFO] --- build-helper-maven-plugin:1.8:add-source (add-source) @ hive-exe= c --- [INFO] Source directory: /data/hive-ptest/working/apache-svn-trunk-source/q= l/src/gen/protobuf/gen-java added. [INFO] Source directory: /data/hive-ptest/working/apache-svn-trunk-source/q= l/src/gen/thrift/gen-javabean added. [INFO] Source directory: /data/hive-ptest/working/apache-svn-trunk-source/q= l/target/generated-sources/java added. [INFO]=20 [INFO] --- antlr3-maven-plugin:3.4:antlr (default) @ hive-exec --- [INFO] ANTLR: Processing source directory /data/hive-ptest/working/apache-s= vn-trunk-source/ql/src/java ANTLR Parser Generator Version 3.4 Grammar /data/hive-ptest/working/apache-svn-trunk-source/ql/src/java/org/ap= ache/hadoop/hive/ql/parse/HiveLexer.g is up to date - build skipped Grammar /data/hive-ptest/working/apache-svn-trunk-source/ql/src/java/org/ap= ache/hadoop/hive/ql/parse/HiveParser.g is up to date - build skipped [INFO]=20 [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-= exec --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] Copying 1 resource [INFO]=20 [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-exec --- [INFO] Executing tasks main: [INFO] Executed tasks [INFO]=20 [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-exec = --- [INFO] Compiling 6 source files to /data/hive-ptest/working/apache-svn-trun= k-source/ql/target/classes [INFO]=20 [INFO] --- build-helper-maven-plugin:1.8:add-test-source (add-test-sources)= @ hive-exec --- [INFO] Test Source directory: /data/hive-ptest/working/apache-svn-trunk-sou= rce/ql/target/generated-test-sources/java added. [INFO]=20 [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources)= @ hive-exec --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] Copying 4 resources [INFO]=20 [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-exec --- [INFO] Executing tasks main: [delete] Deleting directory /data/hive-ptest/working/apache-svn-trunk-so= urce/ql/target/tmp [delete] Deleting directory /data/hive-ptest/working/apache-svn-trunk-so= urce/ql/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/q= l/target/tmp [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/q= l/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/q= l/target/tmp/conf [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-so= urce/ql/target/tmp/conf [INFO] Executed tasks [INFO]=20 [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hi= ve-exec --- [INFO] Compiling 4 source files to /data/hive-ptest/working/apache-svn-trun= k-source/ql/target/test-classes [INFO]=20 [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-exec --- [INFO] = =20 [INFO] --------------------------------------------------------------------= ---- [INFO] Building Hive Service 0.13.0-SNAPSHOT [INFO] --------------------------------------------------------------------= ---- [INFO]=20 [INFO] --- build-helper-maven-plugin:1.8:add-source (add-source) @ hive-ser= vice --- [INFO] Source directory: /data/hive-ptest/working/apache-svn-trunk-source/s= ervice/src/model added. [INFO] Source directory: /data/hive-ptest/working/apache-svn-trunk-source/s= ervice/src/gen/thrift/gen-javabean added. [INFO]=20 [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-= service --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-= svn-trunk-source/service/src/main/resources [INFO]=20 [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-service --= - [INFO] Executing tasks main: [INFO] Executed tasks [INFO]=20 [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-servi= ce --- [INFO] Nothing to compile - all classes are up to date [INFO]=20 [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources)= @ hive-service --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-= svn-trunk-source/service/src/test/resources [INFO]=20 [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-service --- [INFO] Executing tasks main: [delete] Deleting directory /data/hive-ptest/working/apache-svn-trunk-so= urce/service/target/tmp [delete] Deleting directory /data/hive-ptest/working/apache-svn-trunk-so= urce/service/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/s= ervice/target/tmp [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/s= ervice/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/s= ervice/target/tmp/conf [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-so= urce/service/target/tmp/conf [INFO] Executed tasks [INFO]=20 [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hi= ve-service --- [INFO] Nothing to compile - all classes are up to date [INFO]=20 [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-service --= - [INFO] = =20 [INFO] --------------------------------------------------------------------= ---- [INFO] Building Hive JDBC 0.13.0-SNAPSHOT [INFO] --------------------------------------------------------------------= ---- [INFO]=20 [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-= jdbc --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-= svn-trunk-source/jdbc/src/main/resources [INFO]=20 [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-jdbc --- [INFO] Executing tasks main: [INFO] Executed tasks [INFO]=20 [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-jdbc = --- [INFO] Nothing to compile - all classes are up to date [INFO]=20 [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources)= @ hive-jdbc --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-= svn-trunk-source/jdbc/src/test/resources [INFO]=20 [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-jdbc --- [INFO] Executing tasks main: [delete] Deleting directory /data/hive-ptest/working/apache-svn-trunk-so= urce/jdbc/target/tmp [delete] Deleting directory /data/hive-ptest/working/apache-svn-trunk-so= urce/jdbc/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/j= dbc/target/tmp [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/j= dbc/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/j= dbc/target/tmp/conf [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-so= urce/jdbc/target/tmp/conf [INFO] Executed tasks [INFO]=20 [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hi= ve-jdbc --- [INFO] No sources to compile [INFO]=20 [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-jdbc --- [INFO] No tests to run. [INFO] = =20 [INFO] --------------------------------------------------------------------= ---- [INFO] Building Hive Beeline 0.13.0-SNAPSHOT [INFO] --------------------------------------------------------------------= ---- [INFO]=20 [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-= beeline --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] Copying 2 resources [INFO]=20 [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-beeline --= - [INFO] Executing tasks main: [INFO] Executed tasks [INFO]=20 [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-beeli= ne --- [INFO] Nothing to compile - all classes are up to date [INFO]=20 [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources)= @ hive-beeline --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-= svn-trunk-source/beeline/src/test/resources [INFO]=20 [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-beeline --- [INFO] Executing tasks main: [delete] Deleting directory /data/hive-ptest/working/apache-svn-trunk-so= urce/beeline/target/tmp [delete] Deleting directory /data/hive-ptest/working/apache-svn-trunk-so= urce/beeline/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/b= eeline/target/tmp [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/b= eeline/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/b= eeline/target/tmp/conf [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-so= urce/beeline/target/tmp/conf [INFO] Executed tasks [INFO]=20 [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hi= ve-beeline --- [INFO] Nothing to compile - all classes are up to date [INFO]=20 [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-beeline --= - [INFO] = =20 [INFO] --------------------------------------------------------------------= ---- [INFO] Building Hive CLI 0.13.0-SNAPSHOT [INFO] --------------------------------------------------------------------= ---- [INFO]=20 [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-= cli --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-= svn-trunk-source/cli/src/main/resources [INFO]=20 [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-cli --- [INFO] Executing tasks main: [INFO] Executed tasks [INFO]=20 [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-cli -= -- [INFO] Nothing to compile - all classes are up to date [INFO]=20 [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources)= @ hive-cli --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-= svn-trunk-source/cli/src/test/resources [INFO]=20 [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-cli --- [INFO] Executing tasks main: [delete] Deleting directory /data/hive-ptest/working/apache-svn-trunk-so= urce/cli/target/tmp [delete] Deleting directory /data/hive-ptest/working/apache-svn-trunk-so= urce/cli/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/c= li/target/tmp [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/c= li/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/c= li/target/tmp/conf [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-so= urce/cli/target/tmp/conf [INFO] Executed tasks [INFO]=20 [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hi= ve-cli --- [INFO] Nothing to compile - all classes are up to date [INFO]=20 [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-cli --- [INFO] = =20 [INFO] --------------------------------------------------------------------= ---- [INFO] Building Hive Contrib 0.13.0-SNAPSHOT [INFO] --------------------------------------------------------------------= ---- [INFO]=20 [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-= contrib --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-= svn-trunk-source/contrib/src/main/resources [INFO]=20 [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-contrib --= - [INFO] Executing tasks main: [INFO] Executed tasks [INFO]=20 [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-contr= ib --- [INFO] Nothing to compile - all classes are up to date [INFO]=20 [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources)= @ hive-contrib --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-= svn-trunk-source/contrib/src/test/resources [INFO]=20 [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-contrib --- [INFO] Executing tasks main: [delete] Deleting directory /data/hive-ptest/working/apache-svn-trunk-so= urce/contrib/target/tmp [delete] Deleting directory /data/hive-ptest/working/apache-svn-trunk-so= urce/contrib/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/c= ontrib/target/tmp [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/c= ontrib/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/c= ontrib/target/tmp/conf [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-so= urce/contrib/target/tmp/conf [INFO] Executed tasks [INFO]=20 [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hi= ve-contrib --- [INFO] Nothing to compile - all classes are up to date [INFO]=20 [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-contrib --= - [INFO] = =20 [INFO] --------------------------------------------------------------------= ---- [INFO] Building Hive HBase Handler 0.13.0-SNAPSHOT [INFO] --------------------------------------------------------------------= ---- [INFO]=20 [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-= hbase-handler --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-= svn-trunk-source/hbase-handler/src/main/resources [INFO]=20 [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-hbase-hand= ler --- [INFO] Executing tasks main: [INFO] Executed tasks [INFO]=20 [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-hbase= -handler --- [INFO] Compiling 1 source file to /data/hive-ptest/working/apache-svn-trunk= -source/hbase-handler/target/classes [INFO]=20 [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources)= @ hive-hbase-handler --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-= svn-trunk-source/hbase-handler/src/test/resources [INFO]=20 [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-hbase-handl= er --- [INFO] Executing tasks main: [delete] Deleting directory /data/hive-ptest/working/apache-svn-trunk-so= urce/hbase-handler/target/tmp [delete] Deleting directory /data/hive-ptest/working/apache-svn-trunk-so= urce/hbase-handler/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/h= base-handler/target/tmp [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/h= base-handler/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/h= base-handler/target/tmp/conf [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-so= urce/hbase-handler/target/tmp/conf [INFO] Executed tasks [INFO]=20 [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hi= ve-hbase-handler --- [INFO] Nothing to compile - all classes are up to date [INFO]=20 [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-hbase-hand= ler --- [INFO] = =20 [INFO] --------------------------------------------------------------------= ---- [INFO] Building Hive HCatalog 0.13.0-SNAPSHOT [INFO] --------------------------------------------------------------------= ---- [INFO]=20 [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-hcatalog -= -- [INFO] Executing tasks main: [INFO] Executed tasks [INFO]=20 [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-hcatalog --= - [INFO] Executing tasks main: [delete] Deleting directory /data/hive-ptest/working/apache-svn-trunk-so= urce/hcatalog/target/tmp [delete] Deleting directory /data/hive-ptest/working/apache-svn-trunk-so= urce/hcatalog/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/h= catalog/target/tmp [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/h= catalog/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/h= catalog/target/tmp/conf [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-so= urce/hcatalog/target/tmp/conf [INFO] Executed tasks [INFO] = =20 [INFO] --------------------------------------------------------------------= ---- [INFO] Building Hive HCatalog Core 0.13.0-SNAPSHOT [INFO] --------------------------------------------------------------------= ---- [INFO]=20 [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-= hcatalog-core --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-= svn-trunk-source/hcatalog/core/src/main/resources [INFO]=20 [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-hcatalog-c= ore --- [INFO] Executing tasks main: [INFO] Executed tasks [INFO]=20 [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-hcata= log-core --- [INFO] Nothing to compile - all classes are up to date [INFO]=20 [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources)= @ hive-hcatalog-core --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-= svn-trunk-source/hcatalog/core/src/test/resources [INFO]=20 [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-hcatalog-co= re --- [INFO] Executing tasks main: [delete] Deleting directory /data/hive-ptest/working/apache-svn-trunk-so= urce/hcatalog/core/target/tmp [delete] Deleting directory /data/hive-ptest/working/apache-svn-trunk-so= urce/hcatalog/core/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/h= catalog/core/target/tmp [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/h= catalog/core/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/h= catalog/core/target/tmp/conf [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-so= urce/hcatalog/core/target/tmp/conf [INFO] Executed tasks [INFO]=20 [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hi= ve-hcatalog-core --- [INFO] Nothing to compile - all classes are up to date [INFO]=20 [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-hcatalog-c= ore --- [INFO] = =20 [INFO] --------------------------------------------------------------------= ---- [INFO] Building Hive HCatalog Pig Adapter 0.13.0-SNAPSHOT [INFO] --------------------------------------------------------------------= ---- [INFO]=20 [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-= hcatalog-pig-adapter --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-= svn-trunk-source/hcatalog/hcatalog-pig-adapter/src/main/resources [INFO]=20 [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-hcatalog-p= ig-adapter --- [INFO] Executing tasks main: [INFO] Executed tasks [INFO]=20 [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-hcata= log-pig-adapter --- [INFO] Nothing to compile - all classes are up to date [INFO]=20 [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources)= @ hive-hcatalog-pig-adapter --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-= svn-trunk-source/hcatalog/hcatalog-pig-adapter/src/test/resources [INFO]=20 [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-hcatalog-pi= g-adapter --- [INFO] Executing tasks main: [delete] Deleting directory /data/hive-ptest/working/apache-svn-trunk-so= urce/hcatalog/hcatalog-pig-adapter/target/tmp [delete] Deleting directory /data/hive-ptest/working/apache-svn-trunk-so= urce/hcatalog/hcatalog-pig-adapter/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/h= catalog/hcatalog-pig-adapter/target/tmp [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/h= catalog/hcatalog-pig-adapter/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/h= catalog/hcatalog-pig-adapter/target/tmp/conf [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-so= urce/hcatalog/hcatalog-pig-adapter/target/tmp/conf [INFO] Executed tasks [INFO]=20 [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hi= ve-hcatalog-pig-adapter --- [INFO] Nothing to compile - all classes are up to date [INFO]=20 [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-hcatalog-p= ig-adapter --- [INFO] = =20 [INFO] --------------------------------------------------------------------= ---- [INFO] Building Hive HCatalog Server Extensions 0.13.0-SNAPSHOT [INFO] --------------------------------------------------------------------= ---- [INFO]=20 [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-= hcatalog-server-extensions --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-= svn-trunk-source/hcatalog/server-extensions/src/main/resources [INFO]=20 [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-hcatalog-s= erver-extensions --- [INFO] Executing tasks main: [INFO] Executed tasks [INFO]=20 [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-hcata= log-server-extensions --- [INFO] Nothing to compile - all classes are up to date [INFO]=20 [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources)= @ hive-hcatalog-server-extensions --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-= svn-trunk-source/hcatalog/server-extensions/src/test/resources [INFO]=20 [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-hcatalog-se= rver-extensions --- [INFO] Executing tasks main: [delete] Deleting directory /data/hive-ptest/working/apache-svn-trunk-so= urce/hcatalog/server-extensions/target/tmp [delete] Deleting directory /data/hive-ptest/working/apache-svn-trunk-so= urce/hcatalog/server-extensions/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/h= catalog/server-extensions/target/tmp [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/h= catalog/server-extensions/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/h= catalog/server-extensions/target/tmp/conf [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-so= urce/hcatalog/server-extensions/target/tmp/conf [INFO] Executed tasks [INFO]=20 [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hi= ve-hcatalog-server-extensions --- [INFO] Nothing to compile - all classes are up to date [INFO]=20 [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-hcatalog-s= erver-extensions --- [INFO] = =20 [INFO] --------------------------------------------------------------------= ---- [INFO] Building Hive HCatalog Webhcat Java Client 0.13.0-SNAPSHOT [INFO] --------------------------------------------------------------------= ---- [INFO]=20 [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-= webhcat-java-client --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-= svn-trunk-source/hcatalog/webhcat/java-client/src/main/resources [INFO]=20 [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-webhcat-ja= va-client --- [INFO] Executing tasks main: [INFO] Executed tasks [INFO]=20 [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-webhc= at-java-client --- [INFO] Nothing to compile - all classes are up to date [INFO]=20 [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources)= @ hive-webhcat-java-client --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-= svn-trunk-source/hcatalog/webhcat/java-client/src/test/resources [INFO]=20 [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-webhcat-jav= a-client --- [INFO] Executing tasks main: [delete] Deleting directory /data/hive-ptest/working/apache-svn-trunk-so= urce/hcatalog/webhcat/java-client/target/tmp [delete] Deleting directory /data/hive-ptest/working/apache-svn-trunk-so= urce/hcatalog/webhcat/java-client/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/h= catalog/webhcat/java-client/target/tmp [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/h= catalog/webhcat/java-client/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/h= catalog/webhcat/java-client/target/tmp/conf [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-so= urce/hcatalog/webhcat/java-client/target/tmp/conf [INFO] Executed tasks [INFO]=20 [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hi= ve-webhcat-java-client --- [INFO] Nothing to compile - all classes are up to date [INFO]=20 [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-webhcat-ja= va-client --- [INFO] = =20 [INFO] --------------------------------------------------------------------= ---- [INFO] Building Hive HCatalog Webhcat 0.13.0-SNAPSHOT [INFO] --------------------------------------------------------------------= ---- [INFO]=20 [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-= webhcat --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-= svn-trunk-source/hcatalog/webhcat/svr/src/main/resources [INFO]=20 [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-webhcat --= - [INFO] Executing tasks main: [INFO] Executed tasks [INFO]=20 [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-webhc= at --- [INFO] Nothing to compile - all classes are up to date [INFO]=20 [INFO] --- maven-javadoc-plugin:2.4:javadoc (resourcesdoc.xml) @ hive-webhc= at --- [INFO] Setting property: classpath.resource.loader.class =3D> 'org.codehaus= .plexus.velocity.ContextClassLoaderResourceLoader'. [INFO] Setting property: velocimacro.messages.on =3D> 'false'. [INFO] Setting property: resource.loader =3D> 'classpath'. [INFO] Setting property: resource.manager.logwhenfound =3D> 'false'. [INFO] **************************************************************=20 [INFO] Starting Jakarta Velocity v1.4 [INFO] RuntimeInstance initializing. [INFO] Default Properties File: org/apache/velocity/runtime/defaults/veloci= ty.properties [INFO] Default ResourceManager initializing. (class org.apache.velocity.run= time.resource.ResourceManagerImpl) [INFO] Resource Loader Instantiated: org.codehaus.plexus.velocity.ContextCl= assLoaderResourceLoader [INFO] ClasspathResourceLoader : initialization starting. [INFO] ClasspathResourceLoader : initialization complete. [INFO] ResourceCache : initialized. (class org.apache.velocity.runtime.reso= urce.ResourceCacheImpl) [INFO] Default ResourceManager initialization complete. [INFO] Loaded System Directive: org.apache.velocity.runtime.directive.Liter= al [INFO] Loaded System Directive: org.apache.velocity.runtime.directive.Macro [INFO] Loaded System Directive: org.apache.velocity.runtime.directive.Parse [INFO] Loaded System Directive: org.apache.velocity.runtime.directive.Inclu= de [INFO] Loaded System Directive: org.apache.velocity.runtime.directive.Forea= ch [INFO] Created: 20 parsers. [INFO] Velocimacro : initialization starting. [INFO] Velocimacro : adding VMs from VM library template : VM_global_librar= y.vm [ERROR] ResourceManager : unable to find resource 'VM_global_library.vm' in= any resource loader. [INFO] Velocimacro : error using VM library template VM_global_library.vm = : org.apache.velocity.exception.ResourceNotFoundException: Unable to find r= esource 'VM_global_library.vm' [INFO] Velocimacro : VM library template macro registration complete. [INFO] Velocimacro : allowInline =3D true : VMs can be defined inline in te= mplates [INFO] Velocimacro : allowInlineToOverride =3D false : VMs defined inline m= ay NOT replace previous VM definitions [INFO] Velocimacro : allowInlineLocal =3D false : VMs defined inline will b= e global in scope if allowed. [INFO] Velocimacro : initialization complete. [INFO] Velocity successfully started. Loading source files for package org.apache.hive.hcatalog.templeton... [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/= webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/SimpleExceptio= nMapper.java] [parsing completed 29ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/= webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/JsonBuilder.ja= va] [parsing completed 7ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/= webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/JobItemBean.ja= va] [parsing completed 1ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/= webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/JarDelegator.j= ava] [parsing completed 11ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/= webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/LauncherDelega= tor.java] [parsing completed 21ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/= webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/ExecServiceImp= l.java] [parsing completed 20ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/= webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/DeleteDelegato= r.java] [parsing completed 5ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/= webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/ExecBean.java] [parsing completed 9ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/= webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/PigDelegator.j= ava] [parsing completed 15ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/= webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/TempletonDeleg= ator.java] [parsing completed 0ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/= webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/StreamingDeleg= ator.java] [parsing completed 11ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/= webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/DatabaseDesc.j= ava] [parsing completed 5ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/= webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/CompleteBean.j= ava] [parsing completed 1ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/= webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/HiveDelegator.= java] [parsing completed 18ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/= webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/Server.java] [parsing completed 125ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/= webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/BadParam.java] [parsing completed 0ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/= webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/ColumnDesc.jav= a] [parsing completed 4ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/= webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/PartitionDesc.= java] [parsing completed 0ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/= webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/CatchallExcept= ionMapper.java] [parsing completed 3ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/= webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/HcatDelegator.= java] [parsing completed 42ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/= webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/StatusDelegato= r.java] [parsing completed 13ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/= webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/QueueStatusBea= n.java] [parsing completed 2ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/= webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/CompleteDelega= tor.java] [parsing completed 10ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/= webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/GroupPermissio= nsDesc.java] [parsing completed 3ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/= webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/AppConfig.java= ] [parsing completed 18ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/= webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/HcatException.= java] [parsing completed 0ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/= webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/QueueException= .java] [parsing completed 1ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/= webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/TableLikeDesc.= java] [parsing completed 4ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/= webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/TableDesc.java= ] [parsing completed 8ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/= webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/WadlConfig.jav= a] [parsing completed 1ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/= webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/BusyException.= java] [parsing completed 4ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/= webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/ExecService.ja= va] [parsing completed 0ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/= webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/NotAuthorizedE= xception.java] [parsing completed 1ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/= webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/EnqueueBean.ja= va] [parsing completed 4ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/= webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/SimpleWebExcep= tion.java] [parsing completed 2ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/= webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/ListDelegator.= java] [parsing completed 5ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/= webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/Main.java] [parsing completed 13ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/= webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/SecureProxySup= port.java] [parsing completed 8ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/= webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/MaxByteArrayOu= tputStream.java] [parsing completed 1ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/= webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/ProxyUserSuppo= rt.java] [parsing completed 7ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/= webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/UgiFactory.jav= a] [parsing completed 3ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/= webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/CallbackFailed= Exception.java] [parsing completed 1ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/= webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/TablePropertyD= esc.java] [parsing completed 0ms] Loading source files for package org.apache.hive.hcatalog.templeton.tool... [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/= webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/tool/ZooKeeper= Storage.java] [parsing completed 17ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/= webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/tool/NullRecor= dReader.java] [parsing completed 1ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/= webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/tool/PigJobIDP= arser.java] [parsing completed 0ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/= webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/tool/Templeton= Storage.java] [parsing completed 1ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/= webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/tool/JobIDPars= er.java] [parsing completed 4ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/= webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/tool/JobState.= java] [parsing completed 11ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/= webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/tool/Templeton= Utils.java] [parsing completed 18ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/= webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/tool/HDFSStora= ge.java] [parsing completed 12ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/= webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/tool/Delegatio= nTokenCache.java] [parsing completed 4ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/= webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/tool/JarJobIDP= arser.java] [parsing completed 13ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/= webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/tool/JobSubmis= sionConstants.java] [parsing completed 1ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/= webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/tool/ZooKeeper= Cleanup.java] [parsing completed 3ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/= webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/tool/JobStateT= racker.java] [parsing completed 3ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/= webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/tool/NotFoundE= xception.java] [parsing completed 1ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/= webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/tool/SingleInp= utFormat.java] [parsing completed 0ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/= webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/tool/LogRetrie= ver.java] [parsing completed 11ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/= webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/tool/NullSplit= .java] [parsing completed 2ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/= webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/tool/HDFSClean= up.java] [parsing completed 7ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/= webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/tool/Templeton= ControllerJob.java] [parsing completed 6ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/= webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/tool/HiveJobID= Parser.java] [parsing completed 0ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/= webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/tool/LaunchMap= per.java] [parsing completed 14ms] [parsing started /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/= webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/tool/TrivialEx= ecService.java] [parsing completed 1ms] Constructing Javadoc information... [search path for source files: /data/hive-ptest/working/apache-svn-trunk-so= urce/hcatalog/webhcat/svr/src/main/java] [search path for class files: /usr/java/jdk1.6.0_34/jre/lib/resources.jar,/= usr/java/jdk1.6.0_34/jre/lib/rt.jar,/usr/java/jdk1.6.0_34/jre/lib/sunrsasig= n.jar,/usr/java/jdk1.6.0_34/jre/lib/jsse.jar,/usr/java/jdk1.6.0_34/jre/lib/= jce.jar,/usr/java/jdk1.6.0_34/jre/lib/charsets.jar,/usr/java/jdk1.6.0_34/jr= e/lib/modules/jdk.boot.jar,/usr/java/jdk1.6.0_34/jre/classes,/usr/java/jdk1= .6.0_34/jre/lib/ext/localedata.jar,/usr/java/jdk1.6.0_34/jre/lib/ext/sunpkc= s11.jar,/usr/java/jdk1.6.0_34/jre/lib/ext/sunjce_provider.jar,/usr/java/jdk= 1.6.0_34/jre/lib/ext/dnsns.jar,/data/hive-ptest/working/apache-svn-trunk-so= urce/hcatalog/webhcat/svr/target/classes,/data/hive-ptest/working/maven/org= /datanucleus/datanucleus-api-jdo/3.2.1/datanucleus-api-jdo-3.2.1.jar,/data/= hive-ptest/working/maven/org/antlr/stringtemplate/3.2.1/stringtemplate-3.2.= 1.jar,/data/hive-ptest/working/maven/com/sun/jersey/jersey-json/1.14/jersey= -json-1.14.jar,/data/hive-ptest/working/maven/org/apache/zookeeper/zookeepe= r/3.4.3/zookeeper-3.4.3.jar,/data/hive-ptest/working/maven/commons-net/comm= ons-net/1.4.1/commons-net-1.4.1.jar,/data/hive-ptest/working/apache-svn-tru= nk-source/shims/0.23/target/classes,/data/hive-ptest/working/maven/javax/ma= il/mail/1.4.1/mail-1.4.1.jar,/data/hive-ptest/working/maven/javax/mail/mail= /1.4.1/activation.jar,/data/hive-ptest/working/maven/org/datanucleus/datanu= cleus-rdbms/3.2.1/datanucleus-rdbms-3.2.1.jar,/data/hive-ptest/working/mave= n/commons-httpclient/commons-httpclient/3.0.1/commons-httpclient-3.0.1.jar,= /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/core/target/class= es,/data/hive-ptest/working/maven/org/codehaus/jackson/jackson-core-asl/1.9= .2/jackson-core-asl-1.9.2.jar,/data/hive-ptest/working/maven/org/apache/thr= ift/libthrift/0.9.0/libthrift-0.9.0.jar,/data/hive-ptest/working/maven/org/= eclipse/jetty/aggregate/jetty-all-server/7.6.0.v20120127/jetty-all-server-7= .6.0.v20120127.jar,/data/hive-ptest/working/maven/xerces/xercesImpl/2.6.1/x= ercesImpl-2.6.1.jar,/data/hive-ptest/working/maven/antlr/antlr/2.7.7/antlr-= 2.7.7.jar,/data/hive-ptest/working/maven/commons-configuration/commons-conf= iguration/1.6/commons-configuration-1.6.jar,/data/hive-ptest/working/maven/= com/googlecode/javaewah/JavaEWAH/0.3.2/JavaEWAH-0.3.2.jar,/data/hive-ptest/= working/apache-svn-trunk-source/shims/common-secure/target/classes,/data/hi= ve-ptest/working/maven/org/apache/httpcomponents/httpclient/4.1.3/httpclien= t-4.1.3.jar,/data/hive-ptest/working/apache-svn-trunk-source/serde/target/c= lasses,/data/hive-ptest/working/maven/javax/servlet/servlet-api/2.5/servlet= -api-2.5.jar,/data/hive-ptest/working/maven/com/sun/jdmk/jmxtools/1.2.1/jmx= tools-1.2.1.jar,/data/hive-ptest/working/maven/org/apache/velocity/velocity= /1.5/velocity-1.5.jar,/data/hive-ptest/working/maven/com/jolbox/bonecp/0.7.= 1.RELEASE/bonecp-0.7.1.RELEASE.jar,/data/hive-ptest/working/maven/javax/xml= /stream/stax-api/1.0-2/stax-api-1.0-2.jar,/data/hive-ptest/working/maven/or= g/slf4j/slf4j-log4j12/1.6.1/slf4j-log4j12-1.6.1.jar,/data/hive-ptest/workin= g/maven/javax/jms/jms/1.1/jms-1.1.jar,/data/hive-ptest/working/maven/common= s-lang/commons-lang/2.4/commons-lang-2.4.jar,/data/hive-ptest/working/maven= /javax/xml/bind/jaxb-api/2.2.2/jaxb-api-2.2.2.jar,/data/hive-ptest/working/= maven/com/sun/jersey/jersey-core/1.14/jersey-core-1.14.jar,/data/hive-ptest= /working/maven/org/apache/commons/commons-math/2.1/commons-math-2.1.jar,/da= ta/hive-ptest/working/maven/org/apache/httpcomponents/httpcore/4.1.3/httpco= re-4.1.3.jar,/data/hive-ptest/working/maven/org/xerial/snappy/snappy-java/1= .0.4.1/snappy-java-1.0.4.1.jar,/data/hive-ptest/working/maven/commons-colle= ctions/commons-collections/3.2.1/commons-collections-3.2.1.jar,/data/hive-p= test/working/maven/org/antlr/ST4/4.0.4/ST4-4.0.4.jar,/data/hive-ptest/worki= ng/maven/org/apache/commons/commons-exec/1.1/commons-exec-1.1.jar,/data/hiv= e-ptest/working/maven/com/google/guava/guava/11.0.2/guava-11.0.2.jar,/data/= hive-ptest/working/maven/org/datanucleus/datanucleus-core/3.2.2/datanucleus= -core-3.2.2.jar,/data/hive-ptest/working/maven/org/apache/hadoop/hadoop-cor= e/1.2.1/hadoop-core-1.2.1.jar,/data/hive-ptest/working/maven/org/tukaani/xz= /1.0/xz-1.0.jar,/data/hive-ptest/working/maven/org/mortbay/jetty/servlet-ap= i-2.5/6.1.14/servlet-api-2.5-6.1.14.jar,/data/hive-ptest/working/maven/java= x/activation/activation/1.1/activation-1.1.jar,/data/hive-ptest/working/mav= en/org/codehaus/jackson/jackson-jaxrs/1.9.2/jackson-jaxrs-1.9.2.jar,/data/h= ive-ptest/working/maven/stax/stax-api/1.0.1/stax-api-1.0.1.jar,/data/hive-p= test/working/maven/org/codehaus/jettison/jettison/1.1/jettison-1.1.jar,/dat= a/hive-ptest/working/maven/com/google/protobuf/protobuf-java/2.5.0/protobuf= -java-2.5.0.jar,/data/hive-ptest/working/maven/org/apache/commons/commons-c= ompress/1.4.1/commons-compress-1.4.1.jar,/data/hive-ptest/working/maven/org= /antlr/antlr-runtime/3.4/antlr-runtime-3.4.jar,/data/hive-ptest/working/mav= en/com/sun/jersey/jersey-server/1.14/jersey-server-1.14.jar,/usr/java/jdk1.= 6.0_34/jre/../lib/tools.jar,/data/hive-ptest/working/maven/org/apache/ant/a= nt/1.9.1/ant-1.9.1.jar,/data/hive-ptest/working/maven/io/netty/netty/3.4.0.= Final/netty-3.4.0.Final.jar,/data/hive-ptest/working/maven/org/slf4j/jul-to= -slf4j/1.6.1/jul-to-slf4j-1.6.1.jar,/data/hive-ptest/working/maven/com/sun/= jersey/contribs/wadl-resourcedoc-doclet/1.4/wadl-resourcedoc-doclet-1.4.jar= ,/data/hive-ptest/working/maven/org/mortbay/jetty/jetty/6.1.26/jetty-6.1.26= .jar,/data/hive-ptest/working/maven/org/apache/avro/avro-mapred/1.7.1/avro-= mapred-1.7.1.jar,/data/hive-ptest/working/maven/oro/oro/2.0.8/oro-2.0.8.jar= ,/data/hive-ptest/working/maven/org/eclipse/jdt/core/3.1.1/core-3.1.1.jar,/= data/hive-ptest/working/apache-svn-trunk-source/shims/0.20/target/classes,/= data/hive-ptest/working/maven/javax/jdo/jdo-api/3.0.1/jdo-api-3.0.1.jar,/da= ta/hive-ptest/working/maven/javax/transaction/jta/1.1/jta-1.1.jar,/data/hiv= e-ptest/working/maven/log4j/log4j/1.2.15/log4j-1.2.15.jar,/data/hive-ptest/= working/apache-svn-trunk-source/common/target/classes,/data/hive-ptest/work= ing/apache-svn-trunk-source/ql/target/classes,/data/hive-ptest/working/mave= n/org/apache/geronimo/specs/geronimo-annotation_1.0_spec/1.1.1/geronimo-ann= otation_1.0_spec-1.1.1.jar,/data/hive-ptest/working/maven/org/mortbay/jetty= /servlet-api/2.5-20081211/servlet-api-2.5-20081211.jar,/data/hive-ptest/wor= king/maven/org/apache/avro/avro-ipc/1.7.1/avro-ipc-1.7.1.jar,/data/hive-pte= st/working/maven/javolution/javolution/5.5.1/javolution-5.5.1.jar,/data/hiv= e-ptest/working/maven/net/java/dev/jets3t/jets3t/0.6.1/jets3t-0.6.1.jar,/da= ta/hive-ptest/working/maven/com/sun/jersey/jersey-servlet/1.14/jersey-servl= et-1.14.jar,/data/hive-ptest/working/maven/commons-logging/commons-logging/= 1.1.3/commons-logging-1.1.3.jar,/data/hive-ptest/working/maven/com/google/c= ode/findbugs/jsr305/1.3.9/jsr305-1.3.9.jar,/data/hive-ptest/working/maven/x= mlenc/xmlenc/0.52/xmlenc-0.52.jar,/data/hive-ptest/working/maven/org/mortba= y/jetty/jsp-2.1/6.1.14/jsp-2.1-6.1.14.jar,/data/hive-ptest/working/maven/co= mmons-el/commons-el/1.0/commons-el-1.0.jar,/data/hive-ptest/working/maven/c= om/esotericsoftware/kryo/kryo/2.22/kryo-2.22.jar,/data/hive-ptest/working/a= pache-svn-trunk-source/shims/0.20S/target/classes,/data/hive-ptest/working/= maven/jline/jline/0.9.94/jline-0.9.94.jar,/data/hive-ptest/working/maven/or= g/mortbay/jetty/jetty-util/6.1.26/jetty-util-6.1.26.jar,/data/hive-ptest/wo= rking/maven/org/slf4j/slf4j-api/1.6.1/slf4j-api-1.6.1.jar,/data/hive-ptest/= working/maven/org/jboss/netty/netty/3.2.2.Final/netty-3.2.2.Final.jar,/data= /hive-ptest/working/maven/org/codehaus/jackson/jackson-xc/1.9.2/jackson-xc-= 1.9.2.jar,/data/hive-ptest/working/maven/commons-beanutils/commons-beanutil= s-core/1.8.0/commons-beanutils-core-1.8.0.jar,/data/hive-ptest/working/mave= n/ant/ant/1.6.5/ant-1.6.5.jar,/data/hive-ptest/working/apache-svn-trunk-sou= rce/cli/target/classes,/data/hive-ptest/working/maven/org/apache/thrift/lib= fb303/0.9.0/libfb303-0.9.0.jar,/data/hive-ptest/working/maven/org/codehaus/= groovy/groovy-all/2.1.6/groovy-all-2.1.6.jar,/data/hive-ptest/working/maven= /org/apache/derby/derby/10.4.2.0/derby-10.4.2.0.jar,/data/hive-ptest/workin= g/maven/org/apache/derby/derby/10.4.2.0/derbyLocale_cs.jar,/data/hive-ptest= /working/maven/org/apache/derby/derby/10.4.2.0/derbyLocale_de_DE.jar,/data/= hive-ptest/working/maven/org/apache/derby/derby/10.4.2.0/derbyLocale_es.jar= ,/data/hive-ptest/working/maven/org/apache/derby/derby/10.4.2.0/derbyLocale= _fr.jar,/data/hive-ptest/working/maven/org/apache/derby/derby/10.4.2.0/derb= yLocale_hu.jar,/data/hive-ptest/working/maven/org/apache/derby/derby/10.4.2= .0/derbyLocale_it.jar,/data/hive-ptest/working/maven/org/apache/derby/derby= /10.4.2.0/derbyLocale_ja_JP.jar,/data/hive-ptest/working/maven/org/apache/d= erby/derby/10.4.2.0/derbyLocale_ko_KR.jar,/data/hive-ptest/working/maven/or= g/apache/derby/derby/10.4.2.0/derbyLocale_pl.jar,/data/hive-ptest/working/m= aven/org/apache/derby/derby/10.4.2.0/derbyLocale_pt_BR.jar,/data/hive-ptest= /working/maven/org/apache/derby/derby/10.4.2.0/derbyLocale_ru.jar,/data/hiv= e-ptest/working/maven/org/apache/derby/derby/10.4.2.0/derbyLocale_zh_CN.jar= ,/data/hive-ptest/working/maven/org/apache/derby/derby/10.4.2.0/derbyLocale= _zh_TW.jar,/data/hive-ptest/working/apache-svn-trunk-source/metastore/targe= t/classes,/data/hive-ptest/working/maven/asm/asm-commons/3.1/asm-commons-3.= 1.jar,/data/hive-ptest/working/maven/org/iq80/snappy/snappy/0.2/snappy-0.2.= jar,/data/hive-ptest/working/apache-svn-trunk-source/service/target/classes= ,/data/hive-ptest/working/maven/com/sun/xml/bind/jaxb-impl/2.2.3-1/jaxb-imp= l-2.2.3-1.jar,/data/hive-ptest/working/maven/com/sun/xml/bind/jaxb-impl/2.2= .3-1/jaxb-api.jar,/data/hive-ptest/working/maven/com/sun/xml/bind/jaxb-impl= /2.2.3-1/activation.jar,/data/hive-ptest/working/maven/com/sun/xml/bind/jax= b-impl/2.2.3-1/jsr173_1.0_api.jar,/data/hive-ptest/working/maven/com/sun/xm= l/bind/jaxb-impl/2.2.3-1/jaxb1-impl.jar,/data/hive-ptest/working/maven/comm= ons-beanutils/commons-beanutils/1.7.0/commons-beanutils-1.7.0.jar,/data/hiv= e-ptest/working/maven/org/apache/hadoop/hadoop-tools/1.2.1/hadoop-tools-1.2= .1.jar,/data/hive-ptest/working/maven/asm/asm-tree/3.1/asm-tree-3.1.jar,/da= ta/hive-ptest/working/maven/com/thoughtworks/paranamer/paranamer/2.2/parana= mer-2.2.jar,/data/hive-ptest/working/maven/commons-io/commons-io/2.1/common= s-io-2.1.jar,/data/hive-ptest/working/maven/tomcat/jasper-runtime/5.5.12/ja= sper-runtime-5.5.12.jar,/data/hive-ptest/working/maven/org/codehaus/jackson= /jackson-mapper-asl/1.9.2/jackson-mapper-asl-1.9.2.jar,/data/hive-ptest/wor= king/maven/org/apache/avro/avro/1.7.1/avro-1.7.1.jar,/data/hive-ptest/worki= ng/maven/commons-digester/commons-digester/1.8/commons-digester-1.8.jar,/da= ta/hive-ptest/working/maven/org/apache/geronimo/specs/geronimo-jaspic_1.0_s= pec/1.0/geronimo-jaspic_1.0_spec-1.0.jar,/data/hive-ptest/working/maven/org= /mortbay/jetty/jsp-api-2.1/6.1.14/jsp-api-2.1-6.1.14.jar,/data/hive-ptest/w= orking/maven/hsqldb/hsqldb/1.8.0.10/hsqldb-1.8.0.10.jar,/data/hive-ptest/wo= rking/apache-svn-trunk-source/shims/common/target/classes,/data/hive-ptest/= working/maven/commons-cli/commons-cli/1.2/commons-cli-1.2.jar,/data/hive-pt= est/working/apache-svn-trunk-source/shims/assembly/target/classes,/data/hiv= e-ptest/working/maven/org/apache/geronimo/specs/geronimo-jta_1.1_spec/1.1.1= /geronimo-jta_1.1_spec-1.1.1.jar,/data/hive-ptest/working/maven/org/json/js= on/20090211/json-20090211.jar,/data/hive-ptest/working/maven/org/apache/ant= /ant-launcher/1.9.1/ant-launcher-1.9.1.jar,/data/hive-ptest/working/maven/c= om/sun/jmx/jmxri/1.2.1/jmxri-1.2.1.jar,/data/hive-ptest/working/maven/tomca= t/jasper-compiler/5.5.12/jasper-compiler-5.5.12.jar,/data/hive-ptest/workin= g/apache-svn-trunk-source/ant/target/classes,/data/hive-ptest/working/maven= /asm/asm/3.1/asm-3.1.jar,/data/hive-ptest/working/maven/commons-codec/commo= ns-codec/1.4/commons-codec-1.4.jar] [loading javax/ws/rs/core/Response.class(javax/ws/rs/core:Response.class)] [loading javax/ws/rs/ext/ExceptionMapper.class(javax/ws/rs/ext:ExceptionMap= per.class)] [loading javax/ws/rs/ext/Provider.class(javax/ws/rs/ext:Provider.class)] [loading java/io/IOException.class(java/io:IOException.class)] [loading java/util/Map.class(java/util:Map.class)] [loading java/util/HashMap.class(java/util:HashMap.class)] [loading javax/ws/rs/core/MediaType.class(javax/ws/rs/core:MediaType.class)= ] [loading org/codehaus/jackson/map/ObjectMapper.class(org/codehaus/jackson/m= ap:ObjectMapper.class)] [loading java/lang/Throwable.class(java/lang:Throwable.class)] [loading java/io/Serializable.class(java/io:Serializable.class)] [loading java/lang/Object.class(java/lang:Object.class)] [loading java/lang/String.class(java/lang:String.class)] [loading java/io/ByteArrayOutputStream.class(java/io:ByteArrayOutputStream.= class)] [loading /data/hive-ptest/working/apache-svn-trunk-source/ql/target/classes= /org/apache/hadoop/hive/ql/ErrorMsg.class] [loading org/eclipse/jetty/http/HttpStatus.class(org/eclipse/jetty/http:Htt= pStatus.class)] [loading java/lang/Integer.class(java/lang:Integer.class)] [loading org/apache/hadoop/mapred/JobStatus.class(org/apache/hadoop/mapred:= JobStatus.class)] [loading org/apache/hadoop/mapred/JobProfile.class(org/apache/hadoop/mapred= :JobProfile.class)] [loading java/lang/Long.class(java/lang:Long.class)] [loading java/util/ArrayList.class(java/util:ArrayList.class)] [loading java/util/List.class(java/util:List.class)] [loading org/apache/commons/logging/Log.class(org/apache/commons/logging:Lo= g.class)] [loading org/apache/commons/logging/LogFactory.class(org/apache/commons/log= ging:LogFactory.class)] [loading org/apache/hadoop/conf/Configuration.class(org/apache/hadoop/conf:= Configuration.class)] [loading java/lang/Enum.class(java/lang:Enum.class)] [loading java/lang/Comparable.class(java/lang:Comparable.class)] [loading java/lang/Exception.class(java/lang:Exception.class)] [loading java/io/FileNotFoundException.class(java/io:FileNotFoundException.= class)] [loading java/net/URISyntaxException.class(java/net:URISyntaxException.clas= s)] [loading org/apache/commons/exec/ExecuteException.class(org/apache/commons/= exec:ExecuteException.class)] [loading java/security/PrivilegedExceptionAction.class(java/security:Privil= egedExceptionAction.class)] [loading org/apache/hadoop/fs/Path.class(org/apache/hadoop/fs:Path.class)] [loading /data/hive-ptest/working/apache-svn-trunk-source/common/target/cla= sses/org/apache/hadoop/hive/conf/HiveConf.class] [loading org/apache/hadoop/security/UserGroupInformation.class(org/apache/h= adoop/security:UserGroupInformation.class)] [loading org/apache/hadoop/util/StringUtils.class(org/apache/hadoop/util:St= ringUtils.class)] [loading org/apache/hadoop/util/ToolRunner.class(org/apache/hadoop/util:Too= lRunner.class)] [loading java/io/File.class(java/io:File.class)] [loading java/net/URL.class(java/net:URL.class)] [loading org/apache/hadoop/util/VersionInfo.class(org/apache/hadoop/util:Ve= rsionInfo.class)] [loading java/lang/Iterable.class(java/lang:Iterable.class)] [loading org/apache/hadoop/io/Writable.class(org/apache/hadoop/io:Writable.= class)] [loading java/lang/InterruptedException.class(java/lang:InterruptedExceptio= n.class)] [loading java/io/BufferedReader.class(java/io:BufferedReader.class)] [loading java/io/InputStream.class(java/io:InputStream.class)] [loading java/io/InputStreamReader.class(java/io:InputStreamReader.class)] [loading java/io/OutputStream.class(java/io:OutputStream.class)] [loading java/io/PrintWriter.class(java/io:PrintWriter.class)] [loading java/util/Map$Entry.class(java/util:Map$Entry.class)] [loading java/util/concurrent/Semaphore.class(java/util/concurrent:Semaphor= e.class)] [loading org/apache/commons/exec/CommandLine.class(org/apache/commons/exec:= CommandLine.class)] [loading org/apache/commons/exec/DefaultExecutor.class(org/apache/commons/e= xec:DefaultExecutor.class)] [loading org/apache/commons/exec/ExecuteWatchdog.class(org/apache/commons/e= xec:ExecuteWatchdog.class)] [loading org/apache/commons/exec/PumpStreamHandler.class(org/apache/commons= /exec:PumpStreamHandler.class)] [loading org/apache/hadoop/util/Shell.class(org/apache/hadoop/util:Shell.cl= ass)] [loading java/lang/Thread.class(java/lang:Thread.class)] [loading java/lang/Runnable.class(java/lang:Runnable.class)] [loading /data/hive-ptest/working/apache-svn-trunk-source/shims/common/targ= et/classes/org/apache/hadoop/hive/shims/HadoopShims.class] [loading /data/hive-ptest/working/apache-svn-trunk-source/shims/common/targ= et/classes/org/apache/hadoop/hive/shims/HadoopShims$WebHCatJTShim.class] [loading /data/hive-ptest/working/apache-svn-trunk-source/shims/common/targ= et/classes/org/apache/hadoop/hive/shims/ShimLoader.class] [loading org/apache/hadoop/mapred/JobID.class(org/apache/hadoop/mapred:JobI= D.class)] [loading java/util/Arrays.class(java/util:Arrays.class)] [loading javax/xml/bind/annotation/XmlRootElement.class(javax/xml/bind/anno= tation:XmlRootElement.class)] [loading java/net/InetAddress.class(java/net:InetAddress.class)] [loading java/net/UnknownHostException.class(java/net:UnknownHostException.= class)] [loading java/text/MessageFormat.class(java/text:MessageFormat.class)] [loading java/util/Collections.class(java/util:Collections.class)] [loading java/util/regex/Matcher.class(java/util/regex:Matcher.class)] [loading java/util/regex/Pattern.class(java/util/regex:Pattern.class)] [loading javax/servlet/http/HttpServletRequest.class(javax/servlet/http:Htt= pServletRequest.class)] [loading javax/ws/rs/DELETE.class(javax/ws/rs:DELETE.class)] [loading javax/ws/rs/FormParam.class(javax/ws/rs:FormParam.class)] [loading javax/ws/rs/GET.class(javax/ws/rs:GET.class)] [loading javax/ws/rs/POST.class(javax/ws/rs:POST.class)] [loading javax/ws/rs/PUT.class(javax/ws/rs:PUT.class)] [loading javax/ws/rs/Path.class(javax/ws/rs:Path.class)] [loading javax/ws/rs/PathParam.class(javax/ws/rs:PathParam.class)] [loading javax/ws/rs/Produces.class(javax/ws/rs:Produces.class)] [loading javax/ws/rs/QueryParam.class(javax/ws/rs:QueryParam.class)] [loading javax/ws/rs/core/Context.class(javax/ws/rs/core:Context.class)] [loading javax/ws/rs/core/SecurityContext.class(javax/ws/rs/core:SecurityCo= ntext.class)] [loading javax/ws/rs/core/UriInfo.class(javax/ws/rs/core:UriInfo.class)] [loading org/apache/hadoop/security/authentication/client/PseudoAuthenticat= or.class(org/apache/hadoop/security/authentication/client:PseudoAuthenticat= or.class)] [loading com/sun/jersey/api/NotFoundException.class(com/sun/jersey/api:NotF= oundException.class)] [loading java/net/URI.class(java/net:URI.class)] [loading org/apache/commons/lang/StringUtils.class(org/apache/commons/lang:= StringUtils.class)] [loading org/apache/hadoop/fs/FileStatus.class(org/apache/hadoop/fs:FileSta= tus.class)] [loading org/apache/hadoop/fs/FileSystem.class(org/apache/hadoop/fs:FileSys= tem.class)] [loading java/util/Date.class(java/util:Date.class)] [loading /data/hive-ptest/working/apache-svn-trunk-source/common/target/cla= sses/org/apache/hadoop/hive/common/classification/InterfaceAudience.class] [loading /data/hive-ptest/working/apache-svn-trunk-source/metastore/target/= classes/org/apache/hadoop/hive/metastore/HiveMetaStoreClient.class] [loading /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/core/tar= get/classes/org/apache/hive/hcatalog/common/HCatUtil.class] [loading /data/hive-ptest/working/apache-svn-trunk-source/common/target/cla= sses/org/apache/hadoop/hive/common/classification/InterfaceAudience$Private= .class] [loading com/sun/jersey/api/wadl/config/WadlGeneratorConfig.class(com/sun/j= ersey/api/wadl/config:WadlGeneratorConfig.class)] [loading com/sun/jersey/api/wadl/config/WadlGeneratorDescription.class(com/= sun/jersey/api/wadl/config:WadlGeneratorDescription.class)] [loading com/sun/jersey/server/wadl/generators/resourcedoc/WadlGeneratorRes= ourceDocSupport.class(com/sun/jersey/server/wadl/generators/resourcedoc:Wad= lGeneratorResourceDocSupport.class)] [loading com/sun/jersey/api/core/PackagesResourceConfig.class(com/sun/jerse= y/api/core:PackagesResourceConfig.class)] [loading com/sun/jersey/spi/container/servlet/ServletContainer.class(com/su= n/jersey/spi/container/servlet:ServletContainer.class)] [loading /data/hive-ptest/working/apache-svn-trunk-source/common/target/cla= sses/org/apache/hadoop/hive/common/classification/InterfaceStability.class] [loading org/apache/hadoop/hdfs/web/AuthFilter.class(org/apache/hadoop/hdfs= /web:AuthFilter.class)] [loading org/apache/hadoop/util/GenericOptionsParser.class(org/apache/hadoo= p/util:GenericOptionsParser.class)] [loading org/eclipse/jetty/rewrite/handler/RedirectPatternRule.class(org/ec= lipse/jetty/rewrite/handler:RedirectPatternRule.class)] [loading org/eclipse/jetty/rewrite/handler/RewriteHandler.class(org/eclipse= /jetty/rewrite/handler:RewriteHandler.class)] [loading org/eclipse/jetty/server/Handler.class(org/eclipse/jetty/server:Ha= ndler.class)] [loading org/eclipse/jetty/server/Server.class(org/eclipse/jetty/server:Ser= ver.class)] [loading org/eclipse/jetty/server/handler/HandlerList.class(org/eclipse/jet= ty/server/handler:HandlerList.class)] [loading org/eclipse/jetty/servlet/FilterHolder.class(org/eclipse/jetty/ser= vlet:FilterHolder.class)] [loading org/eclipse/jetty/servlet/FilterMapping.class(org/eclipse/jetty/se= rvlet:FilterMapping.class)] [loading org/eclipse/jetty/servlet/ServletContextHandler.class(org/eclipse/= jetty/servlet:ServletContextHandler.class)] [loading org/eclipse/jetty/servlet/ServletHolder.class(org/eclipse/jetty/se= rvlet:ServletHolder.class)] [loading org/slf4j/bridge/SLF4JBridgeHandler.class(org/slf4j/bridge:SLF4JBr= idgeHandler.class)] [loading /data/hive-ptest/working/apache-svn-trunk-source/common/target/cla= sses/org/apache/hadoop/hive/common/classification/InterfaceAudience$Limited= Private.class] [loading /data/hive-ptest/working/apache-svn-trunk-source/common/target/cla= sses/org/apache/hadoop/hive/common/classification/InterfaceStability$Unstab= le.class] [loading /data/hive-ptest/working/apache-svn-trunk-source/metastore/target/= classes/org/apache/hadoop/hive/metastore/api/MetaException.class] [loading org/apache/hadoop/io/Text.class(org/apache/hadoop/io:Text.class)] [loading org/apache/hadoop/security/Credentials.class(org/apache/hadoop/sec= urity:Credentials.class)] [loading org/apache/hadoop/security/token/Token.class(org/apache/hadoop/sec= urity/token:Token.class)] [loading org/apache/thrift/TException.class(org/apache/thrift:TException.cl= ass)] [loading java/io/Closeable.class(java/io:Closeable.class)] [loading java/io/Flushable.class(java/io:Flushable.class)] [loading org/apache/hadoop/security/Groups.class(org/apache/hadoop/security= :Groups.class)] [loading java/util/HashSet.class(java/util:HashSet.class)] [loading java/util/Set.class(java/util:Set.class)] [loading java/util/concurrent/ConcurrentHashMap.class(java/util/concurrent:= ConcurrentHashMap.class)] [loading java/io/UnsupportedEncodingException.class(java/io:UnsupportedEnco= dingException.class)] [loading org/apache/zookeeper/CreateMode.class(org/apache/zookeeper:CreateM= ode.class)] [loading org/apache/zookeeper/KeeperException.class(org/apache/zookeeper:Ke= eperException.class)] [loading org/apache/zookeeper/WatchedEvent.class(org/apache/zookeeper:Watch= edEvent.class)] [loading org/apache/zookeeper/Watcher.class(org/apache/zookeeper:Watcher.cl= ass)] [loading org/apache/zookeeper/ZooDefs.class(org/apache/zookeeper:ZooDefs.cl= ass)] [loading org/apache/zookeeper/ZooDefs$Ids.class(org/apache/zookeeper:ZooDef= s$Ids.class)] [loading org/apache/zookeeper/ZooKeeper.class(org/apache/zookeeper:ZooKeepe= r.class)] [loading org/apache/hadoop/io/NullWritable.class(org/apache/hadoop/io:NullW= ritable.class)] [loading org/apache/hadoop/mapreduce/InputSplit.class(org/apache/hadoop/map= reduce:InputSplit.class)] [loading org/apache/hadoop/mapreduce/RecordReader.class(org/apache/hadoop/m= apreduce:RecordReader.class)] [loading org/apache/hadoop/mapreduce/TaskAttemptContext.class(org/apache/ha= doop/mapreduce:TaskAttemptContext.class)] [loading java/net/URLConnection.class(java/net:URLConnection.class)] [loading java/util/Collection.class(java/util:Collection.class)] [loading javax/ws/rs/core/UriBuilder.class(javax/ws/rs/core:UriBuilder.clas= s)] [loading java/io/OutputStreamWriter.class(java/io:OutputStreamWriter.class)= ] [loading /data/hive-ptest/working/apache-svn-trunk-source/common/target/cla= sses/org/apache/hadoop/hive/common/classification/InterfaceStability$Evolvi= ng.class] [loading org/apache/zookeeper/data/Stat.class(org/apache/zookeeper/data:Sta= t.class)] [loading org/apache/hadoop/mapreduce/InputFormat.class(org/apache/hadoop/ma= preduce:InputFormat.class)] [loading org/apache/hadoop/mapreduce/JobContext.class(org/apache/hadoop/map= reduce:JobContext.class)] [loading org/apache/hadoop/mapred/JobClient.class(org/apache/hadoop/mapred:= JobClient.class)] [loading org/apache/hadoop/mapred/JobConf.class(org/apache/hadoop/mapred:Jo= bConf.class)] [loading org/apache/hadoop/mapred/RunningJob.class(org/apache/hadoop/mapred= :RunningJob.class)] [loading java/io/DataInput.class(java/io:DataInput.class)] [loading java/io/DataOutput.class(java/io:DataOutput.class)] [loading org/apache/hadoop/conf/Configured.class(org/apache/hadoop/conf:Con= figured.class)] [loading org/apache/hadoop/fs/permission/FsPermission.class(org/apache/hado= op/fs/permission:FsPermission.class)] [loading org/apache/hadoop/mapreduce/Job.class(org/apache/hadoop/mapreduce:= Job.class)] [loading org/apache/hadoop/mapreduce/JobID.class(org/apache/hadoop/mapreduc= e:JobID.class)] [loading org/apache/hadoop/mapreduce/lib/output/NullOutputFormat.class(org/= apache/hadoop/mapreduce/lib/output:NullOutputFormat.class)] [loading org/apache/hadoop/mapreduce/security/token/delegation/DelegationTo= kenIdentifier.class(org/apache/hadoop/mapreduce/security/token/delegation:D= elegationTokenIdentifier.class)] [loading org/apache/hadoop/util/Tool.class(org/apache/hadoop/util:Tool.clas= s)] [loading org/apache/hadoop/conf/Configurable.class(org/apache/hadoop/conf:C= onfigurable.class)] [loading java/lang/ClassNotFoundException.class(java/lang:ClassNotFoundExce= ption.class)] [loading org/apache/hadoop/mapreduce/Mapper.class(org/apache/hadoop/mapredu= ce:Mapper.class)] [loading java/util/Iterator.class(java/util:Iterator.class)] [loading java/util/LinkedList.class(java/util:LinkedList.class)] [loading java/util/concurrent/ExecutorService.class(java/util/concurrent:Ex= ecutorService.class)] [loading java/util/concurrent/Executors.class(java/util/concurrent:Executor= s.class)] [loading java/util/concurrent/TimeUnit.class(java/util/concurrent:TimeUnit.= class)] [loading org/apache/hadoop/mapreduce/Mapper$Context.class(org/apache/hadoop= /mapreduce:Mapper$Context.class)] [loading java/lang/Process.class(java/lang:Process.class)] [loading java/lang/StringBuilder.class(java/lang:StringBuilder.class)] [loading java/lang/ProcessBuilder.class(java/lang:ProcessBuilder.class)] [loading java/lang/annotation/Target.class(java/lang/annotation:Target.clas= s)] [loading java/lang/annotation/ElementType.class(java/lang/annotation:Elemen= tType.class)] [loading java/lang/annotation/Retention.class(java/lang/annotation:Retentio= n.class)] [loading java/lang/annotation/RetentionPolicy.class(java/lang/annotation:Re= tentionPolicy.class)] [loading java/lang/annotation/Annotation.class(java/lang/annotation:Annotat= ion.class)] [loading java/lang/SuppressWarnings.class(java/lang:SuppressWarnings.class)= ] [loading java/lang/Override.class(java/lang:Override.class)] [loading javax/ws/rs/HttpMethod.class(javax/ws/rs:HttpMethod.class)] [loading java/lang/Deprecated.class(java/lang:Deprecated.class)] [loading /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/= svr/target/classes/org/apache/hive/hcatalog/templeton/SecureProxySupport$3.= class] [loading /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/= svr/target/classes/org/apache/hive/hcatalog/templeton/SecureProxySupport$1.= class] [loading /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/= svr/target/classes/org/apache/hive/hcatalog/templeton/HcatDelegator$1.class= ] [loading /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/= svr/target/classes/org/apache/hive/hcatalog/templeton/LauncherDelegator$1.c= lass] [loading /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/= svr/target/classes/org/apache/hive/hcatalog/templeton/SecureProxySupport$2.= class] [loading /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/= svr/target/classes/org/apache/hive/hcatalog/templeton/HcatException$1.class= ] [loading /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/= svr/target/classes/org/apache/hive/hcatalog/templeton/tool/TempletonControl= lerJob$2.class] [loading /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/= svr/target/classes/org/apache/hive/hcatalog/templeton/tool/TempletonControl= lerJob$2$1.class] [loading /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/= svr/target/classes/org/apache/hive/hcatalog/templeton/tool/ZooKeeperStorage= $1.class] [loading /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/= svr/target/classes/org/apache/hive/hcatalog/templeton/tool/TempletonControl= lerJob$1.class] [loading /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/= svr/target/classes/org/apache/hive/hcatalog/templeton/tool/LogRetriever$1.c= lass] [loading /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/= svr/target/classes/org/apache/hive/hcatalog/templeton/tool/ZooKeeperStorage= $2.class] [loading /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/= svr/target/classes/org/apache/hive/hcatalog/templeton/tool/TempletonUtils$1= .class] [loading /data/hive-ptest/working/apache-svn-trunk-source/hcatalog/webhcat/= svr/target/classes/org/apache/hive/hcatalog/templeton/tool/HDFSStorage$1.cl= ass] [done in 7970 ms] [WARNING] Javadoc Warnings [WARNING] Nov 11, 2013 4:28:05 PM com.sun.jersey.wadl.resourcedoc.ResourceD= oclet start [WARNING] INFO: Wrote /data/hive-ptest/working/apache-svn-trunk-source/hcat= alog/webhcat/svr/target/classes/resourcedoc.xml [INFO]=20 [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources)= @ hive-webhcat --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-= svn-trunk-source/hcatalog/webhcat/svr/src/test/resources [INFO]=20 [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-webhcat --- [INFO] Executing tasks main: [delete] Deleting directory /data/hive-ptest/working/apache-svn-trunk-so= urce/hcatalog/webhcat/svr/target/tmp [delete] Deleting directory /data/hive-ptest/working/apache-svn-trunk-so= urce/hcatalog/webhcat/svr/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/h= catalog/webhcat/svr/target/tmp [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/h= catalog/webhcat/svr/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/h= catalog/webhcat/svr/target/tmp/conf [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-so= urce/hcatalog/webhcat/svr/target/tmp/conf [INFO] Executed tasks [INFO]=20 [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hi= ve-webhcat --- [INFO] Nothing to compile - all classes are up to date [INFO]=20 [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-webhcat --= - [INFO] = =20 [INFO] --------------------------------------------------------------------= ---- [INFO] Building Hive HCatalog HBase Storage Handler 0.13.0-SNAPSHOT [INFO] --------------------------------------------------------------------= ---- [INFO]=20 [INFO] --- build-helper-maven-plugin:1.8:add-source (add-source) @ hive-hba= se-storage-handler --- [INFO] Source directory: /data/hive-ptest/working/apache-svn-trunk-source/h= catalog/storage-handlers/hbase/src/gen-java added. [INFO]=20 [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-= hbase-storage-handler --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] Copying 1 resource [INFO]=20 [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-hbase-stor= age-handler --- [INFO] Executing tasks main: [INFO] Executed tasks [INFO]=20 [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-hbase= -storage-handler --- [INFO] Compiling 1 source file to /data/hive-ptest/working/apache-svn-trunk= -source/hcatalog/storage-handlers/hbase/target/classes [INFO]=20 [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources)= @ hive-hbase-storage-handler --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-= svn-trunk-source/hcatalog/storage-handlers/hbase/src/test/resources [INFO]=20 [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-hbase-stora= ge-handler --- [INFO] Executing tasks main: [delete] Deleting directory /data/hive-ptest/working/apache-svn-trunk-so= urce/hcatalog/storage-handlers/hbase/target/tmp [delete] Deleting directory /data/hive-ptest/working/apache-svn-trunk-so= urce/hcatalog/storage-handlers/hbase/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/h= catalog/storage-handlers/hbase/target/tmp [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/h= catalog/storage-handlers/hbase/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/h= catalog/storage-handlers/hbase/target/tmp/conf [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-so= urce/hcatalog/storage-handlers/hbase/target/tmp/conf [INFO] Executed tasks [INFO]=20 [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hi= ve-hbase-storage-handler --- [INFO] Nothing to compile - all classes are up to date [INFO]=20 [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-hbase-stor= age-handler --- [INFO] = =20 [INFO] --------------------------------------------------------------------= ---- [INFO] Building Hive HWI 0.13.0-SNAPSHOT [INFO] --------------------------------------------------------------------= ---- [INFO]=20 [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-= hwi --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-= svn-trunk-source/hwi/src/main/resources [INFO]=20 [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-hwi --- [INFO] Executing tasks main: [INFO] Executed tasks [INFO]=20 [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-hwi -= -- [INFO] Nothing to compile - all classes are up to date [INFO]=20 [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources)= @ hive-hwi --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-= svn-trunk-source/hwi/src/test/resources [INFO]=20 [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-hwi --- [INFO] Executing tasks main: [delete] Deleting directory /data/hive-ptest/working/apache-svn-trunk-so= urce/hwi/target/tmp [delete] Deleting directory /data/hive-ptest/working/apache-svn-trunk-so= urce/hwi/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/h= wi/target/tmp [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/h= wi/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/h= wi/target/tmp/conf [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-so= urce/hwi/target/tmp/conf [INFO] Executed tasks [INFO]=20 [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hi= ve-hwi --- [INFO] Nothing to compile - all classes are up to date [INFO]=20 [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-hwi --- [INFO] = =20 [INFO] --------------------------------------------------------------------= ---- [INFO] Building Hive ODBC 0.13.0-SNAPSHOT [INFO] --------------------------------------------------------------------= ---- [INFO]=20 [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-odbc --- [INFO] Executing tasks main: [INFO] Executed tasks [INFO]=20 [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-odbc --- [INFO] Executing tasks main: [delete] Deleting directory /data/hive-ptest/working/apache-svn-trunk-so= urce/odbc/target/tmp [delete] Deleting directory /data/hive-ptest/working/apache-svn-trunk-so= urce/odbc/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/o= dbc/target/tmp [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/o= dbc/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/o= dbc/target/tmp/conf [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-so= urce/odbc/target/tmp/conf [INFO] Executed tasks [INFO] = =20 [INFO] --------------------------------------------------------------------= ---- [INFO] Building Hive Shims Aggregator 0.13.0-SNAPSHOT [INFO] --------------------------------------------------------------------= ---- [INFO]=20 [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-shims-aggr= egator --- [INFO] Executing tasks main: [INFO] Executed tasks [INFO]=20 [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-shims-aggre= gator --- [INFO] Executing tasks main: [delete] Deleting directory /data/hive-ptest/working/apache-svn-trunk-so= urce/shims/target/tmp [delete] Deleting directory /data/hive-ptest/working/apache-svn-trunk-so= urce/shims/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/s= hims/target/tmp [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/s= hims/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/s= hims/target/tmp/conf [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-so= urce/shims/target/tmp/conf [INFO] Executed tasks [INFO] = =20 [INFO] --------------------------------------------------------------------= ---- [INFO] Building Hive TestUtils 0.13.0-SNAPSHOT [INFO] --------------------------------------------------------------------= ---- [INFO]=20 [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-= testutils --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-= svn-trunk-source/testutils/src/main/resources [INFO]=20 [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-testutils = --- [INFO] Executing tasks main: [INFO] Executed tasks [INFO]=20 [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-testu= tils --- [INFO] Nothing to compile - all classes are up to date [INFO]=20 [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources)= @ hive-testutils --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-= svn-trunk-source/testutils/src/test/resources [INFO]=20 [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-testutils -= -- [INFO] Executing tasks main: [delete] Deleting directory /data/hive-ptest/working/apache-svn-trunk-so= urce/testutils/target/tmp [delete] Deleting directory /data/hive-ptest/working/apache-svn-trunk-so= urce/testutils/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/t= estutils/target/tmp [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/t= estutils/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/t= estutils/target/tmp/conf [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-so= urce/testutils/target/tmp/conf [INFO] Executed tasks [INFO]=20 [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hi= ve-testutils --- [INFO] No sources to compile [INFO]=20 [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-testutils = --- [INFO] No tests to run. [INFO] = =20 [INFO] --------------------------------------------------------------------= ---- [INFO] Building Hive Packaging 0.13.0-SNAPSHOT [INFO] --------------------------------------------------------------------= ---- [INFO]=20 [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-packaging = --- [INFO] Executing tasks main: [INFO] Executed tasks [INFO]=20 [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-packaging -= -- [INFO] Executing tasks main: [delete] Deleting directory /data/hive-ptest/working/apache-svn-trunk-so= urce/packaging/target/tmp [delete] Deleting directory /data/hive-ptest/working/apache-svn-trunk-so= urce/packaging/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/p= ackaging/target/tmp [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/p= ackaging/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/p= ackaging/target/tmp/conf [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-so= urce/packaging/target/tmp/conf [INFO] Executed tasks [INFO] --------------------------------------------------------------------= ---- [INFO] Reactor Summary: [INFO]=20 [INFO] Hive .............................................. SUCCESS [2.020s] [INFO] Hive Ant Utilities ................................ SUCCESS [3.877s] [INFO] Hive Shims Common ................................. SUCCESS [1.340s] [INFO] Hive Shims 0.20 ................................... SUCCESS [0.580s] [INFO] Hive Shims Secure Common .......................... SUCCESS [0.890s] [INFO] Hive Shims 0.20S .................................. SUCCESS [0.390s] [INFO] Hive Shims 0.23 ................................... SUCCESS [1.385s] [INFO] Hive Shims ........................................ SUCCESS [0.250s] [INFO] Hive Common ....................................... SUCCESS [3.676s] [INFO] Hive Serde ........................................ SUCCESS [0.705s] [INFO] Hive Metastore .................................... SUCCESS [5.853s] [INFO] Hive Query Language ............................... SUCCESS [11.108s= ] [INFO] Hive Service ...................................... SUCCESS [0.427s] [INFO] Hive JDBC ......................................... SUCCESS [0.462s] [INFO] Hive Beeline ...................................... SUCCESS [0.671s] [INFO] Hive CLI .......................................... SUCCESS [0.912s] [INFO] Hive Contrib ...................................... SUCCESS [0.830s] [INFO] Hive HBase Handler ................................ SUCCESS [1.117s] [INFO] Hive HCatalog ..................................... SUCCESS [0.452s] [INFO] Hive HCatalog Core ................................ SUCCESS [0.780s] [INFO] Hive HCatalog Pig Adapter ......................... SUCCESS [0.299s] [INFO] Hive HCatalog Server Extensions ................... SUCCESS [0.391s] [INFO] Hive HCatalog Webhcat Java Client ................. SUCCESS [0.555s] [INFO] Hive HCatalog Webhcat ............................. SUCCESS [9.585s] [INFO] Hive HCatalog HBase Storage Handler ............... SUCCESS [0.934s] [INFO] Hive HWI .......................................... SUCCESS [0.510s] [INFO] Hive ODBC ......................................... SUCCESS [0.372s] [INFO] Hive Shims Aggregator ............................. SUCCESS [0.131s] [INFO] Hive TestUtils .................................... SUCCESS [0.129s] [INFO] Hive Packaging .................................... SUCCESS [0.215s] [INFO] --------------------------------------------------------------------= ---- [INFO] BUILD SUCCESS [INFO] --------------------------------------------------------------------= ---- [INFO] Total time: 53.202s [INFO] Finished at: Mon Nov 11 16:28:07 EST 2013 [INFO] Final Memory: 39M/104M [INFO] --------------------------------------------------------------------= ---- + cd itests + mvn -B clean install -DskipTests -Dmaven.repo.local=3D/data/hive-ptest/wo= rking/maven [INFO] Scanning for projects... [INFO] --------------------------------------------------------------------= ---- [INFO] Reactor Build Order: [INFO]=20 [INFO] Hive Integration - Parent [INFO] Hive Integration - Custom Serde [INFO] Hive Integration - Testing Utilities [INFO] Hive Integration - Unit Tests [INFO] Hive Integration - HCatalog Unit Tests [INFO] Hive Integration - Test Serde [INFO] Hive Integration - QFile Tests [INFO] = =20 [INFO] --------------------------------------------------------------------= ---- [INFO] Building Hive Integration - Parent 0.13.0-SNAPSHOT [INFO] --------------------------------------------------------------------= ---- [INFO]=20 [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-it --- [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/itests (in= cludes =3D [datanucleus.log, derby.log], excludes =3D []) [INFO]=20 [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-it --- [INFO] Executing tasks main: [INFO] Executed tasks [INFO]=20 [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-it --- [INFO] Executing tasks main: [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/i= tests/target/tmp [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/i= tests/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/i= tests/target/tmp/conf [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-so= urce/itests/target/tmp/conf [INFO] Executed tasks [INFO]=20 [INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-it --- [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/itests/p= om.xml to /data/hive-ptest/working/maven/org/apache/hive/hive-it/0.13.0-SNA= PSHOT/hive-it-0.13.0-SNAPSHOT.pom [INFO] = =20 [INFO] --------------------------------------------------------------------= ---- [INFO] Building Hive Integration - Custom Serde 0.13.0-SNAPSHOT [INFO] --------------------------------------------------------------------= ---- [INFO]=20 [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-it-custom-se= rde --- [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/itests/cus= tom-serde (includes =3D [datanucleus.log, derby.log], excludes =3D []) [INFO]=20 [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-= it-custom-serde --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-= svn-trunk-source/itests/custom-serde/src/main/resources [INFO]=20 [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-it-custom-= serde --- [INFO] Executing tasks main: [INFO] Executed tasks [INFO]=20 [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-it-cu= stom-serde --- [INFO] Compiling 8 source files to /data/hive-ptest/working/apache-svn-trun= k-source/itests/custom-serde/target/classes [INFO]=20 [INFO] --- maven-resources-plugin:2.5:testResources (default-testResources)= @ hive-it-custom-serde --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-= svn-trunk-source/itests/custom-serde/src/test/resources [INFO]=20 [INFO] --- maven-antrun-plugin:1.7:run (setup-test-dirs) @ hive-it-custom-s= erde --- [INFO] Executing tasks main: [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/i= tests/custom-serde/target/tmp [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/i= tests/custom-serde/target/warehouse [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/i= tests/custom-serde/target/tmp/conf [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-so= urce/itests/custom-serde/target/tmp/conf [INFO] Executed tasks [INFO]=20 [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ hi= ve-it-custom-serde --- [INFO] No sources to compile [INFO]=20 [INFO] --- maven-surefire-plugin:2.16:test (default-test) @ hive-it-custom-= serde --- [INFO] Tests are skipped. [INFO]=20 [INFO] --- maven-jar-plugin:2.2:jar (default-jar) @ hive-it-custom-serde --= - [INFO] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/itest= s/custom-serde/target/hive-it-custom-serde-0.13.0-SNAPSHOT.jar [INFO]=20 [INFO] --- maven-install-plugin:2.4:install (default-install) @ hive-it-cus= tom-serde --- [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/itests/c= ustom-serde/target/hive-it-custom-serde-0.13.0-SNAPSHOT.jar to /data/hive-p= test/working/maven/org/apache/hive/hive-it-custom-serde/0.13.0-SNAPSHOT/hiv= e-it-custom-serde-0.13.0-SNAPSHOT.jar [INFO] Installing /data/hive-ptest/working/apache-svn-trunk-source/itests/c= ustom-serde/pom.xml to /data/hive-ptest/working/maven/org/apache/hive/hive-= it-custom-serde/0.13.0-SNAPSHOT/hive-it-custom-serde-0.13.0-SNAPSHOT.pom [INFO] = =20 [INFO] --------------------------------------------------------------------= ---- [INFO] Building Hive Integration - Testing Utilities 0.13.0-SNAPSHOT [INFO] --------------------------------------------------------------------= ---- [INFO]=20 [INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hive-it-util --- [INFO] Deleting /data/hive-ptest/working/apache-svn-trunk-source/itests/uti= l (includes =3D [datanucleus.log, derby.log], excludes =3D []) [INFO]=20 [INFO] --- maven-resources-plugin:2.5:resources (default-resources) @ hive-= it-util --- [debug] execute contextualize [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/hive-ptest/working/apache-= svn-trunk-source/itests/util/src/main/resources [INFO]=20 [INFO] --- maven-antrun-plugin:1.7:run (define-classpath) @ hive-it-util --= - [INFO] Executing tasks main: [INFO] Executed tasks [INFO]=20 [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hive-it-ut= il --- [INFO] Compiling 42 source files to /data/hive-ptest/working/apache-svn-tru= nk-source/itests/util/target/classes [INFO] ------------------------------------------------------------- [WARNING] COMPILATION WARNING :=20 [INFO] ------------------------------------------------------------- [WARNING] Note: Some input files use or override a deprecated API. [WARNING] Note: Recompile with -Xlint:deprecation for details. [INFO] 2 warnings=20 [INFO] ------------------------------------------------------------- [INFO] ------------------------------------------------------------- [ERROR] COMPILATION ERROR :=20 [INFO] ------------------------------------------------------------- [ERROR] /data/hive-ptest/working/apache-svn-trunk-source/itests/util/src/ma= in/java/org/apache/hadoop/hive/ql/hooks/OptrStatGroupByHook.java:[45,73] ca= nnot find symbol symbol : variable HIVEJOBPROGRESS location: class org.apache.hadoop.hive.conf.HiveConf.ConfVars [ERROR] /data/hive-ptest/working/apache-svn-trunk-source/itests/util/src/ma= in/java/org/apache/hadoop/hive/ql/hooks/OptrStatGroupByHook.java:[57,38] ca= nnot find symbol symbol : method getCounters() location: class org.apache.hadoop.hive.ql.exec.Operator [INFO] 2 errors=20 [INFO] ------------------------------------------------------------- [INFO] --------------------------------------------------------------------= ---- [INFO] Reactor Summary: [INFO]=20 [INFO] Hive Integration - Parent ......................... SUCCESS [3.940s] [INFO] Hive Integration - Custom Serde ................... SUCCESS [9.284s] [INFO] Hive Integration - Testing Utilities .............. FAILURE [5.506s] [INFO] Hive Integration - Unit Tests ..................... SKIPPED [INFO] Hive Integration - HCatalog Unit Tests ............ SKIPPED [INFO] Hive Integration - Test Serde ..................... SKIPPED [INFO] Hive Integration - QFile Tests .................... SKIPPED [INFO] --------------------------------------------------------------------= ---- [INFO] BUILD FAILURE [INFO] --------------------------------------------------------------------= ---- [INFO] Total time: 19.829s [INFO] Finished at: Mon Nov 11 16:28:30 EST 2013 [INFO] Final Memory: 26M/63M [INFO] --------------------------------------------------------------------= ---- [ERROR] Failed to execute goal org.apache.maven.plugins:maven-compiler-plug= in:3.1:compile (default-compile) on project hive-it-util: Compilation failu= re: Compilation failure: [ERROR] /data/hive-ptest/working/apache-svn-trunk-source/itests/util/src/ma= in/java/org/apache/hadoop/hive/ql/hooks/OptrStatGroupByHook.java:[45,73] ca= nnot find symbol [ERROR] symbol : variable HIVEJOBPROGRESS [ERROR] location: class org.apache.hadoop.hive.conf.HiveConf.ConfVars [ERROR] /data/hive-ptest/working/apache-svn-trunk-source/itests/util/src/ma= in/java/org/apache/hadoop/hive/ql/hooks/OptrStatGroupByHook.java:[57,38] ca= nnot find symbol [ERROR] symbol : method getCounters() [ERROR] location: class org.apache.hadoop.hive.ql.exec.Operator [ERROR] -> [Help 1] [ERROR]=20 [ERROR] To see the full stack trace of the errors, re-run Maven with the -e= switch. [ERROR] Re-run Maven using the -X switch to enable full debug logging. [ERROR]=20 [ERROR] For more information about the errors and possible solutions, pleas= e read the following articles: [ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailu= reException [ERROR]=20 [ERROR] After correcting the problems, you can resume the build with the co= mmand [ERROR] mvn -rf :hive-it-util + exit 1 ' {noformat} This message is automatically generated. ATTACHMENT ID: 12613111 > Counter Strike: Operation Operator > ---------------------------------- > > Key: HIVE-4518 > URL: https://issues.apache.org/jira/browse/HIVE-4518 > Project: Hive > Issue Type: Improvement > Reporter: Gunther Hagleitner > Assignee: Gunther Hagleitner > Attachments: HIVE-4518.1.patch, HIVE-4518.2.patch, HIVE-4518.3.pa= tch, HIVE-4518.4.patch, HIVE-4518.5.patch, HIVE-4518.6.patch.txt > > > Queries of the form: > from foo > insert overwrite table bar partition (p) select ... > insert overwrite table bar partition (p) select ... > insert overwrite table bar partition (p) select ... > Generate a huge amount of counters. The reason is that task.progress is t= urned on for dynamic partitioning queries. > The counters not only make queries slower than necessary (up to 50%) you = will also eventually run out. That's because we're wrapping them in enum va= lues to comply with hadoop 0.17. > The real reason we turn task.progress on is that we need CREATED_FILES an= d FATAL counters to ensure dynamic partitioning queries don't go haywire. > The counters have counter-intuitive names like C1 through C1000 and don't= seem really useful by themselves. > With hadoop 20+ you don't need to wrap the counters anymore, each operato= r can simply create and increment counters. That should simplify the code a= lot. -- This message was sent by Atlassian JIRA (v6.1#6144)