Return-Path: X-Original-To: apmail-hive-dev-archive@www.apache.org Delivered-To: apmail-hive-dev-archive@www.apache.org Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by minotaur.apache.org (Postfix) with SMTP id 2C8FA10E3E for ; Wed, 18 Sep 2013 00:39:54 +0000 (UTC) Received: (qmail 98722 invoked by uid 500); 18 Sep 2013 00:39:53 -0000 Delivered-To: apmail-hive-dev-archive@hive.apache.org Received: (qmail 98664 invoked by uid 500); 18 Sep 2013 00:39:53 -0000 Mailing-List: contact dev-help@hive.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: dev@hive.apache.org Delivered-To: mailing list dev@hive.apache.org Received: (qmail 98656 invoked by uid 500); 18 Sep 2013 00:39:53 -0000 Delivered-To: apmail-hadoop-hive-dev@hadoop.apache.org Received: (qmail 98652 invoked by uid 99); 18 Sep 2013 00:39:53 -0000 Received: from arcas.apache.org (HELO arcas.apache.org) (140.211.11.28) by apache.org (qpsmtpd/0.29) with ESMTP; Wed, 18 Sep 2013 00:39:53 +0000 Date: Wed, 18 Sep 2013 00:39:53 +0000 (UTC) From: "Hive QA (JIRA)" To: hive-dev@hadoop.apache.org Message-ID: In-Reply-To: References: Subject: [jira] [Commented] (HIVE-5156) HiveServer2 jdbc ResultSet.close should free up resources on server side MIME-Version: 1.0 Content-Type: text/plain; charset=utf-8 Content-Transfer-Encoding: quoted-printable X-JIRA-FingerPrint: 30527f35849b9dde25b450d4833f0394 [ https://issues.apache.org/jira/browse/HIVE-5156?page=3Dcom.atlassian.= jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=3D13770= 231#comment-13770231 ]=20 Hive QA commented on HIVE-5156: ------------------------------- {color:red}Overall{color}: -1 no tests executed Here are the results of testing the latest attachment: https://issues.apache.org/jira/secure/attachment/12603651/HIVE-5156.D12837.= 3.patch Test results: https://builds.apache.org/job/PreCommit-HIVE-Build/789/testRe= port Console output: https://builds.apache.org/job/PreCommit-HIVE-Build/789/cons= ole Messages: {noformat} Executing org.apache.hive.ptest.execution.PrepPhase Tests failed with: NonZeroExitCodeException: Command 'bash /data/hive-ptest= /working/scratch/source-prep.sh' failed with exit status 1 and output '+ [[= -n '' ]] + export 'ANT_OPTS=3D-Xmx1g -XX:MaxPermSize=3D256m -Dhttp.proxyHost=3Dlocal= host -Dhttp.proxyPort=3D3128' + ANT_OPTS=3D'-Xmx1g -XX:MaxPermSize=3D256m -Dhttp.proxyHost=3Dlocalhost -D= http.proxyPort=3D3128' + cd /data/hive-ptest/working/ + tee /data/hive-ptest/logs/PreCommit-HIVE-Build-789/source-prep.txt + mkdir -p maven ivy + [[ svn =3D \s\v\n ]] + [[ -n '' ]] + [[ -d apache-svn-trunk-source ]] + [[ ! -d apache-svn-trunk-source/.svn ]] + [[ ! -d apache-svn-trunk-source ]] + cd apache-svn-trunk-source + svn revert -R . Reverted 'common/src/java/org/apache/hadoop/hive/conf/HiveConf.java' Reverted 'ql/src/test/results/clientnegative/alter_table_add_partition.q.ou= t' Reverted 'ql/src/test/results/clientnegative/alter_view_failure5.q.out' Reverted 'ql/src/java/org/apache/hadoop/hive/ql/parse/TypeCheckProcFactory.= java' Reverted 'ql/src/java/org/apache/hadoop/hive/ql/parse/HiveLexer.g' Reverted 'ql/src/java/org/apache/hadoop/hive/ql/parse/BaseSemanticAnalyzer.= java' Reverted 'ql/src/java/org/apache/hadoop/hive/ql/parse/DDLSemanticAnalyzer.j= ava' Reverted 'ql/src/java/org/apache/hadoop/hive/ql/ErrorMsg.java' Reverted 'ql/src/java/org/apache/hadoop/hive/ql/exec/Utilities.java' ++ egrep -v '^X|^Performing status on external' ++ awk '{print $2}' ++ svn status --no-ignore + rm -rf build hcatalog/build hcatalog/core/build hcatalog/storage-handlers= /hbase/build hcatalog/server-extensions/build hcatalog/webhcat/svr/build hc= atalog/webhcat/java-client/build hcatalog/hcatalog-pig-adapter/build common= /src/gen ql/src/test/results/clientnegative/illegal_partition_type.q.out ql= /src/test/results/clientnegative/illegal_partition_type2.q.out ql/src/test/= results/clientpositive/parititon_type_check.q.out ql/src/test/results/clien= tpositive/partition_type_check.q.out ql/src/test/queries/clientnegative/ill= egal_partition_type.q ql/src/test/queries/clientnegative/illegal_partition_= type2.q ql/src/test/queries/clientpositive/partition_type_check.q + svn update U ql/src/test/queries/clientpositive/udaf_collect_set.q U ql/src/test/results/clientpositive/show_functions.q.out U ql/src/test/results/clientpositive/udaf_collect_set.q.out U ql/src/java/org/apache/hadoop/hive/ql/exec/FunctionRegistry.java U ql/src/java/org/apache/hadoop/hive/ql/udf/generic/GenericUDAFCollectSe= t.java U hcatalog/webhcat/svr/src/main/bin/webhcat_config.sh U hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/= Server.java U hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/= HiveDelegator.java U hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/= Main.java A hcatalog/webhcat/svr/src/main/java/org/apache/hive/hcatalog/templeton/= JobItemBean.java U hcatalog/src/test/e2e/templeton/tests/jobsubmission.conf Fetching external item into 'hcatalog/src/test/e2e/harness' Updated external to revision 1524262. Updated to revision 1524262. + patchCommandPath=3D/data/hive-ptest/working/scratch/smart-apply-patch.sh + patchFilePath=3D/data/hive-ptest/working/scratch/build.patch + [[ -f /data/hive-ptest/working/scratch/build.patch ]] + chmod +x /data/hive-ptest/working/scratch/smart-apply-patch.sh + /data/hive-ptest/working/scratch/smart-apply-patch.sh /data/hive-ptest/wo= rking/scratch/build.patch Going to apply patch with: patch -p0 patching file jdbc/src/java/org/apache/hive/jdbc/HiveQueryResultSet.java patching file jdbc/src/java/org/apache/hive/jdbc/HiveStatement.java patching file jdbc/src/test/org/apache/hive/jdbc/TestJdbcDriver2.java Hunk #13 succeeded at 848 (offset 7 lines). Hunk #14 succeeded at 892 (offset 7 lines). Hunk #15 succeeded at 919 (offset 7 lines). Hunk #16 succeeded at 980 (offset 19 lines). Hunk #17 succeeded at 1000 (offset 19 lines). Hunk #18 succeeded at 1023 (offset 19 lines). Hunk #19 succeeded at 1076 (offset 19 lines). Hunk #20 succeeded at 1119 (offset 19 lines). Hunk #21 succeeded at 1343 (offset 19 lines). + [[ true =3D=3D \t\r\u\e ]] + rm -rf /data/hive-ptest/working/ivy /data/hive-ptest/working/maven + mkdir /data/hive-ptest/working/ivy /data/hive-ptest/working/maven + ant -Dtest.continue.on.failure=3Dtrue -Dtest.silent=3Dfalse -Divy.default= .ivy.user.dir=3D/data/hive-ptest/working/ivy -Dmvn.local.repo=3D/data/hive-= ptest/working/maven clean package test -Dtestcase=3Dnothing Buildfile: /data/hive-ptest/working/apache-svn-trunk-source/build.xml clean: [echo] Project: hive clean: [echo] Project: anttasks clean: [echo] Project: shims clean: [echo] Project: common clean: [echo] Project: serde clean: [echo] Project: metastore clean: [echo] Project: ql clean: [echo] Project: contrib clean: [echo] Project: service clean: [echo] Project: cli clean: [echo] Project: jdbc clean: [echo] Project: beeline clean: [echo] Project: hwi clean: [echo] Project: hbase-handler clean: [echo] Project: testutils clean: [echo] hcatalog clean: [echo] hcatalog-core clean: [echo] hcatalog-pig-adapter clean: [echo] hcatalog-server-extensions clean: [echo] webhcat clean: [echo] webhcat-java-client clean: clean: [echo] Project: odbc [exec] rm -rf /data/hive-ptest/working/apache-svn-trunk-source/build/o= dbc /data/hive-ptest/working/apache-svn-trunk-source/build/service/objs /da= ta/hive-ptest/working/apache-svn-trunk-source/build/ql/objs /data/hive-ptes= t/working/apache-svn-trunk-source/build/metastore/objs clean-online: [echo] Project: hive clean-offline: ivy-init-dirs: [echo] Project: hive [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/b= uild/ivy [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/b= uild/ivy/lib [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/b= uild/ivy/report [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/b= uild/ivy/maven ivy-download: [echo] Project: hive [get] Getting: http://repo2.maven.org/maven2/org/apache/ivy/ivy/2.3.0= /ivy-2.3.0.jar [get] To: /data/hive-ptest/working/apache-svn-trunk-source/build/ivy/= lib/ivy-2.3.0.jar ivy-probe-antlib: [echo] Project: hive ivy-init-antlib: [echo] Project: hive compile-ant-tasks: [echo] Project: hive create-dirs: [echo] Project: anttasks [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/b= uild/anttasks [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/b= uild/anttasks/classes [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/b= uild/jexl/classes [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/b= uild/hadoopcore [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/b= uild/anttasks/test [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/b= uild/anttasks/test/src [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/b= uild/anttasks/test/classes [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/b= uild/anttasks/test/resources [copy] Warning: /data/hive-ptest/working/apache-svn-trunk-source/ant/s= rc/test/resources does not exist. init: [echo] Project: anttasks ivy-init-settings: [echo] Project: anttasks ivy-resolve: [echo] Project: anttasks [ivy:resolve] :: Apache Ivy 2.3.0 - 20130110142753 :: http://ant.apache.org= /ivy/ :: [ivy:resolve] :: loading settings :: file =3D /data/hive-ptest/working/apac= he-svn-trunk-source/ivy/ivysettings.xml [ivy:resolve] :: resolving dependencies :: org.apache.hive#hive-anttasks;0.= 13.0-SNAPSHOT [ivy:resolve] =09confs: [default] [ivy:resolve] =09found commons-lang#commons-lang;2.4 in maven2 [ivy:resolve] =09found velocity#velocity;1.5 in maven2 [ivy:resolve] downloading http://repo1.maven.org/maven2/commons-lang/common= s-lang/2.4/commons-lang-2.4.jar ... [ivy:resolve] ..... (255kB) [ivy:resolve] .. (0kB) [ivy:resolve] =09[SUCCESSFUL ] commons-lang#commons-lang;2.4!commons-lang.j= ar (30ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/velocity/velocity/1= .5/velocity-1.5.jar ... [ivy:resolve] ....... (382kB) [ivy:resolve] .. (0kB) [ivy:resolve] =09[SUCCESSFUL ] velocity#velocity;1.5!velocity.jar (24ms) [ivy:resolve] :: resolution report :: resolve 5625ms :: artifacts dl 75ms =09--------------------------------------------------------------------- =09| | modules || artifacts | =09| conf | number| search|dwnlded|evicted|| number|dwnlded| =09--------------------------------------------------------------------- =09| default | 2 | 2 | 2 | 0 || 2 | 2 | =09--------------------------------------------------------------------- [ivy:report] Processing /data/hive-ptest/working/apache-svn-trunk-source/bu= ild/ivy/resolution-cache/org.apache.hive-hive-anttasks-default.xml to /data= /hive-ptest/working/apache-svn-trunk-source/build/ivy/report/org.apache.hiv= e-hive-anttasks-default.html ivy-retrieve: [echo] Project: anttasks [ivy:retrieve] :: retrieving :: org.apache.hive#hive-anttasks [ivy:retrieve] =09confs: [default] [ivy:retrieve] =092 artifacts copied, 0 already retrieved (638kB/11ms) compile: [echo] anttasks [javac] /data/hive-ptest/working/apache-svn-trunk-source/ant/build.xml:= 38: warning: 'includeantruntime' was not set, defaulting to build.sysclassp= ath=3Dlast; set to false for repeatable builds [javac] Compiling 3 source files to /data/hive-ptest/working/apache-svn= -trunk-source/build/anttasks/classes [javac] Note: /data/hive-ptest/working/apache-svn-trunk-source/ant/src/= org/apache/hadoop/hive/ant/QTestGenTask.java uses or overrides a deprecated= API. [javac] Note: Recompile with -Xlint:deprecation for details. [javac] Note: /data/hive-ptest/working/apache-svn-trunk-source/ant/src/= org/apache/hadoop/hive/ant/DistinctElementsClassPath.java uses unchecked or= unsafe operations. [javac] Note: Recompile with -Xlint:unchecked for details. deploy-ant-tasks: [echo] Project: hive create-dirs: [echo] Project: anttasks [copy] Warning: /data/hive-ptest/working/apache-svn-trunk-source/ant/s= rc/test/resources does not exist. init: [echo] Project: anttasks ivy-init-settings: [echo] Project: anttasks ivy-resolve: [echo] Project: anttasks [ivy:resolve] :: loading settings :: file =3D /data/hive-ptest/working/apac= he-svn-trunk-source/ivy/ivysettings.xml [ivy:resolve] :: resolving dependencies :: org.apache.hive#hive-anttasks;0.= 13.0-SNAPSHOT [ivy:resolve] =09confs: [default] [ivy:resolve] =09found commons-lang#commons-lang;2.4 in maven2 [ivy:resolve] =09found velocity#velocity;1.5 in maven2 [ivy:resolve] :: resolution report :: resolve 451ms :: artifacts dl 2ms =09--------------------------------------------------------------------- =09| | modules || artifacts | =09| conf | number| search|dwnlded|evicted|| number|dwnlded| =09--------------------------------------------------------------------- =09| default | 2 | 0 | 0 | 0 || 2 | 0 | =09--------------------------------------------------------------------- [ivy:report] Processing /data/hive-ptest/working/apache-svn-trunk-source/bu= ild/ivy/resolution-cache/org.apache.hive-hive-anttasks-default.xml to /data= /hive-ptest/working/apache-svn-trunk-source/build/ivy/report/org.apache.hiv= e-hive-anttasks-default.html ivy-retrieve: [echo] Project: anttasks [ivy:retrieve] :: retrieving :: org.apache.hive#hive-anttasks [ivy:retrieve] =09confs: [default] [ivy:retrieve] =090 artifacts copied, 2 already retrieved (0kB/8ms) compile: [echo] anttasks [javac] /data/hive-ptest/working/apache-svn-trunk-source/ant/build.xml:= 38: warning: 'includeantruntime' was not set, defaulting to build.sysclassp= ath=3Dlast; set to false for repeatable builds jar: [echo] anttasks [copy] Copying 1 file to /data/hive-ptest/working/apache-svn-trunk-sou= rce/build/anttasks/classes/org/apache/hadoop/hive/ant [jar] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/= build/anttasks/hive-anttasks-0.13.0-SNAPSHOT.jar init: [echo] Project: hive create-dirs: [echo] Project: anttasks [copy] Warning: /data/hive-ptest/working/apache-svn-trunk-source/ant/s= rc/test/resources does not exist. init: [echo] Project: anttasks create-dirs: [echo] Project: shims [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/b= uild/shims [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/b= uild/shims/classes [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/b= uild/shims/test [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/b= uild/shims/test/src [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/b= uild/shims/test/classes [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/b= uild/shims/test/resources [copy] Warning: /data/hive-ptest/working/apache-svn-trunk-source/shims= /src/test/resources does not exist. init: [echo] Project: shims create-dirs: [echo] Project: common [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/b= uild/common [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/b= uild/common/classes [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/b= uild/common/test [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/b= uild/common/test/src [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/b= uild/common/test/classes [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/b= uild/common/test/resources [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-so= urce/build/common/test/resources init: [echo] Project: common create-dirs: [echo] Project: serde [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/b= uild/serde [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/b= uild/serde/classes [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/b= uild/serde/test [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/b= uild/serde/test/src [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/b= uild/serde/test/classes [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/b= uild/serde/test/resources [copy] Warning: /data/hive-ptest/working/apache-svn-trunk-source/serde= /src/test/resources does not exist. init: [echo] Project: serde create-dirs: [echo] Project: metastore [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/b= uild/metastore [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/b= uild/metastore/classes [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/b= uild/metastore/test [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/b= uild/metastore/test/src [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/b= uild/metastore/test/classes [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/b= uild/metastore/test/resources [copy] Warning: /data/hive-ptest/working/apache-svn-trunk-source/metas= tore/src/test/resources does not exist. init: [echo] Project: metastore create-dirs: [echo] Project: ql [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/b= uild/ql [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/b= uild/ql/classes [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/b= uild/ql/test [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/b= uild/ql/test/src [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/b= uild/ql/test/classes [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/b= uild/ql/test/resources [copy] Copying 4 files to /data/hive-ptest/working/apache-svn-trunk-so= urce/build/ql/test/resources init: [echo] Project: ql create-dirs: [echo] Project: contrib [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/b= uild/contrib [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/b= uild/contrib/classes [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/b= uild/contrib/test [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/b= uild/contrib/test/src [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/b= uild/contrib/test/classes [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/b= uild/contrib/test/resources [copy] Warning: /data/hive-ptest/working/apache-svn-trunk-source/contr= ib/src/test/resources does not exist. init: [echo] Project: contrib create-dirs: [echo] Project: service [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/b= uild/service [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/b= uild/service/classes [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/b= uild/service/test [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/b= uild/service/test/src [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/b= uild/service/test/classes [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/b= uild/service/test/resources [copy] Warning: /data/hive-ptest/working/apache-svn-trunk-source/servi= ce/src/test/resources does not exist. init: [echo] Project: service create-dirs: [echo] Project: cli [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/b= uild/cli [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/b= uild/cli/classes [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/b= uild/cli/test [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/b= uild/cli/test/src [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/b= uild/cli/test/classes [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/b= uild/cli/test/resources [copy] Warning: /data/hive-ptest/working/apache-svn-trunk-source/cli/s= rc/test/resources does not exist. init: [echo] Project: cli create-dirs: [echo] Project: jdbc [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/b= uild/jdbc [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/b= uild/jdbc/classes [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/b= uild/jdbc/test [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/b= uild/jdbc/test/src [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/b= uild/jdbc/test/classes [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/b= uild/jdbc/test/resources [copy] Warning: /data/hive-ptest/working/apache-svn-trunk-source/jdbc/= src/test/resources does not exist. init: [echo] Project: jdbc create-dirs: [echo] Project: beeline [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/b= uild/beeline [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/b= uild/beeline/classes [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/b= uild/beeline/test [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/b= uild/beeline/test/src [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/b= uild/beeline/test/classes [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/b= uild/beeline/test/resources [copy] Warning: /data/hive-ptest/working/apache-svn-trunk-source/beeli= ne/src/test/resources does not exist. init: [echo] Project: beeline create-dirs: [echo] Project: hwi [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/b= uild/hwi [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/b= uild/hwi/classes [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/b= uild/hwi/test [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/b= uild/hwi/test/src [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/b= uild/hwi/test/classes [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/b= uild/hwi/test/resources [copy] Warning: /data/hive-ptest/working/apache-svn-trunk-source/hwi/s= rc/test/resources does not exist. init: [echo] Project: hwi create-dirs: [echo] Project: hbase-handler [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/b= uild/hbase-handler [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/b= uild/hbase-handler/classes [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/b= uild/hbase-handler/test [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/b= uild/hbase-handler/test/src [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/b= uild/hbase-handler/test/classes [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/b= uild/hbase-handler/test/resources [copy] Warning: /data/hive-ptest/working/apache-svn-trunk-source/hbase= -handler/src/test/resources does not exist. init: [echo] Project: hbase-handler create-dirs: [echo] Project: testutils [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/b= uild/testutils [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/b= uild/testutils/classes [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/b= uild/testutils/test [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/b= uild/testutils/test/src [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/b= uild/testutils/test/classes [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/b= uild/testutils/test/resources [copy] Warning: /data/hive-ptest/working/apache-svn-trunk-source/testu= tils/src/test/resources does not exist. init: [echo] Project: testutils init: [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/h= catalog/build/hcatalog-0.13.0-SNAPSHOT jar: [echo] Project: hive ivy-init-settings: [echo] Project: shims check-ivy: [echo] Project: shims ivy-resolve: [echo] Project: shims [ivy:resolve] :: loading settings :: file =3D /data/hive-ptest/working/apac= he-svn-trunk-source/ivy/ivysettings.xml [ivy:resolve] :: resolving dependencies :: org.apache.hive#hive-shims;0.13.= 0-SNAPSHOT [ivy:resolve] =09confs: [default] [ivy:resolve] =09found org.apache.zookeeper#zookeeper;3.4.3 in maven2 [ivy:resolve] =09found org.apache.thrift#libthrift;0.9.0 in maven2 [ivy:resolve] =09found commons-logging#commons-logging;1.0.4 in maven2 [ivy:resolve] =09found commons-logging#commons-logging-api;1.0.4 in maven2 [ivy:resolve] =09found org.codehaus.jackson#jackson-core-asl;1.8.8 in maven= 2 [ivy:resolve] =09found org.codehaus.jackson#jackson-mapper-asl;1.8.8 in mav= en2 [ivy:resolve] =09found log4j#log4j;1.2.16 in maven2 [ivy:resolve] =09found com.google.guava#guava;11.0.2 in maven2 [ivy:resolve] =09found commons-io#commons-io;2.4 in maven2 [ivy:resolve] downloading http://repo1.maven.org/maven2/org/apache/zookeepe= r/zookeeper/3.4.3/zookeeper-3.4.3.jar ... [ivy:resolve] .............. (749kB) [ivy:resolve] .. (0kB) [ivy:resolve] =09[SUCCESSFUL ] org.apache.zookeeper#zookeeper;3.4.3!zookeep= er.jar (50ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/org/apache/thrift/l= ibthrift/0.9.0/libthrift-0.9.0.jar ... [ivy:resolve] ....... (339kB) [ivy:resolve] .. (0kB) [ivy:resolve] =09[SUCCESSFUL ] org.apache.thrift#libthrift;0.9.0!libthrift.= jar (12ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/commons-logging/com= mons-logging/1.0.4/commons-logging-1.0.4.jar ... [ivy:resolve] .. (37kB) [ivy:resolve] .. (0kB) [ivy:resolve] =09[SUCCESSFUL ] commons-logging#commons-logging;1.0.4!common= s-logging.jar (7ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/commons-logging/com= mons-logging-api/1.0.4/commons-logging-api-1.0.4.jar ... [ivy:resolve] .. (25kB) [ivy:resolve] .. (0kB) [ivy:resolve] =09[SUCCESSFUL ] commons-logging#commons-logging-api;1.0.4!co= mmons-logging-api.jar (9ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/org/codehaus/jackso= n/jackson-core-asl/1.8.8/jackson-core-asl-1.8.8.jar ... [ivy:resolve] ..... (222kB) [ivy:resolve] .. (0kB) [ivy:resolve] =09[SUCCESSFUL ] org.codehaus.jackson#jackson-core-asl;1.8.8!= jackson-core-asl.jar (46ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/org/codehaus/jackso= n/jackson-mapper-asl/1.8.8/jackson-mapper-asl-1.8.8.jar ... [ivy:resolve] ............. (652kB) [ivy:resolve] .. (0kB) [ivy:resolve] =09[SUCCESSFUL ] org.codehaus.jackson#jackson-mapper-asl;1.8.= 8!jackson-mapper-asl.jar (18ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/log4j/log4j/1.2.16/= log4j-1.2.16.jar ... [ivy:resolve] ......... (470kB) [ivy:resolve] .. (0kB) [ivy:resolve] =09[SUCCESSFUL ] log4j#log4j;1.2.16!log4j.jar(bundle) (15ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/com/google/guava/gu= ava/11.0.2/guava-11.0.2.jar ... [ivy:resolve] ........................... (1609kB) [ivy:resolve] .. (0kB) [ivy:resolve] =09[SUCCESSFUL ] com.google.guava#guava;11.0.2!guava.jar (34m= s) [ivy:resolve] downloading http://repo1.maven.org/maven2/commons-io/commons-= io/2.4/commons-io-2.4.jar ... [ivy:resolve] .... (180kB) [ivy:resolve] .. (0kB) [ivy:resolve] =09[SUCCESSFUL ] commons-io#commons-io;2.4!commons-io.jar (9m= s) [ivy:resolve] :: resolution report :: resolve 9037ms :: artifacts dl 227ms =09--------------------------------------------------------------------- =09| | modules || artifacts | =09| conf | number| search|dwnlded|evicted|| number|dwnlded| =09--------------------------------------------------------------------- =09| default | 9 | 9 | 9 | 0 || 9 | 9 | =09--------------------------------------------------------------------- [ivy:report] Processing /data/hive-ptest/working/apache-svn-trunk-source/bu= ild/ivy/resolution-cache/org.apache.hive-hive-shims-default.xml to /data/hi= ve-ptest/working/apache-svn-trunk-source/build/ivy/report/org.apache.hive-h= ive-shims-default.html make-pom: [echo] Project: shims [echo] Writing POM to /data/hive-ptest/working/apache-svn-trunk-sourc= e/build/shims/pom.xml [ivy:makepom] DEPRECATED: 'ivy.conf.file' is deprecated, use 'ivy.settings.= file' instead [ivy:makepom] :: loading settings :: file =3D /data/hive-ptest/working/apac= he-svn-trunk-source/ivy/ivysettings.xml create-dirs: [echo] Project: shims [copy] Warning: /data/hive-ptest/working/apache-svn-trunk-source/shims= /src/test/resources does not exist. init: [echo] Project: shims ivy-retrieve: [echo] Project: shims [ivy:retrieve] :: retrieving :: org.apache.hive#hive-shims [ivy:retrieve] =09confs: [default] [ivy:retrieve] =099 artifacts copied, 0 already retrieved (4287kB/38ms) compile: [echo] Project: shims [echo] Building shims 0.20 build-shims: [echo] Project: shims [echo] Compiling /data/hive-ptest/working/apache-svn-trunk-source/shim= s/src/common/java;/data/hive-ptest/working/apache-svn-trunk-source/shims/sr= c/0.20/java against hadoop 0.20.2 (/data/hive-ptest/working/apache-svn-trun= k-source/build/hadoopcore/hadoop-0.20.2) ivy-init-settings: [echo] Project: shims ivy-resolve-hadoop-shim: [echo] Project: shims [ivy:resolve] :: loading settings :: file =3D /data/hive-ptest/working/apac= he-svn-trunk-source/ivy/ivysettings.xml [ivy:resolve] :: resolving dependencies :: org.apache.hive#hive-shims;0.13.= 0-SNAPSHOT [ivy:resolve] =09confs: [hadoop0.20.shim] [ivy:resolve] =09found org.apache.hadoop#hadoop-core;0.20.2 in maven2 [ivy:resolve] =09found commons-cli#commons-cli;1.2 in maven2 [ivy:resolve] =09found xmlenc#xmlenc;0.52 in maven2 [ivy:resolve] =09found commons-httpclient#commons-httpclient;3.0.1 in maven= 2 [ivy:resolve] =09found commons-logging#commons-logging;1.0.3 in maven2 [ivy:resolve] =09found commons-codec#commons-codec;1.3 in maven2 [ivy:resolve] =09found commons-net#commons-net;1.4.1 in maven2 [ivy:resolve] =09found oro#oro;2.0.8 in maven2 [ivy:resolve] =09found org.mortbay.jetty#jetty;6.1.14 in maven2 [ivy:resolve] =09found org.mortbay.jetty#jetty-util;6.1.14 in maven2 [ivy:resolve] =09found org.mortbay.jetty#servlet-api-2.5;6.1.14 in maven2 [ivy:resolve] =09found tomcat#jasper-runtime;5.5.12 in maven2 [ivy:resolve] =09found tomcat#jasper-compiler;5.5.12 in maven2 [ivy:resolve] =09found org.mortbay.jetty#jsp-api-2.1;6.1.14 in maven2 [ivy:resolve] =09found org.mortbay.jetty#jsp-2.1;6.1.14 in maven2 [ivy:resolve] =09found org.eclipse.jdt#core;3.1.1 in maven2 [ivy:resolve] =09found ant#ant;1.6.5 in maven2 [ivy:resolve] =09found commons-el#commons-el;1.0 in maven2 [ivy:resolve] =09found net.java.dev.jets3t#jets3t;0.7.1 in maven2 [ivy:resolve] =09found commons-logging#commons-logging;1.1.1 in maven2 [ivy:resolve] =09found net.sf.kosmosfs#kfs;0.3 in maven2 [ivy:resolve] =09found junit#junit;4.5 in maven2 [ivy:resolve] =09found hsqldb#hsqldb;1.8.0.10 in maven2 [ivy:resolve] =09found org.apache.hadoop#hadoop-tools;0.20.2 in maven2 [ivy:resolve] =09found org.apache.hadoop#hadoop-test;0.20.2 in maven2 [ivy:resolve] =09found org.apache.ftpserver#ftplet-api;1.0.0 in maven2 [ivy:resolve] =09found org.apache.mina#mina-core;2.0.0-M5 in maven2 [ivy:resolve] =09found org.slf4j#slf4j-api;1.5.2 in maven2 [ivy:resolve] =09found org.apache.ftpserver#ftpserver-core;1.0.0 in maven2 [ivy:resolve] =09found org.apache.ftpserver#ftpserver-deprecated;1.0.0-M2 i= n maven2 [ivy:resolve] downloading http://repo1.maven.org/maven2/org/apache/hadoop/h= adoop-core/0.20.2/hadoop-core-0.20.2.jar ... [ivy:resolve] ............................................ (2624kB) [ivy:resolve] .. (0kB) [ivy:resolve] =09[SUCCESSFUL ] org.apache.hadoop#hadoop-core;0.20.2!hadoop-= core.jar (55ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/org/apache/hadoop/h= adoop-tools/0.20.2/hadoop-tools-0.20.2.jar ... [ivy:resolve] ... (68kB) [ivy:resolve] .. (0kB) [ivy:resolve] =09[SUCCESSFUL ] org.apache.hadoop#hadoop-tools;0.20.2!hadoop= -tools.jar (16ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/org/apache/hadoop/h= adoop-test/0.20.2/hadoop-test-0.20.2.jar ... [ivy:resolve] .......................... (1527kB) [ivy:resolve] .. (0kB) [ivy:resolve] =09[SUCCESSFUL ] org.apache.hadoop#hadoop-test;0.20.2!hadoop-= test.jar (32ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/commons-cli/commons= -cli/1.2/commons-cli-1.2.jar ... [ivy:resolve] .. (40kB) [ivy:resolve] .. (0kB) [ivy:resolve] =09[SUCCESSFUL ] commons-cli#commons-cli;1.2!commons-cli.jar = (6ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/xmlenc/xmlenc/0.52/= xmlenc-0.52.jar ... [ivy:resolve] .. (14kB) [ivy:resolve] .. (0kB) [ivy:resolve] =09[SUCCESSFUL ] xmlenc#xmlenc;0.52!xmlenc.jar (6ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/commons-httpclient/= commons-httpclient/3.0.1/commons-httpclient-3.0.1.jar ... [ivy:resolve] ...... (273kB) [ivy:resolve] .. (0kB) [ivy:resolve] =09[SUCCESSFUL ] commons-httpclient#commons-httpclient;3.0.1!= commons-httpclient.jar (11ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/commons-codec/commo= ns-codec/1.3/commons-codec-1.3.jar ... [ivy:resolve] .. (45kB) [ivy:resolve] .. (0kB) [ivy:resolve] =09[SUCCESSFUL ] commons-codec#commons-codec;1.3!commons-code= c.jar (6ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/commons-net/commons= -net/1.4.1/commons-net-1.4.1.jar ... [ivy:resolve] .... (176kB) [ivy:resolve] .. (0kB) [ivy:resolve] =09[SUCCESSFUL ] commons-net#commons-net;1.4.1!commons-net.ja= r (8ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/org/mortbay/jetty/j= etty/6.1.14/jetty-6.1.14.jar ... [ivy:resolve] ......... (504kB) [ivy:resolve] .. (0kB) [ivy:resolve] =09[SUCCESSFUL ] org.mortbay.jetty#jetty;6.1.14!jetty.jar (14= ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/org/mortbay/jetty/j= etty-util/6.1.14/jetty-util-6.1.14.jar ... [ivy:resolve] .... (159kB) [ivy:resolve] .. (0kB) [ivy:resolve] =09[SUCCESSFUL ] org.mortbay.jetty#jetty-util;6.1.14!jetty-ut= il.jar (24ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/tomcat/jasper-runti= me/5.5.12/jasper-runtime-5.5.12.jar ... [ivy:resolve] ... (74kB) [ivy:resolve] .. (0kB) [ivy:resolve] =09[SUCCESSFUL ] tomcat#jasper-runtime;5.5.12!jasper-runtime.= jar (6ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/tomcat/jasper-compi= ler/5.5.12/jasper-compiler-5.5.12.jar ... [ivy:resolve] ........ (395kB) [ivy:resolve] .. (0kB) [ivy:resolve] =09[SUCCESSFUL ] tomcat#jasper-compiler;5.5.12!jasper-compile= r.jar (13ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/org/mortbay/jetty/j= sp-api-2.1/6.1.14/jsp-api-2.1-6.1.14.jar ... [ivy:resolve] .... (131kB) [ivy:resolve] .. (0kB) [ivy:resolve] =09[SUCCESSFUL ] org.mortbay.jetty#jsp-api-2.1;6.1.14!jsp-api= -2.1.jar (8ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/org/mortbay/jetty/j= sp-2.1/6.1.14/jsp-2.1-6.1.14.jar ... [ivy:resolve] ................. (1000kB) [ivy:resolve] .. (0kB) [ivy:resolve] =09[SUCCESSFUL ] org.mortbay.jetty#jsp-2.1;6.1.14!jsp-2.1.jar= (23ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/commons-el/commons-= el/1.0/commons-el-1.0.jar ... [ivy:resolve] ... (109kB) [ivy:resolve] .. (0kB) [ivy:resolve] =09[SUCCESSFUL ] commons-el#commons-el;1.0!commons-el.jar (8m= s) [ivy:resolve] downloading http://repo1.maven.org/maven2/net/java/dev/jets3t= /jets3t/0.7.1/jets3t-0.7.1.jar ... [ivy:resolve] ....... (368kB) [ivy:resolve] .. (0kB) [ivy:resolve] =09[SUCCESSFUL ] net.java.dev.jets3t#jets3t;0.7.1!jets3t.jar = (42ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/org/mortbay/jetty/s= ervlet-api-2.5/6.1.14/servlet-api-2.5-6.1.14.jar ... [ivy:resolve] .... (129kB) [ivy:resolve] .. (0kB) [ivy:resolve] =09[SUCCESSFUL ] org.mortbay.jetty#servlet-api-2.5;6.1.14!ser= vlet-api-2.5.jar (8ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/net/sf/kosmosfs/kfs= /0.3/kfs-0.3.jar ... [ivy:resolve] .. (11kB) [ivy:resolve] .. (0kB) [ivy:resolve] =09[SUCCESSFUL ] net.sf.kosmosfs#kfs;0.3!kfs.jar (36ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/junit/junit/4.5/jun= it-4.5.jar ... [ivy:resolve] ..... (194kB) [ivy:resolve] .. (0kB) [ivy:resolve] =09[SUCCESSFUL ] junit#junit;4.5!junit.jar (9ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/hsqldb/hsqldb/1.8.0= .10/hsqldb-1.8.0.10.jar ... [ivy:resolve] ............ (690kB) [ivy:resolve] .. (0kB) [ivy:resolve] =09[SUCCESSFUL ] hsqldb#hsqldb;1.8.0.10!hsqldb.jar (22ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/oro/oro/2.0.8/oro-2= .0.8.jar ... [ivy:resolve] .. (63kB) [ivy:resolve] .. (0kB) [ivy:resolve] =09[SUCCESSFUL ] oro#oro;2.0.8!oro.jar (7ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/org/eclipse/jdt/cor= e/3.1.1/core-3.1.1.jar ... [ivy:resolve] .............................................................= ............................. (3483kB) [ivy:resolve] .. (0kB) [ivy:resolve] =09[SUCCESSFUL ] org.eclipse.jdt#core;3.1.1!core.jar (75ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/ant/ant/1.6.5/ant-1= .6.5.jar ... [ivy:resolve] ................. (1009kB) [ivy:resolve] .. (0kB) [ivy:resolve] =09[SUCCESSFUL ] ant#ant;1.6.5!ant.jar (33ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/commons-logging/com= mons-logging/1.1.1/commons-logging-1.1.1.jar ... [ivy:resolve] .. (59kB) [ivy:resolve] .. (0kB) [ivy:resolve] =09[SUCCESSFUL ] commons-logging#commons-logging;1.1.1!common= s-logging.jar (6ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/org/apache/ftpserve= r/ftplet-api/1.0.0/ftplet-api-1.0.0.jar ... [ivy:resolve] .. (22kB) [ivy:resolve] .. (0kB) [ivy:resolve] =09[SUCCESSFUL ] org.apache.ftpserver#ftplet-api;1.0.0!ftplet= -api.jar(bundle) (6ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/org/apache/mina/min= a-core/2.0.0-M5/mina-core-2.0.0-M5.jar ... [ivy:resolve] ........... (622kB) [ivy:resolve] .. (0kB) [ivy:resolve] =09[SUCCESSFUL ] org.apache.mina#mina-core;2.0.0-M5!mina-core= .jar(bundle) (16ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/org/apache/ftpserve= r/ftpserver-core/1.0.0/ftpserver-core-1.0.0.jar ... [ivy:resolve] ...... (264kB) [ivy:resolve] .. (0kB) [ivy:resolve] =09[SUCCESSFUL ] org.apache.ftpserver#ftpserver-core;1.0.0!ft= pserver-core.jar(bundle) (10ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/org/apache/ftpserve= r/ftpserver-deprecated/1.0.0-M2/ftpserver-deprecated-1.0.0-M2.jar ... [ivy:resolve] .. (31kB) [ivy:resolve] .. (0kB) [ivy:resolve] =09[SUCCESSFUL ] org.apache.ftpserver#ftpserver-deprecated;1.= 0.0-M2!ftpserver-deprecated.jar (6ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/org/slf4j/slf4j-api= /1.5.2/slf4j-api-1.5.2.jar ... [ivy:resolve] .. (16kB) [ivy:resolve] .. (0kB) [ivy:resolve] =09[SUCCESSFUL ] org.slf4j#slf4j-api;1.5.2!slf4j-api.jar (36m= s) [ivy:resolve] :: resolution report :: resolve 32966ms :: artifacts dl 658ms [ivy:resolve] =09:: evicted modules: [ivy:resolve] =09junit#junit;3.8.1 by [junit#junit;4.5] in [hadoop0.20.shim= ] [ivy:resolve] =09commons-logging#commons-logging;1.0.3 by [commons-logging#= commons-logging;1.1.1] in [hadoop0.20.shim] [ivy:resolve] =09commons-codec#commons-codec;1.2 by [commons-codec#commons-= codec;1.3] in [hadoop0.20.shim] [ivy:resolve] =09commons-httpclient#commons-httpclient;3.1 by [commons-http= client#commons-httpclient;3.0.1] in [hadoop0.20.shim] [ivy:resolve] =09org.apache.mina#mina-core;2.0.0-M4 by [org.apache.mina#min= a-core;2.0.0-M5] in [hadoop0.20.shim] [ivy:resolve] =09org.apache.ftpserver#ftplet-api;1.0.0-M2 by [org.apache.ft= pserver#ftplet-api;1.0.0] in [hadoop0.20.shim] [ivy:resolve] =09org.apache.ftpserver#ftpserver-core;1.0.0-M2 by [org.apach= e.ftpserver#ftpserver-core;1.0.0] in [hadoop0.20.shim] [ivy:resolve] =09org.apache.mina#mina-core;2.0.0-M2 by [org.apache.mina#min= a-core;2.0.0-M5] in [hadoop0.20.shim] =09--------------------------------------------------------------------- =09| | modules || artifacts | =09| conf | number| search|dwnlded|evicted|| number|dwnlded| =09--------------------------------------------------------------------- =09| hadoop0.20.shim | 37 | 30 | 30 | 8 || 29 | 29 | =09--------------------------------------------------------------------- ivy-retrieve-hadoop-shim: [echo] Project: shims [ivy:retrieve] :: retrieving :: org.apache.hive#hive-shims [ivy:retrieve] =09confs: [hadoop0.20.shim] [ivy:retrieve] =0929 artifacts copied, 0 already retrieved (14115kB/57ms) [javac] Compiling 17 source files to /data/hive-ptest/working/apache-sv= n-trunk-source/build/shims/classes [javac] Note: Some input files use or override a deprecated API. [javac] Note: Recompile with -Xlint:deprecation for details. [javac] Note: /data/hive-ptest/working/apache-svn-trunk-source/shims/sr= c/0.20/java/org/apache/hadoop/hive/shims/Hadoop20Shims.java uses unchecked = or unsafe operations. [javac] Note: Recompile with -Xlint:unchecked for details. [echo] Building shims 0.20S build-shims: [echo] Project: shims [echo] Compiling /data/hive-ptest/working/apache-svn-trunk-source/shim= s/src/common/java;/data/hive-ptest/working/apache-svn-trunk-source/shims/sr= c/common-secure/java;/data/hive-ptest/working/apache-svn-trunk-source/shims= /src/0.20S/java against hadoop 1.1.2 (/data/hive-ptest/working/apache-svn-t= runk-source/build/hadoopcore/hadoop-1.1.2) ivy-init-settings: [echo] Project: shims ivy-resolve-hadoop-shim: [echo] Project: shims [ivy:resolve] :: loading settings :: file =3D /data/hive-ptest/working/apac= he-svn-trunk-source/ivy/ivysettings.xml [ivy:resolve] :: resolving dependencies :: org.apache.hive#hive-shims;0.13.= 0-SNAPSHOT [ivy:resolve] =09confs: [hadoop0.20S.shim] [ivy:resolve] =09found org.apache.hadoop#hadoop-core;1.1.2 in maven2 [ivy:resolve] =09found commons-cli#commons-cli;1.2 in maven2 [ivy:resolve] =09found xmlenc#xmlenc;0.52 in maven2 [ivy:resolve] =09found com.sun.jersey#jersey-core;1.8 in maven2 [ivy:resolve] =09found com.sun.jersey#jersey-json;1.8 in maven2 [ivy:resolve] =09found org.codehaus.jettison#jettison;1.1 in maven2 [ivy:resolve] =09found stax#stax-api;1.0.1 in maven2 [ivy:resolve] =09found com.sun.xml.bind#jaxb-impl;2.2.3-1 in maven2 [ivy:resolve] =09found javax.xml.bind#jaxb-api;2.2.2 in maven2 [ivy:resolve] =09found javax.xml.stream#stax-api;1.0-2 in maven2 [ivy:resolve] =09found javax.activation#activation;1.1 in maven2 [ivy:resolve] =09found org.codehaus.jackson#jackson-core-asl;1.7.1 in maven= 2 [ivy:resolve] =09found org.codehaus.jackson#jackson-mapper-asl;1.7.1 in mav= en2 [ivy:resolve] =09found org.codehaus.jackson#jackson-jaxrs;1.7.1 in maven2 [ivy:resolve] =09found org.codehaus.jackson#jackson-xc;1.7.1 in maven2 [ivy:resolve] =09found com.sun.jersey#jersey-server;1.8 in maven2 [ivy:resolve] =09found asm#asm;3.1 in maven2 [ivy:resolve] =09found commons-io#commons-io;2.1 in maven2 [ivy:resolve] =09found commons-httpclient#commons-httpclient;3.0.1 in maven= 2 [ivy:resolve] =09found junit#junit;3.8.1 in maven2 [ivy:resolve] =09found commons-logging#commons-logging;1.0.3 in maven2 [ivy:resolve] =09found commons-codec#commons-codec;1.4 in maven2 [ivy:resolve] =09found org.apache.commons#commons-math;2.1 in maven2 [ivy:resolve] =09found commons-configuration#commons-configuration;1.6 in m= aven2 [ivy:resolve] =09found commons-collections#commons-collections;3.2.1 in mav= en2 [ivy:resolve] =09found commons-lang#commons-lang;2.4 in maven2 [ivy:resolve] =09found commons-logging#commons-logging;1.1.1 in maven2 [ivy:resolve] =09found commons-digester#commons-digester;1.8 in maven2 [ivy:resolve] =09found commons-beanutils#commons-beanutils;1.7.0 in maven2 [ivy:resolve] =09found commons-beanutils#commons-beanutils-core;1.8.0 in ma= ven2 [ivy:resolve] =09found commons-net#commons-net;1.4.1 in maven2 [ivy:resolve] =09found oro#oro;2.0.8 in maven2 [ivy:resolve] =09found org.mortbay.jetty#jetty;6.1.26 in maven2 [ivy:resolve] =09found org.mortbay.jetty#jetty-util;6.1.26 in maven2 [ivy:resolve] =09found org.mortbay.jetty#servlet-api;2.5-20081211 in maven2 [ivy:resolve] =09found tomcat#jasper-runtime;5.5.12 in maven2 [ivy:resolve] =09found tomcat#jasper-compiler;5.5.12 in maven2 [ivy:resolve] =09found org.mortbay.jetty#jsp-api-2.1;6.1.14 in maven2 [ivy:resolve] =09found org.mortbay.jetty#servlet-api-2.5;6.1.14 in maven2 [ivy:resolve] =09found org.mortbay.jetty#jsp-2.1;6.1.14 in maven2 [ivy:resolve] =09found org.eclipse.jdt#core;3.1.1 in maven2 [ivy:resolve] =09found ant#ant;1.6.5 in maven2 [ivy:resolve] =09found commons-el#commons-el;1.0 in maven2 [ivy:resolve] =09found net.java.dev.jets3t#jets3t;0.6.1 in maven2 [ivy:resolve] =09found hsqldb#hsqldb;1.8.0.10 in maven2 [ivy:resolve] =09found org.codehaus.jackson#jackson-mapper-asl;1.8.8 in mav= en2 [ivy:resolve] =09found org.codehaus.jackson#jackson-core-asl;1.8.8 in maven= 2 [ivy:resolve] =09found org.apache.hadoop#hadoop-tools;1.1.2 in maven2 [ivy:resolve] =09found org.apache.hadoop#hadoop-test;1.1.2 in maven2 [ivy:resolve] =09found org.apache.ftpserver#ftplet-api;1.0.0 in maven2 [ivy:resolve] =09found org.apache.mina#mina-core;2.0.0-M5 in maven2 [ivy:resolve] =09found org.slf4j#slf4j-api;1.5.2 in maven2 [ivy:resolve] =09found org.apache.ftpserver#ftpserver-core;1.0.0 in maven2 [ivy:resolve] =09found org.apache.ftpserver#ftpserver-deprecated;1.0.0-M2 i= n maven2 [ivy:resolve] downloading http://repo1.maven.org/maven2/org/apache/hadoop/h= adoop-core/1.1.2/hadoop-core-1.1.2.jar ... [ivy:resolve] .............................................................= .................................. (3941kB) [ivy:resolve] .. (0kB) [ivy:resolve] =09[SUCCESSFUL ] org.apache.hadoop#hadoop-core;1.1.2!hadoop-c= ore.jar (82ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/org/apache/hadoop/h= adoop-tools/1.1.2/hadoop-tools-1.1.2.jar ... [ivy:resolve] ...... (299kB) [ivy:resolve] .. (0kB) [ivy:resolve] =09[SUCCESSFUL ] org.apache.hadoop#hadoop-tools;1.1.2!hadoop-= tools.jar (11ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/org/apache/hadoop/h= adoop-test/1.1.2/hadoop-test-1.1.2.jar ... [ivy:resolve] .............................................. (2712kB) [ivy:resolve] .. (0kB) [ivy:resolve] =09[SUCCESSFUL ] org.apache.hadoop#hadoop-test;1.1.2!hadoop-t= est.jar (53ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/com/sun/jersey/jers= ey-core/1.8/jersey-core-1.8.jar ... [ivy:resolve] ........ (447kB) [ivy:resolve] .. (0kB) [ivy:resolve] =09[SUCCESSFUL ] com.sun.jersey#jersey-core;1.8!jersey-core.j= ar(bundle) (14ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/com/sun/jersey/jers= ey-json/1.8/jersey-json-1.8.jar ... [ivy:resolve] .... (144kB) [ivy:resolve] .. (0kB) [ivy:resolve] =09[SUCCESSFUL ] com.sun.jersey#jersey-json;1.8!jersey-json.j= ar(bundle) (8ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/com/sun/jersey/jers= ey-server/1.8/jersey-server-1.8.jar ... [ivy:resolve] ............. (678kB) [ivy:resolve] .. (0kB) [ivy:resolve] =09[SUCCESSFUL ] com.sun.jersey#jersey-server;1.8!jersey-serv= er.jar(bundle) (17ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/commons-io/commons-= io/2.1/commons-io-2.1.jar ... [ivy:resolve] .... (159kB) [ivy:resolve] .. (0kB) [ivy:resolve] =09[SUCCESSFUL ] commons-io#commons-io;2.1!commons-io.jar (9m= s) [ivy:resolve] downloading http://repo1.maven.org/maven2/commons-codec/commo= ns-codec/1.4/commons-codec-1.4.jar ... [ivy:resolve] .. (56kB) [ivy:resolve] .. (0kB) [ivy:resolve] =09[SUCCESSFUL ] commons-codec#commons-codec;1.4!commons-code= c.jar (6ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/org/apache/commons/= commons-math/2.1/commons-math-2.1.jar ... [ivy:resolve] ............... (812kB) [ivy:resolve] .. (0kB) [ivy:resolve] =09[SUCCESSFUL ] org.apache.commons#commons-math;2.1!commons-= math.jar (20ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/commons-configurati= on/commons-configuration/1.6/commons-configuration-1.6.jar ... [ivy:resolve] ...... (291kB) [ivy:resolve] .. (0kB) [ivy:resolve] =09[SUCCESSFUL ] commons-configuration#commons-configuration;= 1.6!commons-configuration.jar (11ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/org/mortbay/jetty/j= etty/6.1.26/jetty-6.1.26.jar ... [ivy:resolve] .......... (527kB) [ivy:resolve] .. (0kB) [ivy:resolve] =09[SUCCESSFUL ] org.mortbay.jetty#jetty;6.1.26!jetty.jar (15= ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/org/mortbay/jetty/j= etty-util/6.1.26/jetty-util-6.1.26.jar ... [ivy:resolve] .... (172kB) [ivy:resolve] .. (0kB) [ivy:resolve] =09[SUCCESSFUL ] org.mortbay.jetty#jetty-util;6.1.26!jetty-ut= il.jar (13ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/net/java/dev/jets3t= /jets3t/0.6.1/jets3t-0.6.1.jar ... [ivy:resolve] ...... (314kB) [ivy:resolve] .. (0kB) [ivy:resolve] =09[SUCCESSFUL ] net.java.dev.jets3t#jets3t;0.6.1!jets3t.jar = (25ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/org/codehaus/jettis= on/jettison/1.1/jettison-1.1.jar ... [ivy:resolve] ... (66kB) [ivy:resolve] .. (0kB) [ivy:resolve] =09[SUCCESSFUL ] org.codehaus.jettison#jettison;1.1!jettison.= jar(bundle) (10ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/com/sun/xml/bind/ja= xb-impl/2.2.3-1/jaxb-impl-2.2.3-1.jar ... [ivy:resolve] ............... (869kB) [ivy:resolve] .. (0kB) [ivy:resolve] =09[SUCCESSFUL ] com.sun.xml.bind#jaxb-impl;2.2.3-1!jaxb-impl= .jar (37ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/org/codehaus/jackso= n/jackson-jaxrs/1.7.1/jackson-jaxrs-1.7.1.jar ... [ivy:resolve] .. (17kB) [ivy:resolve] .. (0kB) [ivy:resolve] =09[SUCCESSFUL ] org.codehaus.jackson#jackson-jaxrs;1.7.1!jac= kson-jaxrs.jar (13ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/org/codehaus/jackso= n/jackson-xc/1.7.1/jackson-xc-1.7.1.jar ... [ivy:resolve] .. (30kB) [ivy:resolve] .. (0kB) [ivy:resolve] =09[SUCCESSFUL ] org.codehaus.jackson#jackson-xc;1.7.1!jackso= n-xc.jar (15ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/stax/stax-api/1.0.1= /stax-api-1.0.1.jar ... [ivy:resolve] .. (25kB) [ivy:resolve] .. (0kB) [ivy:resolve] =09[SUCCESSFUL ] stax#stax-api;1.0.1!stax-api.jar (12ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/javax/xml/bind/jaxb= -api/2.2.2/jaxb-api-2.2.2.jar ... [ivy:resolve] ... (102kB) [ivy:resolve] .. (0kB) [ivy:resolve] =09[SUCCESSFUL ] javax.xml.bind#jaxb-api;2.2.2!jaxb-api.jar (= 21ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/javax/xml/stream/st= ax-api/1.0-2/stax-api-1.0-2.jar ... [ivy:resolve] .. (22kB) [ivy:resolve] .. (0kB) [ivy:resolve] =09[SUCCESSFUL ] javax.xml.stream#stax-api;1.0-2!stax-api.jar= (17ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/javax/activation/ac= tivation/1.1/activation-1.1.jar ... [ivy:resolve] .. (61kB) [ivy:resolve] .. (0kB) [ivy:resolve] =09[SUCCESSFUL ] javax.activation#activation;1.1!activation.j= ar (15ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/asm/asm/3.1/asm-3.1= .jar ... [ivy:resolve] .. (42kB) [ivy:resolve] .. (0kB) [ivy:resolve] =09[SUCCESSFUL ] asm#asm;3.1!asm.jar (15ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/junit/junit/3.8.1/j= unit-3.8.1.jar ... [ivy:resolve] ... (118kB) [ivy:resolve] .. (0kB) [ivy:resolve] =09[SUCCESSFUL ] junit#junit;3.8.1!junit.jar (21ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/commons-collections= /commons-collections/3.2.1/commons-collections-3.2.1.jar ... [ivy:resolve] .......... (561kB) [ivy:resolve] .. (0kB) [ivy:resolve] =09[SUCCESSFUL ] commons-collections#commons-collections;3.2.= 1!commons-collections.jar (35ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/commons-digester/co= mmons-digester/1.8/commons-digester-1.8.jar ... [ivy:resolve] .... (140kB) [ivy:resolve] .. (0kB) [ivy:resolve] =09[SUCCESSFUL ] commons-digester#commons-digester;1.8!common= s-digester.jar (12ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/commons-beanutils/c= ommons-beanutils-core/1.8.0/commons-beanutils-core-1.8.0.jar ... [ivy:resolve] ..... (201kB) [ivy:resolve] .. (0kB) [ivy:resolve] =09[SUCCESSFUL ] commons-beanutils#commons-beanutils-core;1.8= .0!commons-beanutils-core.jar (23ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/commons-beanutils/c= ommons-beanutils/1.7.0/commons-beanutils-1.7.0.jar ... [ivy:resolve] .... (184kB) [ivy:resolve] .. (0kB) [ivy:resolve] =09[SUCCESSFUL ] commons-beanutils#commons-beanutils;1.7.0!co= mmons-beanutils.jar (21ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/org/mortbay/jetty/s= ervlet-api/2.5-20081211/servlet-api-2.5-20081211.jar ... [ivy:resolve] .... (130kB) [ivy:resolve] .. (0kB) [ivy:resolve] =09[SUCCESSFUL ] org.mortbay.jetty#servlet-api;2.5-20081211!s= ervlet-api.jar (9ms) [ivy:resolve] :: resolution report :: resolve 32216ms :: artifacts dl 713ms [ivy:resolve] =09:: evicted modules: [ivy:resolve] =09org.codehaus.jackson#jackson-core-asl;1.7.1 by [org.codeha= us.jackson#jackson-core-asl;1.8.8] in [hadoop0.20S.shim] [ivy:resolve] =09org.codehaus.jackson#jackson-mapper-asl;1.7.1 by [org.code= haus.jackson#jackson-mapper-asl;1.8.8] in [hadoop0.20S.shim] [ivy:resolve] =09commons-logging#commons-logging;1.0.3 by [commons-logging#= commons-logging;1.1.1] in [hadoop0.20S.shim] [ivy:resolve] =09commons-codec#commons-codec;1.2 by [commons-codec#commons-= codec;1.4] in [hadoop0.20S.shim] [ivy:resolve] =09commons-logging#commons-logging;1.1 by [commons-logging#co= mmons-logging;1.1.1] in [hadoop0.20S.shim] [ivy:resolve] =09commons-codec#commons-codec;1.3 by [commons-codec#commons-= codec;1.4] in [hadoop0.20S.shim] [ivy:resolve] =09commons-httpclient#commons-httpclient;3.1 by [commons-http= client#commons-httpclient;3.0.1] in [hadoop0.20S.shim] [ivy:resolve] =09org.apache.mina#mina-core;2.0.0-M4 by [org.apache.mina#min= a-core;2.0.0-M5] in [hadoop0.20S.shim] [ivy:resolve] =09org.apache.ftpserver#ftplet-api;1.0.0-M2 by [org.apache.ft= pserver#ftplet-api;1.0.0] in [hadoop0.20S.shim] [ivy:resolve] =09org.apache.ftpserver#ftpserver-core;1.0.0-M2 by [org.apach= e.ftpserver#ftpserver-core;1.0.0] in [hadoop0.20S.shim] [ivy:resolve] =09org.apache.mina#mina-core;2.0.0-M2 by [org.apache.mina#min= a-core;2.0.0-M5] in [hadoop0.20S.shim] =09--------------------------------------------------------------------- =09| | modules || artifacts | =09| conf | number| search|dwnlded|evicted|| number|dwnlded| =09--------------------------------------------------------------------- =09| hadoop0.20S.shim | 62 | 30 | 30 | 11 || 51 | 28 | =09--------------------------------------------------------------------- ivy-retrieve-hadoop-shim: [echo] Project: shims [ivy:retrieve] :: retrieving :: org.apache.hive#hive-shims [ivy:retrieve] =09confs: [hadoop0.20S.shim] [ivy:retrieve] =0951 artifacts copied, 0 already retrieved (22876kB/86ms) [javac] Compiling 15 source files to /data/hive-ptest/working/apache-sv= n-trunk-source/build/shims/classes [javac] Note: Some input files use or override a deprecated API. [javac] Note: Recompile with -Xlint:deprecation for details. [javac] Note: Some input files use unchecked or unsafe operations. [javac] Note: Recompile with -Xlint:unchecked for details. [echo] Building shims 0.23 build-shims: [echo] Project: shims [echo] Compiling /data/hive-ptest/working/apache-svn-trunk-source/shim= s/src/common/java;/data/hive-ptest/working/apache-svn-trunk-source/shims/sr= c/common-secure/java;/data/hive-ptest/working/apache-svn-trunk-source/shims= /src/0.23/java against hadoop 2.1.0-beta (/data/hive-ptest/working/apache-s= vn-trunk-source/build/hadoopcore/hadoop-2.1.0-beta) ivy-init-settings: [echo] Project: shims ivy-resolve-hadoop-shim: [echo] Project: shims [ivy:resolve] :: loading settings :: file =3D /data/hive-ptest/working/apac= he-svn-trunk-source/ivy/ivysettings.xml [ivy:resolve] :: resolving dependencies :: org.apache.hive#hive-shims;0.13.= 0-SNAPSHOT [ivy:resolve] =09confs: [hadoop0.23.shim] [ivy:resolve] =09found org.apache.hadoop#hadoop-common;2.1.0-beta in maven2 [ivy:resolve] =09found org.apache.hadoop#hadoop-annotations;2.1.0-beta in m= aven2 [ivy:resolve] =09found com.google.guava#guava;11.0.2 in maven2 [ivy:resolve] =09found com.google.code.findbugs#jsr305;1.3.9 in maven2 [ivy:resolve] =09found commons-cli#commons-cli;1.2 in maven2 [ivy:resolve] =09found org.apache.commons#commons-math;2.1 in maven2 [ivy:resolve] =09found xmlenc#xmlenc;0.52 in maven2 [ivy:resolve] =09found commons-httpclient#commons-httpclient;3.1 in maven2 [ivy:resolve] =09found commons-logging#commons-logging;1.1.1 in maven2 [ivy:resolve] =09found commons-codec#commons-codec;1.4 in maven2 [ivy:resolve] =09found commons-io#commons-io;2.1 in maven2 [ivy:resolve] =09found commons-net#commons-net;3.1 in maven2 [ivy:resolve] =09found javax.servlet#servlet-api;2.5 in maven2 [ivy:resolve] =09found org.mortbay.jetty#jetty;6.1.26 in maven2 [ivy:resolve] =09found org.mortbay.jetty#jetty-util;6.1.26 in maven2 [ivy:resolve] =09found com.sun.jersey#jersey-core;1.8 in maven2 [ivy:resolve] =09found com.sun.jersey#jersey-json;1.8 in maven2 [ivy:resolve] =09found org.codehaus.jettison#jettison;1.1 in maven2 [ivy:resolve] =09found stax#stax-api;1.0.1 in maven2 [ivy:resolve] =09found com.sun.xml.bind#jaxb-impl;2.2.3-1 in maven2 [ivy:resolve] =09found javax.xml.bind#jaxb-api;2.2.2 in maven2 [ivy:resolve] =09found javax.activation#activation;1.1 in maven2 [ivy:resolve] =09found org.codehaus.jackson#jackson-core-asl;1.8.8 in maven= 2 [ivy:resolve] =09found org.codehaus.jackson#jackson-mapper-asl;1.8.8 in mav= en2 [ivy:resolve] =09found org.codehaus.jackson#jackson-jaxrs;1.8.8 in maven2 [ivy:resolve] =09found org.codehaus.jackson#jackson-xc;1.8.8 in maven2 [ivy:resolve] =09found com.sun.jersey#jersey-server;1.8 in maven2 [ivy:resolve] =09found asm#asm;3.2 in maven2 [ivy:resolve] =09found log4j#log4j;1.2.17 in maven2 [ivy:resolve] =09found net.java.dev.jets3t#jets3t;0.6.1 in maven2 [ivy:resolve] =09found commons-lang#commons-lang;2.5 in maven2 [ivy:resolve] =09found commons-configuration#commons-configuration;1.6 in m= aven2 [ivy:resolve] =09found commons-collections#commons-collections;3.2.1 in mav= en2 [ivy:resolve] =09found commons-digester#commons-digester;1.8 in maven2 [ivy:resolve] =09found commons-beanutils#commons-beanutils;1.7.0 in maven2 [ivy:resolve] =09found commons-beanutils#commons-beanutils-core;1.8.0 in ma= ven2 [ivy:resolve] =09found org.slf4j#slf4j-api;1.6.1 in maven2 [ivy:resolve] =09found org.apache.avro#avro;1.5.3 in maven2 [ivy:resolve] =09found com.thoughtworks.paranamer#paranamer;2.3 in maven2 [ivy:resolve] =09found org.xerial.snappy#snappy-java;1.0.3.2 in maven2 [ivy:resolve] =09found com.google.protobuf#protobuf-java;2.5.0 in maven2 [ivy:resolve] =09found org.apache.hadoop#hadoop-auth;2.1.0-beta in maven2 [ivy:resolve] =09found org.slf4j#slf4j-log4j12;1.6.1 in maven2 [ivy:resolve] =09found com.jcraft#jsch;0.1.42 in maven2 [ivy:resolve] =09found org.apache.zookeeper#zookeeper;3.4.2 in maven2 [ivy:resolve] =09found org.apache.commons#commons-compress;1.4 in maven2 [ivy:resolve] =09found org.tukaani#xz;1.0 in maven2 [ivy:resolve] =09found tomcat#jasper-compiler;5.5.23 in maven2 [ivy:resolve] =09found tomcat#jasper-runtime;5.5.23 in maven2 [ivy:resolve] =09found commons-el#commons-el;1.0 in maven2 [ivy:resolve] =09found javax.servlet.jsp#jsp-api;2.1 in maven2 [ivy:resolve] =09found org.apache.hadoop#hadoop-mapreduce-client-core;2.1.0= -beta in maven2 [ivy:resolve] =09found org.apache.hadoop#hadoop-yarn-common;2.1.0-beta in m= aven2 [ivy:resolve] =09found org.apache.hadoop#hadoop-yarn-api;2.1.0-beta in mave= n2 [ivy:resolve] =09found com.google.inject.extensions#guice-servlet;3.0 in ma= ven2 [ivy:resolve] =09found com.google.inject#guice;3.0 in maven2 [ivy:resolve] =09found javax.inject#javax.inject;1 in maven2 [ivy:resolve] =09found aopalliance#aopalliance;1.0 in maven2 [ivy:resolve] =09found org.sonatype.sisu.inject#cglib;2.2.1-v20090111 in ma= ven2 [ivy:resolve] =09found io.netty#netty;3.5.11.Final in maven2 [ivy:resolve] =09found com.sun.jersey.jersey-test-framework#jersey-test-fra= mework-grizzly2;1.8 in maven2 [ivy:resolve] =09found com.sun.jersey.contribs#jersey-guice;1.8 in maven2 [ivy:resolve] =09found org.apache.hadoop#hadoop-archives;2.1.0-beta in mave= n2 [ivy:resolve] =09found org.apache.hadoop#hadoop-hdfs;2.1.0-beta in maven2 [ivy:resolve] =09found org.apache.hadoop#hadoop-mapreduce-client-jobclient;= 2.1.0-beta in maven2 [ivy:resolve] =09found org.apache.hadoop#hadoop-mapreduce-client-common;2.1= .0-beta in maven2 [ivy:resolve] =09found org.apache.hadoop#hadoop-yarn-client;2.1.0-beta in m= aven2 [ivy:resolve] =09found org.apache.hadoop#hadoop-yarn-server-common;2.1.0-be= ta in maven2 [ivy:resolve] =09found org.apache.hadoop#hadoop-yarn-server-tests;2.1.0-bet= a in maven2 [ivy:resolve] =09found org.apache.hadoop#hadoop-yarn-server-nodemanager;2.1= .0-beta in maven2 [ivy:resolve] =09found org.apache.hadoop#hadoop-yarn-server-resourcemanager= ;2.1.0-beta in maven2 [ivy:resolve] =09found org.apache.hadoop#hadoop-yarn-server-web-proxy;2.1.0= -beta in maven2 [ivy:resolve] =09found org.apache.hadoop#hadoop-mapreduce-client-app;2.1.0-= beta in maven2 [ivy:resolve] =09found org.apache.hadoop#hadoop-mapreduce-client-shuffle;2.= 1.0-beta in maven2 [ivy:resolve] =09found org.apache.hadoop#hadoop-mapreduce-client-hs;2.1.0-b= eta in maven2 [ivy:resolve] downloading http://repo1.maven.org/maven2/org/apache/hadoop/h= adoop-common/2.1.0-beta/hadoop-common-2.1.0-beta.jar ... [ivy:resolve] ................................................. (2656kB) [ivy:resolve] .. (0kB) [ivy:resolve] =09[SUCCESSFUL ] org.apache.hadoop#hadoop-common;2.1.0-beta!h= adoop-common.jar (91ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/org/apache/hadoop/h= adoop-common/2.1.0-beta/hadoop-common-2.1.0-beta-tests.jar ... [ivy:resolve] ....................... (1321kB) [ivy:resolve] .. (0kB) [ivy:resolve] =09[SUCCESSFUL ] org.apache.hadoop#hadoop-common;2.1.0-beta!h= adoop-common.jar(tests) (38ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/org/apache/hadoop/h= adoop-mapreduce-client-core/2.1.0-beta/hadoop-mapreduce-client-core-2.1.0-b= eta.jar ... [ivy:resolve] ....................... (1340kB) [ivy:resolve] .. (0kB) [ivy:resolve] =09[SUCCESSFUL ] org.apache.hadoop#hadoop-mapreduce-client-co= re;2.1.0-beta!hadoop-mapreduce-client-core.jar (47ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/org/apache/hadoop/h= adoop-archives/2.1.0-beta/hadoop-archives-2.1.0-beta.jar ... [ivy:resolve] .. (20kB) [ivy:resolve] .. (0kB) [ivy:resolve] =09[SUCCESSFUL ] org.apache.hadoop#hadoop-archives;2.1.0-beta= !hadoop-archives.jar (6ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/org/apache/hadoop/h= adoop-hdfs/2.1.0-beta/hadoop-hdfs-2.1.0-beta.jar ... [ivy:resolve] .............................................................= ...........................................................................= ...........................................................................= ...........................................................................= ...........................................................................= ...........................................................................= ...........................................................................= ...........................................................................= ...........................................................................= ...........................................................................= ...........................................................................= ...........................................................................= ...........................................................................= ...........................................................................= ...........................................................................= ...........................................................................= ...........................................................................= ...........................................................................= ...........................................................................= ...........................................................................= ................................................... (5092kB) [ivy:resolve] .. (0kB) [ivy:resolve] =09[SUCCESSFUL ] org.apache.hadoop#hadoop-hdfs;2.1.0-beta!had= oop-hdfs.jar (404ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/org/apache/hadoop/h= adoop-hdfs/2.1.0-beta/hadoop-hdfs-2.1.0-beta-tests.jar ... [ivy:resolve] ................................ (1897kB) [ivy:resolve] .. (0kB) [ivy:resolve] =09[SUCCESSFUL ] org.apache.hadoop#hadoop-hdfs;2.1.0-beta!had= oop-hdfs.jar(tests) (50ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/org/apache/hadoop/h= adoop-mapreduce-client-jobclient/2.1.0-beta/hadoop-mapreduce-client-jobclie= nt-2.1.0-beta.jar ... [ivy:resolve] .. (33kB) [ivy:resolve] .. (0kB) [ivy:resolve] =09[SUCCESSFUL ] org.apache.hadoop#hadoop-mapreduce-client-jo= bclient;2.1.0-beta!hadoop-mapreduce-client-jobclient.jar (25ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/org/apache/hadoop/h= adoop-mapreduce-client-jobclient/2.1.0-beta/hadoop-mapreduce-client-jobclie= nt-2.1.0-beta-tests.jar ... [ivy:resolve] ....................... (1395kB) [ivy:resolve] .. (0kB) [ivy:resolve] =09[SUCCESSFUL ] org.apache.hadoop#hadoop-mapreduce-client-jo= bclient;2.1.0-beta!hadoop-mapreduce-client-jobclient.jar(tests) (47ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/org/apache/hadoop/h= adoop-mapreduce-client-common/2.1.0-beta/hadoop-mapreduce-client-common-2.1= .0-beta.jar ... [ivy:resolve] ............ (638kB) [ivy:resolve] .. (0kB) [ivy:resolve] =09[SUCCESSFUL ] org.apache.hadoop#hadoop-mapreduce-client-co= mmon;2.1.0-beta!hadoop-mapreduce-client-common.jar (23ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/org/apache/hadoop/h= adoop-yarn-server-tests/2.1.0-beta/hadoop-yarn-server-tests-2.1.0-beta-test= s.jar ... [ivy:resolve] .. (33kB) [ivy:resolve] .. (0kB) [ivy:resolve] =09[SUCCESSFUL ] org.apache.hadoop#hadoop-yarn-server-tests;2= .1.0-beta!hadoop-yarn-server-tests.jar(tests) (14ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/org/apache/hadoop/h= adoop-mapreduce-client-app/2.1.0-beta/hadoop-mapreduce-client-app-2.1.0-bet= a.jar ... [ivy:resolve] ......... (461kB) [ivy:resolve] .. (0kB) [ivy:resolve] =09[SUCCESSFUL ] org.apache.hadoop#hadoop-mapreduce-client-ap= p;2.1.0-beta!hadoop-mapreduce-client-app.jar (30ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/org/apache/hadoop/h= adoop-mapreduce-client-hs/2.1.0-beta/hadoop-mapreduce-client-hs-2.1.0-beta.= jar ... [ivy:resolve] ... (113kB) [ivy:resolve] .. (0kB) [ivy:resolve] =09[SUCCESSFUL ] org.apache.hadoop#hadoop-mapreduce-client-hs= ;2.1.0-beta!hadoop-mapreduce-client-hs.jar (7ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/org/apache/hadoop/h= adoop-annotations/2.1.0-beta/hadoop-annotations-2.1.0-beta.jar ... [ivy:resolve] .. (16kB) [ivy:resolve] .. (0kB) [ivy:resolve] =09[SUCCESSFUL ] org.apache.hadoop#hadoop-annotations;2.1.0-b= eta!hadoop-annotations.jar (44ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/commons-httpclient/= commons-httpclient/3.1/commons-httpclient-3.1.jar ... [ivy:resolve] ...... (297kB) [ivy:resolve] .. (0kB) [ivy:resolve] =09[SUCCESSFUL ] commons-httpclient#commons-httpclient;3.1!co= mmons-httpclient.jar (11ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/commons-net/commons= -net/3.1/commons-net-3.1.jar ... [ivy:resolve] ...... (266kB) [ivy:resolve] .. (0kB) [ivy:resolve] =09[SUCCESSFUL ] commons-net#commons-net;3.1!commons-net.jar = (10ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/javax/servlet/servl= et-api/2.5/servlet-api-2.5.jar ... [ivy:resolve] ... (102kB) [ivy:resolve] .. (0kB) [ivy:resolve] =09[SUCCESSFUL ] javax.servlet#servlet-api;2.5!servlet-api.ja= r (7ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/log4j/log4j/1.2.17/= log4j-1.2.17.jar ... [ivy:resolve] ......... (478kB) [ivy:resolve] .. (0kB) [ivy:resolve] =09[SUCCESSFUL ] log4j#log4j;1.2.17!log4j.jar(bundle) (13ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/commons-lang/common= s-lang/2.5/commons-lang-2.5.jar ... [ivy:resolve] ...... (272kB) [ivy:resolve] .. (0kB) [ivy:resolve] =09[SUCCESSFUL ] commons-lang#commons-lang;2.5!commons-lang.j= ar (9ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/org/slf4j/slf4j-api= /1.6.1/slf4j-api-1.6.1.jar ... [ivy:resolve] .. (24kB) [ivy:resolve] .. (0kB) [ivy:resolve] =09[SUCCESSFUL ] org.slf4j#slf4j-api;1.6.1!slf4j-api.jar (5ms= ) [ivy:resolve] downloading http://repo1.maven.org/maven2/org/apache/avro/avr= o/1.5.3/avro-1.5.3.jar ... [ivy:resolve] ...... (257kB) [ivy:resolve] .. (0kB) [ivy:resolve] =09[SUCCESSFUL ] org.apache.avro#avro;1.5.3!avro.jar (15ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/com/google/protobuf= /protobuf-java/2.5.0/protobuf-java-2.5.0.jar ... [ivy:resolve] .......... (520kB) [ivy:resolve] .. (0kB) [ivy:resolve] =09[SUCCESSFUL ] com.google.protobuf#protobuf-java;2.5.0!prot= obuf-java.jar(bundle) (33ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/org/apache/hadoop/h= adoop-auth/2.1.0-beta/hadoop-auth-2.1.0-beta.jar ... [ivy:resolve] .. (46kB) [ivy:resolve] .. (0kB) [ivy:resolve] =09[SUCCESSFUL ] org.apache.hadoop#hadoop-auth;2.1.0-beta!had= oop-auth.jar (11ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/com/jcraft/jsch/0.1= .42/jsch-0.1.42.jar ... [ivy:resolve] .... (181kB) [ivy:resolve] .. (0kB) [ivy:resolve] =09[SUCCESSFUL ] com.jcraft#jsch;0.1.42!jsch.jar (9ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/org/apache/zookeepe= r/zookeeper/3.4.2/zookeeper-3.4.2.jar ... [ivy:resolve] ............. (746kB) [ivy:resolve] .. (0kB) [ivy:resolve] =09[SUCCESSFUL ] org.apache.zookeeper#zookeeper;3.4.2!zookeep= er.jar (25ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/org/apache/commons/= commons-compress/1.4/commons-compress-1.4.jar ... [ivy:resolve] ..... (233kB) [ivy:resolve] .. (0kB) [ivy:resolve] =09[SUCCESSFUL ] org.apache.commons#commons-compress;1.4!comm= ons-compress.jar (29ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/com/google/code/fin= dbugs/jsr305/1.3.9/jsr305-1.3.9.jar ... [ivy:resolve] .. (32kB) [ivy:resolve] .. (0kB) [ivy:resolve] =09[SUCCESSFUL ] com.google.code.findbugs#jsr305;1.3.9!jsr305= .jar (6ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/org/codehaus/jackso= n/jackson-jaxrs/1.8.8/jackson-jaxrs-1.8.8.jar ... [ivy:resolve] .. (17kB) [ivy:resolve] .. (0kB) [ivy:resolve] =09[SUCCESSFUL ] org.codehaus.jackson#jackson-jaxrs;1.8.8!jac= kson-jaxrs.jar (15ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/org/codehaus/jackso= n/jackson-xc/1.8.8/jackson-xc-1.8.8.jar ... [ivy:resolve] .. (31kB) [ivy:resolve] .. (0kB) [ivy:resolve] =09[SUCCESSFUL ] org.codehaus.jackson#jackson-xc;1.8.8!jackso= n-xc.jar (59ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/asm/asm/3.2/asm-3.2= .jar ... [ivy:resolve] .. (42kB) [ivy:resolve] .. (0kB) [ivy:resolve] =09[SUCCESSFUL ] asm#asm;3.2!asm.jar (5ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/com/thoughtworks/pa= ranamer/paranamer/2.3/paranamer-2.3.jar ... [ivy:resolve] .. (28kB) [ivy:resolve] .. (0kB) [ivy:resolve] =09[SUCCESSFUL ] com.thoughtworks.paranamer#paranamer;2.3!par= anamer.jar (6ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/org/xerial/snappy/s= nappy-java/1.0.3.2/snappy-java-1.0.3.2.jar ... [ivy:resolve] ................. (972kB) [ivy:resolve] .. (0kB) [ivy:resolve] =09[SUCCESSFUL ] org.xerial.snappy#snappy-java;1.0.3.2!snappy= -java.jar(bundle) (23ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/org/slf4j/slf4j-log= 4j12/1.6.1/slf4j-log4j12-1.6.1.jar ... [ivy:resolve] .. (9kB) [ivy:resolve] .. (0kB) [ivy:resolve] =09[SUCCESSFUL ] org.slf4j#slf4j-log4j12;1.6.1!slf4j-log4j12.= jar (5ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/org/tukaani/xz/1.0/= xz-1.0.jar ... [ivy:resolve] ... (92kB) [ivy:resolve] .. (0kB) [ivy:resolve] =09[SUCCESSFUL ] org.tukaani#xz;1.0!xz.jar (7ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/tomcat/jasper-compi= ler/5.5.23/jasper-compiler-5.5.23.jar ... [ivy:resolve] ........ (398kB) [ivy:resolve] .. (0kB) [ivy:resolve] =09[SUCCESSFUL ] tomcat#jasper-compiler;5.5.23!jasper-compile= r.jar (13ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/tomcat/jasper-runti= me/5.5.23/jasper-runtime-5.5.23.jar ... [ivy:resolve] ... (75kB) [ivy:resolve] .. (0kB) [ivy:resolve] =09[SUCCESSFUL ] tomcat#jasper-runtime;5.5.23!jasper-runtime.= jar (7ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/javax/servlet/jsp/j= sp-api/2.1/jsp-api-2.1.jar ... [ivy:resolve] ... (98kB) [ivy:resolve] .. (0kB) [ivy:resolve] =09[SUCCESSFUL ] javax.servlet.jsp#jsp-api;2.1!jsp-api.jar (7= ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/org/apache/hadoop/h= adoop-yarn-common/2.1.0-beta/hadoop-yarn-common-2.1.0-beta.jar ... [ivy:resolve] ...................... (1263kB) [ivy:resolve] .. (0kB) [ivy:resolve] =09[SUCCESSFUL ] org.apache.hadoop#hadoop-yarn-common;2.1.0-b= eta!hadoop-yarn-common.jar (69ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/com/google/inject/e= xtensions/guice-servlet/3.0/guice-servlet-3.0.jar ... [ivy:resolve] .. (63kB) [ivy:resolve] .. (0kB) [ivy:resolve] =09[SUCCESSFUL ] com.google.inject.extensions#guice-servlet;3= .0!guice-servlet.jar (7ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/io/netty/netty/3.5.= 11.Final/netty-3.5.11.Final.jar ... [ivy:resolve] ................... (1106kB) [ivy:resolve] .. (0kB) [ivy:resolve] =09[SUCCESSFUL ] io.netty#netty;3.5.11.Final!netty.jar(bundle= ) (27ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/org/apache/hadoop/h= adoop-yarn-api/2.1.0-beta/hadoop-yarn-api-2.1.0-beta.jar ... [ivy:resolve] .................... (1125kB) [ivy:resolve] .. (0kB) [ivy:resolve] =09[SUCCESSFUL ] org.apache.hadoop#hadoop-yarn-api;2.1.0-beta= !hadoop-yarn-api.jar (30ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/com/google/inject/g= uice/3.0/guice-3.0.jar ... [ivy:resolve] ............. (693kB) [ivy:resolve] .. (0kB) [ivy:resolve] =09[SUCCESSFUL ] com.google.inject#guice;3.0!guice.jar (18ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/com/sun/jersey/jers= ey-test-framework/jersey-test-framework-grizzly2/1.8/jersey-test-framework-= grizzly2-1.8.jar ... [ivy:resolve] .. (12kB) [ivy:resolve] .. (0kB) [ivy:resolve] =09[SUCCESSFUL ] com.sun.jersey.jersey-test-framework#jersey-= test-framework-grizzly2;1.8!jersey-test-framework-grizzly2.jar (6ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/com/sun/jersey/cont= ribs/jersey-guice/1.8/jersey-guice-1.8.jar ... [ivy:resolve] .. (14kB) [ivy:resolve] .. (0kB) [ivy:resolve] =09[SUCCESSFUL ] com.sun.jersey.contribs#jersey-guice;1.8!jer= sey-guice.jar (5ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/javax/inject/javax.= inject/1/javax.inject-1.jar ... [ivy:resolve] .. (2kB) [ivy:resolve] .. (0kB) [ivy:resolve] =09[SUCCESSFUL ] javax.inject#javax.inject;1!javax.inject.jar= (5ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/aopalliance/aopalli= ance/1.0/aopalliance-1.0.jar ... [ivy:resolve] .. (4kB) [ivy:resolve] .. (0kB) [ivy:resolve] =09[SUCCESSFUL ] aopalliance#aopalliance;1.0!aopalliance.jar = (27ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/org/sonatype/sisu/i= nject/cglib/2.2.1-v20090111/cglib-2.2.1-v20090111.jar ... [ivy:resolve] ...... (272kB) [ivy:resolve] .. (0kB) [ivy:resolve] =09[SUCCESSFUL ] org.sonatype.sisu.inject#cglib;2.2.1-v200901= 11!cglib.jar (17ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/org/apache/hadoop/h= adoop-yarn-client/2.1.0-beta/hadoop-yarn-client-2.1.0-beta.jar ... [ivy:resolve] ... (84kB) [ivy:resolve] .. (0kB) [ivy:resolve] =09[SUCCESSFUL ] org.apache.hadoop#hadoop-yarn-client;2.1.0-b= eta!hadoop-yarn-client.jar (11ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/org/apache/hadoop/h= adoop-yarn-server-common/2.1.0-beta/hadoop-yarn-server-common-2.1.0-beta.ja= r ... [ivy:resolve] .... (171kB) [ivy:resolve] .. (0kB) [ivy:resolve] =09[SUCCESSFUL ] org.apache.hadoop#hadoop-yarn-server-common;= 2.1.0-beta!hadoop-yarn-server-common.jar (27ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/org/apache/hadoop/h= adoop-yarn-server-nodemanager/2.1.0-beta/hadoop-yarn-server-nodemanager-2.1= .0-beta.jar ... [ivy:resolve] ......... (451kB) [ivy:resolve] .. (0kB) [ivy:resolve] =09[SUCCESSFUL ] org.apache.hadoop#hadoop-yarn-server-nodeman= ager;2.1.0-beta!hadoop-yarn-server-nodemanager.jar (20ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/org/apache/hadoop/h= adoop-yarn-server-resourcemanager/2.1.0-beta/hadoop-yarn-server-resourceman= ager-2.1.0-beta.jar ... [ivy:resolve] ............ (585kB) [ivy:resolve] .. (0kB) [ivy:resolve] =09[SUCCESSFUL ] org.apache.hadoop#hadoop-yarn-server-resourc= emanager;2.1.0-beta!hadoop-yarn-server-resourcemanager.jar (24ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/org/apache/hadoop/h= adoop-yarn-server-web-proxy/2.1.0-beta/hadoop-yarn-server-web-proxy-2.1.0-b= eta.jar ... [ivy:resolve] .. (24kB) [ivy:resolve] .. (0kB) [ivy:resolve] =09[SUCCESSFUL ] org.apache.hadoop#hadoop-yarn-server-web-pro= xy;2.1.0-beta!hadoop-yarn-server-web-proxy.jar (11ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/org/apache/hadoop/h= adoop-mapreduce-client-shuffle/2.1.0-beta/hadoop-mapreduce-client-shuffle-2= .1.0-beta.jar ... [ivy:resolve] .. (21kB) [ivy:resolve] .. (0kB) [ivy:resolve] =09[SUCCESSFUL ] org.apache.hadoop#hadoop-mapreduce-client-sh= uffle;2.1.0-beta!hadoop-mapreduce-client-shuffle.jar (34ms) [ivy:resolve] :: resolution report :: resolve 52812ms :: artifacts dl 1614m= s =09--------------------------------------------------------------------- =09| | modules || artifacts | =09| conf | number| search|dwnlded|evicted|| number|dwnlded| =09--------------------------------------------------------------------- =09| hadoop0.23.shim | 75 | 49 | 49 | 0 || 78 | 52 | =09--------------------------------------------------------------------- ivy-retrieve-hadoop-shim: [echo] Project: shims [ivy:retrieve] :: retrieving :: org.apache.hive#hive-shims [ivy:retrieve] =09confs: [hadoop0.23.shim] [ivy:retrieve] =0978 artifacts copied, 0 already retrieved (34675kB/154ms) [javac] Compiling 3 source files to /data/hive-ptest/working/apache-svn= -trunk-source/build/shims/classes [javac] Note: /data/hive-ptest/working/apache-svn-trunk-source/shims/sr= c/0.23/java/org/apache/hadoop/hive/shims/Hadoop23Shims.java uses or overrid= es a deprecated API. [javac] Note: Recompile with -Xlint:deprecation for details. jar: [echo] Project: shims [jar] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/= build/shims/hive-shims-0.13.0-SNAPSHOT.jar [ivy:publish] :: delivering :: org.apache.hive#hive-shims;0.13.0-SNAPSHOT := : 0.13.0-SNAPSHOT :: integration :: Tue Sep 17 20:36:02 EDT 2013 [ivy:publish] =09delivering ivy file to /data/hive-ptest/working/apache-svn= -trunk-source/build/shims/ivy-0.13.0-SNAPSHOT.xml [ivy:publish] :: publishing :: org.apache.hive#hive-shims [ivy:publish] =09published hive-shims to /data/hive-ptest/working/ivy/local= /org.apache.hive/hive-shims/0.13.0-SNAPSHOT/jars/hive-shims.jar [ivy:publish] =09published ivy to /data/hive-ptest/working/ivy/local/org.ap= ache.hive/hive-shims/0.13.0-SNAPSHOT/ivys/ivy.xml ivy-init-settings: [echo] Project: common check-ivy: [echo] Project: common ivy-resolve: [echo] Project: common [ivy:resolve] :: loading settings :: file =3D /data/hive-ptest/working/apac= he-svn-trunk-source/ivy/ivysettings.xml [ivy:resolve] :: resolving dependencies :: org.apache.hive#hive-common;0.13= .0-SNAPSHOT [ivy:resolve] =09confs: [default] [ivy:resolve] =09found org.apache.hive#hive-shims;0.13.0-SNAPSHOT in local [ivy:resolve] =09found commons-cli#commons-cli;1.2 in maven2 [ivy:resolve] =09found org.apache.commons#commons-compress;1.4.1 in maven2 [ivy:resolve] =09found org.tukaani#xz;1.0 in maven2 [ivy:resolve] =09found commons-lang#commons-lang;2.4 in maven2 [ivy:resolve] =09found log4j#log4j;1.2.16 in maven2 [ivy:resolve] downloading /data/hive-ptest/working/ivy/local/org.apache.hiv= e/hive-shims/0.13.0-SNAPSHOT/jars/hive-shims.jar ... [ivy:resolve] .... (145kB) [ivy:resolve] .. (0kB) [ivy:resolve] =09[SUCCESSFUL ] org.apache.hive#hive-shims;0.13.0-SNAPSHOT!h= ive-shims.jar (4ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/org/apache/commons/= commons-compress/1.4.1/commons-compress-1.4.1.jar ... [ivy:resolve] ..... (235kB) [ivy:resolve] .. (0kB) [ivy:resolve] =09[SUCCESSFUL ] org.apache.commons#commons-compress;1.4.1!co= mmons-compress.jar (10ms) [ivy:resolve] :: resolution report :: resolve 1402ms :: artifacts dl 19ms =09--------------------------------------------------------------------- =09| | modules || artifacts | =09| conf | number| search|dwnlded|evicted|| number|dwnlded| =09--------------------------------------------------------------------- =09| default | 6 | 2 | 2 | 0 || 6 | 2 | =09--------------------------------------------------------------------- [ivy:report] Processing /data/hive-ptest/working/apache-svn-trunk-source/bu= ild/ivy/resolution-cache/org.apache.hive-hive-common-default.xml to /data/h= ive-ptest/working/apache-svn-trunk-source/build/ivy/report/org.apache.hive-= hive-common-default.html make-pom: [echo] Project: common [echo] Writing POM to /data/hive-ptest/working/apache-svn-trunk-sourc= e/build/common/pom.xml [ivy:makepom] DEPRECATED: 'ivy.conf.file' is deprecated, use 'ivy.settings.= file' instead [ivy:makepom] :: loading settings :: file =3D /data/hive-ptest/working/apac= he-svn-trunk-source/ivy/ivysettings.xml create-dirs: [echo] Project: common init: [echo] Project: common setup: [echo] Project: common ivy-retrieve: [echo] Project: common [ivy:retrieve] :: retrieving :: org.apache.hive#hive-common [ivy:retrieve] =09confs: [default] [ivy:retrieve] =094 artifacts copied, 2 already retrieved (513kB/9ms) compile: [echo] Project: common [javac] Compiling 27 source files to /data/hive-ptest/working/apache-sv= n-trunk-source/build/common/classes [javac] Note: /data/hive-ptest/working/apache-svn-trunk-source/common/s= rc/java/org/apache/hadoop/hive/common/ObjectPair.java uses unchecked or uns= afe operations. [javac] Note: Recompile with -Xlint:unchecked for details. [copy] Copying 1 file to /data/hive-ptest/working/apache-svn-trunk-sou= rce/build/common/classes jar: [echo] Project: common [jar] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/= build/common/hive-common-0.13.0-SNAPSHOT.jar [ivy:publish] :: delivering :: org.apache.hive#hive-common;0.13.0-SNAPSHOT = :: 0.13.0-SNAPSHOT :: integration :: Tue Sep 17 20:36:07 EDT 2013 [ivy:publish] =09delivering ivy file to /data/hive-ptest/working/apache-svn= -trunk-source/build/common/ivy-0.13.0-SNAPSHOT.xml [ivy:publish] :: publishing :: org.apache.hive#hive-common [ivy:publish] =09published hive-common to /data/hive-ptest/working/ivy/loca= l/org.apache.hive/hive-common/0.13.0-SNAPSHOT/jars/hive-common.jar [ivy:publish] =09published ivy to /data/hive-ptest/working/ivy/local/org.ap= ache.hive/hive-common/0.13.0-SNAPSHOT/ivys/ivy.xml ivy-init-settings: [echo] Project: serde check-ivy: [echo] Project: serde ivy-resolve: [echo] Project: serde [ivy:resolve] :: loading settings :: file =3D /data/hive-ptest/working/apac= he-svn-trunk-source/ivy/ivysettings.xml [ivy:resolve] :: resolving dependencies :: org.apache.hive#hive-serde;0.13.= 0-SNAPSHOT [ivy:resolve] =09confs: [default] [ivy:resolve] =09found org.apache.hive#hive-common;0.13.0-SNAPSHOT in local [ivy:resolve] =09found org.apache.hive#hive-shims;0.13.0-SNAPSHOT in local [ivy:resolve] =09found commons-cli#commons-cli;1.2 in maven2 [ivy:resolve] =09found org.apache.commons#commons-compress;1.4.1 in maven2 [ivy:resolve] =09found org.tukaani#xz;1.0 in maven2 [ivy:resolve] =09found commons-lang#commons-lang;2.4 in maven2 [ivy:resolve] =09found log4j#log4j;1.2.16 in maven2 [ivy:resolve] =09found org.slf4j#slf4j-api;1.6.1 in maven2 [ivy:resolve] =09found org.slf4j#slf4j-log4j12;1.6.1 in maven2 [ivy:resolve] =09found org.mockito#mockito-all;1.8.2 in maven2 [ivy:resolve] =09found org.apache.thrift#libfb303;0.9.0 in maven2 [ivy:resolve] =09found commons-codec#commons-codec;1.4 in maven2 [ivy:resolve] =09found org.apache.avro#avro;1.7.1 in maven2 [ivy:resolve] =09found org.apache.avro#avro-mapred;1.7.1 in maven2 [ivy:resolve] downloading /data/hive-ptest/working/ivy/local/org.apache.hiv= e/hive-common/0.13.0-SNAPSHOT/jars/hive-common.jar ... [ivy:resolve] ... (97kB) [ivy:resolve] .. (0kB) [ivy:resolve] =09[SUCCESSFUL ] org.apache.hive#hive-common;0.13.0-SNAPSHOT!= hive-common.jar (4ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/org/mockito/mockito= -all/1.8.2/mockito-all-1.8.2.jar ... [ivy:resolve] ...................... (1315kB) [ivy:resolve] .. (0kB) [ivy:resolve] =09[SUCCESSFUL ] org.mockito#mockito-all;1.8.2!mockito-all.ja= r (35ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/org/apache/thrift/l= ibfb303/0.9.0/libfb303-0.9.0.jar ... [ivy:resolve] ...... (268kB) [ivy:resolve] .. (0kB) [ivy:resolve] =09[SUCCESSFUL ] org.apache.thrift#libfb303;0.9.0!libfb303.ja= r (10ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/org/apache/avro/avr= o/1.7.1/avro-1.7.1.jar ... [ivy:resolve] ...... (290kB) [ivy:resolve] .. (0kB) [ivy:resolve] =09[SUCCESSFUL ] org.apache.avro#avro;1.7.1!avro.jar (14ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/org/apache/avro/avr= o-mapred/1.7.1/avro-mapred-1.7.1.jar ... [ivy:resolve] .... (164kB) [ivy:resolve] .. (0kB) [ivy:resolve] =09[SUCCESSFUL ] org.apache.avro#avro-mapred;1.7.1!avro-mapre= d.jar (10ms) [ivy:resolve] :: resolution report :: resolve 6362ms :: artifacts dl 84ms =09--------------------------------------------------------------------- =09| | modules || artifacts | =09| conf | number| search|dwnlded|evicted|| number|dwnlded| =09--------------------------------------------------------------------- =09| default | 14 | 5 | 5 | 0 || 14 | 5 | =09--------------------------------------------------------------------- [ivy:report] Processing /data/hive-ptest/working/apache-svn-trunk-source/bu= ild/ivy/resolution-cache/org.apache.hive-hive-serde-default.xml to /data/hi= ve-ptest/working/apache-svn-trunk-source/build/ivy/report/org.apache.hive-h= ive-serde-default.html make-pom: [echo] Project: serde [echo] Writing POM to /data/hive-ptest/working/apache-svn-trunk-sourc= e/build/serde/pom.xml [ivy:makepom] DEPRECATED: 'ivy.conf.file' is deprecated, use 'ivy.settings.= file' instead [ivy:makepom] :: loading settings :: file =3D /data/hive-ptest/working/apac= he-svn-trunk-source/ivy/ivysettings.xml create-dirs: [echo] Project: serde [copy] Warning: /data/hive-ptest/working/apache-svn-trunk-source/serde= /src/test/resources does not exist. init: [echo] Project: serde ivy-retrieve: [echo] Project: serde [ivy:retrieve] :: retrieving :: org.apache.hive#hive-serde [ivy:retrieve] =09confs: [default] [ivy:retrieve] =098 artifacts copied, 6 already retrieved (2229kB/33ms) dynamic-serde: compile: [echo] Project: serde [javac] Compiling 338 source files to /data/hive-ptest/working/apache-s= vn-trunk-source/build/serde/classes [javac] Note: Some input files use or override a deprecated API. [javac] Note: Recompile with -Xlint:deprecation for details. [javac] Note: Some input files use unchecked or unsafe operations. [javac] Note: Recompile with -Xlint:unchecked for details. [javac] Creating empty /data/hive-ptest/working/apache-svn-trunk-source= /build/serde/classes/org/apache/hadoop/hive/serde2/typeinfo/package-info.cl= ass jar: [echo] Project: serde [jar] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/= build/serde/hive-serde-0.13.0-SNAPSHOT.jar [ivy:publish] :: delivering :: org.apache.hive#hive-serde;0.13.0-SNAPSHOT := : 0.13.0-SNAPSHOT :: integration :: Tue Sep 17 20:36:25 EDT 2013 [ivy:publish] =09delivering ivy file to /data/hive-ptest/working/apache-svn= -trunk-source/build/serde/ivy-0.13.0-SNAPSHOT.xml [ivy:publish] :: publishing :: org.apache.hive#hive-serde [ivy:publish] =09published hive-serde to /data/hive-ptest/working/ivy/local= /org.apache.hive/hive-serde/0.13.0-SNAPSHOT/jars/hive-serde.jar [ivy:publish] =09published ivy to /data/hive-ptest/working/ivy/local/org.ap= ache.hive/hive-serde/0.13.0-SNAPSHOT/ivys/ivy.xml ivy-init-settings: [echo] Project: metastore check-ivy: [echo] Project: metastore ivy-resolve: [echo] Project: metastore [ivy:resolve] :: loading settings :: file =3D /data/hive-ptest/working/apac= he-svn-trunk-source/ivy/ivysettings.xml [ivy:resolve] :: resolving dependencies :: org.apache.hive#hive-metastore;0= .13.0-SNAPSHOT [ivy:resolve] =09confs: [default] [ivy:resolve] =09found org.apache.hive#hive-serde;0.13.0-SNAPSHOT in local [ivy:resolve] =09found org.apache.hive#hive-common;0.13.0-SNAPSHOT in local [ivy:resolve] =09found org.apache.hive#hive-shims;0.13.0-SNAPSHOT in local [ivy:resolve] =09found commons-cli#commons-cli;1.2 in maven2 [ivy:resolve] =09found org.apache.commons#commons-compress;1.4.1 in maven2 [ivy:resolve] =09found org.tukaani#xz;1.0 in maven2 [ivy:resolve] =09found commons-lang#commons-lang;2.4 in maven2 [ivy:resolve] =09found log4j#log4j;1.2.16 in maven2 [ivy:resolve] =09found org.slf4j#slf4j-api;1.6.1 in maven2 [ivy:resolve] =09found org.slf4j#slf4j-log4j12;1.6.1 in maven2 [ivy:resolve] =09found org.mockito#mockito-all;1.8.2 in maven2 [ivy:resolve] =09found org.apache.thrift#libfb303;0.9.0 in maven2 [ivy:resolve] =09found commons-codec#commons-codec;1.4 in maven2 [ivy:resolve] =09found org.apache.avro#avro;1.7.1 in maven2 [ivy:resolve] =09found org.apache.avro#avro-mapred;1.7.1 in maven2 [ivy:resolve] =09found org.antlr#antlr;3.4 in maven2 [ivy:resolve] =09found org.antlr#antlr-runtime;3.4 in maven2 [ivy:resolve] =09found org.antlr#ST4;4.0.4 in maven2 [ivy:resolve] =09found com.jolbox#bonecp;0.7.1.RELEASE in maven2 [ivy:resolve] =09found com.google.guava#guava;r08 in maven2 [ivy:resolve] =09found commons-pool#commons-pool;1.5.4 in maven2 [ivy:resolve] =09found org.datanucleus#datanucleus-api-jdo;3.2.1 in maven2 [ivy:resolve] =09found org.datanucleus#datanucleus-core;3.2.2 in maven2 [ivy:resolve] =09found org.datanucleus#datanucleus-rdbms;3.2.1 in maven2 [ivy:resolve] =09found javax.jdo#jdo-api;3.0.1 in maven2 [ivy:resolve] =09found org.apache.derby#derby;10.4.2.0 in maven2 [ivy:resolve] downloading /data/hive-ptest/working/ivy/local/org.apache.hiv= e/hive-serde/0.13.0-SNAPSHOT/jars/hive-serde.jar ... [ivy:resolve] ............ (693kB) [ivy:resolve] .. (0kB) [ivy:resolve] =09[SUCCESSFUL ] org.apache.hive#hive-serde;0.13.0-SNAPSHOT!h= ive-serde.jar (12ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/org/antlr/antlr/3.4= /antlr-3.4.jar ... [ivy:resolve] .................. (1086kB) [ivy:resolve] .. (0kB) [ivy:resolve] =09[SUCCESSFUL ] org.antlr#antlr;3.4!antlr.jar (23ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/org/antlr/antlr-run= time/3.4/antlr-runtime-3.4.jar ... [ivy:resolve] .... (160kB) [ivy:resolve] .. (0kB) [ivy:resolve] =09[SUCCESSFUL ] org.antlr#antlr-runtime;3.4!antlr-runtime.ja= r (8ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/org/antlr/ST4/4.0.4= /ST4-4.0.4.jar ... [ivy:resolve] ..... (231kB) [ivy:resolve] .. (0kB) [ivy:resolve] =09[SUCCESSFUL ] org.antlr#ST4;4.0.4!ST4.jar (9ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/com/jolbox/bonecp/0= .7.1.RELEASE/bonecp-0.7.1.RELEASE.jar ... [ivy:resolve] ... (112kB) [ivy:resolve] .. (0kB) [ivy:resolve] =09[SUCCESSFUL ] com.jolbox#bonecp;0.7.1.RELEASE!bonecp.jar(b= undle) (7ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/commons-pool/common= s-pool/1.5.4/commons-pool-1.5.4.jar ... [ivy:resolve] ... (93kB) [ivy:resolve] .. (0kB) [ivy:resolve] =09[SUCCESSFUL ] commons-pool#commons-pool;1.5.4!commons-pool= .jar (7ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/org/datanucleus/dat= anucleus-api-jdo/3.2.1/datanucleus-api-jdo-3.2.1.jar ... [ivy:resolve] ....... (329kB) [ivy:resolve] .. (0kB) [ivy:resolve] =09[SUCCESSFUL ] org.datanucleus#datanucleus-api-jdo;3.2.1!da= tanucleus-api-jdo.jar (10ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/org/datanucleus/dat= anucleus-core/3.2.2/datanucleus-core-3.2.2.jar ... [ivy:resolve] .............................. (1759kB) [ivy:resolve] .. (0kB) [ivy:resolve] =09[SUCCESSFUL ] org.datanucleus#datanucleus-core;3.2.2!datan= ucleus-core.jar (34ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/org/datanucleus/dat= anucleus-rdbms/3.2.1/datanucleus-rdbms-3.2.1.jar ... [ivy:resolve] .............................. (1728kB) [ivy:resolve] .. (0kB) [ivy:resolve] =09[SUCCESSFUL ] org.datanucleus#datanucleus-rdbms;3.2.1!data= nucleus-rdbms.jar (42ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/javax/jdo/jdo-api/3= .0.1/jdo-api-3.0.1.jar ... [ivy:resolve] ..... (196kB) [ivy:resolve] .. (0kB) [ivy:resolve] =09[SUCCESSFUL ] javax.jdo#jdo-api;3.0.1!jdo-api.jar (30ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/org/apache/derby/de= rby/10.4.2.0/derby-10.4.2.0.jar ... [ivy:resolve] ......................................... (2389kB) [ivy:resolve] .. (0kB) [ivy:resolve] =09[SUCCESSFUL ] org.apache.derby#derby;10.4.2.0!derby.jar (5= 1ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/com/google/guava/gu= ava/r08/guava-r08.jar ... [ivy:resolve] ................... (1088kB) [ivy:resolve] .. (0kB) [ivy:resolve] =09[SUCCESSFUL ] com.google.guava#guava;r08!guava.jar (28ms) [ivy:resolve] :: resolution report :: resolve 10147ms :: artifacts dl 286ms [ivy:resolve] =09:: evicted modules: [ivy:resolve] =09org.slf4j#slf4j-api;1.5.10 by [org.slf4j#slf4j-api;1.6.1] = in [default] =09--------------------------------------------------------------------- =09| | modules || artifacts | =09| conf | number| search|dwnlded|evicted|| number|dwnlded| =09--------------------------------------------------------------------- =09| default | 27 | 12 | 12 | 1 || 26 | 12 | =09--------------------------------------------------------------------- [ivy:report] Processing /data/hive-ptest/working/apache-svn-trunk-source/bu= ild/ivy/resolution-cache/org.apache.hive-hive-metastore-default.xml to /dat= a/hive-ptest/working/apache-svn-trunk-source/build/ivy/report/org.apache.hi= ve-hive-metastore-default.html make-pom: [echo] Project: metastore [echo] Writing POM to /data/hive-ptest/working/apache-svn-trunk-sourc= e/build/metastore/pom.xml [ivy:makepom] DEPRECATED: 'ivy.conf.file' is deprecated, use 'ivy.settings.= file' instead [ivy:makepom] :: loading settings :: file =3D /data/hive-ptest/working/apac= he-svn-trunk-source/ivy/ivysettings.xml create-dirs: [echo] Project: metastore [copy] Warning: /data/hive-ptest/working/apache-svn-trunk-source/metas= tore/src/test/resources does not exist. init: [echo] Project: metastore metastore-init: [echo] Project: metastore [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/b= uild/metastore/gen/antlr/gen-java/org/apache/hadoop/hive/metastore/parser ivy-retrieve: [echo] Project: metastore [ivy:retrieve] :: retrieving :: org.apache.hive#hive-metastore [ivy:retrieve] =09confs: [default] [ivy:retrieve] =0912 artifacts copied, 14 already retrieved (9868kB/32ms) build-grammar: [echo] Project: metastore [echo] Building Grammar /data/hive-ptest/working/apache-svn-trunk-sour= ce/metastore/src/java/org/apache/hadoop/hive/metastore/parser/Filter.g ...= . model-compile: [echo] Project: metastore [javac] Compiling 24 source files to /data/hive-ptest/working/apache-sv= n-trunk-source/build/metastore/classes [copy] Copying 1 file to /data/hive-ptest/working/apache-svn-trunk-sou= rce/build/metastore/classes core-compile: [echo] Project: metastore [javac] Compiling 103 source files to /data/hive-ptest/working/apache-s= vn-trunk-source/build/metastore/classes [javac] Note: Some input files use or override a deprecated API. [javac] Note: Recompile with -Xlint:deprecation for details. [javac] Note: Some input files use unchecked or unsafe operations. [javac] Note: Recompile with -Xlint:unchecked for details. [javac] Creating empty /data/hive-ptest/working/apache-svn-trunk-source= /build/metastore/classes/org/apache/hadoop/hive/metastore/parser/package-in= fo.class model-enhance: [echo] Project: metastore [datanucleusenhancer] log4j:WARN No appenders could be found for logger (Da= taNucleus.General). [datanucleusenhancer] log4j:WARN Please initialize the log4j system properl= y. [datanucleusenhancer] log4j:WARN See http://logging.apache.org/log4j/1.2/fa= q.html#noconfig for more info. [datanucleusenhancer] DataNucleus Enhancer (version 3.2.2) for API "JDO" us= ing JRE "1.6" [datanucleusenhancer] DataNucleus Enhancer : Classpath [datanucleusenhancer] >> /data/hive-ptest/working/apache-svn-trunk-source/= build/service/classes [datanucleusenhancer] >> /data/hive-ptest/working/apache-svn-trunk-source/= build/common/classes [datanucleusenhancer] >> /data/hive-ptest/working/apache-svn-trunk-source/= build/serde/classes [datanucleusenhancer] >> /data/hive-ptest/working/apache-svn-trunk-source/= build/metastore/classes [datanucleusenhancer] >> /data/hive-ptest/working/apache-svn-trunk-source/= build/ql/classes [datanucleusenhancer] >> /data/hive-ptest/working/apache-svn-trunk-source/= build/beeline/classes [datanucleusenhancer] >> /data/hive-ptest/working/apache-svn-trunk-source/= build/cli/classes [datanucleusenhancer] >> /data/hive-ptest/working/apache-svn-trunk-source/= build/shims/classes [datanucleusenhancer] >> /data/hive-ptest/working/apache-svn-trunk-source/= build/hwi/classes [datanucleusenhancer] >> /data/hive-ptest/working/apache-svn-trunk-source/= build/jdbc/classes [datanucleusenhancer] >> /data/hive-ptest/working/apache-svn-trunk-source/= build/hbase-handler/classes [datanucleusenhancer] >> /data/hive-ptest/working/apache-svn-trunk-source/= build/anttasks/hive-anttasks-0.13.0-SNAPSHOT.jar [datanucleusenhancer] >> /data/hive-ptest/working/apache-svn-trunk-source/= build/common/hive-common-0.13.0-SNAPSHOT.jar [datanucleusenhancer] >> /data/hive-ptest/working/apache-svn-trunk-source/= build/serde/hive-serde-0.13.0-SNAPSHOT.jar [datanucleusenhancer] >> /data/hive-ptest/working/apache-svn-trunk-source/= build/shims/hive-shims-0.13.0-SNAPSHOT.jar [datanucleusenhancer] >> /data/hive-ptest/working/apache-svn-trunk-source/= build/ivy/lib/hadoop0.20S.shim/activation-1.1.jar [datanucleusenhancer] >> /data/hive-ptest/working/apache-svn-trunk-source/= build/ivy/lib/hadoop0.20S.shim/ant-1.6.5.jar [datanucleusenhancer] >> /data/hive-ptest/working/apache-svn-trunk-source/= build/ivy/lib/hadoop0.20S.shim/asm-3.1.jar [datanucleusenhancer] >> /data/hive-ptest/working/apache-svn-trunk-source/= build/ivy/lib/hadoop0.20S.shim/commons-beanutils-1.7.0.jar [datanucleusenhancer] >> /data/hive-ptest/working/apache-svn-trunk-source/= build/ivy/lib/hadoop0.20S.shim/commons-beanutils-core-1.8.0.jar [datanucleusenhancer] >> /data/hive-ptest/working/apache-svn-trunk-source/= build/ivy/lib/hadoop0.20S.shim/commons-cli-1.2.jar [datanucleusenhancer] >> /data/hive-ptest/working/apache-svn-trunk-source/= build/ivy/lib/hadoop0.20S.shim/commons-codec-1.4.jar [datanucleusenhancer] >> /data/hive-ptest/working/apache-svn-trunk-source/= build/ivy/lib/hadoop0.20S.shim/commons-collections-3.2.1.jar [datanucleusenhancer] >> /data/hive-ptest/working/apache-svn-trunk-source/= build/ivy/lib/hadoop0.20S.shim/commons-configuration-1.6.jar [datanucleusenhancer] >> /data/hive-ptest/working/apache-svn-trunk-source/= build/ivy/lib/hadoop0.20S.shim/commons-digester-1.8.jar [datanucleusenhancer] >> /data/hive-ptest/working/apache-svn-trunk-source/= build/ivy/lib/hadoop0.20S.shim/commons-el-1.0.jar [datanucleusenhancer] >> /data/hive-ptest/working/apache-svn-trunk-source/= build/ivy/lib/hadoop0.20S.shim/commons-httpclient-3.0.1.jar [datanucleusenhancer] >> /data/hive-ptest/working/apache-svn-trunk-source/= build/ivy/lib/hadoop0.20S.shim/commons-io-2.1.jar [datanucleusenhancer] >> /data/hive-ptest/working/apache-svn-trunk-source/= build/ivy/lib/hadoop0.20S.shim/commons-lang-2.4.jar [datanucleusenhancer] >> /data/hive-ptest/working/apache-svn-trunk-source/= build/ivy/lib/hadoop0.20S.shim/commons-logging-1.1.1.jar [datanucleusenhancer] >> /data/hive-ptest/working/apache-svn-trunk-source/= build/ivy/lib/hadoop0.20S.shim/commons-math-2.1.jar [datanucleusenhancer] >> /data/hive-ptest/working/apache-svn-trunk-source/= build/ivy/lib/hadoop0.20S.shim/commons-net-1.4.1.jar [datanucleusenhancer] >> /data/hive-ptest/working/apache-svn-trunk-source/= build/ivy/lib/hadoop0.20S.shim/core-3.1.1.jar [datanucleusenhancer] >> /data/hive-ptest/working/apache-svn-trunk-source/= build/ivy/lib/hadoop0.20S.shim/ftplet-api-1.0.0.jar [datanucleusenhancer] >> /data/hive-ptest/working/apache-svn-trunk-source/= build/ivy/lib/hadoop0.20S.shim/ftpserver-core-1.0.0.jar [datanucleusenhancer] >> /data/hive-ptest/working/apache-svn-trunk-source/= build/ivy/lib/hadoop0.20S.shim/ftpserver-deprecated-1.0.0-M2.jar [datanucleusenhancer] >> /data/hive-ptest/working/apache-svn-trunk-source/= build/ivy/lib/hadoop0.20S.shim/hadoop-core-1.1.2.jar [datanucleusenhancer] >> /data/hive-ptest/working/apache-svn-trunk-source/= build/ivy/lib/hadoop0.20S.shim/hadoop-test-1.1.2.jar [datanucleusenhancer] >> /data/hive-ptest/working/apache-svn-trunk-source/= build/ivy/lib/hadoop0.20S.shim/hadoop-tools-1.1.2.jar [datanucleusenhancer] >> /data/hive-ptest/working/apache-svn-trunk-source/= build/ivy/lib/hadoop0.20S.shim/hsqldb-1.8.0.10.jar [datanucleusenhancer] >> /data/hive-ptest/working/apache-svn-trunk-source/= build/ivy/lib/hadoop0.20S.shim/jackson-core-asl-1.8.8.jar [datanucleusenhancer] >> /data/hive-ptest/working/apache-svn-trunk-source/= build/ivy/lib/hadoop0.20S.shim/jackson-jaxrs-1.7.1.jar [datanucleusenhancer] >> /data/hive-ptest/working/apache-svn-trunk-source/= build/ivy/lib/hadoop0.20S.shim/jackson-mapper-asl-1.8.8.jar [datanucleusenhancer] >> /data/hive-ptest/working/apache-svn-trunk-source/= build/ivy/lib/hadoop0.20S.shim/jackson-xc-1.7.1.jar [datanucleusenhancer] >> /data/hive-ptest/working/apache-svn-trunk-source/= build/ivy/lib/hadoop0.20S.shim/jasper-compiler-5.5.12.jar [datanucleusenhancer] >> /data/hive-ptest/working/apache-svn-trunk-source/= build/ivy/lib/hadoop0.20S.shim/jasper-runtime-5.5.12.jar [datanucleusenhancer] >> /data/hive-ptest/working/apache-svn-trunk-source/= build/ivy/lib/hadoop0.20S.shim/jaxb-api-2.2.2.jar [datanucleusenhancer] >> /data/hive-ptest/working/apache-svn-trunk-source/= build/ivy/lib/hadoop0.20S.shim/jaxb-impl-2.2.3-1.jar [datanucleusenhancer] >> /data/hive-ptest/working/apache-svn-trunk-source/= build/ivy/lib/hadoop0.20S.shim/jersey-core-1.8.jar [datanucleusenhancer] >> /data/hive-ptest/working/apache-svn-trunk-source/= build/ivy/lib/hadoop0.20S.shim/jersey-json-1.8.jar [datanucleusenhancer] >> /data/hive-ptest/working/apache-svn-trunk-source/= build/ivy/lib/hadoop0.20S.shim/jersey-server-1.8.jar [datanucleusenhancer] >> /data/hive-ptest/working/apache-svn-trunk-source/= build/ivy/lib/hadoop0.20S.shim/jets3t-0.6.1.jar [datanucleusenhancer] >> /data/hive-ptest/working/apache-svn-trunk-source/= build/ivy/lib/hadoop0.20S.shim/jettison-1.1.jar [datanucleusenhancer] >> /data/hive-ptest/working/apache-svn-trunk-source/= build/ivy/lib/hadoop0.20S.shim/jetty-6.1.26.jar [datanucleusenhancer] >> /data/hive-ptest/working/apache-svn-trunk-source/= build/ivy/lib/hadoop0.20S.shim/jetty-util-6.1.26.jar [datanucleusenhancer] >> /data/hive-ptest/working/apache-svn-trunk-source/= build/ivy/lib/hadoop0.20S.shim/jsp-2.1-6.1.14.jar [datanucleusenhancer] >> /data/hive-ptest/working/apache-svn-trunk-source/= build/ivy/lib/hadoop0.20S.shim/jsp-api-2.1-6.1.14.jar [datanucleusenhancer] >> /data/hive-ptest/working/apache-svn-trunk-source/= build/ivy/lib/hadoop0.20S.shim/junit-3.8.1.jar [datanucleusenhancer] >> /data/hive-ptest/working/apache-svn-trunk-source/= build/ivy/lib/hadoop0.20S.shim/mina-core-2.0.0-M5.jar [datanucleusenhancer] >> /data/hive-ptest/working/apache-svn-trunk-source/= build/ivy/lib/hadoop0.20S.shim/oro-2.0.8.jar [datanucleusenhancer] >> /data/hive-ptest/working/apache-svn-trunk-source/= build/ivy/lib/hadoop0.20S.shim/servlet-api-2.5-20081211.jar [datanucleusenhancer] >> /data/hive-ptest/working/apache-svn-trunk-source/= build/ivy/lib/hadoop0.20S.shim/servlet-api-2.5-6.1.14.jar [datanucleusenhancer] >> /data/hive-ptest/working/apache-svn-trunk-source/= build/ivy/lib/hadoop0.20S.shim/slf4j-api-1.5.2.jar [datanucleusenhancer] >> /data/hive-ptest/working/apache-svn-trunk-source/= build/ivy/lib/hadoop0.20S.shim/stax-api-1.0-2.jar [datanucleusenhancer] >> /data/hive-ptest/working/apache-svn-trunk-source/= build/ivy/lib/hadoop0.20S.shim/stax-api-1.0.1.jar [datanucleusenhancer] >> /data/hive-ptest/working/apache-svn-trunk-source/= build/ivy/lib/hadoop0.20S.shim/xmlenc-0.52.jar [datanucleusenhancer] >> /data/hive-ptest/working/apache-svn-trunk-source/= build/ivy/lib/default/ST4-4.0.4.jar [datanucleusenhancer] >> /data/hive-ptest/working/apache-svn-trunk-source/= build/ivy/lib/default/antlr-3.4.jar [datanucleusenhancer] >> /data/hive-ptest/working/apache-svn-trunk-source/= build/ivy/lib/default/antlr-runtime-3.4.jar [datanucleusenhancer] >> /data/hive-ptest/working/apache-svn-trunk-source/= build/ivy/lib/default/avro-1.7.1.jar [datanucleusenhancer] >> /data/hive-ptest/working/apache-svn-trunk-source/= build/ivy/lib/default/avro-mapred-1.7.1.jar [datanucleusenhancer] >> /data/hive-ptest/working/apache-svn-trunk-source/= build/ivy/lib/default/bonecp-0.7.1.RELEASE.jar [datanucleusenhancer] >> /data/hive-ptest/working/apache-svn-trunk-source/= build/ivy/lib/default/commons-cli-1.2.jar [datanucleusenhancer] >> /data/hive-ptest/working/apache-svn-trunk-source/= build/ivy/lib/default/commons-codec-1.4.jar [datanucleusenhancer] >> /data/hive-ptest/working/apache-svn-trunk-source/= build/ivy/lib/default/commons-compress-1.4.1.jar [datanucleusenhancer] >> /data/hive-ptest/working/apache-svn-trunk-source/= build/ivy/lib/default/commons-io-2.4.jar [datanucleusenhancer] >> /data/hive-ptest/working/apache-svn-trunk-source/= build/ivy/lib/default/commons-lang-2.4.jar [datanucleusenhancer] >> /data/hive-ptest/working/apache-svn-trunk-source/= build/ivy/lib/default/commons-logging-1.0.4.jar [datanucleusenhancer] >> /data/hive-ptest/working/apache-svn-trunk-source/= build/ivy/lib/default/commons-logging-api-1.0.4.jar [datanucleusenhancer] >> /data/hive-ptest/working/apache-svn-trunk-source/= build/ivy/lib/default/commons-pool-1.5.4.jar [datanucleusenhancer] >> /data/hive-ptest/working/apache-svn-trunk-source/= build/ivy/lib/default/datanucleus-api-jdo-3.2.1.jar [datanucleusenhancer] >> /data/hive-ptest/working/apache-svn-trunk-source/= build/ivy/lib/default/datanucleus-core-3.2.2.jar [datanucleusenhancer] >> /data/hive-ptest/working/apache-svn-trunk-source/= build/ivy/lib/default/datanucleus-rdbms-3.2.1.jar [datanucleusenhancer] >> /data/hive-ptest/working/apache-svn-trunk-source/= build/ivy/lib/default/derby-10.4.2.0.jar [datanucleusenhancer] >> /data/hive-ptest/working/apache-svn-trunk-source/= build/ivy/lib/default/guava-11.0.2.jar [datanucleusenhancer] >> /data/hive-ptest/working/apache-svn-trunk-source/= build/ivy/lib/default/guava-r08.jar [datanucleusenhancer] >> /data/hive-ptest/working/apache-svn-trunk-source/= build/ivy/lib/default/hive-common-0.13.0-SNAPSHOT.jar [datanucleusenhancer] >> /data/hive-ptest/working/apache-svn-trunk-source/= build/ivy/lib/default/hive-serde-0.13.0-SNAPSHOT.jar [datanucleusenhancer] >> /data/hive-ptest/working/apache-svn-trunk-source/= build/ivy/lib/default/hive-shims-0.13.0-SNAPSHOT.jar [datanucleusenhancer] >> /data/hive-ptest/working/apache-svn-trunk-source/= build/ivy/lib/default/jackson-core-asl-1.8.8.jar [datanucleusenhancer] >> /data/hive-ptest/working/apache-svn-trunk-source/= build/ivy/lib/default/jackson-mapper-asl-1.8.8.jar [datanucleusenhancer] >> /data/hive-ptest/working/apache-svn-trunk-source/= build/ivy/lib/default/jdo-api-3.0.1.jar [datanucleusenhancer] >> /data/hive-ptest/working/apache-svn-trunk-source/= build/ivy/lib/default/libfb303-0.9.0.jar [datanucleusenhancer] >> /data/hive-ptest/working/apache-svn-trunk-source/= build/ivy/lib/default/libthrift-0.9.0.jar [datanucleusenhancer] >> /data/hive-ptest/working/apache-svn-trunk-source/= build/ivy/lib/default/log4j-1.2.16.jar [datanucleusenhancer] >> /data/hive-ptest/working/apache-svn-trunk-source/= build/ivy/lib/default/mockito-all-1.8.2.jar [datanucleusenhancer] >> /data/hive-ptest/working/apache-svn-trunk-source/= build/ivy/lib/default/slf4j-api-1.6.1.jar [datanucleusenhancer] >> /data/hive-ptest/working/apache-svn-trunk-source/= build/ivy/lib/default/slf4j-log4j12-1.6.1.jar [datanucleusenhancer] >> /data/hive-ptest/working/apache-svn-trunk-source/= build/ivy/lib/default/velocity-1.5.jar [datanucleusenhancer] >> /data/hive-ptest/working/apache-svn-trunk-source/= build/ivy/lib/default/xz-1.0.jar [datanucleusenhancer] >> /data/hive-ptest/working/apache-svn-trunk-source/= build/ivy/lib/default/zookeeper-3.4.3.jar [datanucleusenhancer] ENHANCED (PersistenceCapable) : org.apache.hadoop.hiv= e.metastore.model.MDatabase [datanucleusenhancer] ENHANCED (PersistenceCapable) : org.apache.hadoop.hiv= e.metastore.model.MFieldSchema [datanucleusenhancer] ENHANCED (PersistenceCapable) : org.apache.hadoop.hiv= e.metastore.model.MType [datanucleusenhancer] ENHANCED (PersistenceCapable) : org.apache.hadoop.hiv= e.metastore.model.MTable [datanucleusenhancer] ENHANCED (PersistenceCapable) : org.apache.hadoop.hiv= e.metastore.model.MSerDeInfo [datanucleusenhancer] ENHANCED (PersistenceCapable) : org.apache.hadoop.hiv= e.metastore.model.MOrder [datanucleusenhancer] ENHANCED (PersistenceCapable) : org.apache.hadoop.hiv= e.metastore.model.MColumnDescriptor [datanucleusenhancer] ENHANCED (PersistenceCapable) : org.apache.hadoop.hiv= e.metastore.model.MStringList [datanucleusenhancer] ENHANCED (PersistenceCapable) : org.apache.hadoop.hiv= e.metastore.model.MStorageDescriptor [datanucleusenhancer] ENHANCED (PersistenceCapable) : org.apache.hadoop.hiv= e.metastore.model.MPartition [datanucleusenhancer] ENHANCED (PersistenceCapable) : org.apache.hadoop.hiv= e.metastore.model.MIndex [datanucleusenhancer] ENHANCED (PersistenceCapable) : org.apache.hadoop.hiv= e.metastore.model.MRole [datanucleusenhancer] ENHANCED (PersistenceCapable) : org.apache.hadoop.hiv= e.metastore.model.MRoleMap [datanucleusenhancer] ENHANCED (PersistenceCapable) : org.apache.hadoop.hiv= e.metastore.model.MGlobalPrivilege [datanucleusenhancer] ENHANCED (PersistenceCapable) : org.apache.hadoop.hiv= e.metastore.model.MDBPrivilege [datanucleusenhancer] ENHANCED (PersistenceCapable) : org.apache.hadoop.hiv= e.metastore.model.MTablePrivilege [datanucleusenhancer] ENHANCED (PersistenceCapable) : org.apache.hadoop.hiv= e.metastore.model.MPartitionPrivilege [datanucleusenhancer] ENHANCED (PersistenceCapable) : org.apache.hadoop.hiv= e.metastore.model.MTableColumnPrivilege [datanucleusenhancer] ENHANCED (PersistenceCapable) : org.apache.hadoop.hiv= e.metastore.model.MPartitionColumnPrivilege [datanucleusenhancer] ENHANCED (PersistenceCapable) : org.apache.hadoop.hiv= e.metastore.model.MPartitionEvent [datanucleusenhancer] ENHANCED (PersistenceCapable) : org.apache.hadoop.hiv= e.metastore.model.MMasterKey [datanucleusenhancer] ENHANCED (PersistenceCapable) : org.apache.hadoop.hiv= e.metastore.model.MDelegationToken [datanucleusenhancer] ENHANCED (PersistenceCapable) : org.apache.hadoop.hiv= e.metastore.model.MTableColumnStatistics [datanucleusenhancer] ENHANCED (PersistenceCapable) : org.apache.hadoop.hiv= e.metastore.model.MPartitionColumnStatistics [datanucleusenhancer] DataNucleus Enhancer completed with success for 24 cl= asses. Timings : input=3D738 ms, enhance=3D1164 ms, total=3D1902 ms. Consul= t the log for full details compile: [echo] Project: metastore jar: [echo] Project: metastore [jar] Building jar: /data/hive-ptest/working/apache-svn-trunk-source/= build/metastore/hive-metastore-0.13.0-SNAPSHOT.jar [ivy:publish] :: delivering :: org.apache.hive#hive-metastore;0.13.0-SNAPSH= OT :: 0.13.0-SNAPSHOT :: integration :: Tue Sep 17 20:37:05 EDT 2013 [ivy:publish] =09delivering ivy file to /data/hive-ptest/working/apache-svn= -trunk-source/build/metastore/ivy-0.13.0-SNAPSHOT.xml [ivy:publish] :: publishing :: org.apache.hive#hive-metastore [ivy:publish] =09published hive-metastore to /data/hive-ptest/working/ivy/l= ocal/org.apache.hive/hive-metastore/0.13.0-SNAPSHOT/jars/hive-metastore.jar [ivy:publish] =09published ivy to /data/hive-ptest/working/ivy/local/org.ap= ache.hive/hive-metastore/0.13.0-SNAPSHOT/ivys/ivy.xml ivy-init-settings: [echo] Project: ql check-ivy: [echo] Project: ql ivy-resolve: [echo] Project: ql [ivy:resolve] :: loading settings :: file =3D /data/hive-ptest/working/apac= he-svn-trunk-source/ivy/ivysettings.xml [ivy:resolve] :: resolving dependencies :: org.apache.hive#hive-exec;0.13.0= -SNAPSHOT [ivy:resolve] =09confs: [default] [ivy:resolve] =09found org.apache.hive#hive-metastore;0.13.0-SNAPSHOT in lo= cal [ivy:resolve] =09found org.apache.hive#hive-serde;0.13.0-SNAPSHOT in local [ivy:resolve] =09found org.apache.hive#hive-common;0.13.0-SNAPSHOT in local [ivy:resolve] =09found org.apache.hive#hive-shims;0.13.0-SNAPSHOT in local [ivy:resolve] =09found commons-cli#commons-cli;1.2 in maven2 [ivy:resolve] =09found org.apache.commons#commons-compress;1.4.1 in maven2 [ivy:resolve] =09found org.tukaani#xz;1.0 in maven2 [ivy:resolve] =09found commons-lang#commons-lang;2.4 in maven2 [ivy:resolve] =09found log4j#log4j;1.2.16 in maven2 [ivy:resolve] =09found org.slf4j#slf4j-api;1.6.1 in maven2 [ivy:resolve] =09found org.slf4j#slf4j-log4j12;1.6.1 in maven2 [ivy:resolve] =09found org.mockito#mockito-all;1.8.2 in maven2 [ivy:resolve] =09found org.apache.thrift#libfb303;0.9.0 in maven2 [ivy:resolve] =09found commons-codec#commons-codec;1.4 in maven2 [ivy:resolve] =09found org.apache.avro#avro;1.7.1 in maven2 [ivy:resolve] =09found org.apache.avro#avro-mapred;1.7.1 in maven2 [ivy:resolve] =09found org.antlr#antlr;3.4 in maven2 [ivy:resolve] =09found org.antlr#antlr-runtime;3.4 in maven2 [ivy:resolve] =09found org.antlr#ST4;4.0.4 in maven2 [ivy:resolve] =09found com.jolbox#bonecp;0.7.1.RELEASE in maven2 [ivy:resolve] =09found com.google.guava#guava;r08 in maven2 [ivy:resolve] =09found commons-pool#commons-pool;1.5.4 in maven2 [ivy:resolve] =09found org.datanucleus#datanucleus-api-jdo;3.2.1 in maven2 [ivy:resolve] =09found org.datanucleus#datanucleus-core;3.2.2 in maven2 [ivy:resolve] =09found org.datanucleus#datanucleus-rdbms;3.2.1 in maven2 [ivy:resolve] =09found javax.jdo#jdo-api;3.0.1 in maven2 [ivy:resolve] =09found org.apache.derby#derby;10.4.2.0 in maven2 [ivy:resolve] =09found com.google.protobuf#protobuf-java;2.5.0 in maven2 [ivy:resolve] =09found org.iq80.snappy#snappy;0.2 in maven2 [ivy:resolve] =09found com.esotericsoftware.kryo#kryo;2.22-SNAPSHOT in sona= type-snapshot [ivy:resolve] =09found com.esotericsoftware.reflectasm#reflectasm;1.07 in m= aven2 [ivy:resolve] =09found org.ow2.asm#asm;4.0 in maven2 [ivy:resolve] =09found com.esotericsoftware.minlog#minlog;1.2 in maven2 [ivy:resolve] =09found org.objenesis#objenesis;1.2 in maven2 [ivy:resolve] =09found org.json#json;20090211 in maven2 [ivy:resolve] =09found commons-collections#commons-collections;3.2.1 in mav= en2 [ivy:resolve] =09found commons-configuration#commons-configuration;1.6 in m= aven2 [ivy:resolve] =09found com.googlecode.javaewah#JavaEWAH;0.3.2 in maven2 [ivy:resolve] =09found javolution#javolution;5.5.1 in maven2 [ivy:resolve] =09found jline#jline;0.9.94 in maven2 [ivy:resolve] =09found com.google.guava#guava;11.0.2 in maven2 [ivy:resolve] =09found com.google.code.findbugs#jsr305;1.3.9 in maven2 [ivy:resolve] downloading /data/hive-ptest/working/ivy/local/org.apache.hiv= e/hive-metastore/0.13.0-SNAPSHOT/jars/hive-metastore.jar ... [ivy:resolve] .................................................... (3257kB) [ivy:resolve] .. (0kB) [ivy:resolve] =09[SUCCESSFUL ] org.apache.hive#hive-metastore;0.13.0-SNAPSH= OT!hive-metastore.jar (51ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/org/iq80/snappy/sna= ppy/0.2/snappy-0.2.jar ... [ivy:resolve] .. (47kB) [ivy:resolve] .. (0kB) [ivy:resolve] =09[SUCCESSFUL ] org.iq80.snappy#snappy;0.2!snappy.jar (12ms) [ivy:resolve] downloading https://oss.sonatype.org/content/repositories/sna= pshots/com/esotericsoftware/kryo/kryo/2.22-SNAPSHOT/kryo-2.22-20130903.0847= 24-39.jar ... [ivy:resolve] .............................................................= ............................... (420kB) [ivy:resolve] .. (0kB) [ivy:resolve] =09[SUCCESSFUL ] com.esotericsoftware.kryo#kryo;2.22-SNAPSHOT= !kryo.jar(bundle) (676ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/org/json/json/20090= 211/json-20090211.jar ... [ivy:resolve] .. (44kB) [ivy:resolve] .. (0kB) [ivy:resolve] =09[SUCCESSFUL ] org.json#json;20090211!json.jar (9ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/com/googlecode/java= ewah/JavaEWAH/0.3.2/JavaEWAH-0.3.2.jar ... [ivy:resolve] .. (16kB) [ivy:resolve] .. (0kB) [ivy:resolve] =09[SUCCESSFUL ] com.googlecode.javaewah#JavaEWAH;0.3.2!JavaE= WAH.jar (5ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/javolution/javoluti= on/5.5.1/javolution-5.5.1.jar ... [ivy:resolve] ........ (385kB) [ivy:resolve] .. (0kB) [ivy:resolve] =09[SUCCESSFUL ] javolution#javolution;5.5.1!javolution.jar(b= undle) (11ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/jline/jline/0.9.94/= jline-0.9.94.jar ... [ivy:resolve] ... (85kB) [ivy:resolve] .. (0kB) [ivy:resolve] =09[SUCCESSFUL ] jline#jline;0.9.94!jline.jar (23ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/com/esotericsoftwar= e/reflectasm/reflectasm/1.07/reflectasm-1.07-shaded.jar ... [ivy:resolve] ... (64kB) [ivy:resolve] .. (0kB) [ivy:resolve] =09[SUCCESSFUL ] com.esotericsoftware.reflectasm#reflectasm;1= .07!reflectasm.jar (40ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/com/esotericsoftwar= e/minlog/minlog/1.2/minlog-1.2.jar ... [ivy:resolve] .. (4kB) [ivy:resolve] .. (0kB) [ivy:resolve] =09[SUCCESSFUL ] com.esotericsoftware.minlog#minlog;1.2!minlo= g.jar (9ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/org/objenesis/objen= esis/1.2/objenesis-1.2.jar ... [ivy:resolve] .. (35kB) [ivy:resolve] .. (0kB) [ivy:resolve] =09[SUCCESSFUL ] org.objenesis#objenesis;1.2!objenesis.jar (2= 2ms) [ivy:resolve] downloading http://repo1.maven.org/maven2/org/ow2/asm/asm/4.0= /asm-4.0.jar ... [ivy:resolve] .. (44kB) [ivy:resolve] .. (0kB) [ivy:resolve] =09[SUCCESSFUL ] org.ow2.asm#asm;4.0!asm.jar (26ms) [ivy:resolve] :: resolution report :: resolve 17137ms :: artifacts dl 922ms [ivy:resolve] =09:: evicted modules: [ivy:resolve] =09com.google.guava#guava;r08 by [com.google.guava#guava;11.0= .2] in [default] [ivy:resolve] =09org.slf4j#slf4j-api;1.5.10 by [org.slf4j#slf4j-api;1.6.1] = in [default] =09--------------------------------------------------------------------- =09| | modules || artifacts | =09| conf | number| search|dwnlded|evicted|| number|dwnlded| =09--------------------------------------------------------------------- =09| default | 43 | 11 | 11 | 2 || 41 | 11 | =09--------------------------------------------------------------------- [ivy:report] Processing /data/hive-ptest/working/apache-svn-trunk-source/bu= ild/ivy/resolution-cache/org.apache.hive-hive-exec-default.xml to /data/hiv= e-ptest/working/apache-svn-trunk-source/build/ivy/report/org.apache.hive-hi= ve-exec-default.html make-pom: [echo] Project: ql [echo] Writing POM to /data/hive-ptest/working/apache-svn-trunk-sourc= e/build/ql/pom.xml [ivy:makepom] DEPRECATED: 'ivy.conf.file' is deprecated, use 'ivy.settings.= file' instead [ivy:makepom] :: loading settings :: file =3D /data/hive-ptest/working/apac= he-svn-trunk-source/ivy/ivysettings.xml create-dirs: [echo] Project: ql init: [echo] Project: ql ql-init: [echo] Project: ql [mkdir] Created dir: /data/hive-ptest/working/apache-svn-trunk-source/b= uild/ql/gen/antlr/gen-java/org/apache/hadoop/hive/ql/parse ivy-retrieve: [echo] Project: ql [ivy:retrieve] :: retrieving :: org.apache.hive#hive-exec [ivy:retrieve] =09confs: [default] [ivy:retrieve] =0915 artifacts copied, 26 already retrieved (5813kB/29ms) build-grammar: [echo] Project: ql [echo] Building Grammar /data/hive-ptest/working/apache-svn-trunk-sour= ce/ql/src/java/org/apache/hadoop/hive/ql/parse/Hive.g .... [java] warning(200): /data/hive-ptest/working/apache-svn-trunk-source/= ql/src/java/org/apache/hadoop/hive/ql/parse/HiveParser.g:867:5:=20 [java] Decision can match input such as "Identifier KW_RENAME KW_TO" u= sing multiple alternatives: 1, 10 [java]=20 [java] As a result, alternative(s) 10 were disabled for that input [java] warning(200): /data/hive-ptest/working/apache-svn-trunk-source/= ql/src/java/org/apache/hadoop/hive/ql/parse/HiveParser.g:1168:5:=20 [java] Decision can match input such as "KW_TEXTFILE" using multiple a= lternatives: 2, 6 [java]=20 [java] As a result, alternative(s) 6 were disabled for that input [java] warning(200): /data/hive-ptest/working/apache-svn-trunk-source/= ql/src/java/org/apache/hadoop/hive/ql/parse/HiveParser.g:1168:5:=20 [java] Decision can match input such as "KW_SEQUENCEFILE" using multip= le alternatives: 1, 6 [java]=20 [java] As a result, alternative(s) 6 were disabled for that input [java] warning(200): /data/hive-ptest/working/apache-svn-trunk-source/= ql/src/java/org/apache/hadoop/hive/ql/parse/HiveParser.g:1168:5:=20 [java] Decision can match input such as "KW_ORCFILE" using multiple al= ternatives: 4, 6 [java]=20 [java] As a result, alternative(s) 6 were disabled for that input [java] warning(200): /data/hive-ptest/working/apache-svn-trunk-source/= ql/src/java/org/apache/hadoop/hive/ql/parse/HiveParser.g:1168:5:=20 [java] Decision can match input such as "KW_RCFILE" using multiple alt= ernatives: 3, 6 [java]=20 [java] As a result, alternative(s) 6 were disabled for that input [java] warning(200): /data/hive-ptest/working/apache-svn-trunk-source/= ql/src/java/org/apache/hadoop/hive/ql/parse/HiveParser.g:1181:23:=20 [java] Decision can match input such as "KW_KEY_TYPE" using multiple a= lternatives: 2, 4 [java]=20 [java] As a result, alternative(s) 4 were disabled for that input [java] warning(200): /data/hive-ptest/working/apache-svn-trunk-source/= ql/src/java/org/apache/hadoop/hive/ql/parse/HiveParser.g:1181:23:=20 [java] Decision can match input such as "KW_ELEM_TYPE" using multiple = alternatives: 1, 4 [java]=20 [java] As a result, alternative(s) 4 were disabled for that input [java] warning(200): /data/hive-ptest/working/apache-svn-trunk-source/= ql/src/java/org/apache/hadoop/hive/ql/parse/HiveParser.g:1181:23:=20 [java] Decision can match input such as "KW_VALUE_TYPE" using multiple= alternatives: 3, 4 [java]=20 [java] As a result, alternative(s) 4 were disabled for that input [java] warning(200): /data/hive-ptest/working/apache-svn-trunk-source/= ql/src/java/org/apache/hadoop/hive/ql/parse/HiveParser.g:1188:23:=20 [java] Decision can match input such as "KW_VALUE_TYPE" using multiple= alternatives: 3, 4 [java]=20 [java] As a result, alternative(s) 4 were disabled for that input [java] warning(200): /data/hive-ptest/working/apache-svn-trunk-source/= ql/src/java/org/apache/hadoop/hive/ql/parse/HiveParser.g:1188:23:=20 [java] Decision can match input such as "KW_ELEM_TYPE" using multiple = alternatives: 1, 4 [java]=20 [java] As a result, alternative(s) 4 were disabled for that input [java] warning(200): /data/hive-ptest/working/apache-svn-trunk-source/= ql/src/java/org/apache/hadoop/hive/ql/parse/HiveParser.g:1188:23:=20 [java] Decision can match input such as "KW_KEY_TYPE" using multiple a= lternatives: 2, 4 [java]=20 [java] As a result, alternative(s) 4 were disabled for that input [java] warning(200): /data/hive-ptest/working/apache-svn-trunk-source/= ql/src/java/org/apache/hadoop/hive/ql/parse/HiveParser.g:1206:29:=20 [java] Decision can match input such as "KW_PRETTY KW_PARTITION" using= multiple alternatives: 3, 4 [java]=20 [java] As a result, alternative(s) 4 were disabled for that input [java] warning(200): /data/hive-ptest/working/apache-svn-trunk-source/= ql/src/java/org/apache/hadoop/hive/ql/parse/HiveParser.g:1206:29:=20 [java] Decision can match input such as "KW_PRETTY {KW_ADD..KW_AFTER, = KW_ALTER..KW_ANALYZE, KW_ARCHIVE..KW_CASCADE, KW_CHANGE..KW_COLLECTION, KW_= COLUMNS..KW_CREATE, KW_CUBE, KW_CURSOR..KW_DATA, KW_DATABASES..KW_DISABLE, = KW_DISTRIBUTE..KW_ELEM_TYPE, KW_ENABLE, KW_ESCAPED, KW_EXCLUSIVE..KW_EXPORT= , KW_EXTERNAL..KW_FLOAT, KW_FOR..KW_FORMATTED, KW_FULL, KW_FUNCTIONS..KW_GR= OUPING, KW_HOLD_DDLTIME..KW_IDXPROPERTIES, KW_IGNORE..KW_ITEMS, KW_KEYS..KW= _LEFT, KW_LIKE..KW_LONG, KW_MAPJOIN..KW_MINUS, KW_MSCK..KW_NOSCAN, KW_NO_DR= OP..KW_OFFLINE, KW_OPTION, KW_ORCFILE..KW_OUTPUTFORMAT, KW_OVERWRITE, KW_PA= RTITIONED..KW_PLUS, KW_PRETTY..KW_RECORDWRITER, KW_REGEXP..KW_SCHEMAS, KW_S= EMI..KW_TABLES, KW_TBLPROPERTIES..KW_TEXTFILE, KW_TIMESTAMP..KW_TOUCH, KW_T= RIGGER..KW_UNARCHIVE, KW_UNDO..KW_UNIONTYPE, KW_UNLOCK..KW_VALUE_TYPE, KW_V= IEW, KW_WHILE, KW_WITH}" using multiple alternatives: 3, 4 [java]=20 [java] As a result, alternative(s) 4 were disabled for that input [java] warning(200): /data/hive-ptest/working/apache-svn-trunk-source/= ql/src/java/org/apache/hadoop/hive/ql/parse/HiveParser.g:1206:29:=20 [java] Decision can match input such as "KW_PRETTY Identifier" using m= ultiple alternatives: 3, 4 [java]=20 [java] As a result, alternative(s) 4 were disabled for that input [java] warning(200): /data/hive-ptest/working/apache-svn-trunk-source/= ql/src/java/org/apache/hadoop/hive/ql/parse/HiveParser.g:1206:29:=20 [java] Decision can match input such as "KW_FORMATTED {KW_ADD..KW_AFTE= R, KW_ALTER..KW_ANALYZE, KW_ARCHIVE..KW_CASCADE, KW_CHANGE..KW_COLLECTION, = KW_COLUMNS..KW_CREATE, KW_CUBE, KW_CURSOR..KW_DATA, KW_DATABASES..KW_DISABL= E, KW_DISTRIBUTE..KW_ELEM_TYPE, KW_ENABLE, KW_ESCAPED, KW_EXCLUSIVE..KW_EXP= ORT, KW_EXTERNAL..KW_FLOAT, KW_FOR..KW_FORMATTED, KW_FULL, KW_FUNCTIONS..KW= _GROUPING, KW_HOLD_DDLTIME..KW_IDXPROPERTIES, KW_IGNORE..KW_ITEMS, KW_KEYS.= .KW_LEFT, KW_LIKE..KW_LONG, KW_MAPJOIN..KW_MINUS, KW_MSCK..KW_NOSCAN, KW_NO= _DROP..KW_OFFLINE, KW_OPTION, KW_ORCFILE..KW_OUTPUTFORMAT, KW_OVERWRITE, KW= _PARTITIONED..KW_PLUS, KW_PRETTY..KW_RECORDWRITER, KW_REGEXP..KW_SCHEMAS, K= W_SEMI..KW_TABLES, KW_TBLPROPERTIES..KW_TEXTFILE, KW_TIMESTAMP..KW_TOUCH, K= W_TRIGGER..KW_UNARCHIVE, KW_UNDO..KW_UNIONTYPE, KW_UNLOCK..KW_VALUE_TYPE, K= W_VIEW, KW_WHILE, KW_WITH}" using multiple alternatives: 1, 4 [java]=20 [java] As a result, alternative(s) 4 were disabled for that input [java] warning(200): /data/hive-ptest/working/apache-svn-trunk-source/= ql/src/java/org/apache/hadoop/hive/ql/parse/HiveParser.g:1206:29:=20 [java] Decision can match input such as "KW_FORMATTED KW_PARTITION" us= ing multiple alternatives: 1, 4 [java]=20 [java] As a result, alternative(s) 4 were disabled for that input [java] warning(200): /data/hive-ptest/working/apache-svn-trunk-source/= ql/src/java/org/apache/hadoop/hive/ql/parse/HiveParser.g:1206:29:=20 [java] Decision can match input such as "KW_FORMATTED Identifier" usin= g multiple alternatives: 1, 4 [java]=20 [java] As a result, alternative(s) 4 were disabled for that input [java] warning(200): /data/hive-ptest/working/apache-svn-trunk-source/= ql/src/java/org/apache/hadoop/hive/ql/parse/HiveParser.g:1477:116:=20 [java] Decision can match input such as "KW_STORED KW_AS KW_DIRECTORIE= S" using multiple alternatives: 1, 2 [java]=20 [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): /data/hive-ptest/working/apache-svn-trunk-source/= ql/src/java/org/apache/hadoop/hive/ql/parse/HiveParser.g:1600:5:=20 [java] Decision can match input such as "KW_STORED KW_AS KW_SEQUENCEFI= LE" using multiple alternatives: 1, 7 [java]=20 [java] As a result, alternative(s) 7 were disabled for that input [java] warning(200): /data/hive-ptest/working/apache-svn-trunk-source/= ql/src/java/org/apache/hadoop/hive/ql/parse/HiveParser.g:1600:5:=20 [java] Decision can match input such as "KW_STORED KW_AS KW_TEXTFILE" = using multiple alternatives: 2, 7 [java]=20 [java] As a result, alternative(s) 7 were disabled for that input [java] warning(200): /data/hive-ptest/working/apache-svn-trunk-source/= ql/src/java/org/apache/hadoop/hive/ql/parse/HiveParser.g:1600:5:=20 [java] Decision can match input such as "KW_STORED KW_AS KW_INPUTFORMA= T" using multiple alternatives: 5, 7 [java]=20 [java] As a result, alternative(s) 7 were disabled for that input [java] warning(200): /data/hive-ptest/working/apache-svn-trunk-source/= ql/src/java/org/apache/hadoop/hive/ql/parse/HiveParser.g:1600:5:=20 [java] Decision can match input such as "KW_STORED KW_AS KW_RCFILE" us= ing multiple alternatives: 3, 7 [java]=20 [java] As a result, alternative(s) 7 were disabled for that input [java] warning(200): /data/hive-ptest/working/apache-svn-trunk-source/= ql/src/java/org/apache/hadoop/hive/ql/parse/HiveParser.g:1600:5:=20 [java] Decision can match input such as "KW_STORED KW_AS KW_ORCFILE" u= sing multiple alternatives: 4, 7 [java]=20 [java] As a result, alternative(s) 7 were disabled for that input [java] warning(200): SelectClauseParser.g:149:5:=20 [java] Decision can match input such as "KW_NULL DOT Identifier" using= multiple alternatives: 1, 2 [java]=20 [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): SelectClauseParser.g:149:5:=20 [java] Decision can match input such as "KW_NULL DOT {KW_ADD..KW_AFTER= , KW_ALTER..KW_ANALYZE, KW_ARCHIVE..KW_CASCADE, KW_CHANGE..KW_COLLECTION, K= W_COLUMNS..KW_CREATE, KW_CUBE, KW_CURSOR..KW_DATA, KW_DATABASES..KW_DISABLE= , KW_DISTRIBUTE..KW_ELEM_TYPE, KW_ENABLE, KW_ESCAPED, KW_EXCLUSIVE..KW_EXPO= RT, KW_EXTERNAL..KW_FLOAT, KW_FOR..KW_FORMATTED, KW_FULL, KW_FUNCTIONS..KW_= GROUPING, KW_HOLD_DDLTIME..KW_IDXPROPERTIES, KW_IGNORE..KW_ITEMS, KW_KEYS..= KW_LEFT, KW_LIKE..KW_LONG, KW_MAPJOIN..KW_MINUS, KW_MSCK..KW_NOSCAN, KW_NO_= DROP..KW_OFFLINE, KW_OPTION, KW_ORCFILE..KW_OUTPUTFORMAT, KW_OVERWRITE, KW_= PARTITION..KW_PLUS, KW_PRETTY..KW_RECORDWRITER, KW_REGEXP..KW_SCHEMAS, KW_S= EMI..KW_TABLES, KW_TBLPROPERTIES..KW_TEXTFILE, KW_TIMESTAMP..KW_TOUCH, KW_T= RIGGER..KW_UNARCHIVE, KW_UNDO..KW_UNIONTYPE, KW_UNLOCK..KW_VALUE_TYPE, KW_V= IEW, KW_WHILE, KW_WITH}" using multiple alternatives: 1, 2 [java]=20 [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): FromClauseParser.g:127:2:=20 [java] Decision can match input such as "KW_LATERAL KW_VIEW KW_OUTER" = using multiple alternatives: 1, 2 [java]=20 [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): FromClauseParser.g:179:25:=20 [java] Decision can match input such as "LPAREN StringLiteral EQUAL" u= sing multiple alternatives: 1, 2 [java]=20 [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): FromClauseParser.g:179:25:=20 [java] Decision can match input such as "LPAREN StringLiteral COMMA" u= sing multiple alternatives: 1, 2 [java]=20 [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): FromClauseParser.g:179:25:=20 [java] Decision can match input such as "LPAREN StringLiteral RPAREN" = using multiple alternatives: 1, 2 [java]=20 [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): FromClauseParser.g:179:68:=20 [java] Decision can match input such as "Identifier LPAREN KW_DATE" us= ing multiple alternatives: 1, 2 [java]=20 [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): FromClauseParser.g:179:68:=20 [java] Decision can match input such as "Identifier LPAREN BigintLiter= al" using multiple alternatives: 1, 2 [java]=20 [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): FromClauseParser.g:179:68:=20 [java] Decision can match input such as "Identifier LPAREN KW_FALSE" u= sing multiple alternatives: 1, 2 [java]=20 [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): FromClauseParser.g:179:68:=20 [java] Decision can match input such as "Identifier LPAREN KW_NOT" usi= ng multiple alternatives: 1, 2 [java]=20 [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): FromClauseParser.g:179:68:=20 [java] Decision can match input such as "Identifier LPAREN KW_TRUE" us= ing multiple alternatives: 1, 2 [java]=20 [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): FromClauseParser.g:179:68:=20 [java] Decision can match input such as "Identifier LPAREN TinyintLite= ral" using multiple alternatives: 1, 2 [java]=20 [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): FromClauseParser.g:179:68:=20 [java] Decision can match input such as "Identifier LPAREN Identifier"= using multiple alternatives: 1, 2 [java]=20 [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): FromClauseParser.g:179:68:=20 [java] Decision can match input such as "Identifier LPAREN KW_UNIONTYP= E" using multiple alternatives: 1, 2 [java]=20 [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): FromClauseParser.g:179:68:=20 [java] Decision can match input such as "Identifier LPAREN SmallintLit= eral" using multiple alternatives: 1, 2 [java]=20 [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): FromClauseParser.g:179:68:=20 [java] Decision can match input such as "Identifier LPAREN KW_CASE" us= ing multiple alternatives: 1, 2 [java]=20 [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): FromClauseParser.g:179:68:=20 [java] Decision can match input such as "Identifier LPAREN KW_IF" usin= g multiple alternatives: 1, 2 [java]=20 [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): FromClauseParser.g:179:68:=20 [java] Decision can match input such as "Identifier LPAREN KW_NULL" us= ing multiple alternatives: 1, 2 [java]=20 [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): FromClauseParser.g:179:68:=20 [java] Decision can match input such as "Identifier LPAREN CharSetName= " using multiple alternatives: 1, 2 [java]=20 [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): FromClauseParser.g:179:68:=20 [java] Decision can match input such as "Identifier LPAREN KW_STRUCT" = using multiple alternatives: 1, 2 [java]=20 [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): FromClauseParser.g:179:68:=20 [java] Decision can match input such as "Identifier LPAREN Number" usi= ng multiple alternatives: 1, 2 [java]=20 [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): FromClauseParser.g:179:68:=20 [java] Decision can match input such as "Identifier LPAREN StringLiter= al" using multiple alternatives: 1, 2 [java]=20 [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): FromClauseParser.g:179:68:=20 [java] Decision can match input such as "Identifier LPAREN DecimalLite= ral" using multiple alternatives: 1, 2 [java]=20 [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): FromClauseParser.g:179:68:=20 [java] Decision can match input such as "Identifier LPAREN LPAREN" usi= ng multiple alternatives: 1, 2 [java]=20 [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): FromClauseParser.g:179:68:=20 [java] Decision can match input such as "Identifier LPAREN KW_CAST" us= ing multiple alternatives: 1, 2 [java]=20 [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): FromClauseParser.g:179:68:=20 [java] Decision can match input such as "Identifier LPAREN KW_MAP" usi= ng multiple alternatives: 1, 2 [java]=20 [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): FromClauseParser.g:179:68:=20 [java] Decision can match input such as "Identifier LPAREN {MINUS, PLU= S, TILDE}" using multiple alternatives: 1, 2 [java]=20 [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): FromClauseParser.g:179:68:=20 [java] Decision can match input such as "Identifier LPAREN {KW_ADD..KW= _AFTER, KW_ALTER..KW_ANALYZE, KW_ARCHIVE, KW_AS..KW_CASCADE, KW_CHANGE..KW_= COLLECTION, KW_COLUMNS..KW_CREATE, KW_CUBE, KW_CURSOR..KW_DATA, KW_DATABASE= S, KW_DATETIME..KW_DISABLE, KW_DISTRIBUTE..KW_ELEM_TYPE, KW_ENABLE, KW_ESCA= PED, KW_EXCLUSIVE..KW_EXPORT, KW_EXTERNAL, KW_FETCH..KW_FLOAT, KW_FOR..KW_F= ORMATTED, KW_FULL, KW_FUNCTIONS..KW_GROUPING, KW_HOLD_DDLTIME..KW_IDXPROPER= TIES, KW_IGNORE..KW_ITEMS, KW_KEYS..KW_LEFT, KW_LIKE..KW_LONG, KW_MAPJOIN..= KW_MINUS, KW_MSCK..KW_NOSCAN, KW_NO_DROP, KW_OF..KW_OFFLINE, KW_OPTION, KW_= ORCFILE..KW_OUTPUTFORMAT, KW_OVERWRITE, KW_PARTITION..KW_PLUS, KW_PRETTY..K= W_RECORDWRITER, KW_REGEXP..KW_SCHEMAS, KW_SEMI..KW_STRING, KW_TABLE..KW_TAB= LES, KW_TBLPROPERTIES..KW_TEXTFILE, KW_TIMESTAMP..KW_TOUCH, KW_TRIGGER, KW_= TRUNCATE..KW_UNARCHIVE, KW_UNDO..KW_UNION, KW_UNLOCK..KW_VALUE_TYPE, KW_VIE= W, KW_WHILE, KW_WITH}" using multiple alternatives: 1, 2 [java]=20 [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): FromClauseParser.g:179:68:=20 [java] Decision can match input such as "Identifier LPAREN KW_ARRAY" u= sing multiple alternatives: 1, 2 [java]=20 [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): FromClauseParser.g:237:16:=20 [java] Decision can match input such as "Identifier LPAREN KW_DATE" us= ing multiple alternatives: 1, 2 [java]=20 [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): FromClauseParser.g:237:16:=20 [java] Decision can match input such as "Identifier LPAREN BigintLiter= al" using multiple alternatives: 1, 2 [java]=20 [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): FromClauseParser.g:237:16:=20 [java] Decision can match input such as "Identifier LPAREN KW_FALSE" u= sing multiple alternatives: 1, 2 [java]=20 [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): FromClauseParser.g:237:16:=20 [java] Decision can match input such as "Identifier LPAREN KW_NOT" usi= ng multiple alternatives: 1, 2 [java]=20 [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): FromClauseParser.g:237:16:=20 [java] Decision can match input such as "Identifier LPAREN KW_TRUE" us= ing multiple alternatives: 1, 2 [java]=20 [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): FromClauseParser.g:237:16:=20 [java] Decision can match input such as "Identifier LPAREN TinyintLite= ral" using multiple alternatives: 1, 2 [java]=20 [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): FromClauseParser.g:237:16:=20 [java] Decision can match input such as "Identifier LPAREN Identifier"= using multiple alternatives: 1, 2 [java]=20 [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): FromClauseParser.g:237:16:=20 [java] Decision can match input such as "Identifier LPAREN KW_UNIONTYP= E" using multiple alternatives: 1, 2 [java]=20 [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): FromClauseParser.g:237:16:=20 [java] Decision can match input such as "Identifier LPAREN SmallintLit= eral" using multiple alternatives: 1, 2 [java]=20 [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): FromClauseParser.g:237:16:=20 [java] Decision can match input such as "Identifier LPAREN KW_CASE" us= ing multiple alternatives: 1, 2 [java]=20 [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): FromClauseParser.g:237:16:=20 [java] Decision can match input such as "Identifier LPAREN KW_IF" usin= g multiple alternatives: 1, 2 [java]=20 [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): FromClauseParser.g:237:16:=20 [java] Decision can match input such as "Identifier LPAREN KW_NULL" us= ing multiple alternatives: 1, 2 [java]=20 [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): FromClauseParser.g:237:16:=20 [java] Decision can match input such as "Identifier LPAREN CharSetName= " using multiple alternatives: 1, 2 [java]=20 [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): FromClauseParser.g:237:16:=20 [java] Decision can match input such as "Identifier LPAREN KW_STRUCT" = using multiple alternatives: 1, 2 [java]=20 [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): FromClauseParser.g:237:16:=20 [java] Decision can match input such as "Identifier LPAREN Number" usi= ng multiple alternatives: 1, 2 [java]=20 [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): FromClauseParser.g:237:16:=20 [java] Decision can match input such as "Identifier LPAREN StringLiter= al" using multiple alternatives: 1, 2 [java]=20 [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): FromClauseParser.g:237:16:=20 [java] Decision can match input such as "Identifier LPAREN DecimalLite= ral" using multiple alternatives: 1, 2 [java]=20 [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): FromClauseParser.g:237:16:=20 [java] Decision can match input such as "Identifier LPAREN LPAREN" usi= ng multiple alternatives: 1, 2 [java]=20 [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): FromClauseParser.g:237:16:=20 [java] Decision can match input such as "Identifier LPAREN KW_CAST" us= ing multiple alternatives: 1, 2 [java]=20 [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): FromClauseParser.g:237:16:=20 [java] Decision can match input such as "Identifier LPAREN KW_MAP" usi= ng multiple alternatives: 1, 2 [java]=20 [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): FromClauseParser.g:237:16:=20 [java] Decision can match input such as "Identifier LPAREN {MINUS, PLU= S, TILDE}" using multiple alternatives: 1, 2 [java]=20 [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): FromClauseParser.g:237:16:=20 [java] Decision can match input such as "Identifier LPAREN {KW_ADD..KW= _AFTER, KW_ALTER..KW_ANALYZE, KW_ARCHIVE, KW_AS..KW_CASCADE, KW_CHANGE..KW_= COLLECTION, KW_COLUMNS..KW_CREATE, KW_CUBE, KW_CURSOR..KW_DATA, KW_DATABASE= S, KW_DATETIME..KW_DISABLE, KW_DISTRIBUTE..KW_ELEM_TYPE, KW_ENABLE, KW_ESCA= PED, KW_EXCLUSIVE..KW_EXPORT, KW_EXTERNAL, KW_FETCH..KW_FLOAT, KW_FOR..KW_F= ORMATTED, KW_FULL, KW_FUNCTIONS..KW_GROUPING, KW_HOLD_DDLTIME..KW_IDXPROPER= TIES, KW_IGNORE..KW_ITEMS, KW_KEYS..KW_LEFT, KW_LIKE..KW_LONG, KW_MAPJOIN..= KW_MINUS, KW_MSCK..KW_NOSCAN, KW_NO_DROP, KW_OF..KW_OFFLINE, KW_OPTION, KW_= ORCFILE..KW_OUTPUTFORMAT, KW_OVERWRITE, KW_PARTITION..KW_PLUS, KW_PRETTY..K= W_RECORDWRITER, KW_REGEXP..KW_SCHEMAS, KW_SEMI..KW_STRING, KW_TABLE..KW_TAB= LES, KW_TBLPROPERTIES..KW_TEXTFILE, KW_TIMESTAMP..KW_TOUCH, KW_TRIGGER, KW_= TRUNCATE..KW_UNARCHIVE, KW_UNDO..KW_UNION, KW_UNLOCK..KW_VALUE_TYPE, KW_VIE= W, KW_WHILE, KW_WITH}" using multiple alternatives: 1, 2 [java]=20 [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): FromClauseParser.g:237:16:=20 [java] Decision can match input such as "Identifier LPAREN KW_ARRAY" u= sing multiple alternatives: 1, 2 [java]=20 [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): IdentifiersParser.g:68:4:=20 [java] Decision can match input such as "LPAREN LPAREN Number" using m= ultiple alternatives: 1, 2 [java]=20 [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): IdentifiersParser.g:68:4:=20 [java] Decision can match input such as "LPAREN KW_NULL GREATERTHAN" u= sing multiple alternatives: 1, 2 [java]=20 [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): IdentifiersParser.g:68:4:=20 [java] Decision can match input such as "LPAREN KW_NOT KW_FALSE" using= multiple alternatives: 1, 2 [java]=20 [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): IdentifiersParser.g:68:4:=20 [java] Decision can match input such as "LPAREN KW_CASE Identifier" us= ing multiple alternatives: 1, 2 [java]=20 [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): IdentifiersParser.g:68:4:=20 [java] Decision can match input such as "LPAREN KW_NULL GREATERTHANORE= QUALTO" using multiple alternatives: 1, 2 [java]=20 [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): IdentifiersParser.g:68:4:=20 [java] Decision can match input such as "LPAREN KW_NOT KW_TRUE" using = multiple alternatives: 1, 2 [java]=20 [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): IdentifiersParser.g:68:4:=20 [java] Decision can match input such as "LPAREN KW_NULL LESSTHAN" usin= g multiple alternatives: 1, 2 [java]=20 [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): IdentifiersParser.g:68:4:=20 [java] Decision can match input such as "LPAREN KW_CASE {KW_ADD..KW_AF= TER, KW_ALTER..KW_ANALYZE, KW_ARCHIVE, KW_AS..KW_CASCADE, KW_CHANGE..KW_COL= LECTION, KW_COLUMNS..KW_CREATE, KW_CUBE, KW_CURSOR..KW_DATA, KW_DATABASES, = KW_DATETIME..KW_DISABLE, KW_DISTRIBUTE..KW_ELEM_TYPE, KW_ENABLE, KW_ESCAPED= , KW_EXCLUSIVE..KW_EXPORT, KW_EXTERNAL, KW_FETCH..KW_FLOAT, KW_FOR..KW_FORM= ATTED, KW_FULL, KW_FUNCTIONS..KW_GROUPING, KW_HOLD_DDLTIME..KW_IDXPROPERTIE= S, KW_IGNORE..KW_ITEMS, KW_KEYS..KW_LEFT, KW_LIKE..KW_LONG, KW_MAPJOIN..KW_= MINUS, KW_MSCK..KW_NOSCAN, KW_NO_DROP, KW_OF..KW_OFFLINE, KW_OPTION, KW_ORC= FILE..KW_OUTPUTFORMAT, KW_OVERWRITE, KW_PARTITION..KW_PLUS, KW_PRETTY..KW_R= ECORDWRITER, KW_REGEXP..KW_SCHEMAS, KW_SEMI..KW_STRING, KW_TABLE..KW_TABLES= , KW_TBLPROPERTIES..KW_TEXTFILE, KW_TIMESTAMP..KW_TOUCH, KW_TRIGGER, KW_TRU= NCATE..KW_UNARCHIVE, KW_UNDO..KW_UNION, KW_UNLOCK..KW_VALUE_TYPE, KW_VIEW, = KW_WHILE, KW_WITH}" using multiple alternatives: 1, 2 [java]=20 [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): IdentifiersParser.g:68:4:=20 [java] Decision can match input such as "LPAREN KW_NULL LESSTHANOREQUA= LTO" using multiple alternatives: 1, 2 [java]=20 [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): IdentifiersParser.g:68:4:=20 [java] Decision can match input such as "LPAREN KW_NULL DOT" using mul= tiple alternatives: 1, 2 [java]=20 [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): IdentifiersParser.g:68:4:=20 [java] Decision can match input such as "LPAREN KW_NOT CharSetName" us= ing multiple alternatives: 1, 2 [java]=20 [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): IdentifiersParser.g:68:4:=20 [java] Decision can match input such as "LPAREN KW_CASE CharSetName" u= sing multiple alternatives: 1, 2 [java]=20 [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): IdentifiersParser.g:68:4:=20 [java] Decision can match input such as "LPAREN LPAREN CharSetName" us= ing multiple alternatives: 1, 2 [java]=20 [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): IdentifiersParser.g:68:4:=20 [java] Decision can match input such as "LPAREN KW_CASE KW_ARRAY" usin= g multiple alternatives: 1, 2 [java]=20 [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): IdentifiersParser.g:68:4:=20 [java] Decision can match input such as "LPAREN KW_NULL NOTEQUAL" usin= g multiple alternatives: 1, 2 [java]=20 [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): IdentifiersParser.g:68:4:=20 [java] Decision can match input such as "LPAREN KW_NOT StringLiteral" = using multiple alternatives: 1, 2 [java]=20 [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): IdentifiersParser.g:68:4:=20 [java] Decision can match input such as "LPAREN KW_NULL EQUAL_NS" usin= g multiple alternatives: 1, 2 [java]=20 [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): IdentifiersParser.g:68:4:=20 [java] Decision can match input such as "LPAREN LPAREN Identifier" usi= ng multiple alternatives: 1, 2 [java]=20 [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): IdentifiersParser.g:68:4:=20 [java] Decision can match input such as "LPAREN KW_NULL {DIV..DIVIDE, = MOD, STAR}" using multiple alternatives: 1, 2 [java]=20 [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): IdentifiersParser.g:68:4:=20 [java] Decision can match input such as "LPAREN KW_NULL BITWISEXOR" us= ing multiple alternatives: 1, 2 [java]=20 [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): IdentifiersParser.g:68:4:=20 [java] Decision can match input such as "LPAREN KW_CASE KW_STRUCT" usi= ng multiple alternatives: 1, 2 [java]=20 [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): IdentifiersParser.g:68:4:=20 [java] Decision can match input such as "LPAREN KW_NULL EQUAL" using m= ultiple alternatives: 1, 2 [java]=20 [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): IdentifiersParser.g:68:4:=20 [java] Decision can match input such as "LPAREN KW_NOT KW_ARRAY" using= multiple alternatives: 1, 2 [java]=20 [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): IdentifiersParser.g:68:4:=20 [java] Decision can match input such as "LPAREN KW_CASE KW_UNIONTYPE" = using multiple alternatives: 1, 2 [java]=20 [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): IdentifiersParser.g:68:4:=20 [java] Decision can match input such as "LPAREN KW_NOT KW_STRUCT" usin= g multiple alternatives: 1, 2 [java]=20 [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): IdentifiersParser.g:68:4:=20 [java] Decision can match input such as "LPAREN KW_NOT Identifier" usi= ng multiple alternatives: 1, 2 [java]=20 [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): IdentifiersParser.g:68:4:=20 [java] Decision can match input such as "LPAREN KW_NOT KW_NOT" using m= ultiple alternatives: 1, 2 [java]=20 [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): IdentifiersParser.g:68:4:=20 [java] Decision can match input such as "LPAREN KW_CASE KW_NOT" using = multiple alternatives: 1, 2 [java]=20 [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): IdentifiersParser.g:68:4:=20 [java] Decision can match input such as "LPAREN LPAREN KW_NOT" using m= ultiple alternatives: 1, 2 [java]=20 [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): IdentifiersParser.g:68:4:=20 [java] Decision can match input such as "LPAREN KW_NOT KW_DATE" using = multiple alternatives: 1, 2 [java]=20 [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): IdentifiersParser.g:68:4:=20 [java] Decision can match input such as "LPAREN LPAREN TinyintLiteral"= using multiple alternatives: 1, 2 [java]=20 [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): IdentifiersParser.g:68:4:=20 [java] Decision can match input such as "LPAREN KW_NOT KW_UNIONTYPE" u= sing multiple alternatives: 1, 2 [java]=20 [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): IdentifiersParser.g:68:4:=20 [java] Decision can match input such as "LPAREN KW_NULL RPAREN" using = multiple alternatives: 1, 2 [java]=20 [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): IdentifiersParser.g:68:4:=20 [java] Decision can match input such as "LPAREN LPAREN DecimalLiteral"= using multiple alternatives: 1, 2 [java]=20 [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): IdentifiersParser.g:68:4:=20 [java] Decision can match input such as "LPAREN KW_CASE KW_NULL" using= multiple alternatives: 1, 2 [java]=20 [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): IdentifiersParser.g:68:4:=20 [java] Decision can match input such as "LPAREN LPAREN BigintLiteral" = using multiple alternatives: 1, 2 [java]=20 [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): IdentifiersParser.g:68:4:=20 [java] Decision can match input such as "LPAREN KW_CASE StringLiteral"= using multiple alternatives: 1, 2 [java]=20 [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): IdentifiersParser.g:68:4:=20 [java] Decision can match input such as "LPAREN LPAREN SmallintLiteral= " using multiple alternatives: 1, 2 [java]=20 [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): IdentifiersParser.g:68:4:=20 [java] Decision can match input such as "LPAREN KW_NULL KW_AND" using = multiple alternatives: 1, 2 [java]=20 [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): IdentifiersParser.g:68:4:=20 [java] Decision can match input such as "LPAREN KW_CAST LPAREN" using = multiple alternatives: 1, 2 [java]=20 [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): IdentifiersParser.g:68:4:=20 [java] Decision can match input such as "LPAREN KW_NULL BITWISEOR" usi= ng multiple alternatives: 1, 2 [java]=20 [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): IdentifiersParser.g:68:4:=20 [java] Decision can match input such as "LPAREN KW_NULL KW_BETWEEN" us= ing multiple alternatives: 1, 2 [java]=20 [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): IdentifiersParser.g:68:4:=20 [java] Decision can match input such as "LPAREN KW_NOT {KW_ADD..KW_AFT= ER, KW_ALTER..KW_ANALYZE, KW_ARCHIVE, KW_AS..KW_CASCADE, KW_CHANGE..KW_COLL= ECTION, KW_COLUMNS..KW_CREATE, KW_CUBE, KW_CURSOR..KW_DATA, KW_DATABASES, K= W_DATETIME..KW_DISABLE, KW_DISTRIBUTE..KW_ELEM_TYPE, KW_ENABLE, KW_ESCAPED,= KW_EXCLUSIVE..KW_EXPORT, KW_EXTERNAL, KW_FETCH..KW_FLOAT, KW_FOR..KW_FORMA= TTED, KW_FULL, KW_FUNCTIONS..KW_GROUPING, KW_HOLD_DDLTIME..KW_IDXPROPERTIES= , KW_IGNORE..KW_ITEMS, KW_KEYS..KW_LEFT, KW_LIKE..KW_LONG, KW_MAPJOIN..KW_M= INUS, KW_MSCK..KW_NOSCAN, KW_NO_DROP, KW_OF..KW_OFFLINE, KW_OPTION, KW_ORCF= ILE..KW_OUTPUTFORMAT, KW_OVERWRITE, KW_PARTITION..KW_PLUS, KW_PRETTY..KW_RE= CORDWRITER, KW_REGEXP..KW_SCHEMAS, KW_SEMI..KW_STRING, KW_TABLE..KW_TABLES,= KW_TBLPROPERTIES..KW_TEXTFILE, KW_TIMESTAMP..KW_TOUCH, KW_TRIGGER, KW_TRUN= CATE..KW_UNARCHIVE, KW_UNDO..KW_UNION, KW_UNLOCK..KW_VALUE_TYPE, KW_VIEW, K= W_WHILE, KW_WITH}" using multiple alternatives: 1, 2 [java]=20 [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): IdentifiersParser.g:68:4:=20 [java] Decision can match input such as "LPAREN KW_NOT KW_NULL" using = multiple alternatives: 1, 2 [java]=20 [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): IdentifiersParser.g:68:4:=20 [java] Decision can match input such as "LPAREN KW_NOT KW_CASE" using = multiple alternatives: 1, 2 [java]=20 [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): IdentifiersParser.g:68:4:=20 [java] Decision can match input such as "LPAREN KW_CASE KW_CASE" using= multiple alternatives: 1, 2 [java]=20 [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): IdentifiersParser.g:68:4:=20 [java] Decision can match input such as "LPAREN LPAREN KW_CASE" using = multiple alternatives: 1, 2 [java]=20 [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): IdentifiersParser.g:68:4:=20 [java] Decision can match input such as "LPAREN KW_NULL KW_NOT" using = multiple alternatives: 1, 2 [java]=20 [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): IdentifiersParser.g:68:4:=20 [java] Decision can match input such as "LPAREN LPAREN StringLiteral" = using multiple alternatives: 1, 2 [java]=20 [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): IdentifiersParser.g:68:4:=20 [java] Decision can match input such as "LPAREN LPAREN KW_ARRAY" using= multiple alternatives: 1, 2 [java]=20 [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): IdentifiersParser.g:68:4:=20 [java] Decision can match input such as "LPAREN KW_NULL KW_IN" using m= ultiple alternatives: 1, 2 [java]=20 [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): IdentifiersParser.g:68:4:=20 [java] Decision can match input such as "LPAREN LPAREN KW_FALSE" using= multiple alternatives: 1, 2 [java]=20 [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): IdentifiersParser.g:68:4:=20 [java] Decision can match input such as "LPAREN LPAREN KW_STRUCT" usin= g multiple alternatives: 1, 2 [java]=20 [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): IdentifiersParser.g:68:4:=20 [java] Decision can match input such as "LPAREN KW_NOT LPAREN" using m= ultiple alternatives: 1, 2 [java]=20 [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): IdentifiersParser.g:68:4:=20 [java] Decision can match input such as "LPAREN KW_CASE LPAREN" using = multiple alternatives: 1, 2 [java]=20 [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): IdentifiersParser.g:68:4:=20 [java] Decision can match input such as "LPAREN LPAREN KW_NULL" using = multiple alternatives: 1, 2 [java]=20 [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): IdentifiersParser.g:68:4:=20 [java] Decision can match input such as "LPAREN LPAREN LPAREN" using m= ultiple alternatives: 1, 2 [java]=20 [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): IdentifiersParser.g:68:4:=20 [java] Decision can match input such as "LPAREN LPAREN KW_UNIONTYPE" u= sing multiple alternatives: 1, 2 [java]=20 [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): IdentifiersParser.g:68:4:=20 [java] Decision can match input such as "LPAREN LPAREN KW_TRUE" using = multiple alternatives: 1, 2 [java]=20 [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): IdentifiersParser.g:68:4:=20 [java] Decision can match input such as "LPAREN LPAREN {KW_ADD..KW_AFT= ER, KW_ALTER..KW_ANALYZE, KW_ARCHIVE, KW_AS..KW_CASCADE, KW_CHANGE..KW_COLL= ECTION, KW_COLUMNS..KW_CREATE, KW_CUBE, KW_CURSOR..KW_DATA, KW_DATABASES, K= W_DATETIME..KW_DISABLE, KW_DISTRIBUTE..KW_ELEM_TYPE, KW_ENABLE, KW_ESCAPED,= KW_EXCLUSIVE..KW_EXPORT, KW_EXTERNAL, KW_FETCH..KW_FLOAT, KW_FOR..KW_FORMA= TTED, KW_FULL, KW_FUNCTIONS..KW_GROUPING, KW_HOLD_DDLTIME..KW_IDXPROPERTIES= , KW_IGNORE..KW_ITEMS, KW_KEYS..KW_LEFT, KW_LIKE..KW_LONG, KW_MAPJOIN..KW_M= INUS, KW_MSCK..KW_NOSCAN, KW_NO_DROP, KW_OF..KW_OFFLINE, KW_OPTION, KW_ORCF= ILE..KW_OUTPUTFORMAT, KW_OVERWRITE, KW_PARTITION..KW_PLUS, KW_PRETTY..KW_RE= CORDWRITER, KW_REGEXP..KW_SCHEMAS, KW_SEMI..KW_STRING, KW_TABLE..KW_TABLES,= KW_TBLPROPERTIES..KW_TEXTFILE, KW_TIMESTAMP..KW_TOUCH, KW_TRIGGER, KW_TRUN= CATE..KW_UNARCHIVE, KW_UNDO..KW_UNION, KW_UNLOCK..KW_VALUE_TYPE, KW_VIEW, K= W_WHILE, KW_WITH}" using multiple alternatives: 1, 2 [java]=20 [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): IdentifiersParser.g:68:4:=20 [java] Decision can match input such as "LPAREN KW_NOT BigintLiteral" = using multiple alternatives: 1, 2 [java]=20 [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): IdentifiersParser.g:68:4:=20 [java] Decision can match input such as "LPAREN KW_NOT KW_IF" using mu= ltiple alternatives: 1, 2 [java]=20 [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): IdentifiersParser.g:68:4:=20 [java] Decision can match input such as "LPAREN KW_CASE KW_IF" using m= ultiple alternatives: 1, 2 [java]=20 [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): IdentifiersParser.g:68:4:=20 [java] Decision can match input such as "LPAREN LPAREN KW_IF" using mu= ltiple alternatives: 1, 2 [java]=20 [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): IdentifiersParser.g:68:4:=20 [java] Decision can match input such as "LPAREN KW_NULL AMPERSAND" usi= ng multiple alternatives: 1, 2 [java]=20 [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): IdentifiersParser.g:68:4:=20 [java] Decision can match input such as "LPAREN KW_NULL LSQUARE" using= multiple alternatives: 1, 2 [java]=20 [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): IdentifiersParser.g:68:4:=20 [java] Decision can match input such as "LPAREN KW_NOT KW_MAP" using m= ultiple alternatives: 1, 2 [java]=20 [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): IdentifiersParser.g:68:4:=20 [java] Decision can match input such as "LPAREN KW_CASE KW_MAP" using = multiple alternatives: 1, 2 [java]=20 [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): IdentifiersParser.g:68:4:=20 [java] Decision can match input such as "LPAREN LPAREN KW_MAP" using m= ultiple alternatives: 1, 2 [java]=20 [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): IdentifiersParser.g:68:4:=20 [java] Decision can match input such as "LPAREN KW_CASE KW_DATE" using= multiple alternatives: 1, 2 [java]=20 [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): IdentifiersParser.g:68:4:=20 [java] Decision can match input such as "LPAREN KW_NULL {KW_LIKE, KW_R= EGEXP, KW_RLIKE}" using multiple alternatives: 1, 2 [java]=20 [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): IdentifiersParser.g:68:4:=20 [java] Decision can match input such as "LPAREN KW_CASE Number" using = multiple alternatives: 1, 2 [java]=20 [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): IdentifiersParser.g:68:4:=20 [java] Decision can match input such as "LPAREN KW_NULL LPAREN" using = multiple alternatives: 1, 2 [java]=20 [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): IdentifiersParser.g:68:4:=20 [java] Decision can match input such as "LPAREN KW_NOT Number" using m= ultiple alternatives: 1, 2 [java]=20 [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): IdentifiersParser.g:68:4:=20 [java] Decision can match input such as "LPAREN KW_CASE DecimalLiteral= " using multiple alternatives: 1, 2 [java]=20 [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): IdentifiersParser.g:68:4:=20 [java] Decision can match input such as "LPAREN KW_CASE TinyintLiteral= " using multiple alternatives: 1, 2 [java]=20 [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): IdentifiersParser.g:68:4:=20 [java] Decision can match input such as "LPAREN KW_NULL {MINUS, PLUS}"= using multiple alternatives: 1, 2 [java]=20 [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): IdentifiersParser.g:68:4:=20 [java] Decision can match input such as "LPAREN KW_CASE SmallintLitera= l" using multiple alternatives: 1, 2 [java]=20 [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): IdentifiersParser.g:68:4:=20 [java] Decision can match input such as "LPAREN CharSetName CharSetLit= eral" using multiple alternatives: 1, 2 [java]=20 [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): IdentifiersParser.g:68:4:=20 [java] Decision can match input such as "LPAREN KW_CASE BigintLiteral"= using multiple alternatives: 1, 2 [java]=20 [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): IdentifiersParser.g:68:4:=20 [java] Decision can match input such as "LPAREN LPAREN KW_DATE" using = multiple alternatives: 1, 2 [java]=20 [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): IdentifiersParser.g:68:4:=20 [java] Decision can match input such as "LPAREN KW_CASE KW_TRUE" using= multiple alternatives: 1, 2 [java]=20 [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): IdentifiersParser.g:68:4:=20 [java] Decision can match input such as "LPAREN KW_CASE KW_WHEN" using= multiple alternatives: 1, 2 [java]=20 [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): IdentifiersParser.g:68:4:=20 [java] Decision can match input such as "LPAREN KW_CASE KW_FALSE" usin= g multiple alternatives: 1, 2 [java]=20 [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): IdentifiersParser.g:68:4:=20 [java] Decision can match input such as "LPAREN KW_NULL KW_IS" using m= ultiple alternatives: 1, 2 [java]=20 [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): IdentifiersParser.g:68:4:=20 [java] Decision can match input such as "LPAREN StringLiteral StringLi= teral" using multiple alternatives: 1, 2 [java]=20 [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): IdentifiersParser.g:68:4:=20 [java] Decision can match input such as "LPAREN KW_NOT {MINUS, PLUS, T= ILDE}" using multiple alternatives: 1, 2 [java]=20 [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): IdentifiersParser.g:68:4:=20 [java] Decision can match input such as "LPAREN KW_CASE {MINUS, PLUS, = TILDE}" using multiple alternatives: 1, 2 [java]=20 [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): IdentifiersParser.g:68:4:=20 [java] Decision can match input such as "LPAREN LPAREN {MINUS, PLUS, T= ILDE}" using multiple alternatives: 1, 2 [java]=20 [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): IdentifiersParser.g:68:4:=20 [java] Decision can match input such as "LPAREN KW_DATE StringLiteral"= using multiple alternatives: 1, 2 [java]=20 [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): IdentifiersParser.g:68:4:=20 [java] Decision can match input such as "LPAREN KW_NULL KW_OR" using m= ultiple alternatives: 1, 2 [java]=20 [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): IdentifiersParser.g:68:4:=20 [java] Decision can match input such as "LPAREN KW_NOT TinyintLiteral"= using multiple alternatives: 1, 2 [java]=20 [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): IdentifiersParser.g:68:4:=20 [java] Decision can match input such as "LPAREN KW_NOT SmallintLiteral= " using multiple alternatives: 1, 2 [java]=20 [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): IdentifiersParser.g:68:4:=20 [java] Decision can match input such as "LPAREN KW_NOT KW_CAST" using = multiple alternatives: 1, 2 [java]=20 [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): IdentifiersParser.g:68:4:=20 [java] Decision can match input such as "LPAREN KW_CASE KW_CAST" using= multiple alternatives: 1, 2 [java]=20 [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): IdentifiersParser.g:68:4:=20 [java] Decision can match input such as "LPAREN LPAREN KW_CAST" using = multiple alternatives: 1, 2 [java]=20 [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): IdentifiersParser.g:68:4:=20 [java] Decision can match input such as "LPAREN KW_NOT DecimalLiteral"= using multiple alternatives: 1, 2 [java]=20 [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): IdentifiersParser.g:108:5:=20 [java] Decision can match input such as "KW_ORDER KW_BY LPAREN" using = multiple alternatives: 1, 2 [java]=20 [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): IdentifiersParser.g:121:5:=20 [java] Decision can match input such as "KW_CLUSTER KW_BY LPAREN" usin= g multiple alternatives: 1, 2 [java]=20 [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): IdentifiersParser.g:133:5:=20 [java] Decision can match input such as "KW_PARTITION KW_BY LPAREN" us= ing multiple alternatives: 1, 2 [java]=20 [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): IdentifiersParser.g:144:5:=20 [java] Decision can match input such as "KW_DISTRIBUTE KW_BY LPAREN" u= sing multiple alternatives: 1, 2 [java]=20 [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): IdentifiersParser.g:155:5:=20 [java] Decision can match input such as "KW_SORT KW_BY LPAREN" using m= ultiple alternatives: 1, 2 [java]=20 [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): IdentifiersParser.g:172:7:=20 [java] Decision can match input such as "STAR" using multiple alternat= ives: 1, 2 [java]=20 [java] As a result, alternative(s) 2 were disabled for that input [java] warning(200): IdentifiersParser.g:185:5:=20 [java] Decision can match input such as "KW_STRUCT" using multiple alt= ernatives: 4, 6 [java]=20 [java] As a result, alternative(s) 6 were disabled for that input [java] warning(200): IdentifiersParser.g:185:5:=20 [java] Decision can match input such as "KW_UNIONTYPE" using multiple = alternatives: 5, 6 [java]=20 [java] As a result, alternative(s) 6 were disabled for that input [java] warning(200): IdentifiersParser.g:185:5:=20 [java] Decision can match input such as "KW_ARRAY" using multiple alte= rnatives: 2, 6 [java]=20 [java] As a result, alternative(s) 6 were disabled for that input [java] warning(200): IdentifiersParser.g:267:5:=20 [java] Decision can match input such as "KW_NULL" using multiple alter= natives: 1, 8 [java]=20 [java] As a result, alternative(s) 8 were disabled for that input [java] warning(200): IdentifiersParser.g:267:5:=20 [java] Decision can match input such as "KW_DATE StringLiteral" using = multiple alternatives: 2, 3 [java]=20 [java] As a result, alternative(s) 3 were disabled for that input [java] warning(200): IdentifiersParser.g:267:5:=20 [java] Decision can match input such as "KW_FALSE" using multiple alte= rnatives: 3, 8 [java]=20 [java] As a result, alternative(s) 8 were disabled for that input [java] warning(200): IdentifiersParser.g:267:5:=20 [java] Decision can match input such as "KW_TRUE" using multiple alter= natives: 3, 8 [java]=20 [java] As a result, alternative(s) 8 were disabled for that input [java] warning(200): IdentifiersParser.g:390:5:=20 [java] Decision can match input such as "{KW_LIKE, KW_REGEXP, KW_RLIKE= } KW_CLUSTER KW_BY" using multiple alternatives: 2, 7 [java]=20 [java] As a result, alternative(s) 7 were disabled for that input [java] warning(200): IdentifiersParser.g:390:5:=20 [java] Decision can match input such as "{KW_LIKE, KW_REGEXP, KW_RLIKE= } KW_MAP LPAREN" using multiple alternatives: 2, 7 [java]=20 [java] As a result, alternative(s) 7 were disabled for that input [java] warning(200): IdentifiersParser.g:390:5:=20 [java] Decision can match input such as "{KW_LIKE, KW_REGEXP, KW_RLIKE= } KW_GROUP KW_BY" using multiple alternatives: 2, 7 [java]=20 [java] As a result, alternative(s) 7 were disabled for that input [java] warning(200): IdentifiersParser.g:390:5:=20 [java] Decision can match input such as "{KW_LIKE, KW_REGEXP, KW_RLIKE= } KW_INSERT KW_INTO" using multiple alternatives: 2, 7 [java]=20 [java] As a result, alternative(s) 7 were disabled for that input [java] warning(200): IdentifiersParser.g:390:5:=20 [java] Decision can match input such as "KW_BETWEEN KW_MAP LPAREN" usi= ng multiple alternatives: 6, 7 [java]=20 [java] As a result, alternative(s) 7 were disabled for that input [java] warning(200): IdentifiersParser.g:390:5:=20 [java] Decision can match input such as "{KW_LIKE, KW_REGEXP, KW_RLIKE= } KW_ORDER KW_BY" using multiple alternatives: 2, 7 [java]=20 [java] As a result, alternative(s) 7 were disabled for that input [java] warning(200): IdentifiersParser.g:390:5:=20 [java] Decision can match input such as "{KW_LIKE, KW_REGEXP, KW_RLIKE= } KW_LATERAL KW_VIEW" using multiple alternatives: 2, 7 [java]=20 [java] As a result, alternative(s) 7 were disabled for that input [java] warning(200): IdentifiersParser.g:390:5:=20 [java] Decision can match input such as "{KW_LIKE, KW_REGEXP, KW_RLIKE= } KW_SORT KW_BY" using multiple alternatives: 2, 7 [java]=20 [java] As a result, alternative(s) 7 were disabled for that input [java] warning(200): IdentifiersParser.g:390:5:=20 [java] Decision can match input such as "{KW_LIKE, KW_REGEXP, KW_RLIKE= } KW_INSERT KW_OVERWRITE" using multiple alternatives: 2, 7 [java]=20 [java] As a result, alternative(s) 7 were disabled for that input [java] warning(200): IdentifiersParser.g:390:5:=20 [java] Decision can match input such as "{KW_LIKE, KW_REGEXP, KW_RLIKE= } KW_DISTRIBUTE KW_BY" using multiple alternatives: 2, 7 [java]=20 [java] As a result, alternative(s) 7 were disabled for that input [java] warning(200): IdentifiersParser.g:514:5:=20 [java] Decision can match input such as "{AMPERSAND..BITWISEXOR, DIV..= DIVIDE, EQUAL..EQUAL_NS, GREATERTHAN..GREATERTHANOREQUALTO, KW_AND, KW_ARRA= Y, KW_BETWEEN..KW_BOOLEAN, KW_CASE, KW_DOUBLE, KW_FLOAT, KW_IF, KW_IN, KW_I= NT, KW_LIKE, KW_MAP, KW_NOT, KW_OR, KW_REGEXP, KW_RLIKE, KW_SMALLINT, KW_ST= RING..KW_STRUCT, KW_TINYINT, KW_UNIONTYPE, KW_WHEN, LESSTHAN..LESSTHANOREQU= ALTO, MINUS..NOTEQUAL, PLUS, STAR, TILDE}" using multiple alternatives: 1, = 3 [java]=20 [java] As a result, alternative(s) 3 were disabled for that input compile: [echo] Project: ql [javac] Compiling 922 source files to /data/hive-ptest/working/apache-s= vn-trunk-source/build/ql/classes [javac] /data/hive-ptest/working/apache-svn-trunk-source/ql/src/java/or= g/apache/hadoop/hive/ql/udf/generic/GenericUDAFCollectSet.java:25: package = org.apache.hadoop.hive.ql.udf.generic.GenericUDAFMkCollectionEvaluator does= not exist [javac] import org.apache.hadoop.hive.ql.udf.generic.GenericUDAFMkColle= ctionEvaluator.BufferType; [javac] = ^ [javac] /data/hive-ptest/working/apache-svn-trunk-source/ql/src/java/or= g/apache/hadoop/hive/ql/exec/FunctionRegistry.java:386: cannot find symbol [javac] symbol : class GenericUDAFCollectList [javac] location: class org.apache.hadoop.hive.ql.exec.FunctionRegistry [javac] registerGenericUDAF("collect_list", new GenericUDAFCollectL= ist()); [javac] ^ [javac] /data/hive-ptest/working/apache-svn-trunk-source/ql/src/java/or= g/apache/hadoop/hive/ql/udf/generic/GenericUDAFCollectSet.java:52: cannot f= ind symbol [javac] symbol : class GenericUDAFMkCollectionEvaluator [javac] location: class org.apache.hadoop.hive.ql.udf.generic.GenericUD= AFCollectSet [javac] return new GenericUDAFMkCollectionEvaluator(BufferType.SET)= ; [javac] ^ [javac] /data/hive-ptest/working/apache-svn-trunk-source/ql/src/java/or= g/apache/hadoop/hive/ql/udf/generic/GenericUDAFCollectSet.java:52: cannot f= ind symbol [javac] symbol : variable BufferType [javac] location: class org.apache.hadoop.hive.ql.udf.generic.GenericUD= AFCollectSet [javac] return new GenericUDAFMkCollectionEvaluator(BufferType.SET)= ; [javac] ^ [javac] Note: Some input files use or override a deprecated API. [javac] Note: Recompile with -Xlint:deprecation for details. [javac] Note: Some input files use unchecked or unsafe operations. [javac] Note: Recompile with -Xlint:unchecked for details. [javac] 4 errors BUILD FAILED /data/hive-ptest/working/apache-svn-trunk-source/build.xml:327: The followi= ng error occurred while executing this line: /data/hive-ptest/working/apache-svn-trunk-source/build.xml:166: The followi= ng error occurred while executing this line: /data/hive-ptest/working/apache-svn-trunk-source/build.xml:168: The followi= ng error occurred while executing this line: /data/hive-ptest/working/apache-svn-trunk-source/ql/build.xml:198: Compile = failed; see the compiler error output for details. Total time: 4 minutes 32 seconds + exit 1 ' {noformat} This message is automatically generated. =20 > HiveServer2 jdbc ResultSet.close should free up resources on server side > ------------------------------------------------------------------------ > > Key: HIVE-5156 > URL: https://issues.apache.org/jira/browse/HIVE-5156 > Project: Hive > Issue Type: Bug > Components: HiveServer2 > Affects Versions: 0.12.0 > Reporter: Vaibhav Gumashta > Assignee: Vaibhav Gumashta > Priority: Minor > Attachments: HIVE-5156.D12837.3.patch > > > ResultSet.close does not free up any resources (tmp files etc) on hive se= rver. -- This message is automatically generated by JIRA. If you think it was sent incorrectly, please contact your JIRA administrato= rs For more information on JIRA, see: http://www.atlassian.com/software/jira