hadoop-hive-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Ashish Thusoo <athu...@facebook.com>
Subject RE: *UNIT TEST FAILURE for apache HIVE* Hadoop.Version=0.17.1 based on SVN Rev# 730302.4
Date Wed, 31 Dec 2008 17:49:31 GMT
That explains it. Something is wrong with our filer configuration then, I do see intermittent problems when our build tries to delete directories. Now that the tmp directory is in build/ql/tmp we have started hitting this in tests. Originally it used to happen as part of test cleanups when we we tried to delete build. No idea why this is happening. Is it possible to run this on the local disk for now, while we try to figure out why this is happening with the filer.

Ashish
________________________________________
From: Murli Varadachari
Sent: Wednesday, December 31, 2008 9:38 AM
To: hive-dev@hadoop.apache.org; Ashish Thusoo
Cc: Murli Varadachari
Subject: Re: *UNIT TEST FAILURE for apache HIVE* Hadoop.Version=0.17.1 based on SVN Rev# 730302.4

It is a filer ! Local disk space on build hosts ore rather limited.

Cheers
murli


On 12/31/08 9:03 AM, "Ashish Thusoo" <athusoo@facebook.com> wrote:

I have seen this happen on a filer (when build is on a filer as opposed to a local disk). Can you verify that  /usr/local/continuous_builds/src/hiveopensource-0.17.1/hiveopensource_0_17_1 is not on a filer...

Thanks,
Ashish
________________________________________
From: Murli Varadachari [mvaradachari@facebook.com]
Sent: Tuesday, December 30, 2008 7:51 PM
To: hive-dev@hadoop.apache.org
Subject: *UNIT TEST FAILURE for apache HIVE* Hadoop.Version=0.17.1 based on SVN Rev# 730302.4

Compiling hiveopensource at /usr/local/continuous_builds/src/hiveopensource-0.17.1/hiveopensource_0_17_1
+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Buildfile: build.xml

clean:

clean:
     [echo] Cleaning: anttasks
   [delete] Deleting directory /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/anttasks

clean:
     [echo] Cleaning: cli
   [delete] Deleting directory /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/cli

clean:
     [echo] Cleaning: common
   [delete] Deleting directory /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/common

clean:
     [echo] Cleaning: metastore
   [delete] Deleting directory /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/metastore
Overriding previous definition of reference to test.classpath

clean:
     [echo] Cleaning: ql
   [delete] Deleting directory /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql

clean:
     [echo] Cleaning: serde
   [delete] Deleting directory /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/serde

clean:
   [delete] Deleting directory /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/service
   [delete] Deleting directory /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/dist
   [delete] Deleting directory /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build

BUILD SUCCESSFUL
Total time: 20 seconds
Buildfile: build.xml

deploy:

init:
    [mkdir] Created dir: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build
    [mkdir] Created dir: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/common
    [mkdir] Created dir: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/common/classes
    [mkdir] Created dir: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/jexl/classes
    [mkdir] Created dir: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/hadoopcore
    [mkdir] Created dir: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/common/test
    [mkdir] Created dir: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/common/test/src
    [mkdir] Created dir: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/common/test/classes

download-ivy:

init-ivy:

settings-ivy:

resolve:
[ivy:retrieve] :: Ivy 2.0.0-rc2 - 20081028224207 :: http://ant.apache.org/ivy/ ::
:: loading settings :: file = /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ivy/ivysettings.xml
[ivy:retrieve] :: resolving dependencies :: org.apache.hadoop.hive#common;working@devbuild001.snc1.facebook.com
[ivy:retrieve]  confs: [default]
[ivy:retrieve]  found hadoop#core;0.17.1 in hadoop-resolver
[ivy:retrieve] :: resolution report :: resolve 132ms :: artifacts dl 5ms
        ---------------------------------------------------------------------

        |                  |            modules            ||   artifacts   |
        |       conf       | number| search|dwnlded|evicted|| number|dwnlded|
        ---------------------------------------------------------------------
        |      default     |   1   |   0   |   0   |   0   ||   1   |   0   |
        ---------------------------------------------------------------------
[ivy:retrieve] :: retrieving :: org.apache.hadoop.hive#common
[ivy:retrieve]  confs: [default]
[ivy:retrieve]  1 artifacts copied, 0 already retrieved (14096kB/368ms)

install-hadoopcore:
    [untar] Expanding: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/hadoopcore/hadoop-0.17.1.tar.gz into /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/hadoopcore
    [touch] Creating /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/hadoopcore/hadoop-0.17.1.installed

compile:
     [echo] Compiling: common
    [javac] Compiling 1 source file to /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/common/classes

jar:
     [echo] Jar: common
      [jar] Building jar: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/common/hive_common.jar

deploy:
     [echo] hive: common
     [copy] Copying 1 file to /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build

init:
    [mkdir] Created dir: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/serde
    [mkdir] Created dir: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/serde/classes
    [mkdir] Created dir: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/serde/test
    [mkdir] Created dir: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/serde/test/src
    [mkdir] Created dir: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/serde/test/classes

dynamic-serde:

compile:
     [echo] Compiling: serde
    [javac] Compiling 128 source files to /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/serde/classes
    [javac] Note: Some input files use unchecked or unsafe operations.
    [javac] Note: Recompile with -Xlint:unchecked for details.

jar:
     [echo] Jar: serde
      [jar] Building jar: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/serde/hive_serde.jar

deploy:
     [echo] hive: serde
     [copy] Copying 1 file to /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build

init:
    [mkdir] Created dir: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/metastore
    [mkdir] Created dir: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/metastore/classes
    [mkdir] Created dir: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/metastore/test
    [mkdir] Created dir: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/metastore/test/src
    [mkdir] Created dir: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/metastore/test/classes

model-compile:
    [javac] Compiling 8 source files to /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/metastore/classes
     [copy] Copying 1 file to /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/metastore/classes

core-compile:
     [echo] Compiling:
    [javac] Compiling 38 source files to /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/metastore/classes
    [javac] Note: Some input files use unchecked or unsafe operations.
    [javac] Note: Recompile with -Xlint:unchecked for details.

model-enhance:
     [echo] Enhancing model classes with JPOX stuff....
     [java] JPOX Enhancer (version 1.2.2) : Enhancement of classes

     [java] JPOX Enhancer completed with success for 8 classes. Timings : input=170 ms, enhance=180 ms, total=350 ms. Consult the log for full details

compile:

jar:
     [echo] Jar: metastore
      [jar] Building jar: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/metastore/hive_metastore.jar

deploy:
     [echo] hive: metastore
     [copy] Copying 1 file to /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build
Overriding previous definition of reference to test.classpath

init:
    [mkdir] Created dir: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql
    [mkdir] Created dir: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/classes
    [mkdir] Created dir: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test
    [mkdir] Created dir: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/src
    [mkdir] Created dir: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/classes

ql-init:
    [mkdir] Created dir: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/gen-java/org/apache/hadoop/hive/ql/parse

build-grammar:
     [echo] Building Grammar /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/java/org/apache/hadoop/hive/ql/parse/Hive.g  ....
     [java] ANTLR Parser Generator  Version 3.0.1 (August 13, 2007)  1989-2007

compile-ant-tasks:

init:
    [mkdir] Created dir: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/anttasks
    [mkdir] Created dir: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/anttasks/classes
    [mkdir] Created dir: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/anttasks/test
    [mkdir] Created dir: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/anttasks/test/src
    [mkdir] Created dir: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/anttasks/test/classes

download-ivy:

init-ivy:

settings-ivy:

resolve:
:: loading settings :: file = /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ivy/ivysettings.xml
[ivy:retrieve] :: resolving dependencies :: org.apache.hadoop.hive#ant;working@devbuild001.snc1.facebook.com
[ivy:retrieve]  confs: [default]
[ivy:retrieve] :: resolution report :: resolve 10ms :: artifacts dl 0ms
        ---------------------------------------------------------------------
        |                  |            modules            ||   artifacts   |
        |       conf       | number| search|dwnlded|evicted|| number|dwnlded|
        ---------------------------------------------------------------------
        |      default     |   0   |   0   |   0   |   0   ||   0   |   0   |
        ---------------------------------------------------------------------
[ivy:retrieve] :: retrieving :: org.apache.hadoop.hive#ant
[ivy:retrieve]  confs: [default]
[ivy:retrieve]  0 artifacts copied, 0 already retrieved (0kB/13ms)

install-hadoopcore:

compile:
     [echo] Compiling: anttasks
    [javac] Compiling 2 source files to /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/anttasks/classes
    [javac] Note: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ant/src/org/apache/hadoop/hive/ant/QTestGenTask.java uses or overrides a deprecated API.
    [javac] Note: Recompile with -Xlint:deprecation for details.

deploy-ant-tasks:

init:

download-ivy:

init-ivy:

settings-ivy:

resolve:
:: loading settings :: file = /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ivy/ivysettings.xml
[ivy:retrieve] :: resolving dependencies :: org.apache.hadoop.hive#ant;working@devbuild001.snc1.facebook.com
[ivy:retrieve]  confs: [default]
[ivy:retrieve] :: resolution report :: resolve 16ms :: artifacts dl 0ms
        ---------------------------------------------------------------------
        |                  |            modules            ||   artifacts   |
        |       conf       | number| search|dwnlded|evicted|| number|dwnlded|
        ---------------------------------------------------------------------
        |      default     |   0   |   0   |   0   |   0   ||   0   |   0   |
        ---------------------------------------------------------------------
[ivy:retrieve] :: retrieving :: org.apache.hadoop.hive#ant
[ivy:retrieve]  confs: [default]
[ivy:retrieve]  0 artifacts copied, 0 already retrieved (0kB/2ms)

install-hadoopcore:

compile:
     [echo] Compiling: anttasks

jar:
     [copy] Copying 1 file to /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/anttasks/classes/org/apache/hadoop/hive/ant
      [jar] Building jar: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/anttasks/hive_anttasks.jar

deploy:
     [echo] hive: anttasks
     [copy] Copying 1 file to /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build

configure:
     [copy] Copying 239 files to /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/java

compile:
     [echo] Compiling: ql
    [javac] Compiling 241 source files to /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/classes
    [javac] Note: Some input files use or override a deprecated API.
    [javac] Note: Recompile with -Xlint:deprecation for details.
    [javac] Note: Some input files use unchecked or unsafe operations.
    [javac] Note: Recompile with -Xlint:unchecked for details.

jar:
     [echo] Jar: ql
    [unzip] Expanding: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/lib/commons-jexl-1.1.jar into /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/jexl/classes
    [unzip] Expanding: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/lib/libthrift.jar into /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/thrift/classes
    [unzip] Expanding: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/lib/commons-lang-2.4.jar into /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/commons-lang/classes
      [jar] Building jar: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/hive_exec.jar

deploy:
     [echo] hive: ql
     [copy] Copying 1 file to /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build

init:
    [mkdir] Created dir: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/cli
    [mkdir] Created dir: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/cli/classes
    [mkdir] Created dir: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/cli/test
    [mkdir] Created dir: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/cli/test/src
    [mkdir] Created dir: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/cli/test/classes

download-ivy:

init-ivy:

settings-ivy:

resolve:
:: loading settings :: file = /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ivy/ivysettings.xml
[ivy:retrieve] :: resolving dependencies :: org.apache.hadoop.hive#cli;working@devbuild001.snc1.facebook.com
[ivy:retrieve]  confs: [default]
[ivy:retrieve]  found hadoop#core;0.17.1 in hadoop-resolver
[ivy:retrieve] :: resolution report :: resolve 43ms :: artifacts dl 2ms
        ---------------------------------------------------------------------
        |                  |            modules            ||   artifacts   |
        |       conf       | number| search|dwnlded|evicted|| number|dwnlded|
        ---------------------------------------------------------------------
        |      default     |   1   |   0   |   0   |   0   ||   1   |   0   |
        ---------------------------------------------------------------------
[ivy:retrieve] :: retrieving :: org.apache.hadoop.hive#cli
[ivy:retrieve]  confs: [default]
[ivy:retrieve]  0 artifacts copied, 1 already retrieved (0kB/4ms)

install-hadoopcore:

compile:
     [echo] Compiling: cli
    [javac] Compiling 5 source files to /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/cli/classes
    [javac] Note: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/cli/src/java/org/apache/hadoop/hive/cli/OptionsProcessor.java uses unchecked or unsafe operations.
    [javac] Note: Recompile with -Xlint:unchecked for details.

jar:
     [echo] Jar: cli
      [jar] Building jar: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/cli/hive_cli.jar

deploy:
     [echo] hive: cli
     [copy] Copying 1 file to /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build

init:
    [mkdir] Created dir: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/service
    [mkdir] Created dir: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/service/classes
    [mkdir] Created dir: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/service/test
    [mkdir] Created dir: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/service/test/src
    [mkdir] Created dir: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/service/test/classes

core-compile:
    [javac] Compiling 6 source files to /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/service/classes

compile:

jar:
     [echo] Jar: service
      [jar] Building jar: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/service/hive_service.jar

deploy:
     [echo] hive: service
     [copy] Copying 1 file to /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build

package:
     [echo] Deploying Hive jars to /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/dist
    [mkdir] Created dir: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/dist
    [mkdir] Created dir: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/dist/lib
    [mkdir] Created dir: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/dist/conf
    [mkdir] Created dir: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/dist/bin
    [mkdir] Created dir: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/dist/examples
    [mkdir] Created dir: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/dist/examples/files
    [mkdir] Created dir: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/dist/examples/queries
    [mkdir] Created dir: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/dist/lib/py
    [mkdir] Created dir: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/dist/lib/php
     [copy] Copying 1 file to /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/dist/bin
     [copy] Copying 5 files to /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/dist/bin/ext
     [copy] Copying 1 file to /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/dist/bin
     [copy] Copying 1 file to /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/dist/conf
     [copy] Copying 1 file to /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/dist/conf
     [copy] Copying 1 file to /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/dist/conf
     [copy] Copying 6 files to /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/dist/lib/php
     [copy] Copying 12 files to /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/dist/lib/py
     [copy] Copied 3 empty directories to 1 empty directory under /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/dist/lib/py
     [copy] Copying 35 files to /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/dist/lib
     [copy] Copying 16 files to /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/dist/examples/files
     [copy] Copying 1 file to /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/dist
     [copy] Copying 41 files to /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/dist/examples/queries

BUILD SUCCESSFUL
Total time: 40 seconds
RUNNING TEST FOR HIVE OPENSOURCE - ant test
+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Buildfile: build.xml

clean-test:

clean-test:
   [delete] Deleting directory /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/cli/test

clean-test:
   [delete] Deleting directory /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/common/test

clean-test:
   [delete] Deleting directory /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/metastore/test
Overriding previous definition of reference to test.classpath

clean-test:
   [delete] Deleting directory /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test

clean-test:
   [delete] Deleting directory /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/serde/test

clean-test:
   [delete] Deleting directory /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/service/test

BUILD SUCCESSFUL
Total time: 1 second
Buildfile: build.xml

clean-test:

clean-test:

clean-test:

clean-test:
Overriding previous definition of reference to test.classpath

clean-test:

clean-test:

clean-test:

deploy:

init:
    [mkdir] Created dir: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/common/test
    [mkdir] Created dir: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/common/test/src
    [mkdir] Created dir: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/common/test/classes

download-ivy:

init-ivy:

settings-ivy:

resolve:
[ivy:retrieve] :: Ivy 2.0.0-rc2 - 20081028224207 :: http://ant.apache.org/ivy/ ::
:: loading settings :: file = /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ivy/ivysettings.xml
[ivy:retrieve] :: resolving dependencies :: org.apache.hadoop.hive#common;working@devbuild001.snc1.facebook.com
[ivy:retrieve]  confs: [default]
[ivy:retrieve]  found hadoop#core;0.17.1 in hadoop-resolver
[ivy:retrieve] :: resolution report :: resolve 96ms :: artifacts dl 3ms
        ---------------------------------------------------------------------
        |                  |            modules            ||   artifacts   |
        |       conf       | number| search|dwnlded|evicted|| number|dwnlded|
        ---------------------------------------------------------------------
        |      default     |   1   |   0   |   0   |   0   ||   1   |   0   |
        ---------------------------------------------------------------------
[ivy:retrieve] :: retrieving :: org.apache.hadoop.hive#common
[ivy:retrieve]  confs: [default]
[ivy:retrieve]  0 artifacts copied, 1 already retrieved (0kB/5ms)

install-hadoopcore:

compile:
     [echo] Compiling: common

jar:
     [echo] Jar: common

deploy:
     [echo] hive: common

init:
    [mkdir] Created dir: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/serde/test
    [mkdir] Created dir: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/serde/test/src
    [mkdir] Created dir: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/serde/test/classes

dynamic-serde:

compile:
     [echo] Compiling: serde

jar:
     [echo] Jar: serde

deploy:
     [echo] hive: serde

init:
    [mkdir] Created dir: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/metastore/test
    [mkdir] Created dir: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/metastore/test/src
    [mkdir] Created dir: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/metastore/test/classes

model-compile:

core-compile:
     [echo] Compiling:

model-enhance:

compile:

jar:
     [echo] Jar: metastore

deploy:
     [echo] hive: metastore
Overriding previous definition of reference to test.classpath

init:
    [mkdir] Created dir: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test
    [mkdir] Created dir: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/src
    [mkdir] Created dir: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/classes

ql-init:

build-grammar:

compile-ant-tasks:

init:

download-ivy:

init-ivy:

settings-ivy:

resolve:
:: loading settings :: file = /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ivy/ivysettings.xml
[ivy:retrieve] :: resolving dependencies :: org.apache.hadoop.hive#ant;working@devbuild001.snc1.facebook.com
[ivy:retrieve]  confs: [default]
[ivy:retrieve] :: resolution report :: resolve 37ms :: artifacts dl 0ms
        ---------------------------------------------------------------------
        |                  |            modules            ||   artifacts   |
        |       conf       | number| search|dwnlded|evicted|| number|dwnlded|
        ---------------------------------------------------------------------
        |      default     |   0   |   0   |   0   |   0   ||   0   |   0   |
        ---------------------------------------------------------------------
[ivy:retrieve] :: retrieving :: org.apache.hadoop.hive#ant
[ivy:retrieve]  confs: [default]
[ivy:retrieve]  0 artifacts copied, 0 already retrieved (0kB/12ms)

install-hadoopcore:

compile:
     [echo] Compiling: anttasks

deploy-ant-tasks:

init:

download-ivy:

init-ivy:

settings-ivy:

resolve:
:: loading settings :: file = /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ivy/ivysettings.xml
[ivy:retrieve] :: resolving dependencies :: org.apache.hadoop.hive#ant;working@devbuild001.snc1.facebook.com
[ivy:retrieve]  confs: [default]
[ivy:retrieve] :: resolution report :: resolve 9ms :: artifacts dl 0ms
        ---------------------------------------------------------------------
        |                  |            modules            ||   artifacts   |
        |       conf       | number| search|dwnlded|evicted|| number|dwnlded|
        ---------------------------------------------------------------------
        |      default     |   0   |   0   |   0   |   0   ||   0   |   0   |
        ---------------------------------------------------------------------
[ivy:retrieve] :: retrieving :: org.apache.hadoop.hive#ant
[ivy:retrieve]  confs: [default]
[ivy:retrieve]  0 artifacts copied, 0 already retrieved (0kB/3ms)

install-hadoopcore:

compile:
     [echo] Compiling: anttasks

jar:

deploy:
     [echo] hive: anttasks

configure:

compile:
     [echo] Compiling: ql
    [javac] Compiling 8 source files to /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/classes

jar:
     [echo] Jar: ql
    [unzip] Expanding: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/lib/commons-jexl-1.1.jar into /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/jexl/classes
    [unzip] Expanding: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/lib/libthrift.jar into /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/thrift/classes
    [unzip] Expanding: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/lib/commons-lang-2.4.jar into /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/commons-lang/classes

deploy:
     [echo] hive: ql

init:
    [mkdir] Created dir: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/cli/test
    [mkdir] Created dir: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/cli/test/src
    [mkdir] Created dir: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/cli/test/classes

download-ivy:

init-ivy:

settings-ivy:

resolve:
:: loading settings :: file = /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ivy/ivysettings.xml
[ivy:retrieve] :: resolving dependencies :: org.apache.hadoop.hive#cli;working@devbuild001.snc1.facebook.com
[ivy:retrieve]  confs: [default]
[ivy:retrieve]  found hadoop#core;0.17.1 in hadoop-resolver
[ivy:retrieve] :: resolution report :: resolve 29ms :: artifacts dl 2ms
        ---------------------------------------------------------------------
        |                  |            modules            ||   artifacts   |
        |       conf       | number| search|dwnlded|evicted|| number|dwnlded|
        ---------------------------------------------------------------------
        |      default     |   1   |   0   |   0   |   0   ||   1   |   0   |
        ---------------------------------------------------------------------
[ivy:retrieve] :: retrieving :: org.apache.hadoop.hive#cli
[ivy:retrieve]  confs: [default]
[ivy:retrieve]  0 artifacts copied, 1 already retrieved (0kB/4ms)

install-hadoopcore:

compile:
     [echo] Compiling: cli

jar:
     [echo] Jar: cli

deploy:
     [echo] hive: cli

init:
    [mkdir] Created dir: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/service/test
    [mkdir] Created dir: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/service/test/src
    [mkdir] Created dir: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/service/test/classes

core-compile:

compile:

jar:
     [echo] Jar: service

deploy:
     [echo] hive: service

test:

test:
     [echo] Nothing to do!

test:
     [echo] Nothing to do!

test-conditions:

gen-test:

init:

model-compile:

core-compile:
     [echo] Compiling:

model-enhance:

compile:

compile-test:
    [javac] Compiling 14 source files to /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/metastore/test/classes

test-jar:

test-init:
    [mkdir] Created dir: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/metastore/test/data
     [copy] Copying 18 files to /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/metastore/test/data
     [copy] Copied 5 empty directories to 3 empty directories under /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/metastore/test/data

test:
    [mkdir] Created dir: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/metastore/test/logs
    [junit] Running org.apache.hadoop.hive.metastore.TestAlter
    [junit] Tests run: 1, Failures: 0, Errors: 0, Time elapsed: 0.069 sec
    [junit] Running org.apache.hadoop.hive.metastore.TestCreateDB
    [junit] Tests run: 1, Failures: 0, Errors: 0, Time elapsed: 0.041 sec
    [junit] Running org.apache.hadoop.hive.metastore.TestDBGetName
    [junit] Tests run: 1, Failures: 0, Errors: 0, Time elapsed: 0.033 sec
    [junit] Running org.apache.hadoop.hive.metastore.TestDrop
    [junit] Tests run: 1, Failures: 0, Errors: 0, Time elapsed: 0.083 sec
    [junit] Running org.apache.hadoop.hive.metastore.TestGetDBs
    [junit] Tests run: 1, Failures: 0, Errors: 0, Time elapsed: 0.043 sec
    [junit] Running org.apache.hadoop.hive.metastore.TestGetSchema
    [junit] Tests run: 1, Failures: 0, Errors: 0, Time elapsed: 0.069 sec
    [junit] Running org.apache.hadoop.hive.metastore.TestGetTable
    [junit] Tests run: 1, Failures: 0, Errors: 0, Time elapsed: 0.063 sec
    [junit] Running org.apache.hadoop.hive.metastore.TestGetTables
    [junit] Tests run: 1, Failures: 0, Errors: 0, Time elapsed: 0.072 sec
    [junit] Running org.apache.hadoop.hive.metastore.TestHiveMetaStore
    [junit] Tests run: 7, Failures: 0, Errors: 0, Time elapsed: 9.318 sec
    [junit] Running org.apache.hadoop.hive.metastore.TestPartitions
    [junit] Tests run: 1, Failures: 0, Errors: 0, Time elapsed: 0.066 sec
    [junit] Running org.apache.hadoop.hive.metastore.TestTableExists
    [junit] Tests run: 1, Failures: 0, Errors: 0, Time elapsed: 0.055 sec
    [junit] Running org.apache.hadoop.hive.metastore.TestTablePath
    [junit] Tests run: 1, Failures: 0, Errors: 0, Time elapsed: 0.061 sec
    [junit] Running org.apache.hadoop.hive.metastore.TestTruncate
    [junit] Tests run: 1, Failures: 0, Errors: 0, Time elapsed: 0.074 sec
Overriding previous definition of reference to test.classpath

test-conditions:

init:

compile-ant-tasks:

init:

download-ivy:

init-ivy:

settings-ivy:

resolve:
:: loading settings :: file = /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ivy/ivysettings.xml
[ivy:retrieve] :: resolving dependencies :: org.apache.hadoop.hive#ant;working@devbuild001.snc1.facebook.com
[ivy:retrieve]  confs: [default]
[ivy:retrieve] :: resolution report :: resolve 31ms :: artifacts dl 0ms
        ---------------------------------------------------------------------
        |                  |            modules            ||   artifacts   |
        |       conf       | number| search|dwnlded|evicted|| number|dwnlded|
        ---------------------------------------------------------------------
        |      default     |   0   |   0   |   0   |   0   ||   0   |   0   |
        ---------------------------------------------------------------------
[ivy:retrieve] :: retrieving :: org.apache.hadoop.hive#ant
[ivy:retrieve]  confs: [default]
[ivy:retrieve]  0 artifacts copied, 0 already retrieved (0kB/3ms)

install-hadoopcore:

compile:
     [echo] Compiling: anttasks

deploy-ant-tasks:

init:

download-ivy:

init-ivy:

settings-ivy:

resolve:
:: loading settings :: file = /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ivy/ivysettings.xml
[ivy:retrieve] :: resolving dependencies :: org.apache.hadoop.hive#ant;working@devbuild001.snc1.facebook.com
[ivy:retrieve]  confs: [default]
[ivy:retrieve] :: resolution report :: resolve 9ms :: artifacts dl 0ms
        ---------------------------------------------------------------------
        |                  |            modules            ||   artifacts   |
        |       conf       | number| search|dwnlded|evicted|| number|dwnlded|
        ---------------------------------------------------------------------
        |      default     |   0   |   0   |   0   |   0   ||   0   |   0   |
        ---------------------------------------------------------------------
[ivy:retrieve] :: retrieving :: org.apache.hadoop.hive#ant
[ivy:retrieve]  confs: [default]
[ivy:retrieve]  0 artifacts copied, 0 already retrieved (0kB/2ms)

install-hadoopcore:

compile:
     [echo] Compiling: anttasks

jar:

deploy:
     [echo] hive: anttasks

gen-test:
 [qtestgen] Template Path:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/templates
 [qtestgen] Dec 30, 2008 7:41:10 PM org.apache.velocity.runtime.log.JdkLogChute log
 [qtestgen] INFO: FileResourceLoader : adding path '/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/templates'
 [qtestgen] Generated /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/src/org/apache/hadoop/hive/ql/parse/TestParse.java from template TestParse.vm
 [qtestgen] Template Path:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/templates
 [qtestgen] Dec 30, 2008 7:41:10 PM org.apache.velocity.runtime.log.JdkLogChute log
 [qtestgen] INFO: FileResourceLoader : adding path '/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/templates'
 [qtestgen] Generated /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/src/org/apache/hadoop/hive/ql/parse/TestParseNegative.java from template TestParseNegative.vm
 [qtestgen] Template Path:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/templates
 [qtestgen] Dec 30, 2008 7:41:10 PM org.apache.velocity.runtime.log.JdkLogChute log
 [qtestgen] INFO: FileResourceLoader : adding path '/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/templates'
 [qtestgen] Generated /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/src/org/apache/hadoop/hive/cli/TestCliDriver.java from template TestCliDriver.vm
 [qtestgen] Template Path:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/templates
 [qtestgen] Dec 30, 2008 7:41:10 PM org.apache.velocity.runtime.log.JdkLogChute log
 [qtestgen] INFO: FileResourceLoader : adding path '/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/templates'
 [qtestgen] Generated /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/src/org/apache/hadoop/hive/cli/TestNegativeCliDriver.java from template TestNegativeCliDriver.vm

ql-init:

build-grammar:

configure:

compile:
     [echo] Compiling: ql
    [javac] Compiling 8 source files to /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/classes

compile-test:
    [javac] Compiling 15 source files to /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/classes
    [javac] Note: Some input files use or override a deprecated API.
    [javac] Note: Recompile with -Xlint:deprecation for details.
    [javac] Note: Some input files use unchecked or unsafe operations.
    [javac] Note: Recompile with -Xlint:unchecked for details.
    [javac] Compiling 4 source files to /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/classes

test-jar:
      [jar] Building jar: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/test-udfs.jar

test-init:
     [copy] Copying 18 files to /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data
     [copy] Copied 4 empty directories to 2 empty directories under /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data

test:
    [mkdir] Created dir: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/logs
    [junit] Running org.apache.hadoop.hive.cli.TestCliDriver
    [junit] Begin query: mapreduce1.q
    [junit] Loading data to table srcpart partition {ds=2008-04-08, hr=11}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-08, hr=12}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-09, hr=11}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-09, hr=12}
    [junit] OK
    [junit] Loading data to table srcbucket
    [junit] OK
    [junit] Loading data to table srcbucket
    [junit] OK
    [junit] Loading data to table src
    [junit] OK
    [junit] plan = /tmp/plan60431.xml
    [junit] 08/12/30 19:41:25 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to jobconf value of: 1
    [junit] 08/12/30 19:41:26 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src
    [junit] 08/12/30 19:41:26 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/data/files/TestSerDe.jar
    [junit] 08/12/30 19:41:26 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
    [junit] 08/12/30 19:41:26 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] Job running in-process (local Hadoop)
    [junit] 08/12/30 19:41:26 INFO exec.ExecDriver: Job running in-process (local Hadoop)
    [junit] 08/12/30 19:41:26 INFO mapred.MapTask: numReduceTasks: 1
    [junit] 08/12/30 19:41:26 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 19:41:26 INFO exec.MapOperator: Adding alias src to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt
    [junit] 08/12/30 19:41:26 INFO exec.MapOperator: Got partitions: null
    [junit] 08/12/30 19:41:26 INFO exec.TableScanOperator: Initializing Self
    [junit] 08/12/30 19:41:26 INFO exec.TableScanOperator: Initializing children:
    [junit] 08/12/30 19:41:26 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 19:41:26 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 19:41:26 INFO exec.ScriptOperator: Initializing Self
    [junit] 08/12/30 19:41:26 INFO exec.ScriptOperator: Initializing children:
    [junit] 08/12/30 19:41:26 INFO exec.ReduceSinkOperator: Initializing Self
    [junit] 08/12/30 19:41:26 INFO exec.ReduceSinkOperator: Using tag = -1
    [junit] 08/12/30 19:41:26 INFO thrift.TBinarySortableProtocol: Sort order is "++"
    [junit] 08/12/30 19:41:26 INFO thrift.TBinarySortableProtocol: Sort order is "++"
    [junit] 08/12/30 19:41:26 INFO exec.ScriptOperator: Initialization Done
    [junit] 08/12/30 19:41:26 INFO exec.ScriptOperator: Executing [/bin/cat]
    [junit] 08/12/30 19:41:26 INFO exec.ScriptOperator: tablename=src
    [junit] 08/12/30 19:41:26 INFO exec.ScriptOperator: partname={}
    [junit] 08/12/30 19:41:26 INFO exec.ScriptOperator: alias=src
    [junit] 08/12/30 19:41:26 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 19:41:26 INFO exec.TableScanOperator: Initialization Done
    [junit] 08/12/30 19:41:26 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 19:41:26 INFO exec.ScriptOperator: StreamThread ErrorProcessor done
    [junit] 08/12/30 19:41:27 INFO exec.ScriptOperator: StreamThread OutputProcessor done
    [junit] 08/12/30 19:41:27 INFO exec.ScriptOperator: SERIALIZE_ERRORS:0
    [junit] 08/12/30 19:41:27 INFO exec.ScriptOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 19:41:27 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt:0+5812
    [junit] 08/12/30 19:41:27 INFO mapred.TaskRunner: Task 'job_local_1_map_0000' done.
    [junit] 08/12/30 19:41:27 INFO mapred.TaskRunner: Saved output of task 'job_local_1_map_0000' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp-834949849
    [junit] 08/12/30 19:41:27 INFO thrift.TBinarySortableProtocol: Sort order is "++"
    [junit] 08/12/30 19:41:27 INFO thrift.TBinarySortableProtocol: Sort order is "++"
    [junit] 08/12/30 19:41:27 INFO exec.ExtractOperator: Initializing Self
    [junit] 08/12/30 19:41:27 INFO exec.ExtractOperator: Initializing children:
    [junit] 08/12/30 19:41:27 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 19:41:27 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 19:41:27 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 19:41:27 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 19:41:27 INFO exec.ExtractOperator: Initialization Done
    [junit] 08/12/30 19:41:27 INFO mapred.LocalJobRunner: reduce > reduce
    [junit] 08/12/30 19:41:27 INFO mapred.TaskRunner: Task 'reduce_cy2095' done.
    [junit] 08/12/30 19:41:27 INFO mapred.TaskRunner: Saved output of task 'reduce_cy2095' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp-834949849
    [junit]  map = 100%,  reduce =100%
    [junit] 08/12/30 19:41:27 INFO exec.ExecDriver:  map = 100%,  reduce =100%
    [junit] Ended Job = job_local_1
    [junit] 08/12/30 19:41:27 INFO exec.ExecDriver: Ended Job = job_local_1
    [junit] diff -I \(file:\)\|\(/tmp/.*\) /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/mapreduce1.q.out /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/clientpositive/mapreduce1.q.out
    [junit] Done query: mapreduce1.q
    [junit] Begin query: mapreduce3.q
    [junit] plan = /tmp/plan60432.xml
    [junit] 08/12/30 19:41:31 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to jobconf value of: 1
    [junit] 08/12/30 19:41:31 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src
    [junit] 08/12/30 19:41:31 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/data/files/TestSerDe.jar
    [junit] 08/12/30 19:41:31 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
    [junit] 08/12/30 19:41:32 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] Job running in-process (local Hadoop)
    [junit] 08/12/30 19:41:32 INFO exec.ExecDriver: Job running in-process (local Hadoop)
    [junit] 08/12/30 19:41:32 INFO mapred.MapTask: numReduceTasks: 1
    [junit] 08/12/30 19:41:32 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 19:41:32 INFO exec.MapOperator: Adding alias src to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt
    [junit] 08/12/30 19:41:32 INFO exec.MapOperator: Got partitions: null
    [junit] 08/12/30 19:41:32 INFO exec.TableScanOperator: Initializing Self
    [junit] 08/12/30 19:41:32 INFO exec.TableScanOperator: Initializing children:
    [junit] 08/12/30 19:41:32 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 19:41:32 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 19:41:32 INFO exec.ScriptOperator: Initializing Self
    [junit] 08/12/30 19:41:32 INFO exec.ScriptOperator: Initializing children:
    [junit] 08/12/30 19:41:32 INFO exec.ReduceSinkOperator: Initializing Self
    [junit] 08/12/30 19:41:32 INFO exec.ReduceSinkOperator: Using tag = -1
    [junit] 08/12/30 19:41:32 INFO thrift.TBinarySortableProtocol: Sort order is "++"
    [junit] 08/12/30 19:41:32 INFO thrift.TBinarySortableProtocol: Sort order is "++"
    [junit] 08/12/30 19:41:32 INFO exec.ScriptOperator: Initialization Done
    [junit] 08/12/30 19:41:32 INFO exec.ScriptOperator: Executing [/bin/cat]
    [junit] 08/12/30 19:41:32 INFO exec.ScriptOperator: tablename=src
    [junit] 08/12/30 19:41:32 INFO exec.ScriptOperator: partname={}
    [junit] 08/12/30 19:41:32 INFO exec.ScriptOperator: alias=src
    [junit] 08/12/30 19:41:32 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 19:41:32 INFO exec.TableScanOperator: Initialization Done
    [junit] 08/12/30 19:41:32 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 19:41:32 INFO exec.ScriptOperator: StreamThread ErrorProcessor done
    [junit] 08/12/30 19:41:32 INFO exec.ScriptOperator: StreamThread OutputProcessor done
    [junit] 08/12/30 19:41:32 INFO exec.ScriptOperator: SERIALIZE_ERRORS:0
    [junit] 08/12/30 19:41:32 INFO exec.ScriptOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 19:41:32 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt:0+5812
    [junit] 08/12/30 19:41:32 INFO mapred.TaskRunner: Task 'job_local_1_map_0000' done.
    [junit] 08/12/30 19:41:32 INFO mapred.TaskRunner: Saved output of task 'job_local_1_map_0000' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp-728768704
    [junit] 08/12/30 19:41:32 INFO thrift.TBinarySortableProtocol: Sort order is "++"
    [junit] 08/12/30 19:41:32 INFO thrift.TBinarySortableProtocol: Sort order is "++"
    [junit] 08/12/30 19:41:33 INFO exec.ExtractOperator: Initializing Self
    [junit] 08/12/30 19:41:33 INFO exec.ExtractOperator: Initializing children:
    [junit] 08/12/30 19:41:33 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 19:41:33 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 19:41:33 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 19:41:33 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 19:41:33 INFO exec.ExtractOperator: Initialization Done
    [junit] 08/12/30 19:41:33 INFO mapred.LocalJobRunner: reduce > reduce
    [junit] 08/12/30 19:41:33 INFO mapred.TaskRunner: Task 'reduce_1za0qz' done.
    [junit]  map = 100%,  reduce =100%
    [junit] 08/12/30 19:41:33 INFO exec.ExecDriver:  map = 100%,  reduce =100%
    [junit] 08/12/30 19:41:33 INFO mapred.TaskRunner: Saved output of task 'reduce_1za0qz' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp-728768704
    [junit] Ended Job = job_local_1
    [junit] 08/12/30 19:41:34 INFO exec.ExecDriver: Ended Job = job_local_1
    [junit] diff -I \(file:\)\|\(/tmp/.*\) /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/mapreduce3.q.out /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/clientpositive/mapreduce3.q.out
    [junit] Done query: mapreduce3.q
    [junit] Begin query: alter1.q
    [junit] diff -I \(file:\)\|\(/tmp/.*\) /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/alter1.q.out /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/clientpositive/alter1.q.out
    [junit] Done query: alter1.q
    [junit] Begin query: showparts.q
    [junit] diff -I \(file:\)\|\(/tmp/.*\) /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/showparts.q.out /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/clientpositive/showparts.q.out
    [junit] Done query: showparts.q
    [junit] Begin query: mapreduce5.q
    [junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
    [junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:361)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
    [junit]     at org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_mapreduce5(TestCliDriver.java:405)
    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit]     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit]     at java.lang.reflect.Method.invoke(Method.java:597)
    [junit]     at junit.framework.TestCase.runTest(TestCase.java:164)
    [junit]     at junit.framework.TestCase.runBare(TestCase.java:130)
    [junit]     at junit.framework.TestResult$1.protect(TestResult.java:106)
    [junit]     at junit.framework.TestResult.runProtected(TestResult.java:124)
    [junit]     at junit.framework.TestResult.run(TestResult.java:109)
    [junit]     at junit.framework.TestCase.run(TestCase.java:120)
    [junit]     at junit.framework.TestSuite.runTest(TestSuite.java:230)
    [junit]     at junit.framework.TestSuite.run(TestSuite.java:225)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
    [junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
    [junit]     at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
    [junit]     at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
    [junit]     at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
    [junit]     ... 21 more
    [junit] Begin query: subq2.q
    [junit] plan = /tmp/plan60433.xml
    [junit] 08/12/30 19:41:43 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to jobconf value of: 1
    [junit] 08/12/30 19:41:43 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src
    [junit] 08/12/30 19:41:43 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/data/files/TestSerDe.jar
    [junit] 08/12/30 19:41:43 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
    [junit] 08/12/30 19:41:43 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] Job running in-process (local Hadoop)
    [junit] 08/12/30 19:41:43 INFO exec.ExecDriver: Job running in-process (local Hadoop)
    [junit] 08/12/30 19:41:43 INFO mapred.MapTask: numReduceTasks: 1
    [junit] 08/12/30 19:41:43 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 19:41:43 INFO exec.MapOperator: Adding alias a:b to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt
    [junit] 08/12/30 19:41:43 INFO exec.MapOperator: Got partitions: null
    [junit] 08/12/30 19:41:43 INFO exec.TableScanOperator: Initializing Self
    [junit] 08/12/30 19:41:43 INFO exec.TableScanOperator: Initializing children:
    [junit] 08/12/30 19:41:43 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 19:41:43 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 19:41:43 INFO exec.ReduceSinkOperator: Initializing Self
    [junit] 08/12/30 19:41:43 INFO exec.ReduceSinkOperator: Using tag = -1
    [junit] 08/12/30 19:41:43 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 19:41:43 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 19:41:44 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 19:41:44 INFO exec.TableScanOperator: Initialization Done
    [junit] 08/12/30 19:41:44 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 19:41:44 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt:0+5812
    [junit] 08/12/30 19:41:44 INFO mapred.TaskRunner: Task 'job_local_1_map_0000' done.
    [junit] 08/12/30 19:41:44 INFO mapred.TaskRunner: Saved output of task 'job_local_1_map_0000' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp172943633
    [junit] 08/12/30 19:41:44 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 19:41:44 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 19:41:44 INFO exec.GroupByOperator: Initializing Self
    [junit] 08/12/30 19:41:44 INFO exec.GroupByOperator: Initializing children:
    [junit] 08/12/30 19:41:44 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 19:41:44 INFO exec.GroupByOperator: Initialization Done
    [junit] 08/12/30 19:41:44 INFO mapred.LocalJobRunner: reduce > reduce
    [junit] 08/12/30 19:41:44 INFO mapred.TaskRunner: Task 'reduce_eqe5uc' done.
    [junit] 08/12/30 19:41:44 INFO mapred.TaskRunner: Saved output of task 'reduce_eqe5uc' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp172943633
    [junit]  map = 100%,  reduce =100%
    [junit] Ended Job = job_local_1
    [junit] 08/12/30 19:41:44 INFO exec.ExecDriver:  map = 100%,  reduce =100%
    [junit] 08/12/30 19:41:44 INFO exec.ExecDriver: Ended Job = job_local_1
    [junit] plan = /tmp/plan60434.xml
    [junit] 08/12/30 19:41:46 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to jobconf value of: 1
    [junit] 08/12/30 19:41:46 INFO exec.ExecDriver: Adding input file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp/1282612442/823007644.10002
    [junit] 08/12/30 19:41:46 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/data/files/TestSerDe.jar
    [junit] 08/12/30 19:41:46 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
    [junit] 08/12/30 19:41:47 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] 08/12/30 19:41:47 INFO exec.ExecDriver: Job running in-process (local Hadoop)
    [junit] Job running in-process (local Hadoop)
    [junit] 08/12/30 19:41:47 INFO mapred.MapTask: numReduceTasks: 1
    [junit] 08/12/30 19:41:47 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 19:41:47 INFO exec.MapOperator: Adding alias /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp/1282612442/823007644.10002 to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp/1282612442/823007644.10002/reduce_eqe5uc
    [junit] 08/12/30 19:41:47 INFO exec.MapOperator: Got partitions: null
    [junit] 08/12/30 19:41:47 INFO exec.ReduceSinkOperator: Initializing Self
    [junit] 08/12/30 19:41:47 INFO exec.ReduceSinkOperator: Using tag = -1
    [junit] 08/12/30 19:41:47 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 19:41:47 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 19:41:47 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 19:41:47 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp/1282612442/823007644.10002/reduce_eqe5uc:0+11875
    [junit] 08/12/30 19:41:47 INFO mapred.TaskRunner: Task 'job_local_1_map_0000' done.
    [junit] 08/12/30 19:41:47 INFO mapred.TaskRunner: Saved output of task 'job_local_1_map_0000' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp-359312783
    [junit] 08/12/30 19:41:47 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 19:41:47 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 19:41:47 INFO exec.GroupByOperator: Initializing Self
    [junit] 08/12/30 19:41:47 INFO exec.GroupByOperator: Initializing children:
    [junit] 08/12/30 19:41:47 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 19:41:47 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 19:41:47 INFO exec.FilterOperator: Initializing Self
    [junit] 08/12/30 19:41:47 INFO exec.FilterOperator: Initializing children:
    [junit] 08/12/30 19:41:47 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 19:41:47 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 19:41:47 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 19:41:47 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 19:41:47 INFO exec.FilterOperator: Initialization Done
    [junit] 08/12/30 19:41:47 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 19:41:47 INFO exec.GroupByOperator: Initialization Done
    [junit] 08/12/30 19:41:47 INFO exec.FilterOperator: PASSED:258
    [junit] 08/12/30 19:41:47 INFO exec.FilterOperator: FILTERED:51
    [junit] 08/12/30 19:41:47 INFO mapred.LocalJobRunner: reduce > reduce
    [junit] 08/12/30 19:41:47 INFO mapred.TaskRunner: Task 'reduce_x1fq4d' done.
    [junit] 08/12/30 19:41:47 INFO mapred.TaskRunner: Saved output of task 'reduce_x1fq4d' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp-359312783
    [junit]  map = 100%,  reduce =100%
    [junit] Ended Job = job_local_1
    [junit] 08/12/30 19:41:48 INFO exec.ExecDriver:  map = 100%,  reduce =100%
    [junit] 08/12/30 19:41:48 INFO exec.ExecDriver: Ended Job = job_local_1
    [junit] diff -I \(file:\)\|\(/tmp/.*\) /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/subq2.q.out /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/clientpositive/subq2.q.out
    [junit] Done query: subq2.q
    [junit] Begin query: input_limit.q
    [junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
    [junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:361)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
    [junit]     at org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_input_limit(TestCliDriver.java:455)
    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit]     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit]     at java.lang.reflect.Method.invoke(Method.java:597)
    [junit]     at junit.framework.TestCase.runTest(TestCase.java:164)
    [junit]     at junit.framework.TestCase.runBare(TestCase.java:130)
    [junit]     at junit.framework.TestResult$1.protect(TestResult.java:106)
    [junit]     at junit.framework.TestResult.runProtected(TestResult.java:124)
    [junit]     at junit.framework.TestResult.run(TestResult.java:109)
    [junit]     at junit.framework.TestCase.run(TestCase.java:120)
    [junit]     at junit.framework.TestSuite.runTest(TestSuite.java:230)
    [junit]     at junit.framework.TestSuite.run(TestSuite.java:225)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
    [junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
    [junit]     at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
    [junit]     at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
    [junit]     at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
    [junit]     ... 21 more
    [junit] Begin query: input11_limit.q
    [junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcbucket)
    [junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcbucket)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:361)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
    [junit]     at org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_input11_limit(TestCliDriver.java:480)
    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit]     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit]     at java.lang.reflect.Method.invoke(Method.java:597)
    [junit]     at junit.framework.TestCase.runTest(TestCase.java:164)
    [junit]     at junit.framework.TestCase.runBare(TestCase.java:130)
    [junit]     at junit.framework.TestResult$1.protect(TestResult.java:106)
    [junit]     at junit.framework.TestResult.runProtected(TestResult.java:124)
    [junit]     at junit.framework.TestResult.run(TestResult.java:109)
    [junit]     at junit.framework.TestCase.run(TestCase.java:120)
    [junit]     at junit.framework.TestSuite.runTest(TestSuite.java:230)
    [junit]     at junit.framework.TestSuite.run(TestSuite.java:225)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
    [junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcbucket)
    [junit]     at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
    [junit]     at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
    [junit]     at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
    [junit]     ... 21 more
    [junit] Begin query: input20.q
    [junit] plan = /tmp/plan60435.xml
    [junit] 08/12/30 19:41:51 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to jobconf value of: 1
    [junit] 08/12/30 19:41:52 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src
    [junit] 08/12/30 19:41:52 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/data/files/TestSerDe.jar
    [junit] 08/12/30 19:41:52 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
    [junit] 08/12/30 19:41:52 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] Job running in-process (local Hadoop)
    [junit] 08/12/30 19:41:52 INFO exec.ExecDriver: Job running in-process (local Hadoop)
    [junit] 08/12/30 19:41:52 INFO mapred.MapTask: numReduceTasks: 1
    [junit] 08/12/30 19:41:52 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 19:41:52 INFO exec.MapOperator: Adding alias tmap:src to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt
    [junit] 08/12/30 19:41:52 INFO exec.MapOperator: Got partitions: null
    [junit] 08/12/30 19:41:52 INFO exec.TableScanOperator: Initializing Self
    [junit] 08/12/30 19:41:52 INFO exec.TableScanOperator: Initializing children:
    [junit] 08/12/30 19:41:52 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 19:41:52 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 19:41:52 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 19:41:52 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 19:41:52 INFO exec.ScriptOperator: Initializing Self
    [junit] 08/12/30 19:41:52 INFO exec.ScriptOperator: Initializing children:
    [junit] 08/12/30 19:41:52 INFO exec.ReduceSinkOperator: Initializing Self
    [junit] 08/12/30 19:41:52 INFO exec.ReduceSinkOperator: Using tag = -1
    [junit] 08/12/30 19:41:52 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 19:41:52 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 19:41:52 INFO exec.ScriptOperator: Initialization Done
    [junit] 08/12/30 19:41:52 INFO exec.ScriptOperator: Executing [/bin/cat]
    [junit] 08/12/30 19:41:52 INFO exec.ScriptOperator: tablename=src
    [junit] 08/12/30 19:41:52 INFO exec.ScriptOperator: partname={}
    [junit] 08/12/30 19:41:52 INFO exec.ScriptOperator: alias=tmap:src
    [junit] 08/12/30 19:41:52 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 19:41:52 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 19:41:52 INFO exec.TableScanOperator: Initialization Done
    [junit] 08/12/30 19:41:53 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 19:41:53 INFO exec.ScriptOperator: StreamThread ErrorProcessor done
    [junit] 08/12/30 19:41:53 INFO exec.ScriptOperator: StreamThread OutputProcessor done
    [junit] 08/12/30 19:41:53 INFO exec.ScriptOperator: SERIALIZE_ERRORS:0
    [junit] 08/12/30 19:41:53 INFO exec.ScriptOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 19:41:53 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt:0+5812
    [junit] 08/12/30 19:41:53 INFO mapred.TaskRunner: Task 'job_local_1_map_0000' done.
    [junit] 08/12/30 19:41:53 INFO mapred.TaskRunner: Saved output of task 'job_local_1_map_0000' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp1155092949
    [junit] 08/12/30 19:41:53 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 19:41:53 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 19:41:53 INFO exec.ExtractOperator: Initializing Self
    [junit] 08/12/30 19:41:53 INFO exec.ExtractOperator: Initializing children:
    [junit] 08/12/30 19:41:53 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 19:41:53 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 19:41:53 INFO exec.ScriptOperator: Initializing Self
    [junit] 08/12/30 19:41:53 INFO exec.ScriptOperator: Initializing children:
    [junit] 08/12/30 19:41:53 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 19:41:53 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 19:41:53 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 19:41:53 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 19:41:53 INFO exec.ScriptOperator: Initialization Done
    [junit] 08/12/30 19:41:53 INFO exec.ScriptOperator: Executing [/usr/bin/uniq, -c, |, sed, s@^ *@@, |, sed, s@\t@_@, |, sed, s@ @\t@]
    [junit] 08/12/30 19:41:53 INFO exec.ScriptOperator: tablename=null
    [junit] 08/12/30 19:41:53 INFO exec.ScriptOperator: partname=null
    [junit] 08/12/30 19:41:53 INFO exec.ScriptOperator: alias=null
    [junit] 08/12/30 19:41:53 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 19:41:53 INFO exec.ExtractOperator: Initialization Done
    [junit] 08/12/30 19:41:53 INFO exec.ScriptOperator: StreamThread OutputProcessor done
    [junit] /usr/bin/uniq: extra operand `s@^ *@@'
    [junit] Try `/usr/bin/uniq --help' for more information.
    [junit] 08/12/30 19:41:53 INFO exec.ScriptOperator: StreamThread ErrorProcessor done
    [junit] 08/12/30 19:41:53 INFO exec.ScriptOperator: SERIALIZE_ERRORS:0
    [junit] 08/12/30 19:41:53 INFO exec.ScriptOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 19:41:53 INFO mapred.LocalJobRunner: reduce > reduce
    [junit] 08/12/30 19:41:53 INFO mapred.TaskRunner: Task 'reduce_a3xv04' done.
    [junit] 08/12/30 19:41:53 INFO mapred.TaskRunner: Saved output of task 'reduce_a3xv04' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp1155092949
    [junit]  map = 100%,  reduce =100%
    [junit] Ended Job = job_local_1
    [junit] 08/12/30 19:41:53 INFO exec.ExecDriver:  map = 100%,  reduce =100%
    [junit] 08/12/30 19:41:53 INFO exec.ExecDriver: Ended Job = job_local_1
    [junit] diff -I \(file:\)\|\(/tmp/.*\) /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/input20.q.out /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/clientpositive/input20.q.out
    [junit] Done query: input20.q
    [junit] Begin query: input14_limit.q
    [junit] plan = /tmp/plan60436.xml
    [junit] 08/12/30 19:41:57 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to jobconf value of: 1
    [junit] 08/12/30 19:41:57 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src
    [junit] 08/12/30 19:41:57 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/data/files/TestSerDe.jar
    [junit] 08/12/30 19:41:57 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
    [junit] 08/12/30 19:41:57 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] Job running in-process (local Hadoop)
    [junit] 08/12/30 19:41:57 INFO exec.ExecDriver: Job running in-process (local Hadoop)
    [junit] 08/12/30 19:41:57 INFO mapred.MapTask: numReduceTasks: 1
    [junit] 08/12/30 19:41:58 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 19:41:58 INFO exec.MapOperator: Adding alias tmap:src to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt
    [junit] 08/12/30 19:41:58 INFO exec.MapOperator: Got partitions: null
    [junit] 08/12/30 19:41:58 INFO exec.TableScanOperator: Initializing Self
    [junit] 08/12/30 19:41:58 INFO exec.TableScanOperator: Initializing children:
    [junit] 08/12/30 19:41:58 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 19:41:58 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 19:41:58 INFO exec.ScriptOperator: Initializing Self
    [junit] 08/12/30 19:41:58 INFO exec.ScriptOperator: Initializing children:
    [junit] 08/12/30 19:41:58 INFO exec.ReduceSinkOperator: Initializing Self
    [junit] 08/12/30 19:41:58 INFO exec.ReduceSinkOperator: Using tag = -1
    [junit] 08/12/30 19:41:58 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 19:41:58 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 19:41:58 INFO exec.ScriptOperator: Initialization Done
    [junit] 08/12/30 19:41:58 INFO exec.ScriptOperator: Executing [/bin/cat]
    [junit] 08/12/30 19:41:58 INFO exec.ScriptOperator: tablename=src
    [junit] 08/12/30 19:41:58 INFO exec.ScriptOperator: partname={}
    [junit] 08/12/30 19:41:58 INFO exec.ScriptOperator: alias=tmap:src
    [junit] 08/12/30 19:41:58 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 19:41:58 INFO exec.TableScanOperator: Initialization Done
    [junit] 08/12/30 19:41:58 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 19:41:58 INFO exec.ScriptOperator: StreamThread ErrorProcessor done
    [junit] 08/12/30 19:41:58 INFO exec.ScriptOperator: StreamThread OutputProcessor done
    [junit] 08/12/30 19:41:58 INFO exec.ScriptOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 19:41:58 INFO exec.ScriptOperator: SERIALIZE_ERRORS:0
    [junit] 08/12/30 19:41:58 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt:0+5812
    [junit] 08/12/30 19:41:58 INFO mapred.TaskRunner: Task 'job_local_1_map_0000' done.
    [junit] 08/12/30 19:41:58 INFO mapred.TaskRunner: Saved output of task 'job_local_1_map_0000' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp-581691282
    [junit] 08/12/30 19:41:58 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 19:41:58 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 19:41:58 INFO exec.ExtractOperator: Initializing Self
    [junit] 08/12/30 19:41:58 INFO exec.ExtractOperator: Initializing children:
    [junit] 08/12/30 19:41:58 INFO exec.LimitOperator: Initializing Self
    [junit] 08/12/30 19:41:58 INFO exec.LimitOperator: Initializing children:
    [junit] 08/12/30 19:41:58 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 19:41:58 INFO exec.LimitOperator: Initialization Done
    [junit] 08/12/30 19:41:58 INFO exec.ExtractOperator: Initialization Done
    [junit] 08/12/30 19:41:58 INFO mapred.LocalJobRunner: reduce > reduce
    [junit] 08/12/30 19:41:58 INFO mapred.TaskRunner: Task 'reduce_n004sn' done.
    [junit] 08/12/30 19:41:58 INFO mapred.TaskRunner: Saved output of task 'reduce_n004sn' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp-581691282
    [junit]  map = 100%,  reduce =100%
    [junit] Ended Job = job_local_1
    [junit] 08/12/30 19:41:58 INFO exec.ExecDriver:  map = 100%,  reduce =100%
    [junit] 08/12/30 19:41:58 INFO exec.ExecDriver: Ended Job = job_local_1
    [junit] plan = /tmp/plan60437.xml
    [junit] 08/12/30 19:42:00 INFO exec.ExecDriver: Number of reduce tasks determined at compile : 1
    [junit] 08/12/30 19:42:00 INFO exec.ExecDriver: Adding input file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp/806332980/254283961.10001
    [junit] 08/12/30 19:42:00 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/data/files/TestSerDe.jar
    [junit] 08/12/30 19:42:00 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
    [junit] 08/12/30 19:42:01 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] Job running in-process (local Hadoop)
    [junit] 08/12/30 19:42:01 INFO exec.ExecDriver: Job running in-process (local Hadoop)
    [junit] 08/12/30 19:42:01 INFO mapred.MapTask: numReduceTasks: 1
    [junit] 08/12/30 19:42:01 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 19:42:01 INFO exec.MapOperator: Adding alias /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp/806332980/254283961.10001 to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp/806332980/254283961.10001/reduce_n004sn
    [junit] 08/12/30 19:42:01 INFO exec.MapOperator: Got partitions: null
    [junit] 08/12/30 19:42:01 INFO exec.ReduceSinkOperator: Initializing Self
    [junit] 08/12/30 19:42:01 INFO exec.ReduceSinkOperator: Using tag = -1
    [junit] 08/12/30 19:42:01 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 19:42:01 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 19:42:01 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 19:42:01 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp/806332980/254283961.10001/reduce_n004sn:0+900
    [junit] 08/12/30 19:42:01 INFO mapred.TaskRunner: Task 'job_local_1_map_0000' done.
    [junit] 08/12/30 19:42:01 INFO mapred.TaskRunner: Saved output of task 'job_local_1_map_0000' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp-916040072
    [junit] 08/12/30 19:42:01 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 19:42:01 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 19:42:01 INFO exec.ExtractOperator: Initializing Self
    [junit] 08/12/30 19:42:01 INFO exec.ExtractOperator: Initializing children:
    [junit] 08/12/30 19:42:01 INFO exec.LimitOperator: Initializing Self
    [junit] 08/12/30 19:42:01 INFO exec.LimitOperator: Initializing children:
    [junit] 08/12/30 19:42:01 INFO exec.FilterOperator: Initializing Self
    [junit] 08/12/30 19:42:01 INFO exec.FilterOperator: Initializing children:
    [junit] 08/12/30 19:42:01 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 19:42:01 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 19:42:01 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 19:42:01 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 19:42:01 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 19:42:01 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 19:42:01 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 19:42:01 INFO exec.FilterOperator: Initialization Done
    [junit] 08/12/30 19:42:01 INFO exec.LimitOperator: Initialization Done
    [junit] 08/12/30 19:42:01 INFO exec.ExtractOperator: Initialization Done
    [junit] 08/12/30 19:42:01 INFO exec.FilterOperator: PASSED:5
    [junit] 08/12/30 19:42:01 INFO exec.FilterOperator: FILTERED:15
    [junit] 08/12/30 19:42:01 INFO mapred.LocalJobRunner: reduce > reduce
    [junit] 08/12/30 19:42:01 INFO mapred.TaskRunner: Task 'reduce_fvz6nf' done.
    [junit] 08/12/30 19:42:01 INFO mapred.TaskRunner: Saved output of task 'reduce_fvz6nf' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp-916040072
    [junit]  map = 100%,  reduce =100%
    [junit] Ended Job = job_local_1
    [junit] 08/12/30 19:42:02 INFO exec.ExecDriver:  map = 100%,  reduce =100%
    [junit] 08/12/30 19:42:02 INFO exec.ExecDriver: Ended Job = job_local_1
    [junit] diff -I \(file:\)\|\(/tmp/.*\) /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/input14_limit.q.out /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/clientpositive/input14_limit.q.out
    [junit] Done query: input14_limit.q
    [junit] Begin query: sample2.q
    [junit] plan = /tmp/plan60438.xml
    [junit] 08/12/30 19:42:05 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to 0 since there's no reduce operator
    [junit] 08/12/30 19:42:05 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcbucket/kv1.txt
    [junit] 08/12/30 19:42:05 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/data/files/TestSerDe.jar
    [junit] 08/12/30 19:42:05 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
    [junit] 08/12/30 19:42:05 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] Job running in-process (local Hadoop)
    [junit] 08/12/30 19:42:06 INFO exec.ExecDriver: Job running in-process (local Hadoop)
    [junit] 08/12/30 19:42:06 INFO mapred.MapTask: numReduceTasks: 0
    [junit] 08/12/30 19:42:06 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 19:42:06 INFO exec.MapOperator: Adding alias s to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcbucket/kv1.txt
    [junit] 08/12/30 19:42:06 INFO exec.MapOperator: Got partitions: null
    [junit] 08/12/30 19:42:06 INFO exec.TableScanOperator: Initializing Self
    [junit] 08/12/30 19:42:06 INFO exec.TableScanOperator: Initializing children:
    [junit] 08/12/30 19:42:06 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 19:42:06 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 19:42:06 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 19:42:06 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 19:42:06 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 19:42:06 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 19:42:06 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 19:42:06 INFO exec.TableScanOperator: Initialization Done
    [junit] 08/12/30 19:42:06 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 19:42:06 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcbucket/kv1.txt:0+5812
    [junit] 08/12/30 19:42:06 INFO mapred.TaskRunner: Task 'job_local_1_map_0000' done.
    [junit] 08/12/30 19:42:06 INFO mapred.TaskRunner: Saved output of task 'job_local_1_map_0000' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp-1347919543
    [junit]  map = 100%,  reduce =0%
    [junit] Ended Job = job_local_1
    [junit] 08/12/30 19:42:07 INFO exec.ExecDriver:  map = 100%,  reduce =0%
    [junit] 08/12/30 19:42:07 INFO exec.ExecDriver: Ended Job = job_local_1
    [junit] diff -I \(file:\)\|\(/tmp/.*\) /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/sample2.q.out /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/clientpositive/sample2.q.out
    [junit] Done query: sample2.q
    [junit] Begin query: inputddl1.q
    [junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
    [junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:361)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
    [junit]     at org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_inputddl1(TestCliDriver.java:580)
    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit]     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit]     at java.lang.reflect.Method.invoke(Method.java:597)
    [junit]     at junit.framework.TestCase.runTest(TestCase.java:164)
    [junit]     at junit.framework.TestCase.runBare(TestCase.java:130)
    [junit]     at junit.framework.TestResult$1.protect(TestResult.java:106)
    [junit]     at junit.framework.TestResult.runProtected(TestResult.java:124)
    [junit]     at junit.framework.TestResult.run(TestResult.java:109)
    [junit]     at junit.framework.TestCase.run(TestCase.java:120)
    [junit]     at junit.framework.TestSuite.runTest(TestSuite.java:230)
    [junit]     at junit.framework.TestSuite.run(TestSuite.java:225)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
    [junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
    [junit]     at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)

    [junit]     at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
    [junit]     at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
    [junit]     ... 21 more
    [junit] Begin query: sample4.q
    [junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcbucket)
    [junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcbucket)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:361)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
    [junit]     at org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_sample4(TestCliDriver.java:605)
    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit]     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit]     at java.lang.reflect.Method.invoke(Method.java:597)
    [junit]     at junit.framework.TestCase.runTest(TestCase.java:164)
    [junit]     at junit.framework.TestCase.runBare(TestCase.java:130)
    [junit]     at junit.framework.TestResult$1.protect(TestResult.java:106)
    [junit]     at junit.framework.TestResult.runProtected(TestResult.java:124)
    [junit]     at junit.framework.TestResult.run(TestResult.java:109)
    [junit]     at junit.framework.TestCase.run(TestCase.java:120)
    [junit]     at junit.framework.TestSuite.runTest(TestSuite.java:230)
    [junit]     at junit.framework.TestSuite.run(TestSuite.java:225)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
    [junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcbucket)
    [junit]     at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
    [junit]     at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
    [junit]     at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
    [junit]     ... 21 more
    [junit] Begin query: inputddl3.q
    [junit] diff -I \(file:\)\|\(/tmp/.*\) /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/inputddl3.q.out /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/clientpositive/inputddl3.q.out
    [junit] Done query: inputddl3.q
    [junit] Begin query: groupby2_map.q
    [junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
    [junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:361)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
    [junit]     at org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_groupby2_map(TestCliDriver.java:655)
    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit]     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit]     at java.lang.reflect.Method.invoke(Method.java:597)
    [junit]     at junit.framework.TestCase.runTest(TestCase.java:164)
    [junit]     at junit.framework.TestCase.runBare(TestCase.java:130)
    [junit]     at junit.framework.TestResult$1.protect(TestResult.java:106)
    [junit]     at junit.framework.TestResult.runProtected(TestResult.java:124)
    [junit]     at junit.framework.TestResult.run(TestResult.java:109)
    [junit]     at junit.framework.TestCase.run(TestCase.java:120)
    [junit]     at junit.framework.TestSuite.runTest(TestSuite.java:230)
    [junit]     at junit.framework.TestSuite.run(TestSuite.java:225)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
    [junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
    [junit]     at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
    [junit]     at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
    [junit]     at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
    [junit]     ... 21 more
    [junit] Begin query: sample6.q
    [junit] plan = /tmp/plan60439.xml
    [junit] 08/12/30 19:42:12 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to 0 since there's no reduce operator
    [junit] 08/12/30 19:42:12 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcbucket/kv1.txt
    [junit] 08/12/30 19:42:12 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/data/files/TestSerDe.jar
    [junit] 08/12/30 19:42:12 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
    [junit] 08/12/30 19:42:12 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] Job running in-process (local Hadoop)
    [junit] 08/12/30 19:42:12 INFO exec.ExecDriver: Job running in-process (local Hadoop)
    [junit] 08/12/30 19:42:13 INFO mapred.MapTask: numReduceTasks: 0
    [junit] 08/12/30 19:42:13 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 19:42:13 INFO exec.MapOperator: Adding alias s to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcbucket/kv1.txt
    [junit] 08/12/30 19:42:13 INFO exec.MapOperator: Got partitions: null
    [junit] 08/12/30 19:42:13 INFO exec.TableScanOperator: Initializing Self
    [junit] 08/12/30 19:42:13 INFO exec.TableScanOperator: Initializing children:
    [junit] 08/12/30 19:42:13 INFO exec.FilterOperator: Initializing Self
    [junit] 08/12/30 19:42:13 INFO exec.FilterOperator: Initializing children:
    [junit] 08/12/30 19:42:13 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 19:42:13 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 19:42:13 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 19:42:13 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 19:42:13 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 19:42:13 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 19:42:13 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 19:42:13 INFO exec.FilterOperator: Initialization Done
    [junit] 08/12/30 19:42:13 INFO exec.TableScanOperator: Initialization Done
    [junit] 08/12/30 19:42:13 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 19:42:13 INFO exec.FilterOperator: PASSED:118
    [junit] 08/12/30 19:42:13 INFO exec.FilterOperator: FILTERED:382
    [junit] 08/12/30 19:42:13 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcbucket/kv1.txt:0+5812
    [junit] 08/12/30 19:42:13 INFO mapred.TaskRunner: Task 'job_local_1_map_0000' done.
    [junit] 08/12/30 19:42:13 INFO mapred.TaskRunner: Saved output of task 'job_local_1_map_0000' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp-692599239
    [junit]  map = 100%,  reduce =0%
    [junit] Ended Job = job_local_1
    [junit] 08/12/30 19:42:13 INFO exec.ExecDriver:  map = 100%,  reduce =0%
    [junit] 08/12/30 19:42:13 INFO exec.ExecDriver: Ended Job = job_local_1
    [junit] diff -I \(file:\)\|\(/tmp/.*\) /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/sample6.q.out /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/clientpositive/sample6.q.out
    [junit] Done query: sample6.q
    [junit] Begin query: groupby4_map.q
    [junit] plan = /tmp/plan60440.xml
    [junit] 08/12/30 19:42:17 INFO exec.ExecDriver: Number of reduce tasks determined at compile : 1
    [junit] 08/12/30 19:42:17 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src
    [junit] 08/12/30 19:42:17 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/data/files/TestSerDe.jar
    [junit] 08/12/30 19:42:17 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
    [junit] 08/12/30 19:42:17 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] Job running in-process (local Hadoop)
    [junit] 08/12/30 19:42:18 INFO exec.ExecDriver: Job running in-process (local Hadoop)
    [junit] 08/12/30 19:42:18 INFO mapred.MapTask: numReduceTasks: 1
    [junit] 08/12/30 19:42:18 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 19:42:18 INFO exec.MapOperator: Adding alias src to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt
    [junit] 08/12/30 19:42:18 INFO exec.MapOperator: Got partitions: null
    [junit] 08/12/30 19:42:18 INFO exec.TableScanOperator: Initializing Self
    [junit] 08/12/30 19:42:18 INFO exec.TableScanOperator: Initializing children:
    [junit] 08/12/30 19:42:18 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 19:42:18 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 19:42:18 INFO exec.GroupByOperator: Initializing Self
    [junit] 08/12/30 19:42:18 INFO exec.GroupByOperator: Initializing children:
    [junit] 08/12/30 19:42:18 INFO exec.ReduceSinkOperator: Initializing Self
    [junit] 08/12/30 19:42:18 INFO exec.ReduceSinkOperator: Using tag = -1
    [junit] 08/12/30 19:42:18 INFO thrift.TBinarySortableProtocol: Sort order is ""
    [junit] 08/12/30 19:42:18 INFO thrift.TBinarySortableProtocol: Sort order is ""
    [junit] 08/12/30 19:42:18 INFO exec.GroupByOperator: Initialization Done
    [junit] 08/12/30 19:42:18 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 19:42:18 INFO exec.TableScanOperator: Initialization Done
    [junit] 08/12/30 19:42:18 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 19:42:18 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt:0+5812
    [junit] 08/12/30 19:42:18 INFO mapred.TaskRunner: Task 'job_local_1_map_0000' done.
    [junit] 08/12/30 19:42:18 INFO mapred.TaskRunner: Saved output of task 'job_local_1_map_0000' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp-2130867396
    [junit] 08/12/30 19:42:18 INFO thrift.TBinarySortableProtocol: Sort order is ""
    [junit] 08/12/30 19:42:18 INFO thrift.TBinarySortableProtocol: Sort order is ""
    [junit] 08/12/30 19:42:18 INFO exec.GroupByOperator: Initializing Self
    [junit] 08/12/30 19:42:18 INFO exec.GroupByOperator: Initializing children:
    [junit] 08/12/30 19:42:18 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 19:42:18 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 19:42:18 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 19:42:18 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 19:42:18 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 19:42:18 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 19:42:18 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 19:42:18 INFO exec.GroupByOperator: Initialization Done
    [junit] 08/12/30 19:42:18 INFO mapred.LocalJobRunner: reduce > reduce
    [junit] 08/12/30 19:42:18 INFO mapred.TaskRunner: Task 'reduce_fuxh92' done.
    [junit] 08/12/30 19:42:18 INFO mapred.TaskRunner: Saved output of task 'reduce_fuxh92' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp-2130867396
    [junit] 08/12/30 19:42:19 INFO exec.ExecDriver:  map = 100%,  reduce =100%
    [junit] 08/12/30 19:42:19 INFO exec.ExecDriver: Ended Job = job_local_1
    [junit]  map = 100%,  reduce =100%
    [junit] Ended Job = job_local_1
    [junit] diff -I \(file:\)\|\(/tmp/.*\) /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/groupby4_map.q.out /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/clientpositive/groupby4_map.q.out
    [junit] Done query: groupby4_map.q
    [junit] Begin query: inputddl5.q
    [junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
    [junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:361)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
    [junit]     at org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_inputddl5(TestCliDriver.java:730)
    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit]     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit]     at java.lang.reflect.Method.invoke(Method.java:597)
    [junit]     at junit.framework.TestCase.runTest(TestCase.java:164)
    [junit]     at junit.framework.TestCase.runBare(TestCase.java:130)
    [junit]     at junit.framework.TestResult$1.protect(TestResult.java:106)
    [junit]     at junit.framework.TestResult.runProtected(TestResult.java:124)
    [junit]     at junit.framework.TestResult.run(TestResult.java:109)
    [junit]     at junit.framework.TestCase.run(TestCase.java:120)
    [junit]     at junit.framework.TestSuite.runTest(TestSuite.java:230)
    [junit]     at junit.framework.TestSuite.run(TestSuite.java:225)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
    [junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
    [junit]     at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
    [junit]     at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
    [junit]     at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
    [junit]     ... 21 more
    [junit] Begin query: sample8.q
    [junit] plan = /tmp/plan60441.xml
    [junit] 08/12/30 19:42:22 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to jobconf value of: 1
    [junit] 08/12/30 19:42:23 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcpart/ds=2008-04-08/hr=11
    [junit] 08/12/30 19:42:23 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcpart/ds=2008-04-08/hr=12
    [junit] 08/12/30 19:42:23 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcpart/ds=2008-04-09/hr=11
    [junit] 08/12/30 19:42:23 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcpart/ds=2008-04-09/hr=12
    [junit] 08/12/30 19:42:23 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/data/files/TestSerDe.jar
    [junit] 08/12/30 19:42:23 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
    [junit] 08/12/30 19:42:23 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] 08/12/30 19:42:23 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] 08/12/30 19:42:23 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] 08/12/30 19:42:23 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] 08/12/30 19:42:23 INFO exec.ExecDriver: Job running in-process (local Hadoop)
    [junit] Job running in-process (local Hadoop)
    [junit] 08/12/30 19:42:23 INFO mapred.MapTask: numReduceTasks: 1
    [junit] 08/12/30 19:42:24 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 19:42:24 INFO exec.MapOperator: Adding alias t to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcpart/ds=2008-04-08/hr=11/kv1.txt
    [junit] 08/12/30 19:42:24 INFO exec.MapOperator: Adding alias s to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcpart/ds=2008-04-08/hr=11/kv1.txt
    [junit] 08/12/30 19:42:24 INFO exec.MapOperator: Got partitions: ds/hr
    [junit] 08/12/30 19:42:24 INFO exec.TableScanOperator: Initializing Self
    [junit] 08/12/30 19:42:24 INFO exec.TableScanOperator: Initializing children:
    [junit] 08/12/30 19:42:24 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 19:42:24 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 19:42:24 INFO exec.FilterOperator: Initializing Self
    [junit] 08/12/30 19:42:24 INFO exec.FilterOperator: Initializing children:
    [junit] 08/12/30 19:42:24 INFO exec.ReduceSinkOperator: Initializing Self
    [junit] 08/12/30 19:42:24 INFO exec.ReduceSinkOperator: Using tag = 1
    [junit] 08/12/30 19:42:24 INFO thrift.TBinarySortableProtocol: Sort order is ""
    [junit] 08/12/30 19:42:24 INFO thrift.TBinarySortableProtocol: Sort order is ""
    [junit] 08/12/30 19:42:24 INFO exec.FilterOperator: Initialization Done
    [junit] 08/12/30 19:42:24 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 19:42:24 INFO exec.TableScanOperator: Initialization Done
    [junit] 08/12/30 19:42:24 INFO exec.TableScanOperator: Initializing Self
    [junit] 08/12/30 19:42:24 INFO exec.TableScanOperator: Initializing children:
    [junit] 08/12/30 19:42:24 INFO exec.FilterOperator: Initializing Self
    [junit] 08/12/30 19:42:24 INFO exec.FilterOperator: Initializing children:
    [junit] 08/12/30 19:42:24 INFO exec.ReduceSinkOperator: Initializing Self
    [junit] 08/12/30 19:42:24 INFO exec.ReduceSinkOperator: Using tag = 0
    [junit] 08/12/30 19:42:24 INFO thrift.TBinarySortableProtocol: Sort order is ""
    [junit] 08/12/30 19:42:24 INFO thrift.TBinarySortableProtocol: Sort order is ""
    [junit] 08/12/30 19:42:24 INFO exec.FilterOperator: Initialization Done
    [junit] 08/12/30 19:42:24 INFO exec.TableScanOperator: Initialization Done
    [junit] 08/12/30 19:42:24 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 19:42:24 INFO exec.FilterOperator: FILTERED:458
    [junit] 08/12/30 19:42:24 INFO exec.FilterOperator: PASSED:42
    [junit] 08/12/30 19:42:24 INFO exec.FilterOperator: FILTERED:0
    [junit] 08/12/30 19:42:24 INFO exec.FilterOperator: PASSED:500
    [junit] 08/12/30 19:42:24 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcpart/ds=2008-04-08/hr=11/kv1.txt:0+5812
    [junit] 08/12/30 19:42:24 INFO mapred.TaskRunner: Task 'job_local_1_map_0000' done.
    [junit] 08/12/30 19:42:24 INFO mapred.TaskRunner: Saved output of task 'job_local_1_map_0000' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp-2024050855
    [junit] 08/12/30 19:42:24 INFO mapred.MapTask: numReduceTasks: 1
    [junit] 08/12/30 19:42:24 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 19:42:24 INFO exec.MapOperator: Adding alias t to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcpart/ds=2008-04-08/hr=12/kv1.txt
    [junit] 08/12/30 19:42:24 INFO exec.MapOperator: Got partitions: ds/hr
    [junit] 08/12/30 19:42:24 INFO exec.TableScanOperator: Initializing Self
    [junit] 08/12/30 19:42:24 INFO exec.TableScanOperator: Initializing children:
    [junit] 08/12/30 19:42:24 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 19:42:24 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 19:42:24 INFO exec.FilterOperator: Initializing Self
    [junit] 08/12/30 19:42:24 INFO exec.FilterOperator: Initializing children:
    [junit] 08/12/30 19:42:24 INFO exec.ReduceSinkOperator: Initializing Self
    [junit] 08/12/30 19:42:24 INFO exec.ReduceSinkOperator: Using tag = 1
    [junit] 08/12/30 19:42:24 INFO thrift.TBinarySortableProtocol: Sort order is ""
    [junit] 08/12/30 19:42:24 INFO thrift.TBinarySortableProtocol: Sort order is ""
    [junit] 08/12/30 19:42:24 INFO exec.FilterOperator: Initialization Done
    [junit] 08/12/30 19:42:24 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 19:42:24 INFO exec.TableScanOperator: Initialization Done
    [junit] 08/12/30 19:42:24 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 19:42:24 INFO exec.FilterOperator: FILTERED:916
    [junit] 08/12/30 19:42:24 INFO exec.FilterOperator: PASSED:84
    [junit] 08/12/30 19:42:24 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcpart/ds=2008-04-08/hr=12/kv1.txt:0+5812
    [junit] 08/12/30 19:42:24 INFO mapred.TaskRunner: Task 'job_local_1_map_0001' done.
    [junit] 08/12/30 19:42:24 INFO mapred.TaskRunner: Saved output of task 'job_local_1_map_0001' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp-2024050855
    [junit] 08/12/30 19:42:24 INFO mapred.MapTask: numReduceTasks: 1
    [junit]  map = 100%,  reduce =0%
    [junit] 08/12/30 19:42:24 INFO exec.ExecDriver:  map = 100%,  reduce =0%
    [junit] 08/12/30 19:42:24 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 19:42:24 INFO exec.MapOperator: Adding alias t to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcpart/ds=2008-04-09/hr=11/kv1.txt
    [junit] 08/12/30 19:42:24 INFO exec.MapOperator: Got partitions: ds/hr
    [junit] 08/12/30 19:42:24 INFO exec.TableScanOperator: Initializing Self
    [junit] 08/12/30 19:42:24 INFO exec.TableScanOperator: Initializing children:
    [junit] 08/12/30 19:42:24 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 19:42:24 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 19:42:24 INFO exec.FilterOperator: Initializing Self
    [junit] 08/12/30 19:42:24 INFO exec.FilterOperator: Initializing children:
    [junit] 08/12/30 19:42:24 INFO exec.ReduceSinkOperator: Initializing Self
    [junit] 08/12/30 19:42:24 INFO exec.ReduceSinkOperator: Using tag = 1
    [junit] 08/12/30 19:42:24 INFO thrift.TBinarySortableProtocol: Sort order is ""
    [junit] 08/12/30 19:42:24 INFO thrift.TBinarySortableProtocol: Sort order is ""
    [junit] 08/12/30 19:42:24 INFO exec.FilterOperator: Initialization Done
    [junit] 08/12/30 19:42:24 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 19:42:24 INFO exec.TableScanOperator: Initialization Done
    [junit] 08/12/30 19:42:24 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 19:42:24 INFO exec.FilterOperator: FILTERED:1374
    [junit] 08/12/30 19:42:24 INFO exec.FilterOperator: PASSED:126
    [junit] 08/12/30 19:42:24 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcpart/ds=2008-04-09/hr=11/kv1.txt:0+5812
    [junit] 08/12/30 19:42:24 INFO mapred.TaskRunner: Task 'job_local_1_map_0002' done.
    [junit] 08/12/30 19:42:24 INFO mapred.TaskRunner: Saved output of task 'job_local_1_map_0002' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp-2024050855
    [junit] 08/12/30 19:42:24 INFO mapred.MapTask: numReduceTasks: 1
    [junit] 08/12/30 19:42:25 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 19:42:25 INFO exec.MapOperator: Adding alias t to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcpart/ds=2008-04-09/hr=12/kv1.txt
    [junit] 08/12/30 19:42:25 INFO exec.MapOperator: Got partitions: ds/hr
    [junit] 08/12/30 19:42:25 INFO exec.TableScanOperator: Initializing Self
    [junit] 08/12/30 19:42:25 INFO exec.TableScanOperator: Initializing children:
    [junit] 08/12/30 19:42:25 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 19:42:25 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 19:42:25 INFO exec.FilterOperator: Initializing Self
    [junit] 08/12/30 19:42:25 INFO exec.FilterOperator: Initializing children:
    [junit] 08/12/30 19:42:25 INFO exec.ReduceSinkOperator: Initializing Self
    [junit] 08/12/30 19:42:25 INFO exec.ReduceSinkOperator: Using tag = 1
    [junit] 08/12/30 19:42:25 INFO thrift.TBinarySortableProtocol: Sort order is ""
    [junit] 08/12/30 19:42:25 INFO thrift.TBinarySortableProtocol: Sort order is ""
    [junit] 08/12/30 19:42:25 INFO exec.FilterOperator: Initialization Done
    [junit] 08/12/30 19:42:25 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 19:42:25 INFO exec.TableScanOperator: Initialization Done
    [junit] 08/12/30 19:42:25 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 19:42:25 INFO exec.FilterOperator: FILTERED:1832
    [junit] 08/12/30 19:42:25 INFO exec.FilterOperator: PASSED:168
    [junit] 08/12/30 19:42:25 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcpart/ds=2008-04-09/hr=12/kv1.txt:0+5812
    [junit] 08/12/30 19:42:25 INFO mapred.TaskRunner: Task 'job_local_1_map_0003' done.
    [junit] 08/12/30 19:42:25 INFO mapred.TaskRunner: Saved output of task 'job_local_1_map_0003' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp-2024050855
    [junit] 08/12/30 19:42:25 INFO thrift.TBinarySortableProtocol: Sort order is ""
    [junit] 08/12/30 19:42:25 INFO thrift.TBinarySortableProtocol: Sort order is ""
    [junit] 08/12/30 19:42:25 INFO exec.JoinOperator: Initializing Self
    [junit] 08/12/30 19:42:25 INFO exec.JoinOperator: Initializing children:
    [junit] 08/12/30 19:42:25 INFO exec.FilterOperator: Initializing Self
    [junit] 08/12/30 19:42:25 INFO exec.FilterOperator: Initializing children:
    [junit] 08/12/30 19:42:25 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 19:42:25 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 19:42:25 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 19:42:25 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 19:42:25 INFO exec.FilterOperator: Initialization Done
    [junit] 08/12/30 19:42:25 INFO exec.JoinOperator: Initialization Done
    [junit] 08/12/30 19:42:26 INFO exec.FilterOperator: FILTERED:0
    [junit] 08/12/30 19:42:26 INFO exec.FilterOperator: PASSED:84000
    [junit] 08/12/30 19:42:26 INFO mapred.LocalJobRunner: reduce > reduce
    [junit] 08/12/30 19:42:26 INFO mapred.TaskRunner: Task 'reduce_g4mjzk' done.
    [junit] 08/12/30 19:42:26 INFO mapred.TaskRunner: Saved output of task 'reduce_g4mjzk' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp-2024050855
    [junit] 08/12/30 19:42:26 INFO exec.ExecDriver:  map = 100%,  reduce =100%
    [junit] 08/12/30 19:42:26 INFO exec.ExecDriver: Ended Job = job_local_1
    [junit]  map = 100%,  reduce =100%
    [junit] Ended Job = job_local_1
    [junit] plan = /tmp/plan60442.xml
    [junit] 08/12/30 19:42:28 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to jobconf value of: 1
    [junit] 08/12/30 19:42:29 INFO exec.ExecDriver: Adding input file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp/740036349/12174667.10002
    [junit] 08/12/30 19:42:29 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/data/files/TestSerDe.jar
    [junit] 08/12/30 19:42:29 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
    [junit] 08/12/30 19:42:29 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] 08/12/30 19:42:29 INFO exec.ExecDriver: Job running in-process (local Hadoop)
    [junit] Job running in-process (local Hadoop)
    [junit] 08/12/30 19:42:29 INFO mapred.MapTask: numReduceTasks: 1
    [junit] 08/12/30 19:42:29 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 19:42:29 INFO exec.MapOperator: Adding alias /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp/740036349/12174667.10002 to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp/740036349/12174667.10002/reduce_g4mjzk
    [junit] 08/12/30 19:42:29 INFO exec.MapOperator: Got partitions: null
    [junit] 08/12/30 19:42:29 INFO exec.ReduceSinkOperator: Initializing Self
    [junit] 08/12/30 19:42:29 INFO exec.ReduceSinkOperator: Using tag = -1
    [junit] 08/12/30 19:42:29 INFO thrift.TBinarySortableProtocol: Sort order is "++"
    [junit] 08/12/30 19:42:29 INFO thrift.TBinarySortableProtocol: Sort order is "++"
    [junit]  map = 0%,  reduce =0%
    [junit] 08/12/30 19:42:30 INFO exec.ExecDriver:  map = 0%,  reduce =0%
    [junit] 08/12/30 19:42:31 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 19:42:32 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp/740036349/12174667.10002/reduce_g4mjzk:0+3446412
    [junit] 08/12/30 19:42:32 INFO mapred.TaskRunner: Task 'job_local_1_map_0000' done.
    [junit] 08/12/30 19:42:32 INFO mapred.TaskRunner: Saved output of task 'job_local_1_map_0000' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp-1513614079
    [junit] 08/12/30 19:42:32 INFO thrift.TBinarySortableProtocol: Sort order is "++"
    [junit] 08/12/30 19:42:32 INFO thrift.TBinarySortableProtocol: Sort order is "++"
    [junit] 08/12/30 19:42:32 INFO exec.ExtractOperator: Initializing Self
    [junit] 08/12/30 19:42:32 INFO exec.ExtractOperator: Initializing children:
    [junit] 08/12/30 19:42:32 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 19:42:32 INFO exec.ExtractOperator: Initialization Done
    [junit] 08/12/30 19:42:32 INFO exec.ExecDriver:  map = 100%,  reduce =0%
    [junit]  map = 100%,  reduce =0%
    [junit] 08/12/30 19:42:33 INFO mapred.LocalJobRunner: reduce > reduce
    [junit] 08/12/30 19:42:33 INFO mapred.TaskRunner: Task 'reduce_fbgk1u' done.
    [junit] 08/12/30 19:42:33 INFO mapred.TaskRunner: Saved output of task 'reduce_fbgk1u' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp-1513614079
    [junit] 08/12/30 19:42:33 INFO exec.ExecDriver:  map = 100%,  reduce =100%
    [junit] 08/12/30 19:42:33 INFO exec.ExecDriver: Ended Job = job_local_1
    [junit]  map = 100%,  reduce =100%
    [junit] Ended Job = job_local_1
    [junit] diff -I \(file:\)\|\(/tmp/.*\) /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/sample8.q.out /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/clientpositive/sample8.q.out
    [junit] Done query: sample8.q
    [junit] Begin query: inputddl7.q
    [junit] plan = /tmp/plan60443.xml
    [junit] 08/12/30 19:42:38 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to jobconf value of: 1
    [junit] 08/12/30 19:42:38 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/t1
    [junit] 08/12/30 19:42:38 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/data/files/TestSerDe.jar
    [junit] 08/12/30 19:42:38 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
    [junit] 08/12/30 19:42:38 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] Job running in-process (local Hadoop)
    [junit] 08/12/30 19:42:39 INFO exec.ExecDriver: Job running in-process (local Hadoop)
    [junit] 08/12/30 19:42:39 INFO mapred.MapTask: numReduceTasks: 1
    [junit] 08/12/30 19:42:39 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 19:42:39 INFO exec.MapOperator: Adding alias t1 to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/t1/kv1.txt
    [junit] 08/12/30 19:42:39 INFO exec.MapOperator: Got partitions: null
    [junit] 08/12/30 19:42:39 INFO exec.TableScanOperator: Initializing Self
    [junit] 08/12/30 19:42:39 INFO exec.TableScanOperator: Initializing children:
    [junit] 08/12/30 19:42:39 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 19:42:39 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 19:42:39 INFO exec.ReduceSinkOperator: Initializing Self
    [junit] 08/12/30 19:42:39 INFO exec.ReduceSinkOperator: Using tag = -1
    [junit] 08/12/30 19:42:39 INFO thrift.TBinarySortableProtocol: Sort order is ""
    [junit] 08/12/30 19:42:39 INFO thrift.TBinarySortableProtocol: Sort order is ""
    [junit] 08/12/30 19:42:39 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 19:42:39 INFO exec.TableScanOperator: Initialization Done
    [junit] 08/12/30 19:42:39 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 19:42:39 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/t1/kv1.txt:0+5812
    [junit] 08/12/30 19:42:39 INFO mapred.TaskRunner: Task 'job_local_1_map_0000' done.
    [junit] 08/12/30 19:42:39 INFO mapred.TaskRunner: Saved output of task 'job_local_1_map_0000' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp-1499926855
    [junit] 08/12/30 19:42:39 INFO thrift.TBinarySortableProtocol: Sort order is ""
    [junit] 08/12/30 19:42:39 INFO thrift.TBinarySortableProtocol: Sort order is ""
    [junit] 08/12/30 19:42:39 INFO exec.GroupByOperator: Initializing Self
    [junit] 08/12/30 19:42:39 INFO exec.GroupByOperator: Initializing children:
    [junit] 08/12/30 19:42:39 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 19:42:39 INFO exec.GroupByOperator: Initialization Done
    [junit] 08/12/30 19:42:39 INFO mapred.LocalJobRunner: reduce > reduce
    [junit] 08/12/30 19:42:39 INFO mapred.TaskRunner: Task 'reduce_bpoece' done.
    [junit] 08/12/30 19:42:39 INFO mapred.TaskRunner: Saved output of task 'reduce_bpoece' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp-1499926855
    [junit] 08/12/30 19:42:40 INFO exec.ExecDriver:  map = 100%,  reduce =100%
    [junit] 08/12/30 19:42:40 INFO exec.ExecDriver: Ended Job = job_local_1
    [junit]  map = 100%,  reduce =100%
    [junit] Ended Job = job_local_1
    [junit] plan = /tmp/plan60444.xml
    [junit] 08/12/30 19:42:41 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to jobconf value of: 1
    [junit] 08/12/30 19:42:41 INFO exec.ExecDriver: Adding input file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp/393512487/659329717.10002
    [junit] 08/12/30 19:42:41 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/data/files/TestSerDe.jar
    [junit] 08/12/30 19:42:41 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
    [junit] 08/12/30 19:42:42 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] Job running in-process (local Hadoop)
    [junit] 08/12/30 19:42:42 INFO exec.ExecDriver: Job running in-process (local Hadoop)
    [junit] 08/12/30 19:42:42 INFO mapred.MapTask: numReduceTasks: 1
    [junit] 08/12/30 19:42:42 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 19:42:42 INFO exec.MapOperator: Adding alias /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp/393512487/659329717.10002 to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp/393512487/659329717.10002/reduce_bpoece
    [junit] 08/12/30 19:42:42 INFO exec.MapOperator: Got partitions: null
    [junit] 08/12/30 19:42:42 INFO exec.ReduceSinkOperator: Initializing Self
    [junit] 08/12/30 19:42:42 INFO exec.ReduceSinkOperator: Using tag = -1
    [junit] 08/12/30 19:42:42 INFO thrift.TBinarySortableProtocol: Sort order is ""
    [junit] 08/12/30 19:42:42 INFO thrift.TBinarySortableProtocol: Sort order is ""
    [junit] 08/12/30 19:42:42 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 19:42:42 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp/393512487/659329717.10002/reduce_bpoece:0+124
    [junit] 08/12/30 19:42:42 INFO mapred.TaskRunner: Task 'job_local_1_map_0000' done.
    [junit] 08/12/30 19:42:42 INFO mapred.TaskRunner: Saved output of task 'job_local_1_map_0000' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp1241865249
    [junit] 08/12/30 19:42:42 INFO thrift.TBinarySortableProtocol: Sort order is ""
    [junit] 08/12/30 19:42:42 INFO thrift.TBinarySortableProtocol: Sort order is ""
    [junit] 08/12/30 19:42:42 INFO exec.GroupByOperator: Initializing Self
    [junit] 08/12/30 19:42:42 INFO exec.GroupByOperator: Initializing children:
    [junit] 08/12/30 19:42:42 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 19:42:42 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 19:42:42 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 19:42:42 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 19:42:42 INFO exec.GroupByOperator: Initialization Done
    [junit] 08/12/30 19:42:42 INFO mapred.LocalJobRunner: reduce > reduce
    [junit] 08/12/30 19:42:42 INFO mapred.TaskRunner: Task 'reduce_q6d6eh' done.
    [junit] 08/12/30 19:42:42 INFO mapred.TaskRunner: Saved output of task 'reduce_q6d6eh' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp1241865249
    [junit]  map = 100%,  reduce =100%
    [junit] 08/12/30 19:42:43 INFO exec.ExecDriver:  map = 100%,  reduce =100%
    [junit] 08/12/30 19:42:43 INFO exec.ExecDriver: Ended Job = job_local_1
    [junit] Ended Job = job_local_1
    [junit] plan = /tmp/plan60445.xml
    [junit] 08/12/30 19:42:45 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to jobconf value of: 1
    [junit] 08/12/30 19:42:45 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/t2
    [junit] 08/12/30 19:42:45 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/data/files/TestSerDe.jar
    [junit] 08/12/30 19:42:45 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
    [junit] 08/12/30 19:42:45 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] Job running in-process (local Hadoop)
    [junit] 08/12/30 19:42:46 INFO exec.ExecDriver: Job running in-process (local Hadoop)
    [junit] 08/12/30 19:42:46 INFO mapred.MapTask: numReduceTasks: 1
    [junit] 08/12/30 19:42:46 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 19:42:46 INFO exec.MapOperator: Adding alias t2 to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/t2/kv1.seq
    [junit] 08/12/30 19:42:46 INFO exec.MapOperator: Got partitions: null
    [junit] 08/12/30 19:42:46 INFO exec.TableScanOperator: Initializing Self
    [junit] 08/12/30 19:42:46 INFO exec.TableScanOperator: Initializing children:
    [junit] 08/12/30 19:42:46 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 19:42:46 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 19:42:46 INFO exec.ReduceSinkOperator: Initializing Self
    [junit] 08/12/30 19:42:46 INFO exec.ReduceSinkOperator: Using tag = -1
    [junit] 08/12/30 19:42:46 INFO thrift.TBinarySortableProtocol: Sort order is ""
    [junit] 08/12/30 19:42:46 INFO thrift.TBinarySortableProtocol: Sort order is ""
    [junit] 08/12/30 19:42:46 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 19:42:46 INFO exec.TableScanOperator: Initialization Done
    [junit] 08/12/30 19:42:46 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 19:42:46 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/t2/kv1.seq:0+10508
    [junit] 08/12/30 19:42:46 INFO mapred.TaskRunner: Task 'job_local_1_map_0000' done.
    [junit] 08/12/30 19:42:46 INFO mapred.TaskRunner: Saved output of task 'job_local_1_map_0000' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp-1932912131
    [junit] 08/12/30 19:42:46 INFO thrift.TBinarySortableProtocol: Sort order is ""
    [junit] 08/12/30 19:42:46 INFO thrift.TBinarySortableProtocol: Sort order is ""
    [junit] 08/12/30 19:42:46 INFO exec.GroupByOperator: Initializing Self
    [junit] 08/12/30 19:42:46 INFO exec.GroupByOperator: Initializing children:
    [junit] 08/12/30 19:42:46 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 19:42:46 INFO exec.GroupByOperator: Initialization Done
    [junit] 08/12/30 19:42:46 INFO mapred.LocalJobRunner: reduce > reduce
    [junit] 08/12/30 19:42:46 INFO mapred.TaskRunner: Task 'reduce_k4jsdj' done.
    [junit] 08/12/30 19:42:46 INFO mapred.TaskRunner: Saved output of task 'reduce_k4jsdj' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp-1932912131
    [junit]  map = 100%,  reduce =100%
    [junit] Ended Job = job_local_1
    [junit] 08/12/30 19:42:47 INFO exec.ExecDriver:  map = 100%,  reduce =100%
    [junit] 08/12/30 19:42:47 INFO exec.ExecDriver: Ended Job = job_local_1
    [junit] plan = /tmp/plan60446.xml
    [junit] 08/12/30 19:42:48 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to jobconf value of: 1
    [junit] 08/12/30 19:42:48 INFO exec.ExecDriver: Adding input file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp/34186417/44740138.10002
    [junit] 08/12/30 19:42:49 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/data/files/TestSerDe.jar
    [junit] 08/12/30 19:42:49 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
    [junit] 08/12/30 19:42:49 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] Job running in-process (local Hadoop)
    [junit] 08/12/30 19:42:49 INFO exec.ExecDriver: Job running in-process (local Hadoop)
    [junit] 08/12/30 19:42:49 INFO mapred.MapTask: numReduceTasks: 1
    [junit] 08/12/30 19:42:49 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 19:42:49 INFO exec.MapOperator: Adding alias /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp/34186417/44740138.10002 to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp/34186417/44740138.10002/reduce_k4jsdj
    [junit] 08/12/30 19:42:49 INFO exec.MapOperator: Got partitions: null
    [junit] 08/12/30 19:42:49 INFO exec.ReduceSinkOperator: Initializing Self
    [junit] 08/12/30 19:42:49 INFO exec.ReduceSinkOperator: Using tag = -1

    [junit] 08/12/30 19:42:49 INFO thrift.TBinarySortableProtocol: Sort order is ""
    [junit] 08/12/30 19:42:49 INFO thrift.TBinarySortableProtocol: Sort order is ""
    [junit] 08/12/30 19:42:49 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 19:42:49 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp/34186417/44740138.10002/reduce_k4jsdj:0+124
    [junit] 08/12/30 19:42:49 INFO mapred.TaskRunner: Task 'job_local_1_map_0000' done.
    [junit] 08/12/30 19:42:49 INFO mapred.TaskRunner: Saved output of task 'job_local_1_map_0000' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp-1155490603
    [junit] 08/12/30 19:42:49 INFO thrift.TBinarySortableProtocol: Sort order is ""
    [junit] 08/12/30 19:42:49 INFO thrift.TBinarySortableProtocol: Sort order is ""
    [junit] 08/12/30 19:42:49 INFO exec.GroupByOperator: Initializing Self
    [junit] 08/12/30 19:42:49 INFO exec.GroupByOperator: Initializing children:
    [junit] 08/12/30 19:42:49 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 19:42:49 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 19:42:49 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 19:42:50 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 19:42:50 INFO exec.GroupByOperator: Initialization Done
    [junit] 08/12/30 19:42:50 INFO mapred.LocalJobRunner: reduce > reduce
    [junit] 08/12/30 19:42:50 INFO mapred.TaskRunner: Task 'reduce_s5ms69' done.
    [junit] 08/12/30 19:42:50 INFO mapred.TaskRunner: Saved output of task 'reduce_s5ms69' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp-1155490603
    [junit] 08/12/30 19:42:50 INFO exec.ExecDriver:  map = 100%,  reduce =100%
    [junit] 08/12/30 19:42:50 INFO exec.ExecDriver: Ended Job = job_local_1
    [junit]  map = 100%,  reduce =100%
    [junit] Ended Job = job_local_1
    [junit] plan = /tmp/plan60447.xml
    [junit] 08/12/30 19:42:52 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to jobconf value of: 1
    [junit] 08/12/30 19:42:52 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/t3/ds=2008-04-09
    [junit] 08/12/30 19:42:52 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/data/files/TestSerDe.jar
    [junit] 08/12/30 19:42:52 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
    [junit] 08/12/30 19:42:52 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] Job running in-process (local Hadoop)
    [junit] 08/12/30 19:42:52 INFO exec.ExecDriver: Job running in-process (local Hadoop)
    [junit] 08/12/30 19:42:53 INFO mapred.MapTask: numReduceTasks: 1
    [junit] 08/12/30 19:42:53 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 19:42:53 INFO exec.MapOperator: Adding alias t3 to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/t3/ds=2008-04-09/kv1.txt
    [junit] 08/12/30 19:42:53 INFO exec.MapOperator: Got partitions: ds
    [junit] 08/12/30 19:42:53 INFO exec.TableScanOperator: Initializing Self
    [junit] 08/12/30 19:42:53 INFO exec.TableScanOperator: Initializing children:
    [junit] 08/12/30 19:42:53 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 19:42:53 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 19:42:53 INFO exec.FilterOperator: Initializing Self
    [junit] 08/12/30 19:42:53 INFO exec.FilterOperator: Initializing children:
    [junit] 08/12/30 19:42:53 INFO exec.ReduceSinkOperator: Initializing Self
    [junit] 08/12/30 19:42:53 INFO exec.ReduceSinkOperator: Using tag = -1
    [junit] 08/12/30 19:42:53 INFO thrift.TBinarySortableProtocol: Sort order is ""
    [junit] 08/12/30 19:42:53 INFO thrift.TBinarySortableProtocol: Sort order is ""
    [junit] 08/12/30 19:42:53 INFO exec.FilterOperator: Initialization Done
    [junit] 08/12/30 19:42:53 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 19:42:53 INFO exec.TableScanOperator: Initialization Done
    [junit] 08/12/30 19:42:53 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 19:42:53 INFO exec.FilterOperator: PASSED:500
    [junit] 08/12/30 19:42:53 INFO exec.FilterOperator: FILTERED:0
    [junit] 08/12/30 19:42:53 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/t3/ds=2008-04-09/kv1.txt:0+5812
    [junit] 08/12/30 19:42:53 INFO mapred.TaskRunner: Task 'job_local_1_map_0000' done.
    [junit] 08/12/30 19:42:53 INFO mapred.TaskRunner: Saved output of task 'job_local_1_map_0000' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp644898716
    [junit] 08/12/30 19:42:53 INFO thrift.TBinarySortableProtocol: Sort order is ""
    [junit] 08/12/30 19:42:53 INFO thrift.TBinarySortableProtocol: Sort order is ""
    [junit] 08/12/30 19:42:53 INFO exec.GroupByOperator: Initializing Self
    [junit] 08/12/30 19:42:53 INFO exec.GroupByOperator: Initializing children:
    [junit] 08/12/30 19:42:53 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 19:42:53 INFO exec.GroupByOperator: Initialization Done
    [junit] 08/12/30 19:42:53 INFO mapred.LocalJobRunner: reduce > reduce
    [junit] 08/12/30 19:42:53 INFO mapred.TaskRunner: Task 'reduce_4pxuvo' done.
    [junit] 08/12/30 19:42:53 INFO mapred.TaskRunner: Saved output of task 'reduce_4pxuvo' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp644898716
    [junit]  map = 100%,  reduce =100%
    [junit] 08/12/30 19:42:53 INFO exec.ExecDriver:  map = 100%,  reduce =100%
    [junit] Ended Job = job_local_1
    [junit] 08/12/30 19:42:53 INFO exec.ExecDriver: Ended Job = job_local_1
    [junit] plan = /tmp/plan60448.xml
    [junit] 08/12/30 19:42:55 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to jobconf value of: 1
    [junit] 08/12/30 19:42:55 INFO exec.ExecDriver: Adding input file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp/531167270/404871161.10002
    [junit] 08/12/30 19:42:55 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/data/files/TestSerDe.jar
    [junit] 08/12/30 19:42:55 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
    [junit] 08/12/30 19:42:56 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] Job running in-process (local Hadoop)
    [junit] 08/12/30 19:42:56 INFO exec.ExecDriver: Job running in-process (local Hadoop)
    [junit] 08/12/30 19:42:56 INFO mapred.MapTask: numReduceTasks: 1
    [junit] 08/12/30 19:42:56 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 19:42:56 INFO exec.MapOperator: Adding alias /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp/531167270/404871161.10002 to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp/531167270/404871161.10002/reduce_4pxuvo
    [junit] 08/12/30 19:42:56 INFO exec.MapOperator: Got partitions: null
    [junit] 08/12/30 19:42:56 INFO exec.ReduceSinkOperator: Initializing Self
    [junit] 08/12/30 19:42:56 INFO exec.ReduceSinkOperator: Using tag = -1
    [junit] 08/12/30 19:42:56 INFO thrift.TBinarySortableProtocol: Sort order is ""
    [junit] 08/12/30 19:42:56 INFO thrift.TBinarySortableProtocol: Sort order is ""
    [junit] 08/12/30 19:42:56 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 19:42:56 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp/531167270/404871161.10002/reduce_4pxuvo:0+124
    [junit] 08/12/30 19:42:56 INFO mapred.TaskRunner: Task 'job_local_1_map_0000' done.
    [junit] 08/12/30 19:42:56 INFO mapred.TaskRunner: Saved output of task 'job_local_1_map_0000' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp-554175630
    [junit] 08/12/30 19:42:56 INFO thrift.TBinarySortableProtocol: Sort order is ""
    [junit] 08/12/30 19:42:56 INFO thrift.TBinarySortableProtocol: Sort order is ""
    [junit] 08/12/30 19:42:56 INFO exec.GroupByOperator: Initializing Self
    [junit] 08/12/30 19:42:56 INFO exec.GroupByOperator: Initializing children:
    [junit] 08/12/30 19:42:56 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 19:42:56 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 19:42:56 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 19:42:56 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 19:42:56 INFO exec.GroupByOperator: Initialization Done
    [junit] 08/12/30 19:42:56 INFO mapred.LocalJobRunner: reduce > reduce
    [junit] 08/12/30 19:42:56 INFO mapred.TaskRunner: Task 'reduce_n5vtyh' done.
    [junit] 08/12/30 19:42:56 INFO mapred.TaskRunner: Saved output of task 'reduce_n5vtyh' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp-554175630
    [junit]  map = 100%,  reduce =100%
    [junit] Ended Job = job_local_1
    [junit] 08/12/30 19:42:57 INFO exec.ExecDriver:  map = 100%,  reduce =100%
    [junit] 08/12/30 19:42:57 INFO exec.ExecDriver: Ended Job = job_local_1
    [junit] plan = /tmp/plan60449.xml
    [junit] 08/12/30 19:42:58 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to jobconf value of: 1
    [junit] 08/12/30 19:42:59 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/t4/ds=2008-04-09
    [junit] 08/12/30 19:42:59 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/data/files/TestSerDe.jar
    [junit] 08/12/30 19:42:59 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
    [junit] 08/12/30 19:42:59 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] Job running in-process (local Hadoop)
    [junit] 08/12/30 19:42:59 INFO exec.ExecDriver: Job running in-process (local Hadoop)
    [junit] 08/12/30 19:42:59 INFO mapred.MapTask: numReduceTasks: 1
    [junit] 08/12/30 19:42:59 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 19:42:59 INFO exec.MapOperator: Adding alias t4 to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/t4/ds=2008-04-09/kv1.seq
    [junit] 08/12/30 19:42:59 INFO exec.MapOperator: Got partitions: ds
    [junit] 08/12/30 19:42:59 INFO exec.TableScanOperator: Initializing Self
    [junit] 08/12/30 19:42:59 INFO exec.TableScanOperator: Initializing children:
    [junit] 08/12/30 19:42:59 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 19:42:59 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 19:42:59 INFO exec.FilterOperator: Initializing Self
    [junit] 08/12/30 19:42:59 INFO exec.FilterOperator: Initializing children:
    [junit] 08/12/30 19:42:59 INFO exec.ReduceSinkOperator: Initializing Self
    [junit] 08/12/30 19:42:59 INFO exec.ReduceSinkOperator: Using tag = -1
    [junit] 08/12/30 19:42:59 INFO thrift.TBinarySortableProtocol: Sort order is ""
    [junit] 08/12/30 19:42:59 INFO thrift.TBinarySortableProtocol: Sort order is ""
    [junit] 08/12/30 19:42:59 INFO exec.FilterOperator: Initialization Done
    [junit] 08/12/30 19:42:59 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 19:42:59 INFO exec.TableScanOperator: Initialization Done
    [junit] 08/12/30 19:42:59 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 19:42:59 INFO exec.FilterOperator: PASSED:500
    [junit] 08/12/30 19:42:59 INFO exec.FilterOperator: FILTERED:0
    [junit] 08/12/30 19:42:59 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/t4/ds=2008-04-09/kv1.seq:0+10508
    [junit] 08/12/30 19:42:59 INFO mapred.TaskRunner: Task 'job_local_1_map_0000' done.
    [junit] 08/12/30 19:42:59 INFO mapred.TaskRunner: Saved output of task 'job_local_1_map_0000' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp-872598019
    [junit] 08/12/30 19:43:00 INFO thrift.TBinarySortableProtocol: Sort order is ""
    [junit] 08/12/30 19:43:00 INFO thrift.TBinarySortableProtocol: Sort order is ""
    [junit] 08/12/30 19:43:00 INFO exec.GroupByOperator: Initializing Self
    [junit] 08/12/30 19:43:00 INFO exec.GroupByOperator: Initializing children:
    [junit] 08/12/30 19:43:00 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 19:43:00 INFO exec.GroupByOperator: Initialization Done
    [junit] 08/12/30 19:43:00 INFO mapred.LocalJobRunner: reduce > reduce
    [junit] 08/12/30 19:43:00 INFO mapred.TaskRunner: Task 'reduce_v79p5m' done.
    [junit] 08/12/30 19:43:00 INFO mapred.TaskRunner: Saved output of task 'reduce_v79p5m' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp-872598019
    [junit] 08/12/30 19:43:00 INFO exec.ExecDriver:  map = 100%,  reduce =100%
    [junit] 08/12/30 19:43:00 INFO exec.ExecDriver: Ended Job = job_local_1
    [junit]  map = 100%,  reduce =100%
    [junit] Ended Job = job_local_1
    [junit] plan = /tmp/plan60450.xml
    [junit] 08/12/30 19:43:01 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to jobconf value of: 1
    [junit] 08/12/30 19:43:02 INFO exec.ExecDriver: Adding input file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp/836098050/748329414.10002
    [junit] 08/12/30 19:43:02 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/data/files/TestSerDe.jar
    [junit] 08/12/30 19:43:02 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
    [junit] 08/12/30 19:43:02 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] Job running in-process (local Hadoop)
    [junit] 08/12/30 19:43:02 INFO exec.ExecDriver: Job running in-process (local Hadoop)
    [junit] 08/12/30 19:43:02 INFO mapred.MapTask: numReduceTasks: 1
    [junit] 08/12/30 19:43:02 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 19:43:02 INFO exec.MapOperator: Adding alias /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp/836098050/748329414.10002 to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp/836098050/748329414.10002/reduce_v79p5m
    [junit] 08/12/30 19:43:03 INFO exec.MapOperator: Got partitions: null
    [junit] 08/12/30 19:43:03 INFO exec.ReduceSinkOperator: Initializing Self
    [junit] 08/12/30 19:43:03 INFO exec.ReduceSinkOperator: Using tag = -1
    [junit] 08/12/30 19:43:03 INFO thrift.TBinarySortableProtocol: Sort order is ""
    [junit] 08/12/30 19:43:03 INFO thrift.TBinarySortableProtocol: Sort order is ""
    [junit] 08/12/30 19:43:03 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 19:43:03 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp/836098050/748329414.10002/reduce_v79p5m:0+124
    [junit] 08/12/30 19:43:03 INFO mapred.TaskRunner: Task 'job_local_1_map_0000' done.
    [junit] 08/12/30 19:43:03 INFO mapred.TaskRunner: Saved output of task 'job_local_1_map_0000' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp353225185
    [junit] 08/12/30 19:43:03 INFO thrift.TBinarySortableProtocol: Sort order is ""
    [junit] 08/12/30 19:43:03 INFO thrift.TBinarySortableProtocol: Sort order is ""
    [junit] 08/12/30 19:43:03 INFO exec.GroupByOperator: Initializing Self
    [junit] 08/12/30 19:43:03 INFO exec.GroupByOperator: Initializing children:
    [junit] 08/12/30 19:43:03 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 19:43:03 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 19:43:03 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 19:43:03 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 19:43:03 INFO exec.GroupByOperator: Initialization Done
    [junit] 08/12/30 19:43:03 INFO mapred.LocalJobRunner: reduce > reduce
    [junit] 08/12/30 19:43:03 INFO mapred.TaskRunner: Task 'reduce_59q566' done.
    [junit] 08/12/30 19:43:03 INFO mapred.TaskRunner: Saved output of task 'reduce_59q566' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp353225185
    [junit] 08/12/30 19:43:03 INFO exec.ExecDriver:  map = 100%,  reduce =100%
    [junit] 08/12/30 19:43:03 INFO exec.ExecDriver: Ended Job = job_local_1
    [junit]  map = 100%,  reduce =100%
    [junit] Ended Job = job_local_1
    [junit] diff -I \(file:\)\|\(/tmp/.*\) /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/inputddl7.q.out /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/clientpositive/inputddl7.q.out
    [junit] Done query: inputddl7.q
    [junit] Begin query: notable_alias1.q
    [junit] plan = /tmp/plan60451.xml
    [junit] 08/12/30 19:43:07 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to jobconf value of: 1
    [junit] 08/12/30 19:43:07 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src
    [junit] 08/12/30 19:43:07 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/data/files/TestSerDe.jar
    [junit] 08/12/30 19:43:07 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
    [junit] 08/12/30 19:43:07 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] Job running in-process (local Hadoop)
    [junit] 08/12/30 19:43:07 INFO exec.ExecDriver: Job running in-process (local Hadoop)
    [junit] 08/12/30 19:43:07 INFO mapred.MapTask: numReduceTasks: 1
    [junit] 08/12/30 19:43:07 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 19:43:07 INFO exec.MapOperator: Adding alias src to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt
    [junit] 08/12/30 19:43:07 INFO exec.MapOperator: Got partitions: null
    [junit] 08/12/30 19:43:07 INFO exec.TableScanOperator: Initializing Self
    [junit] 08/12/30 19:43:07 INFO exec.TableScanOperator: Initializing children:
    [junit] 08/12/30 19:43:07 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 19:43:07 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 19:43:07 INFO exec.FilterOperator: Initializing Self
    [junit] 08/12/30 19:43:07 INFO exec.FilterOperator: Initializing children:
    [junit] 08/12/30 19:43:07 INFO exec.ReduceSinkOperator: Initializing Self
    [junit] 08/12/30 19:43:07 INFO exec.ReduceSinkOperator: Using tag = -1
    [junit] 08/12/30 19:43:08 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 19:43:08 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 19:43:08 INFO exec.FilterOperator: Initialization Done
    [junit] 08/12/30 19:43:08 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 19:43:08 INFO exec.TableScanOperator: Initialization Done
    [junit] 08/12/30 19:43:08 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 19:43:08 INFO exec.FilterOperator: FILTERED:416
    [junit] 08/12/30 19:43:08 INFO exec.FilterOperator: PASSED:84
    [junit] 08/12/30 19:43:08 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt:0+5812
    [junit] 08/12/30 19:43:08 INFO mapred.TaskRunner: Task 'job_local_1_map_0000' done.
    [junit] 08/12/30 19:43:08 INFO mapred.TaskRunner: Saved output of task 'job_local_1_map_0000' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp-367828813
    [junit] 08/12/30 19:43:08 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 19:43:08 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 19:43:08 INFO exec.GroupByOperator: Initializing Self
    [junit] 08/12/30 19:43:08 INFO exec.GroupByOperator: Initializing children:
    [junit] 08/12/30 19:43:08 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 19:43:08 INFO exec.GroupByOperator: Initialization Done
    [junit] 08/12/30 19:43:08 INFO mapred.LocalJobRunner: reduce > reduce
    [junit] 08/12/30 19:43:08 INFO mapred.TaskRunner: Task 'reduce_z0ofwt' done.
    [junit] 08/12/30 19:43:08 INFO mapred.TaskRunner: Saved output of task 'reduce_z0ofwt' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp-367828813
    [junit] 08/12/30 19:43:08 INFO exec.ExecDriver:  map = 100%,  reduce =100%
    [junit]  map = 100%,  reduce =100%
    [junit] 08/12/30 19:43:08 INFO exec.ExecDriver: Ended Job = job_local_1
    [junit] Ended Job = job_local_1
    [junit] plan = /tmp/plan60452.xml
    [junit] 08/12/30 19:43:10 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to jobconf value of: 1
    [junit] 08/12/30 19:43:10 INFO exec.ExecDriver: Adding input file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp/750556676/186838243.10001
    [junit] 08/12/30 19:43:10 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/data/files/TestSerDe.jar
    [junit] 08/12/30 19:43:10 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
    [junit] 08/12/30 19:43:10 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] Job running in-process (local Hadoop)
    [junit] 08/12/30 19:43:11 INFO exec.ExecDriver: Job running in-process (local Hadoop)
    [junit] 08/12/30 19:43:11 INFO mapred.MapTask: numReduceTasks: 1
    [junit] 08/12/30 19:43:11 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 19:43:11 INFO exec.MapOperator: Adding alias /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp/750556676/186838243.10001 to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp/750556676/186838243.10001/reduce_z0ofwt
    [junit] 08/12/30 19:43:11 INFO exec.MapOperator: Got partitions: null
    [junit] 08/12/30 19:43:11 INFO exec.ReduceSinkOperator: Initializing Self
    [junit] 08/12/30 19:43:11 INFO exec.ReduceSinkOperator: Using tag = -1
    [junit] 08/12/30 19:43:11 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 19:43:11 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 19:43:11 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 19:43:11 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp/750556676/186838243.10001/reduce_z0ofwt:0+2219
    [junit] 08/12/30 19:43:11 INFO mapred.TaskRunner: Task 'job_local_1_map_0000' done.
    [junit] 08/12/30 19:43:11 INFO mapred.TaskRunner: Saved output of task 'job_local_1_map_0000' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp965090589
    [junit] 08/12/30 19:43:11 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 19:43:11 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 19:43:11 INFO exec.GroupByOperator: Initializing Self
    [junit] 08/12/30 19:43:11 INFO exec.GroupByOperator: Initializing children:
    [junit] 08/12/30 19:43:11 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 19:43:11 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 19:43:11 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 19:43:11 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 19:43:11 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 19:43:11 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 19:43:11 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 19:43:11 INFO exec.GroupByOperator: Initialization Done
    [junit] 08/12/30 19:43:11 INFO mapred.LocalJobRunner: reduce > reduce
    [junit] 08/12/30 19:43:11 INFO mapred.TaskRunner: Task 'reduce_rgwcq9' done.
    [junit] 08/12/30 19:43:11 INFO mapred.TaskRunner: Saved output of task 'reduce_rgwcq9' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp965090589
    [junit] 08/12/30 19:43:12 INFO exec.ExecDriver:  map = 100%,  reduce =100%
    [junit] 08/12/30 19:43:12 INFO exec.ExecDriver: Ended Job = job_local_1
    [junit]  map = 100%,  reduce =100%
    [junit] Ended Job = job_local_1
    [junit] diff -I \(file:\)\|\(/tmp/.*\) /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/notable_alias1.q.out /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/clientpositive/notable_alias1.q.out
    [junit] Done query: notable_alias1.q
    [junit] Begin query: input0.q
    [junit] diff -I \(file:\)\|\(/tmp/.*\) /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/input0.q.out /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/clientpositive/input0.q.out
    [junit] Done query: input0.q
    [junit] Begin query: join1.q
    [junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
    [junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:361)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
    [junit]     at org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_join1(TestCliDriver.java:855)
    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit]     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit]     at java.lang.reflect.Method.invoke(Method.java:597)
    [junit]     at junit.framework.TestCase.runTest(TestCase.java:164)
    [junit]     at junit.framework.TestCase.runBare(TestCase.java:130)
    [junit]     at junit.framework.TestResult$1.protect(TestResult.java:106)
    [junit]     at junit.framework.TestResult.runProtected(TestResult.java:124)
    [junit]     at junit.framework.TestResult.run(TestResult.java:109)
    [junit]     at junit.framework.TestCase.run(TestCase.java:120)
    [junit]     at junit.framework.TestSuite.runTest(TestSuite.java:230)
    [junit]     at junit.framework.TestSuite.run(TestSuite.java:225)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
    [junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
    [junit]     at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
    [junit]     at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
    [junit]     at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
    [junit]     ... 21 more
    [junit] Begin query: input2.q
    [junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcpart)
    [junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcpart)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:361)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
    [junit]     at org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_input2(TestCliDriver.java:880)
    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit]     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit]     at java.lang.reflect.Method.invoke(Method.java:597)
    [junit]     at junit.framework.TestCase.runTest(TestCase.java:164)
    [junit]     at junit.framework.TestCase.runBare(TestCase.java:130)
    [junit]     at junit.framework.TestResult$1.protect(TestResult.java:106)
    [junit]     at junit.framework.TestResult.runProtected(TestResult.java:124)
    [junit]     at junit.framework.TestResult.run(TestResult.java:109)
    [junit]     at junit.framework.TestCase.run(TestCase.java:120)
    [junit]     at junit.framework.TestSuite.runTest(TestSuite.java:230)
    [junit]     at junit.framework.TestSuite.run(TestSuite.java:225)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
    [junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcpart)
    [junit]     at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
    [junit]     at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
    [junit]     at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
    [junit]     ... 21 more
    [junit] Begin query: join3.q
    [junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcbucket)
    [junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcbucket)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:361)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
    [junit]     at org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_join3(TestCliDriver.java:905)
    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit]     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit]     at java.lang.reflect.Method.invoke(Method.java:597)
    [junit]     at junit.framework.TestCase.runTest(TestCase.java:164)
    [junit]     at junit.framework.TestCase.runBare(TestCase.java:130)
    [junit]     at junit.framework.TestResult$1.protect(TestResult.java:106)
    [junit]     at junit.framework.TestResult.runProtected(TestResult.java:124)
    [junit]     at junit.framework.TestResult.run(TestResult.java:109)
    [junit]     at junit.framework.TestCase.run(TestCase.java:120)
    [junit]     at junit.framework.TestSuite.runTest(TestSuite.java:230)
    [junit]     at junit.framework.TestSuite.run(TestSuite.java:225)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
    [junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcbucket)
    [junit]     at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
    [junit]     at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
    [junit]     at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
    [junit]     ... 21 more
    [junit] Begin query: input4.q
    [junit] plan = /tmp/plan60453.xml
    [junit] 08/12/30 19:43:16 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to 0 since there's no reduce operator
    [junit] 08/12/30 19:43:16 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/input4
    [junit] 08/12/30 19:43:16 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/data/files/TestSerDe.jar
    [junit] 08/12/30 19:43:16 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
    [junit] 08/12/30 19:43:17 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] Job running in-process (local Hadoop)
    [junit] 08/12/30 19:43:17 INFO exec.ExecDriver: Job running in-process (local Hadoop)
    [junit] 08/12/30 19:43:17 INFO mapred.MapTask: numReduceTasks: 0
    [junit] 08/12/30 19:43:17 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 19:43:17 INFO exec.MapOperator: Adding alias input4 to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/input4/kv1.txt
    [junit] 08/12/30 19:43:17 INFO exec.MapOperator: Got partitions: null
    [junit] 08/12/30 19:43:17 INFO exec.TableScanOperator: Initializing Self
    [junit] 08/12/30 19:43:17 INFO exec.TableScanOperator: Initializing children:
    [junit] 08/12/30 19:43:17 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 19:43:17 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 19:43:17 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 19:43:17 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 19:43:17 INFO exec.TableScanOperator: Initialization Done
    [junit] 08/12/30 19:43:17 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 19:43:17 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/input4/kv1.txt:0+5812
    [junit] 08/12/30 19:43:17 INFO mapred.TaskRunner: Task 'job_local_1_map_0000' done.
    [junit] 08/12/30 19:43:17 INFO mapred.TaskRunner: Saved output of task 'job_local_1_map_0000' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp-1038905456
    [junit] 08/12/30 19:43:18 INFO exec.ExecDriver:  map = 100%,  reduce =0%
    [junit] 08/12/30 19:43:18 INFO exec.ExecDriver: Ended Job = job_local_1
    [junit]  map = 100%,  reduce =0%
    [junit] Ended Job = job_local_1
    [junit] Exception: Client Execution failed with error code = 9
    [junit] junit.framework.AssertionFailedError: Client Execution failed with error code = 9
    [junit]     at junit.framework.Assert.fail(Assert.java:47)
    [junit]     at org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_input4(TestCliDriver.java:933)
    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit]     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit]     at java.lang.reflect.Method.invoke(Method.java:597)
    [junit]     at junit.framework.TestCase.runTest(TestCase.java:164)
    [junit]     at junit.framework.TestCase.runBare(TestCase.java:130)
    [junit]     at junit.framework.TestResult$1.protect(TestResult.java:106)
    [junit]     at junit.framework.TestResult.runProtected(TestResult.java:124)
    [junit]     at junit.framework.TestResult.run(TestResult.java:109)
    [junit]     at junit.framework.TestCase.run(TestCase.java:120)
    [junit]     at junit.framework.TestSuite.runTest(TestSuite.java:230)
    [junit]     at junit.framework.TestSuite.run(TestSuite.java:225)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
    [junit] Begin query: describe_xpath.q
    [junit] diff -I \(file:\)\|\(/tmp/.*\) /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/describe_xpath.q.out /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/clientpositive/describe_xpath.q.out
    [junit] Done query: describe_xpath.q
    [junit] Begin query: join5.q
    [junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
    [junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:361)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
    [junit]     at org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_join5(TestCliDriver.java:980)
    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit]     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit]     at java.lang.reflect.Method.invoke(Method.java:597)
    [junit]     at junit.framework.TestCase.runTest(TestCase.java:164)
    [junit]     at junit.framework.TestCase.runBare(TestCase.java:130)
    [junit]     at junit.framework.TestResult$1.protect(TestResult.java:106)
    [junit]     at junit.framework.TestResult.runProtected(TestResult.java:124)
    [junit]     at junit.framework.TestResult.run(TestResult.java:109)
    [junit]     at junit.framework.TestCase.run(TestCase.java:120)
    [junit]     at junit.framework.TestSuite.runTest(TestSuite.java:230)
    [junit]     at junit.framework.TestSuite.run(TestSuite.java:225)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
    [junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
    [junit]     at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
    [junit]     at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
    [junit]     at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
    [junit]     ... 21 more
    [junit] Begin query: input_testxpath2.q
    [junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcbucket)
    [junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcbucket)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:361)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
    [junit]     at org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_input_testxpath2(TestCliDriver.java:1005)
    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit]     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit]     at java.lang.reflect.Method.invoke(Method.java:597)
    [junit]     at junit.framework.TestCase.runTest(TestCase.java:164)
    [junit]     at junit.framework.TestCase.runBare(TestCase.java:130)
    [junit]     at junit.framework.TestResult$1.protect(TestResult.java:106)
    [junit]     at junit.framework.TestResult.runProtected(TestResult.java:124)
    [junit]     at junit.framework.TestResult.run(TestResult.java:109)
    [junit]     at junit.framework.TestCase.run(TestCase.java:120)
    [junit]     at junit.framework.TestSuite.runTest(TestSuite.java:230)
    [junit]     at junit.framework.TestSuite.run(TestSuite.java:225)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
    [junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcbucket)
    [junit]     at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
    [junit]     at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
    [junit]     at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
    [junit]     ... 21 more
    [junit] Begin query: input6.q
    [junit] plan = /tmp/plan60454.xml
    [junit] 08/12/30 19:43:22 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to 0 since there's no reduce operator
    [junit] 08/12/30 19:43:22 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src1
    [junit] 08/12/30 19:43:22 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/data/files/TestSerDe.jar
    [junit] 08/12/30 19:43:22 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
    [junit] 08/12/30 19:43:22 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] 08/12/30 19:43:23 INFO exec.ExecDriver: Job running in-process (local Hadoop)
    [junit] Job running in-process (local Hadoop)
    [junit] 08/12/30 19:43:23 INFO mapred.MapTask: numReduceTasks: 0
    [junit] 08/12/30 19:43:23 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 19:43:23 INFO exec.MapOperator: Adding alias src1 to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src1/kv3.txt
    [junit] 08/12/30 19:43:23 INFO exec.MapOperator: Got partitions: null
    [junit] 08/12/30 19:43:23 INFO exec.TableScanOperator: Initializing Self
    [junit] 08/12/30 19:43:23 INFO exec.TableScanOperator: Initializing children:
    [junit] 08/12/30 19:43:23 INFO exec.FilterOperator: Initializing Self
    [junit] 08/12/30 19:43:23 INFO exec.FilterOperator: Initializing children:
    [junit] 08/12/30 19:43:23 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 19:43:23 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 19:43:23 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 19:43:23 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 19:43:23 INFO exec.FilterOperator: Initialization Done
    [junit] 08/12/30 19:43:23 INFO exec.TableScanOperator: Initialization Done
    [junit] 08/12/30 19:43:23 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 19:43:23 INFO exec.FilterOperator: PASSED:0
    [junit] 08/12/30 19:43:23 INFO exec.FilterOperator: FILTERED:25
    [junit] 08/12/30 19:43:23 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src1/kv3.txt:0+216
    [junit] 08/12/30 19:43:23 INFO mapred.TaskRunner: Task 'job_local_1_map_0000' done.
    [junit] 08/12/30 19:43:23 INFO mapred.TaskRunner: Saved output of task 'job_local_1_map_0000' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp-1651100476
    [junit] 08/12/30 19:43:24 INFO exec.ExecDriver:  map = 100%,  reduce =0%
    [junit] 08/12/30 19:43:24 INFO exec.ExecDriver: Ended Job = job_local_1
    [junit]  map = 100%,  reduce =0%
    [junit] Ended Job = job_local_1
    [junit] diff -I \(file:\)\|\(/tmp/.*\) /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/input6.q.out /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/clientpositive/input6.q.out
    [junit] Done query: input6.q
    [junit] Begin query: join7.q
    [junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
    [junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:361)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
    [junit]     at org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_join7(TestCliDriver.java:1055)
    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit]     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit]     at java.lang.reflect.Method.invoke(Method.java:597)
    [junit]     at junit.framework.TestCase.runTest(TestCase.java:164)
    [junit]     at junit.framework.TestCase.runBare(TestCase.java:130)
    [junit]     at junit.framework.TestResult$1.protect(TestResult.java:106)
    [junit]     at junit.framework.TestResult.runProtected(TestResult.java:124)
    [junit]     at junit.framework.TestResult.run(TestResult.java:109)
    [junit]     at junit.framework.TestCase.run(TestCase.java:120)
    [junit]     at junit.framework.TestSuite.runTest(TestSuite.java:230)
    [junit]     at junit.framework.TestSuite.run(TestSuite.java:225)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
    [junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
    [junit]     at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
    [junit]     at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
    [junit]     at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
    [junit]     ... 21 more
    [junit] Begin query: input8.q
    [junit] plan = /tmp/plan60455.xml
    [junit] 08/12/30 19:43:27 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to 0 since there's no reduce operator
    [junit] 08/12/30 19:43:27 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src1
    [junit] 08/12/30 19:43:27 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/data/files/TestSerDe.jar
    [junit] 08/12/30 19:43:27 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
    [junit] 08/12/30 19:43:27 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] 08/12/30 19:43:28 INFO exec.ExecDriver: Job running in-process (local Hadoop)
    [junit] Job running in-process (local Hadoop)
    [junit] 08/12/30 19:43:28 INFO mapred.MapTask: numReduceTasks: 0
    [junit] 08/12/30 19:43:28 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 19:43:28 INFO exec.MapOperator: Adding alias src1 to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src1/kv3.txt
    [junit] 08/12/30 19:43:28 INFO exec.MapOperator: Got partitions: null
    [junit] 08/12/30 19:43:28 INFO exec.TableScanOperator: Initializing Self
    [junit] 08/12/30 19:43:28 INFO exec.TableScanOperator: Initializing children:
    [junit] 08/12/30 19:43:28 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 19:43:28 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 19:43:28 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 19:43:28 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 19:43:28 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 19:43:28 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 19:43:28 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 19:43:28 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 19:43:28 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 19:43:28 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 19:43:28 INFO exec.TableScanOperator: Initialization Done
    [junit] 08/12/30 19:43:28 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 19:43:28 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src1/kv3.txt:0+216
    [junit] 08/12/30 19:43:28 INFO mapred.TaskRunner: Task 'job_local_1_map_0000' done.
    [junit] 08/12/30 19:43:28 INFO mapred.TaskRunner: Saved output of task 'job_local_1_map_0000' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp1355202995
    [junit]  map = 100%,  reduce =0%
    [junit] 08/12/30 19:43:29 INFO exec.ExecDriver:  map = 100%,  reduce =0%
    [junit] 08/12/30 19:43:29 INFO exec.ExecDriver: Ended Job = job_local_1
    [junit] Ended Job = job_local_1
    [junit] diff -I \(file:\)\|\(/tmp/.*\) /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/input8.q.out /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/clientpositive/input8.q.out
    [junit] Done query: input8.q
    [junit] Begin query: union.q
    [junit] plan = /tmp/plan60456.xml
    [junit] 08/12/30 19:43:31 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to 0 since there's no reduce operator
    [junit] 08/12/30 19:43:32 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src
    [junit] 08/12/30 19:43:32 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/data/files/TestSerDe.jar
    [junit] 08/12/30 19:43:32 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
    [junit] 08/12/30 19:43:32 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] 08/12/30 19:43:32 INFO exec.ExecDriver: Job running in-process (local Hadoop)
    [junit] Job running in-process (local Hadoop)
    [junit] 08/12/30 19:43:32 INFO mapred.MapTask: numReduceTasks: 0
    [junit] 08/12/30 19:43:32 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 19:43:32 INFO exec.MapOperator: Adding alias null-subquery1:unioninput-subquery1:src to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt
    [junit] 08/12/30 19:43:32 INFO exec.MapOperator: Adding alias null-subquery2:unioninput-subquery2:src to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt
    [junit] 08/12/30 19:43:32 INFO exec.MapOperator: Got partitions: null
    [junit] 08/12/30 19:43:32 INFO exec.TableScanOperator: Initializing Self
    [junit] 08/12/30 19:43:32 INFO exec.TableScanOperator: Initializing children:
    [junit] 08/12/30 19:43:32 INFO exec.FilterOperator: Initializing Self
    [junit] 08/12/30 19:43:32 INFO exec.FilterOperator: Initializing children:
    [junit] 08/12/30 19:43:32 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 19:43:32 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 19:43:32 INFO exec.ForwardOperator: Initializing Self
    [junit] 08/12/30 19:43:32 INFO exec.ForwardOperator: Initializing children:
    [junit] 08/12/30 19:43:32 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 19:43:32 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 19:43:32 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 19:43:32 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 19:43:32 INFO exec.ForwardOperator: Initialization Done
    [junit] 08/12/30 19:43:32 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 19:43:32 INFO exec.FilterOperator: Initialization Done
    [junit] 08/12/30 19:43:32 INFO exec.TableScanOperator: Initialization Done
    [junit] 08/12/30 19:43:32 INFO exec.TableScanOperator: Initializing Self
    [junit] 08/12/30 19:43:32 INFO exec.TableScanOperator: Initializing children:
    [junit] 08/12/30 19:43:32 INFO exec.FilterOperator: Initializing Self
    [junit] 08/12/30 19:43:32 INFO exec.FilterOperator: Initializing children:
    [junit] 08/12/30 19:43:32 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 19:43:32 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 19:43:32 INFO exec.ForwardOperator: Initializing Self
    [junit] 08/12/30 19:43:32 INFO exec.ForwardOperator: Initializing children:
    [junit] 08/12/30 19:43:32 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 19:43:32 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 19:43:32 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 19:43:32 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 19:43:32 INFO exec.ForwardOperator: Initialization Done
    [junit] 08/12/30 19:43:32 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 19:43:32 INFO exec.FilterOperator: Initialization Done
    [junit] 08/12/30 19:43:32 INFO exec.TableScanOperator: Initialization Done
    [junit] 08/12/30 19:43:32 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 19:43:32 INFO exec.FilterOperator: PASSED:84
    [junit] 08/12/30 19:43:32 INFO exec.FilterOperator: FILTERED:416
    [junit] 08/12/30 19:43:32 INFO exec.FilterOperator: PASSED:414
    [junit] 08/12/30 19:43:32 INFO exec.FilterOperator: FILTERED:86
    [junit] 08/12/30 19:43:32 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt:0+5812
    [junit] 08/12/30 19:43:32 INFO mapred.TaskRunner: Task 'job_local_1_map_0000' done.
    [junit] 08/12/30 19:43:32 INFO mapred.TaskRunner: Saved output of task 'job_local_1_map_0000' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp-893314146
    [junit]  map = 100%,  reduce =0%
    [junit] Ended Job = job_local_1
    [junit] 08/12/30 19:43:33 INFO exec.ExecDriver:  map = 100%,  reduce =0%
    [junit] 08/12/30 19:43:33 INFO exec.ExecDriver: Ended Job = job_local_1
    [junit] diff -I \(file:\)\|\(/tmp/.*\) /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/union.q.out /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/clientpositive/union.q.out
    [junit] Done query: union.q
    [junit] Begin query: join9.q
    [junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
    [junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:361)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
    [junit]     at org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_join9(TestCliDriver.java:1130)
    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit]     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit]     at java.lang.reflect.Method.invoke(Method.java:597)
    [junit]     at junit.framework.TestCase.runTest(TestCase.java:164)
    [junit]     at junit.framework.TestCase.runBare(TestCase.java:130)
    [junit]     at junit.framework.TestResult$1.protect(TestResult.java:106)
    [junit]     at junit.framework.TestResult.runProtected(TestResult.java:124)
    [junit]     at junit.framework.TestResult.run(TestResult.java:109)
    [junit]     at junit.framework.TestCase.run(TestCase.java:120)
    [junit]     at junit.framework.TestSuite.runTest(TestSuite.java:230)
    [junit]     at junit.framework.TestSuite.run(TestSuite.java:225)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
    [junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
    [junit]     at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
    [junit]     at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
    [junit]     at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
    [junit]     ... 21 more
    [junit] Begin query: udf2.q
    [junit] plan = /tmp/plan60457.xml
    [junit] 08/12/30 19:43:36 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to 0 since there's no reduce operator
    [junit] 08/12/30 19:43:36 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src
    [junit] 08/12/30 19:43:36 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/data/files/TestSerDe.jar
    [junit] 08/12/30 19:43:36 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
    [junit] 08/12/30 19:43:36 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] Job running in-process (local Hadoop)
    [junit] 08/12/30 19:43:37 INFO exec.ExecDriver: Job running in-process (local Hadoop)
    [junit] 08/12/30 19:43:37 INFO mapred.MapTask: numReduceTasks: 0
    [junit] 08/12/30 19:43:37 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 19:43:37 INFO exec.MapOperator: Adding alias src to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt
    [junit] 08/12/30 19:43:37 INFO exec.MapOperator: Got partitions: null
    [junit] 08/12/30 19:43:37 INFO exec.TableScanOperator: Initializing Self
    [junit] 08/12/30 19:43:37 INFO exec.TableScanOperator: Initializing children:
    [junit] 08/12/30 19:43:37 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 19:43:37 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 19:43:37 INFO exec.FilterOperator: Initializing Self
    [junit] 08/12/30 19:43:37 INFO exec.FilterOperator: Initializing children:
    [junit] 08/12/30 19:43:37 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 19:43:37 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 19:43:37 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 19:43:37 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 19:43:37 INFO exec.FilterOperator: Initialization Done
    [junit] 08/12/30 19:43:37 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 19:43:37 INFO exec.TableScanOperator: Initialization Done
    [junit] 08/12/30 19:43:37 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 19:43:37 INFO exec.FilterOperator: FILTERED:499
    [junit] 08/12/30 19:43:37 INFO exec.FilterOperator: PASSED:1
    [junit] 08/12/30 19:43:37 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt:0+5812
    [junit] 08/12/30 19:43:37 INFO mapred.TaskRunner: Task 'job_local_1_map_0000' done.
    [junit] 08/12/30 19:43:37 INFO mapred.TaskRunner: Saved output of task 'job_local_1_map_0000' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp-1943736504
    [junit] 08/12/30 19:43:38 INFO exec.ExecDriver:  map = 100%,  reduce =0%
    [junit] 08/12/30 19:43:38 INFO exec.ExecDriver: Ended Job = job_local_1
    [junit]  map = 100%,  reduce =0%
    [junit] Ended Job = job_local_1
    [junit] plan = /tmp/plan60458.xml
    [junit] 08/12/30 19:43:39 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to 0 since there's no reduce operator
    [junit] 08/12/30 19:43:39 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/dest1
    [junit] 08/12/30 19:43:39 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/data/files/TestSerDe.jar
    [junit] 08/12/30 19:43:39 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
    [junit] 08/12/30 19:43:39 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] 08/12/30 19:43:39 INFO exec.ExecDriver: Job running in-process (local Hadoop)
    [junit] Job running in-process (local Hadoop)
    [junit] 08/12/30 19:43:40 INFO mapred.MapTask: numReduceTasks: 0
    [junit] 08/12/30 19:43:40 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 19:43:40 INFO exec.MapOperator: Adding alias dest1 to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/dest1/job_local_1_map_0000
    [junit] 08/12/30 19:43:40 INFO exec.MapOperator: Got partitions: null
    [junit] 08/12/30 19:43:40 INFO exec.TableScanOperator: Initializing Self
    [junit] 08/12/30 19:43:40 INFO exec.TableScanOperator: Initializing children:
    [junit] 08/12/30 19:43:40 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 19:43:40 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 19:43:40 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 19:43:40 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 19:43:40 INFO exec.TableScanOperator: Initialization Done
    [junit] 08/12/30 19:43:40 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 19:43:40 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/dest1/job_local_1_map_0000:0+8
    [junit] 08/12/30 19:43:40 INFO mapred.TaskRunner: Task 'job_local_1_map_0000' done.
    [junit] 08/12/30 19:43:40 INFO mapred.TaskRunner: Saved output of task 'job_local_1_map_0000' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp1064025164
    [junit]  map = 100%,  reduce =0%
    [junit] 08/12/30 19:43:40 INFO exec.ExecDriver:  map = 100%,  reduce =0%
    [junit] 08/12/30 19:43:40 INFO exec.ExecDriver: Ended Job = job_local_1
    [junit] Ended Job = job_local_1
    [junit] diff -I \(file:\)\|\(/tmp/.*\) /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/udf2.q.out /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/clientpositive/udf2.q.out
    [junit] Done query: udf2.q
    [junit] Begin query: input10.q
    [junit] diff -I \(file:\)\|\(/tmp/.*\) /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/input10.q.out /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/clientpositive/input10.q.out
    [junit] Done query: input10.q
    [junit] Begin query: join11.q
    [junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
    [junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:361)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
    [junit]     at org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_join11(TestCliDriver.java:1205)
    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit]     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit]     at java.lang.reflect.Method.invoke(Method.java:597)
    [junit]     at junit.framework.TestCase.runTest(TestCase.java:164)
    [junit]     at junit.framework.TestCase.runBare(TestCase.java:130)
    [junit]     at junit.framework.TestResult$1.protect(TestResult.java:106)
    [junit]     at junit.framework.TestResult.runProtected(TestResult.java:124)
    [junit]     at junit.framework.TestResult.run(TestResult.java:109)
    [junit]     at junit.framework.TestCase.run(TestCase.java:120)
    [junit]     at junit.framework.TestSuite.runTest(TestSuite.java:230)
    [junit]     at junit.framework.TestSuite.run(TestSuite.java:225)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
    [junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
    [junit]     at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
    [junit]     at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
    [junit]     at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
    [junit]     ... 21 more
    [junit] Begin query: input4_cb_delim.q
    [junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcpart)
    [junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcpart)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:361)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
    [junit]     at org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_input4_cb_delim(TestCliDriver.java:1230)
    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit]     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit]     at java.lang.reflect.Method.invoke(Method.java:597)
    [junit]     at junit.framework.TestCase.runTest(TestCase.java:164)
    [junit]     at junit.framework.TestCase.runBare(TestCase.java:130)
    [junit]     at junit.framework.TestResult$1.protect(TestResult.java:106)
    [junit]     at junit.framework.TestResult.runProtected(TestResult.java:124)
    [junit]     at junit.framework.TestResult.run(TestResult.java:109)
    [junit]     at junit.framework.TestCase.run(TestCase.java:120)
    [junit]     at junit.framework.TestSuite.runTest(TestSuite.java:230)
    [junit]     at junit.framework.TestSuite.run(TestSuite.java:225)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
    [junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcpart)
    [junit]     at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
    [junit]     at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
    [junit]     at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
    [junit]     ... 21 more
    [junit] Begin query: udf4.q
    [junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcbucket)
    [junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcbucket)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:361)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
    [junit]     at org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_udf4(TestCliDriver.java:1255)
    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit]     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit]     at java.lang.reflect.Method.invoke(Method.java:597)
    [junit]     at junit.framework.TestCase.runTest(TestCase.java:164)
    [junit]     at junit.framework.TestCase.runBare(TestCase.java:130)
    [junit]     at junit.framework.TestResult$1.protect(TestResult.java:106)
    [junit]     at junit.framework.TestResult.runProtected(TestResult.java:124)
    [junit]     at junit.framework.TestResult.run(TestResult.java:109)
    [junit]     at junit.framework.TestCase.run(TestCase.java:120)
    [junit]     at junit.framework.TestSuite.runTest(TestSuite.java:230)
    [junit]     at junit.framework.TestSuite.run(TestSuite.java:225)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
    [junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcbucket)
    [junit]     at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
    [junit]     at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
    [junit]     at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
    [junit]     ... 21 more
    [junit] Begin query: input12.q
    [junit] plan = /tmp/plan60459.xml
    [junit] 08/12/30 19:43:45 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to 0 since there's no reduce operator
    [junit] 08/12/30 19:43:45 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src
    [junit] 08/12/30 19:43:45 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/data/files/TestSerDe.jar
    [junit] 08/12/30 19:43:45 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
    [junit] 08/12/30 19:43:45 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] Job running in-process (local Hadoop)
    [junit] 08/12/30 19:43:46 INFO exec.ExecDriver: Job running in-process (local Hadoop)
    [junit] 08/12/30 19:43:46 INFO mapred.MapTask: numReduceTasks: 0
    [junit] 08/12/30 19:43:46 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 19:43:46 INFO exec.MapOperator: Adding alias src to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt
    [junit] 08/12/30 19:43:46 INFO exec.MapOperator: Got partitions: null
    [junit] 08/12/30 19:43:46 INFO exec.TableScanOperator: Initializing Self
    [junit] 08/12/30 19:43:46 INFO exec.TableScanOperator: Initializing children:
    [junit] 08/12/30 19:43:46 INFO exec.FilterOperator: Initializing Self
    [junit] 08/12/30 19:43:46 INFO exec.FilterOperator: Initializing children:
    [junit] 08/12/30 19:43:46 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 19:43:46 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 19:43:46 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 19:43:46 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 19:43:46 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 19:43:46 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 19:43:46 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 19:43:46 INFO exec.FilterOperator: Initialization Done
    [junit] 08/12/30 19:43:46 INFO exec.FilterOperator: Initializing Self
    [junit] 08/12/30 19:43:46 INFO exec.FilterOperator: Initializing children:
    [junit] 08/12/30 19:43:46 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 19:43:46 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 19:43:46 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 19:43:46 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 19:43:46 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 19:43:46 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 19:43:46 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 19:43:46 INFO exec.FilterOperator: Initialization Done
    [junit] 08/12/30 19:43:46 INFO exec.FilterOperator: Initializing Self
    [junit] 08/12/30 19:43:46 INFO exec.FilterOperator: Initializing children:
    [junit] 08/12/30 19:43:46 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 19:43:46 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 19:43:46 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 19:43:46 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 19:43:46 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 19:43:46 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 19:43:46 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 19:43:46 INFO exec.FilterOperator: Initialization Done
    [junit] 08/12/30 19:43:46 INFO exec.TableScanOperator: Initialization Done
    [junit] 08/12/30 19:43:46 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 19:43:46 INFO exec.FilterOperator: PASSED:84
    [junit] 08/12/30 19:43:46 INFO exec.FilterOperator: FILTERED:416
    [junit] 08/12/30 19:43:46 INFO exec.FilterOperator: PASSED:105
    [junit] 08/12/30 19:43:46 INFO exec.FilterOperator: FILTERED:395
    [junit] 08/12/30 19:43:46 INFO exec.FilterOperator: PASSED:311
    [junit] 08/12/30 19:43:46 INFO exec.FilterOperator: FILTERED:189
    [junit] 08/12/30 19:43:46 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt:0+5812
    [junit] 08/12/30 19:43:46 INFO mapred.TaskRunner: Task 'job_local_1_map_0000' done.
    [junit] 08/12/30 19:43:46 INFO mapred.TaskRunner: Saved output of task 'job_local_1_map_0000' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp-539878002
    [junit]  map = 100%,  reduce =0%
    [junit] Ended Job = job_local_1
    [junit] 08/12/30 19:43:47 INFO exec.ExecDriver:  map = 100%,  reduce =0%
    [junit] 08/12/30 19:43:47 INFO exec.ExecDriver: Ended Job = job_local_1
    [junit] diff -I \(file:\)\|\(/tmp/.*\) /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/input12.q.out /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/clientpositive/input12.q.out
    [junit] Done query: input12.q
    [junit] Begin query: join13.q
    [junit] plan = /tmp/plan60460.xml
    [junit] 08/12/30 19:43:50 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to jobconf value of: 1
    [junit] 08/12/30 19:43:50 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src
    [junit] 08/12/30 19:43:50 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/data/files/TestSerDe.jar
    [junit] 08/12/30 19:43:50 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
    [junit] 08/12/30 19:43:50 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] 08/12/30 19:43:50 INFO exec.ExecDriver: Job running in-process (local Hadoop)
    [junit] Job running in-process (local Hadoop)
    [junit] 08/12/30 19:43:50 INFO mapred.MapTask: numReduceTasks: 1
    [junit] 08/12/30 19:43:50 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 19:43:50 INFO exec.MapOperator: Adding alias src2:src to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt
    [junit] 08/12/30 19:43:50 INFO exec.MapOperator: Adding alias src1:src to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt
    [junit] 08/12/30 19:43:50 INFO exec.MapOperator: Got partitions: null
    [junit] 08/12/30 19:43:50 INFO exec.TableScanOperator: Initializing Self
    [junit] 08/12/30 19:43:50 INFO exec.TableScanOperator: Initializing children:
    [junit] 08/12/30 19:43:50 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 19:43:50 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 19:43:50 INFO exec.ReduceSinkOperator: Initializing Self
    [junit] 08/12/30 19:43:50 INFO exec.ReduceSinkOperator: Using tag = 1
    [junit] 08/12/30 19:43:50 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 19:43:50 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 19:43:50 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 19:43:50 INFO exec.TableScanOperator: Initialization Done
    [junit] 08/12/30 19:43:50 INFO exec.TableScanOperator: Initializing Self
    [junit] 08/12/30 19:43:50 INFO exec.TableScanOperator: Initializing children:
    [junit] 08/12/30 19:43:50 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 19:43:50 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 19:43:50 INFO exec.FilterOperator: Initializing Self
    [junit] 08/12/30 19:43:50 INFO exec.FilterOperator: Initializing children:
    [junit] 08/12/30 19:43:50 INFO exec.ReduceSinkOperator: Initializing Self
    [junit] 08/12/30 19:43:50 INFO exec.ReduceSinkOperator: Using tag = 0
    [junit] 08/12/30 19:43:50 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 19:43:50 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 19:43:50 INFO exec.FilterOperator: Initialization Done
    [junit] 08/12/30 19:43:50 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 19:43:50 INFO exec.TableScanOperator: Initialization Done
    [junit] 08/12/30 19:43:50 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 19:43:50 INFO exec.FilterOperator: PASSED:84
    [junit] 08/12/30 19:43:50 INFO exec.FilterOperator: FILTERED:416
    [junit] 08/12/30 19:43:51 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt:0+5812
    [junit] 08/12/30 19:43:51 INFO mapred.TaskRunner: Task 'job_local_1_map_0000' done.
    [junit] 08/12/30 19:43:51 INFO mapred.TaskRunner: Saved output of task 'job_local_1_map_0000' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp1730471231
    [junit] 08/12/30 19:43:51 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 19:43:51 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 19:43:51 INFO exec.JoinOperator: Initializing Self
    [junit] 08/12/30 19:43:51 INFO exec.JoinOperator: Initializing children:
    [junit] 08/12/30 19:43:51 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 19:43:51 INFO exec.JoinOperator: Initialization Done
    [junit] 08/12/30 19:43:51 INFO mapred.LocalJobRunner: reduce > reduce
    [junit] 08/12/30 19:43:51 INFO mapred.TaskRunner: Task 'reduce_d0z35a' done.
    [junit] 08/12/30 19:43:51 INFO mapred.TaskRunner: Saved output of task 'reduce_d0z35a' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp1730471231
    [junit]  map = 100%,  reduce =100%
    [junit] Ended Job = job_local_1
    [junit] 08/12/30 19:43:51 INFO exec.ExecDriver:  map = 100%,  reduce =100%
    [junit] 08/12/30 19:43:51 INFO exec.ExecDriver: Ended Job = job_local_1
    [junit] plan = /tmp/plan60461.xml
    [junit] 08/12/30 19:43:53 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to jobconf value of: 1
    [junit] 08/12/30 19:43:53 INFO exec.ExecDriver: Adding input file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp/402662064/644367676.10002
    [junit] 08/12/30 19:43:53 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src
    [junit] 08/12/30 19:43:53 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/data/files/TestSerDe.jar
    [junit] 08/12/30 19:43:53 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
    [junit] 08/12/30 19:43:53 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] 08/12/30 19:43:53 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] Job running in-process (local Hadoop)
    [junit] 08/12/30 19:43:53 INFO exec.ExecDriver: Job running in-process (local Hadoop)
    [junit] 08/12/30 19:43:54 INFO mapred.MapTask: numReduceTasks: 1
    [junit] 08/12/30 19:43:54 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 19:43:54 INFO exec.MapOperator: Adding alias $INTNAME to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp/402662064/644367676.10002/reduce_d0z35a
    [junit] 08/12/30 19:43:54 INFO exec.MapOperator: Got partitions: null
    [junit] 08/12/30 19:43:54 INFO exec.ReduceSinkOperator: Initializing Self
    [junit] 08/12/30 19:43:54 INFO exec.ReduceSinkOperator: Using tag = 0
    [junit] 08/12/30 19:43:54 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 19:43:54 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 19:43:54 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 19:43:54 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp/402662064/644367676.10002/reduce_d0z35a:0+9116
    [junit] 08/12/30 19:43:54 INFO mapred.TaskRunner: Task 'job_local_1_map_0000' done.
    [junit] 08/12/30 19:43:54 INFO mapred.TaskRunner: Saved output of task 'job_local_1_map_0000' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp-103985407
    [junit] 08/12/30 19:43:54 INFO mapred.MapTask: numReduceTasks: 1
    [junit] 08/12/30 19:43:54 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 19:43:54 INFO exec.MapOperator: Adding alias src3:src to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt
    [junit] 08/12/30 19:43:54 INFO exec.MapOperator: Got partitions: null
    [junit] 08/12/30 19:43:54 INFO exec.TableScanOperator: Initializing Self
    [junit] 08/12/30 19:43:54 INFO exec.TableScanOperator: Initializing children:
    [junit] 08/12/30 19:43:54 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 19:43:54 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 19:43:54 INFO exec.FilterOperator: Initializing Self
    [junit] 08/12/30 19:43:54 INFO exec.FilterOperator: Initializing children:
    [junit] 08/12/30 19:43:54 INFO exec.ReduceSinkOperator: Initializing Self
    [junit] 08/12/30 19:43:54 INFO exec.ReduceSinkOperator: Using tag = 1
    [junit] 08/12/30 19:43:54 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 19:43:54 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 19:43:54 INFO exec.FilterOperator: Initialization Done
    [junit] 08/12/30 19:43:54 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 19:43:54 INFO exec.TableScanOperator: Initialization Done
    [junit] 08/12/30 19:43:54 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 19:43:54 INFO exec.FilterOperator: FILTERED:311
    [junit] 08/12/30 19:43:54 INFO exec.FilterOperator: PASSED:189
    [junit] 08/12/30 19:43:54 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt:0+5812
    [junit] 08/12/30 19:43:54 INFO mapred.TaskRunner: Task 'job_local_1_map_0001' done.
    [junit] 08/12/30 19:43:54 INFO mapred.TaskRunner: Saved output of task 'job_local_1_map_0001' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp-103985407
    [junit] 08/12/30 19:43:54 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 19:43:54 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 19:43:54 INFO exec.JoinOperator: Initializing Self
    [junit] 08/12/30 19:43:54 INFO exec.JoinOperator: Initializing children:
    [junit] 08/12/30 19:43:54 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 19:43:54 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 19:43:54 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 19:43:54 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 19:43:54 INFO exec.JoinOperator: Initialization Done
    [junit] 08/12/30 19:43:54 INFO mapred.LocalJobRunner: reduce > reduce
    [junit] 08/12/30 19:43:54 INFO mapred.TaskRunner: Task 'reduce_3g3v46' done.
    [junit] 08/12/30 19:43:54 INFO mapred.TaskRunner: Saved output of task 'reduce_3g3v46' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp-103985407
    [junit] 08/12/30 19:43:54 INFO exec.ExecDriver:  map = 100%,  reduce =100%
    [junit] 08/12/30 19:43:54 INFO exec.ExecDriver: Ended Job = job_local_1
    [junit]  map = 100%,  reduce =100%
    [junit] Ended Job = job_local_1
    [junit] diff -I \(file:\)\|\(/tmp/.*\) /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/join13.q.out /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/clientpositive/join13.q.out
    [junit] Done query: join13.q
    [junit] Begin query: input14.q
    [junit] plan = /tmp/plan60462.xml
    [junit] 08/12/30 19:43:57 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to jobconf value of: 1
    [junit] 08/12/30 19:43:57 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src
    [junit] 08/12/30 19:43:57 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/data/files/TestSerDe.jar
    [junit] 08/12/30 19:43:57 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
    [junit] 08/12/30 19:43:58 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] Job running in-process (local Hadoop)
    [junit] 08/12/30 19:43:58 INFO exec.ExecDriver: Job running in-process (local Hadoop)
    [junit] 08/12/30 19:43:58 INFO mapred.MapTask: numReduceTasks: 1
    [junit] 08/12/30 19:43:58 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 19:43:58 INFO exec.MapOperator: Adding alias tmap:src to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt
    [junit] 08/12/30 19:43:58 INFO exec.MapOperator: Got partitions: null
    [junit] 08/12/30 19:43:58 INFO exec.TableScanOperator: Initializing Self
    [junit] 08/12/30 19:43:58 INFO exec.TableScanOperator: Initializing children:
    [junit] 08/12/30 19:43:58 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 19:43:58 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 19:43:58 INFO exec.ScriptOperator: Initializing Self
    [junit] 08/12/30 19:43:58 INFO exec.ScriptOperator: Initializing children:
    [junit] 08/12/30 19:43:58 INFO exec.ReduceSinkOperator: Initializing Self
    [junit] 08/12/30 19:43:58 INFO exec.ReduceSinkOperator: Using tag = -1
    [junit] 08/12/30 19:43:58 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 19:43:58 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 19:43:58 INFO exec.ScriptOperator: Initialization Done
    [junit] 08/12/30 19:43:58 INFO exec.ScriptOperator: Executing [/bin/cat]
    [junit] 08/12/30 19:43:58 INFO exec.ScriptOperator: tablename=src
    [junit] 08/12/30 19:43:58 INFO exec.ScriptOperator: partname={}
    [junit] 08/12/30 19:43:58 INFO exec.ScriptOperator: alias=tmap:src
    [junit] 08/12/30 19:43:58 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 19:43:58 INFO exec.TableScanOperator: Initialization Done
    [junit] 08/12/30 19:43:58 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 19:43:58 INFO exec.ScriptOperator: StreamThread ErrorProcessor done
    [junit] 08/12/30 19:43:58 INFO exec.ScriptOperator: StreamThread OutputProcessor done
    [junit] 08/12/30 19:43:58 INFO exec.ScriptOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 19:43:58 INFO exec.ScriptOperator: SERIALIZE_ERRORS:0
    [junit] 08/12/30 19:43:58 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt:0+5812
    [junit] 08/12/30 19:43:58 INFO mapred.TaskRunner: Task 'job_local_1_map_0000' done.
    [junit] 08/12/30 19:43:58 INFO mapred.TaskRunner: Saved output of task 'job_local_1_map_0000' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp701362659
    [junit] 08/12/30 19:43:58 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 19:43:58 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 19:43:58 INFO exec.ExtractOperator: Initializing Self
    [junit] 08/12/30 19:43:58 INFO exec.ExtractOperator: Initializing children:
    [junit] 08/12/30 19:43:58 INFO exec.FilterOperator: Initializing Self
    [junit] 08/12/30 19:43:58 INFO exec.FilterOperator: Initializing children:
    [junit] 08/12/30 19:43:58 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 19:43:58 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 19:43:58 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 19:43:58 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 19:43:58 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 19:43:58 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 19:43:58 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 19:43:58 INFO exec.FilterOperator: Initialization Done
    [junit] 08/12/30 19:43:58 INFO exec.ExtractOperator: Initialization Done
    [junit] 08/12/30 19:43:58 INFO exec.FilterOperator: FILTERED:416
    [junit] 08/12/30 19:43:58 INFO exec.FilterOperator: PASSED:84
    [junit] 08/12/30 19:43:59 INFO mapred.LocalJobRunner: reduce > reduce
    [junit] 08/12/30 19:43:59 INFO mapred.TaskRunner: Task 'reduce_vx8p4m' done.
    [junit] 08/12/30 19:43:59 INFO mapred.TaskRunner: Saved output of task 'reduce_vx8p4m' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp701362659
    [junit]  map = 100%,  reduce =100%
    [junit] 08/12/30 19:43:59 INFO exec.ExecDriver:  map = 100%,  reduce =100%
    [junit] Ended Job = job_local_1
    [junit] 08/12/30 19:43:59 INFO exec.ExecDriver: Ended Job = job_local_1
    [junit] diff -I \(file:\)\|\(/tmp/.*\) /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/input14.q.out /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/clientpositive/input14.q.out
    [junit] Done query: input14.q
    [junit] Begin query: join15.q
    [junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
    [junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:361)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
    [junit]     at org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_join15(TestCliDriver.java:1355)
    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit]     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit]     at java.lang.reflect.Method.invoke(Method.java:597)
    [junit]     at junit.framework.TestCase.runTest(TestCase.java:164)
    [junit]     at junit.framework.TestCase.runBare(TestCase.java:130)
    [junit]     at junit.framework.TestResult$1.protect(TestResult.java:106)
    [junit]     at junit.framework.TestResult.runProtected(TestResult.java:124)
    [junit]     at junit.framework.TestResult.run(TestResult.java:109)
    [junit]     at junit.framework.TestCase.run(TestCase.java:120)
    [junit]     at junit.framework.TestSuite.runTest(TestSuite.java:230)
    [junit]     at junit.framework.TestSuite.run(TestSuite.java:225)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
    [junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
    [junit]     at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
    [junit]     at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
    [junit]     at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
    [junit]     ... 21 more
    [junit] Begin query: input16.q
    [junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcbucket)
    [junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcbucket)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:361)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
    [junit]     at org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_input16(TestCliDriver.java:1380)
    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit]     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit]     at java.lang.reflect.Method.invoke(Method.java:597)
    [junit]     at junit.framework.TestCase.runTest(TestCase.java:164)
    [junit]     at junit.framework.TestCase.runBare(TestCase.java:130)
    [junit]     at junit.framework.TestResult$1.protect(TestResult.java:106)
    [junit]     at junit.framework.TestResult.runProtected(TestResult.java:124)
    [junit]     at junit.framework.TestResult.run(TestResult.java:109)
    [junit]     at junit.framework.TestCase.run(TestCase.java:120)
    [junit]     at junit.framework.TestSuite.runTest(TestSuite.java:230)
    [junit]     at junit.framework.TestSuite.run(TestSuite.java:225)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
    [junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcbucket)
    [junit]     at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
    [junit]     at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
    [junit]     at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
    [junit]     ... 21 more
    [junit] Begin query: input_part1.q
    [junit] plan = /tmp/plan60463.xml
    [junit] 08/12/30 19:44:02 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to 0 since there's no reduce operator
    [junit] 08/12/30 19:44:02 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcpart/ds=2008-04-08/hr=12
    [junit] 08/12/30 19:44:02 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/data/files/TestSerDe.jar
    [junit] 08/12/30 19:44:02 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
    [junit] 08/12/30 19:44:03 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] Job running in-process (local Hadoop)
    [junit] 08/12/30 19:44:03 INFO exec.ExecDriver: Job running in-process (local Hadoop)
    [junit] 08/12/30 19:44:03 INFO mapred.MapTask: numReduceTasks: 0
    [junit] 08/12/30 19:44:03 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 19:44:03 INFO exec.MapOperator: Adding alias srcpart to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcpart/ds=2008-04-08/hr=12/kv1.txt
    [junit] 08/12/30 19:44:03 INFO exec.MapOperator: Got partitions: ds/hr
    [junit] 08/12/30 19:44:03 INFO exec.TableScanOperator: Initializing Self
    [junit] 08/12/30 19:44:03 INFO exec.TableScanOperator: Initializing children:
    [junit] 08/12/30 19:44:03 INFO exec.FilterOperator: Initializing Self
    [junit] 08/12/30 19:44:03 INFO exec.FilterOperator: Initializing children:
    [junit] 08/12/30 19:44:03 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 19:44:03 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 19:44:03 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 19:44:03 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 19:44:03 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 19:44:03 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 19:44:03 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 19:44:03 INFO exec.FilterOperator: Initialization Done
    [junit] 08/12/30 19:44:03 INFO exec.TableScanOperator: Initialization Done
    [junit] 08/12/30 19:44:03 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 19:44:03 INFO exec.FilterOperator: PASSED:84
    [junit] 08/12/30 19:44:03 INFO exec.FilterOperator: FILTERED:416
    [junit] 08/12/30 19:44:03 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcpart/ds=2008-04-08/hr=12/kv1.txt:0+5812
    [junit] 08/12/30 19:44:03 INFO mapred.TaskRunner: Task 'job_local_1_map_0000' done.
    [junit] 08/12/30 19:44:03 INFO mapred.TaskRunner: Saved output of task 'job_local_1_map_0000' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp-354946865
    [junit] 08/12/30 19:44:04 INFO exec.ExecDriver:  map = 100%,  reduce =0%
    [junit] 08/12/30 19:44:04 INFO exec.ExecDriver: Ended Job = job_local_1
    [junit]  map = 100%,  reduce =0%
    [junit] Ended Job = job_local_1
    [junit] diff -I \(file:\)\|\(/tmp/.*\) /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/input_part1.q.out /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/clientpositive/input_part1.q.out
    [junit] Done query: input_part1.q
    [junit] Begin query: join17.q
    [junit] plan = /tmp/plan60464.xml
    [junit] 08/12/30 19:44:07 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to jobconf value of: 1
    [junit] 08/12/30 19:44:07 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src
    [junit] 08/12/30 19:44:07 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/data/files/TestSerDe.jar
    [junit] 08/12/30 19:44:07 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
    [junit] 08/12/30 19:44:07 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] Job running in-process (local Hadoop)
    [junit] 08/12/30 19:44:07 INFO exec.ExecDriver: Job running in-process (local Hadoop)
    [junit] 08/12/30 19:44:07 INFO mapred.MapTask: numReduceTasks: 1
    [junit] 08/12/30 19:44:08 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 19:44:08 INFO exec.MapOperator: Adding alias src2 to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt
    [junit] 08/12/30 19:44:08 INFO exec.MapOperator: Adding alias src1 to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt
    [junit] 08/12/30 19:44:08 INFO exec.MapOperator: Got partitions: null
    [junit] 08/12/30 19:44:08 INFO exec.TableScanOperator: Initializing Self
    [junit] 08/12/30 19:44:08 INFO exec.TableScanOperator: Initializing children:
    [junit] 08/12/30 19:44:08 INFO exec.ReduceSinkOperator: Initializing Self
    [junit] 08/12/30 19:44:08 INFO exec.ReduceSinkOperator: Using tag = 1
    [junit] 08/12/30 19:44:08 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 19:44:08 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 19:44:08 INFO exec.TableScanOperator: Initialization Done
    [junit] 08/12/30 19:44:08 INFO exec.TableScanOperator: Initializing Self
    [junit] 08/12/30 19:44:08 INFO exec.TableScanOperator: Initializing children:
    [junit] 08/12/30 19:44:08 INFO exec.ReduceSinkOperator: Initializing Self
    [junit] 08/12/30 19:44:08 INFO exec.ReduceSinkOperator: Using tag = 0
    [junit] 08/12/30 19:44:08 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 19:44:08 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 19:44:08 INFO exec.TableScanOperator: Initialization Done
    [junit] 08/12/30 19:44:08 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 19:44:08 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt:0+5812
    [junit] 08/12/30 19:44:08 INFO mapred.TaskRunner: Task 'job_local_1_map_0000' done.
    [junit] 08/12/30 19:44:08 INFO mapred.TaskRunner: Saved output of task 'job_local_1_map_0000' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp-1919795971
    [junit] 08/12/30 19:44:08 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 19:44:08 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 19:44:08 INFO exec.JoinOperator: Initializing Self
    [junit] 08/12/30 19:44:08 INFO exec.JoinOperator: Initializing children:
    [junit] 08/12/30 19:44:08 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 19:44:08 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 19:44:08 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 19:44:08 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 19:44:08 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 19:44:08 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 19:44:08 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 19:44:08 INFO exec.JoinOperator: Initialization Done
    [junit] 08/12/30 19:44:08 INFO mapred.LocalJobRunner: reduce > reduce
    [junit] 08/12/30 19:44:08 INFO mapred.TaskRunner: Task 'reduce_wwgxg4' done.
    [junit] 08/12/30 19:44:08 INFO mapred.TaskRunner: Saved output of task 'reduce_wwgxg4' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp-1919795971
    [junit]  map = 100%,  reduce =100%
    [junit] 08/12/30 19:44:08 INFO exec.ExecDriver:  map = 100%,  reduce =100%
    [junit] 08/12/30 19:44:08 INFO exec.ExecDriver: Ended Job = job_local_1
    [junit] Ended Job = job_local_1
    [junit] diff -I \(file:\)\|\(/tmp/.*\) /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/join17.q.out /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/clientpositive/join17.q.out
    [junit] Done query: join17.q
    [junit] Begin query: input18.q
    [junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
    [junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:361)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
    [junit]     at org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_input18(TestCliDriver.java:1455)
    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit]     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit]     at java.lang.reflect.Method.invoke(Method.java:597)
    [junit]     at junit.framework.TestCase.runTest(TestCase.java:164)
    [junit]     at junit.framework.TestCase.runBare(TestCase.java:130)
    [junit]     at junit.framework.TestResult$1.protect(TestResult.java:106)
    [junit]     at junit.framework.TestResult.runProtected(TestResult.java:124)
    [junit]     at junit.framework.TestResult.run(TestResult.java:109)
    [junit]     at junit.framework.TestCase.run(TestCase.java:120)
    [junit]     at junit.framework.TestSuite.runTest(TestSuite.java:230)
    [junit]     at junit.framework.TestSuite.run(TestSuite.java:225)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
    [junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
    [junit]     at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
    [junit]     at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
    [junit]     at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
    [junit]     ... 21 more
    [junit] Begin query: input_part3.q
    [junit] diff -I \(file:\)\|\(/tmp/.*\) /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/input_part3.q.out /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/clientpositive/input_part3.q.out
    [junit] Done query: input_part3.q
    [junit] Begin query: groupby2.q
    [junit] plan = /tmp/plan60465.xml
    [junit] 08/12/30 19:44:13 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to jobconf value of: 1
    [junit] 08/12/30 19:44:13 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src
    [junit] 08/12/30 19:44:13 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/data/files/TestSerDe.jar
    [junit] 08/12/30 19:44:13 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
    [junit] 08/12/30 19:44:13 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] 08/12/30 19:44:13 INFO exec.ExecDriver: Job running in-process (local Hadoop)
    [junit] Job running in-process (local Hadoop)
    [junit] 08/12/30 19:44:13 INFO mapred.MapTask: numReduceTasks: 1
    [junit] 08/12/30 19:44:13 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 19:44:13 INFO exec.MapOperator: Adding alias src to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt
    [junit] 08/12/30 19:44:13 INFO exec.MapOperator: Got partitions: null
    [junit] 08/12/30 19:44:13 INFO exec.TableScanOperator: Initializing Self
    [junit] 08/12/30 19:44:13 INFO exec.TableScanOperator: Initializing children:
    [junit] 08/12/30 19:44:13 INFO exec.ReduceSinkOperator: Initializing Self
    [junit] 08/12/30 19:44:13 INFO exec.ReduceSinkOperator: Using tag = -1
    [junit] 08/12/30 19:44:13 INFO thrift.TBinarySortableProtocol: Sort order is "++"
    [junit] 08/12/30 19:44:13 INFO thrift.TBinarySortableProtocol: Sort order is "++"
    [junit] 08/12/30 19:44:14 INFO exec.TableScanOperator: Initialization Done
    [junit] 08/12/30 19:44:14 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 19:44:14 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt:0+5812
    [junit] 08/12/30 19:44:14 INFO mapred.TaskRunner: Task 'job_local_1_map_0000' done.
    [junit] 08/12/30 19:44:14 INFO mapred.TaskRunner: Saved output of task 'job_local_1_map_0000' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp-1474341512
    [junit] 08/12/30 19:44:14 INFO thrift.TBinarySortableProtocol: Sort order is "++"
    [junit] 08/12/30 19:44:14 INFO thrift.TBinarySortableProtocol: Sort order is "++"
    [junit] 08/12/30 19:44:14 INFO exec.GroupByOperator: Initializing Self
    [junit] 08/12/30 19:44:14 INFO exec.GroupByOperator: Initializing children:
    [junit] 08/12/30 19:44:14 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 19:44:14 INFO exec.GroupByOperator: Initialization Done
    [junit] 08/12/30 19:44:14 INFO mapred.LocalJobRunner: reduce > reduce
    [junit] 08/12/30 19:44:14 INFO mapred.TaskRunner: Task 'reduce_4tu1zp' done.
    [junit] 08/12/30 19:44:14 INFO mapred.TaskRunner: Saved output of task 'reduce_4tu1zp' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp-1474341512
    [junit]  map = 100%,  reduce =100%
    [junit] Ended Job = job_local_1
    [junit] 08/12/30 19:44:14 INFO exec.ExecDriver:  map = 100%,  reduce =100%
    [junit] 08/12/30 19:44:14 INFO exec.ExecDriver: Ended Job = job_local_1
    [junit] plan = /tmp/plan60466.xml
    [junit] 08/12/30 19:44:16 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to jobconf value of: 1
    [junit] 08/12/30 19:44:16 INFO exec.ExecDriver: Adding input file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp/427320263/150344277.10001
    [junit] 08/12/30 19:44:16 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/data/files/TestSerDe.jar
    [junit] 08/12/30 19:44:16 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
    [junit] 08/12/30 19:44:16 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] Job running in-process (local Hadoop)
    [junit] 08/12/30 19:44:16 INFO exec.ExecDriver: Job running in-process (local Hadoop)
    [junit] 08/12/30 19:44:16 INFO mapred.MapTask: numReduceTasks: 1
    [junit] 08/12/30 19:44:17 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 19:44:17 INFO exec.MapOperator: Adding alias /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp/427320263/150344277.10001 to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp/427320263/150344277.10001/reduce_4tu1zp
    [junit] 08/12/30 19:44:17 INFO exec.MapOperator: Got partitions: null
    [junit] 08/12/30 19:44:17 INFO exec.ReduceSinkOperator: Initializing Self
    [junit] 08/12/30 19:44:17 INFO exec.ReduceSinkOperator: Using tag = -1
    [junit] 08/12/30 19:44:17 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 19:44:17 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 19:44:17 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 19:44:17 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp/427320263/150344277.10001/reduce_4tu1zp:0+566
    [junit] 08/12/30 19:44:17 INFO mapred.TaskRunner: Task 'job_local_1_map_0000' done.
    [junit] 08/12/30 19:44:17 INFO mapred.TaskRunner: Saved output of task 'job_local_1_map_0000' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp-957717483
    [junit] 08/12/30 19:44:17 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 19:44:17 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 19:44:17 INFO exec.GroupByOperator: Initializing Self
    [junit] 08/12/30 19:44:17 INFO exec.GroupByOperator: Initializing children:
    [junit] 08/12/30 19:44:17 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 19:44:17 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 19:44:17 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 19:44:17 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 19:44:17 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 19:44:17 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 19:44:17 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 19:44:17 INFO exec.GroupByOperator: Initialization Done
    [junit] 08/12/30 19:44:17 INFO mapred.LocalJobRunner: reduce > reduce
    [junit] 08/12/30 19:44:17 INFO mapred.TaskRunner: Task 'reduce_jat4up' done.
    [junit] 08/12/30 19:44:17 INFO mapred.TaskRunner: Saved output of task 'reduce_jat4up' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp-957717483
    [junit]  map = 100%,  reduce =100%
    [junit] 08/12/30 19:44:17 INFO exec.ExecDriver:  map = 100%,  reduce =100%
    [junit] 08/12/30 19:44:17 INFO exec.ExecDriver: Ended Job = job_local_1
    [junit] Ended Job = job_local_1
    [junit] diff -I \(file:\)\|\(/tmp/.*\) /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/groupby2.q.out /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/clientpositive/groupby2.q.out
    [junit] Done query: groupby2.q
    [junit] Begin query: show_tables.q
    [junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
    [junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:361)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
    [junit]     at org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_show_tables(TestCliDriver.java:1530)
    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit]     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit]     at java.lang.reflect.Method.invoke(Method.java:597)
    [junit]     at junit.framework.TestCase.runTest(TestCase.java:164)
    [junit]     at junit.framework.TestCase.runBare(TestCase.java:130)
    [junit]     at junit.framework.TestResult$1.protect(TestResult.java:106)
    [junit]     at junit.framework.TestResult.runProtected(TestResult.java:124)
    [junit]     at junit.framework.TestResult.run(TestResult.java:109)
    [junit]     at junit.framework.TestCase.run(TestCase.java:120)
    [junit]     at junit.framework.TestSuite.runTest(TestSuite.java:230)
    [junit]     at junit.framework.TestSuite.run(TestSuite.java:225)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
    [junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
    [junit]     at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
    [junit]     at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
    [junit]     at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
    [junit]     ... 21 more
    [junit] Begin query: input_part5.q
    [junit] plan = /tmp/plan60467.xml
    [junit] 08/12/30 19:44:20 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to 0 since there's no reduce operator
    [junit] 08/12/30 19:44:21 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcpart/ds=2008-04-08/hr=11
    [junit] 08/12/30 19:44:21 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcpart/ds=2008-04-08/hr=12
    [junit] 08/12/30 19:44:21 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/data/files/TestSerDe.jar
    [junit] 08/12/30 19:44:21 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
    [junit] 08/12/30 19:44:21 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] 08/12/30 19:44:21 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] Job running in-process (local Hadoop)
    [junit] 08/12/30 19:44:21 INFO exec.ExecDriver: Job running in-process (local Hadoop)
    [junit] 08/12/30 19:44:21 INFO mapred.MapTask: numReduceTasks: 0
    [junit] 08/12/30 19:44:21 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 19:44:21 INFO exec.MapOperator: Adding alias x to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcpart/ds=2008-04-08/hr=11/kv1.txt
    [junit] 08/12/30 19:44:21 INFO exec.MapOperator: Got partitions: ds/hr
    [junit] 08/12/30 19:44:21 INFO exec.TableScanOperator: Initializing Self
    [junit] 08/12/30 19:44:21 INFO exec.TableScanOperator: Initializing children:
    [junit] 08/12/30 19:44:21 INFO exec.FilterOperator: Initializing Self
    [junit] 08/12/30 19:44:21 INFO exec.FilterOperator: Initializing children:
    [junit] 08/12/30 19:44:21 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 19:44:21 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 19:44:21 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 19:44:21 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 19:44:21 INFO exec.FilterOperator: Initialization Done
    [junit] 08/12/30 19:44:21 INFO exec.TableScanOperator: Initialization Done
    [junit] 08/12/30 19:44:21 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 19:44:21 INFO exec.FilterOperator: FILTERED:416
    [junit] 08/12/30 19:44:21 INFO exec.FilterOperator: PASSED:84
    [junit] 08/12/30 19:44:21 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcpart/ds=2008-04-08/hr=11/kv1.txt:0+5812
    [junit] 08/12/30 19:44:21 INFO mapred.TaskRunner: Task 'job_local_1_map_0000' done.
    [junit] 08/12/30 19:44:21 INFO mapred.TaskRunner: Saved output of task 'job_local_1_map_0000' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp759710918
    [junit] 08/12/30 19:44:21 INFO mapred.MapTask: numReduceTasks: 0
    [junit] 08/12/30 19:44:21 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 19:44:21 INFO exec.MapOperator: Adding alias x to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcpart/ds=2008-04-08/hr=12/kv1.txt
    [junit] 08/12/30 19:44:21 INFO exec.MapOperator: Got partitions: ds/hr
    [junit] 08/12/30 19:44:21 INFO exec.TableScanOperator: Initializing Self
    [junit] 08/12/30 19:44:21 INFO exec.TableScanOperator: Initializing children:
    [junit] 08/12/30 19:44:21 INFO exec.FilterOperator: Initializing Self
    [junit] 08/12/30 19:44:21 INFO exec.FilterOperator: Initializing children:
    [junit] 08/12/30 19:44:21 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 19:44:21 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 19:44:21 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 19:44:21 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 19:44:21 INFO exec.FilterOperator: Initialization Done
    [junit] 08/12/30 19:44:21 INFO exec.TableScanOperator: Initialization Done
    [junit] 08/12/30 19:44:21 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 19:44:21 INFO exec.FilterOperator: FILTERED:832
    [junit] 08/12/30 19:44:21 INFO exec.FilterOperator: PASSED:168
    [junit] 08/12/30 19:44:21 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcpart/ds=2008-04-08/hr=12/kv1.txt:0+5812
    [junit] 08/12/30 19:44:21 INFO mapred.TaskRunner: Task 'job_local_1_map_0001' done.
    [junit] 08/12/30 19:44:21 INFO mapred.TaskRunner: Saved output of task 'job_local_1_map_0001' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp759710918
    [junit]  map = 100%,  reduce =0%
    [junit] Ended Job = job_local_1
    [junit] 08/12/30 19:44:22 INFO exec.ExecDriver:  map = 100%,  reduce =0%
    [junit] 08/12/30 19:44:22 INFO exec.ExecDriver: Ended Job = job_local_1
    [junit] diff -I \(file:\)\|\(/tmp/.*\) /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/input_part5.q.out /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/clientpositive/input_part5.q.out
    [junit] Done query: input_part5.q
    [junit] Begin query: groupby4.q
    [junit] plan = /tmp/plan60468.xml
    [junit] 08/12/30 19:44:24 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to jobconf value of: 1
    [junit] 08/12/30 19:44:25 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src
    [junit] 08/12/30 19:44:25 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/data/files/TestSerDe.jar
    [junit] 08/12/30 19:44:25 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
    [junit] 08/12/30 19:44:25 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] 08/12/30 19:44:25 INFO exec.ExecDriver: Job running in-process (local Hadoop)
    [junit] Job running in-process (local Hadoop)
    [junit] 08/12/30 19:44:25 INFO mapred.MapTask: numReduceTasks: 1
    [junit] 08/12/30 19:44:25 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 19:44:25 INFO exec.MapOperator: Adding alias src to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt
    [junit] 08/12/30 19:44:25 INFO exec.MapOperator: Got partitions: null
    [junit] 08/12/30 19:44:25 INFO exec.TableScanOperator: Initializing Self
    [junit] 08/12/30 19:44:25 INFO exec.TableScanOperator: Initializing children:
    [junit] 08/12/30 19:44:25 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 19:44:25 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 19:44:25 INFO exec.ReduceSinkOperator: Initializing Self
    [junit] 08/12/30 19:44:25 INFO exec.ReduceSinkOperator: Using tag = -1
    [junit] 08/12/30 19:44:25 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 19:44:25 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 19:44:25 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 19:44:25 INFO exec.TableScanOperator: Initialization Done
    [junit] 08/12/30 19:44:25 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 19:44:25 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt:0+5812
    [junit] 08/12/30 19:44:25 INFO mapred.TaskRunner: Task 'job_local_1_map_0000' done.
    [junit] 08/12/30 19:44:25 INFO mapred.TaskRunner: Saved output of task 'job_local_1_map_0000' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp261110492
    [junit] 08/12/30 19:44:25 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 19:44:25 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 19:44:25 INFO exec.GroupByOperator: Initializing Self
    [junit] 08/12/30 19:44:25 INFO exec.GroupByOperator: Initializing children:
    [junit] 08/12/30 19:44:25 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 19:44:25 INFO exec.GroupByOperator: Initialization Done
    [junit] 08/12/30 19:44:26 INFO mapred.LocalJobRunner: reduce > reduce
    [junit] 08/12/30 19:44:26 INFO mapred.TaskRunner: Task 'reduce_fdfwjm' done.
    [junit] 08/12/30 19:44:26 INFO mapred.TaskRunner: Saved output of task 'reduce_fdfwjm' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp261110492
    [junit] 08/12/30 19:44:26 INFO exec.ExecDriver:  map = 100%,  reduce =100%
    [junit] 08/12/30 19:44:26 INFO exec.ExecDriver: Ended Job = job_local_1
    [junit]  map = 100%,  reduce =100%
    [junit] Ended Job = job_local_1
    [junit] plan = /tmp/plan60469.xml
    [junit] 08/12/30 19:44:27 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to jobconf value of: 1
    [junit] 08/12/30 19:44:28 INFO exec.ExecDriver: Adding input file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp/608918424/303228676.10001
    [junit] 08/12/30 19:44:28 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/data/files/TestSerDe.jar
    [junit] 08/12/30 19:44:28 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
    [junit] 08/12/30 19:44:28 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] Job running in-process (local Hadoop)
    [junit] 08/12/30 19:44:28 INFO exec.ExecDriver: Job running in-process (local Hadoop)
    [junit] 08/12/30 19:44:28 INFO mapred.MapTask: numReduceTasks: 1
    [junit] 08/12/30 19:44:28 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 19:44:28 INFO exec.MapOperator: Adding alias /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp/608918424/303228676.10001 to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp/608918424/303228676.10001/reduce_fdfwjm
    [junit] 08/12/30 19:44:28 INFO exec.MapOperator: Got partitions: null
    [junit] 08/12/30 19:44:28 INFO exec.ReduceSinkOperator: Initializing Self
    [junit] 08/12/30 19:44:28 INFO exec.ReduceSinkOperator: Using tag = -1
    [junit] 08/12/30 19:44:28 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 19:44:28 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 19:44:28 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 19:44:28 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp/608918424/303228676.10001/reduce_fdfwjm:0+346
    [junit] 08/12/30 19:44:28 INFO mapred.TaskRunner: Task 'job_local_1_map_0000' done.
    [junit] 08/12/30 19:44:28 INFO mapred.TaskRunner: Saved output of task 'job_local_1_map_0000' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp882218643
    [junit] 08/12/30 19:44:28 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 19:44:28 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 19:44:28 INFO exec.GroupByOperator: Initializing Self
    [junit] 08/12/30 19:44:28 INFO exec.GroupByOperator: Initializing children:
    [junit] 08/12/30 19:44:28 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 19:44:28 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 19:44:28 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 19:44:28 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 19:44:28 INFO exec.GroupByOperator: Initialization Done
    [junit] 08/12/30 19:44:28 INFO mapred.LocalJobRunner: reduce > reduce
    [junit] 08/12/30 19:44:28 INFO mapred.TaskRunner: Task 'reduce_tt16k' done.
    [junit] 08/12/30 19:44:28 INFO mapred.TaskRunner: Saved output of task 'reduce_tt16k' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp882218643
    [junit]  map = 100%,  reduce =100%
    [junit] 08/12/30 19:44:29 INFO exec.ExecDriver:  map = 100%,  reduce =100%
    [junit] Ended Job = job_local_1
    [junit] 08/12/30 19:44:29 INFO exec.ExecDriver: Ended Job = job_local_1
    [junit] diff -I \(file:\)\|\(/tmp/.*\) /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/groupby4.q.out /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/clientpositive/groupby4.q.out
    [junit] Done query: groupby4.q
    [junit] Begin query: groupby6.q
    [junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
    [junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:361)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
    [junit]     at org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_groupby6(TestCliDriver.java:1605)
    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit]     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit]     at java.lang.reflect.Method.invoke(Method.java:597)
    [junit]     at junit.framework.TestCase.runTest(TestCase.java:164)
    [junit]     at junit.framework.TestCase.runBare(TestCase.java:130)
    [junit]     at junit.framework.TestResult$1.protect(TestResult.java:106)
    [junit]     at junit.framework.TestResult.runProtected(TestResult.java:124)
    [junit]     at junit.framework.TestResult.run(TestResult.java:109)
    [junit]     at junit.framework.TestCase.run(TestCase.java:120)
    [junit]     at junit.framework.TestSuite.runTest(TestSuite.java:230)
    [junit]     at junit.framework.TestSuite.run(TestSuite.java:225)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
    [junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
    [junit]     at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
    [junit]     at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
    [junit]     at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
    [junit]     ... 21 more
    [junit] Begin query: input1_limit.q
    [junit] plan = /tmp/plan60470.xml
    [junit] 08/12/30 19:44:32 INFO exec.ExecDriver: Number of reduce tasks determined at compile : 1
    [junit] 08/12/30 19:44:32 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src
    [junit] 08/12/30 19:44:32 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/data/files/TestSerDe.jar
    [junit] 08/12/30 19:44:32 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
    [junit] 08/12/30 19:44:32 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] Job running in-process (local Hadoop)
    [junit] 08/12/30 19:44:33 INFO exec.ExecDriver: Job running in-process (local Hadoop)
    [junit] 08/12/30 19:44:33 INFO mapred.MapTask: numReduceTasks: 1
    [junit] 08/12/30 19:44:33 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 19:44:33 INFO exec.MapOperator: Adding alias src to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt
    [junit] 08/12/30 19:44:33 INFO exec.MapOperator: Got partitions: null
    [junit] 08/12/30 19:44:33 INFO exec.TableScanOperator: Initializing Self
    [junit] 08/12/30 19:44:33 INFO exec.TableScanOperator: Initializing children:
    [junit] 08/12/30 19:44:33 INFO exec.FilterOperator: Initializing Self
    [junit] 08/12/30 19:44:33 INFO exec.FilterOperator: Initializing children:
    [junit] 08/12/30 19:44:33 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 19:44:33 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 19:44:33 INFO exec.LimitOperator: Initializing Self
    [junit] 08/12/30 19:44:33 INFO exec.LimitOperator: Initializing children:
    [junit] 08/12/30 19:44:33 INFO exec.ReduceSinkOperator: Initializing Self
    [junit] 08/12/30 19:44:33 INFO exec.ReduceSinkOperator: Using tag = -1
    [junit] 08/12/30 19:44:33 INFO thrift.TBinarySortableProtocol: Sort order is ""
    [junit] 08/12/30 19:44:33 INFO thrift.TBinarySortableProtocol: Sort order is ""
    [junit] 08/12/30 19:44:33 INFO exec.LimitOperator: Initialization Done
    [junit] 08/12/30 19:44:33 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 19:44:33 INFO exec.FilterOperator: Initialization Done
    [junit] 08/12/30 19:44:33 INFO exec.FilterOperator: Initializing Self
    [junit] 08/12/30 19:44:33 INFO exec.FilterOperator: Initializing children:
    [junit] 08/12/30 19:44:33 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 19:44:33 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 19:44:33 INFO exec.LimitOperator: Initializing Self
    [junit] 08/12/30 19:44:33 INFO exec.LimitOperator: Initializing children:
    [junit] 08/12/30 19:44:33 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 19:44:33 INFO exec.LimitOperator: Initialization Done
    [junit] 08/12/30 19:44:33 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 19:44:33 INFO exec.FilterOperator: Initialization Done
    [junit] 08/12/30 19:44:33 INFO exec.TableScanOperator: Initialization Done
    [junit] 08/12/30 19:44:33 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 19:44:33 INFO exec.FilterOperator: FILTERED:90
    [junit] 08/12/30 19:44:33 INFO exec.FilterOperator: PASSED:13
    [junit] 08/12/30 19:44:33 INFO exec.FilterOperator: FILTERED:90
    [junit] 08/12/30 19:44:33 INFO exec.FilterOperator: PASSED:13
    [junit] 08/12/30 19:44:33 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt:0+5812
    [junit] 08/12/30 19:44:33 INFO mapred.TaskRunner: Task 'job_local_1_map_0000' done.
    [junit] 08/12/30 19:44:33 INFO mapred.TaskRunner: Saved output of task 'job_local_1_map_0000' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp-1606494983
    [junit] 08/12/30 19:44:33 INFO thrift.TBinarySortableProtocol: Sort order is ""
    [junit] 08/12/30 19:44:33 INFO thrift.TBinarySortableProtocol: Sort order is ""
    [junit] 08/12/30 19:44:33 INFO exec.ExtractOperator: Initializing Self
    [junit] 08/12/30 19:44:33 INFO exec.ExtractOperator: Initializing children:
    [junit] 08/12/30 19:44:33 INFO exec.LimitOperator: Initializing Self
    [junit] 08/12/30 19:44:33 INFO exec.LimitOperator: Initializing children:
    [junit] 08/12/30 19:44:33 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 19:44:33 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 19:44:33 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 19:44:33 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 19:44:33 INFO exec.LimitOperator: Initialization Done
    [junit] 08/12/30 19:44:33 INFO exec.ExtractOperator: Initialization Done
    [junit] 08/12/30 19:44:33 INFO mapred.LocalJobRunner: reduce > reduce
    [junit] 08/12/30 19:44:33 INFO mapred.TaskRunner: Task 'reduce_mkzh4m' done.
    [junit] 08/12/30 19:44:33 INFO mapred.TaskRunner: Saved output of task 'reduce_mkzh4m' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp-1606494983
    [junit] 08/12/30 19:44:34 INFO exec.ExecDriver:  map = 100%,  reduce =100%
    [junit] 08/12/30 19:44:34 INFO exec.ExecDriver: Ended Job = job_local_1
    [junit]  map = 100%,  reduce =100%
    [junit] Ended Job = job_local_1
    [junit] plan = /tmp/plan60471.xml
    [junit] 08/12/30 19:44:35 INFO exec.ExecDriver: Number of reduce tasks determined at compile : 1
    [junit] 08/12/30 19:44:35 INFO exec.ExecDriver: Adding input file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp/929618386/303358759.10002
    [junit] 08/12/30 19:44:35 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/data/files/TestSerDe.jar
    [junit] 08/12/30 19:44:35 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
    [junit] 08/12/30 19:44:35 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] Job running in-process (local Hadoop)
    [junit] 08/12/30 19:44:36 INFO exec.ExecDriver: Job running in-process (local Hadoop)
    [junit] 08/12/30 19:44:36 INFO mapred.MapTask: numReduceTasks: 1
    [junit] 08/12/30 19:44:36 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 19:44:36 INFO exec.MapOperator: Adding alias /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp/929618386/303358759.10002 to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp/929618386/303358759.10002/job_local_1_map_0000
    [junit] 08/12/30 19:44:36 INFO exec.MapOperator: Got partitions: null
    [junit] 08/12/30 19:44:36 INFO exec.ReduceSinkOperator: Initializing Self
    [junit] 08/12/30 19:44:36 INFO exec.ReduceSinkOperator: Using tag = -1
    [junit] 08/12/30 19:44:36 INFO thrift.TBinarySortableProtocol: Sort order is ""
    [junit] 08/12/30 19:44:36 INFO thrift.TBinarySortableProtocol: Sort order is ""
    [junit] 08/12/30 19:44:36 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 19:44:36 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp/929618386/303358759.10002/job_local_1_map_0000:0+291
    [junit] 08/12/30 19:44:36 INFO mapred.TaskRunner: Task 'job_local_1_map_0000' done.
    [junit] 08/12/30 19:44:36 INFO mapred.TaskRunner: Saved output of task 'job_local_1_map_0000' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp1544386957
    [junit] 08/12/30 19:44:36 INFO thrift.TBinarySortableProtocol: Sort order is ""
    [junit] 08/12/30 19:44:36 INFO thrift.TBinarySortableProtocol: Sort order is ""
    [junit] 08/12/30 19:44:36 INFO exec.ExtractOperator: Initializing Self
    [junit] 08/12/30 19:44:36 INFO exec.ExtractOperator: Initializing children:
    [junit] 08/12/30 19:44:36 INFO exec.LimitOperator: Initializing Self
    [junit] 08/12/30 19:44:36 INFO exec.LimitOperator: Initializing children:
    [junit] 08/12/30 19:44:36 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 19:44:36 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 19:44:36 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 19:44:36 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 19:44:36 INFO exec.LimitOperator: Initialization Done
    [junit] 08/12/30 19:44:36 INFO exec.ExtractOperator: Initialization Done
    [junit] 08/12/30 19:44:36 INFO mapred.LocalJobRunner: reduce > reduce
    [junit] 08/12/30 19:44:36 INFO mapred.TaskRunner: Task 'reduce_iiw28b' done.
    [junit] 08/12/30 19:44:36 INFO mapred.TaskRunner: Saved output of task 'reduce_iiw28b' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp1544386957
    [junit] 08/12/30 19:44:37 INFO exec.ExecDriver:  map = 100%,  reduce =100%
    [junit] 08/12/30 19:44:37 INFO exec.ExecDriver: Ended Job = job_local_1
    [junit]  map = 100%,  reduce =100%
    [junit] Ended Job = job_local_1
    [junit] diff -I \(file:\)\|\(/tmp/.*\) /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/input1_limit.q.out /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/clientpositive/input1_limit.q.out
    [junit] Done query: input1_limit.q
    [junit] Begin query: groupby8.q
    [junit] plan = /tmp/plan60472.xml
    [junit] 08/12/30 19:44:39 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to jobconf value of: 1
    [junit] 08/12/30 19:44:39 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src
    [junit] 08/12/30 19:44:39 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/data/files/TestSerDe.jar
    [junit] 08/12/30 19:44:39 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
    [junit] 08/12/30 19:44:40 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] 08/12/30 19:44:40 INFO exec.ExecDriver: Job running in-process (local Hadoop)
    [junit] Job running in-process (local Hadoop)
    [junit] 08/12/30 19:44:40 INFO mapred.MapTask: numReduceTasks: 1
    [junit] 08/12/30 19:44:40 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 19:44:40 INFO exec.MapOperator: Adding alias src to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt
    [junit] 08/12/30 19:44:40 INFO exec.MapOperator: Got partitions: null
    [junit] 08/12/30 19:44:40 INFO exec.TableScanOperator: Initializing Self
    [junit] 08/12/30 19:44:40 INFO exec.TableScanOperator: Initializing children:
    [junit] 08/12/30 19:44:40 INFO exec.ReduceSinkOperator: Initializing Self
    [junit] 08/12/30 19:44:40 INFO exec.ReduceSinkOperator: Using tag = -1
    [junit] 08/12/30 19:44:40 INFO thrift.TBinarySortableProtocol: Sort order is "++"
    [junit] 08/12/30 19:44:40 INFO thrift.TBinarySortableProtocol: Sort order is "++"
    [junit] 08/12/30 19:44:40 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 19:44:40 INFO exec.TableScanOperator: Initialization Done
    [junit] 08/12/30 19:44:40 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 19:44:40 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt:0+5812
    [junit] 08/12/30 19:44:40 INFO mapred.TaskRunner: Task 'job_local_1_map_0000' done.
    [junit] 08/12/30 19:44:40 INFO mapred.TaskRunner: Saved output of task 'job_local_1_map_0000' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp-980163658
    [junit] 08/12/30 19:44:40 INFO thrift.TBinarySortableProtocol: Sort order is "++"
    [junit] 08/12/30 19:44:40 INFO thrift.TBinarySortableProtocol: Sort order is "++"
    [junit] 08/12/30 19:44:40 INFO exec.GroupByOperator: Initializing Self
    [junit] 08/12/30 19:44:40 INFO exec.GroupByOperator: Initializing children:
    [junit] 08/12/30 19:44:40 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 19:44:40 INFO exec.GroupByOperator: Initialization Done
    [junit] 08/12/30 19:44:40 INFO mapred.LocalJobRunner: reduce > reduce
    [junit] 08/12/30 19:44:40 INFO mapred.TaskRunner: Task 'reduce_yqtorj' done.
    [junit] 08/12/30 19:44:40 INFO mapred.TaskRunner: Saved output of task 'reduce_yqtorj' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp-980163658
    [junit] 08/12/30 19:44:41 INFO exec.ExecDriver:  map = 100%,  reduce =100%
    [junit] 08/12/30 19:44:41 INFO exec.ExecDriver: Ended Job = job_local_1
    [junit]  map = 100%,  reduce =100%
    [junit] Ended Job = job_local_1
    [junit] plan = /tmp/plan60473.xml
    [junit] 08/12/30 19:44:42 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to jobconf value of: 1
    [junit] 08/12/30 19:44:42 INFO exec.ExecDriver: Adding input file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp/222232298/302270672.10002
    [junit] 08/12/30 19:44:42 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/data/files/TestSerDe.jar
    [junit] 08/12/30 19:44:42 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
    [junit] 08/12/30 19:44:42 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] 08/12/30 19:44:43 INFO exec.ExecDriver: Job running in-process (local Hadoop)
    [junit] Job running in-process (local Hadoop)
    [junit] 08/12/30 19:44:43 INFO mapred.MapTask: numReduceTasks: 1
    [junit] 08/12/30 19:44:43 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 19:44:43 INFO exec.MapOperator: Adding alias /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp/222232298/302270672.10002 to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp/222232298/302270672.10002/reduce_yqtorj
    [junit] 08/12/30 19:44:43 INFO exec.MapOperator: Got partitions: null
    [junit] 08/12/30 19:44:43 INFO exec.ReduceSinkOperator: Initializing Self
    [junit] 08/12/30 19:44:43 INFO exec.ReduceSinkOperator: Using tag = -1
    [junit] 08/12/30 19:44:43 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 19:44:43 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 19:44:43 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 19:44:43 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp/222232298/302270672.10002/reduce_yqtorj:0+11875
    [junit] 08/12/30 19:44:43 INFO mapred.TaskRunner: Task 'job_local_1_map_0000' done.
    [junit] 08/12/30 19:44:43 INFO mapred.TaskRunner: Saved output of task 'job_local_1_map_0000' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp856173151
    [junit] 08/12/30 19:44:43 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 19:44:43 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 19:44:43 INFO exec.GroupByOperator: Initializing Self
    [junit] 08/12/30 19:44:43 INFO exec.GroupByOperator: Initializing children:
    [junit] 08/12/30 19:44:43 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 19:44:43 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 19:44:43 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 19:44:43 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 19:44:43 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 19:44:43 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 19:44:43 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 19:44:43 INFO exec.GroupByOperator: Initialization Done
    [junit] 08/12/30 19:44:43 INFO mapred.LocalJobRunner: reduce > reduce
    [junit] 08/12/30 19:44:43 INFO mapred.TaskRunner: Task 'reduce_pfcihi' done.
    [junit] 08/12/30 19:44:43 INFO mapred.TaskRunner: Saved output of task 'reduce_pfcihi' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp856173151
    [junit]  map = 100%,  reduce =100%
    [junit] Ended Job = job_local_1
    [junit] 08/12/30 19:44:44 INFO exec.ExecDriver:  map = 100%,  reduce =100%
    [junit] 08/12/30 19:44:44 INFO exec.ExecDriver: Ended Job = job_local_1
    [junit] plan = /tmp/plan60474.xml
    [junit] 08/12/30 19:44:45 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to jobconf value of: 1
    [junit] 08/12/30 19:44:45 INFO exec.ExecDriver: Adding input file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp/222232298/302270672.10003
    [junit] 08/12/30 19:44:45 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/data/files/TestSerDe.jar
    [junit] 08/12/30 19:44:45 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
    [junit] 08/12/30 19:44:45 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] 08/12/30 19:44:46 INFO exec.ExecDriver: Job running in-process (local Hadoop)
    [junit] Job running in-process (local Hadoop)
    [junit] 08/12/30 19:44:46 INFO mapred.MapTask: numReduceTasks: 1
    [junit] 08/12/30 19:44:46 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 19:44:46 INFO exec.MapOperator: Adding alias /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp/222232298/302270672.10003 to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp/222232298/302270672.10003/job_local_1_map_0000
    [junit] 08/12/30 19:44:46 INFO exec.MapOperator: Got partitions: null
    [junit] 08/12/30 19:44:46 INFO exec.ReduceSinkOperator: Initializing Self
    [junit] 08/12/30 19:44:46 INFO exec.ReduceSinkOperator: Using tag = -1
    [junit] 08/12/30 19:44:46 INFO thrift.TBinarySortableProtocol: Sort order is "++"
    [junit] 08/12/30 19:44:46 INFO thrift.TBinarySortableProtocol: Sort order is "++"
    [junit] 08/12/30 19:44:46 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 19:44:46 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp/222232298/302270672.10003/job_local_1_map_0000:0+20608
    [junit] 08/12/30 19:44:46 INFO mapred.TaskRunner: Task 'job_local_1_map_0000' done.
    [junit] 08/12/30 19:44:46 INFO mapred.TaskRunner: Saved output of task 'job_local_1_map_0000' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp-1466309530
    [junit] 08/12/30 19:44:46 INFO thrift.TBinarySortableProtocol: Sort order is "++"
    [junit] 08/12/30 19:44:46 INFO thrift.TBinarySortableProtocol: Sort order is "++"
    [junit] 08/12/30 19:44:46 INFO exec.GroupByOperator: Initializing Self
    [junit] 08/12/30 19:44:46 INFO exec.GroupByOperator: Initializing children:
    [junit] 08/12/30 19:44:46 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 19:44:46 INFO exec.GroupByOperator: Initialization Done
    [junit] 08/12/30 19:44:46 INFO mapred.LocalJobRunner: reduce > reduce
    [junit] 08/12/30 19:44:46 INFO mapred.TaskRunner: Task 'reduce_6dxbxz' done.
    [junit] 08/12/30 19:44:46 INFO mapred.TaskRunner: Saved output of task 'reduce_6dxbxz' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp-1466309530
    [junit]  map = 100%,  reduce =100%
    [junit] 08/12/30 19:44:47 INFO exec.ExecDriver:  map = 100%,  reduce =100%
    [junit] 08/12/30 19:44:47 INFO exec.ExecDriver: Ended Job = job_local_1
    [junit] Ended Job = job_local_1
    [junit] plan = /tmp/plan60475.xml
    [junit] 08/12/30 19:44:48 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to jobconf value of: 1
    [junit] 08/12/30 19:44:48 INFO exec.ExecDriver: Adding input file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp/222232298/302270672.10004
    [junit] 08/12/30 19:44:48 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/data/files/TestSerDe.jar
    [junit] 08/12/30 19:44:48 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
    [junit] 08/12/30 19:44:48 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] 08/12/30 19:44:48 INFO exec.ExecDriver: Job running in-process (local Hadoop)
    [junit] Job running in-process (local Hadoop)
    [junit] 08/12/30 19:44:48 INFO mapred.MapTask: numReduceTasks: 1
    [junit] 08/12/30 19:44:49 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 19:44:49 INFO exec.MapOperator: Adding alias /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp/222232298/302270672.10004 to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp/222232298/302270672.10004/reduce_6dxbxz
    [junit] 08/12/30 19:44:49 INFO exec.MapOperator: Got partitions: null
    [junit] 08/12/30 19:44:49 INFO exec.ReduceSinkOperator: Initializing Self
    [junit] 08/12/30 19:44:49 INFO exec.ReduceSinkOperator: Using tag = -1
    [junit] 08/12/30 19:44:49 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 19:44:49 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 19:44:49 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 19:44:49 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp/222232298/302270672.10004/reduce_6dxbxz:0+11875
    [junit] 08/12/30 19:44:49 INFO mapred.TaskRunner: Task 'job_local_1_map_0000' done.
    [junit] 08/12/30 19:44:49 INFO mapred.TaskRunner: Saved output of task 'job_local_1_map_0000' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp-2143036169
    [junit] 08/12/30 19:44:49 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 19:44:49 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 19:44:49 INFO exec.GroupByOperator: Initializing Self
    [junit] 08/12/30 19:44:49 INFO exec.GroupByOperator: Initializing children:
    [junit] 08/12/30 19:44:49 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 19:44:49 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 19:44:49 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 19:44:49 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 19:44:49 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 19:44:49 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 19:44:49 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 19:44:49 INFO exec.GroupByOperator: Initialization Done
    [junit] 08/12/30 19:44:49 INFO mapred.LocalJobRunner: reduce > reduce
    [junit] 08/12/30 19:44:49 INFO mapred.TaskRunner: Task 'reduce_d4hd5d' done.
    [junit] 08/12/30 19:44:49 INFO mapred.TaskRunner: Saved output of task 'reduce_d4hd5d' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp-2143036169
    [junit]  map = 100%,  reduce =100%
    [junit] Ended Job = job_local_1
    [junit] 08/12/30 19:44:49 INFO exec.ExecDriver:  map = 100%,  reduce =100%
    [junit] 08/12/30 19:44:49 INFO exec.ExecDriver: Ended Job = job_local_1
    [junit] diff -I \(file:\)\|\(/tmp/.*\) /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/groupby8.q.out /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/clientpositive/groupby8.q.out
    [junit] Done query: groupby8.q
    [junit] Begin query: input2_limit.q
    [junit] plan = /tmp/plan60476.xml
    [junit] 08/12/30 19:44:52 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to 0 since there's no reduce operator
    [junit] 08/12/30 19:44:52 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src
    [junit] 08/12/30 19:44:52 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/data/files/TestSerDe.jar
    [junit] 08/12/30 19:44:52 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
    [junit] 08/12/30 19:44:52 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] 08/12/30 19:44:53 INFO exec.ExecDriver: Job running in-process (local Hadoop)
    [junit] Job running in-process (local Hadoop)
    [junit] 08/12/30 19:44:53 INFO mapred.MapTask: numReduceTasks: 0
    [junit] 08/12/30 19:44:53 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 19:44:53 INFO exec.MapOperator: Adding alias x to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt
    [junit] 08/12/30 19:44:53 INFO exec.MapOperator: Got partitions: null
    [junit] 08/12/30 19:44:53 INFO exec.TableScanOperator: Initializing Self
    [junit] 08/12/30 19:44:53 INFO exec.TableScanOperator: Initializing children:
    [junit] 08/12/30 19:44:53 INFO exec.FilterOperator: Initializing Self
    [junit] 08/12/30 19:44:53 INFO exec.FilterOperator: Initializing children:
    [junit] 08/12/30 19:44:53 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 19:44:53 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 19:44:53 INFO exec.LimitOperator: Initializing Self
    [junit] 08/12/30 19:44:53 INFO exec.LimitOperator: Initializing children:
    [junit] 08/12/30 19:44:53 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 19:44:53 INFO exec.LimitOperator: Initialization Done
    [junit] 08/12/30 19:44:53 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 19:44:53 INFO exec.FilterOperator: Initialization Done
    [junit] 08/12/30 19:44:53 INFO exec.TableScanOperator: Initialization Done
    [junit] 08/12/30 19:44:53 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 19:44:53 INFO exec.FilterOperator: PASSED:8
    [junit] 08/12/30 19:44:53 INFO exec.FilterOperator: FILTERED:3
    [junit] 08/12/30 19:44:53 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt:0+5812
    [junit] 08/12/30 19:44:53 INFO mapred.TaskRunner: Task 'job_local_1_map_0000' done.
    [junit] 08/12/30 19:44:53 INFO mapred.TaskRunner: Saved output of task 'job_local_1_map_0000' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp993313108
    [junit]  map = 100%,  reduce =0%
    [junit] 08/12/30 19:44:54 INFO exec.ExecDriver:  map = 100%,  reduce =0%
    [junit] 08/12/30 19:44:54 INFO exec.ExecDriver: Ended Job = job_local_1
    [junit] Ended Job = job_local_1
    [junit] diff -I \(file:\)\|\(/tmp/.*\) /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/input2_limit.q.out /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/clientpositive/input2_limit.q.out
    [junit] Done query: input2_limit.q
    [junit] Begin query: input3_limit.q
    [junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
    [junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:361)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
    [junit]     at org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_input3_limit(TestCliDriver.java:1705)
    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit]     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit]     at java.lang.reflect.Method.invoke(Method.java:597)
    [junit]     at junit.framework.TestCase.runTest(TestCase.java:164)
    [junit]     at junit.framework.TestCase.runBare(TestCase.java:130)
    [junit]     at junit.framework.TestResult$1.protect(TestResult.java:106)
    [junit]     at junit.framework.TestResult.runProtected(TestResult.java:124)
    [junit]     at junit.framework.TestResult.run(TestResult.java:109)
    [junit]     at junit.framework.TestCase.run(TestCase.java:120)
    [junit]     at junit.framework.TestSuite.runTest(TestSuite.java:230)
    [junit]     at junit.framework.TestSuite.run(TestSuite.java:225)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
    [junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
    [junit]     at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
    [junit]     at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
    [junit]     at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
    [junit]     ... 21 more
    [junit] Begin query: create_1.q
    [junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcbucket)
    [junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcbucket)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:361)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
    [junit]     at org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_create_1(TestCliDriver.java:1730)
    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit]     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit]     at java.lang.reflect.Method.invoke(Method.java:597)
    [junit]     at junit.framework.TestCase.runTest(TestCase.java:164)
    [junit]     at junit.framework.TestCase.runBare(TestCase.java:130)
    [junit]     at junit.framework.TestResult$1.protect(TestResult.java:106)
    [junit]     at junit.framework.TestResult.runProtected(TestResult.java:124)
    [junit]     at junit.framework.TestResult.run(TestResult.java:109)
    [junit]     at junit.framework.TestCase.run(TestCase.java:120)
    [junit]     at junit.framework.TestSuite.runTest(TestSuite.java:230)
    [junit]     at junit.framework.TestSuite.run(TestSuite.java:225)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
    [junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcbucket)
    [junit]     at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
    [junit]     at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
    [junit]     at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
    [junit]     ... 21 more
    [junit] Begin query: scriptfile1.q
    [junit] plan = /tmp/plan60477.xml
    [junit] 08/12/30 19:44:56 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to jobconf value of: 1
    [junit] 08/12/30 19:44:57 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src
    [junit] 08/12/30 19:44:57 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/data/files/TestSerDe.jar
    [junit] 08/12/30 19:44:57 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
    [junit] 08/12/30 19:44:57 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] Job running in-process (local Hadoop)
    [junit] 08/12/30 19:44:57 INFO exec.ExecDriver: Job running in-process (local Hadoop)
    [junit] 08/12/30 19:44:57 INFO mapred.MapTask: numReduceTasks: 1
    [junit] 08/12/30 19:44:57 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 19:44:57 INFO exec.MapOperator: Adding alias tmap:src to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt
    [junit] 08/12/30 19:44:57 INFO exec.MapOperator: Got partitions: null
    [junit] 08/12/30 19:44:57 INFO exec.TableScanOperator: Initializing Self
    [junit] 08/12/30 19:44:57 INFO exec.TableScanOperator: Initializing children:
    [junit] 08/12/30 19:44:57 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 19:44:57 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 19:44:57 INFO exec.ScriptOperator: Initializing Self
    [junit] 08/12/30 19:44:57 INFO exec.ScriptOperator: Initializing children:
    [junit] 08/12/30 19:44:57 INFO exec.ReduceSinkOperator: Initializing Self
    [junit] 08/12/30 19:44:57 INFO exec.ReduceSinkOperator: Using tag = -1
    [junit] 08/12/30 19:44:57 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 19:44:57 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 19:44:57 INFO exec.ScriptOperator: Initialization Done
    [junit] 08/12/30 19:44:57 INFO exec.ScriptOperator: Executing [/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/./src/test/scripts/testgrep]
    [junit] 08/12/30 19:44:57 INFO exec.ScriptOperator: tablename=src
    [junit] 08/12/30 19:44:57 INFO exec.ScriptOperator: partname={}
    [junit] 08/12/30 19:44:57 INFO exec.ScriptOperator: alias=tmap:src
    [junit] 08/12/30 19:44:57 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 19:44:57 INFO exec.TableScanOperator: Initialization Done
    [junit] 08/12/30 19:44:57 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 19:44:57 INFO exec.ScriptOperator: StreamThread ErrorProcessor done
    [junit] 08/12/30 19:44:57 INFO exec.ScriptOperator: StreamThread OutputProcessor done
    [junit] 08/12/30 19:44:57 INFO exec.ScriptOperator: SERIALIZE_ERRORS:0
    [junit] 08/12/30 19:44:57 INFO exec.ScriptOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 19:44:58 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt:0+5812
    [junit] 08/12/30 19:44:58 INFO mapred.TaskRunner: Task 'job_local_1_map_0000' done.
    [junit] 08/12/30 19:44:58 INFO mapred.TaskRunner: Saved output of task 'job_local_1_map_0000' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp1660726314
    [junit] 08/12/30 19:44:58 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 19:44:58 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 19:44:58 INFO exec.ExtractOperator: Initializing Self
    [junit] 08/12/30 19:44:58 INFO exec.ExtractOperator: Initializing children:
    [junit] 08/12/30 19:44:58 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 19:44:58 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 19:44:58 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 19:44:58 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 19:44:58 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 19:44:58 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 19:44:58 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 19:44:58 INFO exec.ExtractOperator: Initialization Done
    [junit] 08/12/30 19:44:58 INFO mapred.LocalJobRunner: reduce > reduce
    [junit] 08/12/30 19:44:58 INFO mapred.TaskRunner: Task 'reduce_entbbp' done.
    [junit] 08/12/30 19:44:58 INFO mapred.TaskRunner: Saved output of task 'reduce_entbbp' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp1660726314
    [junit] 08/12/30 19:44:58 INFO exec.ExecDriver:  map = 100%,  reduce =100%
    [junit] 08/12/30 19:44:58 INFO exec.ExecDriver: Ended Job = job_local_1
    [junit]  map = 100%,  reduce =100%
    [junit] Ended Job = job_local_1
    [junit] diff -I \(file:\)\|\(/tmp/.*\) /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/scriptfile1.q.out /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/clientpositive/scriptfile1.q.out
    [junit] Done query: scriptfile1.q
    [junit] Begin query: case_sensitivity.q
    [junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
    [junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:361)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
    [junit]     at org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_case_sensitivity(TestCliDriver.java:1780)
    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit]     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit]     at java.lang.reflect.Method.invoke(Method.java:597)
    [junit]     at junit.framework.TestCase.runTest(TestCase.java:164)
    [junit]     at junit.framework.TestCase.runBare(TestCase.java:130)
    [junit]     at junit.framework.TestResult$1.protect(TestResult.java:106)
    [junit]     at junit.framework.TestResult.runProtected(TestResult.java:124)
    [junit]     at junit.framework.TestResult.run(TestResult.java:109)
    [junit]     at junit.framework.TestCase.run(TestCase.java:120)
    [junit]     at junit.framework.TestSuite.runTest(TestSuite.java:230)
    [junit]     at junit.framework.TestSuite.run(TestSuite.java:225)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
    [junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
    [junit]     at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
    [junit]     at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
    [junit]     at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
    [junit]     ... 21 more
    [junit] Begin query: sort.q
    [junit] plan = /tmp/plan60478.xml
    [junit] 08/12/30 19:45:01 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to jobconf value of: 1
    [junit] 08/12/30 19:45:01 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src
    [junit] 08/12/30 19:45:01 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/data/files/TestSerDe.jar
    [junit] 08/12/30 19:45:01 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
    [junit] 08/12/30 19:45:01 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] 08/12/30 19:45:01 INFO exec.ExecDriver: Job running in-process (local Hadoop)
    [junit] Job running in-process (local Hadoop)
    [junit] 08/12/30 19:45:01 INFO mapred.MapTask: numReduceTasks: 1
    [junit] 08/12/30 19:45:01 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 19:45:01 INFO exec.MapOperator: Adding alias x to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt
    [junit] 08/12/30 19:45:01 INFO exec.MapOperator: Got partitions: null
    [junit] 08/12/30 19:45:01 INFO exec.TableScanOperator: Initializing Self
    [junit] 08/12/30 19:45:01 INFO exec.TableScanOperator: Initializing children:
    [junit] 08/12/30 19:45:01 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 19:45:01 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 19:45:01 INFO exec.ReduceSinkOperator: Initializing Self
    [junit] 08/12/30 19:45:01 INFO exec.ReduceSinkOperator: Using tag = -1
    [junit] 08/12/30 19:45:01 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 19:45:01 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 19:45:01 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 19:45:01 INFO exec.TableScanOperator: Initialization Done
    [junit] 08/12/30 19:45:02 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 19:45:02 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt:0+5812
    [junit] 08/12/30 19:45:02 INFO mapred.TaskRunner: Task 'job_local_1_map_0000' done.
    [junit] 08/12/30 19:45:02 INFO mapred.TaskRunner: Saved output of task 'job_local_1_map_0000' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp1892394453
    [junit] 08/12/30 19:45:02 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 19:45:02 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 19:45:02 INFO exec.ExtractOperator: Initializing Self
    [junit] 08/12/30 19:45:02 INFO exec.ExtractOperator: Initializing children:
    [junit] 08/12/30 19:45:02 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 19:45:02 INFO exec.ExtractOperator: Initialization Done
    [junit] 08/12/30 19:45:02 INFO mapred.LocalJobRunner: reduce > reduce
    [junit] 08/12/30 19:45:02 INFO mapred.TaskRunner: Task 'reduce_9uhlv' done.
    [junit] 08/12/30 19:45:02 INFO mapred.TaskRunner: Saved output of task 'reduce_9uhlv' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp1892394453
    [junit]  map = 100%,  reduce =100%
    [junit] 08/12/30 19:45:02 INFO exec.ExecDriver:  map = 100%,  reduce =100%
    [junit] Ended Job = job_local_1
    [junit] 08/12/30 19:45:02 INFO exec.ExecDriver: Ended Job = job_local_1
    [junit] diff -I \(file:\)\|\(/tmp/.*\) /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/sort.q.out /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/clientpositive/sort.q.out
    [junit] Done query: sort.q
    [junit] Begin query: mapreduce2.q
    [junit] plan = /tmp/plan60479.xml
    [junit] 08/12/30 19:45:05 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to jobconf value of: 1
    [junit] 08/12/30 19:45:05 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src
    [junit] 08/12/30 19:45:05 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/data/files/TestSerDe.jar
    [junit] 08/12/30 19:45:05 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
    [junit] 08/12/30 19:45:05 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] 08/12/30 19:45:05 INFO exec.ExecDriver: Job running in-process (local Hadoop)
    [junit] Job running in-process (local Hadoop)
    [junit] 08/12/30 19:45:05 INFO mapred.MapTask: numReduceTasks: 1
    [junit] 08/12/30 19:45:06 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 19:45:06 INFO exec.MapOperator: Adding alias src to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt
    [junit] 08/12/30 19:45:06 INFO exec.MapOperator: Got partitions: null
    [junit] 08/12/30 19:45:06 INFO exec.TableScanOperator: Initializing Self
    [junit] 08/12/30 19:45:06 INFO exec.TableScanOperator: Initializing children:
    [junit] 08/12/30 19:45:06 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 19:45:06 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 19:45:06 INFO exec.ScriptOperator: Initializing Self
    [junit] 08/12/30 19:45:06 INFO exec.ScriptOperator: Initializing children:
    [junit] 08/12/30 19:45:06 INFO exec.ReduceSinkOperator: Initializing Self
    [junit] 08/12/30 19:45:06 INFO exec.ReduceSinkOperator: Using tag = -1
    [junit] 08/12/30 19:45:06 INFO thrift.TBinarySortableProtocol: Sort order is ""
    [junit] 08/12/30 19:45:06 INFO thrift.TBinarySortableProtocol: Sort order is ""
    [junit] 08/12/30 19:45:06 INFO exec.ScriptOperator: Initialization Done
    [junit] 08/12/30 19:45:06 INFO exec.ScriptOperator: Executing [/bin/cat]
    [junit] 08/12/30 19:45:06 INFO exec.ScriptOperator: tablename=src
    [junit] 08/12/30 19:45:06 INFO exec.ScriptOperator: partname={}
    [junit] 08/12/30 19:45:06 INFO exec.ScriptOperator: alias=src
    [junit] 08/12/30 19:45:06 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 19:45:06 INFO exec.TableScanOperator: Initialization Done
    [junit] 08/12/30 19:45:06 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 19:45:06 INFO exec.ScriptOperator: StreamThread ErrorProcessor done
    [junit] 08/12/30 19:45:06 INFO exec.ScriptOperator: StreamThread OutputProcessor done
    [junit] 08/12/30 19:45:06 INFO exec.ScriptOperator: SERIALIZE_ERRORS:0
    [junit] 08/12/30 19:45:06 INFO exec.ScriptOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 19:45:06 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt:0+5812
    [junit] 08/12/30 19:45:06 INFO mapred.TaskRunner: Task 'job_local_1_map_0000' done.
    [junit] 08/12/30 19:45:06 INFO mapred.TaskRunner: Saved output of task 'job_local_1_map_0000' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp879187018
    [junit] 08/12/30 19:45:06 INFO thrift.TBinarySortableProtocol: Sort order is ""
    [junit] 08/12/30 19:45:06 INFO thrift.TBinarySortableProtocol: Sort order is ""
    [junit] 08/12/30 19:45:06 INFO exec.ExtractOperator: Initializing Self
    [junit] 08/12/30 19:45:06 INFO exec.ExtractOperator: Initializing children:
    [junit] 08/12/30 19:45:06 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 19:45:06 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 19:45:06 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 19:45:06 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 19:45:06 INFO exec.ExtractOperator: Initialization Done
    [junit] 08/12/30 19:45:06 INFO mapred.LocalJobRunner: reduce > reduce
    [junit] 08/12/30 19:45:06 INFO mapred.TaskRunner: Task 'reduce_2ymfpb' done.
    [junit] 08/12/30 19:45:06 INFO mapred.TaskRunner: Saved output of task 'reduce_2ymfpb' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp879187018
    [junit]  map = 100%,  reduce =100%
    [junit] Ended Job = job_local_1
    [junit] 08/12/30 19:45:06 INFO exec.ExecDriver:  map = 100%,  reduce =100%
    [junit] 08/12/30 19:45:06 INFO exec.ExecDriver: Ended Job = job_local_1
    [junit] plan = /tmp/plan60480.xml
    [junit] 08/12/30 19:45:08 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to jobconf value of: 1
    [junit] 08/12/30 19:45:08 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/dest1
    [junit] 08/12/30 19:45:08 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/data/files/TestSerDe.jar
    [junit] 08/12/30 19:45:08 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
    [junit] 08/12/30 19:45:08 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] 08/12/30 19:45:08 INFO exec.ExecDriver: Job running in-process (local Hadoop)
    [junit] Job running in-process (local Hadoop)
    [junit] 08/12/30 19:45:08 INFO mapred.MapTask: numReduceTasks: 1
    [junit] 08/12/30 19:45:08 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 19:45:08 INFO exec.MapOperator: Adding alias t:dest1 to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/dest1/reduce_2ymfpb
    [junit] 08/12/30 19:45:09 INFO exec.MapOperator: Got partitions: null
    [junit] 08/12/30 19:45:09 INFO exec.TableScanOperator: Initializing Self
    [junit] 08/12/30 19:45:09 INFO exec.TableScanOperator: Initializing children:
    [junit] 08/12/30 19:45:09 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 19:45:09 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 19:45:09 INFO exec.ReduceSinkOperator: Initializing Self
    [junit] 08/12/30 19:45:09 INFO exec.ReduceSinkOperator: Using tag = -1
    [junit] 08/12/30 19:45:09 INFO thrift.TBinarySortableProtocol: Sort order is "++++"
    [junit] 08/12/30 19:45:09 INFO thrift.TBinarySortableProtocol: Sort order is "++++"
    [junit] 08/12/30 19:45:09 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 19:45:09 INFO exec.TableScanOperator: Initialization Done
    [junit] 08/12/30 19:45:09 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 19:45:09 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/dest1/reduce_2ymfpb:0+8228
    [junit] 08/12/30 19:45:09 INFO mapred.TaskRunner: Task 'job_local_1_map_0000' done.
    [junit] 08/12/30 19:45:09 INFO mapred.TaskRunner: Saved output of task 'job_local_1_map_0000' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp1263670961
    [junit] 08/12/30 19:45:09 INFO thrift.TBinarySortableProtocol: Sort order is "++++"
    [junit] 08/12/30 19:45:09 INFO thrift.TBinarySortableProtocol: Sort order is "++++"
    [junit] 08/12/30 19:45:09 INFO exec.ExtractOperator: Initializing Self
    [junit] 08/12/30 19:45:09 INFO exec.ExtractOperator: Initializing children:
    [junit] 08/12/30 19:45:09 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 19:45:09 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 19:45:09 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 19:45:09 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 19:45:09 INFO exec.ExtractOperator: Initialization Done
    [junit] 08/12/30 19:45:09 INFO mapred.LocalJobRunner: reduce > reduce
    [junit] 08/12/30 19:45:09 INFO mapred.TaskRunner: Task 'reduce_iufv63' done.
    [junit] 08/12/30 19:45:09 INFO mapred.TaskRunner: Saved output of task 'reduce_iufv63' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp1263670961
    [junit] 08/12/30 19:45:09 INFO exec.ExecDriver:  map = 100%,  reduce =100%
    [junit] 08/12/30 19:45:09 INFO exec.ExecDriver: Ended Job = job_local_1
    [junit]  map = 100%,  reduce =100%
    [junit] Ended Job = job_local_1
    [junit] diff -I \(file:\)\|\(/tmp/.*\) /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/mapreduce2.q.out /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/clientpositive/mapreduce2.q.out
    [junit] Done query: mapreduce2.q
    [junit] Begin query: mapreduce4.q
    [junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
    [junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:361)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
    [junit]     at org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_mapreduce4(TestCliDriver.java:1855)
    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit]     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit]     at java.lang.reflect.Method.invoke(Method.java:597)
    [junit]     at junit.framework.TestCase.runTest(TestCase.java:164)
    [junit]     at junit.framework.TestCase.runBare(TestCase.java:130)
    [junit]     at junit.framework.TestResult$1.protect(TestResult.java:106)
    [junit]     at junit.framework.TestResult.runProtected(TestResult.java:124)
    [junit]     at junit.framework.TestResult.run(TestResult.java:109)
    [junit]     at junit.framework.TestCase.run(TestCase.java:120)
    [junit]     at junit.framework.TestSuite.runTest(TestSuite.java:230)
    [junit]     at junit.framework.TestSuite.run(TestSuite.java:225)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
    [junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
    [junit]     at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
    [junit]     at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
    [junit]     at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
    [junit]     ... 21 more
    [junit] Begin query: nullinput.q
    [junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcbucket)
    [junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcbucket)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:361)

    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
    [junit]     at org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_nullinput(TestCliDriver.java:1880)
    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit]     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit]     at java.lang.reflect.Method.invoke(Method.java:597)
    [junit]     at junit.framework.TestCase.runTest(TestCase.java:164)
    [junit]     at junit.framework.TestCase.runBare(TestCase.java:130)
    [junit]     at junit.framework.TestResult$1.protect(TestResult.java:106)
    [junit]     at junit.framework.TestResult.runProtected(TestResult.java:124)
    [junit]     at junit.framework.TestResult.run(TestResult.java:109)
    [junit]     at junit.framework.TestCase.run(TestCase.java:120)
    [junit]     at junit.framework.TestSuite.runTest(TestSuite.java:230)
    [junit]     at junit.framework.TestSuite.run(TestSuite.java:225)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
    [junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcbucket)
    [junit]     at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
    [junit]     at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
    [junit]     at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
    [junit]     ... 21 more
    [junit] Begin query: mapreduce6.q
    [junit] plan = /tmp/plan60481.xml
    [junit] 08/12/30 19:45:12 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to jobconf value of: 1
    [junit] 08/12/30 19:45:12 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src
    [junit] 08/12/30 19:45:12 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/data/files/TestSerDe.jar
    [junit] 08/12/30 19:45:12 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
    [junit] 08/12/30 19:45:12 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] Job running in-process (local Hadoop)
    [junit] 08/12/30 19:45:12 INFO exec.ExecDriver: Job running in-process (local Hadoop)
    [junit] 08/12/30 19:45:13 INFO mapred.MapTask: numReduceTasks: 1
    [junit] 08/12/30 19:45:13 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 19:45:13 INFO exec.MapOperator: Adding alias src to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt
    [junit] 08/12/30 19:45:13 INFO exec.MapOperator: Got partitions: null
    [junit] 08/12/30 19:45:13 INFO exec.TableScanOperator: Initializing Self
    [junit] 08/12/30 19:45:13 INFO exec.TableScanOperator: Initializing children:
    [junit] 08/12/30 19:45:13 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 19:45:13 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 19:45:13 INFO exec.ReduceSinkOperator: Initializing Self
    [junit] 08/12/30 19:45:13 INFO exec.ReduceSinkOperator: Using tag = -1
    [junit] 08/12/30 19:45:13 INFO thrift.TBinarySortableProtocol: Sort order is "-+"
    [junit] 08/12/30 19:45:13 INFO thrift.TBinarySortableProtocol: Sort order is "-+"
    [junit] 08/12/30 19:45:13 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 19:45:13 INFO exec.TableScanOperator: Initialization Done
    [junit] 08/12/30 19:45:13 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 19:45:13 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt:0+5812
    [junit] 08/12/30 19:45:13 INFO mapred.TaskRunner: Task 'job_local_1_map_0000' done.
    [junit] 08/12/30 19:45:13 INFO mapred.TaskRunner: Saved output of task 'job_local_1_map_0000' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp1718548026
    [junit] 08/12/30 19:45:13 INFO thrift.TBinarySortableProtocol: Sort order is "-+"
    [junit] 08/12/30 19:45:13 INFO thrift.TBinarySortableProtocol: Sort order is "-+"
    [junit] 08/12/30 19:45:13 INFO exec.ExtractOperator: Initializing Self
    [junit] 08/12/30 19:45:13 INFO exec.ExtractOperator: Initializing children:
    [junit] 08/12/30 19:45:13 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 19:45:13 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 19:45:13 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 19:45:13 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 19:45:13 INFO exec.ExtractOperator: Initialization Done
    [junit] 08/12/30 19:45:13 INFO mapred.LocalJobRunner: reduce > reduce
    [junit] 08/12/30 19:45:13 INFO mapred.TaskRunner: Task 'reduce_qt4kj8' done.
    [junit] 08/12/30 19:45:13 INFO mapred.TaskRunner: Saved output of task 'reduce_qt4kj8' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp1718548026
    [junit] 08/12/30 19:45:14 INFO exec.ExecDriver:  map = 100%,  reduce =100%
    [junit] 08/12/30 19:45:14 INFO exec.ExecDriver: Ended Job = job_local_1
    [junit]  map = 100%,  reduce =100%
    [junit] Ended Job = job_local_1
    [junit] diff -I \(file:\)\|\(/tmp/.*\) /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/mapreduce6.q.out /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/clientpositive/mapreduce6.q.out
    [junit] Done query: mapreduce6.q
    [junit] Begin query: sample1.q
    [junit] plan = /tmp/plan60482.xml
    [junit] 08/12/30 19:45:16 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to 0 since there's no reduce operator
    [junit] 08/12/30 19:45:16 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcpart/ds=2008-04-08/hr=11
    [junit] 08/12/30 19:45:16 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/data/files/TestSerDe.jar
    [junit] 08/12/30 19:45:16 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
    [junit] 08/12/30 19:45:17 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] Job running in-process (local Hadoop)
    [junit] 08/12/30 19:45:17 INFO exec.ExecDriver: Job running in-process (local Hadoop)
    [junit] 08/12/30 19:45:17 INFO mapred.MapTask: numReduceTasks: 0
    [junit] 08/12/30 19:45:17 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 19:45:17 INFO exec.MapOperator: Adding alias s to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcpart/ds=2008-04-08/hr=11/kv1.txt
    [junit] 08/12/30 19:45:17 INFO exec.MapOperator: Got partitions: ds/hr
    [junit] 08/12/30 19:45:17 INFO exec.TableScanOperator: Initializing Self
    [junit] 08/12/30 19:45:17 INFO exec.TableScanOperator: Initializing children:
    [junit] 08/12/30 19:45:17 INFO exec.FilterOperator: Initializing Self
    [junit] 08/12/30 19:45:17 INFO exec.FilterOperator: Initializing children:
    [junit] 08/12/30 19:45:17 INFO exec.FilterOperator: Initializing Self
    [junit] 08/12/30 19:45:17 INFO exec.FilterOperator: Initializing children:
    [junit] 08/12/30 19:45:17 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 19:45:17 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 19:45:17 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 19:45:17 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 19:45:17 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 19:45:17 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 19:45:17 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 19:45:17 INFO exec.FilterOperator: Initialization Done
    [junit] 08/12/30 19:45:17 INFO exec.FilterOperator: Initialization Done
    [junit] 08/12/30 19:45:17 INFO exec.TableScanOperator: Initialization Done
    [junit] 08/12/30 19:45:17 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 19:45:17 INFO exec.FilterOperator: PASSED:500
    [junit] 08/12/30 19:45:17 INFO exec.FilterOperator: FILTERED:0
    [junit] 08/12/30 19:45:17 INFO exec.FilterOperator: PASSED:500
    [junit] 08/12/30 19:45:17 INFO exec.FilterOperator: FILTERED:0
    [junit] 08/12/30 19:45:17 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcpart/ds=2008-04-08/hr=11/kv1.txt:0+5812
    [junit] 08/12/30 19:45:17 INFO mapred.TaskRunner: Task 'job_local_1_map_0000' done.
    [junit] 08/12/30 19:45:17 INFO mapred.TaskRunner: Saved output of task 'job_local_1_map_0000' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp561166710
    [junit] 08/12/30 19:45:18 INFO exec.ExecDriver:  map = 100%,  reduce =0%
    [junit] 08/12/30 19:45:18 INFO exec.ExecDriver: Ended Job = job_local_1
    [junit]  map = 100%,  reduce =0%
    [junit] Ended Job = job_local_1
    [junit] diff -I \(file:\)\|\(/tmp/.*\) /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/sample1.q.out /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/clientpositive/sample1.q.out
    [junit] Done query: sample1.q
    [junit] Begin query: sample3.q
    [junit] plan = /tmp/plan60483.xml
    [junit] 08/12/30 19:45:20 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to 0 since there's no reduce operator
    [junit] 08/12/30 19:45:20 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcbucket
    [junit] 08/12/30 19:45:20 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/data/files/TestSerDe.jar
    [junit] 08/12/30 19:45:20 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
    [junit] 08/12/30 19:45:21 INFO mapred.FileInputFormat: Total input paths to process : 2
    [junit] 08/12/30 19:45:21 INFO exec.ExecDriver: Job running in-process (local Hadoop)
    [junit] Job running in-process (local Hadoop)
    [junit] 08/12/30 19:45:21 INFO mapred.MapTask: numReduceTasks: 0
    [junit] 08/12/30 19:45:21 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 19:45:21 INFO exec.MapOperator: Adding alias s to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcbucket/kv1.txt
    [junit] 08/12/30 19:45:21 INFO exec.MapOperator: Got partitions: null
    [junit] 08/12/30 19:45:21 INFO exec.TableScanOperator: Initializing Self
    [junit] 08/12/30 19:45:21 INFO exec.TableScanOperator: Initializing children:
    [junit] 08/12/30 19:45:21 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 19:45:21 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 19:45:21 INFO exec.FilterOperator: Initializing Self
    [junit] 08/12/30 19:45:21 INFO exec.FilterOperator: Initializing children:
    [junit] 08/12/30 19:45:21 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 19:45:21 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 19:45:21 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 19:45:21 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 19:45:21 INFO exec.FilterOperator: Initialization Done
    [junit] 08/12/30 19:45:21 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 19:45:21 INFO exec.TableScanOperator: Initialization Done
    [junit] 08/12/30 19:45:21 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 19:45:21 INFO exec.FilterOperator: PASSED:98
    [junit] 08/12/30 19:45:21 INFO exec.FilterOperator: FILTERED:402
    [junit] 08/12/30 19:45:21 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcbucket/kv1.txt:0+5812
    [junit] 08/12/30 19:45:21 INFO mapred.TaskRunner: Task 'job_local_1_map_0000' done.
    [junit] 08/12/30 19:45:21 INFO mapred.TaskRunner: Saved output of task 'job_local_1_map_0000' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp1119457404
    [junit] 08/12/30 19:45:21 INFO mapred.MapTask: numReduceTasks: 0
    [junit] 08/12/30 19:45:21 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 19:45:21 INFO exec.MapOperator: Adding alias s to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcbucket/kv2.txt
    [junit] 08/12/30 19:45:21 INFO exec.MapOperator: Got partitions: null
    [junit] 08/12/30 19:45:21 INFO exec.TableScanOperator: Initializing Self
    [junit] 08/12/30 19:45:21 INFO exec.TableScanOperator: Initializing children:
    [junit] 08/12/30 19:45:21 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 19:45:21 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 19:45:21 INFO exec.FilterOperator: Initializing Self
    [junit] 08/12/30 19:45:21 INFO exec.FilterOperator: Initializing children:
    [junit] 08/12/30 19:45:21 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 19:45:21 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 19:45:21 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 19:45:21 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 19:45:21 INFO exec.FilterOperator: Initialization Done
    [junit] 08/12/30 19:45:21 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 19:45:21 INFO exec.TableScanOperator: Initialization Done
    [junit] 08/12/30 19:45:21 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 19:45:21 INFO exec.FilterOperator: PASSED:207
    [junit] 08/12/30 19:45:21 INFO exec.FilterOperator: FILTERED:793
    [junit] 08/12/30 19:45:21 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcbucket/kv2.txt:0+5791
    [junit] 08/12/30 19:45:21 INFO mapred.TaskRunner: Task 'job_local_1_map_0001' done.
    [junit] 08/12/30 19:45:21 INFO mapred.TaskRunner: Saved output of task 'job_local_1_map_0001' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp1119457404
    [junit] 08/12/30 19:45:22 INFO exec.ExecDriver:  map = 100%,  reduce =0%
    [junit] 08/12/30 19:45:22 INFO exec.ExecDriver: Ended Job = job_local_1
    [junit]  map = 100%,  reduce =0%
    [junit] Ended Job = job_local_1
    [junit] diff -I \(file:\)\|\(/tmp/.*\) /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/sample3.q.out /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/clientpositive/sample3.q.out
    [junit] Done query: sample3.q
    [junit] Begin query: groupby1_map.q
    [junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
    [junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:361)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
    [junit]     at org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_groupby1_map(TestCliDriver.java:1980)
    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit]     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit]     at java.lang.reflect.Method.invoke(Method.java:597)
    [junit]     at junit.framework.TestCase.runTest(TestCase.java:164)
    [junit]     at junit.framework.TestCase.runBare(TestCase.java:130)
    [junit]     at junit.framework.TestResult$1.protect(TestResult.java:106)
    [junit]     at junit.framework.TestResult.runProtected(TestResult.java:124)
    [junit]     at junit.framework.TestResult.run(TestResult.java:109)
    [junit]     at junit.framework.TestCase.run(TestCase.java:120)
    [junit]     at junit.framework.TestSuite.runTest(TestSuite.java:230)
    [junit]     at junit.framework.TestSuite.run(TestSuite.java:225)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
    [junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
    [junit]     at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
    [junit]     at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
    [junit]     at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
    [junit]     ... 21 more
    [junit] Begin query: inputddl2.q
    [junit] diff -I \(file:\)\|\(/tmp/.*\) /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/inputddl2.q.out /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/clientpositive/inputddl2.q.out
    [junit] Done query: inputddl2.q
    [junit] Begin query: groupby1_limit.q
    [junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
    [junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:361)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
    [junit]     at org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_groupby1_limit(TestCliDriver.java:2030)
    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit]     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit]     at java.lang.reflect.Method.invoke(Method.java:597)
    [junit]     at junit.framework.TestCase.runTest(TestCase.java:164)
    [junit]     at junit.framework.TestCase.runBare(TestCase.java:130)
    [junit]     at junit.framework.TestResult$1.protect(TestResult.java:106)
    [junit]     at junit.framework.TestResult.runProtected(TestResult.java:124)
    [junit]     at junit.framework.TestResult.run(TestResult.java:109)
    [junit]     at junit.framework.TestCase.run(TestCase.java:120)
    [junit]     at junit.framework.TestSuite.runTest(TestSuite.java:230)
    [junit]     at junit.framework.TestSuite.run(TestSuite.java:225)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
    [junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
    [junit]     at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
    [junit]     at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
    [junit]     at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
    [junit]     ... 21 more
    [junit] Begin query: sample5.q
    [junit] plan = /tmp/plan60484.xml
    [junit] 08/12/30 19:45:26 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to 0 since there's no reduce operator
    [junit] 08/12/30 19:45:26 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcbucket
    [junit] 08/12/30 19:45:26 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/data/files/TestSerDe.jar
    [junit] 08/12/30 19:45:26 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
    [junit] 08/12/30 19:45:26 INFO mapred.FileInputFormat: Total input paths to process : 2
    [junit] 08/12/30 19:45:26 INFO exec.ExecDriver: Job running in-process (local Hadoop)
    [junit] Job running in-process (local Hadoop)
    [junit] 08/12/30 19:45:26 INFO mapred.MapTask: numReduceTasks: 0
    [junit] 08/12/30 19:45:26 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 19:45:26 INFO exec.MapOperator: Adding alias s to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcbucket/kv1.txt
    [junit] 08/12/30 19:45:26 INFO exec.MapOperator: Got partitions: null
    [junit] 08/12/30 19:45:26 INFO exec.TableScanOperator: Initializing Self
    [junit] 08/12/30 19:45:26 INFO exec.TableScanOperator: Initializing children:
    [junit] 08/12/30 19:45:26 INFO exec.FilterOperator: Initializing Self
    [junit] 08/12/30 19:45:26 INFO exec.FilterOperator: Initializing children:
    [junit] 08/12/30 19:45:26 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 19:45:26 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 19:45:26 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 19:45:26 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 19:45:26 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 19:45:26 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 19:45:26 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 19:45:26 INFO exec.FilterOperator: Initialization Done
    [junit] 08/12/30 19:45:26 INFO exec.TableScanOperator: Initialization Done
    [junit] 08/12/30 19:45:26 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 19:45:26 INFO exec.FilterOperator: PASSED:98
    [junit] 08/12/30 19:45:26 INFO exec.FilterOperator: FILTERED:402
    [junit] 08/12/30 19:45:26 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcbucket/kv1.txt:0+5812
    [junit] 08/12/30 19:45:26 INFO mapred.TaskRunner: Task 'job_local_1_map_0000' done.
    [junit] 08/12/30 19:45:26 INFO mapred.TaskRunner: Saved output of task 'job_local_1_map_0000' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp517161350
    [junit] 08/12/30 19:45:26 INFO mapred.MapTask: numReduceTasks: 0
    [junit] 08/12/30 19:45:26 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 19:45:26 INFO exec.MapOperator: Adding alias s to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcbucket/kv2.txt
    [junit] 08/12/30 19:45:26 INFO exec.MapOperator: Got partitions: null
    [junit] 08/12/30 19:45:26 INFO exec.TableScanOperator: Initializing Self
    [junit] 08/12/30 19:45:26 INFO exec.TableScanOperator: Initializing children:
    [junit] 08/12/30 19:45:26 INFO exec.FilterOperator: Initializing Self
    [junit] 08/12/30 19:45:26 INFO exec.FilterOperator: Initializing children:
    [junit] 08/12/30 19:45:26 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 19:45:26 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 19:45:26 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 19:45:26 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 19:45:26 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 19:45:26 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 19:45:26 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 19:45:26 INFO exec.FilterOperator: Initialization Done
    [junit] 08/12/30 19:45:26 INFO exec.TableScanOperator: Initialization Done
    [junit] 08/12/30 19:45:27 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 19:45:27 INFO exec.FilterOperator: PASSED:207
    [junit] 08/12/30 19:45:27 INFO exec.FilterOperator: FILTERED:793
    [junit] 08/12/30 19:45:27 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcbucket/kv2.txt:0+5791
    [junit] 08/12/30 19:45:27 INFO mapred.TaskRunner: Task 'job_local_1_map_0001' done.
    [junit] 08/12/30 19:45:27 INFO mapred.TaskRunner: Saved output of task 'job_local_1_map_0001' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp517161350
    [junit]  map = 100%,  reduce =0%
    [junit] Ended Job = job_local_1
    [junit] 08/12/30 19:45:27 INFO exec.ExecDriver:  map = 100%,  reduce =0%
    [junit] 08/12/30 19:45:27 INFO exec.ExecDriver: Ended Job = job_local_1
    [junit] diff -I \(file:\)\|\(/tmp/.*\) /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/sample5.q.out /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/clientpositive/sample5.q.out
    [junit] Done query: sample5.q
    [junit] Begin query: groupby3_map.q
    [junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
    [junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:361)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
    [junit]     at org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_groupby3_map(TestCliDriver.java:2080)
    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit]     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit]     at java.lang.reflect.Method.invoke(Method.java:597)
    [junit]     at junit.framework.TestCase.runTest(TestCase.java:164)
    [junit]     at junit.framework.TestCase.runBare(TestCase.java:130)
    [junit]     at junit.framework.TestResult$1.protect(TestResult.java:106)
    [junit]     at junit.framework.TestResult.runProtected(TestResult.java:124)
    [junit]     at junit.framework.TestResult.run(TestResult.java:109)
    [junit]     at junit.framework.TestCase.run(TestCase.java:120)
    [junit]     at junit.framework.TestSuite.runTest(TestSuite.java:230)
    [junit]     at junit.framework.TestSuite.run(TestSuite.java:225)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
    [junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
    [junit]     at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
    [junit]     at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
    [junit]     at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
    [junit]     ... 21 more
    [junit] Begin query: groupby2_limit.q
    [junit] plan = /tmp/plan60485.xml
    [junit] 08/12/30 19:45:30 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to jobconf value of: 31
    [junit] 08/12/30 19:45:30 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src
    [junit] 08/12/30 19:45:30 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/data/files/TestSerDe.jar
    [junit] 08/12/30 19:45:30 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
    [junit] 08/12/30 19:45:30 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] 08/12/30 19:45:30 INFO exec.ExecDriver: Job running in-process (local Hadoop)
    [junit] Job running in-process (local Hadoop)
    [junit] 08/12/30 19:45:30 INFO mapred.MapTask: numReduceTasks: 1
    [junit] 08/12/30 19:45:30 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 19:45:30 INFO exec.MapOperator: Adding alias src to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt
    [junit] 08/12/30 19:45:30 INFO exec.MapOperator: Got partitions: null
    [junit] 08/12/30 19:45:30 INFO exec.TableScanOperator: Initializing Self
    [junit] 08/12/30 19:45:30 INFO exec.TableScanOperator: Initializing children:
    [junit] 08/12/30 19:45:30 INFO exec.ReduceSinkOperator: Initializing Self
    [junit] 08/12/30 19:45:30 INFO exec.ReduceSinkOperator: Using tag = -1
    [junit] 08/12/30 19:45:30 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 19:45:30 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 19:45:31 INFO exec.TableScanOperator: Initialization Done
    [junit] 08/12/30 19:45:31 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 19:45:31 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt:0+5812
    [junit] 08/12/30 19:45:31 INFO mapred.TaskRunner: Task 'job_local_1_map_0000' done.
    [junit] 08/12/30 19:45:31 INFO mapred.TaskRunner: Saved output of task 'job_local_1_map_0000' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp-1673523259
    [junit] 08/12/30 19:45:31 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 19:45:31 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 19:45:31 INFO exec.GroupByOperator: Initializing Self
    [junit] 08/12/30 19:45:31 INFO exec.GroupByOperator: Initializing children:
    [junit] 08/12/30 19:45:31 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 19:45:31 INFO exec.GroupByOperator: Initialization Done
    [junit] 08/12/30 19:45:31 INFO mapred.LocalJobRunner: reduce > reduce
    [junit] 08/12/30 19:45:31 INFO mapred.TaskRunner: Task 'reduce_k5gbir' done.
    [junit] 08/12/30 19:45:31 INFO mapred.TaskRunner: Saved output of task 'reduce_k5gbir' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp-1673523259
    [junit] 08/12/30 19:45:31 INFO exec.ExecDriver:  map = 100%,  reduce =100%
    [junit] 08/12/30 19:45:31 INFO exec.ExecDriver: Ended Job = job_local_1
    [junit]  map = 100%,  reduce =100%
    [junit] Ended Job = job_local_1
    [junit] plan = /tmp/plan60486.xml
    [junit] 08/12/30 19:45:33 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to jobconf value of: 31
    [junit] 08/12/30 19:45:33 INFO exec.ExecDriver: Adding input file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp/891093564/1250965928.10002
    [junit] 08/12/30 19:45:33 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/data/files/TestSerDe.jar
    [junit] 08/12/30 19:45:33 WARN exec.ExecDriver: Number of reduce tasks inferred based on input size to : 1
    [junit] 08/12/30 19:45:33 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
    [junit] 08/12/30 19:45:33 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] Job running in-process (local Hadoop)
    [junit] 08/12/30 19:45:33 INFO exec.ExecDriver: Job running in-process (local Hadoop)
    [junit] 08/12/30 19:45:33 INFO mapred.MapTask: numReduceTasks: 1
    [junit] 08/12/30 19:45:34 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 19:45:34 INFO exec.MapOperator: Adding alias /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp/891093564/1250965928.10002 to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp/891093564/1250965928.10002/reduce_k5gbir
    [junit] 08/12/30 19:45:34 INFO exec.MapOperator: Got partitions: null
    [junit] 08/12/30 19:45:34 INFO exec.ReduceSinkOperator: Initializing Self
    [junit] 08/12/30 19:45:34 INFO exec.ReduceSinkOperator: Using tag = -1
    [junit] 08/12/30 19:45:34 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 19:45:34 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 19:45:34 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 19:45:34 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp/891093564/1250965928.10002/reduce_k5gbir:0+11875
    [junit] 08/12/30 19:45:34 INFO mapred.TaskRunner: Task 'job_local_1_map_0000' done.
    [junit] 08/12/30 19:45:34 INFO mapred.TaskRunner: Saved output of task 'job_local_1_map_0000' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp-319014862
    [junit] 08/12/30 19:45:34 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 19:45:34 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 19:45:34 INFO exec.GroupByOperator: Initializing Self
    [junit] 08/12/30 19:45:34 INFO exec.GroupByOperator: Initializing children:
    [junit] 08/12/30 19:45:34 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 19:45:34 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 19:45:34 INFO exec.LimitOperator: Initializing Self
    [junit] 08/12/30 19:45:34 INFO exec.LimitOperator: Initializing children:
    [junit] 08/12/30 19:45:34 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 19:45:34 INFO exec.LimitOperator: Initialization Done
    [junit] 08/12/30 19:45:34 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 19:45:34 INFO exec.GroupByOperator: Initialization Done
    [junit] 08/12/30 19:45:34 INFO mapred.LocalJobRunner: reduce > reduce
    [junit] 08/12/30 19:45:34 INFO mapred.TaskRunner: Task 'reduce_8wz0pl' done.
    [junit] 08/12/30 19:45:34 INFO mapred.TaskRunner: Saved output of task 'reduce_8wz0pl' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp-319014862
    [junit]  map = 100%,  reduce =100%
    [junit] Ended Job = job_local_1
    [junit] 08/12/30 19:45:34 INFO exec.ExecDriver:  map = 100%,  reduce =100%
    [junit] 08/12/30 19:45:34 INFO exec.ExecDriver: Ended Job = job_local_1
    [junit] diff -I \(file:\)\|\(/tmp/.*\) /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/groupby2_limit.q.out /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/clientpositive/groupby2_limit.q.out
    [junit] Done query: groupby2_limit.q
    [junit] Begin query: inputddl4.q
    [junit] diff -I \(file:\)\|\(/tmp/.*\) /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/inputddl4.q.out /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/clientpositive/inputddl4.q.out
    [junit] Done query: inputddl4.q
    [junit] Begin query: sample7.q
    [junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
    [junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:361)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
    [junit]     at org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_sample7(TestCliDriver.java:2155)
    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit]     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit]     at java.lang.reflect.Method.invoke(Method.java:597)
    [junit]     at junit.framework.TestCase.runTest(TestCase.java:164)
    [junit]     at junit.framework.TestCase.runBare(TestCase.java:130)
    [junit]     at junit.framework.TestResult$1.protect(TestResult.java:106)
    [junit]     at junit.framework.TestResult.runProtected(TestResult.java:124)
    [junit]     at junit.framework.TestResult.run(TestResult.java:109)
    [junit]     at junit.framework.TestCase.run(TestCase.java:120)
    [junit]     at junit.framework.TestSuite.runTest(TestSuite.java:230)
    [junit]     at junit.framework.TestSuite.run(TestSuite.java:225)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
    [junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
    [junit]     at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
    [junit]     at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
    [junit]     at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
    [junit]     ... 21 more
    [junit] Begin query: groupby5_map.q
    [junit] plan = /tmp/plan60487.xml
    [junit] 08/12/30 19:45:38 INFO exec.ExecDriver: Number of reduce tasks determined at compile : 1
    [junit] 08/12/30 19:45:39 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src
    [junit] 08/12/30 19:45:39 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/data/files/TestSerDe.jar
    [junit] 08/12/30 19:45:39 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
    [junit] 08/12/30 19:45:39 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] Job running in-process (local Hadoop)
    [junit] 08/12/30 19:45:39 INFO exec.ExecDriver: Job running in-process (local Hadoop)
    [junit] 08/12/30 19:45:39 INFO mapred.MapTask: numReduceTasks: 1
    [junit] 08/12/30 19:45:39 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 19:45:39 INFO exec.MapOperator: Adding alias src to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt
    [junit] 08/12/30 19:45:39 INFO exec.MapOperator: Got partitions: null
    [junit] 08/12/30 19:45:39 INFO exec.TableScanOperator: Initializing Self
    [junit] 08/12/30 19:45:39 INFO exec.TableScanOperator: Initializing children:
    [junit] 08/12/30 19:45:39 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 19:45:39 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 19:45:39 INFO exec.GroupByOperator: Initializing Self
    [junit] 08/12/30 19:45:39 INFO exec.GroupByOperator: Initializing children:
    [junit] 08/12/30 19:45:39 INFO exec.ReduceSinkOperator: Initializing Self
    [junit] 08/12/30 19:45:39 INFO exec.ReduceSinkOperator: Using tag = -1
    [junit] 08/12/30 19:45:39 INFO thrift.TBinarySortableProtocol: Sort order is ""
    [junit] 08/12/30 19:45:39 INFO thrift.TBinarySortableProtocol: Sort order is ""
    [junit] 08/12/30 19:45:39 INFO exec.GroupByOperator: Initialization Done
    [junit] 08/12/30 19:45:39 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 19:45:39 INFO exec.TableScanOperator: Initialization Done
    [junit] 08/12/30 19:45:39 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 19:45:39 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt:0+5812
    [junit] 08/12/30 19:45:39 INFO mapred.TaskRunner: Task 'job_local_1_map_0000' done.
    [junit] 08/12/30 19:45:39 INFO mapred.TaskRunner: Saved output of task 'job_local_1_map_0000' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp-717326559
    [junit] 08/12/30 19:45:40 INFO thrift.TBinarySortableProtocol: Sort order is ""
    [junit] 08/12/30 19:45:40 INFO thrift.TBinarySortableProtocol: Sort order is ""
    [junit] 08/12/30 19:45:40 INFO exec.GroupByOperator: Initializing Self
    [junit] 08/12/30 19:45:40 INFO exec.GroupByOperator: Initializing children:
    [junit] 08/12/30 19:45:40 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 19:45:40 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 19:45:40 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 19:45:40 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 19:45:40 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 19:45:40 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 19:45:40 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 19:45:40 INFO exec.GroupByOperator: Initialization Done
    [junit] 08/12/30 19:45:40 INFO mapred.LocalJobRunner: reduce > reduce
    [junit] 08/12/30 19:45:40 INFO mapred.TaskRunner: Task 'reduce_rko5gl' done.
    [junit] 08/12/30 19:45:40 INFO mapred.TaskRunner: Saved output of task 'reduce_rko5gl' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp-717326559
    [junit] 08/12/30 19:45:40 INFO exec.ExecDriver:  map = 100%,  reduce =100%
    [junit] 08/12/30 19:45:40 INFO exec.ExecDriver: Ended Job = job_local_1
    [junit]  map = 100%,  reduce =100%
    [junit] Ended Job = job_local_1
    [junit] diff -I \(file:\)\|\(/tmp/.*\) /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/groupby5_map.q.out /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/clientpositive/groupby5_map.q.out
    [junit] Done query: groupby5_map.q
    [junit] Begin query: inputddl6.q
    [junit] Exception: Client Execution failed with error code = 9
    [junit] junit.framework.AssertionFailedError: Client Execution failed with error code = 9
    [junit]     at junit.framework.Assert.fail(Assert.java:47)
    [junit]     at org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_inputddl6(TestCliDriver.java:2208)
    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit]     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit]     at java.lang.reflect.Method.invoke(Method.java:597)
    [junit]     at junit.framework.TestCase.runTest(TestCase.java:164)
    [junit]     at junit.framework.TestCase.runBare(TestCase.java:130)
    [junit]     at junit.framework.TestResult$1.protect(TestResult.java:106)
    [junit]     at junit.framework.TestResult.runProtected(TestResult.java:124)
    [junit]     at junit.framework.TestResult.run(TestResult.java:109)
    [junit]     at junit.framework.TestCase.run(TestCase.java:120)
    [junit]     at junit.framework.TestSuite.runTest(TestSuite.java:230)
    [junit]     at junit.framework.TestSuite.run(TestSuite.java:225)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
    [junit] Begin query: input16_cc.q
    [junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
    [junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:361)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
    [junit]     at org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_input16_cc(TestCliDriver.java:2230)
    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit]     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit]     at java.lang.reflect.Method.invoke(Method.java:597)
    [junit]     at junit.framework.TestCase.runTest(TestCase.java:164)
    [junit]     at junit.framework.TestCase.runBare(TestCase.java:130)
    [junit]     at junit.framework.TestResult$1.protect(TestResult.java:106)
    [junit]     at junit.framework.TestResult.runProtected(TestResult.java:124)
    [junit]     at junit.framework.TestResult.run(TestResult.java:109)
    [junit]     at junit.framework.TestCase.run(TestCase.java:120)
    [junit]     at junit.framework.TestSuite.runTest(TestSuite.java:230)
    [junit]     at junit.framework.TestSuite.run(TestSuite.java:225)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
    [junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
    [junit]     at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
    [junit]     at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
    [junit]     at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
    [junit]     ... 21 more
    [junit] Begin query: cast1.q
    [junit] plan = /tmp/plan60488.xml
    [junit] 08/12/30 19:45:44 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to 0 since there's no reduce operator
    [junit] 08/12/30 19:45:45 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src
    [junit] 08/12/30 19:45:45 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/data/files/TestSerDe.jar
    [junit] 08/12/30 19:45:45 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
    [junit] 08/12/30 19:45:45 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] Job running in-process (local Hadoop)
    [junit] 08/12/30 19:45:45 INFO exec.ExecDriver: Job running in-process (local Hadoop)
    [junit] 08/12/30 19:45:45 INFO mapred.MapTask: numReduceTasks: 0
    [junit] 08/12/30 19:45:45 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 19:45:45 INFO exec.MapOperator: Adding alias src to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt
    [junit] 08/12/30 19:45:45 INFO exec.MapOperator: Got partitions: null
    [junit] 08/12/30 19:45:45 INFO exec.TableScanOperator: Initializing Self
    [junit] 08/12/30 19:45:45 INFO exec.TableScanOperator: Initializing children:
    [junit] 08/12/30 19:45:45 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 19:45:45 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 19:45:45 INFO exec.FilterOperator: Initializing Self
    [junit] 08/12/30 19:45:45 INFO exec.FilterOperator: Initializing children:
    [junit] 08/12/30 19:45:45 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 19:45:45 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 19:45:45 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 19:45:45 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 19:45:45 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 19:45:45 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 19:45:45 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 19:45:45 INFO exec.FilterOperator: Initialization Done
    [junit] 08/12/30 19:45:45 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 19:45:45 INFO exec.TableScanOperator: Initialization Done
    [junit] 08/12/30 19:45:45 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 19:45:45 INFO exec.FilterOperator: FILTERED:499
    [junit] 08/12/30 19:45:45 INFO exec.FilterOperator: PASSED:1
    [junit] 08/12/30 19:45:45 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt:0+5812
    [junit] 08/12/30 19:45:45 INFO mapred.TaskRunner: Task 'job_local_1_map_0000' done.
    [junit] 08/12/30 19:45:45 INFO mapred.TaskRunner: Saved output of task 'job_local_1_map_0000' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp-1271130844
    [junit]  map = 100%,  reduce =0%
    [junit] Ended Job = job_local_1
    [junit] 08/12/30 19:45:46 INFO exec.ExecDriver:  map = 100%,  reduce =0%
    [junit] 08/12/30 19:45:46 INFO exec.ExecDriver: Ended Job = job_local_1
    [junit] diff -I \(file:\)\|\(/tmp/.*\) /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/cast1.q.out /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/clientpositive/cast1.q.out
    [junit] Done query: cast1.q
    [junit] Begin query: inputddl8.q
    [junit] diff -I \(file:\)\|\(/tmp/.*\) /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/inputddl8.q.out /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/clientpositive/inputddl8.q.out
    [junit] Done query: inputddl8.q
    [junit] Begin query: quote1.q
    [junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
    [junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:361)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
    [junit]     at org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_quote1(TestCliDriver.java:2305)
    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit]     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit]     at java.lang.reflect.Method.invoke(Method.java:597)
    [junit]     at junit.framework.TestCase.runTest(TestCase.java:164)
    [junit]     at junit.framework.TestCase.runBare(TestCase.java:130)
    [junit]     at junit.framework.TestResult$1.protect(TestResult.java:106)
    [junit]     at junit.framework.TestResult.runProtected(TestResult.java:124)
    [junit]     at junit.framework.TestResult.run(TestResult.java:109)
    [junit]     at junit.framework.TestCase.run(TestCase.java:120)
    [junit]     at junit.framework.TestSuite.runTest(TestSuite.java:230)
    [junit]     at junit.framework.TestSuite.run(TestSuite.java:225)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
    [junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
    [junit]     at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
    [junit]     at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
    [junit]     at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
    [junit]     ... 21 more
    [junit] Begin query: join0.q
    [junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcpart)
    [junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcpart)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:361)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
    [junit]     at org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_join0(TestCliDriver.java:2330)
    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit]     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit]     at java.lang.reflect.Method.invoke(Method.java:597)
    [junit]     at junit.framework.TestCase.runTest(TestCase.java:164)
    [junit]     at junit.framework.TestCase.runBare(TestCase.java:130)
    [junit]     at junit.framework.TestResult$1.protect(TestResult.java:106)
    [junit]     at junit.framework.TestResult.runProtected(TestResult.java:124)
    [junit]     at junit.framework.TestResult.run(TestResult.java:109)
    [junit]     at junit.framework.TestCase.run(TestCase.java:120)
    [junit]     at junit.framework.TestSuite.runTest(TestSuite.java:230)
    [junit]     at junit.framework.TestSuite.run(TestSuite.java:225)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
    [junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcpart)
    [junit]     at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
    [junit]     at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
    [junit]     at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
    [junit]     ... 21 more
    [junit] Begin query: notable_alias2.q
    [junit] java.lang.Exception: load command: LOAD DATA INPATH '/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp/kv1.txt' INTO TABLE srcpart PARTITION (ds='2008-04-09',hr='12') failed with exit code= 9
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.createSources(QTestUtil.java:250)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:362)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
    [junit]     at org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_notable_alias2(TestCliDriver.java:2355)
    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit]     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit]     at java.lang.reflect.Method.invoke(Method.java:597)
    [junit]     at junit.framework.TestCase.runTest(TestCase.java:164)
    [junit]     at junit.framework.TestCase.runBare(TestCase.java:130)
    [junit]     at junit.framework.TestResult$1.protect(TestResult.java:106)
    [junit]     at junit.framework.TestResult.runProtected(TestResult.java:124)
    [junit] Exception: load command: LOAD DATA INPATH '/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp/kv1.txt' INTO TABLE srcpart PARTITION (ds='2008-04-09',hr='12') failed with exit code= 9
    [junit]     at junit.framework.TestResult.run(TestResult.java:109)
    [junit]     at junit.framework.TestCase.run(TestCase.java:120)
    [junit]     at junit.framework.TestSuite.runTest(TestSuite.java:230)
    [junit]     at junit.framework.TestSuite.run(TestSuite.java:225)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
    [junit] Begin query: input1.q
    [junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcpart)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:361)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
    [junit]     at org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_input1(TestCliDriver.java:2380)
    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit]     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit]     at java.lang.reflect.Method.invoke(Method.java:597)
    [junit]     at junit.framework.TestCase.runTest(TestCase.java:164)
    [junit]     at junit.framework.TestCase.runBare(TestCase.java:130)
    [junit]     at junit.framework.TestResult$1.protect(TestResult.java:106)
    [junit]     at junit.framework.TestResult.runProtected(TestResult.java:124)
    [junit]     at junit.framework.TestResult.run(TestResult.java:109)
    [junit]     at junit.framework.TestCase.run(TestCase.java:120)
    [junit]     at junit.framework.TestSuite.runTest(TestSuite.java:230)
    [junit]     at junit.framework.TestSuite.run(TestSuite.java:225)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
    [junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcpart)
    [junit]     at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
    [junit]     at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
    [junit]     at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
    [junit]     ... 21 more
    [junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcpart)
    [junit] Begin query: cluster.q
    [junit] java.lang.Exception: load command: LOAD DATA INPATH '/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp/kv1.txt' INTO TABLE srcpart PARTITION (ds='2008-04-08',hr='12') failed with exit code= 9
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.createSources(QTestUtil.java:250)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:362)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
    [junit]     at org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_cluster(TestCliDriver.java:2405)
    [junit] Exception: load command: LOAD DATA INPATH '/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp/kv1.txt' INTO TABLE srcpart PARTITION (ds='2008-04-08',hr='12') failed with exit code= 9
    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit]     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit]     at java.lang.reflect.Method.invoke(Method.java:597)
    [junit]     at junit.framework.TestCase.runTest(TestCase.java:164)
    [junit]     at junit.framework.TestCase.runBare(TestCase.java:130)
    [junit]     at junit.framework.TestResult$1.protect(TestResult.java:106)
    [junit]     at junit.framework.TestResult.runProtected(TestResult.java:124)
    [junit]     at junit.framework.TestResult.run(TestResult.java:109)
    [junit]     at junit.framework.TestCase.run(TestCase.java:120)
    [junit]     at junit.framework.TestSuite.runTest(TestSuite.java:230)
    [junit]     at junit.framework.TestSuite.run(TestSuite.java:225)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
    [junit] Begin query: join2.q
    [junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcpart)
    [junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcpart)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:361)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
    [junit]     at org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_join2(TestCliDriver.java:2430)
    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit]     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit]     at java.lang.reflect.Method.invoke(Method.java:597)
    [junit]     at junit.framework.TestCase.runTest(TestCase.java:164)
    [junit]     at junit.framework.TestCase.runBare(TestCase.java:130)
    [junit]     at junit.framework.TestResult$1.protect(TestResult.java:106)
    [junit]     at junit.framework.TestResult.runProtected(TestResult.java:124)
    [junit]     at junit.framework.TestResult.run(TestResult.java:109)
    [junit]     at junit.framework.TestCase.run(TestCase.java:120)
    [junit]     at junit.framework.TestSuite.runTest(TestSuite.java:230)
    [junit]     at junit.framework.TestSuite.run(TestSuite.java:225)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
    [junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcpart)
    [junit]     at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
    [junit]     at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
    [junit]     at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
    [junit]     ... 21 more
    [junit] Begin query: input3.q
    [junit] Exception: load command: LOAD DATA INPATH '/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp/kv1.txt' INTO TABLE srcpart PARTITION (ds='2008-04-08',hr='12') failed with exit code= 9
    [junit] java.lang.Exception: load command: LOAD DATA INPATH '/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp/kv1.txt' INTO TABLE srcpart PARTITION (ds='2008-04-08',hr='12') failed with exit code= 9
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.createSources(QTestUtil.java:250)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:362)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
    [junit]     at org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_input3(TestCliDriver.java:2455)
    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit]     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit]     at java.lang.reflect.Method.invoke(Method.java:597)
    [junit]     at junit.framework.TestCase.runTest(TestCase.java:164)
    [junit]     at junit.framework.TestCase.runBare(TestCase.java:130)
    [junit]     at junit.framework.TestResult$1.protect(TestResult.java:106)
    [junit]     at junit.framework.TestResult.runProtected(TestResult.java:124)
    [junit]     at junit.framework.TestResult.run(TestResult.java:109)
    [junit]     at junit.framework.TestCase.run(TestCase.java:120)
    [junit]     at junit.framework.TestSuite.runTest(TestSuite.java:230)
    [junit]     at junit.framework.TestSuite.run(TestSuite.java:225)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
    [junit] Begin query: join4.q
    [junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcpart)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:361)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
    [junit]     at org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_join4(TestCliDriver.java:2480)
    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit]     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit]     at java.lang.reflect.Method.invoke(Method.java:597)
    [junit]     at junit.framework.TestCase.runTest(TestCase.java:164)
    [junit]     at junit.framework.TestCase.runBare(TestCase.java:130)
    [junit]     at junit.framework.TestResult$1.protect(TestResult.java:106)
    [junit]     at junit.framework.TestResult.runProtected(TestResult.java:124)
    [junit]     at junit.framework.TestResult.run(TestResult.java:109)
    [junit]     at junit.framework.TestCase.run(TestCase.java:120)
    [junit]     at junit.framework.TestSuite.runTest(TestSuite.java:230)
    [junit]     at junit.framework.TestSuite.run(TestSuite.java:225)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
    [junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcpart)
    [junit]     at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
    [junit]     at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
    [junit]     at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
    [junit]     ... 21 more
    [junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcpart)
    [junit] Begin query: input5.q
    [junit] java.lang.Exception: load command: LOAD DATA INPATH '/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp/kv1.txt' INTO TABLE srcpart PARTITION (ds='2008-04-08',hr='11') failed with exit code= 9
    [junit] Exception: load command: LOAD DATA INPATH '/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp/kv1.txt' INTO TABLE srcpart PARTITION (ds='2008-04-08',hr='11') failed with exit code= 9
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.createSources(QTestUtil.java:250)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:362)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
    [junit]     at org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_input5(TestCliDriver.java:2505)
    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit]     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit]     at java.lang.reflect.Method.invoke(Method.java:597)
    [junit]     at junit.framework.TestCase.runTest(TestCase.java:164)
    [junit]     at junit.framework.TestCase.runBare(TestCase.java:130)
    [junit]     at junit.framework.TestResult$1.protect(TestResult.java:106)
    [junit]     at junit.framework.TestResult.runProtected(TestResult.java:124)
    [junit]     at junit.framework.TestResult.run(TestResult.java:109)
    [junit]     at junit.framework.TestCase.run(TestCase.java:120)
    [junit]     at junit.framework.TestSuite.runTest(TestSuite.java:230)
    [junit]     at junit.framework.TestSuite.run(TestSuite.java:225)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
    [junit] Begin query: join6.q
    [junit] plan = /tmp/plan60489.xml
    [junit] 08/12/30 19:45:52 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to jobconf value of: 1
    [junit] 08/12/30 19:45:52 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src
    [junit] 08/12/30 19:45:52 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/data/files/TestSerDe.jar
    [junit] 08/12/30 19:45:52 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
    [junit] 08/12/30 19:45:52 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] 08/12/30 19:45:52 INFO exec.ExecDriver: Job running in-process (local Hadoop)
    [junit] Job running in-process (local Hadoop)
    [junit] 08/12/30 19:45:52 INFO mapred.MapTask: numReduceTasks: 1
    [junit] 08/12/30 19:45:53 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 19:45:53 INFO exec.MapOperator: Adding alias c:a:src1 to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt
    [junit] 08/12/30 19:45:53 INFO exec.MapOperator: Adding alias c:b:src2 to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt
    [junit] 08/12/30 19:45:53 INFO exec.MapOperator: Got partitions: null
    [junit] 08/12/30 19:45:53 INFO exec.TableScanOperator: Initializing Self
    [junit] 08/12/30 19:45:53 INFO exec.TableScanOperator: Initializing children:
    [junit] 08/12/30 19:45:53 INFO exec.FilterOperator: Initializing Self
    [junit] 08/12/30 19:45:53 INFO exec.FilterOperator: Initializing children:
    [junit] 08/12/30 19:45:53 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 19:45:53 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 19:45:53 INFO exec.ReduceSinkOperator: Initializing Self
    [junit] 08/12/30 19:45:53 INFO exec.ReduceSinkOperator: Using tag = 0
    [junit] 08/12/30 19:45:53 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 19:45:53 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 19:45:53 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 19:45:53 INFO exec.FilterOperator: Initialization Done
    [junit] 08/12/30 19:45:53 INFO exec.TableScanOperator: Initialization Done
    [junit] 08/12/30 19:45:53 INFO exec.TableScanOperator: Initializing Self
    [junit] 08/12/30 19:45:53 INFO exec.TableScanOperator: Initializing children:
    [junit] 08/12/30 19:45:53 INFO exec.FilterOperator: Initializing Self
    [junit] 08/12/30 19:45:53 INFO exec.FilterOperator: Initializing children:
    [junit] 08/12/30 19:45:53 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 19:45:53 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 19:45:53 INFO exec.ReduceSinkOperator: Initializing Self
    [junit] 08/12/30 19:45:53 INFO exec.ReduceSinkOperator: Using tag = 1
    [junit] 08/12/30 19:45:53 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 19:45:53 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 19:45:53 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 19:45:53 INFO exec.FilterOperator: Initialization Done
    [junit] 08/12/30 19:45:53 INFO exec.TableScanOperator: Initialization Done
    [junit] 08/12/30 19:45:53 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 19:45:53 INFO exec.FilterOperator: PASSED:9
    [junit] 08/12/30 19:45:53 INFO exec.FilterOperator: FILTERED:491
    [junit] 08/12/30 19:45:53 INFO exec.FilterOperator: PASSED:7
    [junit] 08/12/30 19:45:53 INFO exec.FilterOperator: FILTERED:493
    [junit] 08/12/30 19:45:53 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt:0+5812
    [junit] 08/12/30 19:45:53 INFO mapred.TaskRunner: Task 'job_local_1_map_0000' done.
    [junit] 08/12/30 19:45:53 INFO mapred.TaskRunner: Saved output of task 'job_local_1_map_0000' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp580268533
    [junit] 08/12/30 19:45:53 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 19:45:53 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 19:45:53 INFO exec.JoinOperator: Initializing Self
    [junit] 08/12/30 19:45:53 INFO exec.JoinOperator: Initializing children:
    [junit] 08/12/30 19:45:53 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 19:45:53 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 19:45:53 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 19:45:53 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 19:45:53 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 19:45:53 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 19:45:53 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 19:45:53 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 19:45:53 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 19:45:53 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 19:45:53 INFO exec.JoinOperator: Initialization Done
    [junit] 08/12/30 19:45:53 INFO mapred.LocalJobRunner: reduce > reduce
    [junit] 08/12/30 19:45:53 INFO mapred.TaskRunner: Task 'reduce_f2z7ai' done.
    [junit] 08/12/30 19:45:53 INFO mapred.TaskRunner: Saved output of task 'reduce_f2z7ai' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp580268533
    [junit]  map = 100%,  reduce =100%
    [junit] Ended Job = job_local_1
    [junit] 08/12/30 19:45:53 INFO exec.ExecDriver:  map = 100%,  reduce =100%
    [junit] 08/12/30 19:45:53 INFO exec.ExecDriver: Ended Job = job_local_1
    [junit] diff -I \(file:\)\|\(/tmp/.*\) /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/join6.q.out /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/clientpositive/join6.q.out
    [junit] Done query: join6.q
    [junit] Begin query: input_testxpath3.q
    [junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:361)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
    [junit]     at org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_input_testxpath3(TestCliDriver.java:2555)
    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit]     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit]     at java.lang.reflect.Method.invoke(Method.java:597)
    [junit]     at junit.framework.TestCase.runTest(TestCase.java:164)
    [junit]     at junit.framework.TestCase.runBare(TestCase.java:130)
    [junit]     at junit.framework.TestResult$1.protect(TestResult.java:106)
    [junit]     at junit.framework.TestResult.runProtected(TestResult.java:124)
    [junit]     at junit.framework.TestResult.run(TestResult.java:109)
    [junit]     at junit.framework.TestCase.run(TestCase.java:120)
    [junit]     at junit.framework.TestSuite.runTest(TestSuite.java:230)
    [junit]     at junit.framework.TestSuite.run(TestSuite.java:225)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
    [junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
    [junit]     at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
    [junit]     at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
    [junit]     at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
    [junit]     ... 21 more
    [junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
    [junit] Begin query: input_dynamicserde.q
    [junit] plan = /tmp/plan60490.xml
    [junit] 08/12/30 19:45:56 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to 0 since there's no reduce operator
    [junit] 08/12/30 19:45:56 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src_thrift
    [junit] 08/12/30 19:45:56 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/data/files/TestSerDe.jar
    [junit] 08/12/30 19:45:56 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
    [junit] 08/12/30 19:45:57 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] Job running in-process (local Hadoop)
    [junit] 08/12/30 19:45:57 INFO exec.ExecDriver: Job running in-process (local Hadoop)
    [junit] 08/12/30 19:45:57 INFO mapred.MapTask: numReduceTasks: 0
    [junit] 08/12/30 19:45:57 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 19:45:57 INFO exec.MapOperator: Adding alias src_thrift to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src_thrift/complex.seq
    [junit] 08/12/30 19:45:57 INFO exec.MapOperator: Got partitions: null
    [junit] 08/12/30 19:45:57 INFO exec.TableScanOperator: Initializing Self
    [junit] 08/12/30 19:45:57 INFO exec.TableScanOperator: Initializing children:
    [junit] 08/12/30 19:45:57 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 19:45:57 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 19:45:57 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 19:45:57 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 19:45:57 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 19:45:57 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 19:45:57 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 19:45:57 INFO exec.TableScanOperator: Initialization Done
    [junit] 08/12/30 19:45:57 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 19:45:57 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src_thrift/complex.seq:0+1491
    [junit] 08/12/30 19:45:57 INFO mapred.TaskRunner: Task 'job_local_1_map_0000' done.
    [junit] 08/12/30 19:45:57 INFO mapred.TaskRunner: Saved output of task 'job_local_1_map_0000' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp-1297721480
    [junit]  map = 100%,  reduce =0%
    [junit] Ended Job = job_local_1
    [junit] 08/12/30 19:45:58 INFO exec.ExecDriver:  map = 100%,  reduce =0%
    [junit] 08/12/30 19:45:58 INFO exec.ExecDriver: Ended Job = job_local_1
    [junit] plan = /tmp/plan60491.xml
    [junit] 08/12/30 19:45:59 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to 0 since there's no reduce operator
    [junit] 08/12/30 19:45:59 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/dest1
    [junit] 08/12/30 19:45:59 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/data/files/TestSerDe.jar
    [junit] 08/12/30 19:45:59 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
    [junit] 08/12/30 19:45:59 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] 08/12/30 19:46:00 INFO exec.ExecDriver: Job running in-process (local Hadoop)
    [junit] Job running in-process (local Hadoop)
    [junit] 08/12/30 19:46:00 INFO mapred.MapTask: numReduceTasks: 0
    [junit] 08/12/30 19:46:00 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 19:46:00 INFO exec.MapOperator: Adding alias dest1 to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/dest1/job_local_1_map_0000
    [junit] 08/12/30 19:46:00 INFO exec.MapOperator: Got partitions: null
    [junit] 08/12/30 19:46:00 INFO exec.TableScanOperator: Initializing Self
    [junit] 08/12/30 19:46:00 INFO exec.TableScanOperator: Initializing children:
    [junit] 08/12/30 19:46:00 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 19:46:00 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 19:46:00 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 19:46:00 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 19:46:00 INFO exec.TableScanOperator: Initialization Done
    [junit] 08/12/30 19:46:00 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 19:46:00 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/dest1/job_local_1_map_0000:0+532
    [junit] 08/12/30 19:46:00 INFO mapred.TaskRunner: Task 'job_local_1_map_0000' done.
    [junit] 08/12/30 19:46:00 INFO mapred.TaskRunner: Saved output of task 'job_local_1_map_0000' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp1553450543
    [junit]  map = 100%,  reduce =0%
    [junit] 08/12/30 19:46:01 INFO exec.ExecDriver:  map = 100%,  reduce =0%
    [junit] 08/12/30 19:46:01 INFO exec.ExecDriver: Ended Job = job_local_1
    [junit] Ended Job = job_local_1
    [junit] diff -I \(file:\)\|\(/tmp/.*\) /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/input_dynamicserde.q.out /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/clientpositive/input_dynamicserde.q.out
    [junit] Done query: input_dynamicserde.q
    [junit] Begin query: input7.q
    [junit] plan = /tmp/plan60492.xml
    [junit] 08/12/30 19:46:03 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to 0 since there's no reduce operator
    [junit] 08/12/30 19:46:03 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src1
    [junit] 08/12/30 19:46:03 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/data/files/TestSerDe.jar
    [junit] 08/12/30 19:46:03 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
    [junit] 08/12/30 19:46:03 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] 08/12/30 19:46:04 INFO exec.ExecDriver: Job running in-process (local Hadoop)
    [junit] Job running in-process (local Hadoop)
    [junit] 08/12/30 19:46:04 INFO mapred.MapTask: numReduceTasks: 0
    [junit] 08/12/30 19:46:04 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 19:46:04 INFO exec.MapOperator: Adding alias src1 to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src1/kv3.txt
    [junit] 08/12/30 19:46:04 INFO exec.MapOperator: Got partitions: null
    [junit] 08/12/30 19:46:04 INFO exec.TableScanOperator: Initializing Self
    [junit] 08/12/30 19:46:04 INFO exec.TableScanOperator: Initializing children:
    [junit] 08/12/30 19:46:04 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 19:46:04 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 19:46:04 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 19:46:04 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 19:46:04 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 19:46:04 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 19:46:04 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 19:46:04 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 19:46:04 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 19:46:04 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 19:46:04 INFO exec.TableScanOperator: Initialization Done
    [junit] 08/12/30 19:46:04 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 19:46:04 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src1/kv3.txt:0+216
    [junit] 08/12/30 19:46:04 INFO mapred.TaskRunner: Task 'job_local_1_map_0000' done.
    [junit] 08/12/30 19:46:04 INFO mapred.TaskRunner: Saved output of task 'job_local_1_map_0000' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp355374469
    [junit] 08/12/30 19:46:05 INFO exec.ExecDriver:  map = 100%,  reduce =0%
    [junit] 08/12/30 19:46:05 INFO exec.ExecDriver: Ended Job = job_local_1
    [junit]  map = 100%,  reduce =0%
    [junit] Ended Job = job_local_1
    [junit] diff -I \(file:\)\|\(/tmp/.*\) /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/input7.q.out /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/clientpositive/input7.q.out
    [junit] Done query: input7.q
    [junit] Begin query: input_testsequencefile.q
    [junit] plan = /tmp/plan60493.xml
    [junit] 08/12/30 19:46:07 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to 0 since there's no reduce operator
    [junit] 08/12/30 19:46:07 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src
    [junit] 08/12/30 19:46:07 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/data/files/TestSerDe.jar
    [junit] 08/12/30 19:46:07 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
    [junit] 08/12/30 19:46:08 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] 08/12/30 19:46:08 INFO exec.ExecDriver: Job running in-process (local Hadoop)
    [junit] Job running in-process (local Hadoop)
    [junit] 08/12/30 19:46:08 INFO mapred.MapTask: numReduceTasks: 0
    [junit] 08/12/30 19:46:08 INFO util.NativeCodeLoader: Loaded the native-hadoop library
    [junit] 08/12/30 19:46:08 INFO zlib.ZlibFactory: Successfully loaded & initialized native-zlib library
    [junit] 08/12/30 19:46:08 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 19:46:08 INFO exec.MapOperator: Adding alias src to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt
    [junit] 08/12/30 19:46:08 INFO exec.MapOperator: Got partitions: null
    [junit] 08/12/30 19:46:08 INFO exec.TableScanOperator: Initializing Self
    [junit] 08/12/30 19:46:08 INFO exec.TableScanOperator: Initializing children:
    [junit] 08/12/30 19:46:08 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 19:46:08 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 19:46:08 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 19:46:08 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 19:46:08 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 19:46:08 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 19:46:08 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 19:46:08 INFO exec.TableScanOperator: Initialization Done
    [junit] 08/12/30 19:46:08 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 19:46:08 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt:0+5812
    [junit] 08/12/30 19:46:08 INFO mapred.TaskRunner: Task 'job_local_1_map_0000' done.
    [junit] 08/12/30 19:46:08 INFO mapred.TaskRunner: Saved output of task 'job_local_1_map_0000' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp-670049282
    [junit] 08/12/30 19:46:09 INFO exec.ExecDriver:  map = 100%,  reduce =0%
    [junit] 08/12/30 19:46:09 INFO exec.ExecDriver: Ended Job = job_local_1
    [junit]  map = 100%,  reduce =0%
    [junit] Ended Job = job_local_1
    [junit] diff -I \(file:\)\|\(/tmp/.*\) /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/input_testsequencefile.q.out /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/clientpositive/input_testsequencefile.q.out
    [junit] Done query: input_testsequencefile.q
    [junit] Begin query: join8.q
    [junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
    [junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:361)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
    [junit]     at org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_join8(TestCliDriver.java:2655)
    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit]     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit]     at java.lang.reflect.Method.invoke(Method.java:597)
    [junit]     at junit.framework.TestCase.runTest(TestCase.java:164)
    [junit]     at junit.framework.TestCase.runBare(TestCase.java:130)
    [junit]     at junit.framework.TestResult$1.protect(TestResult.java:106)
    [junit]     at junit.framework.TestResult.runProtected(TestResult.java:124)
    [junit]     at junit.framework.TestResult.run(TestResult.java:109)
    [junit]     at junit.framework.TestCase.run(TestCase.java:120)
    [junit]     at junit.framework.TestSuite.runTest(TestSuite.java:230)
    [junit]     at junit.framework.TestSuite.run(TestSuite.java:225)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
    [junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
    [junit]     at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
    [junit]     at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
    [junit]     at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
    [junit]     ... 21 more
    [junit] Begin query: input9.q
    [junit] plan = /tmp/plan60494.xml
    [junit] 08/12/30 19:46:12 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to 0 since there's no reduce operator
    [junit] 08/12/30 19:46:12 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src1
    [junit] 08/12/30 19:46:12 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/data/files/TestSerDe.jar
    [junit] 08/12/30 19:46:12 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
    [junit] 08/12/30 19:46:12 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] 08/12/30 19:46:12 INFO exec.ExecDriver: Job running in-process (local Hadoop)
    [junit] Job running in-process (local Hadoop)
    [junit] 08/12/30 19:46:12 INFO mapred.MapTask: numReduceTasks: 0
    [junit] 08/12/30 19:46:12 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 19:46:12 INFO exec.MapOperator: Adding alias src1 to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src1/kv3.txt
    [junit] 08/12/30 19:46:12 INFO exec.MapOperator: Got partitions: null
    [junit] 08/12/30 19:46:12 INFO exec.TableScanOperator: Initializing Self
    [junit] 08/12/30 19:46:12 INFO exec.TableScanOperator: Initializing children:
    [junit] 08/12/30 19:46:12 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 19:46:12 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 19:46:12 INFO exec.FilterOperator: Initializing Self
    [junit] 08/12/30 19:46:12 INFO exec.FilterOperator: Initializing children:
    [junit] 08/12/30 19:46:12 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 19:46:12 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 19:46:12 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 19:46:12 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 19:46:12 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 19:46:12 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 19:46:12 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 19:46:12 INFO exec.FilterOperator: Initialization Done
    [junit] 08/12/30 19:46:12 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 19:46:12 INFO exec.TableScanOperator: Initialization Done
    [junit] 08/12/30 19:46:12 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 19:46:12 INFO exec.FilterOperator: FILTERED:25
    [junit] 08/12/30 19:46:12 INFO exec.FilterOperator: PASSED:0
    [junit] 08/12/30 19:46:12 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src1/kv3.txt:0+216
    [junit] 08/12/30 19:46:12 INFO mapred.TaskRunner: Task 'job_local_1_map_0000' done.
    [junit] 08/12/30 19:46:12 INFO mapred.TaskRunner: Saved output of task 'job_local_1_map_0000' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp1987955842
    [junit]  map = 100%,  reduce =0%
    [junit] Ended Job = job_local_1
    [junit] 08/12/30 19:46:13 INFO exec.ExecDriver:  map = 100%,  reduce =0%
    [junit] 08/12/30 19:46:13 INFO exec.ExecDriver: Ended Job = job_local_1
    [junit] diff -I \(file:\)\|\(/tmp/.*\) /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/input9.q.out /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/clientpositive/input9.q.out
    [junit] Done query: input9.q
    [junit] Begin query: input_dfs.q
    [junit] diff -I \(file:\)\|\(/tmp/.*\) /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/input_dfs.q.out /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/clientpositive/input_dfs.q.out
    [junit] Done query: input_dfs.q
    [junit] Begin query: input.q
    [junit] diff -I \(file:\)\|\(/tmp/.*\) /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/input.q.out /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/clientpositive/input.q.out
    [junit] Done query: input.q
    [junit] Begin query: udf1.q
    [junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
    [junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:361)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
    [junit]     at org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_udf1(TestCliDriver.java:2755)
    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit]     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit]     at java.lang.reflect.Method.invoke(Method.java:597)
    [junit]     at junit.framework.TestCase.runTest(TestCase.java:164)
    [junit]     at junit.framework.TestCase.runBare(TestCase.java:130)
    [junit]     at junit.framework.TestResult$1.protect(TestResult.java:106)
    [junit]     at junit.framework.TestResult.runProtected(TestResult.java:124)
    [junit]     at junit.framework.TestResult.run(TestResult.java:109)
    [junit]     at junit.framework.TestCase.run(TestCase.java:120)
    [junit]     at junit.framework.TestSuite.runTest(TestSuite.java:230)
    [junit]     at junit.framework.TestSuite.run(TestSuite.java:225)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
    [junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
    [junit]     at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
    [junit]     at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
    [junit]     at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
    [junit]     ... 21 more
    [junit] Begin query: join10.q
    [junit] plan = /tmp/plan60495.xml
    [junit] 08/12/30 19:46:18 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to jobconf value of: 1
    [junit] 08/12/30 19:46:19 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src
    [junit] 08/12/30 19:46:19 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/data/files/TestSerDe.jar
    [junit] 08/12/30 19:46:19 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
    [junit] 08/12/30 19:46:19 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] Job running in-process (local Hadoop)
    [junit] 08/12/30 19:46:19 INFO exec.ExecDriver: Job running in-process (local Hadoop)
    [junit] 08/12/30 19:46:19 INFO mapred.MapTask: numReduceTasks: 1
    [junit] 08/12/30 19:46:19 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 19:46:19 INFO exec.MapOperator: Adding alias x:src to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt
    [junit] 08/12/30 19:46:19 INFO exec.MapOperator: Adding alias y:src to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt
    [junit] 08/12/30 19:46:19 INFO exec.MapOperator: Got partitions: null
    [junit] 08/12/30 19:46:19 INFO exec.TableScanOperator: Initializing Self
    [junit] 08/12/30 19:46:19 INFO exec.TableScanOperator: Initializing children:
    [junit] 08/12/30 19:46:19 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 19:46:19 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 19:46:19 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 19:46:19 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 19:46:19 INFO exec.ReduceSinkOperator: Initializing Self
    [junit] 08/12/30 19:46:19 INFO exec.ReduceSinkOperator: Using tag = 0
    [junit] 08/12/30 19:46:19 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 19:46:19 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 19:46:19 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 19:46:19 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 19:46:19 INFO exec.TableScanOperator: Initialization Done
    [junit] 08/12/30 19:46:19 INFO exec.TableScanOperator: Initializing Self
    [junit] 08/12/30 19:46:19 INFO exec.TableScanOperator: Initializing children:
    [junit] 08/12/30 19:46:19 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 19:46:19 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 19:46:19 INFO exec.ReduceSinkOperator: Initializing Self
    [junit] 08/12/30 19:46:19 INFO exec.ReduceSinkOperator: Using tag = 1
    [junit] 08/12/30 19:46:19 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 19:46:19 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 19:46:19 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 19:46:19 INFO exec.TableScanOperator: Initialization Done
    [junit] 08/12/30 19:46:19 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 19:46:19 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt:0+5812
    [junit] 08/12/30 19:46:20 INFO mapred.TaskRunner: Task 'job_local_1_map_0000' done.
    [junit] 08/12/30 19:46:20 INFO mapred.TaskRunner: Saved output of task 'job_local_1_map_0000' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp482211992
    [junit] 08/12/30 19:46:20 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 19:46:20 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 19:46:20 INFO exec.JoinOperator: Initializing Self
    [junit] 08/12/30 19:46:20 INFO exec.JoinOperator: Initializing children:
    [junit] 08/12/30 19:46:20 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 19:46:20 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 19:46:20 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 19:46:20 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 19:46:20 INFO exec.JoinOperator: Initialization Done
    [junit] 08/12/30 19:46:20 INFO mapred.LocalJobRunner: reduce > reduce
    [junit] 08/12/30 19:46:20 INFO mapred.TaskRunner: Task 'reduce_btmxjs' done.
    [junit] 08/12/30 19:46:20 INFO mapred.TaskRunner: Saved output of task 'reduce_btmxjs' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp482211992
    [junit] 08/12/30 19:46:20 INFO exec.ExecDriver:  map = 100%,  reduce =100%
    [junit] 08/12/30 19:46:20 INFO exec.ExecDriver: Ended Job = job_local_1
    [junit]  map = 100%,  reduce =100%
    [junit] Ended Job = job_local_1
    [junit] diff -I \(file:\)\|\(/tmp/.*\) /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/join10.q.out /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/clientpositive/join10.q.out
    [junit] Done query: join10.q
    [junit] Begin query: input11.q
    [junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
    [junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:361)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
    [junit]     at org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_input11(TestCliDriver.java:2805)
    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit]     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit]     at java.lang.reflect.Method.invoke(Method.java:597)
    [junit]     at junit.framework.TestCase.runTest(TestCase.java:164)
    [junit]     at junit.framework.TestCase.runBare(TestCase.java:130)
    [junit]     at junit.framework.TestResult$1.protect(TestResult.java:106)
    [junit]     at junit.framework.TestResult.runProtected(TestResult.java:124)
    [junit]     at junit.framework.TestResult.run(TestResult.java:109)
    [junit]     at junit.framework.TestCase.run(TestCase.java:120)
    [junit]     at junit.framework.TestSuite.runTest(TestSuite.java:230)
    [junit]     at junit.framework.TestSuite.run(TestSuite.java:225)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
    [junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
    [junit]     at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
    [junit]     at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
    [junit]     at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
    [junit]     ... 21 more
    [junit] Begin query: noalias_subq1.q
    [junit] plan = /tmp/plan60496.xml
    [junit] 08/12/30 19:46:23 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to 0 since there's no reduce operator
    [junit] 08/12/30 19:46:23 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src
    [junit] 08/12/30 19:46:23 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/data/files/TestSerDe.jar
    [junit] 08/12/30 19:46:23 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
    [junit] 08/12/30 19:46:23 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] Job running in-process (local Hadoop)
    [junit] 08/12/30 19:46:23 INFO exec.ExecDriver: Job running in-process (local Hadoop)
    [junit] 08/12/30 19:46:23 INFO mapred.MapTask: numReduceTasks: 0
    [junit] 08/12/30 19:46:23 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 19:46:23 INFO exec.MapOperator: Adding alias x:src to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt
    [junit] 08/12/30 19:46:23 INFO exec.MapOperator: Got partitions: null
    [junit] 08/12/30 19:46:23 INFO exec.TableScanOperator: Initializing Self
    [junit] 08/12/30 19:46:23 INFO exec.TableScanOperator: Initializing children:
    [junit] 08/12/30 19:46:23 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 19:46:23 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 19:46:23 INFO exec.FilterOperator: Initializing Self
    [junit] 08/12/30 19:46:23 INFO exec.FilterOperator: Initializing children:
    [junit] 08/12/30 19:46:23 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 19:46:23 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 19:46:23 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 19:46:23 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 19:46:23 INFO exec.FilterOperator: Initialization Done
    [junit] 08/12/30 19:46:23 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 19:46:23 INFO exec.TableScanOperator: Initialization Done
    [junit] 08/12/30 19:46:23 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 19:46:23 INFO exec.FilterOperator: PASSED:84
    [junit] 08/12/30 19:46:23 INFO exec.FilterOperator: FILTERED:416
    [junit] 08/12/30 19:46:23 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt:0+5812
    [junit] 08/12/30 19:46:23 INFO mapred.TaskRunner: Task 'job_local_1_map_0000' done.
    [junit] 08/12/30 19:46:23 INFO mapred.TaskRunner: Saved output of task 'job_local_1_map_0000' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp-1698580569
    [junit] 08/12/30 19:46:24 INFO exec.ExecDriver:  map = 100%,  reduce =0%
    [junit]  map = 100%,  reduce =0%
    [junit] Ended Job = job_local_1
    [junit] 08/12/30 19:46:24 INFO exec.ExecDriver: Ended Job = job_local_1
    [junit] diff -I \(file:\)\|\(/tmp/.*\) /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/noalias_subq1.q.out /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/clientpositive/noalias_subq1.q.out
    [junit] Done query: noalias_subq1.q
    [junit] Begin query: udf3.q
    [junit] plan = /tmp/plan60497.xml
    [junit] 08/12/30 19:46:27 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to jobconf value of: 1
    [junit] 08/12/30 19:46:27 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src
    [junit] 08/12/30 19:46:27 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/data/files/TestSerDe.jar
    [junit] 08/12/30 19:46:27 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
    [junit] 08/12/30 19:46:27 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] Job running in-process (local Hadoop)
    [junit] 08/12/30 19:46:27 INFO exec.ExecDriver: Job running in-process (local Hadoop)
    [junit] 08/12/30 19:46:27 INFO mapred.MapTask: numReduceTasks: 1
    [junit] 08/12/30 19:46:28 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 19:46:28 INFO exec.MapOperator: Adding alias src to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt
    [junit] 08/12/30 19:46:28 INFO exec.MapOperator: Got partitions: null
    [junit] 08/12/30 19:46:28 INFO exec.TableScanOperator: Initializing Self
    [junit] 08/12/30 19:46:28 INFO exec.TableScanOperator: Initializing children:
    [junit] 08/12/30 19:46:28 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 19:46:28 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 19:46:28 INFO exec.ReduceSinkOperator: Initializing Self
    [junit] 08/12/30 19:46:28 INFO exec.ReduceSinkOperator: Using tag = -1
    [junit] 08/12/30 19:46:28 INFO thrift.TBinarySortableProtocol: Sort order is ""
    [junit] 08/12/30 19:46:28 INFO thrift.TBinarySortableProtocol: Sort order is ""
    [junit] 08/12/30 19:46:28 INFO exec.SelectOperator: Initialization Done

    [junit] 08/12/30 19:46:28 INFO exec.TableScanOperator: Initialization Done
    [junit] 08/12/30 19:46:28 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 19:46:28 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt:0+5812
    [junit] 08/12/30 19:46:28 INFO mapred.TaskRunner: Task 'job_local_1_map_0000' done.
    [junit] 08/12/30 19:46:28 INFO mapred.TaskRunner: Saved output of task 'job_local_1_map_0000' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp953358224
    [junit] 08/12/30 19:46:28 INFO thrift.TBinarySortableProtocol: Sort order is ""
    [junit] 08/12/30 19:46:28 INFO thrift.TBinarySortableProtocol: Sort order is ""
    [junit] 08/12/30 19:46:28 INFO exec.GroupByOperator: Initializing Self
    [junit] 08/12/30 19:46:28 INFO exec.GroupByOperator: Initializing children:
    [junit] 08/12/30 19:46:28 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 19:46:28 INFO exec.GroupByOperator: Initialization Done
    [junit] 08/12/30 19:46:28 INFO mapred.LocalJobRunner: reduce > reduce
    [junit] 08/12/30 19:46:28 INFO mapred.TaskRunner: Task 'reduce_vi1zwi' done.
    [junit] 08/12/30 19:46:28 INFO mapred.TaskRunner: Saved output of task 'reduce_vi1zwi' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp953358224
    [junit] 08/12/30 19:46:28 INFO exec.ExecDriver:  map = 100%,  reduce =100%
    [junit] 08/12/30 19:46:28 INFO exec.ExecDriver: Ended Job = job_local_1
    [junit]  map = 100%,  reduce =100%
    [junit] Ended Job = job_local_1
    [junit] plan = /tmp/plan60498.xml
    [junit] 08/12/30 19:46:30 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to jobconf value of: 1
    [junit] 08/12/30 19:46:30 INFO exec.ExecDriver: Adding input file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp/247117564/638849060.10001
    [junit] 08/12/30 19:46:30 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/data/files/TestSerDe.jar
    [junit] 08/12/30 19:46:30 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
    [junit] 08/12/30 19:46:30 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] Job running in-process (local Hadoop)
    [junit] 08/12/30 19:46:31 INFO exec.ExecDriver: Job running in-process (local Hadoop)
    [junit] 08/12/30 19:46:31 INFO mapred.MapTask: numReduceTasks: 1
    [junit] 08/12/30 19:46:31 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 19:46:31 INFO exec.MapOperator: Adding alias /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp/247117564/638849060.10001 to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp/247117564/638849060.10001/reduce_vi1zwi
    [junit] 08/12/30 19:46:31 INFO exec.MapOperator: Got partitions: null
    [junit] 08/12/30 19:46:31 INFO exec.ReduceSinkOperator: Initializing Self
    [junit] 08/12/30 19:46:31 INFO exec.ReduceSinkOperator: Using tag = -1
    [junit] 08/12/30 19:46:31 INFO thrift.TBinarySortableProtocol: Sort order is ""
    [junit] 08/12/30 19:46:31 INFO thrift.TBinarySortableProtocol: Sort order is ""
    [junit] 08/12/30 19:46:31 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 19:46:31 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp/247117564/638849060.10001/reduce_vi1zwi:0+124
    [junit] 08/12/30 19:46:31 INFO mapred.TaskRunner: Task 'job_local_1_map_0000' done.
    [junit] 08/12/30 19:46:31 INFO mapred.TaskRunner: Saved output of task 'job_local_1_map_0000' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp1921448718
    [junit] 08/12/30 19:46:31 INFO thrift.TBinarySortableProtocol: Sort order is ""
    [junit] 08/12/30 19:46:31 INFO thrift.TBinarySortableProtocol: Sort order is ""
    [junit] 08/12/30 19:46:31 INFO exec.GroupByOperator: Initializing Self
    [junit] 08/12/30 19:46:31 INFO exec.GroupByOperator: Initializing children:
    [junit] 08/12/30 19:46:31 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 19:46:31 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 19:46:31 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 19:46:31 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 19:46:31 INFO exec.GroupByOperator: Initialization Done
    [junit] 08/12/30 19:46:31 INFO mapred.LocalJobRunner: reduce > reduce
    [junit] 08/12/30 19:46:31 INFO mapred.TaskRunner: Task 'reduce_1qfm5o' done.
    [junit] 08/12/30 19:46:31 INFO mapred.TaskRunner: Saved output of task 'reduce_1qfm5o' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp1921448718
    [junit]  map = 100%,  reduce =100%
    [junit] Ended Job = job_local_1
    [junit] 08/12/30 19:46:32 INFO exec.ExecDriver:  map = 100%,  reduce =100%
    [junit] 08/12/30 19:46:32 INFO exec.ExecDriver: Ended Job = job_local_1
    [junit] diff -I \(file:\)\|\(/tmp/.*\) /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/udf3.q.out /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/clientpositive/udf3.q.out
    [junit] Done query: udf3.q
    [junit] Begin query: join12.q
    [junit] plan = /tmp/plan60499.xml
    [junit] 08/12/30 19:46:34 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to jobconf value of: 1
    [junit] 08/12/30 19:46:35 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src
    [junit] 08/12/30 19:46:35 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/data/files/TestSerDe.jar
    [junit] 08/12/30 19:46:35 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
    [junit] 08/12/30 19:46:35 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] 08/12/30 19:46:35 INFO exec.ExecDriver: Job running in-process (local Hadoop)
    [junit] Job running in-process (local Hadoop)
    [junit] 08/12/30 19:46:35 INFO mapred.MapTask: numReduceTasks: 1
    [junit] 08/12/30 19:46:35 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 19:46:35 INFO exec.MapOperator: Adding alias src2:src to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt
    [junit] 08/12/30 19:46:35 INFO exec.MapOperator: Adding alias src1:src to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt
    [junit] 08/12/30 19:46:35 INFO exec.MapOperator: Adding alias src3:src to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt
    [junit] 08/12/30 19:46:35 INFO exec.MapOperator: Got partitions: null
    [junit] 08/12/30 19:46:35 INFO exec.TableScanOperator: Initializing Self
    [junit] 08/12/30 19:46:35 INFO exec.TableScanOperator: Initializing children:
    [junit] 08/12/30 19:46:35 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 19:46:35 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 19:46:35 INFO exec.ReduceSinkOperator: Initializing Self
    [junit] 08/12/30 19:46:35 INFO exec.ReduceSinkOperator: Using tag = 1
    [junit] 08/12/30 19:46:35 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 19:46:35 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 19:46:35 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 19:46:35 INFO exec.TableScanOperator: Initialization Done
    [junit] 08/12/30 19:46:35 INFO exec.TableScanOperator: Initializing Self
    [junit] 08/12/30 19:46:35 INFO exec.TableScanOperator: Initializing children:
    [junit] 08/12/30 19:46:35 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 19:46:35 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 19:46:35 INFO exec.FilterOperator: Initializing Self
    [junit] 08/12/30 19:46:35 INFO exec.FilterOperator: Initializing children:
    [junit] 08/12/30 19:46:35 INFO exec.ReduceSinkOperator: Initializing Self
    [junit] 08/12/30 19:46:35 INFO exec.ReduceSinkOperator: Using tag = 0
    [junit] 08/12/30 19:46:35 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 19:46:35 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 19:46:35 INFO exec.FilterOperator: Initialization Done
    [junit] 08/12/30 19:46:35 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 19:46:35 INFO exec.TableScanOperator: Initialization Done
    [junit] 08/12/30 19:46:35 INFO exec.TableScanOperator: Initializing Self
    [junit] 08/12/30 19:46:35 INFO exec.TableScanOperator: Initializing children:
    [junit] 08/12/30 19:46:35 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 19:46:35 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 19:46:35 INFO exec.FilterOperator: Initializing Self
    [junit] 08/12/30 19:46:35 INFO exec.FilterOperator: Initializing children:
    [junit] 08/12/30 19:46:35 INFO exec.ReduceSinkOperator: Initializing Self
    [junit] 08/12/30 19:46:35 INFO exec.ReduceSinkOperator: Using tag = 2
    [junit] 08/12/30 19:46:35 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 19:46:35 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 19:46:35 INFO exec.FilterOperator: Initialization Done
    [junit] 08/12/30 19:46:35 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 19:46:35 INFO exec.TableScanOperator: Initialization Done
    [junit] 08/12/30 19:46:35 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 19:46:35 INFO exec.FilterOperator: FILTERED:416
    [junit] 08/12/30 19:46:35 INFO exec.FilterOperator: PASSED:84
    [junit] 08/12/30 19:46:35 INFO exec.FilterOperator: FILTERED:436
    [junit] 08/12/30 19:46:35 INFO exec.FilterOperator: PASSED:64
    [junit] 08/12/30 19:46:35 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt:0+5812
    [junit] 08/12/30 19:46:35 INFO mapred.TaskRunner: Task 'job_local_1_map_0000' done.
    [junit] 08/12/30 19:46:35 INFO mapred.TaskRunner: Saved output of task 'job_local_1_map_0000' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp-1816098626
    [junit] 08/12/30 19:46:35 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 19:46:35 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 19:46:35 INFO exec.JoinOperator: Initializing Self
    [junit] 08/12/30 19:46:35 INFO exec.JoinOperator: Initializing children:
    [junit] 08/12/30 19:46:35 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 19:46:35 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 19:46:35 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 19:46:35 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 19:46:35 INFO exec.JoinOperator: Initialization Done
    [junit] 08/12/30 19:46:36 INFO mapred.LocalJobRunner: reduce > reduce
    [junit] 08/12/30 19:46:36 INFO mapred.TaskRunner: Task 'reduce_x27eg3' done.
    [junit] 08/12/30 19:46:36 INFO mapred.TaskRunner: Saved output of task 'reduce_x27eg3' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp-1816098626
    [junit]  map = 100%,  reduce =100%
    [junit] Ended Job = job_local_1
    [junit] 08/12/30 19:46:36 INFO exec.ExecDriver:  map = 100%,  reduce =100%
    [junit] 08/12/30 19:46:36 INFO exec.ExecDriver: Ended Job = job_local_1
    [junit] diff -I \(file:\)\|\(/tmp/.*\) /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/join12.q.out /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/clientpositive/join12.q.out
    [junit] Done query: join12.q
    [junit] Begin query: input_testxpath.q
    [junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
    [junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:361)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
    [junit]     at org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_input_testxpath(TestCliDriver.java:2905)
    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit]     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit]     at java.lang.reflect.Method.invoke(Method.java:597)
    [junit]     at junit.framework.TestCase.runTest(TestCase.java:164)
    [junit]     at junit.framework.TestCase.runBare(TestCase.java:130)
    [junit]     at junit.framework.TestResult$1.protect(TestResult.java:106)
    [junit]     at junit.framework.TestResult.runProtected(TestResult.java:124)
    [junit]     at junit.framework.TestResult.run(TestResult.java:109)
    [junit]     at junit.framework.TestCase.run(TestCase.java:120)
    [junit]     at junit.framework.TestSuite.runTest(TestSuite.java:230)
    [junit]     at junit.framework.TestSuite.run(TestSuite.java:225)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
    [junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
    [junit]     at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
    [junit]     at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
    [junit]     at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
    [junit]     ... 21 more
    [junit] Begin query: input13.q
    [junit] plan = /tmp/plan60500.xml
    [junit] 08/12/30 19:46:39 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to 0 since there's no reduce operator
    [junit] 08/12/30 19:46:39 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src
    [junit] 08/12/30 19:46:39 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/data/files/TestSerDe.jar
    [junit] 08/12/30 19:46:39 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
    [junit] 08/12/30 19:46:39 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] Job running in-process (local Hadoop)
    [junit] 08/12/30 19:46:39 INFO exec.ExecDriver: Job running in-process (local Hadoop)
    [junit] 08/12/30 19:46:40 INFO mapred.MapTask: numReduceTasks: 0
    [junit] 08/12/30 19:46:40 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 19:46:40 INFO exec.MapOperator: Adding alias src to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt
    [junit] 08/12/30 19:46:40 INFO exec.MapOperator: Got partitions: null
    [junit] 08/12/30 19:46:40 INFO exec.TableScanOperator: Initializing Self
    [junit] 08/12/30 19:46:40 INFO exec.TableScanOperator: Initializing children:
    [junit] 08/12/30 19:46:40 INFO exec.FilterOperator: Initializing Self
    [junit] 08/12/30 19:46:40 INFO exec.FilterOperator: Initializing children:
    [junit] 08/12/30 19:46:40 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 19:46:40 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 19:46:40 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 19:46:40 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 19:46:40 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 19:46:40 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 19:46:40 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 19:46:40 INFO exec.FilterOperator: Initialization Done
    [junit] 08/12/30 19:46:40 INFO exec.FilterOperator: Initializing Self
    [junit] 08/12/30 19:46:40 INFO exec.FilterOperator: Initializing children:
    [junit] 08/12/30 19:46:40 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 19:46:40 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 19:46:40 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 19:46:40 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 19:46:40 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 19:46:40 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 19:46:40 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 19:46:40 INFO exec.FilterOperator: Initialization Done
    [junit] 08/12/30 19:46:40 INFO exec.FilterOperator: Initializing Self
    [junit] 08/12/30 19:46:40 INFO exec.FilterOperator: Initializing children:
    [junit] 08/12/30 19:46:40 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 19:46:40 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 19:46:40 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 19:46:40 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 19:46:40 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 19:46:40 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 19:46:40 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 19:46:40 INFO exec.FilterOperator: Initialization Done
    [junit] 08/12/30 19:46:40 INFO exec.FilterOperator: Initializing Self
    [junit] 08/12/30 19:46:40 INFO exec.FilterOperator: Initializing children:
    [junit] 08/12/30 19:46:40 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 19:46:40 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 19:46:40 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 19:46:40 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 19:46:40 INFO exec.FilterOperator: Initialization Done
    [junit] 08/12/30 19:46:40 INFO exec.TableScanOperator: Initialization Done
    [junit] 08/12/30 19:46:40 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 19:46:40 INFO exec.FilterOperator: FILTERED:416
    [junit] 08/12/30 19:46:40 INFO exec.FilterOperator: PASSED:84
    [junit] 08/12/30 19:46:40 INFO exec.FilterOperator: FILTERED:395
    [junit] 08/12/30 19:46:40 INFO exec.FilterOperator: PASSED:105
    [junit] 08/12/30 19:46:40 INFO exec.FilterOperator: FILTERED:397
    [junit] 08/12/30 19:46:40 INFO exec.FilterOperator: PASSED:103
    [junit] 08/12/30 19:46:40 INFO exec.FilterOperator: FILTERED:292
    [junit] 08/12/30 19:46:40 INFO exec.FilterOperator: PASSED:208
    [junit] 08/12/30 19:46:40 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt:0+5812
    [junit] 08/12/30 19:46:40 INFO mapred.TaskRunner: Task 'job_local_1_map_0000' done.
    [junit] 08/12/30 19:46:40 INFO mapred.TaskRunner: Saved output of task 'job_local_1_map_0000' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp290426236
    [junit]  map = 100%,  reduce =0%
    [junit] Ended Job = job_local_1
    [junit] 08/12/30 19:46:40 INFO exec.ExecDriver:  map = 100%,  reduce =0%
    [junit] 08/12/30 19:46:40 INFO exec.ExecDriver: Ended Job = job_local_1
    [junit] diff -I \(file:\)\|\(/tmp/.*\) /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/input13.q.out /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/clientpositive/input13.q.out
    [junit] Done query: input13.q
    [junit] Begin query: udf5.q
    [junit] plan = /tmp/plan60501.xml
    [junit] 08/12/30 19:46:44 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to 0 since there's no reduce operator
    [junit] 08/12/30 19:46:44 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src
    [junit] 08/12/30 19:46:44 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/data/files/TestSerDe.jar
    [junit] 08/12/30 19:46:44 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
    [junit] 08/12/30 19:46:44 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] 08/12/30 19:46:44 INFO exec.ExecDriver: Job running in-process (local Hadoop)
    [junit] Job running in-process (local Hadoop)
    [junit] 08/12/30 19:46:44 INFO mapred.MapTask: numReduceTasks: 0
    [junit] 08/12/30 19:46:44 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 19:46:44 INFO exec.MapOperator: Adding alias src to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt
    [junit] 08/12/30 19:46:44 INFO exec.MapOperator: Got partitions: null
    [junit] 08/12/30 19:46:44 INFO exec.TableScanOperator: Initializing Self
    [junit] 08/12/30 19:46:44 INFO exec.TableScanOperator: Initializing children:
    [junit] 08/12/30 19:46:44 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 19:46:44 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 19:46:44 INFO exec.FilterOperator: Initializing Self
    [junit] 08/12/30 19:46:44 INFO exec.FilterOperator: Initializing children:
    [junit] 08/12/30 19:46:44 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 19:46:44 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 19:46:44 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 19:46:44 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 19:46:44 INFO exec.FilterOperator: Initialization Done
    [junit] 08/12/30 19:46:44 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 19:46:44 INFO exec.TableScanOperator: Initialization Done
    [junit] 08/12/30 19:46:44 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 19:46:44 INFO exec.FilterOperator: FILTERED:499
    [junit] 08/12/30 19:46:44 INFO exec.FilterOperator: PASSED:1
    [junit] 08/12/30 19:46:44 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt:0+5812
    [junit] 08/12/30 19:46:44 INFO mapred.TaskRunner: Task 'job_local_1_map_0000' done.
    [junit] 08/12/30 19:46:44 INFO mapred.TaskRunner: Saved output of task 'job_local_1_map_0000' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp1210781005
    [junit]  map = 100%,  reduce =0%
    [junit] 08/12/30 19:46:45 INFO exec.ExecDriver:  map = 100%,  reduce =0%
    [junit] Ended Job = job_local_1
    [junit] 08/12/30 19:46:45 INFO exec.ExecDriver: Ended Job = job_local_1
    [junit] plan = /tmp/plan60502.xml
    [junit] 08/12/30 19:46:47 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to 0 since there's no reduce operator
    [junit] 08/12/30 19:46:47 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/dest1
    [junit] 08/12/30 19:46:47 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/data/files/TestSerDe.jar
    [junit] 08/12/30 19:46:47 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
    [junit] 08/12/30 19:46:47 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] 08/12/30 19:46:47 INFO exec.ExecDriver: Job running in-process (local Hadoop)
    [junit] Job running in-process (local Hadoop)
    [junit] 08/12/30 19:46:47 INFO mapred.MapTask: numReduceTasks: 0
    [junit] 08/12/30 19:46:47 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 19:46:47 INFO exec.MapOperator: Adding alias dest1 to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/dest1/job_local_1_map_0000
    [junit] 08/12/30 19:46:47 INFO exec.MapOperator: Got partitions: null
    [junit] 08/12/30 19:46:47 INFO exec.TableScanOperator: Initializing Self
    [junit] 08/12/30 19:46:47 INFO exec.TableScanOperator: Initializing children:
    [junit] 08/12/30 19:46:47 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 19:46:47 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 19:46:47 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 19:46:47 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 19:46:47 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 19:46:47 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 19:46:47 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 19:46:47 INFO exec.TableScanOperator: Initialization Done
    [junit] 08/12/30 19:46:47 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 19:46:47 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/dest1/job_local_1_map_0000:0+8
    [junit] 08/12/30 19:46:47 INFO mapred.TaskRunner: Task 'job_local_1_map_0000' done.
    [junit] 08/12/30 19:46:47 INFO mapred.TaskRunner: Saved output of task 'job_local_1_map_0000' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp80660779
    [junit]  map = 100%,  reduce =0%
    [junit] Ended Job = job_local_1
    [junit] 08/12/30 19:46:48 INFO exec.ExecDriver:  map = 100%,  reduce =0%
    [junit] 08/12/30 19:46:48 INFO exec.ExecDriver: Ended Job = job_local_1
    [junit] diff -I \(file:\)\|\(/tmp/.*\) /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/udf5.q.out /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/clientpositive/udf5.q.out
    [junit] Done query: udf5.q
    [junit] Begin query: join14.q
    [junit] plan = /tmp/plan60503.xml
    [junit] 08/12/30 19:46:51 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to jobconf value of: 1
    [junit] 08/12/30 19:46:52 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcpart/ds=2008-04-08/hr=11
    [junit] 08/12/30 19:46:52 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcpart/ds=2008-04-08/hr=12
    [junit] 08/12/30 19:46:52 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src
    [junit] 08/12/30 19:46:52 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/data/files/TestSerDe.jar
    [junit] 08/12/30 19:46:52 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
    [junit] 08/12/30 19:46:52 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] 08/12/30 19:46:52 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] 08/12/30 19:46:52 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] 08/12/30 19:46:52 INFO exec.ExecDriver: Job running in-process (local Hadoop)
    [junit] Job running in-process (local Hadoop)
    [junit] 08/12/30 19:46:52 INFO mapred.MapTask: numReduceTasks: 1
    [junit] 08/12/30 19:46:52 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 19:46:52 INFO exec.MapOperator: Adding alias srcpart to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcpart/ds=2008-04-08/hr=11/kv1.txt
    [junit] 08/12/30 19:46:52 INFO exec.MapOperator: Got partitions: ds/hr
    [junit] 08/12/30 19:46:52 INFO exec.TableScanOperator: Initializing Self
    [junit] 08/12/30 19:46:52 INFO exec.TableScanOperator: Initializing children:
    [junit] 08/12/30 19:46:52 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 19:46:52 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 19:46:52 INFO exec.FilterOperator: Initializing Self
    [junit] 08/12/30 19:46:52 INFO exec.FilterOperator: Initializing children:
    [junit] 08/12/30 19:46:52 INFO exec.ReduceSinkOperator: Initializing Self
    [junit] 08/12/30 19:46:52 INFO exec.ReduceSinkOperator: Using tag = 1
    [junit] 08/12/30 19:46:52 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 19:46:52 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 19:46:52 INFO exec.FilterOperator: Initialization Done
    [junit] 08/12/30 19:46:52 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 19:46:52 INFO exec.TableScanOperator: Initialization Done
    [junit] 08/12/30 19:46:52 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 19:46:52 INFO exec.FilterOperator: FILTERED:0
    [junit] 08/12/30 19:46:52 INFO exec.FilterOperator: PASSED:500
    [junit] 08/12/30 19:46:52 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcpart/ds=2008-04-08/hr=11/kv1.txt:0+5812
    [junit] 08/12/30 19:46:52 INFO mapred.TaskRunner: Task 'job_local_1_map_0000' done.
    [junit] 08/12/30 19:46:52 INFO mapred.TaskRunner: Saved output of task 'job_local_1_map_0000' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp-1652474160
    [junit] 08/12/30 19:46:52 INFO mapred.MapTask: numReduceTasks: 1
    [junit] 08/12/30 19:46:53 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 19:46:53 INFO exec.MapOperator: Adding alias srcpart to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcpart/ds=2008-04-08/hr=12/kv1.txt
    [junit] 08/12/30 19:46:53 INFO exec.MapOperator: Got partitions: ds/hr
    [junit] 08/12/30 19:46:53 INFO exec.TableScanOperator: Initializing Self
    [junit] 08/12/30 19:46:53 INFO exec.TableScanOperator: Initializing children:
    [junit] 08/12/30 19:46:53 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 19:46:53 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 19:46:53 INFO exec.FilterOperator: Initializing Self
    [junit] 08/12/30 19:46:53 INFO exec.FilterOperator: Initializing children:
    [junit] 08/12/30 19:46:53 INFO exec.ReduceSinkOperator: Initializing Self
    [junit] 08/12/30 19:46:53 INFO exec.ReduceSinkOperator: Using tag = 1
    [junit] 08/12/30 19:46:53 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 19:46:53 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 19:46:53 INFO exec.FilterOperator: Initialization Done
    [junit] 08/12/30 19:46:53 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 19:46:53 INFO exec.TableScanOperator: Initialization Done
    [junit] 08/12/30 19:46:53 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 19:46:53 INFO exec.FilterOperator: FILTERED:0
    [junit] 08/12/30 19:46:53 INFO exec.FilterOperator: PASSED:1000
    [junit] 08/12/30 19:46:53 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcpart/ds=2008-04-08/hr=12/kv1.txt:0+5812
    [junit] 08/12/30 19:46:53 INFO mapred.TaskRunner: Task 'job_local_1_map_0001' done.
    [junit] 08/12/30 19:46:53 INFO mapred.TaskRunner: Saved output of task 'job_local_1_map_0001' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp-1652474160
    [junit] 08/12/30 19:46:53 INFO mapred.MapTask: numReduceTasks: 1
    [junit] 08/12/30 19:46:53 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 19:46:53 INFO exec.MapOperator: Adding alias src to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt
    [junit] 08/12/30 19:46:53 INFO exec.MapOperator: Got partitions: null
    [junit] 08/12/30 19:46:53 INFO exec.TableScanOperator: Initializing Self
    [junit] 08/12/30 19:46:53 INFO exec.TableScanOperator: Initializing children:
    [junit] 08/12/30 19:46:53 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 19:46:53 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 19:46:53 INFO exec.FilterOperator: Initializing Self
    [junit] 08/12/30 19:46:53 INFO exec.FilterOperator: Initializing children:
    [junit] 08/12/30 19:46:53 INFO exec.ReduceSinkOperator: Initializing Self
    [junit] 08/12/30 19:46:53 INFO exec.ReduceSinkOperator: Using tag = 0
    [junit] 08/12/30 19:46:53 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 19:46:53 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 19:46:53 INFO exec.FilterOperator: Initialization Done
    [junit] 08/12/30 19:46:53 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 19:46:53 INFO exec.TableScanOperator: Initialization Done
    [junit] 08/12/30 19:46:53 INFO exec.ExecDriver:  map = 100%,  reduce =0%
    [junit] 08/12/30 19:46:53 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 19:46:53 INFO exec.FilterOperator: FILTERED:86
    [junit] 08/12/30 19:46:53 INFO exec.FilterOperator: PASSED:414
    [junit]  map = 100%,  reduce =0%
    [junit] 08/12/30 19:46:53 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt:0+5812
    [junit] 08/12/30 19:46:53 INFO mapred.TaskRunner: Task 'job_local_1_map_0002' done.
    [junit] 08/12/30 19:46:53 INFO mapred.TaskRunner: Saved output of task 'job_local_1_map_0002' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp-1652474160
    [junit] 08/12/30 19:46:53 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 19:46:53 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 19:46:53 INFO exec.JoinOperator: Initializing Self
    [junit] 08/12/30 19:46:53 INFO exec.JoinOperator: Initializing children:
    [junit] 08/12/30 19:46:53 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 19:46:53 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 19:46:53 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 19:46:53 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 19:46:53 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 19:46:53 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 19:46:53 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 19:46:53 INFO exec.JoinOperator: Initialization Done
    [junit] 08/12/30 19:46:53 INFO mapred.LocalJobRunner: reduce > reduce
    [junit] 08/12/30 19:46:53 INFO mapred.TaskRunner: Task 'reduce_5jj1sj' done.
    [junit] 08/12/30 19:46:53 INFO mapred.TaskRunner: Saved output of task 'reduce_5jj1sj' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp-1652474160
    [junit] 08/12/30 19:46:54 INFO exec.ExecDriver:  map = 100%,  reduce =100%
    [junit] 08/12/30 19:46:54 INFO exec.ExecDriver: Ended Job = job_local_1
    [junit]  map = 100%,  reduce =100%
    [junit] Ended Job = job_local_1
    [junit] diff -I \(file:\)\|\(/tmp/.*\) /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/join14.q.out /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/clientpositive/join14.q.out
    [junit] Done query: join14.q
    [junit] Begin query: input_part0.q
    [junit] diff -I \(file:\)\|\(/tmp/.*\) /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/input_part0.q.out /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/clientpositive/input_part0.q.out
    [junit] Done query: input_part0.q
    [junit] Begin query: input15.q
    [junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
    [junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:361)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
    [junit]     at org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_input15(TestCliDriver.java:3030)
    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit]     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit]     at java.lang.reflect.Method.invoke(Method.java:597)
    [junit]     at junit.framework.TestCase.runTest(TestCase.java:164)
    [junit]     at junit.framework.TestCase.runBare(TestCase.java:130)
    [junit]     at junit.framework.TestResult$1.protect(TestResult.java:106)
    [junit]     at junit.framework.TestResult.runProtected(TestResult.java:124)
    [junit]     at junit.framework.TestResult.run(TestResult.java:109)
    [junit]     at junit.framework.TestCase.run(TestCase.java:120)
    [junit]     at junit.framework.TestSuite.runTest(TestSuite.java:230)
    [junit]     at junit.framework.TestSuite.run(TestSuite.java:225)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)

    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
    [junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
    [junit]     at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
    [junit]     at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
    [junit]     at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
    [junit]     ... 21 more
    [junit] Begin query: join16.q
    [junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcpart)
    [junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcpart)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:361)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
    [junit]     at org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_join16(TestCliDriver.java:3055)
    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit]     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit]     at java.lang.reflect.Method.invoke(Method.java:597)
    [junit]     at junit.framework.TestCase.runTest(TestCase.java:164)
    [junit]     at junit.framework.TestCase.runBare(TestCase.java:130)
    [junit]     at junit.framework.TestResult$1.protect(TestResult.java:106)
    [junit]     at junit.framework.TestResult.runProtected(TestResult.java:124)
    [junit]     at junit.framework.TestResult.run(TestResult.java:109)
    [junit]     at junit.framework.TestCase.run(TestCase.java:120)
    [junit]     at junit.framework.TestSuite.runTest(TestSuite.java:230)
    [junit]     at junit.framework.TestSuite.run(TestSuite.java:225)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
    [junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcpart)
    [junit]     at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
    [junit]     at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
    [junit]     at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
    [junit]     ... 21 more
    [junit] Begin query: input_part2.q
    [junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcbucket)
    [junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcbucket)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:361)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
    [junit]     at org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_input_part2(TestCliDriver.java:3080)
    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit]     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit]     at java.lang.reflect.Method.invoke(Method.java:597)
    [junit]     at junit.framework.TestCase.runTest(TestCase.java:164)
    [junit]     at junit.framework.TestCase.runBare(TestCase.java:130)
    [junit]     at junit.framework.TestResult$1.protect(TestResult.java:106)
    [junit]     at junit.framework.TestResult.runProtected(TestResult.java:124)
    [junit]     at junit.framework.TestResult.run(TestResult.java:109)
    [junit]     at junit.framework.TestCase.run(TestCase.java:120)
    [junit]     at junit.framework.TestSuite.runTest(TestSuite.java:230)
    [junit]     at junit.framework.TestSuite.run(TestSuite.java:225)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
    [junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcbucket)
    [junit]     at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
    [junit]     at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
    [junit]     at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
    [junit]     ... 21 more
    [junit] Begin query: input17.q
    [junit] plan = /tmp/plan60504.xml
    [junit] 08/12/30 19:46:58 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to jobconf value of: 1
    [junit] 08/12/30 19:46:59 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src_thrift
    [junit] 08/12/30 19:46:59 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/data/files/TestSerDe.jar
    [junit] 08/12/30 19:46:59 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
    [junit] 08/12/30 19:46:59 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] Job running in-process (local Hadoop)
    [junit] 08/12/30 19:46:59 INFO exec.ExecDriver: Job running in-process (local Hadoop)
    [junit] 08/12/30 19:46:59 INFO mapred.MapTask: numReduceTasks: 1
    [junit] 08/12/30 19:46:59 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 19:46:59 INFO exec.MapOperator: Adding alias tmap:src_thrift to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src_thrift/complex.seq
    [junit] 08/12/30 19:46:59 INFO exec.MapOperator: Got partitions: null
    [junit] 08/12/30 19:46:59 INFO exec.TableScanOperator: Initializing Self
    [junit] 08/12/30 19:46:59 INFO exec.TableScanOperator: Initializing children:
    [junit] 08/12/30 19:46:59 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 19:46:59 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 19:46:59 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 19:46:59 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 19:46:59 INFO exec.ScriptOperator: Initializing Self
    [junit] 08/12/30 19:46:59 INFO exec.ScriptOperator: Initializing children:
    [junit] 08/12/30 19:46:59 INFO exec.ReduceSinkOperator: Initializing Self
    [junit] 08/12/30 19:46:59 INFO exec.ReduceSinkOperator: Using tag = -1
    [junit] 08/12/30 19:46:59 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 19:46:59 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 19:46:59 INFO exec.ScriptOperator: Initialization Done
    [junit] 08/12/30 19:46:59 INFO exec.ScriptOperator: Executing [/bin/cat]
    [junit] 08/12/30 19:46:59 INFO exec.ScriptOperator: tablename=src_thrift
    [junit] 08/12/30 19:46:59 INFO exec.ScriptOperator: partname={}
    [junit] 08/12/30 19:46:59 INFO exec.ScriptOperator: alias=tmap:src_thrift
    [junit] 08/12/30 19:46:59 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 19:46:59 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 19:46:59 INFO exec.TableScanOperator: Initialization Done
    [junit] 08/12/30 19:46:59 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 19:46:59 INFO exec.ScriptOperator: StreamThread ErrorProcessor done
    [junit] 08/12/30 19:46:59 INFO exec.ScriptOperator: StreamThread OutputProcessor done
    [junit] 08/12/30 19:46:59 INFO exec.ScriptOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 19:46:59 INFO exec.ScriptOperator: SERIALIZE_ERRORS:0
    [junit] 08/12/30 19:46:59 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src_thrift/complex.seq:0+1491
    [junit] 08/12/30 19:46:59 INFO mapred.TaskRunner: Task 'job_local_1_map_0000' done.
    [junit] 08/12/30 19:46:59 INFO mapred.TaskRunner: Saved output of task 'job_local_1_map_0000' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp1818472828
    [junit] 08/12/30 19:46:59 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 19:46:59 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 19:46:59 INFO exec.ExtractOperator: Initializing Self
    [junit] 08/12/30 19:46:59 INFO exec.ExtractOperator: Initializing children:
    [junit] 08/12/30 19:46:59 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 19:46:59 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 19:46:59 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 19:46:59 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 19:46:59 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 19:46:59 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 19:46:59 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 19:46:59 INFO exec.ExtractOperator: Initialization Done
    [junit] 08/12/30 19:47:00 INFO mapred.LocalJobRunner: reduce > reduce
    [junit] 08/12/30 19:47:00 INFO mapred.TaskRunner: Task 'reduce_ajcqkt' done.
    [junit] 08/12/30 19:47:00 INFO mapred.TaskRunner: Saved output of task 'reduce_ajcqkt' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp1818472828
    [junit]  map = 100%,  reduce =100%
    [junit] 08/12/30 19:47:00 INFO exec.ExecDriver:  map = 100%,  reduce =100%
    [junit] 08/12/30 19:47:00 INFO exec.ExecDriver: Ended Job = job_local_1
    [junit] Ended Job = job_local_1
    [junit] diff -I \(file:\)\|\(/tmp/.*\) /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/input17.q.out /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/clientpositive/input17.q.out
    [junit] Done query: input17.q
    [junit] Begin query: groupby1.q
    [junit] plan = /tmp/plan60505.xml
    [junit] 08/12/30 19:47:03 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to jobconf value of: 1
    [junit] 08/12/30 19:47:03 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src
    [junit] 08/12/30 19:47:03 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/data/files/TestSerDe.jar
    [junit] 08/12/30 19:47:03 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
    [junit] 08/12/30 19:47:03 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] Job running in-process (local Hadoop)
    [junit] 08/12/30 19:47:03 INFO exec.ExecDriver: Job running in-process (local Hadoop)
    [junit] 08/12/30 19:47:03 INFO mapred.MapTask: numReduceTasks: 1
    [junit] 08/12/30 19:47:03 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 19:47:03 INFO exec.MapOperator: Adding alias src to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt
    [junit] 08/12/30 19:47:03 INFO exec.MapOperator: Got partitions: null
    [junit] 08/12/30 19:47:03 INFO exec.TableScanOperator: Initializing Self
    [junit] 08/12/30 19:47:03 INFO exec.TableScanOperator: Initializing children:
    [junit] 08/12/30 19:47:03 INFO exec.ReduceSinkOperator: Initializing Self
    [junit] 08/12/30 19:47:03 INFO exec.ReduceSinkOperator: Using tag = -1
    [junit] 08/12/30 19:47:03 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 19:47:03 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 19:47:04 INFO exec.TableScanOperator: Initialization Done
    [junit] 08/12/30 19:47:04 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 19:47:04 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt:0+5812
    [junit] 08/12/30 19:47:04 INFO mapred.TaskRunner: Task 'job_local_1_map_0000' done.
    [junit] 08/12/30 19:47:04 INFO mapred.TaskRunner: Saved output of task 'job_local_1_map_0000' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp791132221
    [junit] 08/12/30 19:47:04 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 19:47:04 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 19:47:04 INFO exec.GroupByOperator: Initializing Self
    [junit] 08/12/30 19:47:04 INFO exec.GroupByOperator: Initializing children:
    [junit] 08/12/30 19:47:04 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 19:47:04 INFO exec.GroupByOperator: Initialization Done
    [junit] 08/12/30 19:47:04 INFO mapred.LocalJobRunner: reduce > reduce
    [junit] 08/12/30 19:47:04 INFO mapred.TaskRunner: Task 'reduce_mkq65u' done.
    [junit] 08/12/30 19:47:04 INFO mapred.TaskRunner: Saved output of task 'reduce_mkq65u' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp791132221
    [junit] 08/12/30 19:47:04 INFO exec.ExecDriver:  map = 100%,  reduce =100%
    [junit] 08/12/30 19:47:04 INFO exec.ExecDriver: Ended Job = job_local_1
    [junit]  map = 100%,  reduce =100%
    [junit] Ended Job = job_local_1
    [junit] plan = /tmp/plan60506.xml
    [junit] 08/12/30 19:47:06 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to jobconf value of: 1
    [junit] 08/12/30 19:47:06 INFO exec.ExecDriver: Adding input file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp/1258766110/143636603.10001
    [junit] 08/12/30 19:47:06 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/data/files/TestSerDe.jar
    [junit] 08/12/30 19:47:06 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
    [junit] 08/12/30 19:47:06 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] 08/12/30 19:47:06 INFO exec.ExecDriver: Job running in-process (local Hadoop)
    [junit] Job running in-process (local Hadoop)
    [junit] 08/12/30 19:47:06 INFO mapred.MapTask: numReduceTasks: 1
    [junit] 08/12/30 19:47:07 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 19:47:07 INFO exec.MapOperator: Adding alias /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp/1258766110/143636603.10001 to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp/1258766110/143636603.10001/reduce_mkq65u
    [junit] 08/12/30 19:47:07 INFO exec.MapOperator: Got partitions: null
    [junit] 08/12/30 19:47:07 INFO exec.ReduceSinkOperator: Initializing Self
    [junit] 08/12/30 19:47:07 INFO exec.ReduceSinkOperator: Using tag = -1
    [junit] 08/12/30 19:47:07 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 19:47:07 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 19:47:07 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 19:47:07 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp/1258766110/143636603.10001/reduce_mkq65u:0+11875
    [junit] 08/12/30 19:47:07 INFO mapred.TaskRunner: Task 'job_local_1_map_0000' done.
    [junit] 08/12/30 19:47:07 INFO mapred.TaskRunner: Saved output of task 'job_local_1_map_0000' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp-1905067233
    [junit] 08/12/30 19:47:07 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 19:47:07 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 19:47:07 INFO exec.GroupByOperator: Initializing Self
    [junit] 08/12/30 19:47:07 INFO exec.GroupByOperator: Initializing children:
    [junit] 08/12/30 19:47:07 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 19:47:07 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 19:47:07 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 19:47:07 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 19:47:07 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 19:47:07 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 19:47:07 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 19:47:07 INFO exec.GroupByOperator: Initialization Done
    [junit] 08/12/30 19:47:07 INFO mapred.LocalJobRunner: reduce > reduce
    [junit] 08/12/30 19:47:07 INFO mapred.TaskRunner: Task 'reduce_6fqk7b' done.
    [junit] 08/12/30 19:47:07 INFO mapred.TaskRunner: Saved output of task 'reduce_6fqk7b' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp-1905067233
    [junit] 08/12/30 19:47:07 INFO exec.ExecDriver:  map = 100%,  reduce =100%
    [junit] 08/12/30 19:47:07 INFO exec.ExecDriver: Ended Job = job_local_1
    [junit]  map = 100%,  reduce =100%
    [junit] Ended Job = job_local_1
    [junit] diff -I \(file:\)\|\(/tmp/.*\) /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/groupby1.q.out /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/clientpositive/groupby1.q.out
    [junit] Done query: groupby1.q
    [junit] Begin query: join18.q
    [junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
    [junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:361)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
    [junit]     at org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_join18(TestCliDriver.java:3155)
    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit]     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit]     at java.lang.reflect.Method.invoke(Method.java:597)
    [junit]     at junit.framework.TestCase.runTest(TestCase.java:164)
    [junit]     at junit.framework.TestCase.runBare(TestCase.java:130)
    [junit]     at junit.framework.TestResult$1.protect(TestResult.java:106)
    [junit]     at junit.framework.TestResult.runProtected(TestResult.java:124)
    [junit]     at junit.framework.TestResult.run(TestResult.java:109)
    [junit]     at junit.framework.TestCase.run(TestCase.java:120)
    [junit]     at junit.framework.TestSuite.runTest(TestSuite.java:230)
    [junit]     at junit.framework.TestSuite.run(TestSuite.java:225)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
    [junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
    [junit]     at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
    [junit]     at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
    [junit]     at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
    [junit]     ... 21 more
    [junit] Begin query: input_part4.q
    [junit] diff -I \(file:\)\|\(/tmp/.*\) /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/input_part4.q.out /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/clientpositive/input_part4.q.out
    [junit] Done query: input_part4.q
    [junit] Begin query: input19.q
    [junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
    [junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:361)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
    [junit]     at org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_input19(TestCliDriver.java:3205)
    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit]     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit]     at java.lang.reflect.Method.invoke(Method.java:597)
    [junit]     at junit.framework.TestCase.runTest(TestCase.java:164)
    [junit]     at junit.framework.TestCase.runBare(TestCase.java:130)
    [junit]     at junit.framework.TestResult$1.protect(TestResult.java:106)
    [junit]     at junit.framework.TestResult.runProtected(TestResult.java:124)
    [junit]     at junit.framework.TestResult.run(TestResult.java:109)
    [junit]     at junit.framework.TestCase.run(TestCase.java:120)
    [junit]     at junit.framework.TestSuite.runTest(TestSuite.java:230)
    [junit]     at junit.framework.TestSuite.run(TestSuite.java:225)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
    [junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
    [junit]     at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
    [junit]     at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
    [junit]     at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
    [junit]     ... 21 more
    [junit] Begin query: groupby3.q
    [junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcpart)
    [junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcpart)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:361)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
    [junit]     at org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_groupby3(TestCliDriver.java:3230)
    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit]     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit]     at java.lang.reflect.Method.invoke(Method.java:597)
    [junit]     at junit.framework.TestCase.runTest(TestCase.java:164)
    [junit]     at junit.framework.TestCase.runBare(TestCase.java:130)
    [junit]     at junit.framework.TestResult$1.protect(TestResult.java:106)
    [junit]     at junit.framework.TestResult.runProtected(TestResult.java:124)
    [junit]     at junit.framework.TestResult.run(TestResult.java:109)
    [junit]     at junit.framework.TestCase.run(TestCase.java:120)
    [junit]     at junit.framework.TestSuite.runTest(TestSuite.java:230)
    [junit]     at junit.framework.TestSuite.run(TestSuite.java:225)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
    [junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcpart)
    [junit]     at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
    [junit]     at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
    [junit]     at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
    [junit]     ... 21 more
    [junit] Begin query: subq.q
    [junit] Exception: load command: LOAD DATA INPATH '/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp/kv1.txt' INTO TABLE srcpart PARTITION (ds='2008-04-09',hr='12') failed with exit code= 9
    [junit] java.lang.Exception: load command: LOAD DATA INPATH '/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp/kv1.txt' INTO TABLE srcpart PARTITION (ds='2008-04-09',hr='12') failed with exit code= 9
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.createSources(QTestUtil.java:250)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:362)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
    [junit]     at org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_subq(TestCliDriver.java:3255)
    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit]     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit]     at java.lang.reflect.Method.invoke(Method.java:597)
    [junit]     at junit.framework.TestCase.runTest(TestCase.java:164)
    [junit]     at junit.framework.TestCase.runBare(TestCase.java:130)
    [junit]     at junit.framework.TestResult$1.protect(TestResult.java:106)
    [junit]     at junit.framework.TestResult.runProtected(TestResult.java:124)
    [junit]     at junit.framework.TestResult.run(TestResult.java:109)
    [junit]     at junit.framework.TestCase.run(TestCase.java:120)
    [junit]     at junit.framework.TestSuite.runTest(TestSuite.java:230)
    [junit]     at junit.framework.TestSuite.run(TestSuite.java:225)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
    [junit] Begin query: union2.q
    [junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcpart)
    [junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcpart)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:361)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
    [junit]     at org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_union2(TestCliDriver.java:3280)
    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit]     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit]     at java.lang.reflect.Method.invoke(Method.java:597)
    [junit]     at junit.framework.TestCase.runTest(TestCase.java:164)
    [junit]     at junit.framework.TestCase.runBare(TestCase.java:130)
    [junit]     at junit.framework.TestResult$1.protect(TestResult.java:106)
    [junit]     at junit.framework.TestResult.runProtected(TestResult.java:124)
    [junit]     at junit.framework.TestResult.run(TestResult.java:109)
    [junit]     at junit.framework.TestCase.run(TestCase.java:120)
    [junit]     at junit.framework.TestSuite.runTest(TestSuite.java:230)
    [junit]     at junit.framework.TestSuite.run(TestSuite.java:225)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
    [junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcpart)
    [junit]     at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
    [junit]     at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
    [junit]     at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
    [junit]     ... 21 more
    [junit] Begin query: input_part6.q
    [junit] java.lang.Exception: load command: LOAD DATA INPATH '/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp/kv1.txt' INTO TABLE srcpart PARTITION (ds='2008-04-08',hr='12') failed with exit code= 9
    [junit] Exception: load command: LOAD DATA INPATH '/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp/kv1.txt' INTO TABLE srcpart PARTITION (ds='2008-04-08',hr='12') failed with exit code= 9
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.createSources(QTestUtil.java:250)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:362)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
    [junit]     at org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_input_part6(TestCliDriver.java:3305)
    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit]     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit]     at java.lang.reflect.Method.invoke(Method.java:597)
    [junit]     at junit.framework.TestCase.runTest(TestCase.java:164)
    [junit]     at junit.framework.TestCase.runBare(TestCase.java:130)
    [junit]     at junit.framework.TestResult$1.protect(TestResult.java:106)
    [junit]     at junit.framework.TestResult.runProtected(TestResult.java:124)
    [junit]     at junit.framework.TestResult.run(TestResult.java:109)
    [junit]     at junit.framework.TestCase.run(TestCase.java:120)
    [junit]     at junit.framework.TestSuite.runTest(TestSuite.java:230)
    [junit]     at junit.framework.TestSuite.run(TestSuite.java:225)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
    [junit] Begin query: groupby5.q
    [junit] plan = /tmp/plan60507.xml
    [junit] 08/12/30 19:47:13 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to jobconf value of: 1
    [junit] 08/12/30 19:47:13 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src
    [junit] 08/12/30 19:47:13 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/data/files/TestSerDe.jar
    [junit] 08/12/30 19:47:13 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
    [junit] 08/12/30 19:47:13 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] Job running in-process (local Hadoop)
    [junit] 08/12/30 19:47:13 INFO exec.ExecDriver: Job running in-process (local Hadoop)
    [junit] 08/12/30 19:47:13 INFO mapred.MapTask: numReduceTasks: 1
    [junit] 08/12/30 19:47:13 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 19:47:13 INFO exec.MapOperator: Adding alias src to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt
    [junit] 08/12/30 19:47:13 INFO exec.MapOperator: Got partitions: null
    [junit] 08/12/30 19:47:13 INFO exec.TableScanOperator: Initializing Self
    [junit] 08/12/30 19:47:13 INFO exec.TableScanOperator: Initializing children:
    [junit] 08/12/30 19:47:13 INFO exec.ReduceSinkOperator: Initializing Self
    [junit] 08/12/30 19:47:13 INFO exec.ReduceSinkOperator: Using tag = -1
    [junit] 08/12/30 19:47:13 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 19:47:13 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 19:47:13 INFO exec.TableScanOperator: Initialization Done
    [junit] 08/12/30 19:47:13 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 19:47:14 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt:0+5812
    [junit] 08/12/30 19:47:14 INFO mapred.TaskRunner: Task 'job_local_1_map_0000' done.
    [junit] 08/12/30 19:47:14 INFO mapred.TaskRunner: Saved output of task 'job_local_1_map_0000' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp1953371683
    [junit] 08/12/30 19:47:14 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 19:47:14 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 19:47:14 INFO exec.GroupByOperator: Initializing Self
    [junit] 08/12/30 19:47:14 INFO exec.GroupByOperator: Initializing children:
    [junit] 08/12/30 19:47:14 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 19:47:14 INFO exec.GroupByOperator: Initialization Done
    [junit] 08/12/30 19:47:14 INFO mapred.LocalJobRunner: reduce > reduce
    [junit] 08/12/30 19:47:14 INFO mapred.TaskRunner: Task 'reduce_1umq7z' done.
    [junit] 08/12/30 19:47:14 INFO mapred.TaskRunner: Saved output of task 'reduce_1umq7z' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp1953371683
    [junit]  map = 100%,  reduce =100%
    [junit] Ended Job = job_local_1
    [junit] 08/12/30 19:47:14 INFO exec.ExecDriver:  map = 100%,  reduce =100%
    [junit] 08/12/30 19:47:14 INFO exec.ExecDriver: Ended Job = job_local_1
    [junit] plan = /tmp/plan60508.xml
    [junit] 08/12/30 19:47:16 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to jobconf value of: 1
    [junit] 08/12/30 19:47:16 INFO exec.ExecDriver: Adding input file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp/823496377/423046779.10001
    [junit] 08/12/30 19:47:16 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/data/files/TestSerDe.jar
    [junit] 08/12/30 19:47:16 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
    [junit] 08/12/30 19:47:16 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] 08/12/30 19:47:16 INFO exec.ExecDriver: Job running in-process (local Hadoop)
    [junit] Job running in-process (local Hadoop)
    [junit] 08/12/30 19:47:16 INFO mapred.MapTask: numReduceTasks: 1
    [junit] 08/12/30 19:47:16 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 19:47:16 INFO exec.MapOperator: Adding alias /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp/823496377/423046779.10001 to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp/823496377/423046779.10001/reduce_1umq7z
    [junit] 08/12/30 19:47:16 INFO exec.MapOperator: Got partitions: null
    [junit] 08/12/30 19:47:16 INFO exec.ReduceSinkOperator: Initializing Self
    [junit] 08/12/30 19:47:16 INFO exec.ReduceSinkOperator: Using tag = -1
    [junit] 08/12/30 19:47:16 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 19:47:16 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 19:47:16 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 19:47:17 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp/823496377/423046779.10001/reduce_1umq7z:0+11875
    [junit] 08/12/30 19:47:17 INFO mapred.TaskRunner: Task 'job_local_1_map_0000' done.
    [junit] 08/12/30 19:47:17 INFO mapred.TaskRunner: Saved output of task 'job_local_1_map_0000' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp735337091
    [junit] 08/12/30 19:47:17 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 19:47:17 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 19:47:17 INFO exec.GroupByOperator: Initializing Self
    [junit] 08/12/30 19:47:17 INFO exec.GroupByOperator: Initializing children:
    [junit] 08/12/30 19:47:17 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 19:47:17 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 19:47:17 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 19:47:17 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 19:47:17 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 19:47:17 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 19:47:17 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 19:47:17 INFO exec.GroupByOperator: Initialization Done
    [junit] 08/12/30 19:47:17 INFO mapred.LocalJobRunner: reduce > reduce
    [junit] 08/12/30 19:47:17 INFO mapred.TaskRunner: Task 'reduce_pfnu93' done.
    [junit] 08/12/30 19:47:17 INFO mapred.TaskRunner: Saved output of task 'reduce_pfnu93' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp735337091
    [junit]  map = 100%,  reduce =100%
    [junit] Ended Job = job_local_1
    [junit] 08/12/30 19:47:17 INFO exec.ExecDriver:  map = 100%,  reduce =100%
    [junit] 08/12/30 19:47:17 INFO exec.ExecDriver: Ended Job = job_local_1
    [junit] diff -I \(file:\)\|\(/tmp/.*\) /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/groupby5.q.out /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/clientpositive/groupby5.q.out
    [junit] Done query: groupby5.q
    [junit] Begin query: groupby7.q
    [junit] plan = /tmp/plan60509.xml
    [junit] 08/12/30 19:47:20 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to jobconf value of: 1
    [junit] 08/12/30 19:47:20 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src
    [junit] 08/12/30 19:47:20 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/data/files/TestSerDe.jar
    [junit] 08/12/30 19:47:20 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
    [junit] 08/12/30 19:47:20 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] 08/12/30 19:47:20 INFO exec.ExecDriver: Job running in-process (local Hadoop)
    [junit] Job running in-process (local Hadoop)
    [junit] 08/12/30 19:47:20 INFO mapred.MapTask: numReduceTasks: 1
    [junit] 08/12/30 19:47:21 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 19:47:21 INFO exec.MapOperator: Adding alias src to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt
    [junit] 08/12/30 19:47:21 INFO exec.MapOperator: Got partitions: null
    [junit] 08/12/30 19:47:21 INFO exec.TableScanOperator: Initializing Self
    [junit] 08/12/30 19:47:21 INFO exec.TableScanOperator: Initializing children:
    [junit] 08/12/30 19:47:21 INFO exec.ReduceSinkOperator: Initializing Self
    [junit] 08/12/30 19:47:21 INFO exec.ReduceSinkOperator: Using tag = -1
    [junit] 08/12/30 19:47:21 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 19:47:21 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 19:47:21 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 19:47:21 INFO util.NativeCodeLoader: Loaded the native-hadoop library
    [junit] 08/12/30 19:47:21 INFO zlib.ZlibFactory: Successfully loaded & initialized native-zlib library
    [junit] 08/12/30 19:47:21 INFO exec.TableScanOperator: Initialization Done
    [junit] 08/12/30 19:47:21 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 19:47:21 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt:0+5812
    [junit] 08/12/30 19:47:21 INFO mapred.TaskRunner: Task 'job_local_1_map_0000' done.
    [junit] 08/12/30 19:47:21 INFO mapred.TaskRunner: Saved output of task 'job_local_1_map_0000' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp1264024568
    [junit] 08/12/30 19:47:21 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 19:47:21 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 19:47:21 INFO exec.GroupByOperator: Initializing Self
    [junit] 08/12/30 19:47:21 INFO exec.GroupByOperator: Initializing children:
    [junit] 08/12/30 19:47:21 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 19:47:21 INFO exec.GroupByOperator: Initialization Done
    [junit] 08/12/30 19:47:21 INFO mapred.LocalJobRunner: reduce > reduce
    [junit] 08/12/30 19:47:21 INFO mapred.TaskRunner: Task 'reduce_kr1crc' done.
    [junit] 08/12/30 19:47:21 INFO mapred.TaskRunner: Saved output of task 'reduce_kr1crc' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp1264024568
    [junit] 08/12/30 19:47:21 INFO exec.ExecDriver:  map = 100%,  reduce =100%
    [junit] 08/12/30 19:47:21 INFO exec.ExecDriver: Ended Job = job_local_1
    [junit]  map = 100%,  reduce =100%
    [junit] Ended Job = job_local_1
    [junit] plan = /tmp/plan60510.xml
    [junit] 08/12/30 19:47:23 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to jobconf value of: 1
    [junit] 08/12/30 19:47:23 INFO exec.ExecDriver: Adding input file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp/212049751/670706395.10002
    [junit] 08/12/30 19:47:23 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/data/files/TestSerDe.jar
    [junit] 08/12/30 19:47:23 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
    [junit] 08/12/30 19:47:23 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] 08/12/30 19:47:24 INFO exec.ExecDriver: Job running in-process (local Hadoop)
    [junit] Job running in-process (local Hadoop)
    [junit] 08/12/30 19:47:24 INFO mapred.MapTask: numReduceTasks: 1
    [junit] 08/12/30 19:47:24 INFO util.NativeCodeLoader: Loaded the native-hadoop library
    [junit] 08/12/30 19:47:24 INFO zlib.ZlibFactory: Successfully loaded & initialized native-zlib library
    [junit] 08/12/30 19:47:24 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 19:47:24 INFO exec.MapOperator: Adding alias /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp/212049751/670706395.10002 to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp/212049751/670706395.10002/reduce_kr1crc
    [junit] 08/12/30 19:47:24 INFO exec.MapOperator: Got partitions: null
    [junit] 08/12/30 19:47:24 INFO exec.ReduceSinkOperator: Initializing Self
    [junit] 08/12/30 19:47:24 INFO exec.ReduceSinkOperator: Using tag = -1
    [junit] 08/12/30 19:47:24 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 19:47:24 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 19:47:24 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 19:47:24 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp/212049751/670706395.10002/reduce_kr1crc:0+13424
    [junit] 08/12/30 19:47:24 INFO mapred.TaskRunner: Task 'job_local_1_map_0000' done.
    [junit] 08/12/30 19:47:24 INFO mapred.TaskRunner: Saved output of task 'job_local_1_map_0000' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp1515800641
    [junit] 08/12/30 19:47:24 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 19:47:24 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 19:47:24 INFO exec.GroupByOperator: Initializing Self
    [junit] 08/12/30 19:47:24 INFO exec.GroupByOperator: Initializing children:
    [junit] 08/12/30 19:47:24 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 19:47:24 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 19:47:24 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 19:47:24 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 19:47:24 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 19:47:24 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 19:47:24 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 19:47:24 INFO exec.GroupByOperator: Initialization Done
    [junit] 08/12/30 19:47:24 INFO mapred.LocalJobRunner: reduce > reduce
    [junit] 08/12/30 19:47:24 INFO mapred.TaskRunner: Task 'reduce_ts5oy9' done.
    [junit] 08/12/30 19:47:24 INFO mapred.TaskRunner: Saved output of task 'reduce_ts5oy9' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp1515800641
    [junit] 08/12/30 19:47:25 INFO exec.ExecDriver:  map = 100%,  reduce =100%
    [junit] 08/12/30 19:47:25 INFO exec.ExecDriver: Ended Job = job_local_1
    [junit]  map = 100%,  reduce =100%
    [junit] Ended Job = job_local_1
    [junit] plan = /tmp/plan60511.xml
    [junit] 08/12/30 19:47:26 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to jobconf value of: 1
    [junit] 08/12/30 19:47:26 INFO exec.ExecDriver: Adding input file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp/212049751/670706395.10003
    [junit] 08/12/30 19:47:26 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/data/files/TestSerDe.jar
    [junit] 08/12/30 19:47:26 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
    [junit] 08/12/30 19:47:26 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] Job running in-process (local Hadoop)
    [junit] 08/12/30 19:47:27 INFO exec.ExecDriver: Job running in-process (local Hadoop)
    [junit] 08/12/30 19:47:27 INFO mapred.MapTask: numReduceTasks: 1
    [junit] 08/12/30 19:47:27 INFO util.NativeCodeLoader: Loaded the native-hadoop library
    [junit] 08/12/30 19:47:27 INFO zlib.ZlibFactory: Successfully loaded & initialized native-zlib library
    [junit] 08/12/30 19:47:27 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 19:47:27 INFO exec.MapOperator: Adding alias /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp/212049751/670706395.10003 to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp/212049751/670706395.10003/job_local_1_map_0000
    [junit] 08/12/30 19:47:27 INFO exec.MapOperator: Got partitions: null
    [junit] 08/12/30 19:47:27 INFO exec.ReduceSinkOperator: Initializing Self
    [junit] 08/12/30 19:47:27 INFO exec.ReduceSinkOperator: Using tag = -1
    [junit] 08/12/30 19:47:27 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 19:47:27 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 19:47:27 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 19:47:27 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp/212049751/670706395.10003/job_local_1_map_0000:0+23755
    [junit] 08/12/30 19:47:27 INFO mapred.TaskRunner: Task 'job_local_1_map_0000' done.
    [junit] 08/12/30 19:47:27 INFO mapred.TaskRunner: Saved output of task 'job_local_1_map_0000' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp-1898304819
    [junit] 08/12/30 19:47:27 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 19:47:27 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 19:47:27 INFO exec.GroupByOperator: Initializing Self
    [junit] 08/12/30 19:47:27 INFO exec.GroupByOperator: Initializing children:
    [junit] 08/12/30 19:47:27 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 19:47:27 INFO exec.GroupByOperator: Initialization Done
    [junit] 08/12/30 19:47:27 INFO mapred.LocalJobRunner: reduce > reduce
    [junit] 08/12/30 19:47:27 INFO mapred.TaskRunner: Task 'reduce_u5yyp8' done.
    [junit] 08/12/30 19:47:27 INFO mapred.TaskRunner: Saved output of task 'reduce_u5yyp8' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp-1898304819
    [junit]  map = 100%,  reduce =100%
    [junit] Ended Job = job_local_1
    [junit] 08/12/30 19:47:28 INFO exec.ExecDriver:  map = 100%,  reduce =100%
    [junit] 08/12/30 19:47:28 INFO exec.ExecDriver: Ended Job = job_local_1
    [junit] plan = /tmp/plan60512.xml
    [junit] 08/12/30 19:47:29 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to jobconf value of: 1
    [junit] 08/12/30 19:47:29 INFO exec.ExecDriver: Adding input file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp/212049751/670706395.10004
    [junit] 08/12/30 19:47:29 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/data/files/TestSerDe.jar
    [junit] 08/12/30 19:47:29 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
    [junit] 08/12/30 19:47:29 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] Job running in-process (local Hadoop)
    [junit] 08/12/30 19:47:30 INFO exec.ExecDriver: Job running in-process (local Hadoop)
    [junit] 08/12/30 19:47:30 INFO mapred.MapTask: numReduceTasks: 1
    [junit] 08/12/30 19:47:30 INFO util.NativeCodeLoader: Loaded the native-hadoop library
    [junit] 08/12/30 19:47:30 INFO zlib.ZlibFactory: Successfully loaded & initialized native-zlib library
    [junit] 08/12/30 19:47:30 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 19:47:30 INFO exec.MapOperator: Adding alias /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp/212049751/670706395.10004 to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp/212049751/670706395.10004/reduce_u5yyp8
    [junit] 08/12/30 19:47:30 INFO exec.MapOperator: Got partitions: null
    [junit] 08/12/30 19:47:30 INFO exec.ReduceSinkOperator: Initializing Self
    [junit] 08/12/30 19:47:30 INFO exec.ReduceSinkOperator: Using tag = -1
    [junit] 08/12/30 19:47:30 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 19:47:30 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 19:47:30 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 19:47:30 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp/212049751/670706395.10004/reduce_u5yyp8:0+13424
    [junit] 08/12/30 19:47:30 INFO mapred.TaskRunner: Task 'job_local_1_map_0000' done.
    [junit] 08/12/30 19:47:30 INFO mapred.TaskRunner: Saved output of task 'job_local_1_map_0000' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp757533838
    [junit] 08/12/30 19:47:30 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 19:47:30 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 19:47:30 INFO exec.GroupByOperator: Initializing Self
    [junit] 08/12/30 19:47:30 INFO exec.GroupByOperator: Initializing children:
    [junit] 08/12/30 19:47:30 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 19:47:30 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 19:47:30 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 19:47:30 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 19:47:30 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 19:47:30 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 19:47:30 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 19:47:30 INFO exec.GroupByOperator: Initialization Done
    [junit] 08/12/30 19:47:30 INFO mapred.LocalJobRunner: reduce > reduce
    [junit] 08/12/30 19:47:30 INFO mapred.TaskRunner: Task 'reduce_ir5rxd' done.
    [junit] 08/12/30 19:47:30 INFO mapred.TaskRunner: Saved output of task 'reduce_ir5rxd' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp757533838
    [junit]  map = 100%,  reduce =100%
    [junit] Ended Job = job_local_1
    [junit] 08/12/30 19:47:31 INFO exec.ExecDriver:  map = 100%,  reduce =100%
    [junit] 08/12/30 19:47:31 INFO exec.ExecDriver: Ended Job = job_local_1
    [junit] diff -I \(file:\)\|\(/tmp/.*\) /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/groupby7.q.out /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/clientpositive/groupby7.q.out
    [junit] Done query: groupby7.q
    [junit] Begin query: udf_testlength.q
    [junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
    [junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:361)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
    [junit]     at org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_udf_testlength(TestCliDriver.java:3380)
    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit]     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit]     at java.lang.reflect.Method.invoke(Method.java:597)
    [junit]     at junit.framework.TestCase.runTest(TestCase.java:164)
    [junit]     at junit.framework.TestCase.runBare(TestCase.java:130)
    [junit]     at junit.framework.TestResult$1.protect(TestResult.java:106)
    [junit]     at junit.framework.TestResult.runProtected(TestResult.java:124)
    [junit]     at junit.framework.TestResult.run(TestResult.java:109)
    [junit]     at junit.framework.TestCase.run(TestCase.java:120)
    [junit]     at junit.framework.TestSuite.runTest(TestSuite.java:230)
    [junit]     at junit.framework.TestSuite.run(TestSuite.java:225)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
    [junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
    [junit]     at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
    [junit]     at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
    [junit]     at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
    [junit]     ... 21 more
    [junit] Begin query: fileformat_text.q
    [junit] plan = /tmp/plan60513.xml
    [junit] 08/12/30 19:47:33 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to 0 since there's no reduce operator
    [junit] 08/12/30 19:47:34 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src
    [junit] 08/12/30 19:47:34 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/data/files/TestSerDe.jar
    [junit] 08/12/30 19:47:34 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
    [junit] 08/12/30 19:47:34 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] 08/12/30 19:47:34 INFO exec.ExecDriver: Job running in-process (local Hadoop)
    [junit] Job running in-process (local Hadoop)
    [junit] 08/12/30 19:47:34 INFO mapred.MapTask: numReduceTasks: 0
    [junit] 08/12/30 19:47:34 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 19:47:34 INFO exec.MapOperator: Adding alias src to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt
    [junit] 08/12/30 19:47:34 INFO exec.MapOperator: Got partitions: null
    [junit] 08/12/30 19:47:34 INFO exec.TableScanOperator: Initializing Self
    [junit] 08/12/30 19:47:34 INFO exec.TableScanOperator: Initializing children:
    [junit] 08/12/30 19:47:34 INFO exec.FilterOperator: Initializing Self
    [junit] 08/12/30 19:47:34 INFO exec.FilterOperator: Initializing children:
    [junit] 08/12/30 19:47:34 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 19:47:34 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 19:47:34 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 19:47:34 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 19:47:34 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 19:47:34 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 19:47:34 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 19:47:34 INFO exec.FilterOperator: Initialization Done
    [junit] 08/12/30 19:47:34 INFO exec.TableScanOperator: Initialization Done
    [junit] 08/12/30 19:47:34 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 19:47:34 INFO exec.FilterOperator: PASSED:10
    [junit] 08/12/30 19:47:34 INFO exec.FilterOperator: FILTERED:490
    [junit] 08/12/30 19:47:34 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt:0+5812
    [junit] 08/12/30 19:47:34 INFO mapred.TaskRunner: Task 'job_local_1_map_0000' done.
    [junit] 08/12/30 19:47:34 INFO mapred.TaskRunner: Saved output of task 'job_local_1_map_0000' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp-1019581609
    [junit]  map = 100%,  reduce =0%
    [junit] 08/12/30 19:47:35 INFO exec.ExecDriver:  map = 100%,  reduce =0%
    [junit] 08/12/30 19:47:35 INFO exec.ExecDriver: Ended Job = job_local_1
    [junit] Ended Job = job_local_1
    [junit] diff -I \(file:\)\|\(/tmp/.*\) /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/fileformat_text.q.out /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/clientpositive/fileformat_text.q.out
    [junit] Done query: fileformat_text.q
    [junit] Begin query: fileformat_sequencefile.q
    [junit] plan = /tmp/plan60514.xml
    [junit] 08/12/30 19:47:38 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to 0 since there's no reduce operator
    [junit] 08/12/30 19:47:39 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src
    [junit] 08/12/30 19:47:39 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/data/files/TestSerDe.jar
    [junit] 08/12/30 19:47:39 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
    [junit] 08/12/30 19:47:39 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] Job running in-process (local Hadoop)
    [junit] 08/12/30 19:47:39 INFO exec.ExecDriver: Job running in-process (local Hadoop)
    [junit] 08/12/30 19:47:39 INFO mapred.MapTask: numReduceTasks: 0
    [junit] 08/12/30 19:47:39 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 19:47:39 INFO exec.MapOperator: Adding alias src to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt
    [junit] 08/12/30 19:47:39 INFO exec.MapOperator: Got partitions: null
    [junit] 08/12/30 19:47:39 INFO exec.TableScanOperator: Initializing Self
    [junit] 08/12/30 19:47:39 INFO exec.TableScanOperator: Initializing children:
    [junit] 08/12/30 19:47:39 INFO exec.FilterOperator: Initializing Self
    [junit] 08/12/30 19:47:39 INFO exec.FilterOperator: Initializing children:
    [junit] 08/12/30 19:47:39 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 19:47:39 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 19:47:39 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 19:47:39 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 19:47:39 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 19:47:39 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 19:47:39 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 19:47:39 INFO exec.FilterOperator: Initialization Done
    [junit] 08/12/30 19:47:39 INFO exec.TableScanOperator: Initialization Done
    [junit] 08/12/30 19:47:39 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 19:47:39 INFO exec.FilterOperator: PASSED:10
    [junit] 08/12/30 19:47:39 INFO exec.FilterOperator: FILTERED:490
    [junit] 08/12/30 19:47:39 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt:0+5812
    [junit] 08/12/30 19:47:39 INFO mapred.TaskRunner: Task 'job_local_1_map_0000' done.
    [junit] 08/12/30 19:47:39 INFO mapred.TaskRunner: Saved output of task 'job_local_1_map_0000' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp254721235
    [junit]  map = 100%,  reduce =0%
    [junit] 08/12/30 19:47:40 INFO exec.ExecDriver:  map = 100%,  reduce =0%
    [junit] 08/12/30 19:47:40 INFO exec.ExecDriver: Ended Job = job_local_1
    [junit] Ended Job = job_local_1
    [junit] diff -I \(file:\)\|\(/tmp/.*\) /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/fileformat_sequencefile.q.out /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/clientpositive/fileformat_sequencefile.q.out
    [junit] Done query: fileformat_sequencefile.q
    [junit] Begin query: udf_round.q
    [junit] plan = /tmp/plan60515.xml
    [junit] 08/12/30 19:47:42 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to 0 since there's no reduce operator
    [junit] 08/12/30 19:47:43 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src
    [junit] 08/12/30 19:47:43 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/data/files/TestSerDe.jar
    [junit] 08/12/30 19:47:43 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
    [junit] 08/12/30 19:47:43 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] Job running in-process (local Hadoop)
    [junit] 08/12/30 19:47:43 INFO exec.ExecDriver: Job running in-process (local Hadoop)
    [junit] 08/12/30 19:47:43 INFO mapred.MapTask: numReduceTasks: 0
    [junit] 08/12/30 19:47:43 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 19:47:43 INFO exec.MapOperator: Adding alias src to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt
    [junit] 08/12/30 19:47:43 INFO exec.MapOperator: Got partitions: null
    [junit] 08/12/30 19:47:43 INFO exec.TableScanOperator: Initializing Self
    [junit] 08/12/30 19:47:43 INFO exec.TableScanOperator: Initializing children:
    [junit] 08/12/30 19:47:43 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 19:47:43 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 19:47:43 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 19:47:43 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 19:47:43 INFO exec.LimitOperator: Initializing Self
    [junit] 08/12/30 19:47:43 INFO exec.LimitOperator: Initializing children:
    [junit] 08/12/30 19:47:43 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 19:47:43 INFO exec.LimitOperator: Initialization Done
    [junit] 08/12/30 19:47:43 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 19:47:43 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 19:47:43 INFO exec.TableScanOperator: Initialization Done
    [junit] 08/12/30 19:47:43 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 19:47:43 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt:0+5812
    [junit] 08/12/30 19:47:43 INFO mapred.TaskRunner: Task 'job_local_1_map_0000' done.
    [junit] 08/12/30 19:47:43 INFO mapred.TaskRunner: Saved output of task 'job_local_1_map_0000' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp-61371351
    [junit]  map = 100%,  reduce =0%
    [junit] Ended Job = job_local_1
    [junit] 08/12/30 19:47:44 INFO exec.ExecDriver:  map = 100%,  reduce =0%
    [junit] 08/12/30 19:47:44 INFO exec.ExecDriver: Ended Job = job_local_1
    [junit] plan = /tmp/plan60516.xml
    [junit] 08/12/30 19:47:46 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to 0 since there's no reduce operator
    [junit] 08/12/30 19:47:46 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src
    [junit] 08/12/30 19:47:46 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/data/files/TestSerDe.jar
    [junit] 08/12/30 19:47:46 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
    [junit] 08/12/30 19:47:46 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] Job running in-process (local Hadoop)
    [junit] 08/12/30 19:47:46 INFO exec.ExecDriver: Job running in-process (local Hadoop)
    [junit] 08/12/30 19:47:46 INFO mapred.MapTask: numReduceTasks: 0
    [junit] 08/12/30 19:47:46 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 19:47:46 INFO exec.MapOperator: Adding alias src to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt
    [junit] 08/12/30 19:47:46 INFO exec.MapOperator: Got partitions: null
    [junit] 08/12/30 19:47:46 INFO exec.TableScanOperator: Initializing Self
    [junit] 08/12/30 19:47:46 INFO exec.TableScanOperator: Initializing children:
    [junit] 08/12/30 19:47:46 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 19:47:46 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 19:47:46 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 19:47:46 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 19:47:46 INFO exec.LimitOperator: Initializing Self
    [junit] 08/12/30 19:47:46 INFO exec.LimitOperator: Initializing children:
    [junit] 08/12/30 19:47:46 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 19:47:46 INFO exec.LimitOperator: Initialization Done
    [junit] 08/12/30 19:47:46 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 19:47:46 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 19:47:46 INFO exec.TableScanOperator: Initialization Done
    [junit] 08/12/30 19:47:46 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 19:47:46 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt:0+5812
    [junit] 08/12/30 19:47:46 INFO mapred.TaskRunner: Task 'job_local_1_map_0000' done.
    [junit] 08/12/30 19:47:46 INFO mapred.TaskRunner: Saved output of task 'job_local_1_map_0000' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp1043896704
    [junit]  map = 100%,  reduce =0%
    [junit] 08/12/30 19:47:47 INFO exec.ExecDriver:  map = 100%,  reduce =0%
    [junit] 08/12/30 19:47:47 INFO exec.ExecDriver: Ended Job = job_local_1
    [junit] Ended Job = job_local_1
    [junit] plan = /tmp/plan60517.xml
    [junit] 08/12/30 19:47:49 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to 0 since there's no reduce operator
    [junit] 08/12/30 19:47:49 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src
    [junit] 08/12/30 19:47:49 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/data/files/TestSerDe.jar
    [junit] 08/12/30 19:47:49 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
    [junit] 08/12/30 19:47:49 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] Job running in-process (local Hadoop)
    [junit] 08/12/30 19:47:49 INFO exec.ExecDriver: Job running in-process (local Hadoop)
    [junit] 08/12/30 19:47:50 INFO mapred.MapTask: numReduceTasks: 0
    [junit] 08/12/30 19:47:50 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 19:47:50 INFO exec.MapOperator: Adding alias src to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt
    [junit] 08/12/30 19:47:50 INFO exec.MapOperator: Got partitions: null
    [junit] 08/12/30 19:47:50 INFO exec.TableScanOperator: Initializing Self
    [junit] 08/12/30 19:47:50 INFO exec.TableScanOperator: Initializing children:
    [junit] 08/12/30 19:47:50 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 19:47:50 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 19:47:50 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 19:47:50 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 19:47:50 INFO exec.LimitOperator: Initializing Self
    [junit] 08/12/30 19:47:50 INFO exec.LimitOperator: Initializing children:
    [junit] 08/12/30 19:47:50 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 19:47:50 INFO exec.LimitOperator: Initialization Done
    [junit] 08/12/30 19:47:50 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 19:47:50 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 19:47:50 INFO exec.TableScanOperator: Initialization Done
    [junit] 08/12/30 19:47:50 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 19:47:50 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt:0+5812
    [junit] 08/12/30 19:47:50 INFO mapred.TaskRunner: Task 'job_local_1_map_0000' done.
    [junit] 08/12/30 19:47:50 INFO mapred.TaskRunner: Saved output of task 'job_local_1_map_0000' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp679923394
    [junit]  map = 100%,  reduce =0%
    [junit] 08/12/30 19:47:50 INFO exec.ExecDriver:  map = 100%,  reduce =0%
    [junit] Ended Job = job_local_1
    [junit] 08/12/30 19:47:50 INFO exec.ExecDriver: Ended Job = job_local_1
    [junit] plan = /tmp/plan60518.xml
    [junit] 08/12/30 19:47:52 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to 0 since there's no reduce operator
    [junit] 08/12/30 19:47:53 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src
    [junit] 08/12/30 19:47:53 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/data/files/TestSerDe.jar
    [junit] 08/12/30 19:47:53 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
    [junit] 08/12/30 19:47:53 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] 08/12/30 19:47:53 INFO exec.ExecDriver: Job running in-process (local Hadoop)
    [junit] Job running in-process (local Hadoop)
    [junit] 08/12/30 19:47:53 INFO mapred.MapTask: numReduceTasks: 0
    [junit] 08/12/30 19:47:53 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 19:47:53 INFO exec.MapOperator: Adding alias src to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt
    [junit] 08/12/30 19:47:53 INFO exec.MapOperator: Got partitions: null
    [junit] 08/12/30 19:47:53 INFO exec.TableScanOperator: Initializing Self
    [junit] 08/12/30 19:47:53 INFO exec.TableScanOperator: Initializing children:
    [junit] 08/12/30 19:47:53 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 19:47:53 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 19:47:53 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 19:47:53 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 19:47:53 INFO exec.LimitOperator: Initializing Self
    [junit] 08/12/30 19:47:53 INFO exec.LimitOperator: Initializing children:
    [junit] 08/12/30 19:47:53 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 19:47:53 INFO exec.LimitOperator: Initialization Done
    [junit] 08/12/30 19:47:53 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 19:47:53 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 19:47:53 INFO exec.TableScanOperator: Initialization Done
    [junit] 08/12/30 19:47:53 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 19:47:53 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt:0+5812
    [junit] 08/12/30 19:47:53 INFO mapred.TaskRunner: Task 'job_local_1_map_0000' done.
    [junit] 08/12/30 19:47:53 INFO mapred.TaskRunner: Saved output of task 'job_local_1_map_0000' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp743107856
    [junit] 08/12/30 19:47:54 INFO exec.ExecDriver:  map = 100%,  reduce =0%
    [junit] 08/12/30 19:47:54 INFO exec.ExecDriver: Ended Job = job_local_1
    [junit]  map = 100%,  reduce =0%
    [junit] Ended Job = job_local_1
    [junit] plan = /tmp/plan60519.xml
    [junit] 08/12/30 19:47:55 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to 0 since there's no reduce operator
    [junit] 08/12/30 19:47:55 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src
    [junit] 08/12/30 19:47:55 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/data/files/TestSerDe.jar
    [junit] 08/12/30 19:47:55 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
    [junit] 08/12/30 19:47:56 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] 08/12/30 19:47:56 INFO exec.ExecDriver: Job running in-process (local Hadoop)
    [junit] Job running in-process (local Hadoop)
    [junit] 08/12/30 19:47:56 INFO mapred.MapTask: numReduceTasks: 0
    [junit] 08/12/30 19:47:56 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 19:47:56 INFO exec.MapOperator: Adding alias src to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt
    [junit] 08/12/30 19:47:56 INFO exec.MapOperator: Got partitions: null
    [junit] 08/12/30 19:47:56 INFO exec.TableScanOperator: Initializing Self
    [junit] 08/12/30 19:47:56 INFO exec.TableScanOperator: Initializing children:
    [junit] 08/12/30 19:47:56 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 19:47:56 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 19:47:56 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 19:47:56 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 19:47:56 INFO exec.LimitOperator: Initializing Self
    [junit] 08/12/30 19:47:56 INFO exec.LimitOperator: Initializing children:
    [junit] 08/12/30 19:47:56 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 19:47:56 INFO exec.LimitOperator: Initialization Done
    [junit] 08/12/30 19:47:56 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 19:47:56 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 19:47:56 INFO exec.TableScanOperator: Initialization Done
    [junit] 08/12/30 19:47:56 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 19:47:56 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt:0+5812
    [junit] 08/12/30 19:47:56 INFO mapred.TaskRunner: Task 'job_local_1_map_0000' done.
    [junit] 08/12/30 19:47:56 INFO mapred.TaskRunner: Saved output of task 'job_local_1_map_0000' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp-1416399510
    [junit]  map = 100%,  reduce =0%
    [junit] Ended Job = job_local_1
    [junit] 08/12/30 19:47:57 INFO exec.ExecDriver:  map = 100%,  reduce =0%
    [junit] 08/12/30 19:47:57 INFO exec.ExecDriver: Ended Job = job_local_1
    [junit] diff -I \(file:\)\|\(/tmp/.*\) /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/udf_round.q.out /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/clientpositive/udf_round.q.out
    [junit] Done query: udf_round.q
    [junit] Begin query: fileformat_void.q
    [junit] diff -I \(file:\)\|\(/tmp/.*\) /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/fileformat_void.q.out /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/clientpositive/fileformat_void.q.out
    [junit] Done query: fileformat_void.q
    [junit] Begin query: join19.q
    [junit] diff -I \(file:\)\|\(/tmp/.*\) /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/join19.q.out /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/clientpositive/join19.q.out
    [junit] Done query: join19.q
    [junit] Tests run: 129, Failures: 58, Errors: 0, Time elapsed: 405.911 sec
    [junit] Test org.apache.hadoop.hive.cli.TestCliDriver FAILED
    [junit] Running org.apache.hadoop.hive.cli.TestNegativeCliDriver
    [junit] Begin query: input1.q
    [junit] Loading data to table srcpart partition {ds=2008-04-08, hr=11}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-08, hr=12}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-09, hr=11}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-09, hr=12}
    [junit] OK
    [junit] Loading data to table srcbucket
    [junit] OK
    [junit] Loading data to table srcbucket
    [junit] OK
    [junit] Loading data to table src
    [junit] OK
    [junit] diff -I \(file:\)\|\(/tmp/.*\) /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/input1.q.out /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/clientnegative/input1.q.out
    [junit] Done query: input1.q
    [junit] Begin query: notable_alias3.q
    [junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
    [junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:361)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
    [junit]     at org.apache.hadoop.hive.cli.TestNegativeCliDriver.testNegativeCliDriver_notable_alias3(TestNegativeCliDriver.java:117)
    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit]     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit]     at java.lang.reflect.Method.invoke(Method.java:597)
    [junit]     at junit.framework.TestCase.runTest(TestCase.java:164)
    [junit]     at junit.framework.TestCase.runBare(TestCase.java:130)
    [junit]     at junit.framework.TestResult$1.protect(TestResult.java:106)
    [junit]     at junit.framework.TestResult.runProtected(TestResult.java:124)

    [junit]     at junit.framework.TestResult.run(TestResult.java:109)
    [junit]     at junit.framework.TestCase.run(TestCase.java:120)
    [junit]     at junit.framework.TestSuite.runTest(TestSuite.java:230)
    [junit]     at junit.framework.TestSuite.run(TestSuite.java:225)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
    [junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
    [junit]     at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
    [junit]     at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
    [junit]     at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
    [junit]     ... 21 more
    [junit] Begin query: notable_alias4.q
    [junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcbucket)
    [junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcbucket)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:361)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
    [junit]     at org.apache.hadoop.hive.cli.TestNegativeCliDriver.testNegativeCliDriver_notable_alias4(TestNegativeCliDriver.java:142)
    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit]     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit]     at java.lang.reflect.Method.invoke(Method.java:597)
    [junit]     at junit.framework.TestCase.runTest(TestCase.java:164)
    [junit]     at junit.framework.TestCase.runBare(TestCase.java:130)
    [junit]     at junit.framework.TestResult$1.protect(TestResult.java:106)
    [junit]     at junit.framework.TestResult.runProtected(TestResult.java:124)
    [junit]     at junit.framework.TestResult.run(TestResult.java:109)
    [junit]     at junit.framework.TestCase.run(TestCase.java:120)
    [junit]     at junit.framework.TestSuite.runTest(TestSuite.java:230)
    [junit]     at junit.framework.TestSuite.run(TestSuite.java:225)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
    [junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcbucket)
    [junit]     at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
    [junit]     at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
    [junit]     at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
    [junit]     ... 21 more
    [junit] Begin query: input2.q
    [junit] diff -I \(file:\)\|\(/tmp/.*\) /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/input2.q.out /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/clientnegative/input2.q.out
    [junit] Done query: input2.q
    [junit] Begin query: bad_sample_clause.q
    [junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
    [junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:361)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
    [junit]     at org.apache.hadoop.hive.cli.TestNegativeCliDriver.testNegativeCliDriver_bad_sample_clause(TestNegativeCliDriver.java:192)
    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit]     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit]     at java.lang.reflect.Method.invoke(Method.java:597)
    [junit]     at junit.framework.TestCase.runTest(TestCase.java:164)
    [junit]     at junit.framework.TestCase.runBare(TestCase.java:130)
    [junit]     at junit.framework.TestResult$1.protect(TestResult.java:106)
    [junit]     at junit.framework.TestResult.runProtected(TestResult.java:124)
    [junit]     at junit.framework.TestResult.run(TestResult.java:109)
    [junit]     at junit.framework.TestCase.run(TestCase.java:120)
    [junit]     at junit.framework.TestSuite.runTest(TestSuite.java:230)
    [junit]     at junit.framework.TestSuite.run(TestSuite.java:225)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
    [junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
    [junit]     at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
    [junit]     at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
    [junit]     at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
    [junit]     ... 21 more
    [junit] Begin query: input_testxpath4.q
    [junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcpart)
    [junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcpart)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:361)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
    [junit]     at org.apache.hadoop.hive.cli.TestNegativeCliDriver.testNegativeCliDriver_input_testxpath4(TestNegativeCliDriver.java:217)
    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit]     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit]     at java.lang.reflect.Method.invoke(Method.java:597)
    [junit]     at junit.framework.TestCase.runTest(TestCase.java:164)
    [junit]     at junit.framework.TestCase.runBare(TestCase.java:130)
    [junit]     at junit.framework.TestResult$1.protect(TestResult.java:106)
    [junit]     at junit.framework.TestResult.runProtected(TestResult.java:124)
    [junit]     at junit.framework.TestResult.run(TestResult.java:109)
    [junit]     at junit.framework.TestCase.run(TestCase.java:120)
    [junit]     at junit.framework.TestSuite.runTest(TestSuite.java:230)
    [junit]     at junit.framework.TestSuite.run(TestSuite.java:225)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
    [junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcpart)
    [junit]     at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
    [junit]     at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
    [junit]     at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
    [junit]     ... 21 more
    [junit] Begin query: invalid_tbl_name.q
    [junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcbucket)
    [junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcbucket)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:361)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
    [junit]     at org.apache.hadoop.hive.cli.TestNegativeCliDriver.testNegativeCliDriver_invalid_tbl_name(TestNegativeCliDriver.java:242)
    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit]     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit]     at java.lang.reflect.Method.invoke(Method.java:597)
    [junit]     at junit.framework.TestCase.runTest(TestCase.java:164)
    [junit]     at junit.framework.TestCase.runBare(TestCase.java:130)
    [junit]     at junit.framework.TestResult$1.protect(TestResult.java:106)
    [junit]     at junit.framework.TestResult.runProtected(TestResult.java:124)
    [junit]     at junit.framework.TestResult.run(TestResult.java:109)
    [junit]     at junit.framework.TestCase.run(TestCase.java:120)
    [junit]     at junit.framework.TestSuite.runTest(TestSuite.java:230)
    [junit]     at junit.framework.TestSuite.run(TestSuite.java:225)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
    [junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcbucket)
    [junit]     at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
    [junit]     at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
    [junit]     at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
    [junit]     ... 21 more
    [junit] Begin query: union.q
    [junit] diff -I \(file:\)\|\(/tmp/.*\) /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/union.q.out /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/clientnegative/union.q.out
    [junit] Done query: union.q
    [junit] Begin query: joinneg.q
    [junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
    [junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:361)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
    [junit]     at org.apache.hadoop.hive.cli.TestNegativeCliDriver.testNegativeCliDriver_joinneg(TestNegativeCliDriver.java:292)
    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit]     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit]     at java.lang.reflect.Method.invoke(Method.java:597)
    [junit]     at junit.framework.TestCase.runTest(TestCase.java:164)
    [junit]     at junit.framework.TestCase.runBare(TestCase.java:130)
    [junit]     at junit.framework.TestResult$1.protect(TestResult.java:106)
    [junit]     at junit.framework.TestResult.runProtected(TestResult.java:124)
    [junit]     at junit.framework.TestResult.run(TestResult.java:109)
    [junit]     at junit.framework.TestCase.run(TestCase.java:120)
    [junit]     at junit.framework.TestSuite.runTest(TestSuite.java:230)
    [junit]     at junit.framework.TestSuite.run(TestSuite.java:225)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
    [junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
    [junit]     at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
    [junit]     at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
    [junit]     at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
    [junit]     ... 21 more
    [junit] Begin query: invalid_create_tbl1.q
    [junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcpart)
    [junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcpart)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:361)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
    [junit]     at org.apache.hadoop.hive.cli.TestNegativeCliDriver.testNegativeCliDriver_invalid_create_tbl1(TestNegativeCliDriver.java:317)
    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit]     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit]     at java.lang.reflect.Method.invoke(Method.java:597)
    [junit]     at junit.framework.TestCase.runTest(TestCase.java:164)
    [junit]     at junit.framework.TestCase.runBare(TestCase.java:130)
    [junit]     at junit.framework.TestResult$1.protect(TestResult.java:106)
    [junit]     at junit.framework.TestResult.runProtected(TestResult.java:124)
    [junit]     at junit.framework.TestResult.run(TestResult.java:109)
    [junit]     at junit.framework.TestCase.run(TestCase.java:120)
    [junit]     at junit.framework.TestSuite.runTest(TestSuite.java:230)
    [junit]     at junit.framework.TestSuite.run(TestSuite.java:225)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
    [junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcpart)
    [junit]     at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
    [junit]     at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
    [junit]     at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
    [junit]     ... 21 more
    [junit] Begin query: invalid_create_tbl2.q
    [junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcbucket)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
    [junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcbucket)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:361)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
    [junit]     at org.apache.hadoop.hive.cli.TestNegativeCliDriver.testNegativeCliDriver_invalid_create_tbl2(TestNegativeCliDriver.java:342)
    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit]     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit]     at java.lang.reflect.Method.invoke(Method.java:597)
    [junit]     at junit.framework.TestCase.runTest(TestCase.java:164)
    [junit]     at junit.framework.TestCase.runBare(TestCase.java:130)
    [junit]     at junit.framework.TestResult$1.protect(TestResult.java:106)
    [junit]     at junit.framework.TestResult.runProtected(TestResult.java:124)
    [junit]     at junit.framework.TestResult.run(TestResult.java:109)
    [junit]     at junit.framework.TestCase.run(TestCase.java:120)
    [junit]     at junit.framework.TestSuite.runTest(TestSuite.java:230)
    [junit]     at junit.framework.TestSuite.run(TestSuite.java:225)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
    [junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcbucket)
    [junit]     at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
    [junit]     at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
    [junit]     at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
    [junit]     ... 21 more
    [junit] Begin query: subq_insert.q
    [junit] Exception: load command: LOAD DATA INPATH '/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp/kv1.txt' INTO TABLE srcpart PARTITION (ds='2008-04-09',hr='12') failed with exit code= 9
    [junit] java.lang.Exception: load command: LOAD DATA INPATH '/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp/kv1.txt' INTO TABLE srcpart PARTITION (ds='2008-04-09',hr='12') failed with exit code= 9
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.createSources(QTestUtil.java:250)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:362)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
    [junit]     at org.apache.hadoop.hive.cli.TestNegativeCliDriver.testNegativeCliDriver_subq_insert(TestNegativeCliDriver.java:367)
    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit]     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit]     at java.lang.reflect.Method.invoke(Method.java:597)
    [junit]     at junit.framework.TestCase.runTest(TestCase.java:164)
    [junit]     at junit.framework.TestCase.runBare(TestCase.java:130)
    [junit]     at junit.framework.TestResult$1.protect(TestResult.java:106)
    [junit]     at junit.framework.TestResult.runProtected(TestResult.java:124)
    [junit]     at junit.framework.TestResult.run(TestResult.java:109)
    [junit]     at junit.framework.TestCase.run(TestCase.java:120)
    [junit]     at junit.framework.TestSuite.runTest(TestSuite.java:230)
    [junit]     at junit.framework.TestSuite.run(TestSuite.java:225)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
    [junit] Begin query: load_wrong_fileformat.q
    [junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcpart)
    [junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcpart)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:361)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
    [junit]     at org.apache.hadoop.hive.cli.TestNegativeCliDriver.testNegativeCliDriver_load_wrong_fileformat(TestNegativeCliDriver.java:392)
    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit]     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit]     at java.lang.reflect.Method.invoke(Method.java:597)
    [junit]     at junit.framework.TestCase.runTest(TestCase.java:164)
    [junit]     at junit.framework.TestCase.runBare(TestCase.java:130)
    [junit]     at junit.framework.TestResult$1.protect(TestResult.java:106)
    [junit]     at junit.framework.TestResult.runProtected(TestResult.java:124)
    [junit]     at junit.framework.TestResult.run(TestResult.java:109)
    [junit]     at junit.framework.TestCase.run(TestCase.java:120)
    [junit]     at junit.framework.TestSuite.runTest(TestSuite.java:230)
    [junit]     at junit.framework.TestSuite.run(TestSuite.java:225)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
    [junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcpart)
    [junit]     at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
    [junit]     at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
    [junit]     at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
    [junit]     ... 21 more
    [junit] Begin query: describe_xpath1.q
    [junit] Exception: load command: LOAD DATA INPATH '/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp/kv1.txt' INTO TABLE srcpart PARTITION (ds='2008-04-09',hr='11') failed with exit code= 9
    [junit] java.lang.Exception: load command: LOAD DATA INPATH '/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp/kv1.txt' INTO TABLE srcpart PARTITION (ds='2008-04-09',hr='11') failed with exit code= 9
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.createSources(QTestUtil.java:250)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:362)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
    [junit]     at org.apache.hadoop.hive.cli.TestNegativeCliDriver.testNegativeCliDriver_describe_xpath1(TestNegativeCliDriver.java:417)
    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit]     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit]     at java.lang.reflect.Method.invoke(Method.java:597)
    [junit]     at junit.framework.TestCase.runTest(TestCase.java:164)
    [junit]     at junit.framework.TestCase.runBare(TestCase.java:130)
    [junit]     at junit.framework.TestResult$1.protect(TestResult.java:106)
    [junit]     at junit.framework.TestResult.runProtected(TestResult.java:124)
    [junit]     at junit.framework.TestResult.run(TestResult.java:109)
    [junit]     at junit.framework.TestCase.run(TestCase.java:120)
    [junit]     at junit.framework.TestSuite.runTest(TestSuite.java:230)
    [junit]     at junit.framework.TestSuite.run(TestSuite.java:225)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
    [junit] Begin query: clusterbydistributeby.q
    [junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcpart)
    [junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcpart)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:361)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
    [junit]     at org.apache.hadoop.hive.cli.TestNegativeCliDriver.testNegativeCliDriver_clusterbydistributeby(TestNegativeCliDriver.java:442)
    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit]     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit]     at java.lang.reflect.Method.invoke(Method.java:597)
    [junit]     at junit.framework.TestCase.runTest(TestCase.java:164)
    [junit]     at junit.framework.TestCase.runBare(TestCase.java:130)
    [junit]     at junit.framework.TestResult$1.protect(TestResult.java:106)
    [junit]     at junit.framework.TestResult.runProtected(TestResult.java:124)
    [junit]     at junit.framework.TestResult.run(TestResult.java:109)
    [junit]     at junit.framework.TestCase.run(TestCase.java:120)
    [junit]     at junit.framework.TestSuite.runTest(TestSuite.java:230)
    [junit]     at junit.framework.TestSuite.run(TestSuite.java:225)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
    [junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcpart)
    [junit]     at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
    [junit]     at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
    [junit]     at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
    [junit]     ... 21 more
    [junit] Begin query: describe_xpath2.q
    [junit] Exception: load command: LOAD DATA INPATH '/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp/kv1.txt' INTO TABLE srcpart PARTITION (ds='2008-04-08',hr='12') failed with exit code= 9
    [junit] java.lang.Exception: load command: LOAD DATA INPATH '/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp/kv1.txt' INTO TABLE srcpart PARTITION (ds='2008-04-08',hr='12') failed with exit code= 9
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.createSources(QTestUtil.java:250)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:362)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
    [junit]     at org.apache.hadoop.hive.cli.TestNegativeCliDriver.testNegativeCliDriver_describe_xpath2(TestNegativeCliDriver.java:467)
    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit]     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit]     at java.lang.reflect.Method.invoke(Method.java:597)
    [junit]     at junit.framework.TestCase.runTest(TestCase.java:164)
    [junit]     at junit.framework.TestCase.runBare(TestCase.java:130)
    [junit]     at junit.framework.TestResult$1.protect(TestResult.java:106)
    [junit]     at junit.framework.TestResult.runProtected(TestResult.java:124)
    [junit]     at junit.framework.TestResult.run(TestResult.java:109)
    [junit]     at junit.framework.TestCase.run(TestCase.java:120)
    [junit]     at junit.framework.TestSuite.runTest(TestSuite.java:230)
    [junit]     at junit.framework.TestSuite.run(TestSuite.java:225)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
    [junit] Begin query: describe_xpath3.q
    [junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcpart)
    [junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcpart)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:361)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
    [junit]     at org.apache.hadoop.hive.cli.TestNegativeCliDriver.testNegativeCliDriver_describe_xpath3(TestNegativeCliDriver.java:492)
    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit]     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit]     at java.lang.reflect.Method.invoke(Method.java:597)
    [junit]     at junit.framework.TestCase.runTest(TestCase.java:164)
    [junit]     at junit.framework.TestCase.runBare(TestCase.java:130)
    [junit]     at junit.framework.TestResult$1.protect(TestResult.java:106)
    [junit]     at junit.framework.TestResult.runProtected(TestResult.java:124)
    [junit]     at junit.framework.TestResult.run(TestResult.java:109)
    [junit]     at junit.framework.TestCase.run(TestCase.java:120)
    [junit]     at junit.framework.TestSuite.runTest(TestSuite.java:230)
    [junit]     at junit.framework.TestSuite.run(TestSuite.java:225)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
    [junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcpart)
    [junit]     at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
    [junit]     at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
    [junit]     at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
    [junit]     ... 21 more
    [junit] Begin query: describe_xpath4.q
    [junit] java.lang.Exception: load command: LOAD DATA INPATH '/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp/kv1.txt' INTO TABLE srcpart PARTITION (ds='2008-04-08',hr='12') failed with exit code= 9
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.createSources(QTestUtil.java:250)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:362)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
    [junit]     at org.apache.hadoop.hive.cli.TestNegativeCliDriver.testNegativeCliDriver_describe_xpath4(TestNegativeCliDriver.java:517)
    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit]     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit]     at java.lang.reflect.Method.invoke(Method.java:597)
    [junit]     at junit.framework.TestCase.runTest(TestCase.java:164)
    [junit]     at junit.framework.TestCase.runBare(TestCase.java:130)
    [junit]     at junit.framework.TestResult$1.protect(TestResult.java:106)
    [junit]     at junit.framework.TestResult.runProtected(TestResult.java:124)
    [junit]     at junit.framework.TestResult.run(TestResult.java:109)
    [junit]     at junit.framework.TestCase.run(TestCase.java:120)
    [junit]     at junit.framework.TestSuite.runTest(TestSuite.java:230)
    [junit]     at junit.framework.TestSuite.run(TestSuite.java:225)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
    [junit] Exception: load command: LOAD DATA INPATH '/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp/kv1.txt' INTO TABLE srcpart PARTITION (ds='2008-04-08',hr='12') failed with exit code= 9
    [junit] Begin query: strict_pruning.q
    [junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcpart)
    [junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcpart)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:361)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
    [junit]     at org.apache.hadoop.hive.cli.TestNegativeCliDriver.testNegativeCliDriver_strict_pruning(TestNegativeCliDriver.java:542)
    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit]     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit]     at java.lang.reflect.Method.invoke(Method.java:597)
    [junit]     at junit.framework.TestCase.runTest(TestCase.java:164)
    [junit]     at junit.framework.TestCase.runBare(TestCase.java:130)
    [junit]     at junit.framework.TestResult$1.protect(TestResult.java:106)
    [junit]     at junit.framework.TestResult.runProtected(TestResult.java:124)
    [junit]     at junit.framework.TestResult.run(TestResult.java:109)
    [junit]     at junit.framework.TestCase.run(TestCase.java:120)
    [junit]     at junit.framework.TestSuite.runTest(TestSuite.java:230)
    [junit]     at junit.framework.TestSuite.run(TestSuite.java:225)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
    [junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcpart)
    [junit]     at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
    [junit]     at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
    [junit]     at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
    [junit]     ... 21 more
    [junit] Begin query: clusterbysortby.q
    [junit] Exception: load command: LOAD DATA INPATH '/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp/kv1.txt' INTO TABLE srcpart PARTITION (ds='2008-04-08',hr='11') failed with exit code= 9
    [junit] java.lang.Exception: load command: LOAD DATA INPATH '/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp/kv1.txt' INTO TABLE srcpart PARTITION (ds='2008-04-08',hr='11') failed with exit code= 9
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.createSources(QTestUtil.java:250)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:362)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
    [junit]     at org.apache.hadoop.hive.cli.TestNegativeCliDriver.testNegativeCliDriver_clusterbysortby(TestNegativeCliDriver.java:567)
    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit]     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit]     at java.lang.reflect.Method.invoke(Method.java:597)
    [junit]     at junit.framework.TestCase.runTest(TestCase.java:164)
    [junit]     at junit.framework.TestCase.runBare(TestCase.java:130)
    [junit]     at junit.framework.TestResult$1.protect(TestResult.java:106)
    [junit]     at junit.framework.TestResult.runProtected(TestResult.java:124)
    [junit]     at junit.framework.TestResult.run(TestResult.java:109)
    [junit]     at junit.framework.TestCase.run(TestCase.java:120)
    [junit]     at junit.framework.TestSuite.runTest(TestSuite.java:230)
    [junit]     at junit.framework.TestSuite.run(TestSuite.java:225)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
    [junit] Begin query: fileformat_bad_class.q
    [junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcpart)
    [junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcpart)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:361)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
    [junit]     at org.apache.hadoop.hive.cli.TestNegativeCliDriver.testNegativeCliDriver_fileformat_bad_class(TestNegativeCliDriver.java:592)
    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit]     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit]     at java.lang.reflect.Method.invoke(Method.java:597)
    [junit]     at junit.framework.TestCase.runTest(TestCase.java:164)
    [junit]     at junit.framework.TestCase.runBare(TestCase.java:130)
    [junit]     at junit.framework.TestResult$1.protect(TestResult.java:106)
    [junit]     at junit.framework.TestResult.runProtected(TestResult.java:124)
    [junit]     at junit.framework.TestResult.run(TestResult.java:109)
    [junit]     at junit.framework.TestCase.run(TestCase.java:120)
    [junit]     at junit.framework.TestSuite.runTest(TestSuite.java:230)
    [junit]     at junit.framework.TestSuite.run(TestSuite.java:225)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
    [junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcpart)
    [junit]     at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
    [junit]     at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
    [junit]     at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
    [junit]     ... 21 more
    [junit] Begin query: fileformat_void_input.q
    [junit] Exception: load command: LOAD DATA INPATH '/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp/kv1.txt' INTO TABLE srcpart PARTITION (ds='2008-04-08',hr='11') failed with exit code= 9
    [junit] java.lang.Exception: load command: LOAD DATA INPATH '/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp/kv1.txt' INTO TABLE srcpart PARTITION (ds='2008-04-08',hr='11') failed with exit code= 9
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.createSources(QTestUtil.java:250)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:362)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
    [junit]     at org.apache.hadoop.hive.cli.TestNegativeCliDriver.testNegativeCliDriver_fileformat_void_input(TestNegativeCliDriver.java:617)
    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit]     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit]     at java.lang.reflect.Method.invoke(Method.java:597)
    [junit]     at junit.framework.TestCase.runTest(TestCase.java:164)
    [junit]     at junit.framework.TestCase.runBare(TestCase.java:130)
    [junit]     at junit.framework.TestResult$1.protect(TestResult.java:106)
    [junit]     at junit.framework.TestResult.runProtected(TestResult.java:124)
    [junit]     at junit.framework.TestResult.run(TestResult.java:109)
    [junit]     at junit.framework.TestCase.run(TestCase.java:120)
    [junit]     at junit.framework.TestSuite.runTest(TestSuite.java:230)
    [junit]     at junit.framework.TestSuite.run(TestSuite.java:225)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
    [junit] Begin query: fileformat_void_output.q
    [junit] diff -I \(file:\)\|\(/tmp/.*\) /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/fileformat_void_output.q.out /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/clientnegative/fileformat_void_output.q.out
    [junit] Done query: fileformat_void_output.q
    [junit] Tests run: 23, Failures: 19, Errors: 0, Time elapsed: 20.426 sec
    [junit] Test org.apache.hadoop.hive.cli.TestNegativeCliDriver FAILED
    [junit] Running org.apache.hadoop.hive.ql.exec.TestExecDriver
    [junit] Beginning testMapPlan1
    [junit] Generating plan file /tmp/plan1551.xml
    [junit] Executing: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/hadoopcore/hadoop-0.17.1/bin/hadoop jar /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/hive_exec.jar org.apache.hadoop.hive.ql.exec.ExecDriver -plan /tmp/plan1551.xml -jobconf fs.scheme.class=dfs -jobconf hive.exec.scratchdir=%24%7Buser.dir%7D%2F..%2Fbuild%2Fql%2Ftmp -jobconf test.query.file1=file%3A%2F%2F%24%7Buser.dir%7D%2F..%2Fql%2Fsrc%2Ftest%2Forg%2Fapache%2Fhadoop%2Fhive%2Fql%2Finput2.q -jobconf hive.metastore.connect.retries=5 -jobconf hive.metastore.rawstore.impl=org.apache.hadoop.hive.metastore.ObjectStore -jobconf hive.metastore.metadb.dir=file%3A%2F%2F%24%7Buser.dir%7D%2F..%2Fbuild%2Fql%2Ftest%2Fdata%2Fmetadb%2F -jobconf javax.jdo.option.ConnectionPassword=mine -jobconf hive.metastore.uris=file%3A%2F%2F%2Fvar%2Fmetastore%2Fmetadb%2F -jobconf hive.metastore.warehouse.dir=file%3A%2F%2F%24%7Buser.dir%7D%2F..%2Fbuild%2Fql%2Ftest%2Fdat

 a%2Fwarehouse%2F -jobconf hive.aux.jars.path=%2Fusr%2Ffbtools%2Fcontinuous_builds%2Fhiveopensource-0.17.1%2Fhiveopensource_0_17_1%2Fbuild%2Fql%2Ftest%2Ftest-udfs.jar%2C%2Fusr%2Ffbtools%2Fcontinuous_builds%2Fhiveopensource-0.17.1%2Fhiveopensource_0_17_1%2Fdata%2Ffiles%2FTestSerDe.jar -jobconf hive.metastore.local=true -jobconf test.data.files=%24%7Buser.dir%7D%2F..%2Fdata%2Ffiles -jobconf javax.jdo.option.ConnectionURL=jdbc%3Aderby%3A%3BdatabaseName%3D..%2Fbuild%2Ftest%2Fjunit_metastore_db%3Bcreate%3Dtrue -jobconf javax.jdo.option.ConnectionDriverName=org.apache.derby.jdbc.EmbeddedDriver -jobconf hive.exec.script.maxerrsize=100000 -jobconf javax.jdo.option.ConnectionUserName=APP -jobconf hive.jar.path=%24%7Buser.dir%7D%2F..%2Fbuild%2Fql%2Fhive_exec.jar -jobconf hadoop.config.dir=%2Fusr%2Ffbtools%2Fcontinuous_builds%2Fhiveopensource-0.17.1%2Fhiveopensource_0_17_1%2Fbuild%2Fhadoopcore%2Fhadoop-0.17.1%2Fconf -jobconf hive.join.emit.interval=1000 -jobconf hive.exec.compress.outpu

 t=false -jobconf test.src.dir=file%3A%2F%2F%24%7Buser.dir%7D%2F..%2Fql%2Fsrc%2Ftest -jobconf test.log.dir=%24%7Buser.dir%7D%2F..%2Fbuild%2Fql%2Ftest%2Flogs -jobconf hive.map.aggr=false -jobconf hive.exec.compress.intermediate=false -jobconf hive.default.fileformat=TextFile -jobconf mapred.system.dir=%2Ftmp%2Fhadoop-mvaradachari%2Fmapred%2Fsystem%2F-452750732 -jobconf mapred.local.dir=%2Ftmp%2Fhadoop-mvaradachari%2Fmapred%2Flocal%2F-1034274976
    [junit] plan = /tmp/plan1551.xml
    [junit] 08/12/30 19:48:28 INFO exec.ExecDriver: Number of reduce tasks determined at compile : 0
    [junit] 08/12/30 19:48:28 INFO exec.ExecDriver: Adding input file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src
    [junit] 08/12/30 19:48:28 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/data/files/TestSerDe.jar
    [junit] 08/12/30 19:48:28 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
    [junit] 08/12/30 19:48:28 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] 08/12/30 19:48:28 INFO exec.ExecDriver: Job running in-process (local Hadoop)
    [junit] Job running in-process (local Hadoop)
    [junit] 08/12/30 19:48:28 INFO mapred.MapTask: numReduceTasks: 0
    [junit] 08/12/30 19:48:28 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 19:48:28 INFO exec.MapOperator: Adding alias a to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt
    [junit] 08/12/30 19:48:28 INFO exec.MapOperator: Got partitions: null
    [junit] 08/12/30 19:48:28 INFO exec.FilterOperator: Initializing Self
    [junit] 08/12/30 19:48:28 INFO exec.FilterOperator: Initializing children:
    [junit] 08/12/30 19:48:28 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 19:48:29 INFO util.NativeCodeLoader: Loaded the native-hadoop library
    [junit] 08/12/30 19:48:29 INFO zlib.ZlibFactory: Successfully loaded & initialized native-zlib library
    [junit] 08/12/30 19:48:29 INFO exec.FilterOperator: Initialization Done
    [junit] 08/12/30 19:48:29 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 19:48:29 INFO exec.FilterOperator: FILTERED:416
    [junit] 08/12/30 19:48:29 INFO exec.FilterOperator: PASSED:84
    [junit] 08/12/30 19:48:29 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt:0+5812
    [junit] 08/12/30 19:48:29 INFO mapred.TaskRunner: Task 'job_local_1_map_0000' done.
    [junit] 08/12/30 19:48:29 INFO mapred.TaskRunner: Saved output of task 'job_local_1_map_0000' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp1430928179
    [junit]  map = 100%,  reduce =0%
    [junit] 08/12/30 19:48:29 INFO exec.ExecDriver:  map = 100%,  reduce =0%
    [junit] 08/12/30 19:48:29 INFO exec.ExecDriver: Ended Job = job_local_1
    [junit] Ended Job = job_local_1
    [junit] testMapPlan1 execution completed successfully
    [junit] /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../data/files
    [junit] Beginning testMapPlan2
    [junit] Generating plan file /tmp/plan1552.xml
    [junit] Executing: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/hadoopcore/hadoop-0.17.1/bin/hadoop jar /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/hive_exec.jar org.apache.hadoop.hive.ql.exec.ExecDriver -plan /tmp/plan1552.xml -jobconf fs.scheme.class=dfs -jobconf hive.exec.scratchdir=%24%7Buser.dir%7D%2F..%2Fbuild%2Fql%2Ftmp -jobconf test.query.file1=file%3A%2F%2F%24%7Buser.dir%7D%2F..%2Fql%2Fsrc%2Ftest%2Forg%2Fapache%2Fhadoop%2Fhive%2Fql%2Finput2.q -jobconf hive.metastore.connect.retries=5 -jobconf hive.metastore.rawstore.impl=org.apache.hadoop.hive.metastore.ObjectStore -jobconf hive.metastore.metadb.dir=file%3A%2F%2F%24%7Buser.dir%7D%2F..%2Fbuild%2Fql%2Ftest%2Fdata%2Fmetadb%2F -jobconf javax.jdo.option.ConnectionPassword=mine -jobconf hive.metastore.uris=file%3A%2F%2F%2Fvar%2Fmetastore%2Fmetadb%2F -jobconf hive.metastore.warehouse.dir=file%3A%2F%2F%24%7Buser.dir%7D%2F..%2Fbuild%2Fql%2Ftest%2Fdat

 a%2Fwarehouse%2F -jobconf hive.aux.jars.path=%2Fusr%2Ffbtools%2Fcontinuous_builds%2Fhiveopensource-0.17.1%2Fhiveopensource_0_17_1%2Fbuild%2Fql%2Ftest%2Ftest-udfs.jar%2C%2Fusr%2Ffbtools%2Fcontinuous_builds%2Fhiveopensource-0.17.1%2Fhiveopensource_0_17_1%2Fdata%2Ffiles%2FTestSerDe.jar -jobconf hive.metastore.local=true -jobconf test.data.files=%24%7Buser.dir%7D%2F..%2Fdata%2Ffiles -jobconf javax.jdo.option.ConnectionURL=jdbc%3Aderby%3A%3BdatabaseName%3D..%2Fbuild%2Ftest%2Fjunit_metastore_db%3Bcreate%3Dtrue -jobconf javax.jdo.option.ConnectionDriverName=org.apache.derby.jdbc.EmbeddedDriver -jobconf hive.exec.script.maxerrsize=100000 -jobconf javax.jdo.option.ConnectionUserName=APP -jobconf hive.jar.path=%24%7Buser.dir%7D%2F..%2Fbuild%2Fql%2Fhive_exec.jar -jobconf hadoop.config.dir=%2Fusr%2Ffbtools%2Fcontinuous_builds%2Fhiveopensource-0.17.1%2Fhiveopensource_0_17_1%2Fbuild%2Fhadoopcore%2Fhadoop-0.17.1%2Fconf -jobconf hive.join.emit.interval=1000 -jobconf hive.exec.compress.outpu

 t=false -jobconf test.src.dir=file%3A%2F%2F%24%7Buser.dir%7D%2F..%2Fql%2Fsrc%2Ftest -jobconf test.log.dir=%24%7Buser.dir%7D%2F..%2Fbuild%2Fql%2Ftest%2Flogs -jobconf hive.map.aggr=false -jobconf hive.exec.compress.intermediate=false -jobconf hive.default.fileformat=TextFile -jobconf mapred.system.dir=%2Ftmp%2Fhadoop-mvaradachari%2Fmapred%2Fsystem%2F1998953133 -jobconf mapred.local.dir=%2Ftmp%2Fhadoop-mvaradachari%2Fmapred%2Flocal%2F1998806799
    [junit] plan = /tmp/plan1552.xml
    [junit] 08/12/30 19:48:31 INFO exec.ExecDriver: Number of reduce tasks determined at compile : 0
    [junit] 08/12/30 19:48:31 INFO exec.ExecDriver: Adding input file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src
    [junit] 08/12/30 19:48:31 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/data/files/TestSerDe.jar
    [junit] 08/12/30 19:48:31 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
    [junit] 08/12/30 19:48:31 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] Job running in-process (local Hadoop)
    [junit] 08/12/30 19:48:31 INFO exec.ExecDriver: Job running in-process (local Hadoop)
    [junit] 08/12/30 19:48:31 INFO mapred.MapTask: numReduceTasks: 0
    [junit] 08/12/30 19:48:31 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 19:48:31 INFO exec.MapOperator: Adding alias a to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt
    [junit] 08/12/30 19:48:31 INFO exec.MapOperator: Got partitions: null
    [junit] 08/12/30 19:48:31 INFO exec.FilterOperator: Initializing Self
    [junit] 08/12/30 19:48:31 INFO exec.FilterOperator: Initializing children:
    [junit] 08/12/30 19:48:31 INFO exec.ScriptOperator: Initializing Self
    [junit] 08/12/30 19:48:31 INFO exec.ScriptOperator: Initializing children:
    [junit] 08/12/30 19:48:31 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 19:48:31 INFO exec.ScriptOperator: Initialization Done
    [junit] 08/12/30 19:48:31 INFO exec.ScriptOperator: Executing [/bin/cat]
    [junit] 08/12/30 19:48:31 INFO exec.ScriptOperator: tablename=src
    [junit] 08/12/30 19:48:31 INFO exec.ScriptOperator: partname=null
    [junit] 08/12/30 19:48:31 INFO exec.ScriptOperator: alias=a
    [junit] 08/12/30 19:48:31 INFO exec.FilterOperator: Initialization Done
    [junit] 08/12/30 19:48:31 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 19:48:31 INFO exec.FilterOperator: FILTERED:416
    [junit] 08/12/30 19:48:31 INFO exec.FilterOperator: PASSED:84
    [junit] 08/12/30 19:48:31 INFO exec.ScriptOperator: StreamThread ErrorProcessor done
    [junit] 08/12/30 19:48:31 INFO exec.ScriptOperator: StreamThread OutputProcessor done
    [junit] 08/12/30 19:48:31 INFO exec.ScriptOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 19:48:31 INFO exec.ScriptOperator: SERIALIZE_ERRORS:0
    [junit] 08/12/30 19:48:31 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt:0+5812
    [junit] 08/12/30 19:48:31 INFO mapred.TaskRunner: Task 'job_local_1_map_0000' done.
    [junit] 08/12/30 19:48:31 INFO mapred.TaskRunner: Saved output of task 'job_local_1_map_0000' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp1437609533
    [junit]  map = 100%,  reduce =0%
    [junit] Ended Job = job_local_1
    [junit] 08/12/30 19:48:32 INFO exec.ExecDriver:  map = 100%,  reduce =0%
    [junit] 08/12/30 19:48:32 INFO exec.ExecDriver: Ended Job = job_local_1
    [junit] testMapPlan2 execution completed successfully
    [junit] /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../data/files
    [junit] Beginning testMapRedPlan1
    [junit] Generating plan file /tmp/plan1553.xml
    [junit] Executing: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/hadoopcore/hadoop-0.17.1/bin/hadoop jar /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/hive_exec.jar org.apache.hadoop.hive.ql.exec.ExecDriver -plan /tmp/plan1553.xml -jobconf fs.scheme.class=dfs -jobconf hive.exec.scratchdir=%24%7Buser.dir%7D%2F..%2Fbuild%2Fql%2Ftmp -jobconf test.query.file1=file%3A%2F%2F%24%7Buser.dir%7D%2F..%2Fql%2Fsrc%2Ftest%2Forg%2Fapache%2Fhadoop%2Fhive%2Fql%2Finput2.q -jobconf hive.metastore.connect.retries=5 -jobconf hive.metastore.rawstore.impl=org.apache.hadoop.hive.metastore.ObjectStore -jobconf hive.metastore.metadb.dir=file%3A%2F%2F%24%7Buser.dir%7D%2F..%2Fbuild%2Fql%2Ftest%2Fdata%2Fmetadb%2F -jobconf javax.jdo.option.ConnectionPassword=mine -jobconf hive.metastore.uris=file%3A%2F%2F%2Fvar%2Fmetastore%2Fmetadb%2F -jobconf hive.metastore.warehouse.dir=file%3A%2F%2F%24%7Buser.dir%7D%2F..%2Fbuild%2Fql%2Ftest%2Fdat

 a%2Fwarehouse%2F -jobconf hive.aux.jars.path=%2Fusr%2Ffbtools%2Fcontinuous_builds%2Fhiveopensource-0.17.1%2Fhiveopensource_0_17_1%2Fbuild%2Fql%2Ftest%2Ftest-udfs.jar%2C%2Fusr%2Ffbtools%2Fcontinuous_builds%2Fhiveopensource-0.17.1%2Fhiveopensource_0_17_1%2Fdata%2Ffiles%2FTestSerDe.jar -jobconf hive.metastore.local=true -jobconf test.data.files=%24%7Buser.dir%7D%2F..%2Fdata%2Ffiles -jobconf javax.jdo.option.ConnectionURL=jdbc%3Aderby%3A%3BdatabaseName%3D..%2Fbuild%2Ftest%2Fjunit_metastore_db%3Bcreate%3Dtrue -jobconf javax.jdo.option.ConnectionDriverName=org.apache.derby.jdbc.EmbeddedDriver -jobconf hive.exec.script.maxerrsize=100000 -jobconf javax.jdo.option.ConnectionUserName=APP -jobconf hive.jar.path=%24%7Buser.dir%7D%2F..%2Fbuild%2Fql%2Fhive_exec.jar -jobconf hadoop.config.dir=%2Fusr%2Ffbtools%2Fcontinuous_builds%2Fhiveopensource-0.17.1%2Fhiveopensource_0_17_1%2Fbuild%2Fhadoopcore%2Fhadoop-0.17.1%2Fconf -jobconf hive.join.emit.interval=1000 -jobconf hive.exec.compress.outpu

 t=false -jobconf test.src.dir=file%3A%2F%2F%24%7Buser.dir%7D%2F..%2Fql%2Fsrc%2Ftest -jobconf test.log.dir=%24%7Buser.dir%7D%2F..%2Fbuild%2Fql%2Ftest%2Flogs -jobconf hive.map.aggr=false -jobconf hive.exec.compress.intermediate=false -jobconf hive.default.fileformat=TextFile -jobconf mapred.system.dir=%2Ftmp%2Fhadoop-mvaradachari%2Fmapred%2Fsystem%2F-1409714016 -jobconf mapred.local.dir=%2Ftmp%2Fhadoop-mvaradachari%2Fmapred%2Flocal%2F-877707691
    [junit] plan = /tmp/plan1553.xml
    [junit] 08/12/30 19:48:33 INFO exec.ExecDriver: Number of reduce tasks determined at compile : 1
    [junit] 08/12/30 19:48:33 INFO exec.ExecDriver: Adding input file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src
    [junit] 08/12/30 19:48:34 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/data/files/TestSerDe.jar
    [junit] 08/12/30 19:48:34 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
    [junit] 08/12/30 19:48:34 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] Job running in-process (local Hadoop)
    [junit] 08/12/30 19:48:34 INFO exec.ExecDriver: Job running in-process (local Hadoop)
    [junit] 08/12/30 19:48:34 INFO mapred.MapTask: numReduceTasks: 1
    [junit] 08/12/30 19:48:34 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 19:48:34 INFO exec.MapOperator: Adding alias a to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt
    [junit] 08/12/30 19:48:34 INFO exec.MapOperator: Got partitions: null
    [junit] 08/12/30 19:48:34 INFO exec.ReduceSinkOperator: Initializing Self
    [junit] 08/12/30 19:48:34 INFO exec.ReduceSinkOperator: Using tag = -1
    [junit] 08/12/30 19:48:34 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 19:48:34 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 19:48:34 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 19:48:34 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt:0+5812
    [junit] 08/12/30 19:48:34 INFO mapred.TaskRunner: Task 'job_local_1_map_0000' done.
    [junit] 08/12/30 19:48:34 INFO mapred.TaskRunner: Saved output of task 'job_local_1_map_0000' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp-705271556
    [junit] 08/12/30 19:48:34 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 19:48:34 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 19:48:34 INFO exec.ExtractOperator: Initializing Self
    [junit] 08/12/30 19:48:34 INFO exec.ExtractOperator: Initializing children:
    [junit] 08/12/30 19:48:34 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 19:48:34 INFO exec.ExtractOperator: Initialization Done
    [junit] 08/12/30 19:48:34 INFO mapred.LocalJobRunner: reduce > reduce
    [junit] 08/12/30 19:48:34 INFO mapred.TaskRunner: Task 'reduce_o3qz98' done.
    [junit] 08/12/30 19:48:34 INFO mapred.TaskRunner: Saved output of task 'reduce_o3qz98' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp-705271556
    [junit]  map = 100%,  reduce =100%
    [junit] Ended Job = job_local_1
    [junit] 08/12/30 19:48:35 INFO exec.ExecDriver:  map = 100%,  reduce =100%
    [junit] 08/12/30 19:48:35 INFO exec.ExecDriver: Ended Job = job_local_1
    [junit] testMapRedPlan1 execution completed successfully
    [junit] /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../data/files
    [junit] Beginning testMapPlan2

    [junit] Generating plan file /tmp/plan1554.xml
    [junit] Executing: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/hadoopcore/hadoop-0.17.1/bin/hadoop jar /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/hive_exec.jar org.apache.hadoop.hive.ql.exec.ExecDriver -plan /tmp/plan1554.xml -jobconf fs.scheme.class=dfs -jobconf hive.exec.scratchdir=%24%7Buser.dir%7D%2F..%2Fbuild%2Fql%2Ftmp -jobconf test.query.file1=file%3A%2F%2F%24%7Buser.dir%7D%2F..%2Fql%2Fsrc%2Ftest%2Forg%2Fapache%2Fhadoop%2Fhive%2Fql%2Finput2.q -jobconf hive.metastore.connect.retries=5 -jobconf hive.metastore.rawstore.impl=org.apache.hadoop.hive.metastore.ObjectStore -jobconf hive.metastore.metadb.dir=file%3A%2F%2F%24%7Buser.dir%7D%2F..%2Fbuild%2Fql%2Ftest%2Fdata%2Fmetadb%2F -jobconf javax.jdo.option.ConnectionPassword=mine -jobconf hive.metastore.uris=file%3A%2F%2F%2Fvar%2Fmetastore%2Fmetadb%2F -jobconf hive.metastore.warehouse.dir=file%3A%2F%2F%24%7Buser.dir%7D%2F..%2Fbuild%2Fql%2Ftest%2Fdat

 a%2Fwarehouse%2F -jobconf hive.aux.jars.path=%2Fusr%2Ffbtools%2Fcontinuous_builds%2Fhiveopensource-0.17.1%2Fhiveopensource_0_17_1%2Fbuild%2Fql%2Ftest%2Ftest-udfs.jar%2C%2Fusr%2Ffbtools%2Fcontinuous_builds%2Fhiveopensource-0.17.1%2Fhiveopensource_0_17_1%2Fdata%2Ffiles%2FTestSerDe.jar -jobconf hive.metastore.local=true -jobconf test.data.files=%24%7Buser.dir%7D%2F..%2Fdata%2Ffiles -jobconf javax.jdo.option.ConnectionURL=jdbc%3Aderby%3A%3BdatabaseName%3D..%2Fbuild%2Ftest%2Fjunit_metastore_db%3Bcreate%3Dtrue -jobconf javax.jdo.option.ConnectionDriverName=org.apache.derby.jdbc.EmbeddedDriver -jobconf hive.exec.script.maxerrsize=100000 -jobconf javax.jdo.option.ConnectionUserName=APP -jobconf hive.jar.path=%24%7Buser.dir%7D%2F..%2Fbuild%2Fql%2Fhive_exec.jar -jobconf hadoop.config.dir=%2Fusr%2Ffbtools%2Fcontinuous_builds%2Fhiveopensource-0.17.1%2Fhiveopensource_0_17_1%2Fbuild%2Fhadoopcore%2Fhadoop-0.17.1%2Fconf -jobconf hive.join.emit.interval=1000 -jobconf hive.exec.compress.outpu

 t=false -jobconf test.src.dir=file%3A%2F%2F%24%7Buser.dir%7D%2F..%2Fql%2Fsrc%2Ftest -jobconf test.log.dir=%24%7Buser.dir%7D%2F..%2Fbuild%2Fql%2Ftest%2Flogs -jobconf hive.map.aggr=false -jobconf hive.exec.compress.intermediate=false -jobconf hive.default.fileformat=TextFile -jobconf mapred.system.dir=%2Ftmp%2Fhadoop-mvaradachari%2Fmapred%2Fsystem%2F-1983727710 -jobconf mapred.local.dir=%2Ftmp%2Fhadoop-mvaradachari%2Fmapred%2Flocal%2F-686413515
    [junit] plan = /tmp/plan1554.xml
    [junit] 08/12/30 19:48:36 INFO exec.ExecDriver: Number of reduce tasks determined at compile : 1
    [junit] 08/12/30 19:48:36 INFO exec.ExecDriver: Adding input file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src
    [junit] 08/12/30 19:48:36 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/data/files/TestSerDe.jar
    [junit] 08/12/30 19:48:36 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
    [junit] 08/12/30 19:48:36 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] Job running in-process (local Hadoop)
    [junit] 08/12/30 19:48:37 INFO exec.ExecDriver: Job running in-process (local Hadoop)
    [junit] 08/12/30 19:48:37 INFO mapred.MapTask: numReduceTasks: 1
    [junit] 08/12/30 19:48:37 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 19:48:37 INFO exec.MapOperator: Adding alias a to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt
    [junit] 08/12/30 19:48:37 INFO exec.MapOperator: Got partitions: null
    [junit] 08/12/30 19:48:37 INFO exec.ReduceSinkOperator: Initializing Self
    [junit] 08/12/30 19:48:37 INFO exec.ReduceSinkOperator: Using tag = -1
    [junit] 08/12/30 19:48:37 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 19:48:37 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 19:48:37 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 19:48:37 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt:0+5812
    [junit] 08/12/30 19:48:37 INFO mapred.TaskRunner: Task 'job_local_1_map_0000' done.
    [junit] 08/12/30 19:48:37 INFO mapred.TaskRunner: Saved output of task 'job_local_1_map_0000' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp131007775
    [junit] 08/12/30 19:48:37 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 19:48:37 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 19:48:37 INFO exec.ExtractOperator: Initializing Self
    [junit] 08/12/30 19:48:37 INFO exec.ExtractOperator: Initializing children:
    [junit] 08/12/30 19:48:37 INFO exec.FilterOperator: Initializing Self
    [junit] 08/12/30 19:48:37 INFO exec.FilterOperator: Initializing children:
    [junit] 08/12/30 19:48:37 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 19:48:37 INFO exec.FilterOperator: Initialization Done
    [junit] 08/12/30 19:48:37 INFO exec.ExtractOperator: Initialization Done
    [junit] 08/12/30 19:48:37 INFO exec.FilterOperator: FILTERED:416
    [junit] 08/12/30 19:48:37 INFO exec.FilterOperator: PASSED:84
    [junit] 08/12/30 19:48:37 INFO mapred.LocalJobRunner: reduce > reduce
    [junit] 08/12/30 19:48:37 INFO mapred.TaskRunner: Task 'reduce_us6q8j' done.
    [junit] 08/12/30 19:48:37 INFO mapred.TaskRunner: Saved output of task 'reduce_us6q8j' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp131007775
    [junit] 08/12/30 19:48:38 INFO exec.ExecDriver:  map = 100%,  reduce =100%
    [junit] 08/12/30 19:48:38 INFO exec.ExecDriver: Ended Job = job_local_1
    [junit]  map = 100%,  reduce =100%
    [junit] Ended Job = job_local_1
    [junit] testMapRedPlan2 execution completed successfully
    [junit] /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../data/files
    [junit] Beginning testMapPlan3
    [junit] Generating plan file /tmp/plan1555.xml
    [junit] Executing: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/hadoopcore/hadoop-0.17.1/bin/hadoop jar /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/hive_exec.jar org.apache.hadoop.hive.ql.exec.ExecDriver -plan /tmp/plan1555.xml -jobconf fs.scheme.class=dfs -jobconf hive.exec.scratchdir=%24%7Buser.dir%7D%2F..%2Fbuild%2Fql%2Ftmp -jobconf test.query.file1=file%3A%2F%2F%24%7Buser.dir%7D%2F..%2Fql%2Fsrc%2Ftest%2Forg%2Fapache%2Fhadoop%2Fhive%2Fql%2Finput2.q -jobconf hive.metastore.connect.retries=5 -jobconf hive.metastore.rawstore.impl=org.apache.hadoop.hive.metastore.ObjectStore -jobconf hive.metastore.metadb.dir=file%3A%2F%2F%24%7Buser.dir%7D%2F..%2Fbuild%2Fql%2Ftest%2Fdata%2Fmetadb%2F -jobconf javax.jdo.option.ConnectionPassword=mine -jobconf hive.metastore.uris=file%3A%2F%2F%2Fvar%2Fmetastore%2Fmetadb%2F -jobconf hive.metastore.warehouse.dir=file%3A%2F%2F%24%7Buser.dir%7D%2F..%2Fbuild%2Fql%2Ftest%2Fdat

 a%2Fwarehouse%2F -jobconf hive.aux.jars.path=%2Fusr%2Ffbtools%2Fcontinuous_builds%2Fhiveopensource-0.17.1%2Fhiveopensource_0_17_1%2Fbuild%2Fql%2Ftest%2Ftest-udfs.jar%2C%2Fusr%2Ffbtools%2Fcontinuous_builds%2Fhiveopensource-0.17.1%2Fhiveopensource_0_17_1%2Fdata%2Ffiles%2FTestSerDe.jar -jobconf hive.metastore.local=true -jobconf test.data.files=%24%7Buser.dir%7D%2F..%2Fdata%2Ffiles -jobconf javax.jdo.option.ConnectionURL=jdbc%3Aderby%3A%3BdatabaseName%3D..%2Fbuild%2Ftest%2Fjunit_metastore_db%3Bcreate%3Dtrue -jobconf javax.jdo.option.ConnectionDriverName=org.apache.derby.jdbc.EmbeddedDriver -jobconf hive.exec.script.maxerrsize=100000 -jobconf javax.jdo.option.ConnectionUserName=APP -jobconf hive.jar.path=%24%7Buser.dir%7D%2F..%2Fbuild%2Fql%2Fhive_exec.jar -jobconf hadoop.config.dir=%2Fusr%2Ffbtools%2Fcontinuous_builds%2Fhiveopensource-0.17.1%2Fhiveopensource_0_17_1%2Fbuild%2Fhadoopcore%2Fhadoop-0.17.1%2Fconf -jobconf hive.join.emit.interval=1000 -jobconf hive.exec.compress.outpu

 t=false -jobconf test.src.dir=file%3A%2F%2F%24%7Buser.dir%7D%2F..%2Fql%2Fsrc%2Ftest -jobconf test.log.dir=%24%7Buser.dir%7D%2F..%2Fbuild%2Fql%2Ftest%2Flogs -jobconf hive.map.aggr=false -jobconf hive.exec.compress.intermediate=false -jobconf hive.default.fileformat=TextFile -jobconf mapred.system.dir=%2Ftmp%2Fhadoop-mvaradachari%2Fmapred%2Fsystem%2F1972859496 -jobconf mapred.local.dir=%2Ftmp%2Fhadoop-mvaradachari%2Fmapred%2Flocal%2F-315253964
    [junit] plan = /tmp/plan1555.xml
    [junit] 08/12/30 19:48:39 INFO exec.ExecDriver: Number of reduce tasks determined at compile : 5
    [junit] 08/12/30 19:48:39 INFO exec.ExecDriver: Adding input file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src
    [junit] 08/12/30 19:48:39 INFO exec.ExecDriver: Adding input file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src2
    [junit] 08/12/30 19:48:39 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/data/files/TestSerDe.jar
    [junit] 08/12/30 19:48:39 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
    [junit] 08/12/30 19:48:39 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] 08/12/30 19:48:39 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] 08/12/30 19:48:39 INFO exec.ExecDriver: Job running in-process (local Hadoop)
    [junit] Job running in-process (local Hadoop)
    [junit] 08/12/30 19:48:40 INFO mapred.MapTask: numReduceTasks: 1
    [junit] 08/12/30 19:48:40 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 19:48:40 INFO exec.MapOperator: Adding alias a to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt
    [junit] 08/12/30 19:48:40 INFO exec.MapOperator: Got partitions: null
    [junit] 08/12/30 19:48:40 INFO exec.ReduceSinkOperator: Initializing Self
    [junit] 08/12/30 19:48:40 INFO exec.ReduceSinkOperator: Using tag = 0
    [junit] 08/12/30 19:48:40 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 19:48:40 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 19:48:40 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 19:48:40 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt:0+5812
    [junit] 08/12/30 19:48:40 INFO mapred.TaskRunner: Task 'job_local_1_map_0000' done.
    [junit] 08/12/30 19:48:40 INFO mapred.TaskRunner: Saved output of task 'job_local_1_map_0000' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp941076790
    [junit] 08/12/30 19:48:40 INFO mapred.MapTask: numReduceTasks: 1
    [junit] 08/12/30 19:48:40 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 19:48:40 INFO exec.MapOperator: Adding alias b to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src2/kv2.txt
    [junit] 08/12/30 19:48:40 INFO exec.MapOperator: Got partitions: null
    [junit] 08/12/30 19:48:40 INFO exec.ReduceSinkOperator: Initializing Self
    [junit] 08/12/30 19:48:40 INFO exec.ReduceSinkOperator: Using tag = 1
    [junit] 08/12/30 19:48:40 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 19:48:40 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 19:48:40 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 19:48:40 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src2/kv2.txt:0+5791
    [junit] 08/12/30 19:48:40 INFO mapred.TaskRunner: Task 'job_local_1_map_0001' done.
    [junit] 08/12/30 19:48:40 INFO mapred.TaskRunner: Saved output of task 'job_local_1_map_0001' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp941076790
    [junit] 08/12/30 19:48:40 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 19:48:40 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 19:48:40 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 19:48:40 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 19:48:40 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 19:48:40 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 19:48:40 INFO mapred.LocalJobRunner: reduce > reduce
    [junit] 08/12/30 19:48:40 INFO mapred.TaskRunner: Task 'reduce_ds4bhn' done.
    [junit] 08/12/30 19:48:40 INFO mapred.TaskRunner: Saved output of task 'reduce_ds4bhn' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp941076790
    [junit] 08/12/30 19:48:40 INFO exec.ExecDriver:  map = 100%,  reduce =100%
    [junit] 08/12/30 19:48:40 INFO exec.ExecDriver: Ended Job = job_local_1
    [junit]  map = 100%,  reduce =100%
    [junit] Ended Job = job_local_1
    [junit] testMapRedPlan3 execution completed successfully
    [junit] /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../data/files
    [junit] Beginning testMapPlan4
    [junit] Generating plan file /tmp/plan1556.xml
    [junit] Executing: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/hadoopcore/hadoop-0.17.1/bin/hadoop jar /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/hive_exec.jar org.apache.hadoop.hive.ql.exec.ExecDriver -plan /tmp/plan1556.xml -jobconf fs.scheme.class=dfs -jobconf hive.exec.scratchdir=%24%7Buser.dir%7D%2F..%2Fbuild%2Fql%2Ftmp -jobconf test.query.file1=file%3A%2F%2F%24%7Buser.dir%7D%2F..%2Fql%2Fsrc%2Ftest%2Forg%2Fapache%2Fhadoop%2Fhive%2Fql%2Finput2.q -jobconf hive.metastore.connect.retries=5 -jobconf hive.metastore.rawstore.impl=org.apache.hadoop.hive.metastore.ObjectStore -jobconf hive.metastore.metadb.dir=file%3A%2F%2F%24%7Buser.dir%7D%2F..%2Fbuild%2Fql%2Ftest%2Fdata%2Fmetadb%2F -jobconf javax.jdo.option.ConnectionPassword=mine -jobconf hive.metastore.uris=file%3A%2F%2F%2Fvar%2Fmetastore%2Fmetadb%2F -jobconf hive.metastore.warehouse.dir=file%3A%2F%2F%24%7Buser.dir%7D%2F..%2Fbuild%2Fql%2Ftest%2Fdat

 a%2Fwarehouse%2F -jobconf hive.aux.jars.path=%2Fusr%2Ffbtools%2Fcontinuous_builds%2Fhiveopensource-0.17.1%2Fhiveopensource_0_17_1%2Fbuild%2Fql%2Ftest%2Ftest-udfs.jar%2C%2Fusr%2Ffbtools%2Fcontinuous_builds%2Fhiveopensource-0.17.1%2Fhiveopensource_0_17_1%2Fdata%2Ffiles%2FTestSerDe.jar -jobconf hive.metastore.local=true -jobconf test.data.files=%24%7Buser.dir%7D%2F..%2Fdata%2Ffiles -jobconf javax.jdo.option.ConnectionURL=jdbc%3Aderby%3A%3BdatabaseName%3D..%2Fbuild%2Ftest%2Fjunit_metastore_db%3Bcreate%3Dtrue -jobconf javax.jdo.option.ConnectionDriverName=org.apache.derby.jdbc.EmbeddedDriver -jobconf hive.exec.script.maxerrsize=100000 -jobconf javax.jdo.option.ConnectionUserName=APP -jobconf hive.jar.path=%24%7Buser.dir%7D%2F..%2Fbuild%2Fql%2Fhive_exec.jar -jobconf hadoop.config.dir=%2Fusr%2Ffbtools%2Fcontinuous_builds%2Fhiveopensource-0.17.1%2Fhiveopensource_0_17_1%2Fbuild%2Fhadoopcore%2Fhadoop-0.17.1%2Fconf -jobconf hive.join.emit.interval=1000 -jobconf hive.exec.compress.outpu

 t=false -jobconf test.src.dir=file%3A%2F%2F%24%7Buser.dir%7D%2F..%2Fql%2Fsrc%2Ftest -jobconf test.log.dir=%24%7Buser.dir%7D%2F..%2Fbuild%2Fql%2Ftest%2Flogs -jobconf hive.map.aggr=false -jobconf hive.exec.compress.intermediate=false -jobconf hive.default.fileformat=TextFile -jobconf mapred.system.dir=%2Ftmp%2Fhadoop-mvaradachari%2Fmapred%2Fsystem%2F-752098700 -jobconf mapred.local.dir=%2Ftmp%2Fhadoop-mvaradachari%2Fmapred%2Flocal%2F648342823
    [junit] plan = /tmp/plan1556.xml
    [junit] 08/12/30 19:48:42 INFO exec.ExecDriver: Number of reduce tasks determined at compile : 1
    [junit] 08/12/30 19:48:42 INFO exec.ExecDriver: Adding input file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src
    [junit] 08/12/30 19:48:42 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/data/files/TestSerDe.jar
    [junit] 08/12/30 19:48:42 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
    [junit] 08/12/30 19:48:42 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] 08/12/30 19:48:42 INFO exec.ExecDriver: Job running in-process (local Hadoop)
    [junit] Job running in-process (local Hadoop)
    [junit] 08/12/30 19:48:42 INFO mapred.MapTask: numReduceTasks: 1
    [junit] 08/12/30 19:48:43 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 19:48:43 INFO exec.MapOperator: Adding alias a to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt
    [junit] 08/12/30 19:48:43 INFO exec.MapOperator: Got partitions: null
    [junit] 08/12/30 19:48:43 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 19:48:43 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 19:48:43 INFO exec.ScriptOperator: Initializing Self
    [junit] 08/12/30 19:48:43 INFO exec.ScriptOperator: Initializing children:
    [junit] 08/12/30 19:48:43 INFO exec.ReduceSinkOperator: Initializing Self
    [junit] 08/12/30 19:48:43 INFO exec.ReduceSinkOperator: Using tag = -1
    [junit] 08/12/30 19:48:43 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 19:48:43 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 19:48:43 INFO exec.ScriptOperator: Initialization Done
    [junit] 08/12/30 19:48:43 INFO exec.ScriptOperator: Executing [/bin/cat]
    [junit] 08/12/30 19:48:43 INFO exec.ScriptOperator: tablename=src
    [junit] 08/12/30 19:48:43 INFO exec.ScriptOperator: partname=null
    [junit] 08/12/30 19:48:43 INFO exec.ScriptOperator: alias=a
    [junit] 08/12/30 19:48:43 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 19:48:43 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 19:48:43 INFO exec.ScriptOperator: StreamThread ErrorProcessor done
    [junit] 08/12/30 19:48:43 INFO exec.ScriptOperator: StreamThread OutputProcessor done
    [junit] 08/12/30 19:48:43 INFO exec.ScriptOperator: SERIALIZE_ERRORS:0
    [junit] 08/12/30 19:48:43 INFO exec.ScriptOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 19:48:43 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt:0+5812
    [junit] 08/12/30 19:48:43 INFO mapred.TaskRunner: Task 'job_local_1_map_0000' done.
    [junit] 08/12/30 19:48:43 INFO mapred.TaskRunner: Saved output of task 'job_local_1_map_0000' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp668374721
    [junit] 08/12/30 19:48:43 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 19:48:43 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 19:48:43 INFO exec.ExtractOperator: Initializing Self
    [junit] 08/12/30 19:48:43 INFO exec.ExtractOperator: Initializing children:
    [junit] 08/12/30 19:48:43 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 19:48:43 INFO exec.ExtractOperator: Initialization Done
    [junit] 08/12/30 19:48:43 INFO mapred.LocalJobRunner: reduce > reduce
    [junit] 08/12/30 19:48:43 INFO mapred.TaskRunner: Task 'reduce_ltxzwn' done.
    [junit] 08/12/30 19:48:43 INFO mapred.TaskRunner: Saved output of task 'reduce_ltxzwn' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp668374721
    [junit] 08/12/30 19:48:43 INFO exec.ExecDriver:  map = 100%,  reduce =100%
    [junit] 08/12/30 19:48:43 INFO exec.ExecDriver: Ended Job = job_local_1
    [junit]  map = 100%,  reduce =100%
    [junit] Ended Job = job_local_1
    [junit] testMapRedPlan4 execution completed successfully
    [junit] /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../data/files
    [junit] Beginning testMapPlan5
    [junit] Generating plan file /tmp/plan1557.xml
    [junit] Executing: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/hadoopcore/hadoop-0.17.1/bin/hadoop jar /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/hive_exec.jar org.apache.hadoop.hive.ql.exec.ExecDriver -plan /tmp/plan1557.xml -jobconf fs.scheme.class=dfs -jobconf hive.exec.scratchdir=%24%7Buser.dir%7D%2F..%2Fbuild%2Fql%2Ftmp -jobconf test.query.file1=file%3A%2F%2F%24%7Buser.dir%7D%2F..%2Fql%2Fsrc%2Ftest%2Forg%2Fapache%2Fhadoop%2Fhive%2Fql%2Finput2.q -jobconf hive.metastore.connect.retries=5 -jobconf hive.metastore.rawstore.impl=org.apache.hadoop.hive.metastore.ObjectStore -jobconf hive.metastore.metadb.dir=file%3A%2F%2F%24%7Buser.dir%7D%2F..%2Fbuild%2Fql%2Ftest%2Fdata%2Fmetadb%2F -jobconf javax.jdo.option.ConnectionPassword=mine -jobconf hive.metastore.uris=file%3A%2F%2F%2Fvar%2Fmetastore%2Fmetadb%2F -jobconf hive.metastore.warehouse.dir=file%3A%2F%2F%24%7Buser.dir%7D%2F..%2Fbuild%2Fql%2Ftest%2Fdat

 a%2Fwarehouse%2F -jobconf hive.aux.jars.path=%2Fusr%2Ffbtools%2Fcontinuous_builds%2Fhiveopensource-0.17.1%2Fhiveopensource_0_17_1%2Fbuild%2Fql%2Ftest%2Ftest-udfs.jar%2C%2Fusr%2Ffbtools%2Fcontinuous_builds%2Fhiveopensource-0.17.1%2Fhiveopensource_0_17_1%2Fdata%2Ffiles%2FTestSerDe.jar -jobconf hive.metastore.local=true -jobconf test.data.files=%24%7Buser.dir%7D%2F..%2Fdata%2Ffiles -jobconf javax.jdo.option.ConnectionURL=jdbc%3Aderby%3A%3BdatabaseName%3D..%2Fbuild%2Ftest%2Fjunit_metastore_db%3Bcreate%3Dtrue -jobconf javax.jdo.option.ConnectionDriverName=org.apache.derby.jdbc.EmbeddedDriver -jobconf hive.exec.script.maxerrsize=100000 -jobconf javax.jdo.option.ConnectionUserName=APP -jobconf hive.jar.path=%24%7Buser.dir%7D%2F..%2Fbuild%2Fql%2Fhive_exec.jar -jobconf hadoop.config.dir=%2Fusr%2Ffbtools%2Fcontinuous_builds%2Fhiveopensource-0.17.1%2Fhiveopensource_0_17_1%2Fbuild%2Fhadoopcore%2Fhadoop-0.17.1%2Fconf -jobconf hive.join.emit.interval=1000 -jobconf hive.exec.compress.outpu

 t=false -jobconf test.src.dir=file%3A%2F%2F%24%7Buser.dir%7D%2F..%2Fql%2Fsrc%2Ftest -jobconf test.log.dir=%24%7Buser.dir%7D%2F..%2Fbuild%2Fql%2Ftest%2Flogs -jobconf hive.map.aggr=false -jobconf hive.exec.compress.intermediate=false -jobconf hive.default.fileformat=TextFile -jobconf mapred.system.dir=%2Ftmp%2Fhadoop-mvaradachari%2Fmapred%2Fsystem%2F1375104176 -jobconf mapred.local.dir=%2Ftmp%2Fhadoop-mvaradachari%2Fmapred%2Flocal%2F-1555446494
    [junit] plan = /tmp/plan1557.xml
    [junit] 08/12/30 19:48:45 INFO exec.ExecDriver: Number of reduce tasks determined at compile : 1
    [junit] 08/12/30 19:48:45 INFO exec.ExecDriver: Adding input file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src
    [junit] 08/12/30 19:48:45 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/data/files/TestSerDe.jar
    [junit] 08/12/30 19:48:45 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
    [junit] 08/12/30 19:48:45 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] 08/12/30 19:48:45 INFO exec.ExecDriver: Job running in-process (local Hadoop)
    [junit] Job running in-process (local Hadoop)
    [junit] 08/12/30 19:48:45 INFO mapred.MapTask: numReduceTasks: 1
    [junit] 08/12/30 19:48:45 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 19:48:45 INFO exec.MapOperator: Adding alias a to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt
    [junit] 08/12/30 19:48:45 INFO exec.MapOperator: Got partitions: null
    [junit] 08/12/30 19:48:45 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 19:48:45 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 19:48:45 INFO exec.ReduceSinkOperator: Initializing Self
    [junit] 08/12/30 19:48:45 INFO exec.ReduceSinkOperator: Using tag = -1
    [junit] 08/12/30 19:48:45 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 19:48:45 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 19:48:45 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 19:48:45 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 19:48:45 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt:0+5812
    [junit] 08/12/30 19:48:45 INFO mapred.TaskRunner: Task 'job_local_1_map_0000' done.
    [junit] 08/12/30 19:48:45 INFO mapred.TaskRunner: Saved output of task 'job_local_1_map_0000' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp-329869742
    [junit] 08/12/30 19:48:45 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 19:48:45 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 19:48:45 INFO exec.ExtractOperator: Initializing Self
    [junit] 08/12/30 19:48:45 INFO exec.ExtractOperator: Initializing children:
    [junit] 08/12/30 19:48:45 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 19:48:45 INFO exec.ExtractOperator: Initialization Done
    [junit] 08/12/30 19:48:46 INFO mapred.LocalJobRunner: reduce > reduce
    [junit] 08/12/30 19:48:46 INFO mapred.TaskRunner: Task 'reduce_yf5kqz' done.
    [junit] 08/12/30 19:48:46 INFO mapred.TaskRunner: Saved output of task 'reduce_yf5kqz' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp-329869742
    [junit]  map = 100%,  reduce =100%
    [junit] Ended Job = job_local_1
    [junit] 08/12/30 19:48:46 INFO exec.ExecDriver:  map = 100%,  reduce =100%
    [junit] 08/12/30 19:48:46 INFO exec.ExecDriver: Ended Job = job_local_1
    [junit] testMapRedPlan5 execution completed successfully
    [junit] /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../data/files
    [junit] Beginning testMapPlan6
    [junit] Generating plan file /tmp/plan1558.xml
    [junit] Executing: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/hadoopcore/hadoop-0.17.1/bin/hadoop jar /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/hive_exec.jar org.apache.hadoop.hive.ql.exec.ExecDriver -plan /tmp/plan1558.xml -jobconf fs.scheme.class=dfs -jobconf hive.exec.scratchdir=%24%7Buser.dir%7D%2F..%2Fbuild%2Fql%2Ftmp -jobconf test.query.file1=file%3A%2F%2F%24%7Buser.dir%7D%2F..%2Fql%2Fsrc%2Ftest%2Forg%2Fapache%2Fhadoop%2Fhive%2Fql%2Finput2.q -jobconf hive.metastore.connect.retries=5 -jobconf hive.metastore.rawstore.impl=org.apache.hadoop.hive.metastore.ObjectStore -jobconf hive.metastore.metadb.dir=file%3A%2F%2F%24%7Buser.dir%7D%2F..%2Fbuild%2Fql%2Ftest%2Fdata%2Fmetadb%2F -jobconf javax.jdo.option.ConnectionPassword=mine -jobconf hive.metastore.uris=file%3A%2F%2F%2Fvar%2Fmetastore%2Fmetadb%2F -jobconf hive.metastore.warehouse.dir=file%3A%2F%2F%24%7Buser.dir%7D%2F..%2Fbuild%2Fql%2Ftest%2Fdat

 a%2Fwarehouse%2F -jobconf hive.aux.jars.path=%2Fusr%2Ffbtools%2Fcontinuous_builds%2Fhiveopensource-0.17.1%2Fhiveopensource_0_17_1%2Fbuild%2Fql%2Ftest%2Ftest-udfs.jar%2C%2Fusr%2Ffbtools%2Fcontinuous_builds%2Fhiveopensource-0.17.1%2Fhiveopensource_0_17_1%2Fdata%2Ffiles%2FTestSerDe.jar -jobconf hive.metastore.local=true -jobconf test.data.files=%24%7Buser.dir%7D%2F..%2Fdata%2Ffiles -jobconf javax.jdo.option.ConnectionURL=jdbc%3Aderby%3A%3BdatabaseName%3D..%2Fbuild%2Ftest%2Fjunit_metastore_db%3Bcreate%3Dtrue -jobconf javax.jdo.option.ConnectionDriverName=org.apache.derby.jdbc.EmbeddedDriver -jobconf hive.exec.script.maxerrsize=100000 -jobconf javax.jdo.option.ConnectionUserName=APP -jobconf hive.jar.path=%24%7Buser.dir%7D%2F..%2Fbuild%2Fql%2Fhive_exec.jar -jobconf hadoop.config.dir=%2Fusr%2Ffbtools%2Fcontinuous_builds%2Fhiveopensource-0.17.1%2Fhiveopensource_0_17_1%2Fbuild%2Fhadoopcore%2Fhadoop-0.17.1%2Fconf -jobconf hive.join.emit.interval=1000 -jobconf hive.exec.compress.outpu

 t=false -jobconf test.src.dir=file%3A%2F%2F%24%7Buser.dir%7D%2F..%2Fql%2Fsrc%2Ftest -jobconf test.log.dir=%24%7Buser.dir%7D%2F..%2Fbuild%2Fql%2Ftest%2Flogs -jobconf hive.map.aggr=false -jobconf hive.exec.compress.intermediate=false -jobconf hive.default.fileformat=TextFile -jobconf mapred.system.dir=%2Ftmp%2Fhadoop-mvaradachari%2Fmapred%2Fsystem%2F1566754684 -jobconf mapred.local.dir=%2Ftmp%2Fhadoop-mvaradachari%2Fmapred%2Flocal%2F957709116
    [junit] plan = /tmp/plan1558.xml
    [junit] 08/12/30 19:48:47 INFO exec.ExecDriver: Number of reduce tasks determined at compile : 1
    [junit] 08/12/30 19:48:48 INFO exec.ExecDriver: Adding input file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src
    [junit] 08/12/30 19:48:48 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/data/files/TestSerDe.jar
    [junit] 08/12/30 19:48:48 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
    [junit] 08/12/30 19:48:48 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] 08/12/30 19:48:48 INFO exec.ExecDriver: Job running in-process (local Hadoop)
    [junit] Job running in-process (local Hadoop)
    [junit] 08/12/30 19:48:48 INFO mapred.MapTask: numReduceTasks: 1
    [junit] 08/12/30 19:48:48 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 19:48:48 INFO exec.MapOperator: Adding alias a to work list for file /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt
    [junit] 08/12/30 19:48:48 INFO exec.MapOperator: Got partitions: null
    [junit] 08/12/30 19:48:48 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 19:48:48 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 19:48:48 INFO exec.ScriptOperator: Initializing Self
    [junit] 08/12/30 19:48:48 INFO exec.ScriptOperator: Initializing children:
    [junit] 08/12/30 19:48:48 INFO exec.ReduceSinkOperator: Initializing Self
    [junit] 08/12/30 19:48:48 INFO exec.ReduceSinkOperator: Using tag = -1
    [junit] 08/12/30 19:48:48 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 19:48:48 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 19:48:48 INFO exec.ScriptOperator: Initialization Done
    [junit] 08/12/30 19:48:48 INFO exec.ScriptOperator: Executing [/bin/cat]
    [junit] 08/12/30 19:48:48 INFO exec.ScriptOperator: tablename=src
    [junit] 08/12/30 19:48:48 INFO exec.ScriptOperator: partname=null
    [junit] 08/12/30 19:48:48 INFO exec.ScriptOperator: alias=a
    [junit] 08/12/30 19:48:48 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 19:48:48 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 19:48:48 INFO exec.ScriptOperator: StreamThread ErrorProcessor done
    [junit] 08/12/30 19:48:48 INFO exec.ScriptOperator: StreamThread OutputProcessor done
    [junit] 08/12/30 19:48:48 INFO exec.ScriptOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 19:48:48 INFO exec.ScriptOperator: SERIALIZE_ERRORS:0
    [junit] 08/12/30 19:48:48 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src/kv1.txt:0+5812
    [junit] 08/12/30 19:48:48 INFO mapred.TaskRunner: Task 'job_local_1_map_0000' done.
    [junit] 08/12/30 19:48:48 INFO mapred.TaskRunner: Saved output of task 'job_local_1_map_0000' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp-1891744155
    [junit] 08/12/30 19:48:48 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 19:48:48 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 19:48:48 INFO exec.ExtractOperator: Initializing Self
    [junit] 08/12/30 19:48:48 INFO exec.ExtractOperator: Initializing children:
    [junit] 08/12/30 19:48:48 INFO exec.FilterOperator: Initializing Self
    [junit] 08/12/30 19:48:48 INFO exec.FilterOperator: Initializing children:
    [junit] 08/12/30 19:48:48 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 19:48:48 INFO exec.FilterOperator: Initialization Done
    [junit] 08/12/30 19:48:48 INFO exec.ExtractOperator: Initialization Done
    [junit] 08/12/30 19:48:49 INFO exec.FilterOperator: FILTERED:416
    [junit] 08/12/30 19:48:49 INFO exec.FilterOperator: PASSED:84
    [junit] 08/12/30 19:48:49 INFO mapred.LocalJobRunner: reduce > reduce
    [junit] 08/12/30 19:48:49 INFO mapred.TaskRunner: Task 'reduce_83czoh' done.
    [junit] 08/12/30 19:48:49 INFO mapred.TaskRunner: Saved output of task 'reduce_83czoh' to file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/tmp-1891744155
    [junit]  map = 100%,  reduce =100%
    [junit] 08/12/30 19:48:49 INFO exec.ExecDriver:  map = 100%,  reduce =100%
    [junit] Ended Job = job_local_1
    [junit] 08/12/30 19:48:49 INFO exec.ExecDriver: Ended Job = job_local_1
    [junit] testMapRedPlan6 execution completed successfully
    [junit] /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../data/files
    [junit] Tests run: 8, Failures: 0, Errors: 0, Time elapsed: 22.508 sec
    [junit] Running org.apache.hadoop.hive.ql.exec.TestExpressionEvaluator
    [junit] ExprNodeColumnEvaluator ok
    [junit] ExprNodeFuncEvaluator ok
    [junit] testExprNodeConversionEvaluator ok
    [junit] Evaluating 1 + 2 for 10000000 times
    [junit] Evaluation finished: 0.704 seconds, 0.070 seconds/million call.
    [junit] Evaluating 1 + 2 - 3 for 10000000 times
    [junit] Evaluation finished: 1.594 seconds, 0.159 seconds/million call.
    [junit] Evaluating 1 + 2 - 3 + 4 for 10000000 times
    [junit] Evaluation finished: 2.112 seconds, 0.211 seconds/million call.
    [junit] Evaluating concat("1", "2") for 10000000 times
    [junit] Evaluation finished: 1.740 seconds, 0.174 seconds/million call.
    [junit] Evaluating concat(concat("1", "2"), "3") for 10000000 times
    [junit] Evaluation finished: 3.376 seconds, 0.338 seconds/million call.
    [junit] Evaluating concat(concat(concat("1", "2"), "3"), "4") for 10000000 times
    [junit] Evaluation finished: 5.128 seconds, 0.513 seconds/million call.
    [junit] Evaluating concat(col1[1], cola[1]) for 1000000 times
    [junit] Evaluation finished: 0.293 seconds, 0.293 seconds/million call.
    [junit] Evaluating concat(concat(col1[1], cola[1]), col1[2]) for 1000000 times
    [junit] Evaluation finished: 0.544 seconds, 0.544 seconds/million call.
    [junit] Evaluating concat(concat(concat(col1[1], cola[1]), col1[2]), cola[2]) for 1000000 times
    [junit] Evaluation finished: 0.810 seconds, 0.810 seconds/million call.
    [junit] Tests run: 4, Failures: 0, Errors: 0, Time elapsed: 16.479 sec
    [junit] Running org.apache.hadoop.hive.ql.exec.TestJEXL
    [junit] JEXL library test ok
    [junit] Evaluating 1 + 2 for 10000000 times
    [junit] Evaluation finished: 0.781 seconds, 0.078 seconds/million call.
    [junit] Evaluating __udf__concat.evaluate("1", "2") for 1000000 times
    [junit] Evaluation finished: 1.387 seconds, 1.387 seconds/million call.
    [junit] Tests run: 3, Failures: 0, Errors: 0, Time elapsed: 2.526 sec
    [junit] Running org.apache.hadoop.hive.ql.exec.TestOperators
    [junit] Testing Filter Operator
    [junit] filtered = 4
    [junit] passed = 1
    [junit] Filter Operator ok
    [junit] Testing FileSink Operator
    [junit] FileSink Operator ok
    [junit] Testing Script Operator
    [junit] [0] io.o=[1, 01]
    [junit] [0] io.oi=org.apache.hadoop.hive.serde2.objectinspector.StandardStructObjectInspector@16f8f7db
    [junit] [1] io.o=[2, 11]
    [junit] [1] io.oi=org.apache.hadoop.hive.serde2.objectinspector.StandardStructObjectInspector@16f8f7db
    [junit] [2] io.o=[3, 21]
    [junit] [2] io.oi=org.apache.hadoop.hive.serde2.objectinspector.StandardStructObjectInspector@16f8f7db
    [junit] [3] io.o=[4, 31]
    [junit] [3] io.oi=org.apache.hadoop.hive.serde2.objectinspector.StandardStructObjectInspector@16f8f7db
    [junit] [4] io.o=[5, 41]
    [junit] [4] io.oi=org.apache.hadoop.hive.serde2.objectinspector.StandardStructObjectInspector@16f8f7db
    [junit] Script Operator ok
    [junit] Testing Map Operator
    [junit] io1.o.toString() = [[0, 1, 2]]
    [junit] io2.o.toString() = [[0, 1, 2]]
    [junit] answer.toString() = [[0, 1, 2]]
    [junit] io1.o.toString() = [[1, 2, 3]]
    [junit] io2.o.toString() = [[1, 2, 3]]
    [junit] answer.toString() = [[1, 2, 3]]
    [junit] io1.o.toString() = [[2, 3, 4]]
    [junit] io2.o.toString() = [[2, 3, 4]]
    [junit] answer.toString() = [[2, 3, 4]]
    [junit] io1.o.toString() = [[3, 4, 5]]
    [junit] io2.o.toString() = [[3, 4, 5]]
    [junit] answer.toString() = [[3, 4, 5]]
    [junit] io1.o.toString() = [[4, 5, 6]]
    [junit] io2.o.toString() = [[4, 5, 6]]
    [junit] answer.toString() = [[4, 5, 6]]
    [junit] Map Operator ok
    [junit] Tests run: 4, Failures: 0, Errors: 0, Time elapsed: 1.058 sec
    [junit] Running org.apache.hadoop.hive.ql.exec.TestPlan
    [junit] Serialization/Deserialization of plan successful
    [junit] Tests run: 1, Failures: 0, Errors: 0, Time elapsed: 0.967 sec
    [junit] Running org.apache.hadoop.hive.ql.io.TestFlatFileInputFormat
    [junit] Tests run: 2, Failures: 0, Errors: 0, Time elapsed: 0.89 sec
    [junit] Running org.apache.hadoop.hive.ql.metadata.TestHive
    [junit] Tests run: 4, Failures: 0, Errors: 0, Time elapsed: 7.368 sec
    [junit] Running org.apache.hadoop.hive.ql.metadata.TestPartition
    [junit] Tests run: 1, Failures: 0, Errors: 0, Time elapsed: 0.194 sec
    [junit] Running org.apache.hadoop.hive.ql.parse.TestParse
    [junit] Begin query: case_sensitivity.q
    [junit] Loading data to table srcpart partition {ds=2008-04-08, hr=11}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-08, hr=12}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-09, hr=11}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-09, hr=12}
    [junit] OK
    [junit] Loading data to table srcbucket
    [junit] OK
    [junit] Loading data to table srcbucket
    [junit] OK
    [junit] Loading data to table src
    [junit] OK
    [junit] diff /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/case_sensitivity.q.out /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/compiler/parse/case_sensitivity.q.out
    [junit] diff -b -I'\(\(<java version=".*" class="java.beans.XMLDecoder">\)\|\(<string>.*/tmp/.*</string>\)\|\(<string>file:.*</string>\)\|\(<string>/.*/warehouse/.*</string>\)\)' /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/case_sensitivity.q.xml /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/compiler/plan/case_sensitivity.q.xml
    [junit] Done query: case_sensitivity.q
    [junit] Begin query: input20.q
    [junit] Loading data to table srcpart partition {ds=2008-04-08, hr=11}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-08, hr=12}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-09, hr=11}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-09, hr=12}
    [junit] OK
    [junit] Loading data to table srcbucket
    [junit] OK
    [junit] Loading data to table srcbucket
    [junit] OK
    [junit] Loading data to table src
    [junit] OK
    [junit] diff /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/input20.q.out /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/compiler/parse/input20.q.out
    [junit] diff -b -I'\(\(<java version=".*" class="java.beans.XMLDecoder">\)\|\(<string>.*/tmp/.*</string>\)\|\(<string>file:.*</string>\)\|\(<string>/.*/warehouse/.*</string>\)\)' /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/input20.q.xml /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/compiler/plan/input20.q.xml
    [junit] Done query: input20.q
    [junit] Begin query: sample1.q
    [junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.init(QTestUtil.java:329)
    [junit]     at org.apache.hadoop.hive.ql.parse.TestParse.testParse_sample1(TestParse.java:178)
    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit]     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit]     at java.lang.reflect.Method.invoke(Method.java:597)
    [junit]     at junit.framework.TestCase.runTest(TestCase.java:164)
    [junit]     at junit.framework.TestCase.runBare(TestCase.java:130)
    [junit]     at junit.framework.TestResult$1.protect(TestResult.java:106)
    [junit]     at junit.framework.TestResult.runProtected(TestResult.java:124)
    [junit]     at junit.framework.TestResult.run(TestResult.java:109)
    [junit]     at junit.framework.TestCase.run(TestCase.java:120)
    [junit]     at junit.framework.TestSuite.runTest(TestSuite.java:230)
    [junit]     at junit.framework.TestSuite.run(TestSuite.java:225)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
    [junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
    [junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
    [junit]     at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
    [junit]     at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
    [junit]     at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
    [junit]     ... 20 more
    [junit] Begin query: sample2.q
    [junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcpart)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.init(QTestUtil.java:329)
    [junit]     at org.apache.hadoop.hive.ql.parse.TestParse.testParse_sample2(TestParse.java:204)
    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit]     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit]     at java.lang.reflect.Method.invoke(Method.java:597)
    [junit]     at junit.framework.TestCase.runTest(TestCase.java:164)
    [junit]     at junit.framework.TestCase.runBare(TestCase.java:130)
    [junit]     at junit.framework.TestResult$1.protect(TestResult.java:106)
    [junit]     at junit.framework.TestResult.runProtected(TestResult.java:124)
    [junit]     at junit.framework.TestResult.run(TestResult.java:109)
    [junit]     at junit.framework.TestCase.run(TestCase.java:120)
    [junit]     at junit.framework.TestSuite.runTest(TestSuite.java:230)
    [junit]     at junit.framework.TestSuite.run(TestSuite.java:225)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
    [junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcpart)
    [junit]     at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
    [junit]     at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
    [junit]     at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
    [junit]     ... 20 more
    [junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcpart)
    [junit] Begin query: sample3.q
    [junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcbucket)
    [junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcbucket)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.init(QTestUtil.java:329)
    [junit]     at org.apache.hadoop.hive.ql.parse.TestParse.testParse_sample3(TestParse.java:230)
    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit]     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit]     at java.lang.reflect.Method.invoke(Method.java:597)
    [junit]     at junit.framework.TestCase.runTest(TestCase.java:164)
    [junit]     at junit.framework.TestCase.runBare(TestCase.java:130)
    [junit]     at junit.framework.TestResult$1.protect(TestResult.java:106)
    [junit]     at junit.framework.TestResult.runProtected(TestResult.java:124)
    [junit]     at junit.framework.TestResult.run(TestResult.java:109)
    [junit]     at junit.framework.TestCase.run(TestCase.java:120)
    [junit]     at junit.framework.TestSuite.runTest(TestSuite.java:230)
    [junit]     at junit.framework.TestSuite.run(TestSuite.java:225)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
    [junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcbucket)
    [junit]     at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
    [junit]     at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
    [junit]     at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
    [junit]     ... 20 more
    [junit] Begin query: sample4.q
    [junit] Loading data to table srcpart partition {ds=2008-04-08, hr=11}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-08, hr=12}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-09, hr=11}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-09, hr=12}
    [junit] OK
    [junit] Loading data to table srcbucket
    [junit] OK
    [junit] Loading data to table srcbucket
    [junit] OK
    [junit] Loading data to table src
    [junit] OK
    [junit] diff /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/sample4.q.out /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/compiler/parse/sample4.q.out
    [junit] diff -b -I'\(\(<java version=".*" class="java.beans.XMLDecoder">\)\|\(<string>.*/tmp/.*</string>\)\|\(<string>file:.*</string>\)\|\(<string>/.*/warehouse/.*</string>\)\)' /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/sample4.q.xml /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/compiler/plan/sample4.q.xml
    [junit] Done query: sample4.q
    [junit] Begin query: sample5.q
    [junit] Loading data to table srcpart partition {ds=2008-04-08, hr=11}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-08, hr=12}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-09, hr=11}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-09, hr=12}
    [junit] OK
    [junit] Loading data to table srcbucket
    [junit] OK
    [junit] Loading data to table srcbucket
    [junit] OK
    [junit] Loading data to table src
    [junit] OK
    [junit] diff /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/sample5.q.out /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/compiler/parse/sample5.q.out
    [junit] diff -b -I'\(\(<java version=".*" class="java.beans.XMLDecoder">\)\|\(<string>.*/tmp/.*</string>\)\|\(<string>file:.*</string>\)\|\(<string>/.*/warehouse/.*</string>\)\)' /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/sample5.q.xml /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/compiler/plan/sample5.q.xml
    [junit] Done query: sample5.q
    [junit] Begin query: sample6.q
    [junit] Loading data to table srcpart partition {ds=2008-04-08, hr=11}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-08, hr=12}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-09, hr=11}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-09, hr=12}
    [junit] OK
    [junit] Loading data to table srcbucket
    [junit] OK
    [junit] Loading data to table srcbucket
    [junit] OK
    [junit] Loading data to table src
    [junit] OK
    [junit] diff /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/sample6.q.out /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/compiler/parse/sample6.q.out
    [junit] diff -b -I'\(\(<java version=".*" class="java.beans.XMLDecoder">\)\|\(<string>.*/tmp/.*</string>\)\|\(<string>file:.*</string>\)\|\(<string>/.*/warehouse/.*</string>\)\)' /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/sample6.q.xml /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/compiler/plan/sample6.q.xml
    [junit] Done query: sample6.q
    [junit] Begin query: sample7.q
    [junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
    [junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.init(QTestUtil.java:329)
    [junit]     at org.apache.hadoop.hive.ql.parse.TestParse.testParse_sample7(TestParse.java:334)
    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit]     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit]     at java.lang.reflect.Method.invoke(Method.java:597)
    [junit]     at junit.framework.TestCase.runTest(TestCase.java:164)
    [junit]     at junit.framework.TestCase.runBare(TestCase.java:130)
    [junit]     at junit.framework.TestResult$1.protect(TestResult.java:106)
    [junit]     at junit.framework.TestResult.runProtected(TestResult.java:124)
    [junit]     at junit.framework.TestResult.run(TestResult.java:109)
    [junit]     at junit.framework.TestCase.run(TestCase.java:120)
    [junit]     at junit.framework.TestSuite.runTest(TestSuite.java:230)
    [junit]     at junit.framework.TestSuite.run(TestSuite.java:225)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
    [junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
    [junit]     at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
    [junit]     at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
    [junit]     at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
    [junit]     ... 20 more
    [junit] Begin query: cast1.q
    [junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcbucket)
    [junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcbucket)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.init(QTestUtil.java:329)
    [junit]     at org.apache.hadoop.hive.ql.parse.TestParse.testParse_cast1(TestParse.java:360)
    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit]     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit]     at java.lang.reflect.Method.invoke(Method.java:597)
    [junit]     at junit.framework.TestCase.runTest(TestCase.java:164)
    [junit]     at junit.framework.TestCase.runBare(TestCase.java:130)
    [junit]     at junit.framework.TestResult$1.protect(TestResult.java:106)
    [junit]     at junit.framework.TestResult.runProtected(TestResult.java:124)
    [junit]     at junit.framework.TestResult.run(TestResult.java:109)
    [junit]     at junit.framework.TestCase.run(TestCase.java:120)
    [junit]     at junit.framework.TestSuite.runTest(TestSuite.java:230)
    [junit]     at junit.framework.TestSuite.run(TestSuite.java:225)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
    [junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcbucket)
    [junit]     at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
    [junit]     at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
    [junit]     at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
    [junit]     ... 20 more
    [junit] Begin query: join1.q
    [junit] Loading data to table srcpart partition {ds=2008-04-08, hr=11}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-08, hr=12}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-09, hr=11}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-09, hr=12}
    [junit] OK
    [junit] Loading data to table srcbucket
    [junit] OK
    [junit] Loading data to table srcbucket
    [junit] OK
    [junit] Loading data to table src
    [junit] OK
    [junit] diff /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/join1.q.out /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/compiler/parse/join1.q.out
    [junit] diff -b -I'\(\(<java version=".*" class="java.beans.XMLDecoder">\)\|\(<string>.*/tmp/.*</string>\)\|\(<string>file:.*</string>\)\|\(<string>/.*/warehouse/.*</string>\)\)' /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/join1.q.xml /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/compiler/plan/join1.q.xml
    [junit] Done query: join1.q
    [junit] Begin query: input1.q
    [junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
    [junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.init(QTestUtil.java:329)
    [junit]     at org.apache.hadoop.hive.ql.parse.TestParse.testParse_input1(TestParse.java:412)
    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit]     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit]     at java.lang.reflect.Method.invoke(Method.java:597)
    [junit]     at junit.framework.TestCase.runTest(TestCase.java:164)
    [junit]     at junit.framework.TestCase.runBare(TestCase.java:130)
    [junit]     at junit.framework.TestResult$1.protect(TestResult.java:106)
    [junit]     at junit.framework.TestResult.runProtected(TestResult.java:124)
    [junit]     at junit.framework.TestResult.run(TestResult.java:109)
    [junit]     at junit.framework.TestCase.run(TestCase.java:120)
    [junit]     at junit.framework.TestSuite.runTest(TestSuite.java:230)
    [junit]     at junit.framework.TestSuite.run(TestSuite.java:225)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
    [junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
    [junit]     at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
    [junit]     at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
    [junit]     at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
    [junit]     ... 20 more
    [junit] Begin query: join2.q
    [junit] Loading data to table srcpart partition {ds=2008-04-08, hr=11}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-08, hr=12}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-09, hr=11}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-09, hr=12}
    [junit] OK
    [junit] Loading data to table srcbucket
    [junit] OK
    [junit] Loading data to table srcbucket
    [junit] OK
    [junit] Loading data to table src
    [junit] OK
    [junit] diff /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/join2.q.out /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/compiler/parse/join2.q.out
    [junit] diff -b -I'\(\(<java version=".*" class="java.beans.XMLDecoder">\)\|\(<string>.*/tmp/.*</string>\)\|\(<string>file:.*</string>\)\|\(<string>/.*/warehouse/.*</string>\)\)' /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/join2.q.xml /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/compiler/plan/join2.q.xml
    [junit] Done query: join2.q
    [junit] Begin query: input2.q
    [junit] Loading data to table srcpart partition {ds=2008-04-08, hr=11}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-08, hr=12}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-09, hr=11}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-09, hr=12}
    [junit] OK
    [junit] Loading data to table srcbucket
    [junit] OK
    [junit] Loading data to table srcbucket
    [junit] OK
    [junit] Loading data to table src
    [junit] OK
    [junit] diff /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/input2.q.out /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/compiler/parse/input2.q.out
    [junit] diff -b -I'\(\(<java version=".*" class="java.beans.XMLDecoder">\)\|\(<string>.*/tmp/.*</string>\)\|\(<string>file:.*</string>\)\|\(<string>/.*/warehouse/.*</string>\)\)' /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/input2.q.xml /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/compiler/plan/input2.q.xml
    [junit] Done query: input2.q
    [junit] Begin query: join3.q
    [junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
    [junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.init(QTestUtil.java:329)
    [junit]     at org.apache.hadoop.hive.ql.parse.TestParse.testParse_join3(TestParse.java:490)
    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit]     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit]     at java.lang.reflect.Method.invoke(Method.java:597)
    [junit]     at junit.framework.TestCase.runTest(TestCase.java:164)
    [junit]     at junit.framework.TestCase.runBare(TestCase.java:130)
    [junit]     at junit.framework.TestResult$1.protect(TestResult.java:106)
    [junit]     at junit.framework.TestResult.runProtected(TestResult.java:124)
    [junit]     at junit.framework.TestResult.run(TestResult.java:109)
    [junit]     at junit.framework.TestCase.run(TestCase.java:120)
    [junit]     at junit.framework.TestSuite.runTest(TestSuite.java:230)
    [junit]     at junit.framework.TestSuite.run(TestSuite.java:225)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
    [junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
    [junit]     at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
    [junit]     at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
    [junit]     at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
    [junit]     ... 20 more
    [junit] Begin query: input3.q
    [junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcbucket)
    [junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcbucket)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.init(QTestUtil.java:329)
    [junit]     at org.apache.hadoop.hive.ql.parse.TestParse.testParse_input3(TestParse.java:516)
    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit]     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit]     at java.lang.reflect.Method.invoke(Method.java:597)
    [junit]     at junit.framework.TestCase.runTest(TestCase.java:164)
    [junit]     at junit.framework.TestCase.runBare(TestCase.java:130)
    [junit]     at junit.framework.TestResult$1.protect(TestResult.java:106)
    [junit]     at junit.framework.TestResult.runProtected(TestResult.java:124)
    [junit]     at junit.framework.TestResult.run(TestResult.java:109)
    [junit]     at junit.framework.TestCase.run(TestCase.java:120)
    [junit]     at junit.framework.TestSuite.runTest(TestSuite.java:230)
    [junit]     at junit.framework.TestSuite.run(TestSuite.java:225)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
    [junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcbucket)
    [junit]     at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
    [junit]     at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
    [junit]     at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
    [junit]     ... 20 more
    [junit] Begin query: input4.q
    [junit] Loading data to table srcpart partition {ds=2008-04-08, hr=11}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-08, hr=12}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-09, hr=11}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-09, hr=12}
    [junit] OK
    [junit] Loading data to table srcbucket
    [junit] OK
    [junit] Loading data to table srcbucket
    [junit] OK
    [junit] Loading data to table src
    [junit] OK
    [junit] diff /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/input4.q.out /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/compiler/parse/input4.q.out
    [junit] diff -b -I'\(\(<java version=".*" class="java.beans.XMLDecoder">\)\|\(<string>.*/tmp/.*</string>\)\|\(<string>file:.*</string>\)\|\(<string>/.*/warehouse/.*</string>\)\)' /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/input4.q.xml /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/compiler/plan/input4.q.xml
    [junit] Done query: input4.q
    [junit] Begin query: join4.q
    [junit] Loading data to table srcpart partition {ds=2008-04-08, hr=11}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-08, hr=12}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-09, hr=11}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-09, hr=12}
    [junit] OK
    [junit] Loading data to table srcbucket
    [junit] OK
    [junit] Loading data to table srcbucket
    [junit] OK
    [junit] Loading data to table src
    [junit] OK
    [junit] diff /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/join4.q.out /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/compiler/parse/join4.q.out
    [junit] diff -b -I'\(\(<java version=".*" class="java.beans.XMLDecoder">\)\|\(<string>.*/tmp/.*</string>\)\|\(<string>file:.*</string>\)\|\(<string>/.*/warehouse/.*</string>\)\)' /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/join4.q.xml /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/compiler/plan/join4.q.xml
    [junit] Done query: join4.q
    [junit] Begin query: input5.q
    [junit] Loading data to table srcpart partition {ds=2008-04-08, hr=11}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-08, hr=12}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-09, hr=11}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-09, hr=12}
    [junit] OK
    [junit] Loading data to table srcbucket
    [junit] OK
    [junit] Loading data to table srcbucket
    [junit] OK
    [junit] Loading data to table src
    [junit] OK
    [junit] diff /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/input5.q.out /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/compiler/parse/input5.q.out
    [junit] diff -b -I'\(\(<java version=".*" class="java.beans.XMLDecoder">\)\|\(<string>.*/tmp/.*</string>\)\|\(<string>file:.*</string>\)\|\(<string>/.*/warehouse/.*</string>\)\)' /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/input5.q.xml /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/compiler/plan/input5.q.xml
    [junit] Done query: input5.q
    [junit] Begin query: join5.q
    [junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
    [junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.init(QTestUtil.java:329)
    [junit]     at org.apache.hadoop.hive.ql.parse.TestParse.testParse_join5(TestParse.java:620)
    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit]     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit]     at java.lang.reflect.Method.invoke(Method.java:597)
    [junit]     at junit.framework.TestCase.runTest(TestCase.java:164)
    [junit]     at junit.framework.TestCase.runBare(TestCase.java:130)
    [junit]     at junit.framework.TestResult$1.protect(TestResult.java:106)
    [junit]     at junit.framework.TestResult.runProtected(TestResult.java:124)
    [junit]     at junit.framework.TestResult.run(TestResult.java:109)
    [junit]     at junit.framework.TestCase.run(TestCase.java:120)
    [junit]     at junit.framework.TestSuite.runTest(TestSuite.java:230)
    [junit]     at junit.framework.TestSuite.run(TestSuite.java:225)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
    [junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
    [junit]     at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
    [junit]     at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
    [junit]     at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
    [junit]     ... 20 more
    [junit] Begin query: input6.q
    [junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcpart)
    [junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcpart)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.init(QTestUtil.java:329)
    [junit]     at org.apache.hadoop.hive.ql.parse.TestParse.testParse_input6(TestParse.java:646)
    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit]     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit]     at java.lang.reflect.Method.invoke(Method.java:597)
    [junit]     at junit.framework.TestCase.runTest(TestCase.java:164)
    [junit]     at junit.framework.TestCase.runBare(TestCase.java:130)
    [junit]     at junit.framework.TestResult$1.protect(TestResult.java:106)
    [junit]     at junit.framework.TestResult.runProtected(TestResult.java:124)
    [junit]     at junit.framework.TestResult.run(TestResult.java:109)
    [junit]     at junit.framework.TestCase.run(TestCase.java:120)
    [junit]     at junit.framework.TestSuite.runTest(TestSuite.java:230)
    [junit]     at junit.framework.TestSuite.run(TestSuite.java:225)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
    [junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcpart)
    [junit]     at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
    [junit]     at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
    [junit]     at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
    [junit]     ... 20 more
    [junit] Begin query: input_testxpath2.q
    [junit] Loading data to table srcpart partition {ds=2008-04-08, hr=11}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-08, hr=12}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-09, hr=11}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-09, hr=12}
    [junit] OK
    [junit] Loading data to table srcbucket
    [junit] OK
    [junit] Loading data to table srcbucket
    [junit] OK
    [junit] Loading data to table src
    [junit] OK
    [junit] diff /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/input_testxpath2.q.out /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/compiler/parse/input_testxpath2.q.out
    [junit] diff -b -I'\(\(<java version=".*" class="java.beans.XMLDecoder">\)\|\(<string>.*/tmp/.*</string>\)\|\(<string>file:.*</string>\)\|\(<string>/.*/warehouse/.*</string>\)\)' /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/input_testxpath2.q.xml /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/compiler/plan/input_testxpath2.q.xml
    [junit] Done query: input_testxpath2.q
    [junit] Begin query: join6.q
    [junit] Loading data to table srcpart partition {ds=2008-04-08, hr=11}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-08, hr=12}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-09, hr=11}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-09, hr=12}
    [junit] OK
    [junit] Loading data to table srcbucket
    [junit] OK
    [junit] Loading data to table srcbucket
    [junit] OK
    [junit] Loading data to table src
    [junit] OK
    [junit] diff /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/join6.q.out /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/compiler/parse/join6.q.out
    [junit] diff -b -I'\(\(<java version=".*" class="java.beans.XMLDecoder">\)\|\(<string>.*/tmp/.*</string>\)\|\(<string>file:.*</string>\)\|\(<string>/.*/warehouse/.*</string>\)\)' /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/join6.q.xml /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/compiler/plan/join6.q.xml
    [junit] Done query: join6.q
    [junit] Begin query: input7.q
    [junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.init(QTestUtil.java:329)
    [junit]     at org.apache.hadoop.hive.ql.parse.TestParse.testParse_input7(TestParse.java:724)
    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit]     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit]     at java.lang.reflect.Method.invoke(Method.java:597)
    [junit]     at junit.framework.TestCase.runTest(TestCase.java:164)
    [junit]     at junit.framework.TestCase.runBare(TestCase.java:130)
    [junit]     at junit.framework.TestResult$1.protect(TestResult.java:106)
    [junit]     at junit.framework.TestResult.runProtected(TestResult.java:124)
    [junit]     at junit.framework.TestResult.run(TestResult.java:109)
    [junit]     at junit.framework.TestCase.run(TestCase.java:120)
    [junit]     at junit.framework.TestSuite.runTest(TestSuite.java:230)
    [junit]     at junit.framework.TestSuite.run(TestSuite.java:225)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
    [junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
    [junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
    [junit]     at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
    [junit]     at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
    [junit]     at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
    [junit]     ... 20 more
    [junit] Begin query: join7.q
    [junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcpart)
    [junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcpart)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.init(QTestUtil.java:329)
    [junit]     at org.apache.hadoop.hive.ql.parse.TestParse.testParse_join7(TestParse.java:750)
    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit]     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit]     at java.lang.reflect.Method.invoke(Method.java:597)
    [junit]     at junit.framework.TestCase.runTest(TestCase.java:164)
    [junit]     at junit.framework.TestCase.runBare(TestCase.java:130)
    [junit]     at junit.framework.TestResult$1.protect(TestResult.java:106)
    [junit]     at junit.framework.TestResult.runProtected(TestResult.java:124)
    [junit]     at junit.framework.TestResult.run(TestResult.java:109)
    [junit]     at junit.framework.TestCase.run(TestCase.java:120)
    [junit]     at junit.framework.TestSuite.runTest(TestSuite.java:230)
    [junit]     at junit.framework.TestSuite.run(TestSuite.java:225)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
    [junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcpart)
    [junit]     at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
    [junit]     at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
    [junit]     at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
    [junit]     ... 20 more
    [junit] Begin query: input8.q
    [junit] Loading data to table srcpart partition {ds=2008-04-08, hr=11}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-08, hr=12}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-09, hr=11}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-09, hr=12}
    [junit] OK
    [junit] Loading data to table srcbucket
    [junit] OK
    [junit] Loading data to table srcbucket
    [junit] OK
    [junit] Loading data to table src
    [junit] OK
    [junit] diff /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/input8.q.out /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/compiler/parse/input8.q.out
    [junit] diff -b -I'\(\(<java version=".*" class="java.beans.XMLDecoder">\)\|\(<string>.*/tmp/.*</string>\)\|\(<string>file:.*</string>\)\|\(<string>/.*/warehouse/.*</string>\)\)' /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/input8.q.xml /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/compiler/plan/input8.q.xml
    [junit] Done query: input8.q
    [junit] Begin query: input_testsequencefile.q
    [junit] Loading data to table srcpart partition {ds=2008-04-08, hr=11}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-08, hr=12}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-09, hr=11}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-09, hr=12}
    [junit] OK
    [junit] Loading data to table srcbucket
    [junit] OK
    [junit] Loading data to table srcbucket
    [junit] OK
    [junit] Loading data to table src
    [junit] OK
    [junit] diff /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/input_testsequencefile.q.out /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/compiler/parse/input_testsequencefile.q.out
    [junit] diff -b -I'\(\(<java version=".*" class="java.beans.XMLDecoder">\)\|\(<string>.*/tmp/.*</string>\)\|\(<string>file:.*</string>\)\|\(<string>/.*/warehouse/.*</string>\)\)' /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/input_testsequencefile.q.xml /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/compiler/plan/input_testsequencefile.q.xml
    [junit] Done query: input_testsequencefile.q
    [junit] Begin query: join8.q
    [junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
    [junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.init(QTestUtil.java:329)
    [junit]     at org.apache.hadoop.hive.ql.parse.TestParse.testParse_join8(TestParse.java:828)
    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit]     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit]     at java.lang.reflect.Method.invoke(Method.java:597)
    [junit]     at junit.framework.TestCase.runTest(TestCase.java:164)
    [junit]     at junit.framework.TestCase.runBare(TestCase.java:130)
    [junit]     at junit.framework.TestResult$1.protect(TestResult.java:106)
    [junit]     at junit.framework.TestResult.runProtected(TestResult.java:124)
    [junit]     at junit.framework.TestResult.run(TestResult.java:109)
    [junit]     at junit.framework.TestCase.run(TestCase.java:120)
    [junit]     at junit.framework.TestSuite.runTest(TestSuite.java:230)
    [junit]     at junit.framework.TestSuite.run(TestSuite.java:225)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
    [junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
    [junit]     at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
    [junit]     at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
    [junit]     at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
    [junit]     ... 20 more
    [junit] Begin query: union.q
    [junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcpart)
    [junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcpart)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.init(QTestUtil.java:329)
    [junit]     at org.apache.hadoop.hive.ql.parse.TestParse.testParse_union(TestParse.java:854)
    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit]     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit]     at java.lang.reflect.Method.invoke(Method.java:597)
    [junit]     at junit.framework.TestCase.runTest(TestCase.java:164)
    [junit]     at junit.framework.TestCase.runBare(TestCase.java:130)
    [junit]     at junit.framework.TestResult$1.protect(TestResult.java:106)
    [junit]     at junit.framework.TestResult.runProtected(TestResult.java:124)
    [junit]     at junit.framework.TestResult.run(TestResult.java:109)
    [junit]     at junit.framework.TestCase.run(TestCase.java:120)
    [junit]     at junit.framework.TestSuite.runTest(TestSuite.java:230)
    [junit]     at junit.framework.TestSuite.run(TestSuite.java:225)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
    [junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcpart)
    [junit]     at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
    [junit]     at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
    [junit]     at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
    [junit]     ... 20 more
    [junit] Begin query: input9.q
    [junit] Loading data to table srcpart partition {ds=2008-04-08, hr=11}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-08, hr=12}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-09, hr=11}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-09, hr=12}
    [junit] OK
    [junit] Loading data to table srcbucket
    [junit] OK
    [junit] Loading data to table srcbucket
    [junit] OK
    [junit] Loading data to table src
    [junit] OK
    [junit] diff /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/input9.q.out /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/compiler/parse/input9.q.out
    [junit] diff -b -I'\(\(<java version=".*" class="java.beans.XMLDecoder">\)\|\(<string>.*/tmp/.*</string>\)\|\(<string>file:.*</string>\)\|\(<string>/.*/warehouse/.*</string>\)\)' /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/input9.q.xml /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/compiler/plan/input9.q.xml
    [junit] Done query: input9.q
    [junit] Begin query: udf1.q
    [junit] Loading data to table srcpart partition {ds=2008-04-08, hr=11}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-08, hr=12}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-09, hr=11}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-09, hr=12}
    [junit] OK
    [junit] Loading data to table srcbucket
    [junit] OK
    [junit] Loading data to table srcbucket
    [junit] OK
    [junit] Loading data to table src
    [junit] OK
    [junit] diff /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/udf1.q.out /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/compiler/parse/udf1.q.out
    [junit] diff -b -I'\(\(<java version=".*" class="java.beans.XMLDecoder">\)\|\(<string>.*/tmp/.*</string>\)\|\(<string>file:.*</string>\)\|\(<string>/.*/warehouse/.*</string>\)\)' /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/udf1.q.xml /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/compiler/plan/udf1.q.xml
    [junit] Done query: udf1.q
    [junit] Begin query: udf4.q
    [junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
    [junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.init(QTestUtil.java:329)
    [junit]     at org.apache.hadoop.hive.ql.parse.TestParse.testParse_udf4(TestParse.java:932)
    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit]     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit]     at java.lang.reflect.Method.invoke(Method.java:597)
    [junit]     at junit.framework.TestCase.runTest(TestCase.java:164)
    [junit]     at junit.framework.TestCase.runBare(TestCase.java:130)
    [junit]     at junit.framework.TestResult$1.protect(TestResult.java:106)
    [junit]     at junit.framework.TestResult.runProtected(TestResult.java:124)
    [junit]     at junit.framework.TestResult.run(TestResult.java:109)
    [junit]     at junit.framework.TestCase.run(TestCase.java:120)
    [junit]     at junit.framework.TestSuite.runTest(TestSuite.java:230)
    [junit]     at junit.framework.TestSuite.run(TestSuite.java:225)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
    [junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
    [junit]     at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
    [junit]     at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
    [junit]     at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
    [junit]     ... 20 more
    [junit] Begin query: input_testxpath.q
    [junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcpart)
    [junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcpart)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.init(QTestUtil.java:329)
    [junit]     at org.apache.hadoop.hive.ql.parse.TestParse.testParse_input_testxpath(TestParse.java:958)
    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit]     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit]     at java.lang.reflect.Method.invoke(Method.java:597)
    [junit]     at junit.framework.TestCase.runTest(TestCase.java:164)
    [junit]     at junit.framework.TestCase.runBare(TestCase.java:130)
    [junit]     at junit.framework.TestResult$1.protect(TestResult.java:106)
    [junit]     at junit.framework.TestResult.runProtected(TestResult.java:124)
    [junit]     at junit.framework.TestResult.run(TestResult.java:109)
    [junit]     at junit.framework.TestCase.run(TestCase.java:120)
    [junit]     at junit.framework.TestSuite.runTest(TestSuite.java:230)
    [junit]     at junit.framework.TestSuite.run(TestSuite.java:225)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
    [junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcpart)
    [junit]     at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
    [junit]     at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
    [junit]     at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
    [junit]     ... 20 more
    [junit] Begin query: input_part1.q
    [junit] Loading data to table srcpart partition {ds=2008-04-08, hr=11}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-08, hr=12}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-09, hr=11}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-09, hr=12}
    [junit] OK
    [junit] Loading data to table srcbucket
    [junit] OK
    [junit] Loading data to table srcbucket
    [junit] OK
    [junit] Loading data to table src
    [junit] OK
    [junit] diff /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/input_part1.q.out /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/compiler/parse/input_part1.q.out
    [junit] diff -b -I'\(\(<java version=".*" class="java.beans.XMLDecoder">\)\|\(<string>.*/tmp/.*</string>\)\|\(<string>file:.*</string>\)\|\(<string>/.*/warehouse/.*</string>\)\)' /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/input_part1.q.xml /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/compiler/plan/input_part1.q.xml
    [junit] Done query: input_part1.q
    [junit] Begin query: groupby1.q
    [junit] Loading data to table srcpart partition {ds=2008-04-08, hr=11}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-08, hr=12}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-09, hr=11}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-09, hr=12}
    [junit] OK
    [junit] Loading data to table srcbucket
    [junit] OK
    [junit] Loading data to table srcbucket
    [junit] OK
    [junit] Loading data to table src
    [junit] OK
    [junit] diff /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/groupby1.q.out /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/compiler/parse/groupby1.q.out
    [junit] diff -b -I'\(\(<java version=".*" class="java.beans.XMLDecoder">\)\|\(<string>.*/tmp/.*</string>\)\|\(<string>file:.*</string>\)\|\(<string>/.*/warehouse/.*</string>\)\)' /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/groupby1.q.xml /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/compiler/plan/groupby1.q.xml
    [junit] Done query: groupby1.q
    [junit] Begin query: groupby2.q
    [junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
    [junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.init(QTestUtil.java:329)
    [junit]     at org.apache.hadoop.hive.ql.parse.TestParse.testParse_groupby2(TestParse.java:1036)
    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit]     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit]     at java.lang.reflect.Method.invoke(Method.java:597)
    [junit]     at junit.framework.TestCase.runTest(TestCase.java:164)
    [junit]     at junit.framework.TestCase.runBare(TestCase.java:130)
    [junit]     at junit.framework.TestResult$1.protect(TestResult.java:106)
    [junit]     at junit.framework.TestResult.runProtected(TestResult.java:124)
    [junit]     at junit.framework.TestResult.run(TestResult.java:109)
    [junit]     at junit.framework.TestCase.run(TestCase.java:120)
    [junit]     at junit.framework.TestSuite.runTest(TestSuite.java:230)
    [junit]     at junit.framework.TestSuite.run(TestSuite.java:225)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
    [junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
    [junit]     at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
    [junit]     at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
    [junit]     at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
    [junit]     ... 20 more
    [junit] Begin query: groupby3.q
    [junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcbucket)
    [junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcbucket)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.init(QTestUtil.java:329)
    [junit]     at org.apache.hadoop.hive.ql.parse.TestParse.testParse_groupby3(TestParse.java:1062)
    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit]     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit]     at java.lang.reflect.Method.invoke(Method.java:597)
    [junit]     at junit.framework.TestCase.runTest(TestCase.java:164)
    [junit]     at junit.framework.TestCase.runBare(TestCase.java:130)
    [junit]     at junit.framework.TestResult$1.protect(TestResult.java:106)
    [junit]     at junit.framework.TestResult.runProtected(TestResult.java:124)
    [junit]     at junit.framework.TestResult.run(TestResult.java:109)
    [junit]     at junit.framework.TestCase.run(TestCase.java:120)
    [junit]     at junit.framework.TestSuite.runTest(TestSuite.java:230)
    [junit]     at junit.framework.TestSuite.run(TestSuite.java:225)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
    [junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcbucket)
    [junit]     at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
    [junit]     at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
    [junit]     at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
    [junit]     ... 20 more
    [junit] Begin query: subq.q
    [junit] Loading data to table srcpart partition {ds=2008-04-08, hr=11}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-08, hr=12}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-09, hr=11}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-09, hr=12}
    [junit] OK
    [junit] Loading data to table srcbucket
    [junit] OK
    [junit] Loading data to table srcbucket
    [junit] OK
    [junit] Loading data to table src
    [junit] OK
    [junit] diff /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/subq.q.out /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/compiler/parse/subq.q.out
    [junit] diff -b -I'\(\(<java version=".*" class="java.beans.XMLDecoder">\)\|\(<string>.*/tmp/.*</string>\)\|\(<string>file:.*</string>\)\|\(<string>/.*/warehouse/.*</string>\)\)' /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/subq.q.xml /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/compiler/plan/subq.q.xml
    [junit] Done query: subq.q
    [junit] Begin query: groupby4.q
    [junit] Loading data to table srcpart partition {ds=2008-04-08, hr=11}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-08, hr=12}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-09, hr=11}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-09, hr=12}
    [junit] OK
    [junit] Loading data to table srcbucket
    [junit] OK
    [junit] Loading data to table srcbucket
    [junit] OK
    [junit] Loading data to table src
    [junit] OK
    [junit] diff /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/groupby4.q.out /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/compiler/parse/groupby4.q.out
    [junit] diff -b -I'\(\(<java version=".*" class="java.beans.XMLDecoder">\)\|\(<string>.*/tmp/.*</string>\)\|\(<string>file:.*</string>\)\|\(<string>/.*/warehouse/.*</string>\)\)' /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/groupby4.q.xml /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/compiler/plan/groupby4.q.xml
    [junit] Done query: groupby4.q
    [junit] Begin query: groupby5.q
    [junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
    [junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.init(QTestUtil.java:329)
    [junit]     at org.apache.hadoop.hive.ql.parse.TestParse.testParse_groupby5(TestParse.java:1140)
    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit]     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit]     at java.lang.reflect.Method.invoke(Method.java:597)
    [junit]     at junit.framework.TestCase.runTest(TestCase.java:164)
    [junit]     at junit.framework.TestCase.runBare(TestCase.java:130)
    [junit]     at junit.framework.TestResult$1.protect(TestResult.java:106)
    [junit]     at junit.framework.TestResult.runProtected(TestResult.java:124)
    [junit]     at junit.framework.TestResult.run(TestResult.java:109)
    [junit]     at junit.framework.TestCase.run(TestCase.java:120)
    [junit]     at junit.framework.TestSuite.runTest(TestSuite.java:230)
    [junit]     at junit.framework.TestSuite.run(TestSuite.java:225)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
    [junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
    [junit]     at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
    [junit]     at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
    [junit]     at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
    [junit]     ... 20 more
    [junit] Begin query: groupby6.q
    [junit] Loading data to table srcpart partition {ds=2008-04-08, hr=11}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-08, hr=12}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-09, hr=11}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-09, hr=12}
    [junit] OK
    [junit] Loading data to table srcbucket
    [junit] OK
    [junit] Loading data to table srcbucket
    [junit] OK
    [junit] Loading data to table src
    [junit] OK
    [junit] diff /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/groupby6.q.out /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/compiler/parse/groupby6.q.out
    [junit] diff -b -I'\(\(<java version=".*" class="java.beans.XMLDecoder">\)\|\(<string>.*/tmp/.*</string>\)\|\(<string>file:.*</string>\)\|\(<string>/.*/warehouse/.*</string>\)\)' /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/groupby6.q.xml /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/compiler/plan/groupby6.q.xml
    [junit] Done query: groupby6.q
    [junit] Tests run: 41, Failures: 19, Errors: 0, Time elapsed: 53.784 sec
    [junit] Test org.apache.hadoop.hive.ql.parse.TestParse FAILED
    [junit] Running org.apache.hadoop.hive.ql.parse.TestParseNegative
    [junit] Begin query: insert_wrong_number_columns.q
    [junit] Loading data to table srcpart partition {ds=2008-04-08, hr=11}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-08, hr=12}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-09, hr=11}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-09, hr=12}
    [junit] OK
    [junit] Loading data to table srcbucket
    [junit] OK
    [junit] Loading data to table srcbucket
    [junit] OK
    [junit] Loading data to table src
    [junit] OK
    [junit] diff /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/insert_wrong_number_columns.q.out /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/compiler/errors/insert_wrong_number_columns.q.out
    [junit] Done query: insert_wrong_number_columns.q
    [junit] Begin query: duplicate_alias.q
    [junit] Loading data to table srcpart partition {ds=2008-04-08, hr=11}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-08, hr=12}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-09, hr=11}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-09, hr=12}
    [junit] OK
    [junit] Loading data to table srcbucket
    [junit] OK
    [junit] Loading data to table srcbucket
    [junit] OK
    [junit] Loading data to table src
    [junit] OK
    [junit] diff /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/duplicate_alias.q.out /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/compiler/errors/duplicate_alias.q.out
    [junit] Done query: duplicate_alias.q
    [junit] Begin query: unknown_function1.q
    [junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
    [junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.init(QTestUtil.java:329)
    [junit]     at org.apache.hadoop.hive.ql.parse.TestParseNegative.testParseNegative_unknown_function1(TestParseNegative.java:164)
    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit]     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit]     at java.lang.reflect.Method.invoke(Method.java:597)
    [junit]     at junit.framework.TestCase.runTest(TestCase.java:164)
    [junit]     at junit.framework.TestCase.runBare(TestCase.java:130)
    [junit]     at junit.framework.TestResult$1.protect(TestResult.java:106)
    [junit]     at junit.framework.TestResult.runProtected(TestResult.java:124)
    [junit]     at junit.framework.TestResult.run(TestResult.java:109)
    [junit]     at junit.framework.TestCase.run(TestCase.java:120)
    [junit]     at junit.framework.TestSuite.runTest(TestSuite.java:230)
    [junit]     at junit.framework.TestSuite.run(TestSuite.java:225)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
    [junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
    [junit]     at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
    [junit]     at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
    [junit]     at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
    [junit]     ... 20 more
    [junit] Begin query: unknown_function2.q
    [junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcpart)
    [junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcpart)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.init(QTestUtil.java:329)
    [junit]     at org.apache.hadoop.hive.ql.parse.TestParseNegative.testParseNegative_unknown_function2(TestParseNegative.java:195)
    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit]     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit]     at java.lang.reflect.Method.invoke(Method.java:597)
    [junit]     at junit.framework.TestCase.runTest(TestCase.java:164)
    [junit]     at junit.framework.TestCase.runBare(TestCase.java:130)
    [junit]     at junit.framework.TestResult$1.protect(TestResult.java:106)
    [junit]     at junit.framework.TestResult.runProtected(TestResult.java:124)
    [junit]     at junit.framework.TestResult.run(TestResult.java:109)
    [junit]     at junit.framework.TestCase.run(TestCase.java:120)
    [junit]     at junit.framework.TestSuite.runTest(TestSuite.java:230)
    [junit]     at junit.framework.TestSuite.run(TestSuite.java:225)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
    [junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcpart)
    [junit]     at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
    [junit]     at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
    [junit]     at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
    [junit]     ... 20 more
    [junit] Begin query: unknown_table1.q
    [junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcbucket)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.init(QTestUtil.java:329)
    [junit]     at org.apache.hadoop.hive.ql.parse.TestParseNegative.testParseNegative_unknown_table1(TestParseNegative.java:226)
    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcbucket)
    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit]     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit]     at java.lang.reflect.Method.invoke(Method.java:597)
    [junit]     at junit.framework.TestCase.runTest(TestCase.java:164)
    [junit]     at junit.framework.TestCase.runBare(TestCase.java:130)
    [junit]     at junit.framework.TestResult$1.protect(TestResult.java:106)
    [junit]     at junit.framework.TestResult.runProtected(TestResult.java:124)
    [junit]     at junit.framework.TestResult.run(TestResult.java:109)
    [junit]     at junit.framework.TestCase.run(TestCase.java:120)
    [junit]     at junit.framework.TestSuite.runTest(TestSuite.java:230)
    [junit]     at junit.framework.TestSuite.run(TestSuite.java:225)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
    [junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcbucket)
    [junit]     at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
    [junit]     at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
    [junit]     at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
    [junit]     ... 20 more
    [junit] Begin query: unknown_function3.q
    [junit] Loading data to table srcpart partition {ds=2008-04-08, hr=11}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-08, hr=12}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-09, hr=11}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-09, hr=12}
    [junit] OK
    [junit] Loading data to table srcbucket
    [junit] OK
    [junit] Loading data to table srcbucket
    [junit] OK
    [junit] Loading data to table src
    [junit] OK
    [junit] diff /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/unknown_function3.q.out /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/compiler/errors/unknown_function3.q.out
    [junit] Done query: unknown_function3.q
    [junit] Begin query: quoted_string.q
    [junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
    [junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.init(QTestUtil.java:329)
    [junit]     at org.apache.hadoop.hive.ql.parse.TestParseNegative.testParseNegative_quoted_string(TestParseNegative.java:288)
    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit]     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit]     at java.lang.reflect.Method.invoke(Method.java:597)
    [junit]     at junit.framework.TestCase.runTest(TestCase.java:164)
    [junit]     at junit.framework.TestCase.runBare(TestCase.java:130)
    [junit]     at junit.framework.TestResult$1.protect(TestResult.java:106)
    [junit]     at junit.framework.TestResult.runProtected(TestResult.java:124)
    [junit]     at junit.framework.TestResult.run(TestResult.java:109)
    [junit]     at junit.framework.TestCase.run(TestCase.java:120)
    [junit]     at junit.framework.TestSuite.runTest(TestSuite.java:230)
    [junit]     at junit.framework.TestSuite.run(TestSuite.java:225)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
    [junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
    [junit]     at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
    [junit]     at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
    [junit]     at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
    [junit]     ... 20 more
    [junit] Begin query: unknown_table2.q
    [junit] Loading data to table srcpart partition {ds=2008-04-08, hr=11}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-08, hr=12}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-09, hr=11}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-09, hr=12}
    [junit] OK
    [junit] Loading data to table srcbucket
    [junit] OK
    [junit] Loading data to table srcbucket
    [junit] OK
    [junit] Loading data to table src
    [junit] OK
    [junit] diff /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/unknown_table2.q.out /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/compiler/errors/unknown_table2.q.out
    [junit] Done query: unknown_table2.q
    [junit] Begin query: unknown_function4.q
    [junit] Loading data to table srcpart partition {ds=2008-04-08, hr=11}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-08, hr=12}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-09, hr=11}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-09, hr=12}
    [junit] OK
    [junit] Loading data to table srcbucket
    [junit] OK
    [junit] Loading data to table srcbucket
    [junit] OK
    [junit] Loading data to table src
    [junit] OK
    [junit] diff /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/unknown_function4.q.out /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/compiler/errors/unknown_function4.q.out
    [junit] Done query: unknown_function4.q
    [junit] Begin query: garbage.q
    [junit] Loading data to table srcpart partition {ds=2008-04-08, hr=11}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-08, hr=12}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-09, hr=11}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-09, hr=12}
    [junit] OK
    [junit] Loading data to table srcbucket
    [junit] OK
    [junit] Loading data to table srcbucket
    [junit] OK
    [junit] Loading data to table src
    [junit] OK
    [junit] diff /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/garbage.q.out /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/compiler/errors/garbage.q.out
    [junit] Done query: garbage.q
    [junit] Begin query: unknown_function5.q
    [junit] Loading data to table srcpart partition {ds=2008-04-08, hr=11}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-08, hr=12}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-09, hr=11}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-09, hr=12}
    [junit] OK
    [junit] Loading data to table srcbucket
    [junit] OK
    [junit] Loading data to table srcbucket
    [junit] OK
    [junit] Loading data to table src
    [junit] OK
    [junit] diff /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/unknown_function5.q.out /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/compiler/errors/unknown_function5.q.out
    [junit] Done query: unknown_function5.q
    [junit] Begin query: invalid_list_index2.q
    [junit] Loading data to table srcpart partition {ds=2008-04-08, hr=11}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-08, hr=12}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-09, hr=11}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-09, hr=12}
    [junit] OK
    [junit] Loading data to table srcbucket
    [junit] OK
    [junit] Loading data to table srcbucket
    [junit] OK
    [junit] Loading data to table src
    [junit] OK
    [junit] diff /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/invalid_list_index2.q.out /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/compiler/errors/invalid_list_index2.q.out
    [junit] Done query: invalid_list_index2.q
    [junit] Begin query: invalid_dot.q
    [junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
    [junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.init(QTestUtil.java:329)
    [junit]     at org.apache.hadoop.hive.ql.parse.TestParseNegative.testParseNegative_invalid_dot(TestParseNegative.java:474)
    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit]     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit]     at java.lang.reflect.Method.invoke(Method.java:597)
    [junit]     at junit.framework.TestCase.runTest(TestCase.java:164)
    [junit]     at junit.framework.TestCase.runBare(TestCase.java:130)
    [junit]     at junit.framework.TestResult$1.protect(TestResult.java:106)
    [junit]     at junit.framework.TestResult.runProtected(TestResult.java:124)
    [junit]     at junit.framework.TestResult.run(TestResult.java:109)
    [junit]     at junit.framework.TestCase.run(TestCase.java:120)
    [junit]     at junit.framework.TestSuite.runTest(TestSuite.java:230)
    [junit]     at junit.framework.TestSuite.run(TestSuite.java:225)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
    [junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
    [junit]     at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
    [junit]     at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
    [junit]     at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
    [junit]     ... 20 more
    [junit] Begin query: invalid_function_param1.q
    [junit] Loading data to table srcpart partition {ds=2008-04-08, hr=11}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-08, hr=12}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-09, hr=11}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-09, hr=12}
    [junit] OK
    [junit] Loading data to table srcbucket
    [junit] OK
    [junit] Loading data to table srcbucket
    [junit] OK
    [junit] Loading data to table src
    [junit] OK
    [junit] diff /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/invalid_function_param1.q.out /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/compiler/errors/invalid_function_param1.q.out
    [junit] Done query: invalid_function_param1.q
    [junit] Begin query: invalid_map_index2.q
    [junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
    [junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.init(QTestUtil.java:329)
    [junit]     at org.apache.hadoop.hive.ql.parse.TestParseNegative.testParseNegative_invalid_map_index2(TestParseNegative.java:536)
    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit]     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit]     at java.lang.reflect.Method.invoke(Method.java:597)
    [junit]     at junit.framework.TestCase.runTest(TestCase.java:164)
    [junit]     at junit.framework.TestCase.runBare(TestCase.java:130)
    [junit]     at junit.framework.TestResult$1.protect(TestResult.java:106)
    [junit]     at junit.framework.TestResult.runProtected(TestResult.java:124)
    [junit]     at junit.framework.TestResult.run(TestResult.java:109)
    [junit]     at junit.framework.TestCase.run(TestCase.java:120)
    [junit]     at junit.framework.TestSuite.runTest(TestSuite.java:230)
    [junit]     at junit.framework.TestSuite.run(TestSuite.java:225)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
    [junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
    [junit]     at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
    [junit]     at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
    [junit]     at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
    [junit]     ... 20 more
    [junit] Begin query: unknown_column1.q
    [junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcpart)
    [junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcpart)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.init(QTestUtil.java:329)
    [junit]     at org.apache.hadoop.hive.ql.parse.TestParseNegative.testParseNegative_unknown_column1(TestParseNegative.java:567)
    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit]     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit]     at java.lang.reflect.Method.invoke(Method.java:597)
    [junit]     at junit.framework.TestCase.runTest(TestCase.java:164)
    [junit]     at junit.framework.TestCase.runBare(TestCase.java:130)
    [junit]     at junit.framework.TestResult$1.protect(TestResult.java:106)
    [junit]     at junit.framework.TestResult.runProtected(TestResult.java:124)
    [junit]     at junit.framework.TestResult.run(TestResult.java:109)
    [junit]     at junit.framework.TestCase.run(TestCase.java:120)
    [junit]     at junit.framework.TestSuite.runTest(TestSuite.java:230)
    [junit]     at junit.framework.TestSuite.run(TestSuite.java:225)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
    [junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcpart)
    [junit]     at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
    [junit]     at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
    [junit]     at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
    [junit]     ... 20 more
    [junit] Begin query: invalid_function_param2.q
    [junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcbucket)
    [junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcbucket)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.init(QTestUtil.java:329)
    [junit]     at org.apache.hadoop.hive.ql.parse.TestParseNegative.testParseNegative_invalid_function_param2(TestParseNegative.java:598)
    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit]     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit]     at java.lang.reflect.Method.invoke(Method.java:597)
    [junit]     at junit.framework.TestCase.runTest(TestCase.java:164)
    [junit]     at junit.framework.TestCase.runBare(TestCase.java:130)
    [junit]     at junit.framework.TestResult$1.protect(TestResult.java:106)
    [junit]     at junit.framework.TestResult.runProtected(TestResult.java:124)
    [junit]     at junit.framework.TestResult.run(TestResult.java:109)
    [junit]     at junit.framework.TestCase.run(TestCase.java:120)
    [junit]     at junit.framework.TestSuite.runTest(TestSuite.java:230)
    [junit]     at junit.framework.TestSuite.run(TestSuite.java:225)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
    [junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcbucket)
    [junit]     at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
    [junit]     at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
    [junit]     at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
    [junit]     ... 20 more
    [junit] Begin query: unknown_column2.q
    [junit] Loading data to table srcpart partition {ds=2008-04-08, hr=11}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-08, hr=12}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-09, hr=11}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-09, hr=12}
    [junit] OK
    [junit] Loading data to table srcbucket
    [junit] OK
    [junit] Loading data to table srcbucket
    [junit] OK
    [junit] Loading data to table src
    [junit] OK
    [junit] diff /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/unknown_column2.q.out /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/compiler/errors/unknown_column2.q.out
    [junit] Done query: unknown_column2.q
    [junit] Begin query: unknown_column3.q
    [junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.init(QTestUtil.java:329)
    [junit]     at org.apache.hadoop.hive.ql.parse.TestParseNegative.testParseNegative_unknown_column3(TestParseNegative.java:660)
    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit]     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit]     at java.lang.reflect.Method.invoke(Method.java:597)
    [junit]     at junit.framework.TestCase.runTest(TestCase.java:164)
    [junit]     at junit.framework.TestCase.runBare(TestCase.java:130)
    [junit]     at junit.framework.TestResult$1.protect(TestResult.java:106)
    [junit]     at junit.framework.TestResult.runProtected(TestResult.java:124)
    [junit]     at junit.framework.TestResult.run(TestResult.java:109)
    [junit]     at junit.framework.TestCase.run(TestCase.java:120)
    [junit]     at junit.framework.TestSuite.runTest(TestSuite.java:230)
    [junit]     at junit.framework.TestSuite.run(TestSuite.java:225)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
    [junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
    [junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
    [junit]     at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
    [junit]     at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
    [junit]     at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
    [junit]     ... 20 more
    [junit] Begin query: unknown_column4.q
    [junit] Loading data to table srcpart partition {ds=2008-04-08, hr=11}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-08, hr=12}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-09, hr=11}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-09, hr=12}
    [junit] OK
    [junit] Loading data to table srcbucket
    [junit] OK
    [junit] Loading data to table srcbucket
    [junit] OK
    [junit] Loading data to table src
    [junit] OK
    [junit] diff /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/unknown_column4.q.out /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/compiler/errors/unknown_column4.q.out
    [junit] Done query: unknown_column4.q
    [junit] Begin query: unknown_column5.q
    [junit] Loading data to table srcpart partition {ds=2008-04-08, hr=11}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-08, hr=12}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-09, hr=11}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-09, hr=12}
    [junit] OK
    [junit] Loading data to table srcbucket
    [junit] OK
    [junit] Loading data to table srcbucket
    [junit] OK
    [junit] Loading data to table src
    [junit] OK
    [junit] diff /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/unknown_column5.q.out /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/compiler/errors/unknown_column5.q.out
    [junit] Done query: unknown_column5.q
    [junit] Begin query: unknown_column6.q
    [junit] Loading data to table srcpart partition {ds=2008-04-08, hr=11}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-08, hr=12}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-09, hr=11}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-09, hr=12}
    [junit] OK
    [junit] Loading data to table srcbucket
    [junit] OK
    [junit] Loading data to table srcbucket
    [junit] OK
    [junit] Loading data to table src
    [junit] OK
    [junit] diff /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/unknown_column6.q.out /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/compiler/errors/unknown_column6.q.out
    [junit] Done query: unknown_column6.q
    [junit] Begin query: invalid_list_index.q
    [junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.init(QTestUtil.java:329)
    [junit]     at org.apache.hadoop.hive.ql.parse.TestParseNegative.testParseNegative_invalid_list_index(TestParseNegative.java:784)
    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit]     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit]     at java.lang.reflect.Method.invoke(Method.java:597)
    [junit]     at junit.framework.TestCase.runTest(TestCase.java:164)
    [junit]     at junit.framework.TestCase.runBare(TestCase.java:130)
    [junit]     at junit.framework.TestResult$1.protect(TestResult.java:106)
    [junit]     at junit.framework.TestResult.runProtected(TestResult.java:124)
    [junit]     at junit.framework.TestResult.run(TestResult.java:109)
    [junit]     at junit.framework.TestCase.run(TestCase.java:120)
    [junit]     at junit.framework.TestSuite.runTest(TestSuite.java:230)
    [junit]     at junit.framework.TestSuite.run(TestSuite.java:225)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
    [junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
    [junit]     at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
    [junit]     at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
    [junit]     at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
    [junit]     ... 20 more
    [junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
    [junit] Begin query: nonkey_groupby.q
    [junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcbucket)
    [junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcbucket)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.init(QTestUtil.java:329)
    [junit]     at org.apache.hadoop.hive.ql.parse.TestParseNegative.testParseNegative_nonkey_groupby(TestParseNegative.java:815)
    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit]     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit]     at java.lang.reflect.Method.invoke(Method.java:597)
    [junit]     at junit.framework.TestCase.runTest(TestCase.java:164)
    [junit]     at junit.framework.TestCase.runBare(TestCase.java:130)
    [junit]     at junit.framework.TestResult$1.protect(TestResult.java:106)
    [junit]     at junit.framework.TestResult.runProtected(TestResult.java:124)
    [junit]     at junit.framework.TestResult.run(TestResult.java:109)
    [junit]     at junit.framework.TestCase.run(TestCase.java:120)
    [junit]     at junit.framework.TestSuite.runTest(TestSuite.java:230)
    [junit]     at junit.framework.TestSuite.run(TestSuite.java:225)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
    [junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/srcbucket)
    [junit]     at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
    [junit]     at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
    [junit]     at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
    [junit]     ... 20 more
    [junit] Begin query: invalid_map_index.q
    [junit] Loading data to table srcpart partition {ds=2008-04-08, hr=11}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-08, hr=12}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-09, hr=11}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-09, hr=12}
    [junit] OK
    [junit] Loading data to table srcbucket
    [junit] OK
    [junit] Loading data to table srcbucket
    [junit] OK
    [junit] Loading data to table src
    [junit] OK
    [junit] diff /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/invalid_map_index.q.out /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/compiler/errors/invalid_map_index.q.out
    [junit] Done query: invalid_map_index.q
    [junit] Begin query: invalid_index.q
    [junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
    [junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.init(QTestUtil.java:329)
    [junit]     at org.apache.hadoop.hive.ql.parse.TestParseNegative.testParseNegative_invalid_index(TestParseNegative.java:877)
    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit]     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit]     at java.lang.reflect.Method.invoke(Method.java:597)
    [junit]     at junit.framework.TestCase.runTest(TestCase.java:164)
    [junit]     at junit.framework.TestCase.runBare(TestCase.java:130)
    [junit]     at junit.framework.TestResult$1.protect(TestResult.java:106)
    [junit]     at junit.framework.TestResult.runProtected(TestResult.java:124)
    [junit]     at junit.framework.TestResult.run(TestResult.java:109)
    [junit]     at junit.framework.TestCase.run(TestCase.java:120)
    [junit]     at junit.framework.TestSuite.runTest(TestSuite.java:230)
    [junit]     at junit.framework.TestSuite.run(TestSuite.java:225)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
    [junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
    [junit]     at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
    [junit]     at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
    [junit]     at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
    [junit]     ... 20 more
    [junit] Begin query: wrong_distinct1.q
    [junit] Loading data to table srcpart partition {ds=2008-04-08, hr=11}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-08, hr=12}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-09, hr=11}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-09, hr=12}
    [junit] OK
    [junit] Loading data to table srcbucket
    [junit] OK
    [junit] Loading data to table srcbucket
    [junit] OK
    [junit] Loading data to table src
    [junit] OK
    [junit] diff /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/wrong_distinct1.q.out /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/compiler/errors/wrong_distinct1.q.out
    [junit] Done query: wrong_distinct1.q
    [junit] Begin query: missing_overwrite.q
    [junit] Loading data to table srcpart partition {ds=2008-04-08, hr=11}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-08, hr=12}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-09, hr=11}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-09, hr=12}
    [junit] OK
    [junit] Loading data to table srcbucket
    [junit] OK
    [junit] Loading data to table srcbucket
    [junit] OK
    [junit] Loading data to table src
    [junit] OK
    [junit] diff /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/../build/ql/test/logs/missing_overwrite.q.out /usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/ql/src/test/results/compiler/errors/missing_overwrite.q.out
    [junit] Done query: missing_overwrite.q
    [junit] Begin query: wrong_distinct2.q
    [junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
    [junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
    [junit]     at org.apache.hadoop.hive.ql.QTestUtil.init(QTestUtil.java:329)
    [junit]     at org.apache.hadoop.hive.ql.parse.TestParseNegative.testParseNegative_wrong_distinct2(TestParseNegative.java:970)
    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit]     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit]     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit]     at java.lang.reflect.Method.invoke(Method.java:597)
    [junit]     at junit.framework.TestCase.runTest(TestCase.java:164)
    [junit]     at junit.framework.TestCase.runBare(TestCase.java:130)
    [junit]     at junit.framework.TestResult$1.protect(TestResult.java:106)
    [junit]     at junit.framework.TestResult.runProtected(TestResult.java:124)
    [junit]     at junit.framework.TestResult.run(TestResult.java:109)
    [junit]     at junit.framework.TestCase.run(TestCase.java:120)
    [junit]     at junit.framework.TestSuite.runTest(TestSuite.java:230)
    [junit]     at junit.framework.TestSuite.run(TestSuite.java:225)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
    [junit]     at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
    [junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build/ql/test/data/warehouse/src)
    [junit]     at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
    [junit]     at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
    [junit]     at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
    [junit]     at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
    [junit]     ... 20 more
    [junit] Tests run: 29, Failures: 13, Errors: 0, Time elapsed: 41.601 sec
    [junit] Test org.apache.hadoop.hive.ql.parse.TestParseNegative FAILED
    [junit] Running org.apache.hadoop.hive.ql.tool.TestLineageInfo
    [junit] Tests run: 4, Failures: 0, Errors: 0, Time elapsed: 0.496 sec

BUILD FAILED
/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build.xml:104: The following error occurred while executing this line:
/usr/fbtools/continuous_builds/hiveopensource-0.17.1/hiveopensource_0_17_1/build-common.xml:261: Tests failed!

Total time: 10 minutes 18 seconds
EXIT VALUE IS 1 for runtests


Mime
View raw message