hadoop-hive-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From mvaradach...@facebook.com (Murli Varadachari)
Subject *UNIT TEST FAILURE for apache HIVE* based on SVN Rev# 730288.15
Date Wed, 31 Dec 2008 01:23:38 GMT
Compiling hiveopensource at /usr/local/continuous_builds/src/hiveopensource-trunk/hiveopensource_trunk
+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Buildfile: build.xml

clean:

clean:
     [echo] Cleaning: anttasks
   [delete] Deleting directory /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/anttasks

clean:
     [echo] Cleaning: cli
   [delete] Deleting directory /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/cli

clean:
     [echo] Cleaning: common
   [delete] Deleting directory /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/common

clean:
     [echo] Cleaning: metastore
   [delete] Deleting directory /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/metastore
Overriding previous definition of reference to test.classpath

clean:
     [echo] Cleaning: ql
   [delete] Deleting directory /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql

clean:
     [echo] Cleaning: serde
   [delete] Deleting directory /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/serde

clean:
   [delete] Deleting directory /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/service
   [delete] Deleting directory /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/dist
   [delete] Deleting directory /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build

BUILD SUCCESSFUL
Total time: 9 seconds
Buildfile: build.xml

deploy:

init:
    [mkdir] Created dir: /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build
    [mkdir] Created dir: /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/common
    [mkdir] Created dir: /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/common/classes
    [mkdir] Created dir: /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/jexl/classes
    [mkdir] Created dir: /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/hadoopcore
    [mkdir] Created dir: /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/common/test
    [mkdir] Created dir: /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/common/test/src
    [mkdir] Created dir: /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/common/test/classes

download-ivy:

init-ivy:

settings-ivy:

resolve:
[ivy:retrieve] :: Ivy 2.0.0-rc2 - 20081028224207 :: http://ant.apache.org/ivy/ ::
:: loading settings :: file = /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ivy/ivysettings.xml
[ivy:retrieve] :: resolving dependencies :: org.apache.hadoop.hive#common;working@devbuild001.snc1.facebook.com
[ivy:retrieve] 	confs: [default]
[ivy:retrieve] 	found hadoop#core;0.19.0 in hadoop-resolver
[ivy:retrieve] :: resolution report :: resolve 106ms :: artifacts dl 5ms
	---------------------------------------------------------------------
	|                  |            modules            ||   artifacts   |
	|       conf       | number| search|dwnlded|evicted|| number|dwnlded|
	---------------------------------------------------------------------
	|      default     |   1   |   0   |   0   |   0   ||   1   |   0   |
	---------------------------------------------------------------------
[ivy:retrieve] :: retrieving :: org.apache.hadoop.hive#common
[ivy:retrieve] 	confs: [default]
[ivy:retrieve] 	1 artifacts copied, 0 already retrieved (41275kB/799ms)

install-hadoopcore:
    [untar] Expanding: /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/hadoopcore/hadoop-0.19.0.tar.gz into /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/hadoopcore
    [touch] Creating /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/hadoopcore/hadoop-0.19.0.installed

compile:
     [echo] Compiling: common
    [javac] Compiling 1 source file to /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/common/classes

jar:
     [echo] Jar: common
      [jar] Building jar: /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/common/hive_common.jar

deploy:
     [echo] hive: common
     [copy] Copying 1 file to /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build

init:
    [mkdir] Created dir: /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/serde
    [mkdir] Created dir: /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/serde/classes
    [mkdir] Created dir: /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/serde/test
    [mkdir] Created dir: /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/serde/test/src
    [mkdir] Created dir: /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/serde/test/classes

dynamic-serde:

compile:
     [echo] Compiling: serde
    [javac] Compiling 128 source files to /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/serde/classes
    [javac] Note: Some input files use or override a deprecated API.
    [javac] Note: Recompile with -Xlint:deprecation for details.
    [javac] Note: Some input files use unchecked or unsafe operations.
    [javac] Note: Recompile with -Xlint:unchecked for details.

jar:
     [echo] Jar: serde
      [jar] Building jar: /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/serde/hive_serde.jar

deploy:
     [echo] hive: serde
     [copy] Copying 1 file to /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build

init:
    [mkdir] Created dir: /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/metastore
    [mkdir] Created dir: /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/metastore/classes
    [mkdir] Created dir: /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/metastore/test
    [mkdir] Created dir: /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/metastore/test/src
    [mkdir] Created dir: /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/metastore/test/classes

model-compile:
    [javac] Compiling 8 source files to /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/metastore/classes
     [copy] Copying 1 file to /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/metastore/classes

core-compile:
     [echo] Compiling: 
    [javac] Compiling 38 source files to /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/metastore/classes
    [javac] Note: Some input files use unchecked or unsafe operations.
    [javac] Note: Recompile with -Xlint:unchecked for details.

model-enhance:
     [echo] Enhancing model classes with JPOX stuff....
     [java] JPOX Enhancer (version 1.2.2) : Enhancement of classes

     [java] JPOX Enhancer completed with success for 8 classes. Timings : input=168 ms, enhance=182 ms, total=350 ms. Consult the log for full details

compile:

jar:
     [echo] Jar: metastore
      [jar] Building jar: /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/metastore/hive_metastore.jar

deploy:
     [echo] hive: metastore
     [copy] Copying 1 file to /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build
Overriding previous definition of reference to test.classpath

init:
    [mkdir] Created dir: /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql
    [mkdir] Created dir: /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/classes
    [mkdir] Created dir: /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test
    [mkdir] Created dir: /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/src
    [mkdir] Created dir: /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/classes

ql-init:
    [mkdir] Created dir: /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/gen-java/org/apache/hadoop/hive/ql/parse

build-grammar:
     [echo] Building Grammar /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/src/java/org/apache/hadoop/hive/ql/parse/Hive.g  ....
     [java] ANTLR Parser Generator  Version 3.0.1 (August 13, 2007)  1989-2007

compile-ant-tasks:

init:
    [mkdir] Created dir: /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/anttasks
    [mkdir] Created dir: /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/anttasks/classes
    [mkdir] Created dir: /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/anttasks/test
    [mkdir] Created dir: /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/anttasks/test/src
    [mkdir] Created dir: /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/anttasks/test/classes

download-ivy:

init-ivy:

settings-ivy:

resolve:
:: loading settings :: file = /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ivy/ivysettings.xml
[ivy:retrieve] :: resolving dependencies :: org.apache.hadoop.hive#ant;working@devbuild001.snc1.facebook.com
[ivy:retrieve] 	confs: [default]
[ivy:retrieve] :: resolution report :: resolve 22ms :: artifacts dl 0ms
	---------------------------------------------------------------------
	|                  |            modules            ||   artifacts   |
	|       conf       | number| search|dwnlded|evicted|| number|dwnlded|
	---------------------------------------------------------------------
	|      default     |   0   |   0   |   0   |   0   ||   0   |   0   |
	---------------------------------------------------------------------
[ivy:retrieve] :: retrieving :: org.apache.hadoop.hive#ant
[ivy:retrieve] 	confs: [default]
[ivy:retrieve] 	0 artifacts copied, 0 already retrieved (0kB/2ms)

install-hadoopcore:

compile:
     [echo] Compiling: anttasks
    [javac] Compiling 2 source files to /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/anttasks/classes
    [javac] Note: /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ant/src/org/apache/hadoop/hive/ant/QTestGenTask.java uses or overrides a deprecated API.
    [javac] Note: Recompile with -Xlint:deprecation for details.

deploy-ant-tasks:

init:

download-ivy:

init-ivy:

settings-ivy:

resolve:
:: loading settings :: file = /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ivy/ivysettings.xml
[ivy:retrieve] :: resolving dependencies :: org.apache.hadoop.hive#ant;working@devbuild001.snc1.facebook.com
[ivy:retrieve] 	confs: [default]
[ivy:retrieve] :: resolution report :: resolve 13ms :: artifacts dl 0ms
	---------------------------------------------------------------------
	|                  |            modules            ||   artifacts   |
	|       conf       | number| search|dwnlded|evicted|| number|dwnlded|
	---------------------------------------------------------------------
	|      default     |   0   |   0   |   0   |   0   ||   0   |   0   |
	---------------------------------------------------------------------
[ivy:retrieve] :: retrieving :: org.apache.hadoop.hive#ant
[ivy:retrieve] 	confs: [default]
[ivy:retrieve] 	0 artifacts copied, 0 already retrieved (0kB/2ms)

install-hadoopcore:

compile:
     [echo] Compiling: anttasks

jar:
     [copy] Copying 1 file to /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/anttasks/classes/org/apache/hadoop/hive/ant
      [jar] Building jar: /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/anttasks/hive_anttasks.jar

deploy:
     [echo] hive: anttasks
     [copy] Copying 1 file to /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build

configure:
     [copy] Copying 239 files to /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/java

compile:
     [echo] Compiling: ql
    [javac] Compiling 241 source files to /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/classes
    [javac] Note: Some input files use or override a deprecated API.
    [javac] Note: Recompile with -Xlint:deprecation for details.
    [javac] Note: Some input files use unchecked or unsafe operations.
    [javac] Note: Recompile with -Xlint:unchecked for details.

jar:
     [echo] Jar: ql
    [unzip] Expanding: /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/lib/commons-jexl-1.1.jar into /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/jexl/classes
    [unzip] Expanding: /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/lib/libthrift.jar into /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/thrift/classes
    [unzip] Expanding: /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/lib/commons-lang-2.4.jar into /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/commons-lang/classes
      [jar] Building jar: /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/hive_exec.jar

deploy:
     [echo] hive: ql
     [copy] Copying 1 file to /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build

init:
    [mkdir] Created dir: /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/cli
    [mkdir] Created dir: /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/cli/classes
    [mkdir] Created dir: /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/cli/test
    [mkdir] Created dir: /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/cli/test/src
    [mkdir] Created dir: /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/cli/test/classes

download-ivy:

init-ivy:

settings-ivy:

resolve:
:: loading settings :: file = /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ivy/ivysettings.xml
[ivy:retrieve] :: resolving dependencies :: org.apache.hadoop.hive#cli;working@devbuild001.snc1.facebook.com
[ivy:retrieve] 	confs: [default]
[ivy:retrieve] 	found hadoop#core;0.19.0 in hadoop-resolver
[ivy:retrieve] :: resolution report :: resolve 41ms :: artifacts dl 2ms
	---------------------------------------------------------------------
	|                  |            modules            ||   artifacts   |
	|       conf       | number| search|dwnlded|evicted|| number|dwnlded|
	---------------------------------------------------------------------
	|      default     |   1   |   0   |   0   |   0   ||   1   |   0   |
	---------------------------------------------------------------------
[ivy:retrieve] :: retrieving :: org.apache.hadoop.hive#cli
[ivy:retrieve] 	confs: [default]
[ivy:retrieve] 	0 artifacts copied, 1 already retrieved (0kB/5ms)

install-hadoopcore:

compile:
     [echo] Compiling: cli
    [javac] Compiling 5 source files to /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/cli/classes
    [javac] Note: /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/cli/src/java/org/apache/hadoop/hive/cli/OptionsProcessor.java uses unchecked or unsafe operations.
    [javac] Note: Recompile with -Xlint:unchecked for details.

jar:
     [echo] Jar: cli
      [jar] Building jar: /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/cli/hive_cli.jar

deploy:
     [echo] hive: cli
     [copy] Copying 1 file to /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build

init:
    [mkdir] Created dir: /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/service
    [mkdir] Created dir: /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/service/classes
    [mkdir] Created dir: /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/service/test
    [mkdir] Created dir: /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/service/test/src
    [mkdir] Created dir: /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/service/test/classes

core-compile:
    [javac] Compiling 6 source files to /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/service/classes

compile:

jar:
     [echo] Jar: service
      [jar] Building jar: /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/service/hive_service.jar

deploy:
     [echo] hive: service
     [copy] Copying 1 file to /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build

package:
     [echo] Deploying Hive jars to /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/dist
    [mkdir] Created dir: /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/dist
    [mkdir] Created dir: /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/dist/lib
    [mkdir] Created dir: /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/dist/conf
    [mkdir] Created dir: /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/dist/bin
    [mkdir] Created dir: /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/dist/examples
    [mkdir] Created dir: /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/dist/examples/files
    [mkdir] Created dir: /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/dist/examples/queries
    [mkdir] Created dir: /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/dist/lib/py
    [mkdir] Created dir: /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/dist/lib/php
     [copy] Copying 1 file to /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/dist/bin
     [copy] Copying 5 files to /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/dist/bin/ext
     [copy] Copying 1 file to /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/dist/bin
     [copy] Copying 1 file to /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/dist/conf
     [copy] Copying 1 file to /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/dist/conf
     [copy] Copying 1 file to /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/dist/conf
     [copy] Copying 6 files to /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/dist/lib/php
     [copy] Copying 12 files to /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/dist/lib/py
     [copy] Copied 3 empty directories to 1 empty directory under /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/dist/lib/py
     [copy] Copying 35 files to /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/dist/lib
     [copy] Copying 16 files to /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/dist/examples/files
     [copy] Copying 1 file to /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/dist
     [copy] Copying 41 files to /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/dist/examples/queries

BUILD SUCCESSFUL
Total time: 42 seconds
RUNNING TEST FOR HIVE OPENSOURCE - ant test
+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Buildfile: build.xml

clean-test:

clean-test:
   [delete] Deleting directory /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/cli/test

clean-test:
   [delete] Deleting directory /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/common/test

clean-test:
   [delete] Deleting directory /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/metastore/test
Overriding previous definition of reference to test.classpath

clean-test:
   [delete] Deleting directory /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test

clean-test:
   [delete] Deleting directory /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/serde/test

clean-test:
   [delete] Deleting directory /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/service/test

BUILD SUCCESSFUL
Total time: 1 second
Buildfile: build.xml

clean-test:

clean-test:

clean-test:

clean-test:
Overriding previous definition of reference to test.classpath

clean-test:

clean-test:

clean-test:

deploy:

init:
    [mkdir] Created dir: /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/common/test
    [mkdir] Created dir: /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/common/test/src
    [mkdir] Created dir: /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/common/test/classes

download-ivy:

init-ivy:

settings-ivy:

resolve:
[ivy:retrieve] :: Ivy 2.0.0-rc2 - 20081028224207 :: http://ant.apache.org/ivy/ ::
:: loading settings :: file = /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ivy/ivysettings.xml
[ivy:retrieve] :: resolving dependencies :: org.apache.hadoop.hive#common;working@devbuild001.snc1.facebook.com
[ivy:retrieve] 	confs: [default]
[ivy:retrieve] 	found hadoop#core;0.19.0 in hadoop-resolver
[ivy:retrieve] :: resolution report :: resolve 89ms :: artifacts dl 4ms
	---------------------------------------------------------------------
	|                  |            modules            ||   artifacts   |
	|       conf       | number| search|dwnlded|evicted|| number|dwnlded|
	---------------------------------------------------------------------
	|      default     |   1   |   0   |   0   |   0   ||   1   |   0   |
	---------------------------------------------------------------------
[ivy:retrieve] :: retrieving :: org.apache.hadoop.hive#common
[ivy:retrieve] 	confs: [default]
[ivy:retrieve] 	0 artifacts copied, 1 already retrieved (0kB/5ms)

install-hadoopcore:

compile:
     [echo] Compiling: common

jar:
     [echo] Jar: common

deploy:
     [echo] hive: common

init:
    [mkdir] Created dir: /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/serde/test
    [mkdir] Created dir: /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/serde/test/src
    [mkdir] Created dir: /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/serde/test/classes

dynamic-serde:

compile:
     [echo] Compiling: serde

jar:
     [echo] Jar: serde

deploy:
     [echo] hive: serde

init:
    [mkdir] Created dir: /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/metastore/test
    [mkdir] Created dir: /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/metastore/test/src
    [mkdir] Created dir: /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/metastore/test/classes

model-compile:

core-compile:
     [echo] Compiling: 

model-enhance:

compile:

jar:
     [echo] Jar: metastore

deploy:
     [echo] hive: metastore
Overriding previous definition of reference to test.classpath

init:
    [mkdir] Created dir: /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test
    [mkdir] Created dir: /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/src
    [mkdir] Created dir: /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/classes

ql-init:

build-grammar:

compile-ant-tasks:

init:

download-ivy:

init-ivy:

settings-ivy:

resolve:
:: loading settings :: file = /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ivy/ivysettings.xml
[ivy:retrieve] :: resolving dependencies :: org.apache.hadoop.hive#ant;working@devbuild001.snc1.facebook.com
[ivy:retrieve] 	confs: [default]
[ivy:retrieve] :: resolution report :: resolve 8ms :: artifacts dl 0ms
	---------------------------------------------------------------------
	|                  |            modules            ||   artifacts   |
	|       conf       | number| search|dwnlded|evicted|| number|dwnlded|
	---------------------------------------------------------------------
	|      default     |   0   |   0   |   0   |   0   ||   0   |   0   |
	---------------------------------------------------------------------
[ivy:retrieve] :: retrieving :: org.apache.hadoop.hive#ant
[ivy:retrieve] 	confs: [default]
[ivy:retrieve] 	0 artifacts copied, 0 already retrieved (0kB/2ms)

install-hadoopcore:

compile:
     [echo] Compiling: anttasks

deploy-ant-tasks:

init:

download-ivy:

init-ivy:

settings-ivy:

resolve:
:: loading settings :: file = /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ivy/ivysettings.xml
[ivy:retrieve] :: resolving dependencies :: org.apache.hadoop.hive#ant;working@devbuild001.snc1.facebook.com
[ivy:retrieve] 	confs: [default]
[ivy:retrieve] :: resolution report :: resolve 8ms :: artifacts dl 0ms
	---------------------------------------------------------------------
	|                  |            modules            ||   artifacts   |
	|       conf       | number| search|dwnlded|evicted|| number|dwnlded|
	---------------------------------------------------------------------
	|      default     |   0   |   0   |   0   |   0   ||   0   |   0   |
	---------------------------------------------------------------------
[ivy:retrieve] :: retrieving :: org.apache.hadoop.hive#ant
[ivy:retrieve] 	confs: [default]
[ivy:retrieve] 	0 artifacts copied, 0 already retrieved (0kB/3ms)

install-hadoopcore:

compile:
     [echo] Compiling: anttasks

jar:

deploy:
     [echo] hive: anttasks

configure:

compile:
     [echo] Compiling: ql
    [javac] Compiling 8 source files to /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/classes

jar:
     [echo] Jar: ql
    [unzip] Expanding: /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/lib/commons-jexl-1.1.jar into /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/jexl/classes
    [unzip] Expanding: /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/lib/libthrift.jar into /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/thrift/classes
    [unzip] Expanding: /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/lib/commons-lang-2.4.jar into /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/commons-lang/classes

deploy:
     [echo] hive: ql

init:
    [mkdir] Created dir: /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/cli/test
    [mkdir] Created dir: /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/cli/test/src
    [mkdir] Created dir: /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/cli/test/classes

download-ivy:

init-ivy:

settings-ivy:

resolve:
:: loading settings :: file = /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ivy/ivysettings.xml
[ivy:retrieve] :: resolving dependencies :: org.apache.hadoop.hive#cli;working@devbuild001.snc1.facebook.com
[ivy:retrieve] 	confs: [default]
[ivy:retrieve] 	found hadoop#core;0.19.0 in hadoop-resolver
[ivy:retrieve] :: resolution report :: resolve 35ms :: artifacts dl 2ms
	---------------------------------------------------------------------
	|                  |            modules            ||   artifacts   |
	|       conf       | number| search|dwnlded|evicted|| number|dwnlded|
	---------------------------------------------------------------------
	|      default     |   1   |   0   |   0   |   0   ||   1   |   0   |
	---------------------------------------------------------------------
[ivy:retrieve] :: retrieving :: org.apache.hadoop.hive#cli
[ivy:retrieve] 	confs: [default]
[ivy:retrieve] 	0 artifacts copied, 1 already retrieved (0kB/3ms)

install-hadoopcore:

compile:
     [echo] Compiling: cli

jar:
     [echo] Jar: cli

deploy:
     [echo] hive: cli

init:
    [mkdir] Created dir: /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/service/test
    [mkdir] Created dir: /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/service/test/src
    [mkdir] Created dir: /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/service/test/classes

core-compile:

compile:

jar:
     [echo] Jar: service

deploy:
     [echo] hive: service

test:

test:
     [echo] Nothing to do!

test:
     [echo] Nothing to do!

test-conditions:

gen-test:

init:

model-compile:

core-compile:
     [echo] Compiling: 

model-enhance:

compile:

compile-test:
    [javac] Compiling 14 source files to /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/metastore/test/classes

test-jar:

test-init:
    [mkdir] Created dir: /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/metastore/test/data
     [copy] Copying 18 files to /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/metastore/test/data
     [copy] Copied 5 empty directories to 2 empty directories under /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/metastore/test/data

test:
    [mkdir] Created dir: /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/metastore/test/logs
    [junit] Running org.apache.hadoop.hive.metastore.TestAlter
    [junit] Tests run: 1, Failures: 0, Errors: 0, Time elapsed: 0.06 sec
    [junit] Running org.apache.hadoop.hive.metastore.TestCreateDB
    [junit] Tests run: 1, Failures: 0, Errors: 0, Time elapsed: 0.048 sec
    [junit] Running org.apache.hadoop.hive.metastore.TestDBGetName
    [junit] Tests run: 1, Failures: 0, Errors: 0, Time elapsed: 0.032 sec
    [junit] Running org.apache.hadoop.hive.metastore.TestDrop
    [junit] Tests run: 1, Failures: 0, Errors: 0, Time elapsed: 0.06 sec
    [junit] Running org.apache.hadoop.hive.metastore.TestGetDBs
    [junit] Tests run: 1, Failures: 0, Errors: 0, Time elapsed: 0.049 sec
    [junit] Running org.apache.hadoop.hive.metastore.TestGetSchema
    [junit] Tests run: 1, Failures: 0, Errors: 0, Time elapsed: 0.055 sec
    [junit] Running org.apache.hadoop.hive.metastore.TestGetTable
    [junit] Tests run: 1, Failures: 0, Errors: 0, Time elapsed: 0.056 sec
    [junit] Running org.apache.hadoop.hive.metastore.TestGetTables
    [junit] Tests run: 1, Failures: 0, Errors: 0, Time elapsed: 0.059 sec
    [junit] Running org.apache.hadoop.hive.metastore.TestHiveMetaStore
    [junit] Tests run: 7, Failures: 0, Errors: 0, Time elapsed: 8.017 sec
    [junit] Running org.apache.hadoop.hive.metastore.TestPartitions
    [junit] Tests run: 1, Failures: 0, Errors: 0, Time elapsed: 0.069 sec
    [junit] Running org.apache.hadoop.hive.metastore.TestTableExists
    [junit] Tests run: 1, Failures: 0, Errors: 0, Time elapsed: 0.054 sec
    [junit] Running org.apache.hadoop.hive.metastore.TestTablePath
    [junit] Tests run: 1, Failures: 0, Errors: 0, Time elapsed: 0.032 sec
    [junit] Running org.apache.hadoop.hive.metastore.TestTruncate
    [junit] Tests run: 1, Failures: 0, Errors: 0, Time elapsed: 0.069 sec
Overriding previous definition of reference to test.classpath

test-conditions:

init:

compile-ant-tasks:

init:

download-ivy:

init-ivy:

settings-ivy:

resolve:
:: loading settings :: file = /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ivy/ivysettings.xml
[ivy:retrieve] :: resolving dependencies :: org.apache.hadoop.hive#ant;working@devbuild001.snc1.facebook.com
[ivy:retrieve] 	confs: [default]
[ivy:retrieve] :: resolution report :: resolve 9ms :: artifacts dl 0ms
	---------------------------------------------------------------------
	|                  |            modules            ||   artifacts   |
	|       conf       | number| search|dwnlded|evicted|| number|dwnlded|
	---------------------------------------------------------------------
	|      default     |   0   |   0   |   0   |   0   ||   0   |   0   |
	---------------------------------------------------------------------
[ivy:retrieve] :: retrieving :: org.apache.hadoop.hive#ant
[ivy:retrieve] 	confs: [default]
[ivy:retrieve] 	0 artifacts copied, 0 already retrieved (0kB/3ms)

install-hadoopcore:

compile:
     [echo] Compiling: anttasks

deploy-ant-tasks:

init:

download-ivy:

init-ivy:

settings-ivy:

resolve:
:: loading settings :: file = /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ivy/ivysettings.xml
[ivy:retrieve] :: resolving dependencies :: org.apache.hadoop.hive#ant;working@devbuild001.snc1.facebook.com
[ivy:retrieve] 	confs: [default]
[ivy:retrieve] :: resolution report :: resolve 7ms :: artifacts dl 0ms
	---------------------------------------------------------------------
	|                  |            modules            ||   artifacts   |
	|       conf       | number| search|dwnlded|evicted|| number|dwnlded|
	---------------------------------------------------------------------
	|      default     |   0   |   0   |   0   |   0   ||   0   |   0   |
	---------------------------------------------------------------------
[ivy:retrieve] :: retrieving :: org.apache.hadoop.hive#ant
[ivy:retrieve] 	confs: [default]
[ivy:retrieve] 	0 artifacts copied, 0 already retrieved (0kB/2ms)

install-hadoopcore:

compile:
     [echo] Compiling: anttasks

jar:

deploy:
     [echo] hive: anttasks

gen-test:
 [qtestgen] Template Path:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/src/test/templates
 [qtestgen] Dec 30, 2008 5:12:20 PM org.apache.velocity.runtime.log.JdkLogChute log
 [qtestgen] INFO: FileResourceLoader : adding path '/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/src/test/templates'
 [qtestgen] Generated /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/src/org/apache/hadoop/hive/ql/parse/TestParse.java from template TestParse.vm
 [qtestgen] Template Path:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/src/test/templates
 [qtestgen] Dec 30, 2008 5:12:20 PM org.apache.velocity.runtime.log.JdkLogChute log
 [qtestgen] INFO: FileResourceLoader : adding path '/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/src/test/templates'
 [qtestgen] Generated /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/src/org/apache/hadoop/hive/ql/parse/TestParseNegative.java from template TestParseNegative.vm
 [qtestgen] Template Path:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/src/test/templates
 [qtestgen] Dec 30, 2008 5:12:20 PM org.apache.velocity.runtime.log.JdkLogChute log
 [qtestgen] INFO: FileResourceLoader : adding path '/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/src/test/templates'
 [qtestgen] Generated /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/src/org/apache/hadoop/hive/cli/TestCliDriver.java from template TestCliDriver.vm
 [qtestgen] Template Path:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/src/test/templates
 [qtestgen] Dec 30, 2008 5:12:20 PM org.apache.velocity.runtime.log.JdkLogChute log
 [qtestgen] INFO: FileResourceLoader : adding path '/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/src/test/templates'
 [qtestgen] Generated /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/src/org/apache/hadoop/hive/cli/TestNegativeCliDriver.java from template TestNegativeCliDriver.vm

ql-init:

build-grammar:

configure:

compile:
     [echo] Compiling: ql
    [javac] Compiling 8 source files to /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/classes

compile-test:
    [javac] Compiling 15 source files to /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/classes
    [javac] Note: Some input files use or override a deprecated API.
    [javac] Note: Recompile with -Xlint:deprecation for details.
    [javac] Note: Some input files use unchecked or unsafe operations.
    [javac] Note: Recompile with -Xlint:unchecked for details.
    [javac] Compiling 4 source files to /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/classes

test-jar:
      [jar] Building jar: /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/test-udfs.jar

test-init:
     [copy] Copying 18 files to /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data
     [copy] Copied 4 empty directories to 2 empty directories under /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data

test:
    [mkdir] Created dir: /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/logs
    [junit] Running org.apache.hadoop.hive.cli.TestCliDriver
    [junit] Begin query: mapreduce1.q
    [junit] Loading data to table srcpart partition {ds=2008-04-08, hr=11}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-08, hr=12}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-09, hr=11}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-09, hr=12}
    [junit] OK
    [junit] Loading data to table srcbucket
    [junit] OK
    [junit] Loading data to table srcbucket
    [junit] OK
    [junit] Loading data to table src
    [junit] OK
    [junit] plan = /tmp/plan1375.xml
    [junit] 08/12/30 17:12:34 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to jobconf value of: 1
    [junit] 08/12/30 17:12:34 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src
    [junit] 08/12/30 17:12:34 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/data/files/TestSerDe.jar
    [junit] 08/12/30 17:12:34 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
    [junit] 08/12/30 17:12:34 WARN mapred.JobClient: Use GenericOptionsParser for parsing the arguments. Applications should implement Tool for the same.
    [junit] 08/12/30 17:12:34 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] 08/12/30 17:12:34 INFO exec.ExecDriver: Job running in-process (local Hadoop)
    [junit] 08/12/30 17:12:35 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] Job running in-process (local Hadoop)
    [junit] 08/12/30 17:12:35 INFO mapred.MapTask: numReduceTasks: 1
    [junit] 08/12/30 17:12:35 INFO mapred.MapTask: io.sort.mb = 100
    [junit] 08/12/30 17:12:35 INFO mapred.MapTask: data buffer = 79691776/99614720
    [junit] 08/12/30 17:12:35 INFO mapred.MapTask: record buffer = 262144/327680
    [junit] 08/12/30 17:12:35 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 17:12:35 INFO exec.MapOperator: Adding alias src to work list for file /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src/kv1.txt
    [junit] 08/12/30 17:12:35 INFO exec.MapOperator: Got partitions: null
    [junit] 08/12/30 17:12:35 INFO exec.TableScanOperator: Initializing Self
    [junit] 08/12/30 17:12:35 INFO exec.TableScanOperator: Initializing children:
    [junit] 08/12/30 17:12:35 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 17:12:35 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 17:12:35 INFO exec.ScriptOperator: Initializing Self
    [junit] 08/12/30 17:12:35 INFO exec.ScriptOperator: Initializing children:
    [junit] 08/12/30 17:12:35 INFO exec.ReduceSinkOperator: Initializing Self
    [junit] 08/12/30 17:12:35 INFO exec.ReduceSinkOperator: Using tag = -1
    [junit] 08/12/30 17:12:35 INFO thrift.TBinarySortableProtocol: Sort order is "++"
    [junit] 08/12/30 17:12:35 INFO thrift.TBinarySortableProtocol: Sort order is "++"
    [junit] 08/12/30 17:12:35 INFO exec.ScriptOperator: Initialization Done
    [junit] 08/12/30 17:12:35 INFO exec.ScriptOperator: Executing [/bin/cat]
    [junit] 08/12/30 17:12:35 INFO exec.ScriptOperator: tablename=src
    [junit] 08/12/30 17:12:35 INFO exec.ScriptOperator: partname={}
    [junit] 08/12/30 17:12:35 INFO exec.ScriptOperator: alias=src
    [junit] 08/12/30 17:12:35 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 17:12:35 INFO exec.TableScanOperator: Initialization Done
    [junit] 08/12/30 17:12:35 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 17:12:35 INFO exec.ScriptOperator: StreamThread ErrorProcessor done
    [junit] 08/12/30 17:12:35 INFO exec.ScriptOperator: StreamThread OutputProcessor done
    [junit] 08/12/30 17:12:35 INFO exec.ScriptOperator: SERIALIZE_ERRORS:0
    [junit] 08/12/30 17:12:35 INFO exec.ScriptOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 17:12:35 INFO mapred.MapTask: Starting flush of map output
    [junit] 08/12/30 17:12:35 INFO mapred.MapTask: Finished spill 0
    [junit] 08/12/30 17:12:35 INFO mapred.TaskRunner: Task:attempt_local_0001_m_000000_0 is done. And is in the process of commiting
    [junit] 08/12/30 17:12:35 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src/kv1.txt:0+5812
    [junit] 08/12/30 17:12:35 INFO mapred.TaskRunner: Task 'attempt_local_0001_m_000000_0' done.
    [junit] 08/12/30 17:12:35 INFO thrift.TBinarySortableProtocol: Sort order is "++"
    [junit] 08/12/30 17:12:35 INFO thrift.TBinarySortableProtocol: Sort order is "++"
    [junit] 08/12/30 17:12:35 INFO mapred.Merger: Merging 1 sorted segments
    [junit] 08/12/30 17:12:35 INFO mapred.Merger: Down to the last merge-pass, with 1 segments left of total size: 29146 bytes
    [junit] 08/12/30 17:12:35 INFO exec.ExtractOperator: Initializing Self
    [junit] 08/12/30 17:12:35 INFO exec.ExtractOperator: Initializing children:
    [junit] 08/12/30 17:12:35 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 17:12:35 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 17:12:35 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 17:12:35 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 17:12:35 INFO exec.ExtractOperator: Initialization Done
    [junit] 08/12/30 17:12:35 INFO mapred.TaskRunner: Task:attempt_local_0001_r_000000_0 is done. And is in the process of commiting
    [junit] 08/12/30 17:12:35 INFO mapred.LocalJobRunner: 
    [junit] 08/12/30 17:12:35 INFO mapred.TaskRunner: Task attempt_local_0001_r_000000_0 is allowed to commit now
    [junit] 08/12/30 17:12:35 INFO mapred.FileOutputCommitter: Saved output of task 'attempt_local_0001_r_000000_0' to file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/tmp554598750
    [junit] 08/12/30 17:12:35 INFO mapred.LocalJobRunner: reduce > reduce
    [junit] 08/12/30 17:12:35 INFO mapred.TaskRunner: Task 'attempt_local_0001_r_000000_0' done.
    [junit] 08/12/30 17:12:36 INFO exec.ExecDriver:  map = 100%,  reduce =100%
    [junit] 08/12/30 17:12:36 INFO exec.ExecDriver: Ended Job = job_local_0001
    [junit]  map = 100%,  reduce =100%
    [junit] Ended Job = job_local_0001
    [junit] diff -I \(file:\)\|\(/tmp/.*\) /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/../build/ql/test/logs/mapreduce1.q.out /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/src/test/results/clientpositive/mapreduce1.q.out
    [junit] Done query: mapreduce1.q
    [junit] Begin query: mapreduce3.q
    [junit] plan = /tmp/plan1376.xml
    [junit] 08/12/30 17:12:40 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to jobconf value of: 1
    [junit] 08/12/30 17:12:40 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src
    [junit] 08/12/30 17:12:40 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/data/files/TestSerDe.jar
    [junit] 08/12/30 17:12:40 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
    [junit] 08/12/30 17:12:40 WARN mapred.JobClient: Use GenericOptionsParser for parsing the arguments. Applications should implement Tool for the same.
    [junit] 08/12/30 17:12:41 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] 08/12/30 17:12:41 INFO exec.ExecDriver: Job running in-process (local Hadoop)
    [junit] 08/12/30 17:12:41 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] Job running in-process (local Hadoop)
    [junit] 08/12/30 17:12:41 INFO mapred.MapTask: numReduceTasks: 1
    [junit] 08/12/30 17:12:41 INFO mapred.MapTask: io.sort.mb = 100
    [junit] 08/12/30 17:12:41 INFO mapred.MapTask: data buffer = 79691776/99614720
    [junit] 08/12/30 17:12:41 INFO mapred.MapTask: record buffer = 262144/327680
    [junit] 08/12/30 17:12:41 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 17:12:41 INFO exec.MapOperator: Adding alias src to work list for file /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src/kv1.txt
    [junit] 08/12/30 17:12:41 INFO exec.MapOperator: Got partitions: null
    [junit] 08/12/30 17:12:41 INFO exec.TableScanOperator: Initializing Self
    [junit] 08/12/30 17:12:41 INFO exec.TableScanOperator: Initializing children:
    [junit] 08/12/30 17:12:41 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 17:12:41 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 17:12:41 INFO exec.ScriptOperator: Initializing Self
    [junit] 08/12/30 17:12:41 INFO exec.ScriptOperator: Initializing children:
    [junit] 08/12/30 17:12:41 INFO exec.ReduceSinkOperator: Initializing Self
    [junit] 08/12/30 17:12:41 INFO exec.ReduceSinkOperator: Using tag = -1
    [junit] 08/12/30 17:12:41 INFO thrift.TBinarySortableProtocol: Sort order is "++"
    [junit] 08/12/30 17:12:41 INFO thrift.TBinarySortableProtocol: Sort order is "++"
    [junit] 08/12/30 17:12:41 INFO exec.ScriptOperator: Initialization Done
    [junit] 08/12/30 17:12:41 INFO exec.ScriptOperator: Executing [/bin/cat]
    [junit] 08/12/30 17:12:41 INFO exec.ScriptOperator: tablename=src
    [junit] 08/12/30 17:12:41 INFO exec.ScriptOperator: partname={}
    [junit] 08/12/30 17:12:41 INFO exec.ScriptOperator: alias=src
    [junit] 08/12/30 17:12:41 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 17:12:41 INFO exec.TableScanOperator: Initialization Done
    [junit] 08/12/30 17:12:41 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 17:12:41 INFO exec.ScriptOperator: StreamThread ErrorProcessor done
    [junit] 08/12/30 17:12:41 INFO exec.ScriptOperator: StreamThread OutputProcessor done
    [junit] 08/12/30 17:12:41 INFO exec.ScriptOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 17:12:41 INFO exec.ScriptOperator: SERIALIZE_ERRORS:0
    [junit] 08/12/30 17:12:41 INFO mapred.MapTask: Starting flush of map output
    [junit] 08/12/30 17:12:41 INFO mapred.MapTask: Finished spill 0
    [junit] 08/12/30 17:12:41 INFO mapred.TaskRunner: Task:attempt_local_0001_m_000000_0 is done. And is in the process of commiting
    [junit] 08/12/30 17:12:41 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src/kv1.txt:0+5812
    [junit] 08/12/30 17:12:41 INFO mapred.TaskRunner: Task 'attempt_local_0001_m_000000_0' done.
    [junit] 08/12/30 17:12:41 INFO thrift.TBinarySortableProtocol: Sort order is "++"
    [junit] 08/12/30 17:12:41 INFO thrift.TBinarySortableProtocol: Sort order is "++"
    [junit] 08/12/30 17:12:41 INFO mapred.Merger: Merging 1 sorted segments
    [junit] 08/12/30 17:12:41 INFO mapred.Merger: Down to the last merge-pass, with 1 segments left of total size: 32542 bytes
    [junit] 08/12/30 17:12:41 INFO exec.ExtractOperator: Initializing Self
    [junit] 08/12/30 17:12:41 INFO exec.ExtractOperator: Initializing children:
    [junit] 08/12/30 17:12:41 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 17:12:41 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 17:12:41 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 17:12:41 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 17:12:41 INFO exec.ExtractOperator: Initialization Done
    [junit] 08/12/30 17:12:41 INFO mapred.TaskRunner: Task:attempt_local_0001_r_000000_0 is done. And is in the process of commiting
    [junit] 08/12/30 17:12:41 INFO mapred.LocalJobRunner: 
    [junit] 08/12/30 17:12:41 INFO mapred.TaskRunner: Task attempt_local_0001_r_000000_0 is allowed to commit now
    [junit] 08/12/30 17:12:41 INFO mapred.FileOutputCommitter: Saved output of task 'attempt_local_0001_r_000000_0' to file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/tmp1698090832
    [junit] 08/12/30 17:12:41 INFO mapred.LocalJobRunner: reduce > reduce
    [junit] 08/12/30 17:12:42 INFO mapred.TaskRunner: Task 'attempt_local_0001_r_000000_0' done.
    [junit] 08/12/30 17:12:42 INFO exec.ExecDriver:  map = 100%,  reduce =100%
    [junit] 08/12/30 17:12:42 INFO exec.ExecDriver: Ended Job = job_local_0001
    [junit]  map = 100%,  reduce =100%
    [junit] Ended Job = job_local_0001
    [junit] diff -I \(file:\)\|\(/tmp/.*\) /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/../build/ql/test/logs/mapreduce3.q.out /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/src/test/results/clientpositive/mapreduce3.q.out
    [junit] Done query: mapreduce3.q
    [junit] Begin query: alter1.q
    [junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src)
    [junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:361)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
    [junit] 	at org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_alter1(TestCliDriver.java:353)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit] 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit] 	at java.lang.reflect.Method.invoke(Method.java:597)
    [junit] 	at junit.framework.TestCase.runTest(TestCase.java:164)
    [junit] 	at junit.framework.TestCase.runBare(TestCase.java:130)
    [junit] 	at junit.framework.TestResult$1.protect(TestResult.java:106)
    [junit] 	at junit.framework.TestResult.runProtected(TestResult.java:124)
    [junit] 	at junit.framework.TestResult.run(TestResult.java:109)
    [junit] 	at junit.framework.TestCase.run(TestCase.java:120)
    [junit] 	at junit.framework.TestSuite.runTest(TestSuite.java:230)
    [junit] 	at junit.framework.TestSuite.run(TestSuite.java:225)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
    [junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src)
    [junit] 	at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
    [junit] 	at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
    [junit] 	at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
    [junit] 	... 21 more
    [junit] Begin query: showparts.q
    [junit] diff -I \(file:\)\|\(/tmp/.*\) /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/../build/ql/test/logs/showparts.q.out /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/src/test/results/clientpositive/showparts.q.out
    [junit] Done query: showparts.q
    [junit] Begin query: mapreduce5.q
    [junit] plan = /tmp/plan1377.xml
    [junit] 08/12/30 17:12:48 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to jobconf value of: 1
    [junit] 08/12/30 17:12:48 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src
    [junit] 08/12/30 17:12:48 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/data/files/TestSerDe.jar
    [junit] 08/12/30 17:12:48 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
    [junit] 08/12/30 17:12:48 WARN mapred.JobClient: Use GenericOptionsParser for parsing the arguments. Applications should implement Tool for the same.
    [junit] 08/12/30 17:12:48 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] Job running in-process (local Hadoop)
    [junit] 08/12/30 17:12:48 INFO exec.ExecDriver: Job running in-process (local Hadoop)
    [junit] 08/12/30 17:12:48 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] 08/12/30 17:12:48 INFO mapred.MapTask: numReduceTasks: 1
    [junit] 08/12/30 17:12:48 INFO mapred.MapTask: io.sort.mb = 100
    [junit] 08/12/30 17:12:49 INFO mapred.MapTask: data buffer = 79691776/99614720
    [junit] 08/12/30 17:12:49 INFO mapred.MapTask: record buffer = 262144/327680
    [junit] 08/12/30 17:12:49 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 17:12:49 INFO exec.MapOperator: Adding alias src to work list for file /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src/kv1.txt
    [junit] 08/12/30 17:12:49 INFO exec.MapOperator: Got partitions: null
    [junit] 08/12/30 17:12:49 INFO exec.TableScanOperator: Initializing Self
    [junit] 08/12/30 17:12:49 INFO exec.TableScanOperator: Initializing children:
    [junit] 08/12/30 17:12:49 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 17:12:49 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 17:12:49 INFO exec.ReduceSinkOperator: Initializing Self
    [junit] 08/12/30 17:12:49 INFO exec.ReduceSinkOperator: Using tag = -1
    [junit] 08/12/30 17:12:49 INFO thrift.TBinarySortableProtocol: Sort order is "-+"
    [junit] 08/12/30 17:12:49 INFO thrift.TBinarySortableProtocol: Sort order is "-+"
    [junit] 08/12/30 17:12:49 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 17:12:49 INFO exec.TableScanOperator: Initialization Done
    [junit] 08/12/30 17:12:49 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 17:12:49 INFO mapred.MapTask: Starting flush of map output
    [junit] 08/12/30 17:12:49 INFO mapred.MapTask: Finished spill 0
    [junit] 08/12/30 17:12:49 INFO mapred.TaskRunner: Task:attempt_local_0001_m_000000_0 is done. And is in the process of commiting
    [junit] 08/12/30 17:12:49 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src/kv1.txt:0+5812
    [junit] 08/12/30 17:12:49 INFO mapred.TaskRunner: Task 'attempt_local_0001_m_000000_0' done.
    [junit] 08/12/30 17:12:49 INFO thrift.TBinarySortableProtocol: Sort order is "-+"
    [junit] 08/12/30 17:12:49 INFO thrift.TBinarySortableProtocol: Sort order is "-+"
    [junit] 08/12/30 17:12:49 INFO mapred.Merger: Merging 1 sorted segments
    [junit] 08/12/30 17:12:49 INFO mapred.Merger: Down to the last merge-pass, with 1 segments left of total size: 29314 bytes
    [junit] 08/12/30 17:12:49 INFO exec.ExtractOperator: Initializing Self
    [junit] 08/12/30 17:12:49 INFO exec.ExtractOperator: Initializing children:
    [junit] 08/12/30 17:12:49 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 17:12:49 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 17:12:49 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 17:12:49 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 17:12:49 INFO exec.ExtractOperator: Initialization Done
    [junit] 08/12/30 17:12:49 INFO mapred.TaskRunner: Task:attempt_local_0001_r_000000_0 is done. And is in the process of commiting
    [junit] 08/12/30 17:12:49 INFO mapred.LocalJobRunner: 
    [junit] 08/12/30 17:12:49 INFO mapred.TaskRunner: Task attempt_local_0001_r_000000_0 is allowed to commit now
    [junit] 08/12/30 17:12:49 INFO mapred.FileOutputCommitter: Saved output of task 'attempt_local_0001_r_000000_0' to file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/tmp182571786
    [junit] 08/12/30 17:12:49 INFO mapred.LocalJobRunner: reduce > reduce
    [junit] 08/12/30 17:12:49 INFO mapred.TaskRunner: Task 'attempt_local_0001_r_000000_0' done.
    [junit] 08/12/30 17:12:49 INFO exec.ExecDriver:  map = 100%,  reduce =100%
    [junit] 08/12/30 17:12:49 INFO exec.ExecDriver: Ended Job = job_local_0001
    [junit]  map = 100%,  reduce =100%
    [junit] Ended Job = job_local_0001
    [junit] diff -I \(file:\)\|\(/tmp/.*\) /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/../build/ql/test/logs/mapreduce5.q.out /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/src/test/results/clientpositive/mapreduce5.q.out
    [junit] Done query: mapreduce5.q
    [junit] Begin query: subq2.q
    [junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src)
    [junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:361)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
    [junit] 	at org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_subq2(TestCliDriver.java:428)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit] 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit] 	at java.lang.reflect.Method.invoke(Method.java:597)
    [junit] 	at junit.framework.TestCase.runTest(TestCase.java:164)
    [junit] 	at junit.framework.TestCase.runBare(TestCase.java:130)
    [junit] 	at junit.framework.TestResult$1.protect(TestResult.java:106)
    [junit] 	at junit.framework.TestResult.runProtected(TestResult.java:124)
    [junit] 	at junit.framework.TestResult.run(TestResult.java:109)
    [junit] 	at junit.framework.TestCase.run(TestCase.java:120)
    [junit] 	at junit.framework.TestSuite.runTest(TestSuite.java:230)
    [junit] 	at junit.framework.TestSuite.run(TestSuite.java:225)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
    [junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src)
    [junit] 	at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
    [junit] 	at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
    [junit] 	at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
    [junit] 	... 21 more
    [junit] Begin query: input_limit.q
    [junit] diff -I \(file:\)\|\(/tmp/.*\) /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/../build/ql/test/logs/input_limit.q.out /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/src/test/results/clientpositive/input_limit.q.out
    [junit] Done query: input_limit.q
    [junit] Begin query: input11_limit.q
    [junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src)
    [junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:361)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
    [junit] 	at org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_input11_limit(TestCliDriver.java:478)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit] 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit] 	at java.lang.reflect.Method.invoke(Method.java:597)
    [junit] 	at junit.framework.TestCase.runTest(TestCase.java:164)
    [junit] 	at junit.framework.TestCase.runBare(TestCase.java:130)
    [junit] 	at junit.framework.TestResult$1.protect(TestResult.java:106)
    [junit] 	at junit.framework.TestResult.runProtected(TestResult.java:124)
    [junit] 	at junit.framework.TestResult.run(TestResult.java:109)
    [junit] 	at junit.framework.TestCase.run(TestCase.java:120)
    [junit] 	at junit.framework.TestSuite.runTest(TestSuite.java:230)
    [junit] 	at junit.framework.TestSuite.run(TestSuite.java:225)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
    [junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src)
    [junit] 	at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
    [junit] 	at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
    [junit] 	at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
    [junit] 	... 21 more
    [junit] Begin query: input20.q
    [junit] plan = /tmp/plan1378.xml
    [junit] 08/12/30 17:12:55 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to jobconf value of: 1
    [junit] 08/12/30 17:12:55 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src
    [junit] 08/12/30 17:12:55 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/data/files/TestSerDe.jar
    [junit] 08/12/30 17:12:55 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
    [junit] 08/12/30 17:12:55 WARN mapred.JobClient: Use GenericOptionsParser for parsing the arguments. Applications should implement Tool for the same.
    [junit] 08/12/30 17:12:56 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] Job running in-process (local Hadoop)
    [junit] 08/12/30 17:12:56 INFO exec.ExecDriver: Job running in-process (local Hadoop)
    [junit] 08/12/30 17:12:56 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] 08/12/30 17:12:56 INFO mapred.MapTask: numReduceTasks: 1
    [junit] 08/12/30 17:12:56 INFO mapred.MapTask: io.sort.mb = 100
    [junit] 08/12/30 17:12:56 INFO mapred.MapTask: data buffer = 79691776/99614720
    [junit] 08/12/30 17:12:56 INFO mapred.MapTask: record buffer = 262144/327680
    [junit] 08/12/30 17:12:56 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 17:12:56 INFO exec.MapOperator: Adding alias tmap:src to work list for file /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src/kv1.txt
    [junit] 08/12/30 17:12:56 INFO exec.MapOperator: Got partitions: null
    [junit] 08/12/30 17:12:56 INFO exec.TableScanOperator: Initializing Self
    [junit] 08/12/30 17:12:56 INFO exec.TableScanOperator: Initializing children:
    [junit] 08/12/30 17:12:56 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 17:12:56 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 17:12:56 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 17:12:56 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 17:12:56 INFO exec.ScriptOperator: Initializing Self
    [junit] 08/12/30 17:12:56 INFO exec.ScriptOperator: Initializing children:
    [junit] 08/12/30 17:12:56 INFO exec.ReduceSinkOperator: Initializing Self
    [junit] 08/12/30 17:12:56 INFO exec.ReduceSinkOperator: Using tag = -1
    [junit] 08/12/30 17:12:56 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 17:12:56 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 17:12:56 INFO exec.ScriptOperator: Initialization Done
    [junit] 08/12/30 17:12:56 INFO exec.ScriptOperator: Executing [/bin/cat]
    [junit] 08/12/30 17:12:56 INFO exec.ScriptOperator: tablename=src
    [junit] 08/12/30 17:12:56 INFO exec.ScriptOperator: partname={}
    [junit] 08/12/30 17:12:56 INFO exec.ScriptOperator: alias=tmap:src
    [junit] 08/12/30 17:12:56 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 17:12:56 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 17:12:56 INFO exec.TableScanOperator: Initialization Done
    [junit] 08/12/30 17:12:56 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 17:12:56 INFO exec.ScriptOperator: StreamThread ErrorProcessor done
    [junit] 08/12/30 17:12:56 INFO exec.ScriptOperator: StreamThread OutputProcessor done
    [junit] 08/12/30 17:12:56 INFO exec.ScriptOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 17:12:56 INFO exec.ScriptOperator: SERIALIZE_ERRORS:0
    [junit] 08/12/30 17:12:56 INFO mapred.MapTask: Starting flush of map output
    [junit] 08/12/30 17:12:56 INFO mapred.MapTask: Finished spill 0
    [junit] 08/12/30 17:12:56 INFO mapred.TaskRunner: Task:attempt_local_0001_m_000000_0 is done. And is in the process of commiting
    [junit] 08/12/30 17:12:56 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src/kv1.txt:0+5812
    [junit] 08/12/30 17:12:56 INFO mapred.TaskRunner: Task 'attempt_local_0001_m_000000_0' done.
    [junit] 08/12/30 17:12:56 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 17:12:56 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 17:12:56 INFO mapred.Merger: Merging 1 sorted segments
    [junit] 08/12/30 17:12:56 INFO mapred.Merger: Down to the last merge-pass, with 1 segments left of total size: 18002 bytes
    [junit] 08/12/30 17:12:56 INFO exec.ExtractOperator: Initializing Self
    [junit] 08/12/30 17:12:56 INFO exec.ExtractOperator: Initializing children:
    [junit] 08/12/30 17:12:56 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 17:12:56 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 17:12:56 INFO exec.ScriptOperator: Initializing Self
    [junit] 08/12/30 17:12:56 INFO exec.ScriptOperator: Initializing children:
    [junit] 08/12/30 17:12:56 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 17:12:56 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 17:12:56 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 17:12:56 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 17:12:56 INFO exec.ScriptOperator: Initialization Done
    [junit] 08/12/30 17:12:56 INFO exec.ScriptOperator: Executing [/usr/bin/uniq, -c, |, sed, s@^ *@@, |, sed, s@\t@_@, |, sed, s@ @\t@]
    [junit] 08/12/30 17:12:56 INFO exec.ScriptOperator: tablename=null
    [junit] 08/12/30 17:12:56 INFO exec.ScriptOperator: partname=null
    [junit] 08/12/30 17:12:56 INFO exec.ScriptOperator: alias=null
    [junit] 08/12/30 17:12:57 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 17:12:57 INFO exec.ExtractOperator: Initialization Done
    [junit] /usr/bin/uniq: extra operand `s@^ *@@'
    [junit] Try `/usr/bin/uniq --help' for more information.
    [junit] 08/12/30 17:12:57 INFO exec.ScriptOperator: StreamThread ErrorProcessor done
    [junit] 08/12/30 17:12:57 INFO exec.ScriptOperator: StreamThread OutputProcessor done
    [junit] 08/12/30 17:12:57 INFO exec.ScriptOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 17:12:57 INFO exec.ScriptOperator: SERIALIZE_ERRORS:0
    [junit] 08/12/30 17:12:57 INFO mapred.TaskRunner: Task:attempt_local_0001_r_000000_0 is done. And is in the process of commiting
    [junit] 08/12/30 17:12:57 INFO mapred.LocalJobRunner: 
    [junit] 08/12/30 17:12:57 INFO mapred.TaskRunner: Task attempt_local_0001_r_000000_0 is allowed to commit now
    [junit] 08/12/30 17:12:57 INFO mapred.FileOutputCommitter: Saved output of task 'attempt_local_0001_r_000000_0' to file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/tmp1619846689
    [junit] 08/12/30 17:12:57 INFO mapred.LocalJobRunner: reduce > reduce
    [junit] 08/12/30 17:12:57 INFO mapred.TaskRunner: Task 'attempt_local_0001_r_000000_0' done.
    [junit] 08/12/30 17:12:57 INFO exec.ExecDriver:  map = 100%,  reduce =100%
    [junit] 08/12/30 17:12:57 INFO exec.ExecDriver: Ended Job = job_local_0001
    [junit]  map = 100%,  reduce =100%
    [junit] Ended Job = job_local_0001
    [junit] diff -I \(file:\)\|\(/tmp/.*\) /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/../build/ql/test/logs/input20.q.out /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/src/test/results/clientpositive/input20.q.out
    [junit] Done query: input20.q
    [junit] Begin query: input14_limit.q
    [junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src)
    [junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:361)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
    [junit] 	at org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_input14_limit(TestCliDriver.java:528)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit] 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit] 	at java.lang.reflect.Method.invoke(Method.java:597)
    [junit] 	at junit.framework.TestCase.runTest(TestCase.java:164)
    [junit] 	at junit.framework.TestCase.runBare(TestCase.java:130)
    [junit] 	at junit.framework.TestResult$1.protect(TestResult.java:106)
    [junit] 	at junit.framework.TestResult.runProtected(TestResult.java:124)
    [junit] 	at junit.framework.TestResult.run(TestResult.java:109)
    [junit] 	at junit.framework.TestCase.run(TestCase.java:120)
    [junit] 	at junit.framework.TestSuite.runTest(TestSuite.java:230)
    [junit] 	at junit.framework.TestSuite.run(TestSuite.java:225)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
    [junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src)
    [junit] 	at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
    [junit] 	at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
    [junit] 	at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
    [junit] 	... 21 more
    [junit] Begin query: sample2.q
    [junit] plan = /tmp/plan1379.xml
    [junit] 08/12/30 17:13:00 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to 0 since there's no reduce operator
    [junit] 08/12/30 17:13:00 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/srcbucket/kv1.txt
    [junit] 08/12/30 17:13:00 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/data/files/TestSerDe.jar
    [junit] 08/12/30 17:13:00 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
    [junit] 08/12/30 17:13:00 WARN mapred.JobClient: Use GenericOptionsParser for parsing the arguments. Applications should implement Tool for the same.
    [junit] 08/12/30 17:13:00 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] 08/12/30 17:13:01 INFO exec.ExecDriver: Job running in-process (local Hadoop)
    [junit] Job running in-process (local Hadoop)
    [junit] 08/12/30 17:13:01 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] 08/12/30 17:13:01 INFO mapred.MapTask: numReduceTasks: 0
    [junit] 08/12/30 17:13:01 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 17:13:01 INFO exec.MapOperator: Adding alias s to work list for file /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/srcbucket/kv1.txt
    [junit] 08/12/30 17:13:01 INFO exec.MapOperator: Got partitions: null
    [junit] 08/12/30 17:13:01 INFO exec.TableScanOperator: Initializing Self
    [junit] 08/12/30 17:13:01 INFO exec.TableScanOperator: Initializing children:
    [junit] 08/12/30 17:13:01 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 17:13:01 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 17:13:01 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 17:13:01 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 17:13:01 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 17:13:01 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 17:13:01 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 17:13:01 INFO exec.TableScanOperator: Initialization Done
    [junit] 08/12/30 17:13:01 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 17:13:01 INFO mapred.TaskRunner: Task:attempt_local_0001_m_000000_0 is done. And is in the process of commiting
    [junit] 08/12/30 17:13:01 INFO mapred.LocalJobRunner: 
    [junit] 08/12/30 17:13:01 INFO mapred.TaskRunner: Task attempt_local_0001_m_000000_0 is allowed to commit now
    [junit] 08/12/30 17:13:01 INFO mapred.FileOutputCommitter: Saved output of task 'attempt_local_0001_m_000000_0' to file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/tmp549594733
    [junit] 08/12/30 17:13:01 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/srcbucket/kv1.txt:0+5812
    [junit] 08/12/30 17:13:01 INFO mapred.TaskRunner: Task 'attempt_local_0001_m_000000_0' done.
    [junit]  map = 100%,  reduce =0%
    [junit] Ended Job = job_local_0001
    [junit] 08/12/30 17:13:02 INFO exec.ExecDriver:  map = 100%,  reduce =0%
    [junit] 08/12/30 17:13:02 INFO exec.ExecDriver: Ended Job = job_local_0001
    [junit] diff -I \(file:\)\|\(/tmp/.*\) /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/../build/ql/test/logs/sample2.q.out /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/src/test/results/clientpositive/sample2.q.out
    [junit] Done query: sample2.q
    [junit] Begin query: inputddl1.q
    [junit] diff -I \(file:\)\|\(/tmp/.*\) /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/../build/ql/test/logs/inputddl1.q.out /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/src/test/results/clientpositive/inputddl1.q.out
    [junit] Done query: inputddl1.q
    [junit] Begin query: sample4.q
    [junit] plan = /tmp/plan1380.xml
    [junit] 08/12/30 17:13:07 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to 0 since there's no reduce operator
    [junit] 08/12/30 17:13:07 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/srcbucket/kv1.txt
    [junit] 08/12/30 17:13:07 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/data/files/TestSerDe.jar
    [junit] 08/12/30 17:13:07 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
    [junit] 08/12/30 17:13:07 WARN mapred.JobClient: Use GenericOptionsParser for parsing the arguments. Applications should implement Tool for the same.
    [junit] 08/12/30 17:13:07 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] Job running in-process (local Hadoop)
    [junit] 08/12/30 17:13:07 INFO exec.ExecDriver: Job running in-process (local Hadoop)
    [junit] 08/12/30 17:13:07 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] 08/12/30 17:13:07 INFO mapred.MapTask: numReduceTasks: 0
    [junit] 08/12/30 17:13:07 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 17:13:07 INFO exec.MapOperator: Adding alias s to work list for file /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/srcbucket/kv1.txt
    [junit] 08/12/30 17:13:07 INFO exec.MapOperator: Got partitions: null
    [junit] 08/12/30 17:13:07 INFO exec.TableScanOperator: Initializing Self
    [junit] 08/12/30 17:13:07 INFO exec.TableScanOperator: Initializing children:
    [junit] 08/12/30 17:13:07 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 17:13:07 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 17:13:07 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 17:13:07 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 17:13:07 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 17:13:07 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 17:13:07 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 17:13:07 INFO exec.TableScanOperator: Initialization Done
    [junit] 08/12/30 17:13:07 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 17:13:07 INFO mapred.TaskRunner: Task:attempt_local_0001_m_000000_0 is done. And is in the process of commiting
    [junit] 08/12/30 17:13:07 INFO mapred.LocalJobRunner: 
    [junit] 08/12/30 17:13:07 INFO mapred.TaskRunner: Task attempt_local_0001_m_000000_0 is allowed to commit now
    [junit] 08/12/30 17:13:07 INFO mapred.FileOutputCommitter: Saved output of task 'attempt_local_0001_m_000000_0' to file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/tmp-1723420520
    [junit] 08/12/30 17:13:07 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/srcbucket/kv1.txt:0+5812
    [junit] 08/12/30 17:13:07 INFO mapred.TaskRunner: Task 'attempt_local_0001_m_000000_0' done.
    [junit]  map = 100%,  reduce =0%
    [junit] Ended Job = job_local_0001
    [junit] 08/12/30 17:13:08 INFO exec.ExecDriver:  map = 100%,  reduce =0%
    [junit] 08/12/30 17:13:08 INFO exec.ExecDriver: Ended Job = job_local_0001
    [junit] diff -I \(file:\)\|\(/tmp/.*\) /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/../build/ql/test/logs/sample4.q.out /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/src/test/results/clientpositive/sample4.q.out
    [junit] Done query: sample4.q
    [junit] Begin query: inputddl3.q
    [junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src)
    [junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:361)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
    [junit] 	at org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_inputddl3(TestCliDriver.java:628)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit] 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit] 	at java.lang.reflect.Method.invoke(Method.java:597)
    [junit] 	at junit.framework.TestCase.runTest(TestCase.java:164)
    [junit] 	at junit.framework.TestCase.runBare(TestCase.java:130)
    [junit] 	at junit.framework.TestResult$1.protect(TestResult.java:106)
    [junit] 	at junit.framework.TestResult.runProtected(TestResult.java:124)
    [junit] 	at junit.framework.TestResult.run(TestResult.java:109)
    [junit] 	at junit.framework.TestCase.run(TestCase.java:120)
    [junit] 	at junit.framework.TestSuite.runTest(TestSuite.java:230)
    [junit] 	at junit.framework.TestSuite.run(TestSuite.java:225)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
    [junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src)
    [junit] 	at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
    [junit] 	at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
    [junit] 	at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
    [junit] 	... 21 more
    [junit] Begin query: groupby2_map.q
    [junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/srcbucket)
    [junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/srcbucket)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:361)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
    [junit] 	at org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_groupby2_map(TestCliDriver.java:653)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit] 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit] 	at java.lang.reflect.Method.invoke(Method.java:597)
    [junit] 	at junit.framework.TestCase.runTest(TestCase.java:164)
    [junit] 	at junit.framework.TestCase.runBare(TestCase.java:130)
    [junit] 	at junit.framework.TestResult$1.protect(TestResult.java:106)
    [junit] 	at junit.framework.TestResult.runProtected(TestResult.java:124)
    [junit] 	at junit.framework.TestResult.run(TestResult.java:109)
    [junit] 	at junit.framework.TestCase.run(TestCase.java:120)
    [junit] 	at junit.framework.TestSuite.runTest(TestSuite.java:230)
    [junit] 	at junit.framework.TestSuite.run(TestSuite.java:225)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
    [junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/srcbucket)
    [junit] 	at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
    [junit] 	at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
    [junit] 	at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
    [junit] 	... 21 more
    [junit] Begin query: sample6.q
    [junit] plan = /tmp/plan1381.xml
    [junit] 08/12/30 17:13:11 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to 0 since there's no reduce operator
    [junit] 08/12/30 17:13:11 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/srcbucket/kv1.txt
    [junit] 08/12/30 17:13:11 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/data/files/TestSerDe.jar
    [junit] 08/12/30 17:13:12 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
    [junit] 08/12/30 17:13:12 WARN mapred.JobClient: Use GenericOptionsParser for parsing the arguments. Applications should implement Tool for the same.
    [junit] 08/12/30 17:13:12 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] Job running in-process (local Hadoop)
    [junit] 08/12/30 17:13:12 INFO exec.ExecDriver: Job running in-process (local Hadoop)
    [junit] 08/12/30 17:13:12 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] 08/12/30 17:13:12 INFO mapred.MapTask: numReduceTasks: 0
    [junit] 08/12/30 17:13:12 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 17:13:12 INFO exec.MapOperator: Adding alias s to work list for file /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/srcbucket/kv1.txt
    [junit] 08/12/30 17:13:12 INFO exec.MapOperator: Got partitions: null
    [junit] 08/12/30 17:13:12 INFO exec.TableScanOperator: Initializing Self
    [junit] 08/12/30 17:13:12 INFO exec.TableScanOperator: Initializing children:
    [junit] 08/12/30 17:13:12 INFO exec.FilterOperator: Initializing Self
    [junit] 08/12/30 17:13:12 INFO exec.FilterOperator: Initializing children:
    [junit] 08/12/30 17:13:12 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 17:13:12 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 17:13:12 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 17:13:12 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 17:13:12 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 17:13:12 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 17:13:12 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 17:13:12 INFO exec.FilterOperator: Initialization Done
    [junit] 08/12/30 17:13:12 INFO exec.TableScanOperator: Initialization Done
    [junit] 08/12/30 17:13:12 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 17:13:12 INFO exec.FilterOperator: PASSED:118
    [junit] 08/12/30 17:13:12 INFO exec.FilterOperator: FILTERED:382
    [junit] 08/12/30 17:13:12 INFO mapred.TaskRunner: Task:attempt_local_0001_m_000000_0 is done. And is in the process of commiting
    [junit] 08/12/30 17:13:12 INFO mapred.LocalJobRunner: 
    [junit] 08/12/30 17:13:12 INFO mapred.TaskRunner: Task attempt_local_0001_m_000000_0 is allowed to commit now
    [junit] 08/12/30 17:13:12 INFO mapred.FileOutputCommitter: Saved output of task 'attempt_local_0001_m_000000_0' to file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/tmp1789177063
    [junit] 08/12/30 17:13:12 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/srcbucket/kv1.txt:0+5812
    [junit] 08/12/30 17:13:12 INFO mapred.TaskRunner: Task 'attempt_local_0001_m_000000_0' done.
    [junit]  map = 100%,  reduce =0%
    [junit] Ended Job = job_local_0001
    [junit] 08/12/30 17:13:13 INFO exec.ExecDriver:  map = 100%,  reduce =0%
    [junit] 08/12/30 17:13:13 INFO exec.ExecDriver: Ended Job = job_local_0001
    [junit] diff -I \(file:\)\|\(/tmp/.*\) /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/../build/ql/test/logs/sample6.q.out /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/src/test/results/clientpositive/sample6.q.out
    [junit] Done query: sample6.q
    [junit] Begin query: groupby4_map.q
    [junit] plan = /tmp/plan1382.xml
    [junit] 08/12/30 17:13:16 INFO exec.ExecDriver: Number of reduce tasks determined at compile : 1
    [junit] 08/12/30 17:13:16 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src
    [junit] 08/12/30 17:13:16 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/data/files/TestSerDe.jar
    [junit] 08/12/30 17:13:16 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
    [junit] 08/12/30 17:13:16 WARN mapred.JobClient: Use GenericOptionsParser for parsing the arguments. Applications should implement Tool for the same.
    [junit] 08/12/30 17:13:16 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] 08/12/30 17:13:17 INFO exec.ExecDriver: Job running in-process (local Hadoop)
    [junit] 08/12/30 17:13:17 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] Job running in-process (local Hadoop)
    [junit] 08/12/30 17:13:17 INFO mapred.MapTask: numReduceTasks: 1
    [junit] 08/12/30 17:13:17 INFO mapred.MapTask: io.sort.mb = 100
    [junit] 08/12/30 17:13:17 INFO mapred.MapTask: data buffer = 79691776/99614720
    [junit] 08/12/30 17:13:17 INFO mapred.MapTask: record buffer = 262144/327680
    [junit] 08/12/30 17:13:17 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 17:13:17 INFO exec.MapOperator: Adding alias src to work list for file /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src/kv1.txt
    [junit] 08/12/30 17:13:17 INFO exec.MapOperator: Got partitions: null
    [junit] 08/12/30 17:13:17 INFO exec.TableScanOperator: Initializing Self
    [junit] 08/12/30 17:13:17 INFO exec.TableScanOperator: Initializing children:
    [junit] 08/12/30 17:13:17 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 17:13:17 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 17:13:17 INFO exec.GroupByOperator: Initializing Self
    [junit] 08/12/30 17:13:17 INFO exec.GroupByOperator: Initializing children:
    [junit] 08/12/30 17:13:17 INFO exec.ReduceSinkOperator: Initializing Self
    [junit] 08/12/30 17:13:17 INFO exec.ReduceSinkOperator: Using tag = -1
    [junit] 08/12/30 17:13:17 INFO thrift.TBinarySortableProtocol: Sort order is ""
    [junit] 08/12/30 17:13:17 INFO thrift.TBinarySortableProtocol: Sort order is ""
    [junit] 08/12/30 17:13:17 INFO exec.GroupByOperator: Initialization Done
    [junit] 08/12/30 17:13:17 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 17:13:17 INFO exec.TableScanOperator: Initialization Done
    [junit] 08/12/30 17:13:17 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 17:13:17 INFO mapred.MapTask: Starting flush of map output
    [junit] 08/12/30 17:13:17 INFO mapred.MapTask: Finished spill 0
    [junit] 08/12/30 17:13:17 INFO mapred.TaskRunner: Task:attempt_local_0001_m_000000_0 is done. And is in the process of commiting
    [junit] 08/12/30 17:13:17 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src/kv1.txt:0+5812
    [junit] 08/12/30 17:13:17 INFO mapred.TaskRunner: Task 'attempt_local_0001_m_000000_0' done.
    [junit] 08/12/30 17:13:17 INFO thrift.TBinarySortableProtocol: Sort order is ""
    [junit] 08/12/30 17:13:17 INFO thrift.TBinarySortableProtocol: Sort order is ""
    [junit] 08/12/30 17:13:17 INFO mapred.Merger: Merging 1 sorted segments
    [junit] 08/12/30 17:13:17 INFO mapred.Merger: Down to the last merge-pass, with 1 segments left of total size: 11002 bytes
    [junit] 08/12/30 17:13:17 INFO exec.GroupByOperator: Initializing Self
    [junit] 08/12/30 17:13:17 INFO exec.GroupByOperator: Initializing children:
    [junit] 08/12/30 17:13:17 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 17:13:17 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 17:13:17 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 17:13:17 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 17:13:17 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 17:13:17 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 17:13:17 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 17:13:17 INFO exec.GroupByOperator: Initialization Done
    [junit] 08/12/30 17:13:17 INFO mapred.TaskRunner: Task:attempt_local_0001_r_000000_0 is done. And is in the process of commiting
    [junit] 08/12/30 17:13:17 INFO mapred.LocalJobRunner: 
    [junit] 08/12/30 17:13:17 INFO mapred.TaskRunner: Task attempt_local_0001_r_000000_0 is allowed to commit now
    [junit] 08/12/30 17:13:17 INFO mapred.FileOutputCommitter: Saved output of task 'attempt_local_0001_r_000000_0' to file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/tmp1332401354
    [junit] 08/12/30 17:13:17 INFO mapred.LocalJobRunner: reduce > reduce
    [junit] 08/12/30 17:13:17 INFO mapred.TaskRunner: Task 'attempt_local_0001_r_000000_0' done.
    [junit]  map = 100%,  reduce =100%
    [junit] Ended Job = job_local_0001
    [junit] 08/12/30 17:13:18 INFO exec.ExecDriver:  map = 100%,  reduce =100%
    [junit] 08/12/30 17:13:18 INFO exec.ExecDriver: Ended Job = job_local_0001
    [junit] diff -I \(file:\)\|\(/tmp/.*\) /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/../build/ql/test/logs/groupby4_map.q.out /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/src/test/results/clientpositive/groupby4_map.q.out
    [junit] Done query: groupby4_map.q
    [junit] Begin query: inputddl5.q
    [junit] plan = /tmp/plan1383.xml
    [junit] 08/12/30 17:13:21 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to 0 since there's no reduce operator
    [junit] 08/12/30 17:13:21 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/inputddl5
    [junit] 08/12/30 17:13:21 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/data/files/TestSerDe.jar
    [junit] 08/12/30 17:13:21 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
    [junit] 08/12/30 17:13:21 WARN mapred.JobClient: Use GenericOptionsParser for parsing the arguments. Applications should implement Tool for the same.
    [junit] 08/12/30 17:13:21 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] 08/12/30 17:13:21 INFO exec.ExecDriver: Job running in-process (local Hadoop)
    [junit] 08/12/30 17:13:21 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] Job running in-process (local Hadoop)
    [junit] 08/12/30 17:13:21 INFO mapred.MapTask: numReduceTasks: 0
    [junit] 08/12/30 17:13:21 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 17:13:21 INFO exec.MapOperator: Adding alias inputddl5 to work list for file /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/inputddl5/kv4.txt
    [junit] 08/12/30 17:13:21 INFO exec.MapOperator: Got partitions: null
    [junit] 08/12/30 17:13:21 INFO exec.TableScanOperator: Initializing Self
    [junit] 08/12/30 17:13:21 INFO exec.TableScanOperator: Initializing children:
    [junit] 08/12/30 17:13:21 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 17:13:21 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 17:13:21 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 17:13:21 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 17:13:21 INFO exec.TableScanOperator: Initialization Done
    [junit] 08/12/30 17:13:21 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 17:13:21 INFO mapred.TaskRunner: Task:attempt_local_0001_m_000000_0 is done. And is in the process of commiting
    [junit] 08/12/30 17:13:21 INFO mapred.LocalJobRunner: 
    [junit] 08/12/30 17:13:21 INFO mapred.TaskRunner: Task attempt_local_0001_m_000000_0 is allowed to commit now
    [junit] 08/12/30 17:13:21 INFO mapred.FileOutputCommitter: Saved output of task 'attempt_local_0001_m_000000_0' to file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/tmp2031330907
    [junit] 08/12/30 17:13:21 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/inputddl5/kv4.txt:0+6
    [junit] 08/12/30 17:13:21 INFO mapred.TaskRunner: Task 'attempt_local_0001_m_000000_0' done.
    [junit]  map = 100%,  reduce =0%
    [junit] Ended Job = job_local_0001
    [junit] 08/12/30 17:13:22 INFO exec.ExecDriver:  map = 100%,  reduce =0%
    [junit] 08/12/30 17:13:22 INFO exec.ExecDriver: Ended Job = job_local_0001
    [junit] plan = /tmp/plan1384.xml
    [junit] 08/12/30 17:13:23 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to jobconf value of: 1
    [junit] 08/12/30 17:13:24 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/inputddl5
    [junit] 08/12/30 17:13:24 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/data/files/TestSerDe.jar
    [junit] 08/12/30 17:13:24 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
    [junit] 08/12/30 17:13:24 WARN mapred.JobClient: Use GenericOptionsParser for parsing the arguments. Applications should implement Tool for the same.
    [junit] 08/12/30 17:13:24 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] Job running in-process (local Hadoop)
    [junit] 08/12/30 17:13:24 INFO exec.ExecDriver: Job running in-process (local Hadoop)
    [junit] 08/12/30 17:13:24 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] 08/12/30 17:13:24 INFO mapred.MapTask: numReduceTasks: 1
    [junit] 08/12/30 17:13:24 INFO mapred.MapTask: io.sort.mb = 100
    [junit] 08/12/30 17:13:24 INFO mapred.MapTask: data buffer = 79691776/99614720
    [junit] 08/12/30 17:13:24 INFO mapred.MapTask: record buffer = 262144/327680
    [junit] 08/12/30 17:13:24 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 17:13:24 INFO exec.MapOperator: Adding alias inputddl5 to work list for file /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/inputddl5/kv4.txt
    [junit] 08/12/30 17:13:24 INFO exec.MapOperator: Got partitions: null
    [junit] 08/12/30 17:13:24 INFO exec.TableScanOperator: Initializing Self
    [junit] 08/12/30 17:13:24 INFO exec.TableScanOperator: Initializing children:
    [junit] 08/12/30 17:13:24 INFO exec.FilterOperator: Initializing Self
    [junit] 08/12/30 17:13:24 INFO exec.FilterOperator: Initializing children:
    [junit] 08/12/30 17:13:24 INFO exec.ReduceSinkOperator: Initializing Self
    [junit] 08/12/30 17:13:24 INFO exec.ReduceSinkOperator: Using tag = -1
    [junit] 08/12/30 17:13:24 INFO thrift.TBinarySortableProtocol: Sort order is ""
    [junit] 08/12/30 17:13:24 INFO thrift.TBinarySortableProtocol: Sort order is ""
    [junit] 08/12/30 17:13:24 INFO exec.FilterOperator: Initialization Done
    [junit] 08/12/30 17:13:24 INFO exec.TableScanOperator: Initialization Done
    [junit] 08/12/30 17:13:24 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 17:13:24 INFO exec.FilterOperator: PASSED:1
    [junit] 08/12/30 17:13:24 INFO exec.FilterOperator: FILTERED:0
    [junit] 08/12/30 17:13:24 INFO mapred.MapTask: Starting flush of map output
    [junit] 08/12/30 17:13:24 INFO mapred.MapTask: Finished spill 0
    [junit] 08/12/30 17:13:24 INFO mapred.TaskRunner: Task:attempt_local_0001_m_000000_0 is done. And is in the process of commiting
    [junit] 08/12/30 17:13:24 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/inputddl5/kv4.txt:0+6
    [junit] 08/12/30 17:13:24 INFO mapred.TaskRunner: Task 'attempt_local_0001_m_000000_0' done.
    [junit] 08/12/30 17:13:24 INFO thrift.TBinarySortableProtocol: Sort order is ""
    [junit] 08/12/30 17:13:24 INFO thrift.TBinarySortableProtocol: Sort order is ""
    [junit] 08/12/30 17:13:24 INFO mapred.Merger: Merging 1 sorted segments
    [junit] 08/12/30 17:13:24 INFO mapred.Merger: Down to the last merge-pass, with 1 segments left of total size: 20 bytes
    [junit] 08/12/30 17:13:24 INFO exec.GroupByOperator: Initializing Self
    [junit] 08/12/30 17:13:24 INFO exec.GroupByOperator: Initializing children:
    [junit] 08/12/30 17:13:24 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 17:13:24 INFO exec.GroupByOperator: Initialization Done
    [junit] 08/12/30 17:13:24 INFO mapred.TaskRunner: Task:attempt_local_0001_r_000000_0 is done. And is in the process of commiting
    [junit] 08/12/30 17:13:24 INFO mapred.LocalJobRunner: 
    [junit] 08/12/30 17:13:24 INFO mapred.TaskRunner: Task attempt_local_0001_r_000000_0 is allowed to commit now
    [junit] 08/12/30 17:13:24 INFO mapred.FileOutputCommitter: Saved output of task 'attempt_local_0001_r_000000_0' to file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/tmp485066972
    [junit] 08/12/30 17:13:24 INFO mapred.LocalJobRunner: reduce > reduce
    [junit] 08/12/30 17:13:24 INFO mapred.TaskRunner: Task 'attempt_local_0001_r_000000_0' done.
    [junit]  map = 100%,  reduce =100%
    [junit] Ended Job = job_local_0001
    [junit] 08/12/30 17:13:25 INFO exec.ExecDriver:  map = 100%,  reduce =100%
    [junit] 08/12/30 17:13:25 INFO exec.ExecDriver: Ended Job = job_local_0001
    [junit] plan = /tmp/plan1385.xml
    [junit] 08/12/30 17:13:26 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to jobconf value of: 1
    [junit] 08/12/30 17:13:26 INFO exec.ExecDriver: Adding input file /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/tmp/131788175/85073467.10002
    [junit] 08/12/30 17:13:26 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/data/files/TestSerDe.jar
    [junit] 08/12/30 17:13:26 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
    [junit] 08/12/30 17:13:26 WARN mapred.JobClient: Use GenericOptionsParser for parsing the arguments. Applications should implement Tool for the same.
    [junit] 08/12/30 17:13:27 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] 08/12/30 17:13:27 INFO exec.ExecDriver: Job running in-process (local Hadoop)
    [junit] 08/12/30 17:13:27 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] Job running in-process (local Hadoop)
    [junit] 08/12/30 17:13:27 INFO mapred.MapTask: numReduceTasks: 1
    [junit] 08/12/30 17:13:27 INFO mapred.MapTask: io.sort.mb = 100
    [junit] 08/12/30 17:13:27 INFO mapred.MapTask: data buffer = 79691776/99614720
    [junit] 08/12/30 17:13:27 INFO mapred.MapTask: record buffer = 262144/327680
    [junit] 08/12/30 17:13:27 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 17:13:27 INFO exec.MapOperator: Adding alias /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/tmp/131788175/85073467.10002 to work list for file /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/tmp/131788175/85073467.10002/attempt_local_0001_r_000000_0
    [junit] 08/12/30 17:13:27 INFO exec.MapOperator: Got partitions: null
    [junit] 08/12/30 17:13:27 INFO exec.ReduceSinkOperator: Initializing Self
    [junit] 08/12/30 17:13:27 INFO exec.ReduceSinkOperator: Using tag = -1
    [junit] 08/12/30 17:13:27 INFO thrift.TBinarySortableProtocol: Sort order is ""
    [junit] 08/12/30 17:13:27 INFO thrift.TBinarySortableProtocol: Sort order is ""
    [junit] 08/12/30 17:13:27 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 17:13:27 INFO mapred.MapTask: Starting flush of map output
    [junit] 08/12/30 17:13:27 INFO mapred.MapTask: Finished spill 0
    [junit] 08/12/30 17:13:27 INFO mapred.TaskRunner: Task:attempt_local_0001_m_000000_0 is done. And is in the process of commiting
    [junit] 08/12/30 17:13:27 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/tmp/131788175/85073467.10002/attempt_local_0001_r_000000_0:0+124
    [junit] 08/12/30 17:13:27 INFO mapred.TaskRunner: Task 'attempt_local_0001_m_000000_0' done.
    [junit] 08/12/30 17:13:27 INFO thrift.TBinarySortableProtocol: Sort order is ""
    [junit] 08/12/30 17:13:27 INFO thrift.TBinarySortableProtocol: Sort order is ""
    [junit] 08/12/30 17:13:27 INFO mapred.Merger: Merging 1 sorted segments
    [junit] 08/12/30 17:13:27 INFO mapred.Merger: Down to the last merge-pass, with 1 segments left of total size: 24 bytes
    [junit] 08/12/30 17:13:27 INFO exec.GroupByOperator: Initializing Self
    [junit] 08/12/30 17:13:27 INFO exec.GroupByOperator: Initializing children:
    [junit] 08/12/30 17:13:27 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 17:13:27 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 17:13:27 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 17:13:27 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 17:13:27 INFO exec.GroupByOperator: Initialization Done
    [junit] 08/12/30 17:13:27 INFO mapred.TaskRunner: Task:attempt_local_0001_r_000000_0 is done. And is in the process of commiting
    [junit] 08/12/30 17:13:27 INFO mapred.LocalJobRunner: 
    [junit] 08/12/30 17:13:27 INFO mapred.TaskRunner: Task attempt_local_0001_r_000000_0 is allowed to commit now
    [junit] 08/12/30 17:13:27 INFO mapred.FileOutputCommitter: Saved output of task 'attempt_local_0001_r_000000_0' to file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/tmp369228729
    [junit] 08/12/30 17:13:27 INFO mapred.LocalJobRunner: reduce > reduce
    [junit] 08/12/30 17:13:27 INFO mapred.TaskRunner: Task 'attempt_local_0001_r_000000_0' done.
    [junit]  map = 100%,  reduce =100%
    [junit] Ended Job = job_local_0001
    [junit] 08/12/30 17:13:28 INFO exec.ExecDriver:  map = 100%,  reduce =100%
    [junit] 08/12/30 17:13:28 INFO exec.ExecDriver: Ended Job = job_local_0001
    [junit] Exception: Client Execution failed with error code = 9
    [junit] junit.framework.AssertionFailedError: Client Execution failed with error code = 9
    [junit] 	at junit.framework.Assert.fail(Assert.java:47)
    [junit] 	at org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_inputddl5(TestCliDriver.java:731)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit] 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit] 	at java.lang.reflect.Method.invoke(Method.java:597)
    [junit] 	at junit.framework.TestCase.runTest(TestCase.java:164)
    [junit] 	at junit.framework.TestCase.runBare(TestCase.java:130)
    [junit] 	at junit.framework.TestResult$1.protect(TestResult.java:106)
    [junit] 	at junit.framework.TestResult.runProtected(TestResult.java:124)
    [junit] 	at junit.framework.TestResult.run(TestResult.java:109)
    [junit] 	at junit.framework.TestCase.run(TestCase.java:120)
    [junit] 	at junit.framework.TestSuite.runTest(TestSuite.java:230)
    [junit] 	at junit.framework.TestSuite.run(TestSuite.java:225)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
    [junit] Begin query: sample8.q
    [junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src)
    [junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:361)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
    [junit] 	at org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_sample8(TestCliDriver.java:753)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit] 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit] 	at java.lang.reflect.Method.invoke(Method.java:597)
    [junit] 	at junit.framework.TestCase.runTest(TestCase.java:164)
    [junit] 	at junit.framework.TestCase.runBare(TestCase.java:130)
    [junit] 	at junit.framework.TestResult$1.protect(TestResult.java:106)
    [junit] 	at junit.framework.TestResult.runProtected(TestResult.java:124)
    [junit] 	at junit.framework.TestResult.run(TestResult.java:109)
    [junit] 	at junit.framework.TestCase.run(TestCase.java:120)
    [junit] 	at junit.framework.TestSuite.runTest(TestSuite.java:230)
    [junit] 	at junit.framework.TestSuite.run(TestSuite.java:225)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
    [junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src)
    [junit] 	at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
    [junit] 	at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
    [junit] 	at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
    [junit] 	... 21 more
    [junit] Begin query: inputddl7.q
    [junit] plan = /tmp/plan1386.xml
    [junit] 08/12/30 17:13:31 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to jobconf value of: 1
    [junit] 08/12/30 17:13:31 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/t1
    [junit] 08/12/30 17:13:31 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/data/files/TestSerDe.jar
    [junit] 08/12/30 17:13:31 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
    [junit] 08/12/30 17:13:31 WARN mapred.JobClient: Use GenericOptionsParser for parsing the arguments. Applications should implement Tool for the same.
    [junit] 08/12/30 17:13:31 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] Job running in-process (local Hadoop)
    [junit] 08/12/30 17:13:31 INFO exec.ExecDriver: Job running in-process (local Hadoop)
    [junit] 08/12/30 17:13:31 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] 08/12/30 17:13:31 INFO mapred.MapTask: numReduceTasks: 1
    [junit] 08/12/30 17:13:31 INFO mapred.MapTask: io.sort.mb = 100
    [junit] 08/12/30 17:13:32 INFO mapred.MapTask: data buffer = 79691776/99614720
    [junit] 08/12/30 17:13:32 INFO mapred.MapTask: record buffer = 262144/327680
    [junit] 08/12/30 17:13:32 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 17:13:32 INFO exec.MapOperator: Adding alias t1 to work list for file /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/t1/kv1.txt
    [junit] 08/12/30 17:13:32 INFO exec.MapOperator: Got partitions: null
    [junit] 08/12/30 17:13:32 INFO exec.TableScanOperator: Initializing Self
    [junit] 08/12/30 17:13:32 INFO exec.TableScanOperator: Initializing children:
    [junit] 08/12/30 17:13:32 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 17:13:32 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 17:13:32 INFO exec.ReduceSinkOperator: Initializing Self
    [junit] 08/12/30 17:13:32 INFO exec.ReduceSinkOperator: Using tag = -1
    [junit] 08/12/30 17:13:32 INFO thrift.TBinarySortableProtocol: Sort order is ""
    [junit] 08/12/30 17:13:32 INFO thrift.TBinarySortableProtocol: Sort order is ""
    [junit] 08/12/30 17:13:32 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 17:13:32 INFO exec.TableScanOperator: Initialization Done
    [junit] 08/12/30 17:13:32 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 17:13:32 INFO mapred.MapTask: Starting flush of map output
    [junit] 08/12/30 17:13:32 INFO mapred.MapTask: Finished spill 0
    [junit] 08/12/30 17:13:32 INFO mapred.TaskRunner: Task:attempt_local_0001_m_000000_0 is done. And is in the process of commiting
    [junit] 08/12/30 17:13:32 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/t1/kv1.txt:0+5812
    [junit] 08/12/30 17:13:32 INFO mapred.TaskRunner: Task 'attempt_local_0001_m_000000_0' done.
    [junit] 08/12/30 17:13:32 INFO thrift.TBinarySortableProtocol: Sort order is ""
    [junit] 08/12/30 17:13:32 INFO thrift.TBinarySortableProtocol: Sort order is ""
    [junit] 08/12/30 17:13:32 INFO mapred.Merger: Merging 1 sorted segments
    [junit] 08/12/30 17:13:32 INFO mapred.Merger: Down to the last merge-pass, with 1 segments left of total size: 9002 bytes
    [junit] 08/12/30 17:13:32 INFO exec.GroupByOperator: Initializing Self
    [junit] 08/12/30 17:13:32 INFO exec.GroupByOperator: Initializing children:
    [junit] 08/12/30 17:13:32 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 17:13:32 INFO exec.GroupByOperator: Initialization Done
    [junit] 08/12/30 17:13:32 INFO mapred.TaskRunner: Task:attempt_local_0001_r_000000_0 is done. And is in the process of commiting
    [junit] 08/12/30 17:13:32 INFO mapred.LocalJobRunner: 
    [junit] 08/12/30 17:13:32 INFO mapred.TaskRunner: Task attempt_local_0001_r_000000_0 is allowed to commit now
    [junit] 08/12/30 17:13:32 INFO mapred.FileOutputCommitter: Saved output of task 'attempt_local_0001_r_000000_0' to file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/tmp618889346
    [junit] 08/12/30 17:13:32 INFO mapred.LocalJobRunner: reduce > reduce
    [junit] 08/12/30 17:13:32 INFO mapred.TaskRunner: Task 'attempt_local_0001_r_000000_0' done.
    [junit]  map = 100%,  reduce =100%
    [junit] Ended Job = job_local_0001
    [junit] 08/12/30 17:13:32 INFO exec.ExecDriver:  map = 100%,  reduce =100%
    [junit] 08/12/30 17:13:32 INFO exec.ExecDriver: Ended Job = job_local_0001
    [junit] plan = /tmp/plan1387.xml
    [junit] 08/12/30 17:13:34 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to jobconf value of: 1
    [junit] 08/12/30 17:13:34 INFO exec.ExecDriver: Adding input file /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/tmp/72828951/309227858.10002
    [junit] 08/12/30 17:13:34 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/data/files/TestSerDe.jar
    [junit] 08/12/30 17:13:34 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
    [junit] 08/12/30 17:13:34 WARN mapred.JobClient: Use GenericOptionsParser for parsing the arguments. Applications should implement Tool for the same.
    [junit] 08/12/30 17:13:34 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] 08/12/30 17:13:34 INFO exec.ExecDriver: Job running in-process (local Hadoop)
    [junit] 08/12/30 17:13:34 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] Job running in-process (local Hadoop)
    [junit] 08/12/30 17:13:34 INFO mapred.MapTask: numReduceTasks: 1
    [junit] 08/12/30 17:13:34 INFO mapred.MapTask: io.sort.mb = 100
    [junit] 08/12/30 17:13:34 INFO mapred.MapTask: data buffer = 79691776/99614720
    [junit] 08/12/30 17:13:34 INFO mapred.MapTask: record buffer = 262144/327680
    [junit] 08/12/30 17:13:34 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 17:13:34 INFO exec.MapOperator: Adding alias /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/tmp/72828951/309227858.10002 to work list for file /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/tmp/72828951/309227858.10002/attempt_local_0001_r_000000_0
    [junit] 08/12/30 17:13:34 INFO exec.MapOperator: Got partitions: null
    [junit] 08/12/30 17:13:34 INFO exec.ReduceSinkOperator: Initializing Self
    [junit] 08/12/30 17:13:34 INFO exec.ReduceSinkOperator: Using tag = -1
    [junit] 08/12/30 17:13:34 INFO thrift.TBinarySortableProtocol: Sort order is ""
    [junit] 08/12/30 17:13:34 INFO thrift.TBinarySortableProtocol: Sort order is ""
    [junit] 08/12/30 17:13:34 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 17:13:34 INFO mapred.MapTask: Starting flush of map output
    [junit] 08/12/30 17:13:34 INFO mapred.MapTask: Finished spill 0
    [junit] 08/12/30 17:13:34 INFO mapred.TaskRunner: Task:attempt_local_0001_m_000000_0 is done. And is in the process of commiting
    [junit] 08/12/30 17:13:34 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/tmp/72828951/309227858.10002/attempt_local_0001_r_000000_0:0+124
    [junit] 08/12/30 17:13:34 INFO mapred.TaskRunner: Task 'attempt_local_0001_m_000000_0' done.
    [junit] 08/12/30 17:13:35 INFO thrift.TBinarySortableProtocol: Sort order is ""
    [junit] 08/12/30 17:13:35 INFO thrift.TBinarySortableProtocol: Sort order is ""
    [junit] 08/12/30 17:13:35 INFO mapred.Merger: Merging 1 sorted segments
    [junit] 08/12/30 17:13:35 INFO mapred.Merger: Down to the last merge-pass, with 1 segments left of total size: 24 bytes
    [junit] 08/12/30 17:13:35 INFO exec.GroupByOperator: Initializing Self
    [junit] 08/12/30 17:13:35 INFO exec.GroupByOperator: Initializing children:
    [junit] 08/12/30 17:13:35 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 17:13:35 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 17:13:35 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 17:13:35 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 17:13:35 INFO exec.GroupByOperator: Initialization Done
    [junit] 08/12/30 17:13:35 INFO mapred.TaskRunner: Task:attempt_local_0001_r_000000_0 is done. And is in the process of commiting
    [junit] 08/12/30 17:13:35 INFO mapred.LocalJobRunner: 
    [junit] 08/12/30 17:13:35 INFO mapred.TaskRunner: Task attempt_local_0001_r_000000_0 is allowed to commit now
    [junit] 08/12/30 17:13:35 INFO mapred.FileOutputCommitter: Saved output of task 'attempt_local_0001_r_000000_0' to file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/tmp287745858
    [junit] 08/12/30 17:13:35 INFO mapred.LocalJobRunner: reduce > reduce
    [junit] 08/12/30 17:13:35 INFO mapred.TaskRunner: Task 'attempt_local_0001_r_000000_0' done.
    [junit]  map = 100%,  reduce =100%
    [junit] Ended Job = job_local_0001
    [junit] 08/12/30 17:13:35 INFO exec.ExecDriver:  map = 100%,  reduce =100%
    [junit] 08/12/30 17:13:35 INFO exec.ExecDriver: Ended Job = job_local_0001
    [junit] plan = /tmp/plan1388.xml
    [junit] 08/12/30 17:13:37 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to jobconf value of: 1
    [junit] 08/12/30 17:13:37 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/t2
    [junit] 08/12/30 17:13:37 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/data/files/TestSerDe.jar
    [junit] 08/12/30 17:13:37 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
    [junit] 08/12/30 17:13:37 WARN mapred.JobClient: Use GenericOptionsParser for parsing the arguments. Applications should implement Tool for the same.
    [junit] 08/12/30 17:13:37 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] Job running in-process (local Hadoop)
    [junit] 08/12/30 17:13:37 INFO exec.ExecDriver: Job running in-process (local Hadoop)
    [junit] 08/12/30 17:13:37 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] 08/12/30 17:13:37 INFO mapred.MapTask: numReduceTasks: 1
    [junit] 08/12/30 17:13:37 INFO mapred.MapTask: io.sort.mb = 100
    [junit] 08/12/30 17:13:37 INFO mapred.MapTask: data buffer = 79691776/99614720
    [junit] 08/12/30 17:13:37 INFO mapred.MapTask: record buffer = 262144/327680
    [junit] 08/12/30 17:13:37 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 17:13:37 INFO exec.MapOperator: Adding alias t2 to work list for file /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/t2/kv1.seq
    [junit] 08/12/30 17:13:37 INFO exec.MapOperator: Got partitions: null
    [junit] 08/12/30 17:13:37 INFO exec.TableScanOperator: Initializing Self
    [junit] 08/12/30 17:13:37 INFO exec.TableScanOperator: Initializing children:
    [junit] 08/12/30 17:13:37 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 17:13:37 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 17:13:37 INFO exec.ReduceSinkOperator: Initializing Self
    [junit] 08/12/30 17:13:37 INFO exec.ReduceSinkOperator: Using tag = -1
    [junit] 08/12/30 17:13:37 INFO thrift.TBinarySortableProtocol: Sort order is ""
    [junit] 08/12/30 17:13:37 INFO thrift.TBinarySortableProtocol: Sort order is ""
    [junit] 08/12/30 17:13:37 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 17:13:37 INFO exec.TableScanOperator: Initialization Done
    [junit] 08/12/30 17:13:38 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 17:13:38 INFO mapred.MapTask: Starting flush of map output
    [junit] 08/12/30 17:13:38 INFO mapred.MapTask: Finished spill 0
    [junit] 08/12/30 17:13:38 INFO mapred.TaskRunner: Task:attempt_local_0001_m_000000_0 is done. And is in the process of commiting
    [junit] 08/12/30 17:13:38 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/t2/kv1.seq:0+10508
    [junit] 08/12/30 17:13:38 INFO mapred.TaskRunner: Task 'attempt_local_0001_m_000000_0' done.
    [junit] 08/12/30 17:13:38 INFO thrift.TBinarySortableProtocol: Sort order is ""
    [junit] 08/12/30 17:13:38 INFO thrift.TBinarySortableProtocol: Sort order is ""
    [junit] 08/12/30 17:13:38 INFO mapred.Merger: Merging 1 sorted segments
    [junit] 08/12/30 17:13:38 INFO mapred.Merger: Down to the last merge-pass, with 1 segments left of total size: 9002 bytes
    [junit] 08/12/30 17:13:38 INFO exec.GroupByOperator: Initializing Self
    [junit] 08/12/30 17:13:38 INFO exec.GroupByOperator: Initializing children:
    [junit] 08/12/30 17:13:38 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 17:13:38 INFO exec.GroupByOperator: Initialization Done
    [junit] 08/12/30 17:13:38 INFO mapred.TaskRunner: Task:attempt_local_0001_r_000000_0 is done. And is in the process of commiting
    [junit] 08/12/30 17:13:38 INFO mapred.LocalJobRunner: 
    [junit] 08/12/30 17:13:38 INFO mapred.TaskRunner: Task attempt_local_0001_r_000000_0 is allowed to commit now
    [junit] 08/12/30 17:13:38 INFO mapred.FileOutputCommitter: Saved output of task 'attempt_local_0001_r_000000_0' to file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/tmp188711858
    [junit] 08/12/30 17:13:38 INFO mapred.LocalJobRunner: reduce > reduce
    [junit] 08/12/30 17:13:38 INFO mapred.TaskRunner: Task 'attempt_local_0001_r_000000_0' done.
    [junit]  map = 100%,  reduce =100%
    [junit] Ended Job = job_local_0001
    [junit] 08/12/30 17:13:38 INFO exec.ExecDriver:  map = 100%,  reduce =100%
    [junit] 08/12/30 17:13:38 INFO exec.ExecDriver: Ended Job = job_local_0001
    [junit] plan = /tmp/plan1389.xml
    [junit] 08/12/30 17:13:39 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to jobconf value of: 1
    [junit] 08/12/30 17:13:40 INFO exec.ExecDriver: Adding input file /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/tmp/14968518/269820109.10002
    [junit] 08/12/30 17:13:40 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/data/files/TestSerDe.jar
    [junit] 08/12/30 17:13:40 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
    [junit] 08/12/30 17:13:40 WARN mapred.JobClient: Use GenericOptionsParser for parsing the arguments. Applications should implement Tool for the same.
    [junit] 08/12/30 17:13:40 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] Job running in-process (local Hadoop)
    [junit] 08/12/30 17:13:40 INFO exec.ExecDriver: Job running in-process (local Hadoop)
    [junit] 08/12/30 17:13:40 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] 08/12/30 17:13:40 INFO mapred.MapTask: numReduceTasks: 1
    [junit] 08/12/30 17:13:40 INFO mapred.MapTask: io.sort.mb = 100
    [junit] 08/12/30 17:13:40 INFO mapred.MapTask: data buffer = 79691776/99614720
    [junit] 08/12/30 17:13:40 INFO mapred.MapTask: record buffer = 262144/327680
    [junit] 08/12/30 17:13:40 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 17:13:40 INFO exec.MapOperator: Adding alias /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/tmp/14968518/269820109.10002 to work list for file /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/tmp/14968518/269820109.10002/attempt_local_0001_r_000000_0
    [junit] 08/12/30 17:13:40 INFO exec.MapOperator: Got partitions: null
    [junit] 08/12/30 17:13:40 INFO exec.ReduceSinkOperator: Initializing Self
    [junit] 08/12/30 17:13:40 INFO exec.ReduceSinkOperator: Using tag = -1
    [junit] 08/12/30 17:13:40 INFO thrift.TBinarySortableProtocol: Sort order is ""
    [junit] 08/12/30 17:13:40 INFO thrift.TBinarySortableProtocol: Sort order is ""
    [junit] 08/12/30 17:13:40 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 17:13:40 INFO mapred.MapTask: Starting flush of map output
    [junit] 08/12/30 17:13:40 INFO mapred.MapTask: Finished spill 0
    [junit] 08/12/30 17:13:40 INFO mapred.TaskRunner: Task:attempt_local_0001_m_000000_0 is done. And is in the process of commiting
    [junit] 08/12/30 17:13:40 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/tmp/14968518/269820109.10002/attempt_local_0001_r_000000_0:0+124
    [junit] 08/12/30 17:13:40 INFO mapred.TaskRunner: Task 'attempt_local_0001_m_000000_0' done.
    [junit] 08/12/30 17:13:40 INFO thrift.TBinarySortableProtocol: Sort order is ""
    [junit] 08/12/30 17:13:40 INFO thrift.TBinarySortableProtocol: Sort order is ""
    [junit] 08/12/30 17:13:40 INFO mapred.Merger: Merging 1 sorted segments
    [junit] 08/12/30 17:13:40 INFO mapred.Merger: Down to the last merge-pass, with 1 segments left of total size: 24 bytes
    [junit] 08/12/30 17:13:41 INFO exec.GroupByOperator: Initializing Self
    [junit] 08/12/30 17:13:41 INFO exec.GroupByOperator: Initializing children:
    [junit] 08/12/30 17:13:41 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 17:13:41 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 17:13:41 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 17:13:41 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 17:13:41 INFO exec.GroupByOperator: Initialization Done
    [junit] 08/12/30 17:13:41 INFO mapred.TaskRunner: Task:attempt_local_0001_r_000000_0 is done. And is in the process of commiting
    [junit] 08/12/30 17:13:41 INFO mapred.LocalJobRunner: 
    [junit] 08/12/30 17:13:41 INFO mapred.TaskRunner: Task attempt_local_0001_r_000000_0 is allowed to commit now
    [junit] 08/12/30 17:13:41 INFO mapred.FileOutputCommitter: Saved output of task 'attempt_local_0001_r_000000_0' to file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/tmp-689320418
    [junit] 08/12/30 17:13:41 INFO mapred.LocalJobRunner: reduce > reduce
    [junit] 08/12/30 17:13:41 INFO mapred.TaskRunner: Task 'attempt_local_0001_r_000000_0' done.
    [junit] 08/12/30 17:13:41 INFO exec.ExecDriver:  map = 100%,  reduce =100%
    [junit] 08/12/30 17:13:41 INFO exec.ExecDriver: Ended Job = job_local_0001
    [junit]  map = 100%,  reduce =100%
    [junit] Ended Job = job_local_0001
    [junit] plan = /tmp/plan1390.xml
    [junit] 08/12/30 17:13:43 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to jobconf value of: 1
    [junit] 08/12/30 17:13:43 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/t3/ds=2008-04-09
    [junit] 08/12/30 17:13:43 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/data/files/TestSerDe.jar
    [junit] 08/12/30 17:13:43 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
    [junit] 08/12/30 17:13:43 WARN mapred.JobClient: Use GenericOptionsParser for parsing the arguments. Applications should implement Tool for the same.
    [junit] 08/12/30 17:13:43 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] Job running in-process (local Hadoop)
    [junit] 08/12/30 17:13:43 INFO exec.ExecDriver: Job running in-process (local Hadoop)
    [junit] 08/12/30 17:13:43 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] 08/12/30 17:13:43 INFO mapred.MapTask: numReduceTasks: 1
    [junit] 08/12/30 17:13:44 INFO mapred.MapTask: io.sort.mb = 100
    [junit] 08/12/30 17:13:44 INFO mapred.MapTask: data buffer = 79691776/99614720
    [junit] 08/12/30 17:13:44 INFO mapred.MapTask: record buffer = 262144/327680
    [junit] 08/12/30 17:13:44 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 17:13:44 INFO exec.MapOperator: Adding alias t3 to work list for file /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/t3/ds=2008-04-09/kv1.txt
    [junit] 08/12/30 17:13:44 INFO exec.MapOperator: Got partitions: ds
    [junit] 08/12/30 17:13:44 INFO exec.TableScanOperator: Initializing Self
    [junit] 08/12/30 17:13:44 INFO exec.TableScanOperator: Initializing children:
    [junit] 08/12/30 17:13:44 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 17:13:44 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 17:13:44 INFO exec.FilterOperator: Initializing Self
    [junit] 08/12/30 17:13:44 INFO exec.FilterOperator: Initializing children:
    [junit] 08/12/30 17:13:44 INFO exec.ReduceSinkOperator: Initializing Self
    [junit] 08/12/30 17:13:44 INFO exec.ReduceSinkOperator: Using tag = -1
    [junit] 08/12/30 17:13:44 INFO thrift.TBinarySortableProtocol: Sort order is ""
    [junit] 08/12/30 17:13:44 INFO thrift.TBinarySortableProtocol: Sort order is ""
    [junit] 08/12/30 17:13:44 INFO exec.FilterOperator: Initialization Done
    [junit] 08/12/30 17:13:44 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 17:13:44 INFO exec.TableScanOperator: Initialization Done
    [junit] 08/12/30 17:13:44 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 17:13:44 INFO exec.FilterOperator: FILTERED:0
    [junit] 08/12/30 17:13:44 INFO exec.FilterOperator: PASSED:500
    [junit] 08/12/30 17:13:44 INFO mapred.MapTask: Starting flush of map output
    [junit] 08/12/30 17:13:44 INFO mapred.MapTask: Finished spill 0
    [junit] 08/12/30 17:13:44 INFO mapred.TaskRunner: Task:attempt_local_0001_m_000000_0 is done. And is in the process of commiting
    [junit] 08/12/30 17:13:44 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/t3/ds=2008-04-09/kv1.txt:0+5812
    [junit] 08/12/30 17:13:44 INFO mapred.TaskRunner: Task 'attempt_local_0001_m_000000_0' done.
    [junit] 08/12/30 17:13:44 INFO thrift.TBinarySortableProtocol: Sort order is ""
    [junit] 08/12/30 17:13:44 INFO thrift.TBinarySortableProtocol: Sort order is ""
    [junit] 08/12/30 17:13:44 INFO mapred.Merger: Merging 1 sorted segments
    [junit] 08/12/30 17:13:44 INFO mapred.Merger: Down to the last merge-pass, with 1 segments left of total size: 9002 bytes
    [junit] 08/12/30 17:13:44 INFO exec.GroupByOperator: Initializing Self
    [junit] 08/12/30 17:13:44 INFO exec.GroupByOperator: Initializing children:
    [junit] 08/12/30 17:13:44 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 17:13:44 INFO exec.GroupByOperator: Initialization Done
    [junit] 08/12/30 17:13:44 INFO mapred.TaskRunner: Task:attempt_local_0001_r_000000_0 is done. And is in the process of commiting
    [junit] 08/12/30 17:13:44 INFO mapred.LocalJobRunner: 
    [junit] 08/12/30 17:13:44 INFO mapred.TaskRunner: Task attempt_local_0001_r_000000_0 is allowed to commit now
    [junit] 08/12/30 17:13:44 INFO mapred.FileOutputCommitter: Saved output of task 'attempt_local_0001_r_000000_0' to file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/tmp-972274266
    [junit] 08/12/30 17:13:44 INFO mapred.LocalJobRunner: reduce > reduce
    [junit] 08/12/30 17:13:44 INFO mapred.TaskRunner: Task 'attempt_local_0001_r_000000_0' done.
    [junit] 08/12/30 17:13:44 INFO exec.ExecDriver:  map = 100%,  reduce =100%
    [junit]  map = 100%,  reduce =100%
    [junit] 08/12/30 17:13:44 INFO exec.ExecDriver: Ended Job = job_local_0001
    [junit] Ended Job = job_local_0001
    [junit] plan = /tmp/plan1391.xml
    [junit] 08/12/30 17:13:46 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to jobconf value of: 1
    [junit] 08/12/30 17:13:46 INFO exec.ExecDriver: Adding input file /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/tmp/500219325/64425560.10002
    [junit] 08/12/30 17:13:46 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/data/files/TestSerDe.jar
    [junit] 08/12/30 17:13:46 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
    [junit] 08/12/30 17:13:46 WARN mapred.JobClient: Use GenericOptionsParser for parsing the arguments. Applications should implement Tool for the same.
    [junit] 08/12/30 17:13:46 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] Job running in-process (local Hadoop)
    [junit] 08/12/30 17:13:47 INFO exec.ExecDriver: Job running in-process (local Hadoop)
    [junit] 08/12/30 17:13:47 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] 08/12/30 17:13:47 INFO mapred.MapTask: numReduceTasks: 1
    [junit] 08/12/30 17:13:47 INFO mapred.MapTask: io.sort.mb = 100
    [junit] 08/12/30 17:13:47 INFO mapred.MapTask: data buffer = 79691776/99614720
    [junit] 08/12/30 17:13:47 INFO mapred.MapTask: record buffer = 262144/327680
    [junit] 08/12/30 17:13:47 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 17:13:47 INFO exec.MapOperator: Adding alias /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/tmp/500219325/64425560.10002 to work list for file /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/tmp/500219325/64425560.10002/attempt_local_0001_r_000000_0
    [junit] 08/12/30 17:13:47 INFO exec.MapOperator: Got partitions: null
    [junit] 08/12/30 17:13:47 INFO exec.ReduceSinkOperator: Initializing Self
    [junit] 08/12/30 17:13:47 INFO exec.ReduceSinkOperator: Using tag = -1
    [junit] 08/12/30 17:13:47 INFO thrift.TBinarySortableProtocol: Sort order is ""
    [junit] 08/12/30 17:13:47 INFO thrift.TBinarySortableProtocol: Sort order is ""
    [junit] 08/12/30 17:13:47 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 17:13:47 INFO mapred.MapTask: Starting flush of map output
    [junit] 08/12/30 17:13:47 INFO mapred.MapTask: Finished spill 0
    [junit] 08/12/30 17:13:47 INFO mapred.TaskRunner: Task:attempt_local_0001_m_000000_0 is done. And is in the process of commiting
    [junit] 08/12/30 17:13:47 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/tmp/500219325/64425560.10002/attempt_local_0001_r_000000_0:0+124
    [junit] 08/12/30 17:13:47 INFO mapred.TaskRunner: Task 'attempt_local_0001_m_000000_0' done.
    [junit] 08/12/30 17:13:47 INFO thrift.TBinarySortableProtocol: Sort order is ""
    [junit] 08/12/30 17:13:47 INFO thrift.TBinarySortableProtocol: Sort order is ""
    [junit] 08/12/30 17:13:47 INFO mapred.Merger: Merging 1 sorted segments
    [junit] 08/12/30 17:13:47 INFO mapred.Merger: Down to the last merge-pass, with 1 segments left of total size: 24 bytes
    [junit] 08/12/30 17:13:47 INFO exec.GroupByOperator: Initializing Self
    [junit] 08/12/30 17:13:47 INFO exec.GroupByOperator: Initializing children:
    [junit] 08/12/30 17:13:47 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 17:13:47 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 17:13:47 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 17:13:47 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 17:13:47 INFO exec.GroupByOperator: Initialization Done
    [junit] 08/12/30 17:13:47 INFO mapred.TaskRunner: Task:attempt_local_0001_r_000000_0 is done. And is in the process of commiting
    [junit] 08/12/30 17:13:47 INFO mapred.LocalJobRunner: 
    [junit] 08/12/30 17:13:47 INFO mapred.TaskRunner: Task attempt_local_0001_r_000000_0 is allowed to commit now
    [junit] 08/12/30 17:13:47 INFO mapred.FileOutputCommitter: Saved output of task 'attempt_local_0001_r_000000_0' to file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/tmp1701204428
    [junit] 08/12/30 17:13:47 INFO mapred.LocalJobRunner: reduce > reduce
    [junit] 08/12/30 17:13:47 INFO mapred.TaskRunner: Task 'attempt_local_0001_r_000000_0' done.
    [junit]  map = 100%,  reduce =100%
    [junit] 08/12/30 17:13:48 INFO exec.ExecDriver:  map = 100%,  reduce =100%
    [junit] Ended Job = job_local_0001
    [junit] 08/12/30 17:13:48 INFO exec.ExecDriver: Ended Job = job_local_0001
    [junit] plan = /tmp/plan1392.xml
    [junit] 08/12/30 17:13:49 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to jobconf value of: 1
    [junit] 08/12/30 17:13:49 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/t4/ds=2008-04-09
    [junit] 08/12/30 17:13:49 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/data/files/TestSerDe.jar
    [junit] 08/12/30 17:13:49 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
    [junit] 08/12/30 17:13:49 WARN mapred.JobClient: Use GenericOptionsParser for parsing the arguments. Applications should implement Tool for the same.
    [junit] 08/12/30 17:13:50 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] Job running in-process (local Hadoop)
    [junit] 08/12/30 17:13:50 INFO exec.ExecDriver: Job running in-process (local Hadoop)
    [junit] 08/12/30 17:13:50 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] 08/12/30 17:13:50 INFO mapred.MapTask: numReduceTasks: 1
    [junit] 08/12/30 17:13:50 INFO mapred.MapTask: io.sort.mb = 100
    [junit] 08/12/30 17:13:50 INFO mapred.MapTask: data buffer = 79691776/99614720
    [junit] 08/12/30 17:13:50 INFO mapred.MapTask: record buffer = 262144/327680
    [junit] 08/12/30 17:13:50 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 17:13:50 INFO exec.MapOperator: Adding alias t4 to work list for file /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/t4/ds=2008-04-09/kv1.seq
    [junit] 08/12/30 17:13:50 INFO exec.MapOperator: Got partitions: ds
    [junit] 08/12/30 17:13:50 INFO exec.TableScanOperator: Initializing Self
    [junit] 08/12/30 17:13:50 INFO exec.TableScanOperator: Initializing children:
    [junit] 08/12/30 17:13:50 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 17:13:50 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 17:13:50 INFO exec.FilterOperator: Initializing Self
    [junit] 08/12/30 17:13:50 INFO exec.FilterOperator: Initializing children:
    [junit] 08/12/30 17:13:50 INFO exec.ReduceSinkOperator: Initializing Self
    [junit] 08/12/30 17:13:50 INFO exec.ReduceSinkOperator: Using tag = -1
    [junit] 08/12/30 17:13:50 INFO thrift.TBinarySortableProtocol: Sort order is ""
    [junit] 08/12/30 17:13:50 INFO thrift.TBinarySortableProtocol: Sort order is ""
    [junit] 08/12/30 17:13:50 INFO exec.FilterOperator: Initialization Done
    [junit] 08/12/30 17:13:50 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 17:13:50 INFO exec.TableScanOperator: Initialization Done
    [junit] 08/12/30 17:13:51 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 17:13:51 INFO exec.FilterOperator: FILTERED:0
    [junit] 08/12/30 17:13:51 INFO exec.FilterOperator: PASSED:500
    [junit] 08/12/30 17:13:51 INFO mapred.MapTask: Starting flush of map output
    [junit] 08/12/30 17:13:51 INFO mapred.MapTask: Finished spill 0
    [junit] 08/12/30 17:13:51 INFO mapred.TaskRunner: Task:attempt_local_0001_m_000000_0 is done. And is in the process of commiting
    [junit] 08/12/30 17:13:51 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/t4/ds=2008-04-09/kv1.seq:0+10508
    [junit] 08/12/30 17:13:51 INFO mapred.TaskRunner: Task 'attempt_local_0001_m_000000_0' done.
    [junit] 08/12/30 17:13:51 INFO thrift.TBinarySortableProtocol: Sort order is ""
    [junit] 08/12/30 17:13:51 INFO thrift.TBinarySortableProtocol: Sort order is ""
    [junit] 08/12/30 17:13:51 INFO mapred.Merger: Merging 1 sorted segments
    [junit] 08/12/30 17:13:51 INFO mapred.Merger: Down to the last merge-pass, with 1 segments left of total size: 9002 bytes
    [junit] 08/12/30 17:13:51 INFO exec.GroupByOperator: Initializing Self
    [junit] 08/12/30 17:13:51 INFO exec.GroupByOperator: Initializing children:
    [junit] 08/12/30 17:13:51 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 17:13:51 INFO exec.GroupByOperator: Initialization Done
    [junit] 08/12/30 17:13:51 INFO mapred.TaskRunner: Task:attempt_local_0001_r_000000_0 is done. And is in the process of commiting
    [junit] 08/12/30 17:13:51 INFO mapred.LocalJobRunner: 
    [junit] 08/12/30 17:13:51 INFO mapred.TaskRunner: Task attempt_local_0001_r_000000_0 is allowed to commit now
    [junit] 08/12/30 17:13:51 INFO mapred.FileOutputCommitter: Saved output of task 'attempt_local_0001_r_000000_0' to file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/tmp43843614
    [junit] 08/12/30 17:13:51 INFO mapred.LocalJobRunner: reduce > reduce
    [junit] 08/12/30 17:13:51 INFO mapred.TaskRunner: Task 'attempt_local_0001_r_000000_0' done.
    [junit] 08/12/30 17:13:51 INFO exec.ExecDriver:  map = 100%,  reduce =100%
    [junit] 08/12/30 17:13:51 INFO exec.ExecDriver: Ended Job = job_local_0001
    [junit]  map = 100%,  reduce =100%
    [junit] Ended Job = job_local_0001
    [junit] plan = /tmp/plan1393.xml
    [junit] 08/12/30 17:13:52 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to jobconf value of: 1
    [junit] 08/12/30 17:13:53 INFO exec.ExecDriver: Adding input file /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/tmp/29669155/208267024.10002
    [junit] 08/12/30 17:13:53 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/data/files/TestSerDe.jar
    [junit] 08/12/30 17:13:53 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
    [junit] 08/12/30 17:13:53 WARN mapred.JobClient: Use GenericOptionsParser for parsing the arguments. Applications should implement Tool for the same.
    [junit] 08/12/30 17:13:53 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] Job running in-process (local Hadoop)
    [junit] 08/12/30 17:13:53 INFO exec.ExecDriver: Job running in-process (local Hadoop)
    [junit] 08/12/30 17:13:53 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] 08/12/30 17:13:53 INFO mapred.MapTask: numReduceTasks: 1
    [junit] 08/12/30 17:13:53 INFO mapred.MapTask: io.sort.mb = 100
    [junit] 08/12/30 17:13:53 INFO mapred.MapTask: data buffer = 79691776/99614720
    [junit] 08/12/30 17:13:53 INFO mapred.MapTask: record buffer = 262144/327680
    [junit] 08/12/30 17:13:53 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 17:13:53 INFO exec.MapOperator: Adding alias /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/tmp/29669155/208267024.10002 to work list for file /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/tmp/29669155/208267024.10002/attempt_local_0001_r_000000_0
    [junit] 08/12/30 17:13:53 INFO exec.MapOperator: Got partitions: null
    [junit] 08/12/30 17:13:53 INFO exec.ReduceSinkOperator: Initializing Self
    [junit] 08/12/30 17:13:53 INFO exec.ReduceSinkOperator: Using tag = -1
    [junit] 08/12/30 17:13:53 INFO thrift.TBinarySortableProtocol: Sort order is ""
    [junit] 08/12/30 17:13:53 INFO thrift.TBinarySortableProtocol: Sort order is ""
    [junit] 08/12/30 17:13:53 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 17:13:53 INFO mapred.MapTask: Starting flush of map output
    [junit] 08/12/30 17:13:53 INFO mapred.MapTask: Finished spill 0
    [junit] 08/12/30 17:13:53 INFO mapred.TaskRunner: Task:attempt_local_0001_m_000000_0 is done. And is in the process of commiting
    [junit] 08/12/30 17:13:53 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/tmp/29669155/208267024.10002/attempt_local_0001_r_000000_0:0+124
    [junit] 08/12/30 17:13:53 INFO mapred.TaskRunner: Task 'attempt_local_0001_m_000000_0' done.
    [junit] 08/12/30 17:13:53 INFO thrift.TBinarySortableProtocol: Sort order is ""
    [junit] 08/12/30 17:13:53 INFO thrift.TBinarySortableProtocol: Sort order is ""
    [junit] 08/12/30 17:13:53 INFO mapred.Merger: Merging 1 sorted segments
    [junit] 08/12/30 17:13:53 INFO mapred.Merger: Down to the last merge-pass, with 1 segments left of total size: 24 bytes
    [junit] 08/12/30 17:13:53 INFO exec.GroupByOperator: Initializing Self
    [junit] 08/12/30 17:13:53 INFO exec.GroupByOperator: Initializing children:
    [junit] 08/12/30 17:13:53 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 17:13:53 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 17:13:53 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 17:13:53 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 17:13:53 INFO exec.GroupByOperator: Initialization Done
    [junit] 08/12/30 17:13:53 INFO mapred.TaskRunner: Task:attempt_local_0001_r_000000_0 is done. And is in the process of commiting
    [junit] 08/12/30 17:13:53 INFO mapred.LocalJobRunner: 
    [junit] 08/12/30 17:13:53 INFO mapred.TaskRunner: Task attempt_local_0001_r_000000_0 is allowed to commit now
    [junit] 08/12/30 17:13:53 INFO mapred.FileOutputCommitter: Saved output of task 'attempt_local_0001_r_000000_0' to file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/tmp1419766912
    [junit] 08/12/30 17:13:53 INFO mapred.LocalJobRunner: reduce > reduce
    [junit] 08/12/30 17:13:53 INFO mapred.TaskRunner: Task 'attempt_local_0001_r_000000_0' done.
    [junit]  map = 100%,  reduce =100%
    [junit] Ended Job = job_local_0001
    [junit] 08/12/30 17:13:54 INFO exec.ExecDriver:  map = 100%,  reduce =100%
    [junit] 08/12/30 17:13:54 INFO exec.ExecDriver: Ended Job = job_local_0001
    [junit] diff -I \(file:\)\|\(/tmp/.*\) /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/../build/ql/test/logs/inputddl7.q.out /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/src/test/results/clientpositive/inputddl7.q.out
    [junit] Done query: inputddl7.q
    [junit] Begin query: notable_alias1.q
    [junit] plan = /tmp/plan1394.xml
    [junit] 08/12/30 17:13:58 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to jobconf value of: 1
    [junit] 08/12/30 17:13:59 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src
    [junit] 08/12/30 17:13:59 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/data/files/TestSerDe.jar
    [junit] 08/12/30 17:13:59 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
    [junit] 08/12/30 17:13:59 WARN mapred.JobClient: Use GenericOptionsParser for parsing the arguments. Applications should implement Tool for the same.
    [junit] 08/12/30 17:13:59 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] 08/12/30 17:13:59 INFO exec.ExecDriver: Job running in-process (local Hadoop)
    [junit] Job running in-process (local Hadoop)
    [junit] 08/12/30 17:13:59 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] 08/12/30 17:13:59 INFO mapred.MapTask: numReduceTasks: 1
    [junit] 08/12/30 17:13:59 INFO mapred.MapTask: io.sort.mb = 100
    [junit] 08/12/30 17:13:59 INFO mapred.MapTask: data buffer = 79691776/99614720
    [junit] 08/12/30 17:13:59 INFO mapred.MapTask: record buffer = 262144/327680
    [junit] 08/12/30 17:13:59 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 17:13:59 INFO exec.MapOperator: Adding alias src to work list for file /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src/kv1.txt
    [junit] 08/12/30 17:13:59 INFO exec.MapOperator: Got partitions: null
    [junit] 08/12/30 17:13:59 INFO exec.TableScanOperator: Initializing Self
    [junit] 08/12/30 17:13:59 INFO exec.TableScanOperator: Initializing children:
    [junit] 08/12/30 17:13:59 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 17:13:59 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 17:13:59 INFO exec.FilterOperator: Initializing Self
    [junit] 08/12/30 17:13:59 INFO exec.FilterOperator: Initializing children:
    [junit] 08/12/30 17:13:59 INFO exec.ReduceSinkOperator: Initializing Self
    [junit] 08/12/30 17:13:59 INFO exec.ReduceSinkOperator: Using tag = -1
    [junit] 08/12/30 17:13:59 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 17:13:59 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 17:13:59 INFO exec.FilterOperator: Initialization Done
    [junit] 08/12/30 17:13:59 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 17:13:59 INFO exec.TableScanOperator: Initialization Done
    [junit] 08/12/30 17:13:59 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 17:13:59 INFO exec.FilterOperator: PASSED:84
    [junit] 08/12/30 17:13:59 INFO exec.FilterOperator: FILTERED:416
    [junit] 08/12/30 17:13:59 INFO mapred.MapTask: Starting flush of map output
    [junit] 08/12/30 17:13:59 INFO mapred.MapTask: Finished spill 0
    [junit] 08/12/30 17:13:59 INFO mapred.TaskRunner: Task:attempt_local_0001_m_000000_0 is done. And is in the process of commiting
    [junit] 08/12/30 17:13:59 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src/kv1.txt:0+5812
    [junit] 08/12/30 17:13:59 INFO mapred.TaskRunner: Task 'attempt_local_0001_m_000000_0' done.
    [junit] 08/12/30 17:13:59 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 17:13:59 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 17:13:59 INFO mapred.Merger: Merging 1 sorted segments
    [junit] 08/12/30 17:13:59 INFO mapred.Merger: Down to the last merge-pass, with 1 segments left of total size: 1840 bytes
    [junit] 08/12/30 17:13:59 INFO exec.GroupByOperator: Initializing Self
    [junit] 08/12/30 17:13:59 INFO exec.GroupByOperator: Initializing children:
    [junit] 08/12/30 17:13:59 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 17:14:00 INFO exec.GroupByOperator: Initialization Done
    [junit] 08/12/30 17:14:00 INFO mapred.TaskRunner: Task:attempt_local_0001_r_000000_0 is done. And is in the process of commiting
    [junit] 08/12/30 17:14:00 INFO mapred.LocalJobRunner: 
    [junit] 08/12/30 17:14:00 INFO mapred.TaskRunner: Task attempt_local_0001_r_000000_0 is allowed to commit now
    [junit] 08/12/30 17:14:00 INFO mapred.FileOutputCommitter: Saved output of task 'attempt_local_0001_r_000000_0' to file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/tmp-1280394025
    [junit] 08/12/30 17:14:00 INFO mapred.LocalJobRunner: reduce > reduce
    [junit] 08/12/30 17:14:00 INFO mapred.TaskRunner: Task 'attempt_local_0001_r_000000_0' done.
    [junit]  map = 100%,  reduce =100%
    [junit] Ended Job = job_local_0001
    [junit] 08/12/30 17:14:00 INFO exec.ExecDriver:  map = 100%,  reduce =100%
    [junit] 08/12/30 17:14:00 INFO exec.ExecDriver: Ended Job = job_local_0001
    [junit] plan = /tmp/plan1395.xml
    [junit] 08/12/30 17:14:02 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to jobconf value of: 1
    [junit] 08/12/30 17:14:02 INFO exec.ExecDriver: Adding input file /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/tmp/228213277/101160528.10001
    [junit] 08/12/30 17:14:02 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/data/files/TestSerDe.jar
    [junit] 08/12/30 17:14:02 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
    [junit] 08/12/30 17:14:02 WARN mapred.JobClient: Use GenericOptionsParser for parsing the arguments. Applications should implement Tool for the same.
    [junit] 08/12/30 17:14:02 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] 08/12/30 17:14:02 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] 08/12/30 17:14:02 INFO exec.ExecDriver: Job running in-process (local Hadoop)
    [junit] Job running in-process (local Hadoop)
    [junit] 08/12/30 17:14:02 INFO mapred.MapTask: numReduceTasks: 1
    [junit] 08/12/30 17:14:02 INFO mapred.MapTask: io.sort.mb = 100
    [junit] 08/12/30 17:14:02 INFO mapred.MapTask: data buffer = 79691776/99614720
    [junit] 08/12/30 17:14:02 INFO mapred.MapTask: record buffer = 262144/327680
    [junit] 08/12/30 17:14:02 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 17:14:02 INFO exec.MapOperator: Adding alias /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/tmp/228213277/101160528.10001 to work list for file /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/tmp/228213277/101160528.10001/attempt_local_0001_r_000000_0
    [junit] 08/12/30 17:14:03 INFO exec.MapOperator: Got partitions: null
    [junit] 08/12/30 17:14:03 INFO exec.ReduceSinkOperator: Initializing Self
    [junit] 08/12/30 17:14:03 INFO exec.ReduceSinkOperator: Using tag = -1
    [junit] 08/12/30 17:14:03 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 17:14:03 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 17:14:03 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 17:14:03 INFO mapred.MapTask: Starting flush of map output
    [junit] 08/12/30 17:14:03 INFO mapred.MapTask: Finished spill 0
    [junit] 08/12/30 17:14:03 INFO mapred.TaskRunner: Task:attempt_local_0001_m_000000_0 is done. And is in the process of commiting
    [junit] 08/12/30 17:14:03 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/tmp/228213277/101160528.10001/attempt_local_0001_r_000000_0:0+2219
    [junit] 08/12/30 17:14:03 INFO mapred.TaskRunner: Task 'attempt_local_0001_m_000000_0' done.
    [junit] 08/12/30 17:14:03 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 17:14:03 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 17:14:03 INFO mapred.Merger: Merging 1 sorted segments
    [junit] 08/12/30 17:14:03 INFO mapred.Merger: Down to the last merge-pass, with 1 segments left of total size: 1478 bytes
    [junit] 08/12/30 17:14:03 INFO exec.GroupByOperator: Initializing Self
    [junit] 08/12/30 17:14:03 INFO exec.GroupByOperator: Initializing children:
    [junit] 08/12/30 17:14:03 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 17:14:03 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 17:14:03 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 17:14:03 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 17:14:03 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 17:14:03 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 17:14:03 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 17:14:03 INFO exec.GroupByOperator: Initialization Done
    [junit] 08/12/30 17:14:03 INFO mapred.TaskRunner: Task:attempt_local_0001_r_000000_0 is done. And is in the process of commiting
    [junit] 08/12/30 17:14:03 INFO mapred.LocalJobRunner: 
    [junit] 08/12/30 17:14:03 INFO mapred.TaskRunner: Task attempt_local_0001_r_000000_0 is allowed to commit now
    [junit] 08/12/30 17:14:03 INFO mapred.FileOutputCommitter: Saved output of task 'attempt_local_0001_r_000000_0' to file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/tmp1058300962
    [junit] 08/12/30 17:14:03 INFO mapred.LocalJobRunner: reduce > reduce
    [junit] 08/12/30 17:14:03 INFO mapred.TaskRunner: Task 'attempt_local_0001_r_000000_0' done.
    [junit] 08/12/30 17:14:03 INFO exec.ExecDriver:  map = 100%,  reduce =100%
    [junit] 08/12/30 17:14:03 INFO exec.ExecDriver: Ended Job = job_local_0001
    [junit]  map = 100%,  reduce =100%
    [junit] Ended Job = job_local_0001
    [junit] diff -I \(file:\)\|\(/tmp/.*\) /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/../build/ql/test/logs/notable_alias1.q.out /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/src/test/results/clientpositive/notable_alias1.q.out
    [junit] Done query: notable_alias1.q
    [junit] Begin query: input0.q
    [junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src)
    [junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:361)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
    [junit] 	at org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_input0(TestCliDriver.java:828)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit] 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit] 	at java.lang.reflect.Method.invoke(Method.java:597)
    [junit] 	at junit.framework.TestCase.runTest(TestCase.java:164)
    [junit] 	at junit.framework.TestCase.runBare(TestCase.java:130)
    [junit] 	at junit.framework.TestResult$1.protect(TestResult.java:106)
    [junit] 	at junit.framework.TestResult.runProtected(TestResult.java:124)
    [junit] 	at junit.framework.TestResult.run(TestResult.java:109)
    [junit] 	at junit.framework.TestCase.run(TestCase.java:120)
    [junit] 	at junit.framework.TestSuite.runTest(TestSuite.java:230)
    [junit] 	at junit.framework.TestSuite.run(TestSuite.java:225)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
    [junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src)
    [junit] 	at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
    [junit] 	at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
    [junit] 	at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
    [junit] 	... 21 more
    [junit] Begin query: join1.q
    [junit] plan = /tmp/plan1396.xml
    [junit] 08/12/30 17:14:07 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to jobconf value of: 1
    [junit] 08/12/30 17:14:07 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src
    [junit] 08/12/30 17:14:07 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/data/files/TestSerDe.jar
    [junit] 08/12/30 17:14:07 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
    [junit] 08/12/30 17:14:07 WARN mapred.JobClient: Use GenericOptionsParser for parsing the arguments. Applications should implement Tool for the same.
    [junit] 08/12/30 17:14:08 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] 08/12/30 17:14:08 INFO exec.ExecDriver: Job running in-process (local Hadoop)
    [junit] Job running in-process (local Hadoop)
    [junit] 08/12/30 17:14:08 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] 08/12/30 17:14:08 INFO mapred.MapTask: numReduceTasks: 1
    [junit] 08/12/30 17:14:08 INFO mapred.MapTask: io.sort.mb = 100
    [junit] 08/12/30 17:14:08 INFO mapred.MapTask: data buffer = 79691776/99614720
    [junit] 08/12/30 17:14:08 INFO mapred.MapTask: record buffer = 262144/327680
    [junit] 08/12/30 17:14:08 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 17:14:08 INFO exec.MapOperator: Adding alias src2 to work list for file /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src/kv1.txt
    [junit] 08/12/30 17:14:08 INFO exec.MapOperator: Adding alias src1 to work list for file /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src/kv1.txt
    [junit] 08/12/30 17:14:08 INFO exec.MapOperator: Got partitions: null
    [junit] 08/12/30 17:14:08 INFO exec.TableScanOperator: Initializing Self
    [junit] 08/12/30 17:14:08 INFO exec.TableScanOperator: Initializing children:
    [junit] 08/12/30 17:14:08 INFO exec.ReduceSinkOperator: Initializing Self
    [junit] 08/12/30 17:14:08 INFO exec.ReduceSinkOperator: Using tag = 1
    [junit] 08/12/30 17:14:08 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 17:14:08 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 17:14:08 INFO exec.TableScanOperator: Initialization Done
    [junit] 08/12/30 17:14:08 INFO exec.TableScanOperator: Initializing Self
    [junit] 08/12/30 17:14:08 INFO exec.TableScanOperator: Initializing children:
    [junit] 08/12/30 17:14:08 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 17:14:08 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 17:14:08 INFO exec.ReduceSinkOperator: Initializing Self
    [junit] 08/12/30 17:14:08 INFO exec.ReduceSinkOperator: Using tag = 0
    [junit] 08/12/30 17:14:08 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 17:14:08 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 17:14:08 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 17:14:08 INFO exec.TableScanOperator: Initialization Done
    [junit] 08/12/30 17:14:08 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 17:14:08 INFO mapred.MapTask: Starting flush of map output
    [junit] 08/12/30 17:14:08 INFO mapred.MapTask: Finished spill 0
    [junit] 08/12/30 17:14:08 INFO mapred.TaskRunner: Task:attempt_local_0001_m_000000_0 is done. And is in the process of commiting
    [junit] 08/12/30 17:14:08 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src/kv1.txt:0+5812
    [junit] 08/12/30 17:14:08 INFO mapred.TaskRunner: Task 'attempt_local_0001_m_000000_0' done.
    [junit] 08/12/30 17:14:08 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 17:14:08 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 17:14:08 INFO mapred.Merger: Merging 1 sorted segments
    [junit] 08/12/30 17:14:09 INFO mapred.Merger: Down to the last merge-pass, with 1 segments left of total size: 33532 bytes
    [junit] 08/12/30 17:14:09 INFO exec.JoinOperator: Initializing Self
    [junit] 08/12/30 17:14:09 INFO exec.JoinOperator: Initializing children:
    [junit] 08/12/30 17:14:09 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 17:14:09 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 17:14:09 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 17:14:09 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 17:14:09 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 17:14:09 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 17:14:09 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 17:14:09 INFO exec.JoinOperator: Initialization Done
    [junit] 08/12/30 17:14:09 INFO mapred.TaskRunner: Task:attempt_local_0001_r_000000_0 is done. And is in the process of commiting
    [junit] 08/12/30 17:14:09 INFO mapred.LocalJobRunner: 
    [junit] 08/12/30 17:14:09 INFO mapred.TaskRunner: Task attempt_local_0001_r_000000_0 is allowed to commit now
    [junit] 08/12/30 17:14:09 INFO mapred.FileOutputCommitter: Saved output of task 'attempt_local_0001_r_000000_0' to file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/tmp-1391802431
    [junit] 08/12/30 17:14:09 INFO mapred.LocalJobRunner: reduce > reduce
    [junit] 08/12/30 17:14:09 INFO mapred.TaskRunner: Task 'attempt_local_0001_r_000000_0' done.
    [junit] 08/12/30 17:14:09 INFO exec.ExecDriver:  map = 100%,  reduce =100%
    [junit] 08/12/30 17:14:09 INFO exec.ExecDriver: Ended Job = job_local_0001
    [junit]  map = 100%,  reduce =100%
    [junit] Ended Job = job_local_0001
    [junit] diff -I \(file:\)\|\(/tmp/.*\) /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/../build/ql/test/logs/join1.q.out /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/src/test/results/clientpositive/join1.q.out
    [junit] Done query: join1.q
    [junit] Begin query: input2.q
    [junit] diff -I \(file:\)\|\(/tmp/.*\) /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/../build/ql/test/logs/input2.q.out /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/src/test/results/clientpositive/input2.q.out
    [junit] Done query: input2.q
    [junit] Begin query: join3.q
    [junit] plan = /tmp/plan1397.xml
    [junit] 08/12/30 17:14:15 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to jobconf value of: 1
    [junit] 08/12/30 17:14:16 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src
    [junit] 08/12/30 17:14:16 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/data/files/TestSerDe.jar
    [junit] 08/12/30 17:14:16 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
    [junit] 08/12/30 17:14:16 WARN mapred.JobClient: Use GenericOptionsParser for parsing the arguments. Applications should implement Tool for the same.
    [junit] 08/12/30 17:14:16 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] Job running in-process (local Hadoop)
    [junit] 08/12/30 17:14:16 INFO exec.ExecDriver: Job running in-process (local Hadoop)
    [junit] 08/12/30 17:14:16 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] 08/12/30 17:14:16 INFO mapred.MapTask: numReduceTasks: 1
    [junit] 08/12/30 17:14:16 INFO mapred.MapTask: io.sort.mb = 100
    [junit] 08/12/30 17:14:16 INFO mapred.MapTask: data buffer = 79691776/99614720
    [junit] 08/12/30 17:14:16 INFO mapred.MapTask: record buffer = 262144/327680
    [junit] 08/12/30 17:14:16 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 17:14:16 INFO exec.MapOperator: Adding alias src2 to work list for file /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src/kv1.txt
    [junit] 08/12/30 17:14:16 INFO exec.MapOperator: Adding alias src3 to work list for file /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src/kv1.txt
    [junit] 08/12/30 17:14:16 INFO exec.MapOperator: Adding alias src1 to work list for file /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src/kv1.txt
    [junit] 08/12/30 17:14:16 INFO exec.MapOperator: Got partitions: null
    [junit] 08/12/30 17:14:16 INFO exec.TableScanOperator: Initializing Self
    [junit] 08/12/30 17:14:16 INFO exec.TableScanOperator: Initializing children:
    [junit] 08/12/30 17:14:16 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 17:14:16 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 17:14:16 INFO exec.ReduceSinkOperator: Initializing Self
    [junit] 08/12/30 17:14:16 INFO exec.ReduceSinkOperator: Using tag = 1
    [junit] 08/12/30 17:14:16 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 17:14:16 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 17:14:16 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 17:14:16 INFO exec.TableScanOperator: Initialization Done
    [junit] 08/12/30 17:14:16 INFO exec.TableScanOperator: Initializing Self
    [junit] 08/12/30 17:14:16 INFO exec.TableScanOperator: Initializing children:
    [junit] 08/12/30 17:14:16 INFO exec.ReduceSinkOperator: Initializing Self
    [junit] 08/12/30 17:14:16 INFO exec.ReduceSinkOperator: Using tag = 2
    [junit] 08/12/30 17:14:16 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 17:14:16 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 17:14:16 INFO exec.TableScanOperator: Initialization Done
    [junit] 08/12/30 17:14:16 INFO exec.TableScanOperator: Initializing Self
    [junit] 08/12/30 17:14:16 INFO exec.TableScanOperator: Initializing children:
    [junit] 08/12/30 17:14:16 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 17:14:16 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 17:14:16 INFO exec.ReduceSinkOperator: Initializing Self
    [junit] 08/12/30 17:14:16 INFO exec.ReduceSinkOperator: Using tag = 0
    [junit] 08/12/30 17:14:16 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 17:14:16 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 17:14:16 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 17:14:16 INFO exec.TableScanOperator: Initialization Done
    [junit] 08/12/30 17:14:17 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 17:14:17 INFO mapred.MapTask: Starting flush of map output
    [junit] 08/12/30 17:14:17 INFO mapred.MapTask: Finished spill 0
    [junit] 08/12/30 17:14:17 INFO mapred.TaskRunner: Task:attempt_local_0001_m_000000_0 is done. And is in the process of commiting
    [junit] 08/12/30 17:14:17 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src/kv1.txt:0+5812
    [junit] 08/12/30 17:14:17 INFO mapred.TaskRunner: Task 'attempt_local_0001_m_000000_0' done.
    [junit] 08/12/30 17:14:17 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 17:14:17 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 17:14:17 INFO mapred.Merger: Merging 1 sorted segments
    [junit] 08/12/30 17:14:17 INFO mapred.Merger: Down to the last merge-pass, with 1 segments left of total size: 46844 bytes
    [junit] 08/12/30 17:14:17 INFO exec.JoinOperator: Initializing Self
    [junit] 08/12/30 17:14:17 INFO exec.JoinOperator: Initializing children:
    [junit] 08/12/30 17:14:17 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 17:14:17 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 17:14:17 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 17:14:17 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 17:14:17 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 17:14:17 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 17:14:17 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 17:14:17 INFO exec.JoinOperator: Initialization Done
    [junit]  map = 100%,  reduce =0%
    [junit] 08/12/30 17:14:17 INFO mapred.TaskRunner: Task:attempt_local_0001_r_000000_0 is done. And is in the process of commiting
    [junit] 08/12/30 17:14:17 INFO mapred.LocalJobRunner: 
    [junit] 08/12/30 17:14:17 INFO exec.ExecDriver:  map = 100%,  reduce =0%
    [junit] 08/12/30 17:14:17 INFO mapred.TaskRunner: Task attempt_local_0001_r_000000_0 is allowed to commit now
    [junit] 08/12/30 17:14:17 INFO mapred.FileOutputCommitter: Saved output of task 'attempt_local_0001_r_000000_0' to file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/tmp-712268395
    [junit] 08/12/30 17:14:17 INFO mapred.LocalJobRunner: reduce > reduce
    [junit] 08/12/30 17:14:17 INFO mapred.TaskRunner: Task 'attempt_local_0001_r_000000_0' done.
    [junit] 08/12/30 17:14:18 INFO exec.ExecDriver:  map = 100%,  reduce =100%
    [junit] 08/12/30 17:14:18 INFO exec.ExecDriver: Ended Job = job_local_0001
    [junit]  map = 100%,  reduce =100%
    [junit] Ended Job = job_local_0001
    [junit] diff -I \(file:\)\|\(/tmp/.*\) /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/../build/ql/test/logs/join3.q.out /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/src/test/results/clientpositive/join3.q.out
    [junit] Done query: join3.q
    [junit] Begin query: input4.q
    [junit] plan = /tmp/plan1398.xml
    [junit] 08/12/30 17:14:22 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to 0 since there's no reduce operator
    [junit] 08/12/30 17:14:22 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/input4
    [junit] 08/12/30 17:14:22 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/data/files/TestSerDe.jar
    [junit] 08/12/30 17:14:22 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
    [junit] 08/12/30 17:14:22 WARN mapred.JobClient: Use GenericOptionsParser for parsing the arguments. Applications should implement Tool for the same.
    [junit] 08/12/30 17:14:22 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] Job running in-process (local Hadoop)
    [junit] 08/12/30 17:14:22 INFO exec.ExecDriver: Job running in-process (local Hadoop)
    [junit] 08/12/30 17:14:22 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] 08/12/30 17:14:23 INFO mapred.MapTask: numReduceTasks: 0
    [junit] 08/12/30 17:14:23 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 17:14:23 INFO exec.MapOperator: Adding alias input4 to work list for file /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/input4/kv1.txt
    [junit] 08/12/30 17:14:23 INFO exec.MapOperator: Got partitions: null
    [junit] 08/12/30 17:14:23 INFO exec.TableScanOperator: Initializing Self
    [junit] 08/12/30 17:14:23 INFO exec.TableScanOperator: Initializing children:
    [junit] 08/12/30 17:14:23 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 17:14:23 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 17:14:23 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 17:14:23 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 17:14:23 INFO exec.TableScanOperator: Initialization Done
    [junit] 08/12/30 17:14:23 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 17:14:23 INFO mapred.TaskRunner: Task:attempt_local_0001_m_000000_0 is done. And is in the process of commiting
    [junit] 08/12/30 17:14:23 INFO mapred.LocalJobRunner: 
    [junit] 08/12/30 17:14:23 INFO mapred.TaskRunner: Task attempt_local_0001_m_000000_0 is allowed to commit now
    [junit] 08/12/30 17:14:23 INFO mapred.FileOutputCommitter: Saved output of task 'attempt_local_0001_m_000000_0' to file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/tmp2128467165
    [junit] 08/12/30 17:14:23 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/input4/kv1.txt:0+5812
    [junit] 08/12/30 17:14:23 INFO mapred.TaskRunner: Task 'attempt_local_0001_m_000000_0' done.
    [junit]  map = 100%,  reduce =0%
    [junit] 08/12/30 17:14:23 INFO exec.ExecDriver:  map = 100%,  reduce =0%
    [junit] Ended Job = job_local_0001
    [junit] 08/12/30 17:14:23 INFO exec.ExecDriver: Ended Job = job_local_0001
    [junit] junit.framework.AssertionFailedError: Client Execution failed with error code = 9
    [junit] Exception: Client Execution failed with error code = 9
    [junit] 	at junit.framework.Assert.fail(Assert.java:47)
    [junit] 	at org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_input4(TestCliDriver.java:931)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit] 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit] 	at java.lang.reflect.Method.invoke(Method.java:597)
    [junit] 	at junit.framework.TestCase.runTest(TestCase.java:164)
    [junit] 	at junit.framework.TestCase.runBare(TestCase.java:130)
    [junit] 	at junit.framework.TestResult$1.protect(TestResult.java:106)
    [junit] 	at junit.framework.TestResult.runProtected(TestResult.java:124)
    [junit] 	at junit.framework.TestResult.run(TestResult.java:109)
    [junit] 	at junit.framework.TestCase.run(TestCase.java:120)
    [junit] 	at junit.framework.TestSuite.runTest(TestSuite.java:230)
    [junit] 	at junit.framework.TestSuite.run(TestSuite.java:225)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
    [junit] Begin query: describe_xpath.q
    [junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src)
    [junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:361)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
    [junit] 	at org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_describe_xpath(TestCliDriver.java:953)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit] 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit] 	at java.lang.reflect.Method.invoke(Method.java:597)
    [junit] 	at junit.framework.TestCase.runTest(TestCase.java:164)
    [junit] 	at junit.framework.TestCase.runBare(TestCase.java:130)
    [junit] 	at junit.framework.TestResult$1.protect(TestResult.java:106)
    [junit] 	at junit.framework.TestResult.runProtected(TestResult.java:124)
    [junit] 	at junit.framework.TestResult.run(TestResult.java:109)
    [junit] 	at junit.framework.TestCase.run(TestCase.java:120)
    [junit] 	at junit.framework.TestSuite.runTest(TestSuite.java:230)
    [junit] 	at junit.framework.TestSuite.run(TestSuite.java:225)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
    [junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src)
    [junit] 	at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
    [junit] 	at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
    [junit] 	at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
    [junit] 	... 21 more
    [junit] Begin query: join5.q
    [junit] plan = /tmp/plan1399.xml
    [junit] 08/12/30 17:14:27 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to jobconf value of: 1
    [junit] 08/12/30 17:14:28 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src
    [junit] 08/12/30 17:14:28 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/data/files/TestSerDe.jar
    [junit] 08/12/30 17:14:28 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
    [junit] 08/12/30 17:14:28 WARN mapred.JobClient: Use GenericOptionsParser for parsing the arguments. Applications should implement Tool for the same.
    [junit] 08/12/30 17:14:28 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] 08/12/30 17:14:28 INFO exec.ExecDriver: Job running in-process (local Hadoop)
    [junit] 08/12/30 17:14:28 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] Job running in-process (local Hadoop)
    [junit] 08/12/30 17:14:28 INFO mapred.MapTask: numReduceTasks: 1
    [junit] 08/12/30 17:14:28 INFO mapred.MapTask: io.sort.mb = 100
    [junit] 08/12/30 17:14:28 INFO mapred.MapTask: data buffer = 79691776/99614720
    [junit] 08/12/30 17:14:28 INFO mapred.MapTask: record buffer = 262144/327680
    [junit] 08/12/30 17:14:28 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 17:14:28 INFO exec.MapOperator: Adding alias c:a:src1 to work list for file /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src/kv1.txt
    [junit] 08/12/30 17:14:28 INFO exec.MapOperator: Adding alias c:b:src2 to work list for file /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src/kv1.txt
    [junit] 08/12/30 17:14:28 INFO exec.MapOperator: Got partitions: null
    [junit] 08/12/30 17:14:28 INFO exec.TableScanOperator: Initializing Self
    [junit] 08/12/30 17:14:28 INFO exec.TableScanOperator: Initializing children:
    [junit] 08/12/30 17:14:28 INFO exec.FilterOperator: Initializing Self
    [junit] 08/12/30 17:14:28 INFO exec.FilterOperator: Initializing children:
    [junit] 08/12/30 17:14:28 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 17:14:28 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 17:14:28 INFO exec.ReduceSinkOperator: Initializing Self
    [junit] 08/12/30 17:14:28 INFO exec.ReduceSinkOperator: Using tag = 0
    [junit] 08/12/30 17:14:28 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 17:14:28 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 17:14:28 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 17:14:28 INFO exec.FilterOperator: Initialization Done
    [junit] 08/12/30 17:14:28 INFO exec.TableScanOperator: Initialization Done
    [junit] 08/12/30 17:14:28 INFO exec.TableScanOperator: Initializing Self
    [junit] 08/12/30 17:14:28 INFO exec.TableScanOperator: Initializing children:
    [junit] 08/12/30 17:14:28 INFO exec.FilterOperator: Initializing Self
    [junit] 08/12/30 17:14:28 INFO exec.FilterOperator: Initializing children:
    [junit] 08/12/30 17:14:28 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 17:14:28 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 17:14:28 INFO exec.ReduceSinkOperator: Initializing Self
    [junit] 08/12/30 17:14:28 INFO exec.ReduceSinkOperator: Using tag = 1
    [junit] 08/12/30 17:14:28 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 17:14:28 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 17:14:28 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 17:14:28 INFO exec.FilterOperator: Initialization Done
    [junit] 08/12/30 17:14:28 INFO exec.TableScanOperator: Initialization Done
    [junit] 08/12/30 17:14:29 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 17:14:29 INFO exec.FilterOperator: FILTERED:491
    [junit] 08/12/30 17:14:29 INFO exec.FilterOperator: PASSED:9
    [junit] 08/12/30 17:14:29 INFO exec.FilterOperator: FILTERED:493
    [junit] 08/12/30 17:14:29 INFO exec.FilterOperator: PASSED:7
    [junit] 08/12/30 17:14:29 INFO mapred.MapTask: Starting flush of map output
    [junit] 08/12/30 17:14:29 INFO mapred.MapTask: Finished spill 0
    [junit] 08/12/30 17:14:29 INFO mapred.TaskRunner: Task:attempt_local_0001_m_000000_0 is done. And is in the process of commiting
    [junit] 08/12/30 17:14:29 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src/kv1.txt:0+5812
    [junit] 08/12/30 17:14:29 INFO mapred.TaskRunner: Task 'attempt_local_0001_m_000000_0' done.
    [junit] 08/12/30 17:14:29 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 17:14:29 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 17:14:29 INFO mapred.Merger: Merging 1 sorted segments
    [junit] 08/12/30 17:14:29 INFO mapred.Merger: Down to the last merge-pass, with 1 segments left of total size: 610 bytes
    [junit] 08/12/30 17:14:29 INFO exec.JoinOperator: Initializing Self
    [junit] 08/12/30 17:14:29 INFO exec.JoinOperator: Initializing children:
    [junit] 08/12/30 17:14:29 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 17:14:29 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 17:14:29 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 17:14:29 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 17:14:29 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 17:14:29 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 17:14:29 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 17:14:29 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 17:14:29 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 17:14:29 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 17:14:29 INFO exec.JoinOperator: Initialization Done
    [junit] 08/12/30 17:14:29 INFO mapred.TaskRunner: Task:attempt_local_0001_r_000000_0 is done. And is in the process of commiting
    [junit] 08/12/30 17:14:29 INFO mapred.LocalJobRunner: 
    [junit] 08/12/30 17:14:29 INFO mapred.TaskRunner: Task attempt_local_0001_r_000000_0 is allowed to commit now
    [junit] 08/12/30 17:14:29 INFO mapred.FileOutputCommitter: Saved output of task 'attempt_local_0001_r_000000_0' to file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/tmp1888463889
    [junit] 08/12/30 17:14:29 INFO mapred.LocalJobRunner: reduce > reduce
    [junit] 08/12/30 17:14:29 INFO mapred.TaskRunner: Task 'attempt_local_0001_r_000000_0' done.
    [junit]  map = 100%,  reduce =100%
    [junit] Ended Job = job_local_0001
    [junit] 08/12/30 17:14:29 INFO exec.ExecDriver:  map = 100%,  reduce =100%
    [junit] 08/12/30 17:14:29 INFO exec.ExecDriver: Ended Job = job_local_0001
    [junit] diff -I \(file:\)\|\(/tmp/.*\) /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/../build/ql/test/logs/join5.q.out /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/src/test/results/clientpositive/join5.q.out
    [junit] Done query: join5.q
    [junit] Begin query: input_testxpath2.q
    [junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src)
    [junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:361)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
    [junit] 	at org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_input_testxpath2(TestCliDriver.java:1003)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit] 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit] 	at java.lang.reflect.Method.invoke(Method.java:597)
    [junit] 	at junit.framework.TestCase.runTest(TestCase.java:164)
    [junit] 	at junit.framework.TestCase.runBare(TestCase.java:130)
    [junit] 	at junit.framework.TestResult$1.protect(TestResult.java:106)
    [junit] 	at junit.framework.TestResult.runProtected(TestResult.java:124)
    [junit] 	at junit.framework.TestResult.run(TestResult.java:109)
    [junit] 	at junit.framework.TestCase.run(TestCase.java:120)
    [junit] 	at junit.framework.TestSuite.runTest(TestSuite.java:230)
    [junit] 	at junit.framework.TestSuite.run(TestSuite.java:225)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
    [junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src)
    [junit] 	at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
    [junit] 	at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
    [junit] 	at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
    [junit] 	... 21 more
    [junit] Begin query: input6.q
    [junit] plan = /tmp/plan1400.xml
    [junit] 08/12/30 17:14:33 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to 0 since there's no reduce operator
    [junit] 08/12/30 17:14:33 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src1
    [junit] 08/12/30 17:14:33 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/data/files/TestSerDe.jar
    [junit] 08/12/30 17:14:33 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
    [junit] 08/12/30 17:14:33 WARN mapred.JobClient: Use GenericOptionsParser for parsing the arguments. Applications should implement Tool for the same.
    [junit] 08/12/30 17:14:33 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] Job running in-process (local Hadoop)
    [junit] 08/12/30 17:14:33 INFO exec.ExecDriver: Job running in-process (local Hadoop)
    [junit] 08/12/30 17:14:33 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] 08/12/30 17:14:33 INFO mapred.MapTask: numReduceTasks: 0
    [junit] 08/12/30 17:14:34 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 17:14:34 INFO exec.MapOperator: Adding alias src1 to work list for file /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src1/kv3.txt
    [junit] 08/12/30 17:14:34 INFO exec.MapOperator: Got partitions: null
    [junit] 08/12/30 17:14:34 INFO exec.TableScanOperator: Initializing Self
    [junit] 08/12/30 17:14:34 INFO exec.TableScanOperator: Initializing children:
    [junit] 08/12/30 17:14:34 INFO exec.FilterOperator: Initializing Self
    [junit] 08/12/30 17:14:34 INFO exec.FilterOperator: Initializing children:
    [junit] 08/12/30 17:14:34 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 17:14:34 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 17:14:34 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 17:14:34 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 17:14:34 INFO exec.FilterOperator: Initialization Done
    [junit] 08/12/30 17:14:34 INFO exec.TableScanOperator: Initialization Done
    [junit] 08/12/30 17:14:34 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 17:14:34 INFO exec.FilterOperator: FILTERED:25
    [junit] 08/12/30 17:14:34 INFO exec.FilterOperator: PASSED:0
    [junit] 08/12/30 17:14:34 INFO mapred.TaskRunner: Task:attempt_local_0001_m_000000_0 is done. And is in the process of commiting
    [junit] 08/12/30 17:14:34 INFO mapred.LocalJobRunner: 
    [junit] 08/12/30 17:14:34 INFO mapred.TaskRunner: Task attempt_local_0001_m_000000_0 is allowed to commit now
    [junit] 08/12/30 17:14:34 INFO mapred.FileOutputCommitter: Saved output of task 'attempt_local_0001_m_000000_0' to file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/tmp1849943556
    [junit] 08/12/30 17:14:34 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src1/kv3.txt:0+216
    [junit] 08/12/30 17:14:34 INFO mapred.TaskRunner: Task 'attempt_local_0001_m_000000_0' done.
    [junit]  map = 100%,  reduce =0%
    [junit] Ended Job = job_local_0001
    [junit] 08/12/30 17:14:34 INFO exec.ExecDriver:  map = 100%,  reduce =0%
    [junit] 08/12/30 17:14:34 INFO exec.ExecDriver: Ended Job = job_local_0001
    [junit] diff -I \(file:\)\|\(/tmp/.*\) /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/../build/ql/test/logs/input6.q.out /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/src/test/results/clientpositive/input6.q.out
    [junit] Done query: input6.q
    [junit] Begin query: join7.q
    [junit] plan = /tmp/plan1401.xml
    [junit] 08/12/30 17:14:39 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to jobconf value of: 1
    [junit] 08/12/30 17:14:39 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src
    [junit] 08/12/30 17:14:39 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/data/files/TestSerDe.jar
    [junit] 08/12/30 17:14:39 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
    [junit] 08/12/30 17:14:39 WARN mapred.JobClient: Use GenericOptionsParser for parsing the arguments. Applications should implement Tool for the same.
    [junit] 08/12/30 17:14:39 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] 08/12/30 17:14:39 INFO exec.ExecDriver: Job running in-process (local Hadoop)
    [junit] Job running in-process (local Hadoop)
    [junit] 08/12/30 17:14:39 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] 08/12/30 17:14:39 INFO mapred.MapTask: numReduceTasks: 1
    [junit] 08/12/30 17:14:39 INFO mapred.MapTask: io.sort.mb = 100
    [junit] 08/12/30 17:14:40 INFO mapred.MapTask: data buffer = 79691776/99614720
    [junit] 08/12/30 17:14:40 INFO mapred.MapTask: record buffer = 262144/327680
    [junit] 08/12/30 17:14:40 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 17:14:40 INFO exec.MapOperator: Adding alias c:a:src1 to work list for file /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src/kv1.txt
    [junit] 08/12/30 17:14:40 INFO exec.MapOperator: Adding alias c:b:src2 to work list for file /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src/kv1.txt
    [junit] 08/12/30 17:14:40 INFO exec.MapOperator: Adding alias c:c:src3 to work list for file /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src/kv1.txt
    [junit] 08/12/30 17:14:40 INFO exec.MapOperator: Got partitions: null
    [junit] 08/12/30 17:14:40 INFO exec.TableScanOperator: Initializing Self
    [junit] 08/12/30 17:14:40 INFO exec.TableScanOperator: Initializing children:
    [junit] 08/12/30 17:14:40 INFO exec.FilterOperator: Initializing Self
    [junit] 08/12/30 17:14:40 INFO exec.FilterOperator: Initializing children:
    [junit] 08/12/30 17:14:40 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 17:14:40 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 17:14:40 INFO exec.ReduceSinkOperator: Initializing Self
    [junit] 08/12/30 17:14:40 INFO exec.ReduceSinkOperator: Using tag = 0
    [junit] 08/12/30 17:14:40 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 17:14:40 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 17:14:40 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 17:14:40 INFO exec.FilterOperator: Initialization Done
    [junit] 08/12/30 17:14:40 INFO exec.TableScanOperator: Initialization Done
    [junit] 08/12/30 17:14:40 INFO exec.TableScanOperator: Initializing Self
    [junit] 08/12/30 17:14:40 INFO exec.TableScanOperator: Initializing children:
    [junit] 08/12/30 17:14:40 INFO exec.FilterOperator: Initializing Self
    [junit] 08/12/30 17:14:40 INFO exec.FilterOperator: Initializing children:
    [junit] 08/12/30 17:14:40 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 17:14:40 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 17:14:40 INFO exec.ReduceSinkOperator: Initializing Self
    [junit] 08/12/30 17:14:40 INFO exec.ReduceSinkOperator: Using tag = 1
    [junit] 08/12/30 17:14:40 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 17:14:40 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 17:14:40 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 17:14:40 INFO exec.FilterOperator: Initialization Done
    [junit] 08/12/30 17:14:40 INFO exec.TableScanOperator: Initialization Done
    [junit] 08/12/30 17:14:40 INFO exec.TableScanOperator: Initializing Self
    [junit] 08/12/30 17:14:40 INFO exec.TableScanOperator: Initializing children:
    [junit] 08/12/30 17:14:40 INFO exec.FilterOperator: Initializing Self
    [junit] 08/12/30 17:14:40 INFO exec.FilterOperator: Initializing children:
    [junit] 08/12/30 17:14:40 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 17:14:40 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 17:14:40 INFO exec.ReduceSinkOperator: Initializing Self
    [junit] 08/12/30 17:14:40 INFO exec.ReduceSinkOperator: Using tag = 2
    [junit] 08/12/30 17:14:40 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 17:14:40 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 17:14:40 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 17:14:40 INFO exec.FilterOperator: Initialization Done
    [junit] 08/12/30 17:14:40 INFO exec.TableScanOperator: Initialization Done
    [junit] 08/12/30 17:14:40 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 17:14:40 INFO exec.FilterOperator: FILTERED:491
    [junit] 08/12/30 17:14:40 INFO exec.FilterOperator: PASSED:9
    [junit] 08/12/30 17:14:40 INFO exec.FilterOperator: FILTERED:493
    [junit] 08/12/30 17:14:40 INFO exec.FilterOperator: PASSED:7
    [junit] 08/12/30 17:14:40 INFO exec.FilterOperator: FILTERED:498
    [junit] 08/12/30 17:14:40 INFO exec.FilterOperator: PASSED:2
    [junit] 08/12/30 17:14:40 INFO mapred.MapTask: Starting flush of map output
    [junit] 08/12/30 17:14:40 INFO mapred.MapTask: Finished spill 0
    [junit] 08/12/30 17:14:40 INFO mapred.TaskRunner: Task:attempt_local_0001_m_000000_0 is done. And is in the process of commiting
    [junit] 08/12/30 17:14:40 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src/kv1.txt:0+5812
    [junit] 08/12/30 17:14:40 INFO mapred.TaskRunner: Task 'attempt_local_0001_m_000000_0' done.
    [junit] 08/12/30 17:14:40 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 17:14:40 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 17:14:40 INFO mapred.Merger: Merging 1 sorted segments
    [junit] 08/12/30 17:14:40 INFO mapred.Merger: Down to the last merge-pass, with 1 segments left of total size: 686 bytes
    [junit] 08/12/30 17:14:40 INFO exec.JoinOperator: Initializing Self
    [junit] 08/12/30 17:14:40 INFO exec.JoinOperator: Initializing children:
    [junit] 08/12/30 17:14:40 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 17:14:40 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 17:14:40 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 17:14:40 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 17:14:40 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 17:14:40 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 17:14:40 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 17:14:40 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 17:14:40 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 17:14:40 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 17:14:40 INFO exec.JoinOperator: Initialization Done
    [junit] 08/12/30 17:14:40 INFO mapred.TaskRunner: Task:attempt_local_0001_r_000000_0 is done. And is in the process of commiting
    [junit] 08/12/30 17:14:40 INFO mapred.LocalJobRunner: 
    [junit] 08/12/30 17:14:40 INFO mapred.TaskRunner: Task attempt_local_0001_r_000000_0 is allowed to commit now
    [junit] 08/12/30 17:14:40 INFO mapred.FileOutputCommitter: Saved output of task 'attempt_local_0001_r_000000_0' to file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/tmp1451874228
    [junit] 08/12/30 17:14:40 INFO mapred.LocalJobRunner: reduce > reduce
    [junit] 08/12/30 17:14:40 INFO mapred.TaskRunner: Task 'attempt_local_0001_r_000000_0' done.
    [junit] 08/12/30 17:14:40 INFO exec.ExecDriver:  map = 100%,  reduce =100%
    [junit] 08/12/30 17:14:40 INFO exec.ExecDriver: Ended Job = job_local_0001
    [junit]  map = 100%,  reduce =100%
    [junit] Ended Job = job_local_0001
    [junit] diff -I \(file:\)\|\(/tmp/.*\) /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/../build/ql/test/logs/join7.q.out /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/src/test/results/clientpositive/join7.q.out
    [junit] Done query: join7.q
    [junit] Begin query: input8.q
    [junit] plan = /tmp/plan1402.xml
    [junit] 08/12/30 17:14:45 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to 0 since there's no reduce operator
    [junit] 08/12/30 17:14:45 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src1
    [junit] 08/12/30 17:14:45 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/data/files/TestSerDe.jar
    [junit] 08/12/30 17:14:45 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
    [junit] 08/12/30 17:14:45 WARN mapred.JobClient: Use GenericOptionsParser for parsing the arguments. Applications should implement Tool for the same.
    [junit] 08/12/30 17:14:45 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] Job running in-process (local Hadoop)
    [junit] 08/12/30 17:14:45 INFO exec.ExecDriver: Job running in-process (local Hadoop)
    [junit] 08/12/30 17:14:45 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] 08/12/30 17:14:45 INFO mapred.MapTask: numReduceTasks: 0
    [junit] 08/12/30 17:14:45 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 17:14:45 INFO exec.MapOperator: Adding alias src1 to work list for file /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src1/kv3.txt
    [junit] 08/12/30 17:14:45 INFO exec.MapOperator: Got partitions: null
    [junit] 08/12/30 17:14:45 INFO exec.TableScanOperator: Initializing Self
    [junit] 08/12/30 17:14:45 INFO exec.TableScanOperator: Initializing children:
    [junit] 08/12/30 17:14:45 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 17:14:45 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 17:14:45 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 17:14:45 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 17:14:45 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 17:14:45 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 17:14:45 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 17:14:46 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 17:14:46 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 17:14:46 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 17:14:46 INFO exec.TableScanOperator: Initialization Done
    [junit] 08/12/30 17:14:46 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 17:14:46 INFO mapred.TaskRunner: Task:attempt_local_0001_m_000000_0 is done. And is in the process of commiting
    [junit] 08/12/30 17:14:46 INFO mapred.LocalJobRunner: 
    [junit] 08/12/30 17:14:46 INFO mapred.TaskRunner: Task attempt_local_0001_m_000000_0 is allowed to commit now
    [junit] 08/12/30 17:14:46 INFO mapred.FileOutputCommitter: Saved output of task 'attempt_local_0001_m_000000_0' to file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/tmp-1843268160
    [junit] 08/12/30 17:14:46 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src1/kv3.txt:0+216
    [junit] 08/12/30 17:14:46 INFO mapred.TaskRunner: Task 'attempt_local_0001_m_000000_0' done.
    [junit]  map = 100%,  reduce =0%
    [junit] Ended Job = job_local_0001
    [junit] 08/12/30 17:14:46 INFO exec.ExecDriver:  map = 100%,  reduce =0%
    [junit] 08/12/30 17:14:46 INFO exec.ExecDriver: Ended Job = job_local_0001
    [junit] diff -I \(file:\)\|\(/tmp/.*\) /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/../build/ql/test/logs/input8.q.out /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/src/test/results/clientpositive/input8.q.out
    [junit] Done query: input8.q
    [junit] Begin query: union.q
    [junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src)
    [junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:361)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
    [junit] 	at org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_union(TestCliDriver.java:1103)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit] 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit] 	at java.lang.reflect.Method.invoke(Method.java:597)
    [junit] 	at junit.framework.TestCase.runTest(TestCase.java:164)
    [junit] 	at junit.framework.TestCase.runBare(TestCase.java:130)
    [junit] 	at junit.framework.TestResult$1.protect(TestResult.java:106)
    [junit] 	at junit.framework.TestResult.runProtected(TestResult.java:124)
    [junit] 	at junit.framework.TestResult.run(TestResult.java:109)
    [junit] 	at junit.framework.TestCase.run(TestCase.java:120)
    [junit] 	at junit.framework.TestSuite.runTest(TestSuite.java:230)
    [junit] 	at junit.framework.TestSuite.run(TestSuite.java:225)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
    [junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src)
    [junit] 	at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
    [junit] 	at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
    [junit] 	at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
    [junit] 	... 21 more
    [junit] Begin query: join9.q
    [junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/srcbucket)
    [junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/srcbucket)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:361)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
    [junit] 	at org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_join9(TestCliDriver.java:1128)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit] 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit] 	at java.lang.reflect.Method.invoke(Method.java:597)
    [junit] 	at junit.framework.TestCase.runTest(TestCase.java:164)
    [junit] 	at junit.framework.TestCase.runBare(TestCase.java:130)
    [junit] 	at junit.framework.TestResult$1.protect(TestResult.java:106)
    [junit] 	at junit.framework.TestResult.runProtected(TestResult.java:124)
    [junit] 	at junit.framework.TestResult.run(TestResult.java:109)
    [junit] 	at junit.framework.TestCase.run(TestCase.java:120)
    [junit] 	at junit.framework.TestSuite.runTest(TestSuite.java:230)
    [junit] 	at junit.framework.TestSuite.run(TestSuite.java:225)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
    [junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/srcbucket)
    [junit] 	at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
    [junit] 	at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
    [junit] 	at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
    [junit] 	... 21 more
    [junit] Begin query: udf2.q
    [junit] plan = /tmp/plan1403.xml
    [junit] 08/12/30 17:14:51 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to 0 since there's no reduce operator
    [junit] 08/12/30 17:14:51 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src
    [junit] 08/12/30 17:14:51 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/data/files/TestSerDe.jar
    [junit] 08/12/30 17:14:51 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
    [junit] 08/12/30 17:14:51 WARN mapred.JobClient: Use GenericOptionsParser for parsing the arguments. Applications should implement Tool for the same.
    [junit] 08/12/30 17:14:51 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] 08/12/30 17:14:51 INFO exec.ExecDriver: Job running in-process (local Hadoop)
    [junit] Job running in-process (local Hadoop)
    [junit] 08/12/30 17:14:51 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] 08/12/30 17:14:52 INFO mapred.MapTask: numReduceTasks: 0
    [junit] 08/12/30 17:14:52 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 17:14:52 INFO exec.MapOperator: Adding alias src to work list for file /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src/kv1.txt
    [junit] 08/12/30 17:14:52 INFO exec.MapOperator: Got partitions: null
    [junit] 08/12/30 17:14:52 INFO exec.TableScanOperator: Initializing Self
    [junit] 08/12/30 17:14:52 INFO exec.TableScanOperator: Initializing children:
    [junit] 08/12/30 17:14:52 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 17:14:52 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 17:14:52 INFO exec.FilterOperator: Initializing Self
    [junit] 08/12/30 17:14:52 INFO exec.FilterOperator: Initializing children:
    [junit] 08/12/30 17:14:52 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 17:14:52 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 17:14:52 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 17:14:52 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 17:14:52 INFO exec.FilterOperator: Initialization Done
    [junit] 08/12/30 17:14:52 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 17:14:52 INFO exec.TableScanOperator: Initialization Done
    [junit] 08/12/30 17:14:52 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 17:14:52 INFO exec.FilterOperator: FILTERED:499
    [junit] 08/12/30 17:14:52 INFO exec.FilterOperator: PASSED:1
    [junit] 08/12/30 17:14:52 INFO mapred.TaskRunner: Task:attempt_local_0001_m_000000_0 is done. And is in the process of commiting
    [junit] 08/12/30 17:14:52 INFO mapred.LocalJobRunner: 
    [junit] 08/12/30 17:14:52 INFO mapred.TaskRunner: Task attempt_local_0001_m_000000_0 is allowed to commit now
    [junit] 08/12/30 17:14:52 INFO mapred.FileOutputCommitter: Saved output of task 'attempt_local_0001_m_000000_0' to file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/tmp253238294
    [junit] 08/12/30 17:14:52 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src/kv1.txt:0+5812
    [junit] 08/12/30 17:14:52 INFO mapred.TaskRunner: Task 'attempt_local_0001_m_000000_0' done.
    [junit] 08/12/30 17:14:52 INFO exec.ExecDriver:  map = 100%,  reduce =0%
    [junit] 08/12/30 17:14:52 INFO exec.ExecDriver: Ended Job = job_local_0001
    [junit]  map = 100%,  reduce =0%
    [junit] Ended Job = job_local_0001
    [junit] plan = /tmp/plan1404.xml
    [junit] 08/12/30 17:14:54 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to 0 since there's no reduce operator
    [junit] 08/12/30 17:14:54 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/dest1
    [junit] 08/12/30 17:14:54 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/data/files/TestSerDe.jar
    [junit] 08/12/30 17:14:54 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
    [junit] 08/12/30 17:14:54 WARN mapred.JobClient: Use GenericOptionsParser for parsing the arguments. Applications should implement Tool for the same.
    [junit] 08/12/30 17:14:55 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] 08/12/30 17:14:55 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] Job running in-process (local Hadoop)
    [junit] 08/12/30 17:14:55 INFO exec.ExecDriver: Job running in-process (local Hadoop)
    [junit] 08/12/30 17:14:55 INFO mapred.MapTask: numReduceTasks: 0
    [junit] 08/12/30 17:14:55 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 17:14:55 INFO exec.MapOperator: Adding alias dest1 to work list for file /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/dest1/attempt_local_0001_m_000000_0
    [junit] 08/12/30 17:14:55 INFO exec.MapOperator: Got partitions: null
    [junit] 08/12/30 17:14:55 INFO exec.TableScanOperator: Initializing Self
    [junit] 08/12/30 17:14:55 INFO exec.TableScanOperator: Initializing children:
    [junit] 08/12/30 17:14:55 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 17:14:55 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 17:14:55 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 17:14:55 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 17:14:55 INFO exec.TableScanOperator: Initialization Done
    [junit] 08/12/30 17:14:55 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 17:14:55 INFO mapred.TaskRunner: Task:attempt_local_0001_m_000000_0 is done. And is in the process of commiting
    [junit] 08/12/30 17:14:55 INFO mapred.LocalJobRunner: 
    [junit] 08/12/30 17:14:55 INFO mapred.TaskRunner: Task attempt_local_0001_m_000000_0 is allowed to commit now
    [junit] 08/12/30 17:14:55 INFO mapred.FileOutputCommitter: Saved output of task 'attempt_local_0001_m_000000_0' to file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/tmp1190048255
    [junit] 08/12/30 17:14:55 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/dest1/attempt_local_0001_m_000000_0:0+8
    [junit] 08/12/30 17:14:55 INFO mapred.TaskRunner: Task 'attempt_local_0001_m_000000_0' done.
    [junit]  map = 100%,  reduce =0%
    [junit] Ended Job = job_local_0001
    [junit] 08/12/30 17:14:56 INFO exec.ExecDriver:  map = 100%,  reduce =0%
    [junit] 08/12/30 17:14:56 INFO exec.ExecDriver: Ended Job = job_local_0001
    [junit] diff -I \(file:\)\|\(/tmp/.*\) /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/../build/ql/test/logs/udf2.q.out /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/src/test/results/clientpositive/udf2.q.out
    [junit] Done query: udf2.q
    [junit] Begin query: input10.q
    [junit] diff -I \(file:\)\|\(/tmp/.*\) /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/../build/ql/test/logs/input10.q.out /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/src/test/results/clientpositive/input10.q.out
    [junit] Done query: input10.q
    [junit] Begin query: join11.q
    [junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src)
    [junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:361)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
    [junit] 	at org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_join11(TestCliDriver.java:1203)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit] 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit] 	at java.lang.reflect.Method.invoke(Method.java:597)
    [junit] 	at junit.framework.TestCase.runTest(TestCase.java:164)
    [junit] 	at junit.framework.TestCase.runBare(TestCase.java:130)
    [junit] 	at junit.framework.TestResult$1.protect(TestResult.java:106)
    [junit] 	at junit.framework.TestResult.runProtected(TestResult.java:124)
    [junit] 	at junit.framework.TestResult.run(TestResult.java:109)
    [junit] 	at junit.framework.TestCase.run(TestCase.java:120)
    [junit] 	at junit.framework.TestSuite.runTest(TestSuite.java:230)
    [junit] 	at junit.framework.TestSuite.run(TestSuite.java:225)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
    [junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src)
    [junit] 	at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
    [junit] 	at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
    [junit] 	at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
    [junit] 	... 21 more
    [junit] Begin query: input4_cb_delim.q
    [junit] plan = /tmp/plan1405.xml
    [junit] 08/12/30 17:15:04 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to 0 since there's no reduce operator
    [junit] 08/12/30 17:15:04 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/input4_cb
    [junit] 08/12/30 17:15:04 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/data/files/TestSerDe.jar
    [junit] 08/12/30 17:15:04 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
    [junit] 08/12/30 17:15:04 WARN mapred.JobClient: Use GenericOptionsParser for parsing the arguments. Applications should implement Tool for the same.
    [junit] 08/12/30 17:15:04 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] 08/12/30 17:15:04 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] Job running in-process (local Hadoop)
    [junit] 08/12/30 17:15:04 INFO exec.ExecDriver: Job running in-process (local Hadoop)
    [junit] 08/12/30 17:15:04 INFO mapred.MapTask: numReduceTasks: 0
    [junit] 08/12/30 17:15:04 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 17:15:04 INFO exec.MapOperator: Adding alias input4_cb to work list for file /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/input4_cb/kv1_cb.txt
    [junit] 08/12/30 17:15:04 INFO exec.MapOperator: Got partitions: null
    [junit] 08/12/30 17:15:04 INFO exec.TableScanOperator: Initializing Self
    [junit] 08/12/30 17:15:04 INFO exec.TableScanOperator: Initializing children:
    [junit] 08/12/30 17:15:04 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 17:15:04 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 17:15:04 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 17:15:04 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 17:15:04 INFO exec.TableScanOperator: Initialization Done
    [junit] 08/12/30 17:15:04 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 17:15:05 INFO mapred.TaskRunner: Task:attempt_local_0001_m_000000_0 is done. And is in the process of commiting
    [junit] 08/12/30 17:15:05 INFO mapred.LocalJobRunner: 
    [junit] 08/12/30 17:15:05 INFO mapred.TaskRunner: Task attempt_local_0001_m_000000_0 is allowed to commit now
    [junit] 08/12/30 17:15:05 INFO mapred.FileOutputCommitter: Saved output of task 'attempt_local_0001_m_000000_0' to file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/tmp-907594402
    [junit] 08/12/30 17:15:05 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/input4_cb/kv1_cb.txt:0+5812
    [junit] 08/12/30 17:15:05 INFO mapred.TaskRunner: Task 'attempt_local_0001_m_000000_0' done.
    [junit]  map = 100%,  reduce =0%
    [junit] 08/12/30 17:15:05 INFO exec.ExecDriver:  map = 100%,  reduce =0%
    [junit] Ended Job = job_local_0001
    [junit] 08/12/30 17:15:05 INFO exec.ExecDriver: Ended Job = job_local_0001
    [junit] Exception: Client Execution failed with error code = 9
    [junit] junit.framework.AssertionFailedError: Client Execution failed with error code = 9
    [junit] 	at junit.framework.Assert.fail(Assert.java:47)
    [junit] 	at org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_input4_cb_delim(TestCliDriver.java:1231)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit] 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit] 	at java.lang.reflect.Method.invoke(Method.java:597)
    [junit] 	at junit.framework.TestCase.runTest(TestCase.java:164)
    [junit] 	at junit.framework.TestCase.runBare(TestCase.java:130)
    [junit] 	at junit.framework.TestResult$1.protect(TestResult.java:106)
    [junit] 	at junit.framework.TestResult.runProtected(TestResult.java:124)
    [junit] 	at junit.framework.TestResult.run(TestResult.java:109)
    [junit] 	at junit.framework.TestCase.run(TestCase.java:120)
    [junit] 	at junit.framework.TestSuite.runTest(TestSuite.java:230)
    [junit] 	at junit.framework.TestSuite.run(TestSuite.java:225)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
    [junit] Begin query: udf4.q
    [junit] plan = /tmp/plan1406.xml
    [junit] 08/12/30 17:15:09 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to 0 since there's no reduce operator
    [junit] 08/12/30 17:15:09 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src
    [junit] 08/12/30 17:15:09 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/data/files/TestSerDe.jar
    [junit] 08/12/30 17:15:09 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
    [junit] 08/12/30 17:15:09 WARN mapred.JobClient: Use GenericOptionsParser for parsing the arguments. Applications should implement Tool for the same.
    [junit] 08/12/30 17:15:09 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] 08/12/30 17:15:10 INFO exec.ExecDriver: Job running in-process (local Hadoop)
    [junit] 08/12/30 17:15:10 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] Job running in-process (local Hadoop)
    [junit] 08/12/30 17:15:10 INFO mapred.MapTask: numReduceTasks: 0
    [junit] 08/12/30 17:15:10 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 17:15:10 INFO exec.MapOperator: Adding alias src to work list for file /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src/kv1.txt
    [junit] 08/12/30 17:15:10 INFO exec.MapOperator: Got partitions: null
    [junit] 08/12/30 17:15:10 INFO exec.TableScanOperator: Initializing Self
    [junit] 08/12/30 17:15:10 INFO exec.TableScanOperator: Initializing children:
    [junit] 08/12/30 17:15:10 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 17:15:10 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 17:15:10 INFO exec.FilterOperator: Initializing Self
    [junit] 08/12/30 17:15:10 INFO exec.FilterOperator: Initializing children:
    [junit] 08/12/30 17:15:10 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 17:15:10 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 17:15:10 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 17:15:10 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 17:15:10 INFO exec.FilterOperator: Initialization Done
    [junit] 08/12/30 17:15:10 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 17:15:10 INFO exec.TableScanOperator: Initialization Done
    [junit] 08/12/30 17:15:10 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 17:15:10 INFO exec.FilterOperator: PASSED:1
    [junit] 08/12/30 17:15:10 INFO exec.FilterOperator: FILTERED:499
    [junit] 08/12/30 17:15:10 INFO mapred.TaskRunner: Task:attempt_local_0001_m_000000_0 is done. And is in the process of commiting
    [junit] 08/12/30 17:15:10 INFO mapred.LocalJobRunner: 
    [junit] 08/12/30 17:15:10 INFO mapred.TaskRunner: Task attempt_local_0001_m_000000_0 is allowed to commit now
    [junit] 08/12/30 17:15:10 INFO mapred.FileOutputCommitter: Saved output of task 'attempt_local_0001_m_000000_0' to file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/tmp1300868364
    [junit] 08/12/30 17:15:10 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src/kv1.txt:0+5812
    [junit] 08/12/30 17:15:10 INFO mapred.TaskRunner: Task 'attempt_local_0001_m_000000_0' done.
    [junit] 08/12/30 17:15:11 INFO exec.ExecDriver:  map = 100%,  reduce =0%
    [junit] 08/12/30 17:15:11 INFO exec.ExecDriver: Ended Job = job_local_0001
    [junit]  map = 100%,  reduce =0%
    [junit] Ended Job = job_local_0001
    [junit] plan = /tmp/plan1407.xml
    [junit] 08/12/30 17:15:12 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to 0 since there's no reduce operator
    [junit] 08/12/30 17:15:13 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/dest1
    [junit] 08/12/30 17:15:13 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/data/files/TestSerDe.jar
    [junit] 08/12/30 17:15:13 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
    [junit] 08/12/30 17:15:13 WARN mapred.JobClient: Use GenericOptionsParser for parsing the arguments. Applications should implement Tool for the same.
    [junit] 08/12/30 17:15:13 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] 08/12/30 17:15:13 INFO exec.ExecDriver: Job running in-process (local Hadoop)
    [junit] Job running in-process (local Hadoop)
    [junit] 08/12/30 17:15:13 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] 08/12/30 17:15:13 INFO mapred.MapTask: numReduceTasks: 0
    [junit] 08/12/30 17:15:13 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 17:15:13 INFO exec.MapOperator: Adding alias dest1 to work list for file /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/dest1/attempt_local_0001_m_000000_0
    [junit] 08/12/30 17:15:13 INFO exec.MapOperator: Got partitions: null
    [junit] 08/12/30 17:15:13 INFO exec.TableScanOperator: Initializing Self
    [junit] 08/12/30 17:15:13 INFO exec.TableScanOperator: Initializing children:
    [junit] 08/12/30 17:15:13 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 17:15:13 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 17:15:13 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 17:15:13 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 17:15:13 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 17:15:13 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 17:15:13 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 17:15:13 INFO exec.TableScanOperator: Initialization Done
    [junit] 08/12/30 17:15:13 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 17:15:13 INFO mapred.TaskRunner: Task:attempt_local_0001_m_000000_0 is done. And is in the process of commiting
    [junit] 08/12/30 17:15:13 INFO mapred.LocalJobRunner: 
    [junit] 08/12/30 17:15:13 INFO mapred.TaskRunner: Task attempt_local_0001_m_000000_0 is allowed to commit now
    [junit] 08/12/30 17:15:13 INFO mapred.FileOutputCommitter: Saved output of task 'attempt_local_0001_m_000000_0' to file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/tmp1485715670
    [junit] 08/12/30 17:15:13 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/dest1/attempt_local_0001_m_000000_0:0+8
    [junit] 08/12/30 17:15:13 INFO mapred.TaskRunner: Task 'attempt_local_0001_m_000000_0' done.
    [junit]  map = 100%,  reduce =0%
    [junit] Ended Job = job_local_0001
    [junit] 08/12/30 17:15:14 INFO exec.ExecDriver:  map = 100%,  reduce =0%
    [junit] 08/12/30 17:15:14 INFO exec.ExecDriver: Ended Job = job_local_0001
    [junit] diff -I \(file:\)\|\(/tmp/.*\) /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/../build/ql/test/logs/udf4.q.out /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/src/test/results/clientpositive/udf4.q.out
    [junit] Done query: udf4.q
    [junit] Begin query: input12.q
    [junit] plan = /tmp/plan1408.xml
    [junit] 08/12/30 17:15:18 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to 0 since there's no reduce operator
    [junit] 08/12/30 17:15:18 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src
    [junit] 08/12/30 17:15:18 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/data/files/TestSerDe.jar
    [junit] 08/12/30 17:15:18 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
    [junit] 08/12/30 17:15:18 WARN mapred.JobClient: Use GenericOptionsParser for parsing the arguments. Applications should implement Tool for the same.
    [junit] 08/12/30 17:15:18 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] Job running in-process (local Hadoop)
    [junit] 08/12/30 17:15:19 INFO exec.ExecDriver: Job running in-process (local Hadoop)
    [junit] 08/12/30 17:15:19 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] 08/12/30 17:15:19 INFO mapred.MapTask: numReduceTasks: 0
    [junit] 08/12/30 17:15:19 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 17:15:19 INFO exec.MapOperator: Adding alias src to work list for file /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src/kv1.txt
    [junit] 08/12/30 17:15:19 INFO exec.MapOperator: Got partitions: null
    [junit] 08/12/30 17:15:19 INFO exec.TableScanOperator: Initializing Self
    [junit] 08/12/30 17:15:19 INFO exec.TableScanOperator: Initializing children:
    [junit] 08/12/30 17:15:19 INFO exec.FilterOperator: Initializing Self
    [junit] 08/12/30 17:15:19 INFO exec.FilterOperator: Initializing children:
    [junit] 08/12/30 17:15:19 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 17:15:19 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 17:15:19 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 17:15:19 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 17:15:19 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 17:15:19 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 17:15:19 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 17:15:19 INFO exec.FilterOperator: Initialization Done
    [junit] 08/12/30 17:15:19 INFO exec.FilterOperator: Initializing Self
    [junit] 08/12/30 17:15:19 INFO exec.FilterOperator: Initializing children:
    [junit] 08/12/30 17:15:19 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 17:15:19 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 17:15:19 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 17:15:19 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 17:15:19 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 17:15:19 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 17:15:19 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 17:15:19 INFO exec.FilterOperator: Initialization Done
    [junit] 08/12/30 17:15:19 INFO exec.FilterOperator: Initializing Self
    [junit] 08/12/30 17:15:19 INFO exec.FilterOperator: Initializing children:
    [junit] 08/12/30 17:15:19 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 17:15:19 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 17:15:19 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 17:15:19 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 17:15:19 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 17:15:19 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 17:15:19 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 17:15:19 INFO exec.FilterOperator: Initialization Done
    [junit] 08/12/30 17:15:19 INFO exec.TableScanOperator: Initialization Done
    [junit] 08/12/30 17:15:19 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 17:15:19 INFO exec.FilterOperator: PASSED:84
    [junit] 08/12/30 17:15:19 INFO exec.FilterOperator: FILTERED:416
    [junit] 08/12/30 17:15:19 INFO exec.FilterOperator: PASSED:105
    [junit] 08/12/30 17:15:19 INFO exec.FilterOperator: FILTERED:395
    [junit] 08/12/30 17:15:19 INFO exec.FilterOperator: PASSED:311
    [junit] 08/12/30 17:15:19 INFO exec.FilterOperator: FILTERED:189
    [junit] 08/12/30 17:15:19 INFO mapred.TaskRunner: Task:attempt_local_0001_m_000000_0 is done. And is in the process of commiting
    [junit] 08/12/30 17:15:19 INFO mapred.LocalJobRunner: 
    [junit] 08/12/30 17:15:19 INFO mapred.TaskRunner: Task attempt_local_0001_m_000000_0 is allowed to commit now
    [junit] 08/12/30 17:15:19 INFO mapred.FileOutputCommitter: Saved output of task 'attempt_local_0001_m_000000_0' to file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/tmp1880629879
    [junit] 08/12/30 17:15:19 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src/kv1.txt:0+5812
    [junit] 08/12/30 17:15:19 INFO mapred.TaskRunner: Task 'attempt_local_0001_m_000000_0' done.
    [junit]  map = 100%,  reduce =0%
    [junit] 08/12/30 17:15:20 INFO exec.ExecDriver:  map = 100%,  reduce =0%
    [junit] 08/12/30 17:15:20 INFO exec.ExecDriver: Ended Job = job_local_0001
    [junit] Ended Job = job_local_0001
    [junit] diff -I \(file:\)\|\(/tmp/.*\) /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/../build/ql/test/logs/input12.q.out /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/src/test/results/clientpositive/input12.q.out
    [junit] Done query: input12.q
    [junit] Begin query: join13.q
    [junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src)
    [junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:361)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
    [junit] 	at org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_join13(TestCliDriver.java:1303)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit] 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit] 	at java.lang.reflect.Method.invoke(Method.java:597)
    [junit] 	at junit.framework.TestCase.runTest(TestCase.java:164)
    [junit] 	at junit.framework.TestCase.runBare(TestCase.java:130)
    [junit] 	at junit.framework.TestResult$1.protect(TestResult.java:106)
    [junit] 	at junit.framework.TestResult.runProtected(TestResult.java:124)
    [junit] 	at junit.framework.TestResult.run(TestResult.java:109)
    [junit] 	at junit.framework.TestCase.run(TestCase.java:120)
    [junit] 	at junit.framework.TestSuite.runTest(TestSuite.java:230)
    [junit] 	at junit.framework.TestSuite.run(TestSuite.java:225)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
    [junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src)
    [junit] 	at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
    [junit] 	at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
    [junit] 	at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
    [junit] 	... 21 more
    [junit] Begin query: input14.q
    [junit] plan = /tmp/plan1409.xml
    [junit] 08/12/30 17:15:24 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to jobconf value of: 1
    [junit] 08/12/30 17:15:24 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src
    [junit] 08/12/30 17:15:24 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/data/files/TestSerDe.jar
    [junit] 08/12/30 17:15:24 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
    [junit] 08/12/30 17:15:24 WARN mapred.JobClient: Use GenericOptionsParser for parsing the arguments. Applications should implement Tool for the same.
    [junit] 08/12/30 17:15:24 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] 08/12/30 17:15:25 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] Job running in-process (local Hadoop)
    [junit] 08/12/30 17:15:25 INFO exec.ExecDriver: Job running in-process (local Hadoop)
    [junit] 08/12/30 17:15:25 INFO mapred.MapTask: numReduceTasks: 1
    [junit] 08/12/30 17:15:25 INFO mapred.MapTask: io.sort.mb = 100
    [junit] 08/12/30 17:15:25 INFO mapred.MapTask: data buffer = 79691776/99614720
    [junit] 08/12/30 17:15:25 INFO mapred.MapTask: record buffer = 262144/327680
    [junit] 08/12/30 17:15:25 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 17:15:25 INFO exec.MapOperator: Adding alias tmap:src to work list for file /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src/kv1.txt
    [junit] 08/12/30 17:15:25 INFO exec.MapOperator: Got partitions: null
    [junit] 08/12/30 17:15:25 INFO exec.TableScanOperator: Initializing Self
    [junit] 08/12/30 17:15:25 INFO exec.TableScanOperator: Initializing children:
    [junit] 08/12/30 17:15:25 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 17:15:25 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 17:15:25 INFO exec.ScriptOperator: Initializing Self
    [junit] 08/12/30 17:15:25 INFO exec.ScriptOperator: Initializing children:
    [junit] 08/12/30 17:15:25 INFO exec.ReduceSinkOperator: Initializing Self
    [junit] 08/12/30 17:15:25 INFO exec.ReduceSinkOperator: Using tag = -1
    [junit] 08/12/30 17:15:25 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 17:15:25 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 17:15:25 INFO exec.ScriptOperator: Initialization Done
    [junit] 08/12/30 17:15:25 INFO exec.ScriptOperator: Executing [/bin/cat]
    [junit] 08/12/30 17:15:25 INFO exec.ScriptOperator: tablename=src
    [junit] 08/12/30 17:15:25 INFO exec.ScriptOperator: partname={}
    [junit] 08/12/30 17:15:25 INFO exec.ScriptOperator: alias=tmap:src
    [junit] 08/12/30 17:15:25 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 17:15:25 INFO exec.TableScanOperator: Initialization Done
    [junit] 08/12/30 17:15:25 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 17:15:25 INFO exec.ScriptOperator: StreamThread ErrorProcessor done
    [junit] 08/12/30 17:15:25 INFO exec.ScriptOperator: StreamThread OutputProcessor done
    [junit] 08/12/30 17:15:25 INFO exec.ScriptOperator: SERIALIZE_ERRORS:0
    [junit] 08/12/30 17:15:25 INFO exec.ScriptOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 17:15:25 INFO mapred.MapTask: Starting flush of map output
    [junit] 08/12/30 17:15:25 INFO mapred.MapTask: Finished spill 0
    [junit] 08/12/30 17:15:25 INFO mapred.TaskRunner: Task:attempt_local_0001_m_000000_0 is done. And is in the process of commiting
    [junit] 08/12/30 17:15:25 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src/kv1.txt:0+5812
    [junit] 08/12/30 17:15:25 INFO mapred.TaskRunner: Task 'attempt_local_0001_m_000000_0' done.
    [junit] 08/12/30 17:15:25 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 17:15:25 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 17:15:25 INFO mapred.Merger: Merging 1 sorted segments
    [junit] 08/12/30 17:15:25 INFO mapred.Merger: Down to the last merge-pass, with 1 segments left of total size: 19720 bytes
    [junit] 08/12/30 17:15:25 INFO exec.ExtractOperator: Initializing Self
    [junit] 08/12/30 17:15:25 INFO exec.ExtractOperator: Initializing children:
    [junit] 08/12/30 17:15:25 INFO exec.FilterOperator: Initializing Self
    [junit] 08/12/30 17:15:25 INFO exec.FilterOperator: Initializing children:
    [junit] 08/12/30 17:15:25 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 17:15:25 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 17:15:25 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 17:15:25 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 17:15:25 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 17:15:25 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 17:15:25 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 17:15:25 INFO exec.FilterOperator: Initialization Done
    [junit] 08/12/30 17:15:25 INFO exec.ExtractOperator: Initialization Done
    [junit] 08/12/30 17:15:25 INFO exec.FilterOperator: FILTERED:416
    [junit] 08/12/30 17:15:25 INFO exec.FilterOperator: PASSED:84
    [junit] 08/12/30 17:15:25 INFO mapred.TaskRunner: Task:attempt_local_0001_r_000000_0 is done. And is in the process of commiting
    [junit] 08/12/30 17:15:25 INFO mapred.LocalJobRunner: 
    [junit] 08/12/30 17:15:25 INFO mapred.TaskRunner: Task attempt_local_0001_r_000000_0 is allowed to commit now
    [junit] 08/12/30 17:15:25 INFO mapred.FileOutputCommitter: Saved output of task 'attempt_local_0001_r_000000_0' to file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/tmp778973955
    [junit] 08/12/30 17:15:25 INFO mapred.LocalJobRunner: reduce > reduce
    [junit] 08/12/30 17:15:25 INFO mapred.TaskRunner: Task 'attempt_local_0001_r_000000_0' done.
    [junit] 08/12/30 17:15:26 INFO exec.ExecDriver:  map = 100%,  reduce =100%
    [junit] 08/12/30 17:15:26 INFO exec.ExecDriver: Ended Job = job_local_0001
    [junit]  map = 100%,  reduce =100%
    [junit] Ended Job = job_local_0001
    [junit] diff -I \(file:\)\|\(/tmp/.*\) /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/../build/ql/test/logs/input14.q.out /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/src/test/results/clientpositive/input14.q.out
    [junit] Done query: input14.q
    [junit] Begin query: join15.q
    [junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src)
    [junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:361)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
    [junit] 	at org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_join15(TestCliDriver.java:1353)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit] 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit] 	at java.lang.reflect.Method.invoke(Method.java:597)
    [junit] 	at junit.framework.TestCase.runTest(TestCase.java:164)
    [junit] 	at junit.framework.TestCase.runBare(TestCase.java:130)
    [junit] 	at junit.framework.TestResult$1.protect(TestResult.java:106)
    [junit] 	at junit.framework.TestResult.runProtected(TestResult.java:124)
    [junit] 	at junit.framework.TestResult.run(TestResult.java:109)
    [junit] 	at junit.framework.TestCase.run(TestCase.java:120)
    [junit] 	at junit.framework.TestSuite.runTest(TestSuite.java:230)
    [junit] 	at junit.framework.TestSuite.run(TestSuite.java:225)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
    [junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src)
    [junit] 	at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
    [junit] 	at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
    [junit] 	at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
    [junit] 	... 21 more
    [junit] Begin query: input16.q
    [junit] plan = /tmp/plan1410.xml
    [junit] 08/12/30 17:15:29 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to 0 since there's no reduce operator
    [junit] 08/12/30 17:15:30 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/input16
    [junit] 08/12/30 17:15:30 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/data/files/TestSerDe.jar
    [junit] 08/12/30 17:15:30 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
    [junit] 08/12/30 17:15:30 WARN mapred.JobClient: Use GenericOptionsParser for parsing the arguments. Applications should implement Tool for the same.
    [junit] 08/12/30 17:15:30 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] Job running in-process (local Hadoop)
    [junit] 08/12/30 17:15:30 INFO exec.ExecDriver: Job running in-process (local Hadoop)
    [junit] 08/12/30 17:15:30 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] 08/12/30 17:15:30 INFO mapred.MapTask: numReduceTasks: 0
    [junit] 08/12/30 17:15:30 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 17:15:30 INFO exec.MapOperator: Adding alias input16 to work list for file /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/input16/kv1_cb.txt
    [junit] 08/12/30 17:15:30 INFO serde2.TestSerDe: org.apache.hadoop.hive.serde2.TestSerDe: initialized with columnNames: [key, value]
    [junit] 08/12/30 17:15:30 INFO exec.MapOperator: Got partitions: null
    [junit] 08/12/30 17:15:30 INFO exec.TableScanOperator: Initializing Self
    [junit] 08/12/30 17:15:30 INFO exec.TableScanOperator: Initializing children:
    [junit] 08/12/30 17:15:30 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 17:15:30 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 17:15:30 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 17:15:30 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 17:15:30 INFO exec.TableScanOperator: Initialization Done
    [junit] 08/12/30 17:15:30 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 17:15:30 INFO mapred.TaskRunner: Task:attempt_local_0001_m_000000_0 is done. And is in the process of commiting
    [junit] 08/12/30 17:15:30 INFO mapred.LocalJobRunner: 
    [junit] 08/12/30 17:15:30 INFO mapred.TaskRunner: Task attempt_local_0001_m_000000_0 is allowed to commit now
    [junit] 08/12/30 17:15:30 INFO mapred.FileOutputCommitter: Saved output of task 'attempt_local_0001_m_000000_0' to file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/tmp148046935
    [junit] 08/12/30 17:15:30 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/input16/kv1_cb.txt:0+5812
    [junit] 08/12/30 17:15:30 INFO mapred.TaskRunner: Task 'attempt_local_0001_m_000000_0' done.
    [junit]  map = 100%,  reduce =0%
    [junit] 08/12/30 17:15:31 INFO exec.ExecDriver:  map = 100%,  reduce =0%
    [junit] 08/12/30 17:15:31 INFO exec.ExecDriver: Ended Job = job_local_0001
    [junit] Ended Job = job_local_0001
    [junit] diff -I \(file:\)\|\(/tmp/.*\) /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/../build/ql/test/logs/input16.q.out /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/src/test/results/clientpositive/input16.q.out
    [junit] Done query: input16.q
    [junit] Begin query: input_part1.q
    [junit] plan = /tmp/plan1411.xml
    [junit] 08/12/30 17:15:35 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to 0 since there's no reduce operator
    [junit] 08/12/30 17:15:35 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/srcpart/ds=2008-04-08/hr=12
    [junit] 08/12/30 17:15:35 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/data/files/TestSerDe.jar
    [junit] 08/12/30 17:15:35 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
    [junit] 08/12/30 17:15:35 WARN mapred.JobClient: Use GenericOptionsParser for parsing the arguments. Applications should implement Tool for the same.
    [junit] 08/12/30 17:15:35 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] 08/12/30 17:15:35 INFO exec.ExecDriver: Job running in-process (local Hadoop)
    [junit] 08/12/30 17:15:35 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] Job running in-process (local Hadoop)
    [junit] 08/12/30 17:15:35 INFO mapred.MapTask: numReduceTasks: 0
    [junit] 08/12/30 17:15:35 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 17:15:35 INFO exec.MapOperator: Adding alias srcpart to work list for file /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/srcpart/ds=2008-04-08/hr=12/kv1.txt
    [junit] 08/12/30 17:15:35 INFO exec.MapOperator: Got partitions: ds/hr
    [junit] 08/12/30 17:15:35 INFO exec.TableScanOperator: Initializing Self
    [junit] 08/12/30 17:15:35 INFO exec.TableScanOperator: Initializing children:
    [junit] 08/12/30 17:15:35 INFO exec.FilterOperator: Initializing Self
    [junit] 08/12/30 17:15:35 INFO exec.FilterOperator: Initializing children:
    [junit] 08/12/30 17:15:35 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 17:15:35 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 17:15:35 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 17:15:35 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 17:15:35 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 17:15:35 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 17:15:35 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 17:15:35 INFO exec.FilterOperator: Initialization Done
    [junit] 08/12/30 17:15:36 INFO exec.TableScanOperator: Initialization Done
    [junit] 08/12/30 17:15:36 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 17:15:36 INFO exec.FilterOperator: PASSED:84
    [junit] 08/12/30 17:15:36 INFO exec.FilterOperator: FILTERED:416
    [junit] 08/12/30 17:15:36 INFO mapred.TaskRunner: Task:attempt_local_0001_m_000000_0 is done. And is in the process of commiting
    [junit] 08/12/30 17:15:36 INFO mapred.LocalJobRunner: 
    [junit] 08/12/30 17:15:36 INFO mapred.TaskRunner: Task attempt_local_0001_m_000000_0 is allowed to commit now
    [junit] 08/12/30 17:15:36 INFO mapred.FileOutputCommitter: Saved output of task 'attempt_local_0001_m_000000_0' to file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/tmp-876067368
    [junit] 08/12/30 17:15:36 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/srcpart/ds=2008-04-08/hr=12/kv1.txt:0+5812
    [junit] 08/12/30 17:15:36 INFO mapred.TaskRunner: Task 'attempt_local_0001_m_000000_0' done.
    [junit]  map = 100%,  reduce =0%
    [junit] 08/12/30 17:15:36 INFO exec.ExecDriver:  map = 100%,  reduce =0%
    [junit] Ended Job = job_local_0001
    [junit] 08/12/30 17:15:36 INFO exec.ExecDriver: Ended Job = job_local_0001
    [junit] diff -I \(file:\)\|\(/tmp/.*\) /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/../build/ql/test/logs/input_part1.q.out /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/src/test/results/clientpositive/input_part1.q.out
    [junit] Done query: input_part1.q
    [junit] Begin query: join17.q
    [junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src)
    [junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:361)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
    [junit] 	at org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_join17(TestCliDriver.java:1428)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit] 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit] 	at java.lang.reflect.Method.invoke(Method.java:597)
    [junit] 	at junit.framework.TestCase.runTest(TestCase.java:164)
    [junit] 	at junit.framework.TestCase.runBare(TestCase.java:130)
    [junit] 	at junit.framework.TestResult$1.protect(TestResult.java:106)
    [junit] 	at junit.framework.TestResult.runProtected(TestResult.java:124)
    [junit] 	at junit.framework.TestResult.run(TestResult.java:109)
    [junit] 	at junit.framework.TestCase.run(TestCase.java:120)
    [junit] 	at junit.framework.TestSuite.runTest(TestSuite.java:230)
    [junit] 	at junit.framework.TestSuite.run(TestSuite.java:225)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
    [junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src)
    [junit] 	at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
    [junit] 	at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
    [junit] 	at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
    [junit] 	... 21 more
    [junit] Begin query: input18.q
    [junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/srcbucket)
    [junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/srcbucket)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:361)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
    [junit] 	at org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_input18(TestCliDriver.java:1453)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit] 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit] 	at java.lang.reflect.Method.invoke(Method.java:597)
    [junit] 	at junit.framework.TestCase.runTest(TestCase.java:164)
    [junit] 	at junit.framework.TestCase.runBare(TestCase.java:130)
    [junit] 	at junit.framework.TestResult$1.protect(TestResult.java:106)
    [junit] 	at junit.framework.TestResult.runProtected(TestResult.java:124)
    [junit] 	at junit.framework.TestResult.run(TestResult.java:109)
    [junit] 	at junit.framework.TestCase.run(TestCase.java:120)
    [junit] 	at junit.framework.TestSuite.runTest(TestSuite.java:230)
    [junit] 	at junit.framework.TestSuite.run(TestSuite.java:225)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
    [junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/srcbucket)
    [junit] 	at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
    [junit] 	at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
    [junit] 	at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
    [junit] 	... 21 more
    [junit] Begin query: input_part3.q
    [junit] diff -I \(file:\)\|\(/tmp/.*\) /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/../build/ql/test/logs/input_part3.q.out /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/src/test/results/clientpositive/input_part3.q.out
    [junit] Done query: input_part3.q
    [junit] Begin query: groupby2.q
    [junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src)
    [junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:361)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
    [junit] 	at org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_groupby2(TestCliDriver.java:1503)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit] 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit] 	at java.lang.reflect.Method.invoke(Method.java:597)
    [junit] 	at junit.framework.TestCase.runTest(TestCase.java:164)
    [junit] 	at junit.framework.TestCase.runBare(TestCase.java:130)
    [junit] 	at junit.framework.TestResult$1.protect(TestResult.java:106)
    [junit] 	at junit.framework.TestResult.runProtected(TestResult.java:124)
    [junit] 	at junit.framework.TestResult.run(TestResult.java:109)
    [junit] 	at junit.framework.TestCase.run(TestCase.java:120)
    [junit] 	at junit.framework.TestSuite.runTest(TestSuite.java:230)
    [junit] 	at junit.framework.TestSuite.run(TestSuite.java:225)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
    [junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src)
    [junit] 	at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
    [junit] 	at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
    [junit] 	at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
    [junit] 	... 21 more
    [junit] Begin query: show_tables.q
    [junit] diff -I \(file:\)\|\(/tmp/.*\) /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/../build/ql/test/logs/show_tables.q.out /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/src/test/results/clientpositive/show_tables.q.out
    [junit] Done query: show_tables.q
    [junit] Begin query: input_part5.q
    [junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src)
    [junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:361)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
    [junit] 	at org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_input_part5(TestCliDriver.java:1553)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit] 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit] 	at java.lang.reflect.Method.invoke(Method.java:597)
    [junit] 	at junit.framework.TestCase.runTest(TestCase.java:164)
    [junit] 	at junit.framework.TestCase.runBare(TestCase.java:130)
    [junit] 	at junit.framework.TestResult$1.protect(TestResult.java:106)
    [junit] 	at junit.framework.TestResult.runProtected(TestResult.java:124)
    [junit] 	at junit.framework.TestResult.run(TestResult.java:109)
    [junit] 	at junit.framework.TestCase.run(TestCase.java:120)
    [junit] 	at junit.framework.TestSuite.runTest(TestSuite.java:230)
    [junit] 	at junit.framework.TestSuite.run(TestSuite.java:225)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
    [junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src)
    [junit] 	at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
    [junit] 	at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
    [junit] 	at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
    [junit] 	... 21 more
    [junit] Begin query: groupby4.q
    [junit] plan = /tmp/plan1412.xml
    [junit] 08/12/30 17:15:43 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to jobconf value of: 1
    [junit] 08/12/30 17:15:43 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src
    [junit] 08/12/30 17:15:43 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/data/files/TestSerDe.jar
    [junit] 08/12/30 17:15:43 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
    [junit] 08/12/30 17:15:43 WARN mapred.JobClient: Use GenericOptionsParser for parsing the arguments. Applications should implement Tool for the same.
    [junit] 08/12/30 17:15:43 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] 08/12/30 17:15:44 INFO exec.ExecDriver: Job running in-process (local Hadoop)
    [junit] 08/12/30 17:15:44 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] Job running in-process (local Hadoop)
    [junit] 08/12/30 17:15:44 INFO mapred.MapTask: numReduceTasks: 1
    [junit] 08/12/30 17:15:44 INFO mapred.MapTask: io.sort.mb = 100
    [junit] 08/12/30 17:15:44 INFO mapred.MapTask: data buffer = 79691776/99614720
    [junit] 08/12/30 17:15:44 INFO mapred.MapTask: record buffer = 262144/327680
    [junit] 08/12/30 17:15:44 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 17:15:44 INFO exec.MapOperator: Adding alias src to work list for file /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src/kv1.txt
    [junit] 08/12/30 17:15:44 INFO exec.MapOperator: Got partitions: null
    [junit] 08/12/30 17:15:44 INFO exec.TableScanOperator: Initializing Self
    [junit] 08/12/30 17:15:44 INFO exec.TableScanOperator: Initializing children:
    [junit] 08/12/30 17:15:44 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 17:15:44 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 17:15:44 INFO exec.ReduceSinkOperator: Initializing Self
    [junit] 08/12/30 17:15:44 INFO exec.ReduceSinkOperator: Using tag = -1
    [junit] 08/12/30 17:15:44 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 17:15:44 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 17:15:44 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 17:15:44 INFO exec.TableScanOperator: Initialization Done
    [junit] 08/12/30 17:15:44 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 17:15:44 INFO mapred.MapTask: Starting flush of map output
    [junit] 08/12/30 17:15:44 INFO mapred.MapTask: Finished spill 0
    [junit] 08/12/30 17:15:44 INFO mapred.TaskRunner: Task:attempt_local_0001_m_000000_0 is done. And is in the process of commiting
    [junit] 08/12/30 17:15:44 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src/kv1.txt:0+5812
    [junit] 08/12/30 17:15:44 INFO mapred.TaskRunner: Task 'attempt_local_0001_m_000000_0' done.
    [junit] 08/12/30 17:15:44 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 17:15:44 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 17:15:44 INFO mapred.Merger: Merging 1 sorted segments
    [junit] 08/12/30 17:15:44 INFO mapred.Merger: Down to the last merge-pass, with 1 segments left of total size: 7002 bytes
    [junit] 08/12/30 17:15:44 INFO exec.GroupByOperator: Initializing Self
    [junit] 08/12/30 17:15:44 INFO exec.GroupByOperator: Initializing children:
    [junit] 08/12/30 17:15:44 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 17:15:44 INFO exec.GroupByOperator: Initialization Done
    [junit] 08/12/30 17:15:44 INFO mapred.TaskRunner: Task:attempt_local_0001_r_000000_0 is done. And is in the process of commiting
    [junit] 08/12/30 17:15:44 INFO mapred.LocalJobRunner: 
    [junit] 08/12/30 17:15:44 INFO mapred.TaskRunner: Task attempt_local_0001_r_000000_0 is allowed to commit now
    [junit] 08/12/30 17:15:44 INFO mapred.FileOutputCommitter: Saved output of task 'attempt_local_0001_r_000000_0' to file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/tmp1666029652
    [junit] 08/12/30 17:15:44 INFO mapred.LocalJobRunner: reduce > reduce
    [junit] 08/12/30 17:15:44 INFO mapred.TaskRunner: Task 'attempt_local_0001_r_000000_0' done.
    [junit]  map = 100%,  reduce =100%
    [junit] Ended Job = job_local_0001
    [junit] 08/12/30 17:15:45 INFO exec.ExecDriver:  map = 100%,  reduce =100%
    [junit] 08/12/30 17:15:45 INFO exec.ExecDriver: Ended Job = job_local_0001
    [junit] plan = /tmp/plan1413.xml
    [junit] 08/12/30 17:15:46 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to jobconf value of: 1
    [junit] 08/12/30 17:15:46 INFO exec.ExecDriver: Adding input file /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/tmp/211681700/379119233.10001
    [junit] 08/12/30 17:15:46 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/data/files/TestSerDe.jar
    [junit] 08/12/30 17:15:46 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
    [junit] 08/12/30 17:15:46 WARN mapred.JobClient: Use GenericOptionsParser for parsing the arguments. Applications should implement Tool for the same.
    [junit] 08/12/30 17:15:46 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] 08/12/30 17:15:46 INFO exec.ExecDriver: Job running in-process (local Hadoop)
    [junit] 08/12/30 17:15:46 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] Job running in-process (local Hadoop)
    [junit] 08/12/30 17:15:46 INFO mapred.MapTask: numReduceTasks: 1
    [junit] 08/12/30 17:15:46 INFO mapred.MapTask: io.sort.mb = 100
    [junit] 08/12/30 17:15:47 INFO mapred.MapTask: data buffer = 79691776/99614720
    [junit] 08/12/30 17:15:47 INFO mapred.MapTask: record buffer = 262144/327680
    [junit] 08/12/30 17:15:47 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 17:15:47 INFO exec.MapOperator: Adding alias /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/tmp/211681700/379119233.10001 to work list for file /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/tmp/211681700/379119233.10001/attempt_local_0001_r_000000_0
    [junit] 08/12/30 17:15:47 INFO exec.MapOperator: Got partitions: null
    [junit] 08/12/30 17:15:47 INFO exec.ReduceSinkOperator: Initializing Self
    [junit] 08/12/30 17:15:47 INFO exec.ReduceSinkOperator: Using tag = -1
    [junit] 08/12/30 17:15:47 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 17:15:47 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 17:15:47 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 17:15:47 INFO mapred.MapTask: Starting flush of map output
    [junit] 08/12/30 17:15:47 INFO mapred.MapTask: Finished spill 0
    [junit] 08/12/30 17:15:47 INFO mapred.TaskRunner: Task:attempt_local_0001_m_000000_0 is done. And is in the process of commiting
    [junit] 08/12/30 17:15:47 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/tmp/211681700/379119233.10001/attempt_local_0001_r_000000_0:0+346
    [junit] 08/12/30 17:15:47 INFO mapred.TaskRunner: Task 'attempt_local_0001_m_000000_0' done.
    [junit] 08/12/30 17:15:47 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 17:15:47 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 17:15:47 INFO mapred.Merger: Merging 1 sorted segments
    [junit] 08/12/30 17:15:47 INFO mapred.Merger: Down to the last merge-pass, with 1 segments left of total size: 142 bytes
    [junit] 08/12/30 17:15:47 INFO exec.GroupByOperator: Initializing Self
    [junit] 08/12/30 17:15:47 INFO exec.GroupByOperator: Initializing children:
    [junit] 08/12/30 17:15:47 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 17:15:47 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 17:15:47 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 17:15:47 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 17:15:47 INFO exec.GroupByOperator: Initialization Done
    [junit] 08/12/30 17:15:47 INFO mapred.TaskRunner: Task:attempt_local_0001_r_000000_0 is done. And is in the process of commiting
    [junit] 08/12/30 17:15:47 INFO mapred.LocalJobRunner: 
    [junit] 08/12/30 17:15:47 INFO mapred.TaskRunner: Task attempt_local_0001_r_000000_0 is allowed to commit now
    [junit] 08/12/30 17:15:47 INFO mapred.FileOutputCommitter: Saved output of task 'attempt_local_0001_r_000000_0' to file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/tmp538489696
    [junit] 08/12/30 17:15:47 INFO mapred.LocalJobRunner: reduce > reduce
    [junit] 08/12/30 17:15:47 INFO mapred.TaskRunner: Task 'attempt_local_0001_r_000000_0' done.
    [junit] 08/12/30 17:15:47 INFO exec.ExecDriver:  map = 100%,  reduce =100%
    [junit] 08/12/30 17:15:47 INFO exec.ExecDriver: Ended Job = job_local_0001
    [junit]  map = 100%,  reduce =100%
    [junit] Ended Job = job_local_0001
    [junit] diff -I \(file:\)\|\(/tmp/.*\) /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/../build/ql/test/logs/groupby4.q.out /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/src/test/results/clientpositive/groupby4.q.out
    [junit] Done query: groupby4.q
    [junit] Begin query: groupby6.q
    [junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src)
    [junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:361)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
    [junit] 	at org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_groupby6(TestCliDriver.java:1603)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit] 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit] 	at java.lang.reflect.Method.invoke(Method.java:597)
    [junit] 	at junit.framework.TestCase.runTest(TestCase.java:164)
    [junit] 	at junit.framework.TestCase.runBare(TestCase.java:130)
    [junit] 	at junit.framework.TestResult$1.protect(TestResult.java:106)
    [junit] 	at junit.framework.TestResult.runProtected(TestResult.java:124)
    [junit] 	at junit.framework.TestResult.run(TestResult.java:109)
    [junit] 	at junit.framework.TestCase.run(TestCase.java:120)
    [junit] 	at junit.framework.TestSuite.runTest(TestSuite.java:230)
    [junit] 	at junit.framework.TestSuite.run(TestSuite.java:225)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
    [junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src)
    [junit] 	at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
    [junit] 	at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
    [junit] 	at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
    [junit] 	... 21 more
    [junit] Begin query: input1_limit.q
    [junit] plan = /tmp/plan1414.xml
    [junit] 08/12/30 17:15:51 INFO exec.ExecDriver: Number of reduce tasks determined at compile : 1
    [junit] 08/12/30 17:15:51 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src
    [junit] 08/12/30 17:15:51 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/data/files/TestSerDe.jar
    [junit] 08/12/30 17:15:51 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
    [junit] 08/12/30 17:15:51 WARN mapred.JobClient: Use GenericOptionsParser for parsing the arguments. Applications should implement Tool for the same.
    [junit] 08/12/30 17:15:51 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] Job running in-process (local Hadoop)
    [junit] 08/12/30 17:15:51 INFO exec.ExecDriver: Job running in-process (local Hadoop)
    [junit] 08/12/30 17:15:51 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] 08/12/30 17:15:51 INFO mapred.MapTask: numReduceTasks: 1
    [junit] 08/12/30 17:15:51 INFO mapred.MapTask: io.sort.mb = 100
    [junit] 08/12/30 17:15:51 INFO mapred.MapTask: data buffer = 79691776/99614720
    [junit] 08/12/30 17:15:51 INFO mapred.MapTask: record buffer = 262144/327680
    [junit] 08/12/30 17:15:51 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 17:15:51 INFO exec.MapOperator: Adding alias src to work list for file /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src/kv1.txt
    [junit] 08/12/30 17:15:51 INFO exec.MapOperator: Got partitions: null
    [junit] 08/12/30 17:15:51 INFO exec.TableScanOperator: Initializing Self
    [junit] 08/12/30 17:15:51 INFO exec.TableScanOperator: Initializing children:
    [junit] 08/12/30 17:15:51 INFO exec.FilterOperator: Initializing Self
    [junit] 08/12/30 17:15:51 INFO exec.FilterOperator: Initializing children:
    [junit] 08/12/30 17:15:51 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 17:15:51 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 17:15:51 INFO exec.LimitOperator: Initializing Self
    [junit] 08/12/30 17:15:51 INFO exec.LimitOperator: Initializing children:
    [junit] 08/12/30 17:15:51 INFO exec.ReduceSinkOperator: Initializing Self
    [junit] 08/12/30 17:15:51 INFO exec.ReduceSinkOperator: Using tag = -1
    [junit] 08/12/30 17:15:51 INFO thrift.TBinarySortableProtocol: Sort order is ""
    [junit] 08/12/30 17:15:51 INFO thrift.TBinarySortableProtocol: Sort order is ""
    [junit] 08/12/30 17:15:52 INFO exec.LimitOperator: Initialization Done
    [junit] 08/12/30 17:15:52 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 17:15:52 INFO exec.FilterOperator: Initialization Done
    [junit] 08/12/30 17:15:52 INFO exec.FilterOperator: Initializing Self
    [junit] 08/12/30 17:15:52 INFO exec.FilterOperator: Initializing children:
    [junit] 08/12/30 17:15:52 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 17:15:52 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 17:15:52 INFO exec.LimitOperator: Initializing Self
    [junit] 08/12/30 17:15:52 INFO exec.LimitOperator: Initializing children:
    [junit] 08/12/30 17:15:52 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 17:15:52 INFO exec.LimitOperator: Initialization Done
    [junit] 08/12/30 17:15:52 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 17:15:52 INFO exec.FilterOperator: Initialization Done
    [junit] 08/12/30 17:15:52 INFO exec.TableScanOperator: Initialization Done
    [junit] 08/12/30 17:15:52 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 17:15:52 INFO exec.FilterOperator: FILTERED:90
    [junit] 08/12/30 17:15:52 INFO exec.FilterOperator: PASSED:13
    [junit] 08/12/30 17:15:52 INFO exec.FilterOperator: FILTERED:90
    [junit] 08/12/30 17:15:52 INFO exec.FilterOperator: PASSED:13
    [junit] 08/12/30 17:15:52 INFO mapred.MapTask: Starting flush of map output
    [junit] 08/12/30 17:15:52 INFO mapred.MapTask: Finished spill 0
    [junit] 08/12/30 17:15:52 INFO mapred.TaskRunner: Task:attempt_local_0001_m_000000_0 is done. And is in the process of commiting
    [junit] 08/12/30 17:15:52 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src/kv1.txt:0+5812
    [junit] 08/12/30 17:15:52 INFO mapred.TaskRunner: Task 'attempt_local_0001_m_000000_0' done.
    [junit] 08/12/30 17:15:52 INFO thrift.TBinarySortableProtocol: Sort order is ""
    [junit] 08/12/30 17:15:52 INFO thrift.TBinarySortableProtocol: Sort order is ""
    [junit] 08/12/30 17:15:52 INFO mapred.Merger: Merging 1 sorted segments
    [junit] 08/12/30 17:15:52 INFO mapred.Merger: Down to the last merge-pass, with 1 segments left of total size: 330 bytes
    [junit] 08/12/30 17:15:52 INFO exec.ExtractOperator: Initializing Self
    [junit] 08/12/30 17:15:52 INFO exec.ExtractOperator: Initializing children:
    [junit] 08/12/30 17:15:52 INFO exec.LimitOperator: Initializing Self
    [junit] 08/12/30 17:15:52 INFO exec.LimitOperator: Initializing children:
    [junit] 08/12/30 17:15:52 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 17:15:52 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 17:15:52 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 17:15:52 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 17:15:52 INFO exec.LimitOperator: Initialization Done
    [junit] 08/12/30 17:15:52 INFO exec.ExtractOperator: Initialization Done
    [junit] 08/12/30 17:15:52 INFO mapred.TaskRunner: Task:attempt_local_0001_r_000000_0 is done. And is in the process of commiting
    [junit] 08/12/30 17:15:52 INFO mapred.LocalJobRunner: 
    [junit] 08/12/30 17:15:52 INFO mapred.TaskRunner: Task attempt_local_0001_r_000000_0 is allowed to commit now
    [junit] 08/12/30 17:15:52 INFO mapred.FileOutputCommitter: Saved output of task 'attempt_local_0001_r_000000_0' to file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/tmp320134382
    [junit] 08/12/30 17:15:52 INFO mapred.LocalJobRunner: reduce > reduce
    [junit] 08/12/30 17:15:52 INFO mapred.TaskRunner: Task 'attempt_local_0001_r_000000_0' done.
    [junit] 08/12/30 17:15:52 INFO exec.ExecDriver:  map = 100%,  reduce =100%
    [junit] 08/12/30 17:15:52 INFO exec.ExecDriver: Ended Job = job_local_0001
    [junit]  map = 100%,  reduce =100%
    [junit] Ended Job = job_local_0001
    [junit] plan = /tmp/plan1415.xml
    [junit] 08/12/30 17:15:54 INFO exec.ExecDriver: Number of reduce tasks determined at compile : 1
    [junit] 08/12/30 17:15:54 INFO exec.ExecDriver: Adding input file /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/tmp/496078988/649667774.10002
    [junit] 08/12/30 17:15:54 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/data/files/TestSerDe.jar
    [junit] 08/12/30 17:15:54 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
    [junit] 08/12/30 17:15:54 WARN mapred.JobClient: Use GenericOptionsParser for parsing the arguments. Applications should implement Tool for the same.
    [junit] 08/12/30 17:15:54 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] Job running in-process (local Hadoop)
    [junit] 08/12/30 17:15:54 INFO exec.ExecDriver: Job running in-process (local Hadoop)
    [junit] 08/12/30 17:15:54 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] 08/12/30 17:15:54 INFO mapred.MapTask: numReduceTasks: 1
    [junit] 08/12/30 17:15:54 INFO mapred.MapTask: io.sort.mb = 100
    [junit] 08/12/30 17:15:54 INFO mapred.MapTask: data buffer = 79691776/99614720
    [junit] 08/12/30 17:15:54 INFO mapred.MapTask: record buffer = 262144/327680
    [junit] 08/12/30 17:15:54 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 17:15:54 INFO exec.MapOperator: Adding alias /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/tmp/496078988/649667774.10002 to work list for file /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/tmp/496078988/649667774.10002/attempt_local_0001_m_000000_0
    [junit] 08/12/30 17:15:54 INFO exec.MapOperator: Got partitions: null
    [junit] 08/12/30 17:15:54 INFO exec.ReduceSinkOperator: Initializing Self
    [junit] 08/12/30 17:15:54 INFO exec.ReduceSinkOperator: Using tag = -1
    [junit] 08/12/30 17:15:54 INFO thrift.TBinarySortableProtocol: Sort order is ""
    [junit] 08/12/30 17:15:54 INFO thrift.TBinarySortableProtocol: Sort order is ""
    [junit] 08/12/30 17:15:54 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 17:15:54 INFO mapred.MapTask: Starting flush of map output
    [junit] 08/12/30 17:15:55 INFO mapred.MapTask: Finished spill 0
    [junit] 08/12/30 17:15:55 INFO mapred.TaskRunner: Task:attempt_local_0001_m_000000_0 is done. And is in the process of commiting
    [junit] 08/12/30 17:15:55 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/tmp/496078988/649667774.10002/attempt_local_0001_m_000000_0:0+291
    [junit] 08/12/30 17:15:55 INFO mapred.TaskRunner: Task 'attempt_local_0001_m_000000_0' done.
    [junit] 08/12/30 17:15:55 INFO thrift.TBinarySortableProtocol: Sort order is ""
    [junit] 08/12/30 17:15:55 INFO thrift.TBinarySortableProtocol: Sort order is ""
    [junit] 08/12/30 17:15:55 INFO mapred.Merger: Merging 1 sorted segments
    [junit] 08/12/30 17:15:55 INFO mapred.Merger: Down to the last merge-pass, with 1 segments left of total size: 167 bytes
    [junit] 08/12/30 17:15:55 INFO exec.ExtractOperator: Initializing Self
    [junit] 08/12/30 17:15:55 INFO exec.ExtractOperator: Initializing children:
    [junit] 08/12/30 17:15:55 INFO exec.LimitOperator: Initializing Self
    [junit] 08/12/30 17:15:55 INFO exec.LimitOperator: Initializing children:
    [junit] 08/12/30 17:15:55 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 17:15:55 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 17:15:55 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 17:15:55 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 17:15:55 INFO exec.LimitOperator: Initialization Done
    [junit] 08/12/30 17:15:55 INFO exec.ExtractOperator: Initialization Done
    [junit] 08/12/30 17:15:55 INFO mapred.TaskRunner: Task:attempt_local_0001_r_000000_0 is done. And is in the process of commiting
    [junit] 08/12/30 17:15:55 INFO mapred.LocalJobRunner: 
    [junit] 08/12/30 17:15:55 INFO mapred.TaskRunner: Task attempt_local_0001_r_000000_0 is allowed to commit now
    [junit] 08/12/30 17:15:55 INFO mapred.FileOutputCommitter: Saved output of task 'attempt_local_0001_r_000000_0' to file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/tmp510905853
    [junit] 08/12/30 17:15:55 INFO mapred.LocalJobRunner: reduce > reduce
    [junit] 08/12/30 17:15:55 INFO mapred.TaskRunner: Task 'attempt_local_0001_r_000000_0' done.
    [junit]  map = 100%,  reduce =100%
    [junit] Ended Job = job_local_0001
    [junit] 08/12/30 17:15:55 INFO exec.ExecDriver:  map = 100%,  reduce =100%
    [junit] 08/12/30 17:15:55 INFO exec.ExecDriver: Ended Job = job_local_0001
    [junit] diff -I \(file:\)\|\(/tmp/.*\) /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/../build/ql/test/logs/input1_limit.q.out /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/src/test/results/clientpositive/input1_limit.q.out
    [junit] Done query: input1_limit.q
    [junit] Begin query: groupby8.q
    [junit] plan = /tmp/plan1416.xml
    [junit] 08/12/30 17:15:59 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to jobconf value of: 1
    [junit] 08/12/30 17:15:59 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src
    [junit] 08/12/30 17:15:59 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/data/files/TestSerDe.jar
    [junit] 08/12/30 17:15:59 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
    [junit] 08/12/30 17:15:59 WARN mapred.JobClient: Use GenericOptionsParser for parsing the arguments. Applications should implement Tool for the same.
    [junit] 08/12/30 17:15:59 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] Job running in-process (local Hadoop)
    [junit] 08/12/30 17:15:59 INFO exec.ExecDriver: Job running in-process (local Hadoop)
    [junit] 08/12/30 17:15:59 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] 08/12/30 17:15:59 INFO mapred.MapTask: numReduceTasks: 1
    [junit] 08/12/30 17:15:59 INFO mapred.MapTask: io.sort.mb = 100
    [junit] 08/12/30 17:16:00 INFO mapred.MapTask: data buffer = 79691776/99614720
    [junit] 08/12/30 17:16:00 INFO mapred.MapTask: record buffer = 262144/327680
    [junit] 08/12/30 17:16:00 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 17:16:00 INFO exec.MapOperator: Adding alias src to work list for file /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src/kv1.txt
    [junit] 08/12/30 17:16:00 INFO exec.MapOperator: Got partitions: null
    [junit] 08/12/30 17:16:00 INFO exec.TableScanOperator: Initializing Self
    [junit] 08/12/30 17:16:00 INFO exec.TableScanOperator: Initializing children:
    [junit] 08/12/30 17:16:00 INFO exec.ReduceSinkOperator: Initializing Self
    [junit] 08/12/30 17:16:00 INFO exec.ReduceSinkOperator: Using tag = -1
    [junit] 08/12/30 17:16:00 INFO thrift.TBinarySortableProtocol: Sort order is "++"
    [junit] 08/12/30 17:16:00 INFO thrift.TBinarySortableProtocol: Sort order is "++"
    [junit] 08/12/30 17:16:00 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 17:16:00 INFO exec.TableScanOperator: Initialization Done
    [junit] 08/12/30 17:16:00 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 17:16:00 INFO mapred.MapTask: Starting flush of map output
    [junit] 08/12/30 17:16:00 INFO mapred.MapTask: Finished spill 0
    [junit] 08/12/30 17:16:00 INFO mapred.TaskRunner: Task:attempt_local_0001_m_000000_0 is done. And is in the process of commiting
    [junit] 08/12/30 17:16:00 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src/kv1.txt:0+5812
    [junit] 08/12/30 17:16:00 INFO mapred.TaskRunner: Task 'attempt_local_0001_m_000000_0' done.
    [junit] 08/12/30 17:16:00 INFO thrift.TBinarySortableProtocol: Sort order is "++"
    [junit] 08/12/30 17:16:00 INFO thrift.TBinarySortableProtocol: Sort order is "++"
    [junit] 08/12/30 17:16:00 INFO mapred.Merger: Merging 1 sorted segments
    [junit] 08/12/30 17:16:00 INFO mapred.Merger: Down to the last merge-pass, with 1 segments left of total size: 10314 bytes
    [junit] 08/12/30 17:16:00 INFO exec.GroupByOperator: Initializing Self
    [junit] 08/12/30 17:16:00 INFO exec.GroupByOperator: Initializing children:
    [junit] 08/12/30 17:16:00 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 17:16:00 INFO exec.GroupByOperator: Initialization Done
    [junit] 08/12/30 17:16:00 INFO mapred.TaskRunner: Task:attempt_local_0001_r_000000_0 is done. And is in the process of commiting
    [junit] 08/12/30 17:16:00 INFO mapred.LocalJobRunner: 
    [junit] 08/12/30 17:16:00 INFO mapred.TaskRunner: Task attempt_local_0001_r_000000_0 is allowed to commit now
    [junit] 08/12/30 17:16:00 INFO mapred.FileOutputCommitter: Saved output of task 'attempt_local_0001_r_000000_0' to file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/tmp1300714499
    [junit] 08/12/30 17:16:00 INFO mapred.LocalJobRunner: reduce > reduce
    [junit] 08/12/30 17:16:00 INFO mapred.TaskRunner: Task 'attempt_local_0001_r_000000_0' done.
    [junit] 08/12/30 17:16:00 INFO exec.ExecDriver:  map = 100%,  reduce =100%
    [junit] 08/12/30 17:16:00 INFO exec.ExecDriver: Ended Job = job_local_0001
    [junit]  map = 100%,  reduce =100%
    [junit] Ended Job = job_local_0001
    [junit] plan = /tmp/plan1417.xml
    [junit] 08/12/30 17:16:02 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to jobconf value of: 1
    [junit] 08/12/30 17:16:02 INFO exec.ExecDriver: Adding input file /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/tmp/431193243/44780248.10002
    [junit] 08/12/30 17:16:02 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/data/files/TestSerDe.jar
    [junit] 08/12/30 17:16:02 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
    [junit] 08/12/30 17:16:02 WARN mapred.JobClient: Use GenericOptionsParser for parsing the arguments. Applications should implement Tool for the same.
    [junit] 08/12/30 17:16:02 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] Job running in-process (local Hadoop)
    [junit] 08/12/30 17:16:03 INFO exec.ExecDriver: Job running in-process (local Hadoop)
    [junit] 08/12/30 17:16:03 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] 08/12/30 17:16:03 INFO mapred.MapTask: numReduceTasks: 1
    [junit] 08/12/30 17:16:03 INFO mapred.MapTask: io.sort.mb = 100
    [junit] 08/12/30 17:16:03 INFO mapred.MapTask: data buffer = 79691776/99614720
    [junit] 08/12/30 17:16:03 INFO mapred.MapTask: record buffer = 262144/327680
    [junit] 08/12/30 17:16:03 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 17:16:03 INFO exec.MapOperator: Adding alias /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/tmp/431193243/44780248.10002 to work list for file /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/tmp/431193243/44780248.10002/attempt_local_0001_r_000000_0
    [junit] 08/12/30 17:16:03 INFO exec.MapOperator: Got partitions: null
    [junit] 08/12/30 17:16:03 INFO exec.ReduceSinkOperator: Initializing Self
    [junit] 08/12/30 17:16:03 INFO exec.ReduceSinkOperator: Using tag = -1
    [junit] 08/12/30 17:16:03 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 17:16:03 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 17:16:03 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 17:16:03 INFO mapred.MapTask: Starting flush of map output
    [junit] 08/12/30 17:16:03 INFO mapred.MapTask: Finished spill 0
    [junit] 08/12/30 17:16:03 INFO mapred.TaskRunner: Task:attempt_local_0001_m_000000_0 is done. And is in the process of commiting
    [junit] 08/12/30 17:16:03 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/tmp/431193243/44780248.10002/attempt_local_0001_r_000000_0:0+11875
    [junit] 08/12/30 17:16:03 INFO mapred.TaskRunner: Task 'attempt_local_0001_m_000000_0' done.
    [junit] 08/12/30 17:16:03 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 17:16:03 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 17:16:03 INFO mapred.Merger: Merging 1 sorted segments
    [junit] 08/12/30 17:16:03 INFO mapred.Merger: Down to the last merge-pass, with 1 segments left of total size: 8282 bytes
    [junit] 08/12/30 17:16:03 INFO exec.GroupByOperator: Initializing Self
    [junit] 08/12/30 17:16:03 INFO exec.GroupByOperator: Initializing children:
    [junit] 08/12/30 17:16:03 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 17:16:03 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 17:16:03 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 17:16:03 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 17:16:03 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 17:16:04 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 17:16:04 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 17:16:04 INFO exec.GroupByOperator: Initialization Done
    [junit] 08/12/30 17:16:04 INFO mapred.TaskRunner: Task:attempt_local_0001_r_000000_0 is done. And is in the process of commiting
    [junit] 08/12/30 17:16:04 INFO mapred.LocalJobRunner: 
    [junit] 08/12/30 17:16:04 INFO mapred.TaskRunner: Task attempt_local_0001_r_000000_0 is allowed to commit now
    [junit]  map = 100%,  reduce =0%
    [junit] 08/12/30 17:16:04 INFO exec.ExecDriver:  map = 100%,  reduce =0%
    [junit] 08/12/30 17:16:04 INFO mapred.FileOutputCommitter: Saved output of task 'attempt_local_0001_r_000000_0' to file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/tmp-1680453079
    [junit] 08/12/30 17:16:04 INFO mapred.LocalJobRunner: reduce > reduce
    [junit] 08/12/30 17:16:04 INFO mapred.TaskRunner: Task 'attempt_local_0001_r_000000_0' done.
    [junit]  map = 100%,  reduce =100%
    [junit] 08/12/30 17:16:05 INFO exec.ExecDriver:  map = 100%,  reduce =100%
    [junit] Ended Job = job_local_0001
    [junit] 08/12/30 17:16:05 INFO exec.ExecDriver: Ended Job = job_local_0001
    [junit] plan = /tmp/plan1418.xml
    [junit] 08/12/30 17:16:06 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to jobconf value of: 1
    [junit] 08/12/30 17:16:07 INFO exec.ExecDriver: Adding input file /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/tmp/431193243/44780248.10003
    [junit] 08/12/30 17:16:07 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/data/files/TestSerDe.jar
    [junit] 08/12/30 17:16:07 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
    [junit] 08/12/30 17:16:07 WARN mapred.JobClient: Use GenericOptionsParser for parsing the arguments. Applications should implement Tool for the same.
    [junit] 08/12/30 17:16:07 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] Job running in-process (local Hadoop)
    [junit] 08/12/30 17:16:07 INFO exec.ExecDriver: Job running in-process (local Hadoop)
    [junit] 08/12/30 17:16:07 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] 08/12/30 17:16:07 INFO mapred.MapTask: numReduceTasks: 1
    [junit] 08/12/30 17:16:07 INFO mapred.MapTask: io.sort.mb = 100
    [junit] 08/12/30 17:16:07 INFO mapred.MapTask: data buffer = 79691776/99614720
    [junit] 08/12/30 17:16:07 INFO mapred.MapTask: record buffer = 262144/327680
    [junit] 08/12/30 17:16:07 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 17:16:07 INFO exec.MapOperator: Adding alias /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/tmp/431193243/44780248.10003 to work list for file /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/tmp/431193243/44780248.10003/attempt_local_0001_m_000000_0
    [junit] 08/12/30 17:16:07 INFO exec.MapOperator: Got partitions: null
    [junit] 08/12/30 17:16:07 INFO exec.ReduceSinkOperator: Initializing Self
    [junit] 08/12/30 17:16:07 INFO exec.ReduceSinkOperator: Using tag = -1
    [junit] 08/12/30 17:16:07 INFO thrift.TBinarySortableProtocol: Sort order is "++"
    [junit] 08/12/30 17:16:07 INFO thrift.TBinarySortableProtocol: Sort order is "++"
    [junit] 08/12/30 17:16:07 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 17:16:07 INFO mapred.MapTask: Starting flush of map output
    [junit] 08/12/30 17:16:07 INFO mapred.MapTask: Finished spill 0
    [junit] 08/12/30 17:16:07 INFO mapred.TaskRunner: Task:attempt_local_0001_m_000000_0 is done. And is in the process of commiting
    [junit] 08/12/30 17:16:07 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/tmp/431193243/44780248.10003/attempt_local_0001_m_000000_0:0+20608
    [junit] 08/12/30 17:16:07 INFO mapred.TaskRunner: Task 'attempt_local_0001_m_000000_0' done.
    [junit] 08/12/30 17:16:07 INFO thrift.TBinarySortableProtocol: Sort order is "++"
    [junit] 08/12/30 17:16:07 INFO thrift.TBinarySortableProtocol: Sort order is "++"
    [junit] 08/12/30 17:16:08 INFO mapred.Merger: Merging 1 sorted segments
    [junit] 08/12/30 17:16:08 INFO mapred.Merger: Down to the last merge-pass, with 1 segments left of total size: 10314 bytes
    [junit] 08/12/30 17:16:08 INFO exec.GroupByOperator: Initializing Self
    [junit] 08/12/30 17:16:08 INFO exec.GroupByOperator: Initializing children:
    [junit] 08/12/30 17:16:08 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 17:16:08 INFO exec.GroupByOperator: Initialization Done
    [junit] 08/12/30 17:16:08 INFO mapred.TaskRunner: Task:attempt_local_0001_r_000000_0 is done. And is in the process of commiting
    [junit] 08/12/30 17:16:08 INFO mapred.LocalJobRunner: 
    [junit] 08/12/30 17:16:08 INFO mapred.TaskRunner: Task attempt_local_0001_r_000000_0 is allowed to commit now
    [junit] 08/12/30 17:16:08 INFO mapred.FileOutputCommitter: Saved output of task 'attempt_local_0001_r_000000_0' to file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/tmp-609718161
    [junit] 08/12/30 17:16:08 INFO mapred.LocalJobRunner: reduce > reduce
    [junit] 08/12/30 17:16:08 INFO mapred.TaskRunner: Task 'attempt_local_0001_r_000000_0' done.
    [junit]  map = 100%,  reduce =100%
    [junit] Ended Job = job_local_0001
    [junit] 08/12/30 17:16:08 INFO exec.ExecDriver:  map = 100%,  reduce =100%
    [junit] 08/12/30 17:16:08 INFO exec.ExecDriver: Ended Job = job_local_0001
    [junit] plan = /tmp/plan1419.xml
    [junit] 08/12/30 17:16:09 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to jobconf value of: 1
    [junit] 08/12/30 17:16:10 INFO exec.ExecDriver: Adding input file /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/tmp/431193243/44780248.10004
    [junit] 08/12/30 17:16:10 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/data/files/TestSerDe.jar
    [junit] 08/12/30 17:16:10 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
    [junit] 08/12/30 17:16:10 WARN mapred.JobClient: Use GenericOptionsParser for parsing the arguments. Applications should implement Tool for the same.
    [junit] 08/12/30 17:16:10 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] 08/12/30 17:16:10 INFO exec.ExecDriver: Job running in-process (local Hadoop)
    [junit] 08/12/30 17:16:10 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] Job running in-process (local Hadoop)
    [junit] 08/12/30 17:16:10 INFO mapred.MapTask: numReduceTasks: 1
    [junit] 08/12/30 17:16:10 INFO mapred.MapTask: io.sort.mb = 100
    [junit] 08/12/30 17:16:10 INFO mapred.MapTask: data buffer = 79691776/99614720
    [junit] 08/12/30 17:16:10 INFO mapred.MapTask: record buffer = 262144/327680
    [junit] 08/12/30 17:16:10 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 17:16:10 INFO exec.MapOperator: Adding alias /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/tmp/431193243/44780248.10004 to work list for file /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/tmp/431193243/44780248.10004/attempt_local_0001_r_000000_0
    [junit] 08/12/30 17:16:10 INFO exec.MapOperator: Got partitions: null
    [junit] 08/12/30 17:16:10 INFO exec.ReduceSinkOperator: Initializing Self
    [junit] 08/12/30 17:16:10 INFO exec.ReduceSinkOperator: Using tag = -1
    [junit] 08/12/30 17:16:10 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 17:16:10 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 17:16:10 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 17:16:10 INFO mapred.MapTask: Starting flush of map output
    [junit] 08/12/30 17:16:10 INFO mapred.MapTask: Finished spill 0
    [junit] 08/12/30 17:16:10 INFO mapred.TaskRunner: Task:attempt_local_0001_m_000000_0 is done. And is in the process of commiting
    [junit] 08/12/30 17:16:10 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/tmp/431193243/44780248.10004/attempt_local_0001_r_000000_0:0+11875
    [junit] 08/12/30 17:16:10 INFO mapred.TaskRunner: Task 'attempt_local_0001_m_000000_0' done.
    [junit] 08/12/30 17:16:10 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 17:16:10 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 17:16:10 INFO mapred.Merger: Merging 1 sorted segments
    [junit] 08/12/30 17:16:10 INFO mapred.Merger: Down to the last merge-pass, with 1 segments left of total size: 8282 bytes
    [junit] 08/12/30 17:16:10 INFO exec.GroupByOperator: Initializing Self
    [junit] 08/12/30 17:16:10 INFO exec.GroupByOperator: Initializing children:
    [junit] 08/12/30 17:16:10 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 17:16:10 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 17:16:10 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 17:16:10 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 17:16:10 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 17:16:10 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 17:16:10 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 17:16:10 INFO exec.GroupByOperator: Initialization Done
    [junit] 08/12/30 17:16:10 INFO mapred.TaskRunner: Task:attempt_local_0001_r_000000_0 is done. And is in the process of commiting
    [junit] 08/12/30 17:16:10 INFO mapred.LocalJobRunner: 
    [junit] 08/12/30 17:16:10 INFO mapred.TaskRunner: Task attempt_local_0001_r_000000_0 is allowed to commit now
    [junit] 08/12/30 17:16:10 INFO mapred.FileOutputCommitter: Saved output of task 'attempt_local_0001_r_000000_0' to file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/tmp-1953217512
    [junit] 08/12/30 17:16:10 INFO mapred.LocalJobRunner: reduce > reduce
    [junit] 08/12/30 17:16:10 INFO mapred.TaskRunner: Task 'attempt_local_0001_r_000000_0' done.
    [junit]  map = 100%,  reduce =100%
    [junit] Ended Job = job_local_0001
    [junit] 08/12/30 17:16:11 INFO exec.ExecDriver:  map = 100%,  reduce =100%
    [junit] 08/12/30 17:16:11 INFO exec.ExecDriver: Ended Job = job_local_0001
    [junit] diff -I \(file:\)\|\(/tmp/.*\) /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/../build/ql/test/logs/groupby8.q.out /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/src/test/results/clientpositive/groupby8.q.out
    [junit] Done query: groupby8.q
    [junit] Begin query: input2_limit.q
    [junit] plan = /tmp/plan1420.xml
    [junit] 08/12/30 17:16:14 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to 0 since there's no reduce operator
    [junit] 08/12/30 17:16:14 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src
    [junit] 08/12/30 17:16:14 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/data/files/TestSerDe.jar
    [junit] 08/12/30 17:16:14 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
    [junit] 08/12/30 17:16:14 WARN mapred.JobClient: Use GenericOptionsParser for parsing the arguments. Applications should implement Tool for the same.
    [junit] 08/12/30 17:16:14 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] 08/12/30 17:16:14 INFO exec.ExecDriver: Job running in-process (local Hadoop)
    [junit] 08/12/30 17:16:14 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] Job running in-process (local Hadoop)
    [junit] 08/12/30 17:16:14 INFO mapred.MapTask: numReduceTasks: 0
    [junit] 08/12/30 17:16:14 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 17:16:14 INFO exec.MapOperator: Adding alias x to work list for file /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src/kv1.txt
    [junit] 08/12/30 17:16:14 INFO exec.MapOperator: Got partitions: null
    [junit] 08/12/30 17:16:14 INFO exec.TableScanOperator: Initializing Self
    [junit] 08/12/30 17:16:14 INFO exec.TableScanOperator: Initializing children:
    [junit] 08/12/30 17:16:14 INFO exec.FilterOperator: Initializing Self
    [junit] 08/12/30 17:16:14 INFO exec.FilterOperator: Initializing children:
    [junit] 08/12/30 17:16:14 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 17:16:14 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 17:16:14 INFO exec.LimitOperator: Initializing Self
    [junit] 08/12/30 17:16:14 INFO exec.LimitOperator: Initializing children:
    [junit] 08/12/30 17:16:14 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 17:16:14 INFO exec.LimitOperator: Initialization Done
    [junit] 08/12/30 17:16:14 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 17:16:14 INFO exec.FilterOperator: Initialization Done
    [junit] 08/12/30 17:16:14 INFO exec.TableScanOperator: Initialization Done
    [junit] 08/12/30 17:16:14 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 17:16:14 INFO exec.FilterOperator: PASSED:8
    [junit] 08/12/30 17:16:14 INFO exec.FilterOperator: FILTERED:3
    [junit] 08/12/30 17:16:14 INFO mapred.TaskRunner: Task:attempt_local_0001_m_000000_0 is done. And is in the process of commiting
    [junit] 08/12/30 17:16:14 INFO mapred.LocalJobRunner: 
    [junit] 08/12/30 17:16:14 INFO mapred.TaskRunner: Task attempt_local_0001_m_000000_0 is allowed to commit now
    [junit] 08/12/30 17:16:14 INFO mapred.FileOutputCommitter: Saved output of task 'attempt_local_0001_m_000000_0' to file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/tmp2083036561
    [junit] 08/12/30 17:16:14 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src/kv1.txt:0+5812
    [junit] 08/12/30 17:16:14 INFO mapred.TaskRunner: Task 'attempt_local_0001_m_000000_0' done.
    [junit] 08/12/30 17:16:15 INFO exec.ExecDriver:  map = 100%,  reduce =0%
    [junit] 08/12/30 17:16:15 INFO exec.ExecDriver: Ended Job = job_local_0001
    [junit]  map = 100%,  reduce =0%
    [junit] Ended Job = job_local_0001
    [junit] diff -I \(file:\)\|\(/tmp/.*\) /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/../build/ql/test/logs/input2_limit.q.out /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/src/test/results/clientpositive/input2_limit.q.out
    [junit] Done query: input2_limit.q
    [junit] Begin query: input3_limit.q
    [junit] plan = /tmp/plan1421.xml
    [junit] 08/12/30 17:16:18 INFO exec.ExecDriver: Number of reduce tasks determined at compile : 1
    [junit] 08/12/30 17:16:19 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/t1
    [junit] 08/12/30 17:16:19 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/data/files/TestSerDe.jar
    [junit] 08/12/30 17:16:19 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
    [junit] 08/12/30 17:16:19 WARN mapred.JobClient: Use GenericOptionsParser for parsing the arguments. Applications should implement Tool for the same.
    [junit] 08/12/30 17:16:19 INFO mapred.FileInputFormat: Total input paths to process : 2
    [junit] 08/12/30 17:16:19 INFO exec.ExecDriver: Job running in-process (local Hadoop)
    [junit] 08/12/30 17:16:19 INFO mapred.FileInputFormat: Total input paths to process : 2
    [junit] Job running in-process (local Hadoop)
    [junit] 08/12/30 17:16:19 INFO mapred.MapTask: numReduceTasks: 1
    [junit] 08/12/30 17:16:19 INFO mapred.MapTask: io.sort.mb = 100
    [junit] 08/12/30 17:16:19 INFO mapred.MapTask: data buffer = 79691776/99614720
    [junit] 08/12/30 17:16:19 INFO mapred.MapTask: record buffer = 262144/327680
    [junit] 08/12/30 17:16:19 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 17:16:19 INFO exec.MapOperator: Adding alias a to work list for file /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/t1/kv1.txt
    [junit] 08/12/30 17:16:19 INFO exec.MapOperator: Got partitions: null
    [junit] 08/12/30 17:16:19 INFO exec.TableScanOperator: Initializing Self
    [junit] 08/12/30 17:16:19 INFO exec.TableScanOperator: Initializing children:
    [junit] 08/12/30 17:16:19 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 17:16:19 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 17:16:19 INFO exec.LimitOperator: Initializing Self
    [junit] 08/12/30 17:16:19 INFO exec.LimitOperator: Initializing children:
    [junit] 08/12/30 17:16:19 INFO exec.ReduceSinkOperator: Initializing Self
    [junit] 08/12/30 17:16:19 INFO exec.ReduceSinkOperator: Using tag = -1
    [junit] 08/12/30 17:16:19 INFO thrift.TBinarySortableProtocol: Sort order is ""
    [junit] 08/12/30 17:16:19 INFO thrift.TBinarySortableProtocol: Sort order is ""
    [junit] 08/12/30 17:16:19 INFO exec.LimitOperator: Initialization Done
    [junit] 08/12/30 17:16:19 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 17:16:19 INFO exec.TableScanOperator: Initialization Done
    [junit] 08/12/30 17:16:19 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 17:16:19 INFO mapred.MapTask: Starting flush of map output
    [junit] 08/12/30 17:16:19 INFO mapred.MapTask: Finished spill 0
    [junit] 08/12/30 17:16:19 INFO mapred.TaskRunner: Task:attempt_local_0001_m_000000_0 is done. And is in the process of commiting
    [junit] 08/12/30 17:16:19 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/t1/kv1.txt:0+5812
    [junit] 08/12/30 17:16:19 INFO mapred.TaskRunner: Task 'attempt_local_0001_m_000000_0' done.
    [junit] 08/12/30 17:16:19 INFO mapred.MapTask: numReduceTasks: 1
    [junit] 08/12/30 17:16:19 INFO mapred.MapTask: io.sort.mb = 100
    [junit] 08/12/30 17:16:20 INFO mapred.MapTask: data buffer = 79691776/99614720
    [junit] 08/12/30 17:16:20 INFO mapred.MapTask: record buffer = 262144/327680
    [junit] 08/12/30 17:16:20 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 17:16:20 INFO exec.MapOperator: Adding alias a to work list for file /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/t1/kv2.txt
    [junit] 08/12/30 17:16:20 INFO exec.MapOperator: Got partitions: null
    [junit] 08/12/30 17:16:20 INFO exec.TableScanOperator: Initializing Self
    [junit] 08/12/30 17:16:20 INFO exec.TableScanOperator: Initializing children:
    [junit] 08/12/30 17:16:20 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 17:16:20 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 17:16:20 INFO exec.LimitOperator: Initializing Self
    [junit] 08/12/30 17:16:20 INFO exec.LimitOperator: Initializing children:
    [junit] 08/12/30 17:16:20 INFO exec.ReduceSinkOperator: Initializing Self
    [junit] 08/12/30 17:16:20 INFO exec.ReduceSinkOperator: Using tag = -1
    [junit] 08/12/30 17:16:20 INFO thrift.TBinarySortableProtocol: Sort order is ""
    [junit] 08/12/30 17:16:20 INFO thrift.TBinarySortableProtocol: Sort order is ""
    [junit] 08/12/30 17:16:20 INFO exec.LimitOperator: Initialization Done
    [junit] 08/12/30 17:16:20 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 17:16:20 INFO exec.TableScanOperator: Initialization Done
    [junit] 08/12/30 17:16:20 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 17:16:20 INFO mapred.MapTask: Starting flush of map output
    [junit] 08/12/30 17:16:20 INFO mapred.MapTask: Index: (0, 2, 6)
    [junit] 08/12/30 17:16:20 INFO mapred.TaskRunner: Task:attempt_local_0001_m_000001_0 is done. And is in the process of commiting
    [junit] 08/12/30 17:16:20 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/t1/kv2.txt:0+5791
    [junit] 08/12/30 17:16:20 INFO mapred.TaskRunner: Task 'attempt_local_0001_m_000001_0' done.
    [junit] 08/12/30 17:16:20 INFO thrift.TBinarySortableProtocol: Sort order is ""
    [junit] 08/12/30 17:16:20 INFO thrift.TBinarySortableProtocol: Sort order is ""
    [junit] 08/12/30 17:16:20 INFO mapred.Merger: Merging 2 sorted segments
    [junit] 08/12/30 17:16:20 INFO mapred.Merger: Down to the last merge-pass, with 1 segments left of total size: 694 bytes
    [junit] 08/12/30 17:16:20 INFO exec.ExtractOperator: Initializing Self
    [junit] 08/12/30 17:16:20 INFO exec.ExtractOperator: Initializing children:
    [junit] 08/12/30 17:16:20 INFO exec.LimitOperator: Initializing Self
    [junit] 08/12/30 17:16:20 INFO exec.LimitOperator: Initializing children:
    [junit] 08/12/30 17:16:20 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 17:16:20 INFO exec.LimitOperator: Initialization Done
    [junit] 08/12/30 17:16:20 INFO exec.ExtractOperator: Initialization Done
    [junit] 08/12/30 17:16:20 INFO mapred.TaskRunner: Task:attempt_local_0001_r_000000_0 is done. And is in the process of commiting
    [junit] 08/12/30 17:16:20 INFO mapred.LocalJobRunner: 
    [junit] 08/12/30 17:16:20 INFO mapred.TaskRunner: Task attempt_local_0001_r_000000_0 is allowed to commit now
    [junit] 08/12/30 17:16:20 INFO mapred.FileOutputCommitter: Saved output of task 'attempt_local_0001_r_000000_0' to file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/tmp1363107851
    [junit] 08/12/30 17:16:20 INFO mapred.LocalJobRunner: reduce > reduce
    [junit] 08/12/30 17:16:20 INFO mapred.TaskRunner: Task 'attempt_local_0001_r_000000_0' done.
    [junit] 08/12/30 17:16:20 INFO exec.ExecDriver:  map = 100%,  reduce =100%
    [junit] 08/12/30 17:16:20 INFO exec.ExecDriver: Ended Job = job_local_0001
    [junit]  map = 100%,  reduce =100%
    [junit] Ended Job = job_local_0001
    [junit] plan = /tmp/plan1422.xml
    [junit] 08/12/30 17:16:21 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to jobconf value of: 1
    [junit] 08/12/30 17:16:22 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/t2
    [junit] 08/12/30 17:16:22 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/data/files/TestSerDe.jar
    [junit] 08/12/30 17:16:22 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
    [junit] 08/12/30 17:16:22 WARN mapred.JobClient: Use GenericOptionsParser for parsing the arguments. Applications should implement Tool for the same.
    [junit] 08/12/30 17:16:22 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] 08/12/30 17:16:22 INFO exec.ExecDriver: Job running in-process (local Hadoop)
    [junit] 08/12/30 17:16:22 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] Job running in-process (local Hadoop)
    [junit] 08/12/30 17:16:22 INFO mapred.MapTask: numReduceTasks: 1
    [junit] 08/12/30 17:16:22 INFO mapred.MapTask: io.sort.mb = 100
    [junit] 08/12/30 17:16:22 INFO mapred.MapTask: data buffer = 79691776/99614720
    [junit] 08/12/30 17:16:22 INFO mapred.MapTask: record buffer = 262144/327680
    [junit] 08/12/30 17:16:22 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 17:16:22 INFO exec.MapOperator: Adding alias t:t2 to work list for file /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/t2/attempt_local_0001_r_000000_0
    [junit] 08/12/30 17:16:22 INFO exec.MapOperator: Got partitions: null
    [junit] 08/12/30 17:16:22 INFO exec.TableScanOperator: Initializing Self
    [junit] 08/12/30 17:16:22 INFO exec.TableScanOperator: Initializing children:
    [junit] 08/12/30 17:16:22 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 17:16:22 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 17:16:22 INFO exec.ReduceSinkOperator: Initializing Self
    [junit] 08/12/30 17:16:22 INFO exec.ReduceSinkOperator: Using tag = -1
    [junit] 08/12/30 17:16:22 INFO thrift.TBinarySortableProtocol: Sort order is "++"
    [junit] 08/12/30 17:16:22 INFO thrift.TBinarySortableProtocol: Sort order is "++"
    [junit] 08/12/30 17:16:22 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 17:16:22 INFO exec.TableScanOperator: Initialization Done
    [junit] 08/12/30 17:16:22 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 17:16:22 INFO mapred.MapTask: Starting flush of map output
    [junit] 08/12/30 17:16:22 INFO mapred.MapTask: Finished spill 0
    [junit] 08/12/30 17:16:22 INFO mapred.TaskRunner: Task:attempt_local_0001_m_000000_0 is done. And is in the process of commiting
    [junit] 08/12/30 17:16:22 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/t2/attempt_local_0001_r_000000_0:0+232
    [junit] 08/12/30 17:16:22 INFO mapred.TaskRunner: Task 'attempt_local_0001_m_000000_0' done.
    [junit] 08/12/30 17:16:22 INFO thrift.TBinarySortableProtocol: Sort order is "++"
    [junit] 08/12/30 17:16:22 INFO thrift.TBinarySortableProtocol: Sort order is "++"
    [junit] 08/12/30 17:16:22 INFO mapred.Merger: Merging 1 sorted segments
    [junit] 08/12/30 17:16:22 INFO mapred.Merger: Down to the last merge-pass, with 1 segments left of total size: 966 bytes
    [junit] 08/12/30 17:16:22 INFO exec.ExtractOperator: Initializing Self
    [junit] 08/12/30 17:16:22 INFO exec.ExtractOperator: Initializing children:
    [junit] 08/12/30 17:16:22 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 17:16:22 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 17:16:22 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 17:16:22 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 17:16:22 INFO exec.ExtractOperator: Initialization Done
    [junit] 08/12/30 17:16:22 INFO mapred.TaskRunner: Task:attempt_local_0001_r_000000_0 is done. And is in the process of commiting
    [junit] 08/12/30 17:16:22 INFO mapred.LocalJobRunner: 
    [junit] 08/12/30 17:16:22 INFO mapred.TaskRunner: Task attempt_local_0001_r_000000_0 is allowed to commit now
    [junit] 08/12/30 17:16:22 INFO mapred.FileOutputCommitter: Saved output of task 'attempt_local_0001_r_000000_0' to file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/tmp-1489795987
    [junit] 08/12/30 17:16:22 INFO mapred.LocalJobRunner: reduce > reduce
    [junit] 08/12/30 17:16:22 INFO mapred.TaskRunner: Task 'attempt_local_0001_r_000000_0' done.
    [junit]  map = 100%,  reduce =100%
    [junit] 08/12/30 17:16:23 INFO exec.ExecDriver:  map = 100%,  reduce =100%
    [junit] Ended Job = job_local_0001
    [junit] 08/12/30 17:16:23 INFO exec.ExecDriver: Ended Job = job_local_0001
    [junit] diff -I \(file:\)\|\(/tmp/.*\) /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/../build/ql/test/logs/input3_limit.q.out /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/src/test/results/clientpositive/input3_limit.q.out
    [junit] Done query: input3_limit.q
    [junit] Begin query: create_1.q
    [junit] diff -I \(file:\)\|\(/tmp/.*\) /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/../build/ql/test/logs/create_1.q.out /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/src/test/results/clientpositive/create_1.q.out
    [junit] Done query: create_1.q
    [junit] Begin query: scriptfile1.q
    [junit] plan = /tmp/plan1423.xml
    [junit] 08/12/30 17:16:28 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to jobconf value of: 1
    [junit] 08/12/30 17:16:28 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src
    [junit] 08/12/30 17:16:28 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/data/files/TestSerDe.jar
    [junit] 08/12/30 17:16:28 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
    [junit] 08/12/30 17:16:28 WARN mapred.JobClient: Use GenericOptionsParser for parsing the arguments. Applications should implement Tool for the same.
    [junit] 08/12/30 17:16:28 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] 08/12/30 17:16:28 INFO exec.ExecDriver: Job running in-process (local Hadoop)
    [junit] Job running in-process (local Hadoop)
    [junit] 08/12/30 17:16:28 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] 08/12/30 17:16:28 INFO mapred.MapTask: numReduceTasks: 1
    [junit] 08/12/30 17:16:28 INFO mapred.MapTask: io.sort.mb = 100
    [junit] 08/12/30 17:16:29 INFO mapred.MapTask: data buffer = 79691776/99614720
    [junit] 08/12/30 17:16:29 INFO mapred.MapTask: record buffer = 262144/327680
    [junit] 08/12/30 17:16:29 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 17:16:29 INFO exec.MapOperator: Adding alias tmap:src to work list for file /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src/kv1.txt
    [junit] 08/12/30 17:16:29 INFO exec.MapOperator: Got partitions: null
    [junit] 08/12/30 17:16:29 INFO exec.TableScanOperator: Initializing Self
    [junit] 08/12/30 17:16:29 INFO exec.TableScanOperator: Initializing children:
    [junit] 08/12/30 17:16:29 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 17:16:29 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 17:16:29 INFO exec.ScriptOperator: Initializing Self
    [junit] 08/12/30 17:16:29 INFO exec.ScriptOperator: Initializing children:
    [junit] 08/12/30 17:16:29 INFO exec.ReduceSinkOperator: Initializing Self
    [junit] 08/12/30 17:16:29 INFO exec.ReduceSinkOperator: Using tag = -1
    [junit] 08/12/30 17:16:29 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 17:16:29 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 17:16:29 INFO exec.ScriptOperator: Initialization Done
    [junit] 08/12/30 17:16:29 INFO exec.ScriptOperator: Executing [/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/./src/test/scripts/testgrep]
    [junit] 08/12/30 17:16:29 INFO exec.ScriptOperator: tablename=src
    [junit] 08/12/30 17:16:29 INFO exec.ScriptOperator: partname={}
    [junit] 08/12/30 17:16:29 INFO exec.ScriptOperator: alias=tmap:src
    [junit] 08/12/30 17:16:29 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 17:16:29 INFO exec.TableScanOperator: Initialization Done
    [junit] 08/12/30 17:16:29 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 17:16:29 INFO exec.ScriptOperator: StreamThread ErrorProcessor done
    [junit] 08/12/30 17:16:29 INFO exec.ScriptOperator: StreamThread OutputProcessor done
    [junit] 08/12/30 17:16:29 INFO exec.ScriptOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 17:16:29 INFO exec.ScriptOperator: SERIALIZE_ERRORS:0
    [junit] 08/12/30 17:16:29 INFO mapred.MapTask: Starting flush of map output
    [junit] 08/12/30 17:16:29 INFO mapred.MapTask: Finished spill 0
    [junit] 08/12/30 17:16:29 INFO mapred.TaskRunner: Task:attempt_local_0001_m_000000_0 is done. And is in the process of commiting
    [junit] 08/12/30 17:16:29 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src/kv1.txt:0+5812
    [junit] 08/12/30 17:16:29 INFO mapred.TaskRunner: Task 'attempt_local_0001_m_000000_0' done.
    [junit] 08/12/30 17:16:29 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 17:16:29 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 17:16:29 INFO mapred.Merger: Merging 1 sorted segments
    [junit] 08/12/30 17:16:29 INFO mapred.Merger: Down to the last merge-pass, with 1 segments left of total size: 359 bytes
    [junit] 08/12/30 17:16:29 INFO exec.ExtractOperator: Initializing Self
    [junit] 08/12/30 17:16:29 INFO exec.ExtractOperator: Initializing children:
    [junit] 08/12/30 17:16:29 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 17:16:29 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 17:16:29 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 17:16:29 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 17:16:29 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 17:16:29 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 17:16:29 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 17:16:29 INFO exec.ExtractOperator: Initialization Done
    [junit] 08/12/30 17:16:29 INFO mapred.TaskRunner: Task:attempt_local_0001_r_000000_0 is done. And is in the process of commiting
    [junit] 08/12/30 17:16:29 INFO mapred.LocalJobRunner: 
    [junit] 08/12/30 17:16:29 INFO mapred.TaskRunner: Task attempt_local_0001_r_000000_0 is allowed to commit now
    [junit] 08/12/30 17:16:29 INFO mapred.FileOutputCommitter: Saved output of task 'attempt_local_0001_r_000000_0' to file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/tmp-348054238
    [junit] 08/12/30 17:16:29 INFO mapred.LocalJobRunner: reduce > reduce
    [junit] 08/12/30 17:16:29 INFO mapred.TaskRunner: Task 'attempt_local_0001_r_000000_0' done.
    [junit] 08/12/30 17:16:29 INFO exec.ExecDriver:  map = 100%,  reduce =100%
    [junit]  map = 100%,  reduce =100%
    [junit] Ended Job = job_local_0001
    [junit] 08/12/30 17:16:29 INFO exec.ExecDriver: Ended Job = job_local_0001
    [junit] diff -I \(file:\)\|\(/tmp/.*\) /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/../build/ql/test/logs/scriptfile1.q.out /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/src/test/results/clientpositive/scriptfile1.q.out
    [junit] Done query: scriptfile1.q
    [junit] Begin query: case_sensitivity.q
    [junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src)
    [junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:361)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
    [junit] 	at org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_case_sensitivity(TestCliDriver.java:1778)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit] 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit] 	at java.lang.reflect.Method.invoke(Method.java:597)
    [junit] 	at junit.framework.TestCase.runTest(TestCase.java:164)
    [junit] 	at junit.framework.TestCase.runBare(TestCase.java:130)
    [junit] 	at junit.framework.TestResult$1.protect(TestResult.java:106)
    [junit] 	at junit.framework.TestResult.runProtected(TestResult.java:124)
    [junit] 	at junit.framework.TestResult.run(TestResult.java:109)
    [junit] 	at junit.framework.TestCase.run(TestCase.java:120)
    [junit] 	at junit.framework.TestSuite.runTest(TestSuite.java:230)
    [junit] 	at junit.framework.TestSuite.run(TestSuite.java:225)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
    [junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src)
    [junit] 	at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
    [junit] 	at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
    [junit] 	at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
    [junit] 	... 21 more
    [junit] Begin query: mapreduce2.q
    [junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/srcbucket)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:361)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
    [junit] 	at org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_mapreduce2(TestCliDriver.java:1803)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit] 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit] 	at java.lang.reflect.Method.invoke(Method.java:597)
    [junit] 	at junit.framework.TestCase.runTest(TestCase.java:164)
    [junit] 	at junit.framework.TestCase.runBare(TestCase.java:130)
    [junit] 	at junit.framework.TestResult$1.protect(TestResult.java:106)
    [junit] 	at junit.framework.TestResult.runProtected(TestResult.java:124)
    [junit] 	at junit.framework.TestResult.run(TestResult.java:109)
    [junit] 	at junit.framework.TestCase.run(TestCase.java:120)
    [junit] 	at junit.framework.TestSuite.runTest(TestSuite.java:230)
    [junit] 	at junit.framework.TestSuite.run(TestSuite.java:225)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
    [junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/srcbucket)
    [junit] 	at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
    [junit] 	at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
    [junit] 	at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
    [junit] 	... 21 more
    [junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/srcbucket)
    [junit] Begin query: mapreduce4.q
    [junit] plan = /tmp/plan1424.xml
    [junit] 08/12/30 17:16:33 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to jobconf value of: 1
    [junit] 08/12/30 17:16:33 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src
    [junit] 08/12/30 17:16:33 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/data/files/TestSerDe.jar
    [junit] 08/12/30 17:16:33 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
    [junit] 08/12/30 17:16:33 WARN mapred.JobClient: Use GenericOptionsParser for parsing the arguments. Applications should implement Tool for the same.
    [junit] 08/12/30 17:16:33 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] Job running in-process (local Hadoop)
    [junit] 08/12/30 17:16:33 INFO exec.ExecDriver: Job running in-process (local Hadoop)
    [junit] 08/12/30 17:16:33 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] 08/12/30 17:16:33 INFO mapred.MapTask: numReduceTasks: 1
    [junit] 08/12/30 17:16:33 INFO mapred.MapTask: io.sort.mb = 100
    [junit] 08/12/30 17:16:33 INFO mapred.MapTask: data buffer = 79691776/99614720
    [junit] 08/12/30 17:16:33 INFO mapred.MapTask: record buffer = 262144/327680
    [junit] 08/12/30 17:16:33 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 17:16:33 INFO exec.MapOperator: Adding alias src to work list for file /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src/kv1.txt
    [junit] 08/12/30 17:16:33 INFO exec.MapOperator: Got partitions: null
    [junit] 08/12/30 17:16:33 INFO exec.TableScanOperator: Initializing Self
    [junit] 08/12/30 17:16:33 INFO exec.TableScanOperator: Initializing children:
    [junit] 08/12/30 17:16:33 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 17:16:33 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 17:16:33 INFO exec.ScriptOperator: Initializing Self
    [junit] 08/12/30 17:16:33 INFO exec.ScriptOperator: Initializing children:
    [junit] 08/12/30 17:16:33 INFO exec.ReduceSinkOperator: Initializing Self
    [junit] 08/12/30 17:16:33 INFO exec.ReduceSinkOperator: Using tag = -1
    [junit] 08/12/30 17:16:33 INFO thrift.TBinarySortableProtocol: Sort order is "-+"
    [junit] 08/12/30 17:16:33 INFO thrift.TBinarySortableProtocol: Sort order is "-+"
    [junit] 08/12/30 17:16:33 INFO exec.ScriptOperator: Initialization Done
    [junit] 08/12/30 17:16:33 INFO exec.ScriptOperator: Executing [/bin/cat]
    [junit] 08/12/30 17:16:33 INFO exec.ScriptOperator: tablename=src
    [junit] 08/12/30 17:16:33 INFO exec.ScriptOperator: partname={}
    [junit] 08/12/30 17:16:33 INFO exec.ScriptOperator: alias=src
    [junit] 08/12/30 17:16:33 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 17:16:33 INFO exec.TableScanOperator: Initialization Done
    [junit] 08/12/30 17:16:34 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 17:16:34 INFO exec.ScriptOperator: StreamThread ErrorProcessor done
    [junit] 08/12/30 17:16:34 INFO exec.ScriptOperator: StreamThread OutputProcessor done
    [junit] 08/12/30 17:16:34 INFO exec.ScriptOperator: SERIALIZE_ERRORS:0
    [junit] 08/12/30 17:16:34 INFO exec.ScriptOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 17:16:34 INFO mapred.MapTask: Starting flush of map output
    [junit] 08/12/30 17:16:34 INFO mapred.MapTask: Finished spill 0
    [junit] 08/12/30 17:16:34 INFO mapred.TaskRunner: Task:attempt_local_0001_m_000000_0 is done. And is in the process of commiting
    [junit] 08/12/30 17:16:34 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src/kv1.txt:0+5812
    [junit] 08/12/30 17:16:34 INFO mapred.TaskRunner: Task 'attempt_local_0001_m_000000_0' done.
    [junit] 08/12/30 17:16:34 INFO thrift.TBinarySortableProtocol: Sort order is "-+"
    [junit] 08/12/30 17:16:34 INFO thrift.TBinarySortableProtocol: Sort order is "-+"
    [junit] 08/12/30 17:16:34 INFO mapred.Merger: Merging 1 sorted segments
    [junit] 08/12/30 17:16:34 INFO mapred.Merger: Down to the last merge-pass, with 1 segments left of total size: 29146 bytes
    [junit] 08/12/30 17:16:34 INFO exec.ExtractOperator: Initializing Self
    [junit] 08/12/30 17:16:34 INFO exec.ExtractOperator: Initializing children:
    [junit] 08/12/30 17:16:34 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 17:16:34 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 17:16:34 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 17:16:34 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 17:16:34 INFO exec.ExtractOperator: Initialization Done
    [junit] 08/12/30 17:16:34 INFO mapred.TaskRunner: Task:attempt_local_0001_r_000000_0 is done. And is in the process of commiting
    [junit] 08/12/30 17:16:34 INFO mapred.LocalJobRunner: 
    [junit] 08/12/30 17:16:34 INFO mapred.TaskRunner: Task attempt_local_0001_r_000000_0 is allowed to commit now
    [junit] 08/12/30 17:16:34 INFO mapred.FileOutputCommitter: Saved output of task 'attempt_local_0001_r_000000_0' to file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/tmp1171100960
    [junit] 08/12/30 17:16:34 INFO mapred.LocalJobRunner: reduce > reduce
    [junit] 08/12/30 17:16:34 INFO mapred.TaskRunner: Task 'attempt_local_0001_r_000000_0' done.
    [junit] 08/12/30 17:16:34 INFO exec.ExecDriver:  map = 100%,  reduce =100%
    [junit] 08/12/30 17:16:34 INFO exec.ExecDriver: Ended Job = job_local_0001
    [junit]  map = 100%,  reduce =100%
    [junit] Ended Job = job_local_0001
    [junit] diff -I \(file:\)\|\(/tmp/.*\) /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/../build/ql/test/logs/mapreduce4.q.out /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/src/test/results/clientpositive/mapreduce4.q.out
    [junit] Done query: mapreduce4.q
    [junit] Begin query: nullinput.q
    [junit] plan = /tmp/plan1425.xml
    [junit] 08/12/30 17:16:37 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to jobconf value of: 1
    [junit] 08/12/30 17:16:37 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/tstnullinut
    [junit] 08/12/30 17:16:37 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/data/files/TestSerDe.jar
    [junit] Job need not be submitted: no output: Success
    [junit] 08/12/30 17:16:37 INFO exec.ExecDriver: Job need not be submitted: no output: Success
    [junit] plan = /tmp/plan1426.xml
    [junit] 08/12/30 17:16:38 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to jobconf value of: 1
    [junit] 08/12/30 17:16:39 INFO exec.ExecDriver: Adding input file /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/tmp/231029896/48376510.10002
    [junit] 08/12/30 17:16:39 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/data/files/TestSerDe.jar
    [junit] Job need not be submitted: no output: Success
    [junit] 08/12/30 17:16:39 INFO exec.ExecDriver: Job need not be submitted: no output: Success
    [junit] diff -I \(file:\)\|\(/tmp/.*\) /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/../build/ql/test/logs/nullinput.q.out /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/src/test/results/clientpositive/nullinput.q.out
    [junit] Done query: nullinput.q
    [junit] Begin query: mapreduce6.q
    [junit] plan = /tmp/plan1427.xml
    [junit] 08/12/30 17:16:42 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to jobconf value of: 1
    [junit] 08/12/30 17:16:42 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src
    [junit] 08/12/30 17:16:42 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/data/files/TestSerDe.jar
    [junit] 08/12/30 17:16:42 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
    [junit] 08/12/30 17:16:42 WARN mapred.JobClient: Use GenericOptionsParser for parsing the arguments. Applications should implement Tool for the same.
    [junit] 08/12/30 17:16:42 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] Job running in-process (local Hadoop)
    [junit] 08/12/30 17:16:42 INFO exec.ExecDriver: Job running in-process (local Hadoop)
    [junit] 08/12/30 17:16:42 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] 08/12/30 17:16:42 INFO mapred.MapTask: numReduceTasks: 1
    [junit] 08/12/30 17:16:42 INFO mapred.MapTask: io.sort.mb = 100
    [junit] 08/12/30 17:16:42 INFO mapred.MapTask: data buffer = 79691776/99614720
    [junit] 08/12/30 17:16:42 INFO mapred.MapTask: record buffer = 262144/327680
    [junit] 08/12/30 17:16:42 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 17:16:42 INFO exec.MapOperator: Adding alias src to work list for file /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src/kv1.txt
    [junit] 08/12/30 17:16:42 INFO exec.MapOperator: Got partitions: null
    [junit] 08/12/30 17:16:42 INFO exec.TableScanOperator: Initializing Self
    [junit] 08/12/30 17:16:42 INFO exec.TableScanOperator: Initializing children:
    [junit] 08/12/30 17:16:42 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 17:16:42 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 17:16:42 INFO exec.ReduceSinkOperator: Initializing Self
    [junit] 08/12/30 17:16:42 INFO exec.ReduceSinkOperator: Using tag = -1
    [junit] 08/12/30 17:16:43 INFO thrift.TBinarySortableProtocol: Sort order is "-+"
    [junit] 08/12/30 17:16:43 INFO thrift.TBinarySortableProtocol: Sort order is "-+"
    [junit] 08/12/30 17:16:43 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 17:16:43 INFO exec.TableScanOperator: Initialization Done
    [junit] 08/12/30 17:16:43 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 17:16:43 INFO mapred.MapTask: Starting flush of map output
    [junit] 08/12/30 17:16:43 INFO mapred.MapTask: Finished spill 0
    [junit] 08/12/30 17:16:43 INFO mapred.TaskRunner: Task:attempt_local_0001_m_000000_0 is done. And is in the process of commiting
    [junit] 08/12/30 17:16:43 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src/kv1.txt:0+5812
    [junit] 08/12/30 17:16:43 INFO mapred.TaskRunner: Task 'attempt_local_0001_m_000000_0' done.
    [junit] 08/12/30 17:16:43 INFO thrift.TBinarySortableProtocol: Sort order is "-+"
    [junit] 08/12/30 17:16:43 INFO thrift.TBinarySortableProtocol: Sort order is "-+"
    [junit] 08/12/30 17:16:43 INFO mapred.Merger: Merging 1 sorted segments
    [junit] 08/12/30 17:16:43 INFO mapred.Merger: Down to the last merge-pass, with 1 segments left of total size: 29314 bytes
    [junit] 08/12/30 17:16:43 INFO exec.ExtractOperator: Initializing Self
    [junit] 08/12/30 17:16:43 INFO exec.ExtractOperator: Initializing children:
    [junit] 08/12/30 17:16:43 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 17:16:43 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 17:16:43 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 17:16:43 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 17:16:43 INFO exec.ExtractOperator: Initialization Done
    [junit] 08/12/30 17:16:43 INFO mapred.TaskRunner: Task:attempt_local_0001_r_000000_0 is done. And is in the process of commiting
    [junit] 08/12/30 17:16:43 INFO mapred.LocalJobRunner: 
    [junit] 08/12/30 17:16:43 INFO mapred.TaskRunner: Task attempt_local_0001_r_000000_0 is allowed to commit now
    [junit] 08/12/30 17:16:43 INFO mapred.FileOutputCommitter: Saved output of task 'attempt_local_0001_r_000000_0' to file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/tmp1207982076
    [junit] 08/12/30 17:16:43 INFO mapred.LocalJobRunner: reduce > reduce
    [junit] 08/12/30 17:16:43 INFO mapred.TaskRunner: Task 'attempt_local_0001_r_000000_0' done.
    [junit]  map = 100%,  reduce =100%
    [junit] Ended Job = job_local_0001
    [junit] 08/12/30 17:16:43 INFO exec.ExecDriver:  map = 100%,  reduce =100%
    [junit] 08/12/30 17:16:43 INFO exec.ExecDriver: Ended Job = job_local_0001
    [junit] diff -I \(file:\)\|\(/tmp/.*\) /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/../build/ql/test/logs/mapreduce6.q.out /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/src/test/results/clientpositive/mapreduce6.q.out
    [junit] Done query: mapreduce6.q
    [junit] Begin query: sample1.q
    [junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src)
    [junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:361)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
    [junit] 	at org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_sample1(TestCliDriver.java:1903)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit] 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit] 	at java.lang.reflect.Method.invoke(Method.java:597)
    [junit] 	at junit.framework.TestCase.runTest(TestCase.java:164)
    [junit] 	at junit.framework.TestCase.runBare(TestCase.java:130)
    [junit] 	at junit.framework.TestResult$1.protect(TestResult.java:106)
    [junit] 	at junit.framework.TestResult.runProtected(TestResult.java:124)
    [junit] 	at junit.framework.TestResult.run(TestResult.java:109)
    [junit] 	at junit.framework.TestCase.run(TestCase.java:120)
    [junit] 	at junit.framework.TestSuite.runTest(TestSuite.java:230)
    [junit] 	at junit.framework.TestSuite.run(TestSuite.java:225)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
    [junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src)
    [junit] 	at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
    [junit] 	at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
    [junit] 	at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
    [junit] 	... 21 more
    [junit] Begin query: sample3.q
    [junit] plan = /tmp/plan1428.xml
    [junit] 08/12/30 17:16:46 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to 0 since there's no reduce operator
    [junit] 08/12/30 17:16:46 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/srcbucket
    [junit] 08/12/30 17:16:46 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/data/files/TestSerDe.jar
    [junit] 08/12/30 17:16:46 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
    [junit] 08/12/30 17:16:46 WARN mapred.JobClient: Use GenericOptionsParser for parsing the arguments. Applications should implement Tool for the same.
    [junit] 08/12/30 17:16:47 INFO mapred.FileInputFormat: Total input paths to process : 2
    [junit] 08/12/30 17:16:47 INFO exec.ExecDriver: Job running in-process (local Hadoop)
    [junit] Job running in-process (local Hadoop)
    [junit] 08/12/30 17:16:47 INFO mapred.FileInputFormat: Total input paths to process : 2
    [junit] 08/12/30 17:16:47 INFO mapred.MapTask: numReduceTasks: 0
    [junit] 08/12/30 17:16:47 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 17:16:47 INFO exec.MapOperator: Adding alias s to work list for file /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/srcbucket/kv1.txt
    [junit] 08/12/30 17:16:47 INFO exec.MapOperator: Got partitions: null
    [junit] 08/12/30 17:16:47 INFO exec.TableScanOperator: Initializing Self
    [junit] 08/12/30 17:16:47 INFO exec.TableScanOperator: Initializing children:
    [junit] 08/12/30 17:16:47 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 17:16:47 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 17:16:47 INFO exec.FilterOperator: Initializing Self
    [junit] 08/12/30 17:16:47 INFO exec.FilterOperator: Initializing children:
    [junit] 08/12/30 17:16:47 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 17:16:47 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 17:16:47 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 17:16:47 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 17:16:47 INFO exec.FilterOperator: Initialization Done
    [junit] 08/12/30 17:16:47 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 17:16:47 INFO exec.TableScanOperator: Initialization Done
    [junit] 08/12/30 17:16:47 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 17:16:47 INFO exec.FilterOperator: FILTERED:402
    [junit] 08/12/30 17:16:47 INFO exec.FilterOperator: PASSED:98
    [junit] 08/12/30 17:16:47 INFO mapred.TaskRunner: Task:attempt_local_0001_m_000000_0 is done. And is in the process of commiting
    [junit] 08/12/30 17:16:47 INFO mapred.LocalJobRunner: 
    [junit] 08/12/30 17:16:47 INFO mapred.TaskRunner: Task attempt_local_0001_m_000000_0 is allowed to commit now
    [junit] 08/12/30 17:16:47 INFO mapred.FileOutputCommitter: Saved output of task 'attempt_local_0001_m_000000_0' to file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/tmp374558463
    [junit] 08/12/30 17:16:47 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/srcbucket/kv1.txt:0+5812
    [junit] 08/12/30 17:16:47 INFO mapred.TaskRunner: Task 'attempt_local_0001_m_000000_0' done.
    [junit] 08/12/30 17:16:47 INFO mapred.MapTask: numReduceTasks: 0
    [junit] 08/12/30 17:16:47 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 17:16:47 INFO exec.MapOperator: Adding alias s to work list for file /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/srcbucket/kv2.txt
    [junit] 08/12/30 17:16:47 INFO exec.MapOperator: Got partitions: null
    [junit] 08/12/30 17:16:47 INFO exec.TableScanOperator: Initializing Self
    [junit] 08/12/30 17:16:47 INFO exec.TableScanOperator: Initializing children:
    [junit] 08/12/30 17:16:47 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 17:16:47 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 17:16:47 INFO exec.FilterOperator: Initializing Self
    [junit] 08/12/30 17:16:47 INFO exec.FilterOperator: Initializing children:
    [junit] 08/12/30 17:16:47 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 17:16:47 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 17:16:47 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 17:16:47 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 17:16:47 INFO exec.FilterOperator: Initialization Done
    [junit] 08/12/30 17:16:47 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 17:16:47 INFO exec.TableScanOperator: Initialization Done
    [junit] 08/12/30 17:16:47 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 17:16:47 INFO exec.FilterOperator: FILTERED:793
    [junit] 08/12/30 17:16:47 INFO exec.FilterOperator: PASSED:207
    [junit] 08/12/30 17:16:47 INFO mapred.TaskRunner: Task:attempt_local_0001_m_000001_0 is done. And is in the process of commiting
    [junit] 08/12/30 17:16:47 INFO mapred.LocalJobRunner: 
    [junit] 08/12/30 17:16:47 INFO mapred.TaskRunner: Task attempt_local_0001_m_000001_0 is allowed to commit now
    [junit] 08/12/30 17:16:47 INFO mapred.FileOutputCommitter: Saved output of task 'attempt_local_0001_m_000001_0' to file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/tmp374558463
    [junit] 08/12/30 17:16:47 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/srcbucket/kv2.txt:0+5791
    [junit] 08/12/30 17:16:47 INFO mapred.TaskRunner: Task 'attempt_local_0001_m_000001_0' done.
    [junit]  map = 100%,  reduce =0%
    [junit] Ended Job = job_local_0001
    [junit] 08/12/30 17:16:48 INFO exec.ExecDriver:  map = 100%,  reduce =0%
    [junit] 08/12/30 17:16:48 INFO exec.ExecDriver: Ended Job = job_local_0001
    [junit] diff -I \(file:\)\|\(/tmp/.*\) /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/../build/ql/test/logs/sample3.q.out /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/src/test/results/clientpositive/sample3.q.out
    [junit] Done query: sample3.q
    [junit] Begin query: groupby1_map.q
    [junit] plan = /tmp/plan1429.xml
    [junit] 08/12/30 17:16:51 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to jobconf value of: 1
    [junit] 08/12/30 17:16:51 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src
    [junit] 08/12/30 17:16:51 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/data/files/TestSerDe.jar
    [junit] 08/12/30 17:16:51 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
    [junit] 08/12/30 17:16:51 WARN mapred.JobClient: Use GenericOptionsParser for parsing the arguments. Applications should implement Tool for the same.
    [junit] 08/12/30 17:16:51 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] 08/12/30 17:16:51 INFO exec.ExecDriver: Job running in-process (local Hadoop)
    [junit] 08/12/30 17:16:51 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] Job running in-process (local Hadoop)
    [junit] 08/12/30 17:16:51 INFO mapred.MapTask: numReduceTasks: 1
    [junit] 08/12/30 17:16:51 INFO mapred.MapTask: io.sort.mb = 100
    [junit] 08/12/30 17:16:51 INFO mapred.MapTask: data buffer = 79691776/99614720
    [junit] 08/12/30 17:16:51 INFO mapred.MapTask: record buffer = 262144/327680
    [junit] 08/12/30 17:16:51 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 17:16:51 INFO exec.MapOperator: Adding alias src to work list for file /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src/kv1.txt
    [junit] 08/12/30 17:16:51 INFO exec.MapOperator: Got partitions: null
    [junit] 08/12/30 17:16:51 INFO exec.TableScanOperator: Initializing Self
    [junit] 08/12/30 17:16:51 INFO exec.TableScanOperator: Initializing children:
    [junit] 08/12/30 17:16:51 INFO exec.GroupByOperator: Initializing Self
    [junit] 08/12/30 17:16:51 INFO exec.GroupByOperator: Initializing children:
    [junit] 08/12/30 17:16:51 INFO exec.ReduceSinkOperator: Initializing Self
    [junit] 08/12/30 17:16:51 INFO exec.ReduceSinkOperator: Using tag = -1
    [junit] 08/12/30 17:16:51 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 17:16:51 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 17:16:51 INFO exec.GroupByOperator: Initialization Done
    [junit] 08/12/30 17:16:51 INFO exec.TableScanOperator: Initialization Done
    [junit] 08/12/30 17:16:52 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 17:16:52 INFO mapred.MapTask: Starting flush of map output
    [junit] 08/12/30 17:16:52 INFO mapred.MapTask: Finished spill 0
    [junit] 08/12/30 17:16:52 INFO mapred.TaskRunner: Task:attempt_local_0001_m_000000_0 is done. And is in the process of commiting
    [junit] 08/12/30 17:16:52 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src/kv1.txt:0+5812
    [junit] 08/12/30 17:16:52 INFO mapred.TaskRunner: Task 'attempt_local_0001_m_000000_0' done.
    [junit] 08/12/30 17:16:52 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 17:16:52 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 17:16:52 INFO mapred.Merger: Merging 1 sorted segments
    [junit] 08/12/30 17:16:52 INFO mapred.Merger: Down to the last merge-pass, with 1 segments left of total size: 13408 bytes
    [junit] 08/12/30 17:16:52 INFO exec.GroupByOperator: Initializing Self
    [junit] 08/12/30 17:16:52 INFO exec.GroupByOperator: Initializing children:
    [junit] 08/12/30 17:16:52 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 17:16:52 INFO exec.GroupByOperator: Initialization Done
    [junit] 08/12/30 17:16:52 INFO mapred.TaskRunner: Task:attempt_local_0001_r_000000_0 is done. And is in the process of commiting
    [junit] 08/12/30 17:16:52 INFO mapred.LocalJobRunner: 
    [junit] 08/12/30 17:16:52 INFO mapred.TaskRunner: Task attempt_local_0001_r_000000_0 is allowed to commit now
    [junit] 08/12/30 17:16:52 INFO mapred.FileOutputCommitter: Saved output of task 'attempt_local_0001_r_000000_0' to file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/tmp-861031885
    [junit] 08/12/30 17:16:52 INFO mapred.LocalJobRunner: reduce > reduce
    [junit] 08/12/30 17:16:52 INFO mapred.TaskRunner: Task 'attempt_local_0001_r_000000_0' done.
    [junit] 08/12/30 17:16:52 INFO exec.ExecDriver:  map = 100%,  reduce =100%
    [junit]  map = 100%,  reduce =100%
    [junit] 08/12/30 17:16:52 INFO exec.ExecDriver: Ended Job = job_local_0001
    [junit] Ended Job = job_local_0001
    [junit] plan = /tmp/plan1430.xml
    [junit] 08/12/30 17:16:53 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to jobconf value of: 1
    [junit] 08/12/30 17:16:54 INFO exec.ExecDriver: Adding input file /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/tmp/804960335/84310169.10001
    [junit] 08/12/30 17:16:54 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/data/files/TestSerDe.jar
    [junit] 08/12/30 17:16:54 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
    [junit] 08/12/30 17:16:54 WARN mapred.JobClient: Use GenericOptionsParser for parsing the arguments. Applications should implement Tool for the same.
    [junit] 08/12/30 17:16:54 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] 08/12/30 17:16:54 INFO exec.ExecDriver: Job running in-process (local Hadoop)
    [junit] Job running in-process (local Hadoop)
    [junit] 08/12/30 17:16:54 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] 08/12/30 17:16:54 INFO mapred.MapTask: numReduceTasks: 1
    [junit] 08/12/30 17:16:54 INFO mapred.MapTask: io.sort.mb = 100
    [junit] 08/12/30 17:16:54 INFO mapred.MapTask: data buffer = 79691776/99614720
    [junit] 08/12/30 17:16:54 INFO mapred.MapTask: record buffer = 262144/327680
    [junit] 08/12/30 17:16:54 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 17:16:54 INFO exec.MapOperator: Adding alias /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/tmp/804960335/84310169.10001 to work list for file /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/tmp/804960335/84310169.10001/attempt_local_0001_r_000000_0
    [junit] 08/12/30 17:16:54 INFO exec.MapOperator: Got partitions: null
    [junit] 08/12/30 17:16:54 INFO exec.ReduceSinkOperator: Initializing Self
    [junit] 08/12/30 17:16:54 INFO exec.ReduceSinkOperator: Using tag = -1
    [junit] 08/12/30 17:16:54 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 17:16:54 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 17:16:54 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 17:16:54 INFO mapred.MapTask: Starting flush of map output
    [junit] 08/12/30 17:16:54 INFO mapred.MapTask: Finished spill 0
    [junit] 08/12/30 17:16:54 INFO mapred.TaskRunner: Task:attempt_local_0001_m_000000_0 is done. And is in the process of commiting
    [junit] 08/12/30 17:16:54 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/tmp/804960335/84310169.10001/attempt_local_0001_r_000000_0:0+11875
    [junit] 08/12/30 17:16:54 INFO mapred.TaskRunner: Task 'attempt_local_0001_m_000000_0' done.
    [junit] 08/12/30 17:16:54 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 17:16:54 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 17:16:54 INFO mapred.Merger: Merging 1 sorted segments
    [junit] 08/12/30 17:16:54 INFO mapred.Merger: Down to the last merge-pass, with 1 segments left of total size: 8282 bytes
    [junit] 08/12/30 17:16:54 INFO exec.GroupByOperator: Initializing Self
    [junit] 08/12/30 17:16:54 INFO exec.GroupByOperator: Initializing children:
    [junit] 08/12/30 17:16:54 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 17:16:54 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 17:16:54 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 17:16:54 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 17:16:54 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 17:16:55 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 17:16:55 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 17:16:55 INFO exec.GroupByOperator: Initialization Done
    [junit] 08/12/30 17:16:55 INFO mapred.TaskRunner: Task:attempt_local_0001_r_000000_0 is done. And is in the process of commiting
    [junit] 08/12/30 17:16:55 INFO mapred.LocalJobRunner: 
    [junit] 08/12/30 17:16:55 INFO mapred.TaskRunner: Task attempt_local_0001_r_000000_0 is allowed to commit now
    [junit] 08/12/30 17:16:55 INFO mapred.FileOutputCommitter: Saved output of task 'attempt_local_0001_r_000000_0' to file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/tmp368307071
    [junit] 08/12/30 17:16:55 INFO mapred.LocalJobRunner: reduce > reduce
    [junit] 08/12/30 17:16:55 INFO mapred.TaskRunner: Task 'attempt_local_0001_r_000000_0' done.
    [junit]  map = 100%,  reduce =100%
    [junit] 08/12/30 17:16:55 INFO exec.ExecDriver:  map = 100%,  reduce =100%
    [junit] 08/12/30 17:16:55 INFO exec.ExecDriver: Ended Job = job_local_0001
    [junit] Ended Job = job_local_0001
    [junit] diff -I \(file:\)\|\(/tmp/.*\) /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/../build/ql/test/logs/groupby1_map.q.out /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/src/test/results/clientpositive/groupby1_map.q.out
    [junit] Done query: groupby1_map.q
    [junit] Begin query: inputddl2.q
    [junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:361)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
    [junit] 	at org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_inputddl2(TestCliDriver.java:1978)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit] 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit] 	at java.lang.reflect.Method.invoke(Method.java:597)
    [junit] 	at junit.framework.TestCase.runTest(TestCase.java:164)
    [junit] 	at junit.framework.TestCase.runBare(TestCase.java:130)
    [junit] 	at junit.framework.TestResult$1.protect(TestResult.java:106)
    [junit] 	at junit.framework.TestResult.runProtected(TestResult.java:124)
    [junit] 	at junit.framework.TestResult.run(TestResult.java:109)
    [junit] 	at junit.framework.TestCase.run(TestCase.java:120)
    [junit] 	at junit.framework.TestSuite.runTest(TestSuite.java:230)
    [junit] 	at junit.framework.TestSuite.run(TestSuite.java:225)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
    [junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src)
    [junit] 	at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
    [junit] 	at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
    [junit] 	at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
    [junit] 	... 21 more
    [junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src)
    [junit] Begin query: groupby1_limit.q
    [junit] plan = /tmp/plan1431.xml
    [junit] 08/12/30 17:16:58 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to jobconf value of: 31
    [junit] 08/12/30 17:16:58 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src
    [junit] 08/12/30 17:16:58 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/data/files/TestSerDe.jar
    [junit] 08/12/30 17:16:58 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
    [junit] 08/12/30 17:16:58 WARN mapred.JobClient: Use GenericOptionsParser for parsing the arguments. Applications should implement Tool for the same.
    [junit] 08/12/30 17:16:58 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] 08/12/30 17:16:59 INFO exec.ExecDriver: Job running in-process (local Hadoop)
    [junit] 08/12/30 17:16:59 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] Job running in-process (local Hadoop)
    [junit] 08/12/30 17:16:59 INFO mapred.MapTask: numReduceTasks: 1
    [junit] 08/12/30 17:16:59 INFO mapred.MapTask: io.sort.mb = 100
    [junit] 08/12/30 17:16:59 INFO mapred.MapTask: data buffer = 79691776/99614720
    [junit] 08/12/30 17:16:59 INFO mapred.MapTask: record buffer = 262144/327680
    [junit] 08/12/30 17:16:59 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 17:16:59 INFO exec.MapOperator: Adding alias src to work list for file /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src/kv1.txt
    [junit] 08/12/30 17:16:59 INFO exec.MapOperator: Got partitions: null
    [junit] 08/12/30 17:16:59 INFO exec.TableScanOperator: Initializing Self
    [junit] 08/12/30 17:16:59 INFO exec.TableScanOperator: Initializing children:
    [junit] 08/12/30 17:16:59 INFO exec.ReduceSinkOperator: Initializing Self
    [junit] 08/12/30 17:16:59 INFO exec.ReduceSinkOperator: Using tag = -1
    [junit] 08/12/30 17:16:59 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 17:16:59 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 17:16:59 INFO exec.TableScanOperator: Initialization Done
    [junit] 08/12/30 17:16:59 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 17:16:59 INFO mapred.MapTask: Starting flush of map output
    [junit] 08/12/30 17:16:59 INFO mapred.MapTask: Finished spill 0
    [junit] 08/12/30 17:16:59 INFO mapred.TaskRunner: Task:attempt_local_0001_m_000000_0 is done. And is in the process of commiting
    [junit] 08/12/30 17:16:59 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src/kv1.txt:0+5812
    [junit] 08/12/30 17:16:59 INFO mapred.TaskRunner: Task 'attempt_local_0001_m_000000_0' done.
    [junit] 08/12/30 17:16:59 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 17:16:59 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 17:16:59 INFO mapred.Merger: Merging 1 sorted segments
    [junit] 08/12/30 17:16:59 INFO mapred.Merger: Down to the last merge-pass, with 1 segments left of total size: 12814 bytes
    [junit] 08/12/30 17:16:59 INFO exec.GroupByOperator: Initializing Self
    [junit] 08/12/30 17:16:59 INFO exec.GroupByOperator: Initializing children:
    [junit] 08/12/30 17:16:59 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 17:16:59 INFO exec.GroupByOperator: Initialization Done
    [junit] 08/12/30 17:16:59 INFO mapred.TaskRunner: Task:attempt_local_0001_r_000000_0 is done. And is in the process of commiting
    [junit] 08/12/30 17:16:59 INFO mapred.LocalJobRunner: 
    [junit] 08/12/30 17:16:59 INFO mapred.TaskRunner: Task attempt_local_0001_r_000000_0 is allowed to commit now
    [junit] 08/12/30 17:16:59 INFO mapred.FileOutputCommitter: Saved output of task 'attempt_local_0001_r_000000_0' to file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/tmp-1645525061
    [junit] 08/12/30 17:16:59 INFO mapred.LocalJobRunner: reduce > reduce
    [junit] 08/12/30 17:16:59 INFO mapred.TaskRunner: Task 'attempt_local_0001_r_000000_0' done.
    [junit]  map = 100%,  reduce =100%
    [junit] 08/12/30 17:17:00 INFO exec.ExecDriver:  map = 100%,  reduce =100%
    [junit] 08/12/30 17:17:00 INFO exec.ExecDriver: Ended Job = job_local_0001
    [junit] Ended Job = job_local_0001
    [junit] plan = /tmp/plan1432.xml
    [junit] 08/12/30 17:17:01 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to jobconf value of: 31
    [junit] 08/12/30 17:17:01 INFO exec.ExecDriver: Adding input file /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/tmp/524590074/115172128.10001
    [junit] 08/12/30 17:17:01 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/data/files/TestSerDe.jar
    [junit] 08/12/30 17:17:01 WARN exec.ExecDriver: Number of reduce tasks inferred based on input size to : 1
    [junit] 08/12/30 17:17:01 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
    [junit] 08/12/30 17:17:01 WARN mapred.JobClient: Use GenericOptionsParser for parsing the arguments. Applications should implement Tool for the same.
    [junit] 08/12/30 17:17:01 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] 08/12/30 17:17:01 INFO exec.ExecDriver: Job running in-process (local Hadoop)
    [junit] 08/12/30 17:17:01 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] Job running in-process (local Hadoop)
    [junit] 08/12/30 17:17:01 INFO mapred.MapTask: numReduceTasks: 1
    [junit] 08/12/30 17:17:01 INFO mapred.MapTask: io.sort.mb = 100
    [junit] 08/12/30 17:17:02 INFO mapred.MapTask: data buffer = 79691776/99614720
    [junit] 08/12/30 17:17:02 INFO mapred.MapTask: record buffer = 262144/327680
    [junit] 08/12/30 17:17:02 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 17:17:02 INFO exec.MapOperator: Adding alias /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/tmp/524590074/115172128.10001 to work list for file /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/tmp/524590074/115172128.10001/attempt_local_0001_r_000000_0
    [junit] 08/12/30 17:17:02 INFO exec.MapOperator: Got partitions: null
    [junit] 08/12/30 17:17:02 INFO exec.ReduceSinkOperator: Initializing Self
    [junit] 08/12/30 17:17:02 INFO exec.ReduceSinkOperator: Using tag = -1
    [junit] 08/12/30 17:17:02 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 17:17:02 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 17:17:02 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 17:17:02 INFO mapred.MapTask: Starting flush of map output
    [junit] 08/12/30 17:17:02 INFO mapred.MapTask: Finished spill 0
    [junit] 08/12/30 17:17:02 INFO mapred.TaskRunner: Task:attempt_local_0001_m_000000_0 is done. And is in the process of commiting
    [junit] 08/12/30 17:17:02 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/tmp/524590074/115172128.10001/attempt_local_0001_r_000000_0:0+11875
    [junit] 08/12/30 17:17:02 INFO mapred.TaskRunner: Task 'attempt_local_0001_m_000000_0' done.
    [junit] 08/12/30 17:17:02 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 17:17:02 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 17:17:02 INFO mapred.Merger: Merging 1 sorted segments
    [junit] 08/12/30 17:17:02 INFO mapred.Merger: Down to the last merge-pass, with 1 segments left of total size: 8282 bytes
    [junit] 08/12/30 17:17:02 INFO exec.GroupByOperator: Initializing Self
    [junit] 08/12/30 17:17:02 INFO exec.GroupByOperator: Initializing children:
    [junit] 08/12/30 17:17:02 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 17:17:02 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 17:17:02 INFO exec.LimitOperator: Initializing Self
    [junit] 08/12/30 17:17:02 INFO exec.LimitOperator: Initializing children:
    [junit] 08/12/30 17:17:02 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 17:17:02 INFO exec.LimitOperator: Initialization Done
    [junit] 08/12/30 17:17:02 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 17:17:02 INFO exec.GroupByOperator: Initialization Done
    [junit] 08/12/30 17:17:02 INFO mapred.TaskRunner: Task:attempt_local_0001_r_000000_0 is done. And is in the process of commiting
    [junit] 08/12/30 17:17:02 INFO mapred.LocalJobRunner: 
    [junit] 08/12/30 17:17:02 INFO mapred.TaskRunner: Task attempt_local_0001_r_000000_0 is allowed to commit now
    [junit] 08/12/30 17:17:02 INFO mapred.FileOutputCommitter: Saved output of task 'attempt_local_0001_r_000000_0' to file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/tmp-1205456750
    [junit] 08/12/30 17:17:02 INFO mapred.LocalJobRunner: reduce > reduce
    [junit] 08/12/30 17:17:02 INFO mapred.TaskRunner: Task 'attempt_local_0001_r_000000_0' done.
    [junit]  map = 100%,  reduce =100%
    [junit] 08/12/30 17:17:02 INFO exec.ExecDriver:  map = 100%,  reduce =100%
    [junit] 08/12/30 17:17:02 INFO exec.ExecDriver: Ended Job = job_local_0001
    [junit] Ended Job = job_local_0001
    [junit] plan = /tmp/plan1433.xml
    [junit] 08/12/30 17:17:04 INFO exec.ExecDriver: Number of reduce tasks determined at compile : 1
    [junit] 08/12/30 17:17:04 INFO exec.ExecDriver: Adding input file /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/tmp/524590074/115172128.10002
    [junit] 08/12/30 17:17:04 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/data/files/TestSerDe.jar
    [junit] 08/12/30 17:17:04 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
    [junit] 08/12/30 17:17:04 WARN mapred.JobClient: Use GenericOptionsParser for parsing the arguments. Applications should implement Tool for the same.
    [junit] 08/12/30 17:17:04 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] Job running in-process (local Hadoop)
    [junit] 08/12/30 17:17:04 INFO exec.ExecDriver: Job running in-process (local Hadoop)
    [junit] 08/12/30 17:17:04 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] 08/12/30 17:17:04 INFO mapred.MapTask: numReduceTasks: 1
    [junit] 08/12/30 17:17:04 INFO mapred.MapTask: io.sort.mb = 100
    [junit] 08/12/30 17:17:05 INFO mapred.MapTask: data buffer = 79691776/99614720
    [junit] 08/12/30 17:17:05 INFO mapred.MapTask: record buffer = 262144/327680
    [junit] 08/12/30 17:17:05 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 17:17:05 INFO exec.MapOperator: Adding alias /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/tmp/524590074/115172128.10002 to work list for file /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/tmp/524590074/115172128.10002/attempt_local_0001_r_000000_0
    [junit] 08/12/30 17:17:05 INFO exec.MapOperator: Got partitions: null
    [junit] 08/12/30 17:17:05 INFO exec.ReduceSinkOperator: Initializing Self
    [junit] 08/12/30 17:17:05 INFO exec.ReduceSinkOperator: Using tag = -1
    [junit] 08/12/30 17:17:05 INFO thrift.TBinarySortableProtocol: Sort order is ""
    [junit] 08/12/30 17:17:05 INFO thrift.TBinarySortableProtocol: Sort order is ""
    [junit] 08/12/30 17:17:05 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 17:17:05 INFO mapred.MapTask: Starting flush of map output
    [junit] 08/12/30 17:17:05 INFO mapred.MapTask: Finished spill 0
    [junit] 08/12/30 17:17:05 INFO mapred.TaskRunner: Task:attempt_local_0001_m_000000_0 is done. And is in the process of commiting
    [junit] 08/12/30 17:17:05 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/tmp/524590074/115172128.10002/attempt_local_0001_r_000000_0:0+283
    [junit] 08/12/30 17:17:05 INFO mapred.TaskRunner: Task 'attempt_local_0001_m_000000_0' done.
    [junit] 08/12/30 17:17:05 INFO thrift.TBinarySortableProtocol: Sort order is ""
    [junit] 08/12/30 17:17:05 INFO thrift.TBinarySortableProtocol: Sort order is ""
    [junit] 08/12/30 17:17:05 INFO mapred.Merger: Merging 1 sorted segments
    [junit] 08/12/30 17:17:05 INFO mapred.Merger: Down to the last merge-pass, with 1 segments left of total size: 159 bytes
    [junit] 08/12/30 17:17:05 INFO exec.ExtractOperator: Initializing Self
    [junit] 08/12/30 17:17:05 INFO exec.ExtractOperator: Initializing children:
    [junit] 08/12/30 17:17:05 INFO exec.LimitOperator: Initializing Self
    [junit] 08/12/30 17:17:05 INFO exec.LimitOperator: Initializing children:
    [junit] 08/12/30 17:17:05 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 17:17:05 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 17:17:05 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 17:17:05 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 17:17:05 INFO exec.LimitOperator: Initialization Done
    [junit] 08/12/30 17:17:05 INFO exec.ExtractOperator: Initialization Done
    [junit] 08/12/30 17:17:05 INFO mapred.TaskRunner: Task:attempt_local_0001_r_000000_0 is done. And is in the process of commiting
    [junit] 08/12/30 17:17:05 INFO mapred.LocalJobRunner: 
    [junit] 08/12/30 17:17:05 INFO mapred.TaskRunner: Task attempt_local_0001_r_000000_0 is allowed to commit now
    [junit] 08/12/30 17:17:05 INFO mapred.FileOutputCommitter: Saved output of task 'attempt_local_0001_r_000000_0' to file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/tmp2037769549
    [junit] 08/12/30 17:17:05 INFO mapred.LocalJobRunner: reduce > reduce
    [junit] 08/12/30 17:17:05 INFO mapred.TaskRunner: Task 'attempt_local_0001_r_000000_0' done.
    [junit]  map = 100%,  reduce =100%
    [junit] Ended Job = job_local_0001
    [junit] 08/12/30 17:17:05 INFO exec.ExecDriver:  map = 100%,  reduce =100%
    [junit] 08/12/30 17:17:05 INFO exec.ExecDriver: Ended Job = job_local_0001
    [junit] diff -I \(file:\)\|\(/tmp/.*\) /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/../build/ql/test/logs/groupby1_limit.q.out /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/src/test/results/clientpositive/groupby1_limit.q.out
    [junit] Done query: groupby1_limit.q
    [junit] Begin query: sample5.q
    [junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src)
    [junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:361)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
    [junit] 	at org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_sample5(TestCliDriver.java:2028)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit] 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit] 	at java.lang.reflect.Method.invoke(Method.java:597)
    [junit] 	at junit.framework.TestCase.runTest(TestCase.java:164)
    [junit] 	at junit.framework.TestCase.runBare(TestCase.java:130)
    [junit] 	at junit.framework.TestResult$1.protect(TestResult.java:106)
    [junit] 	at junit.framework.TestResult.runProtected(TestResult.java:124)
    [junit] 	at junit.framework.TestResult.run(TestResult.java:109)
    [junit] 	at junit.framework.TestCase.run(TestCase.java:120)
    [junit] 	at junit.framework.TestSuite.runTest(TestSuite.java:230)
    [junit] 	at junit.framework.TestSuite.run(TestSuite.java:225)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
    [junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src)
    [junit] 	at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
    [junit] 	at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
    [junit] 	at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
    [junit] 	... 21 more
    [junit] Begin query: groupby3_map.q
    [junit] plan = /tmp/plan1434.xml
    [junit] 08/12/30 17:17:09 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to jobconf value of: 1
    [junit] 08/12/30 17:17:09 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src
    [junit] 08/12/30 17:17:09 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/data/files/TestSerDe.jar
    [junit] 08/12/30 17:17:09 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
    [junit] 08/12/30 17:17:09 WARN mapred.JobClient: Use GenericOptionsParser for parsing the arguments. Applications should implement Tool for the same.
    [junit] 08/12/30 17:17:09 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] 08/12/30 17:17:09 INFO exec.ExecDriver: Job running in-process (local Hadoop)
    [junit] 08/12/30 17:17:09 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] Job running in-process (local Hadoop)
    [junit] 08/12/30 17:17:09 INFO mapred.MapTask: numReduceTasks: 1
    [junit] 08/12/30 17:17:09 INFO mapred.MapTask: io.sort.mb = 100
    [junit] 08/12/30 17:17:09 INFO mapred.MapTask: data buffer = 79691776/99614720
    [junit] 08/12/30 17:17:09 INFO mapred.MapTask: record buffer = 262144/327680
    [junit] 08/12/30 17:17:09 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 17:17:09 INFO exec.MapOperator: Adding alias src to work list for file /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src/kv1.txt
    [junit] 08/12/30 17:17:09 INFO exec.MapOperator: Got partitions: null
    [junit] 08/12/30 17:17:09 INFO exec.TableScanOperator: Initializing Self
    [junit] 08/12/30 17:17:09 INFO exec.TableScanOperator: Initializing children:
    [junit] 08/12/30 17:17:09 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 17:17:09 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 17:17:09 INFO exec.GroupByOperator: Initializing Self
    [junit] 08/12/30 17:17:09 INFO exec.GroupByOperator: Initializing children:
    [junit] 08/12/30 17:17:09 INFO exec.ReduceSinkOperator: Initializing Self
    [junit] 08/12/30 17:17:09 INFO exec.ReduceSinkOperator: Using tag = -1
    [junit] 08/12/30 17:17:09 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 17:17:09 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 17:17:10 INFO exec.GroupByOperator: Initialization Done
    [junit] 08/12/30 17:17:10 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 17:17:10 INFO exec.TableScanOperator: Initialization Done
    [junit] 08/12/30 17:17:10 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 17:17:10 INFO mapred.MapTask: Starting flush of map output
    [junit] 08/12/30 17:17:10 INFO mapred.MapTask: Finished spill 0
    [junit] 08/12/30 17:17:10 INFO mapred.TaskRunner: Task:attempt_local_0001_m_000000_0 is done. And is in the process of commiting
    [junit] 08/12/30 17:17:10 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src/kv1.txt:0+5812
    [junit] 08/12/30 17:17:10 INFO mapred.TaskRunner: Task 'attempt_local_0001_m_000000_0' done.
    [junit] 08/12/30 17:17:10 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 17:17:10 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 17:17:10 INFO mapred.Merger: Merging 1 sorted segments
    [junit] 08/12/30 17:17:10 INFO mapred.Merger: Down to the last merge-pass, with 1 segments left of total size: 38220 bytes
    [junit] 08/12/30 17:17:10 INFO exec.GroupByOperator: Initializing Self
    [junit] 08/12/30 17:17:10 INFO exec.GroupByOperator: Initializing children:
    [junit] 08/12/30 17:17:10 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 17:17:10 INFO exec.GroupByOperator: Initialization Done
    [junit] 08/12/30 17:17:10 INFO mapred.TaskRunner: Task:attempt_local_0001_r_000000_0 is done. And is in the process of commiting
    [junit] 08/12/30 17:17:10 INFO mapred.LocalJobRunner: 
    [junit] 08/12/30 17:17:10 INFO mapred.TaskRunner: Task attempt_local_0001_r_000000_0 is allowed to commit now
    [junit] 08/12/30 17:17:10 INFO mapred.FileOutputCommitter: Saved output of task 'attempt_local_0001_r_000000_0' to file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/tmp673369920
    [junit] 08/12/30 17:17:10 INFO mapred.LocalJobRunner: reduce > reduce
    [junit] 08/12/30 17:17:10 INFO mapred.TaskRunner: Task 'attempt_local_0001_r_000000_0' done.
    [junit]  map = 100%,  reduce =100%
    [junit] 08/12/30 17:17:10 INFO exec.ExecDriver:  map = 100%,  reduce =100%
    [junit] Ended Job = job_local_0001
    [junit] 08/12/30 17:17:10 INFO exec.ExecDriver: Ended Job = job_local_0001
    [junit] plan = /tmp/plan1435.xml
    [junit] 08/12/30 17:17:12 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to jobconf value of: 1
    [junit] 08/12/30 17:17:12 INFO exec.ExecDriver: Adding input file /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/tmp/363664/47945238.10001
    [junit] 08/12/30 17:17:12 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/data/files/TestSerDe.jar
    [junit] 08/12/30 17:17:12 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
    [junit] 08/12/30 17:17:12 WARN mapred.JobClient: Use GenericOptionsParser for parsing the arguments. Applications should implement Tool for the same.
    [junit] 08/12/30 17:17:12 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] 08/12/30 17:17:12 INFO exec.ExecDriver: Job running in-process (local Hadoop)
    [junit] 08/12/30 17:17:12 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] Job running in-process (local Hadoop)
    [junit] 08/12/30 17:17:12 INFO mapred.MapTask: numReduceTasks: 1
    [junit] 08/12/30 17:17:12 INFO mapred.MapTask: io.sort.mb = 100
    [junit] 08/12/30 17:17:12 INFO mapred.MapTask: data buffer = 79691776/99614720
    [junit] 08/12/30 17:17:12 INFO mapred.MapTask: record buffer = 262144/327680
    [junit] 08/12/30 17:17:13 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 17:17:13 INFO exec.MapOperator: Adding alias /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/tmp/363664/47945238.10001 to work list for file /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/tmp/363664/47945238.10001/attempt_local_0001_r_000000_0
    [junit] 08/12/30 17:17:13 INFO exec.MapOperator: Got partitions: null
    [junit] 08/12/30 17:17:13 INFO exec.ReduceSinkOperator: Initializing Self
    [junit] 08/12/30 17:17:13 INFO exec.ReduceSinkOperator: Using tag = -1
    [junit] 08/12/30 17:17:13 INFO thrift.TBinarySortableProtocol: Sort order is ""
    [junit] 08/12/30 17:17:13 INFO thrift.TBinarySortableProtocol: Sort order is ""
    [junit] 08/12/30 17:17:13 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 17:17:13 INFO mapred.MapTask: Starting flush of map output
    [junit] 08/12/30 17:17:13 INFO mapred.MapTask: Finished spill 0
    [junit] 08/12/30 17:17:13 INFO mapred.TaskRunner: Task:attempt_local_0001_m_000000_0 is done. And is in the process of commiting
    [junit] 08/12/30 17:17:13 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/tmp/363664/47945238.10001/attempt_local_0001_r_000000_0:0+183
    [junit] 08/12/30 17:17:13 INFO mapred.TaskRunner: Task 'attempt_local_0001_m_000000_0' done.
    [junit] 08/12/30 17:17:13 INFO thrift.TBinarySortableProtocol: Sort order is ""
    [junit] 08/12/30 17:17:13 INFO thrift.TBinarySortableProtocol: Sort order is ""
    [junit] 08/12/30 17:17:13 INFO mapred.Merger: Merging 1 sorted segments
    [junit] 08/12/30 17:17:13 INFO mapred.Merger: Down to the last merge-pass, with 1 segments left of total size: 83 bytes
    [junit] 08/12/30 17:17:13 INFO exec.GroupByOperator: Initializing Self
    [junit] 08/12/30 17:17:13 INFO exec.GroupByOperator: Initializing children:
    [junit] 08/12/30 17:17:13 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 17:17:13 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 17:17:13 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 17:17:13 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 17:17:13 INFO exec.GroupByOperator: Initialization Done
    [junit] 08/12/30 17:17:13 INFO mapred.TaskRunner: Task:attempt_local_0001_r_000000_0 is done. And is in the process of commiting
    [junit] 08/12/30 17:17:13 INFO mapred.LocalJobRunner: 
    [junit] 08/12/30 17:17:13 INFO mapred.TaskRunner: Task attempt_local_0001_r_000000_0 is allowed to commit now
    [junit] 08/12/30 17:17:13 INFO mapred.FileOutputCommitter: Saved output of task 'attempt_local_0001_r_000000_0' to file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/tmp288368028
    [junit] 08/12/30 17:17:13 INFO mapred.LocalJobRunner: reduce > reduce
    [junit] 08/12/30 17:17:13 INFO mapred.TaskRunner: Task 'attempt_local_0001_r_000000_0' done.
    [junit]  map = 100%,  reduce =100%
    [junit] Ended Job = job_local_0001
    [junit] 08/12/30 17:17:13 INFO exec.ExecDriver:  map = 100%,  reduce =100%
    [junit] 08/12/30 17:17:13 INFO exec.ExecDriver: Ended Job = job_local_0001
    [junit] diff -I \(file:\)\|\(/tmp/.*\) /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/../build/ql/test/logs/groupby3_map.q.out /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/src/test/results/clientpositive/groupby3_map.q.out
    [junit] Done query: groupby3_map.q
    [junit] Begin query: groupby2_limit.q
    [junit] plan = /tmp/plan1436.xml
    [junit] 08/12/30 17:17:16 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to jobconf value of: 31
    [junit] 08/12/30 17:17:16 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src
    [junit] 08/12/30 17:17:16 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/data/files/TestSerDe.jar
    [junit] 08/12/30 17:17:16 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
    [junit] 08/12/30 17:17:16 WARN mapred.JobClient: Use GenericOptionsParser for parsing the arguments. Applications should implement Tool for the same.
    [junit] 08/12/30 17:17:17 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] Job running in-process (local Hadoop)
    [junit] 08/12/30 17:17:17 INFO exec.ExecDriver: Job running in-process (local Hadoop)
    [junit] 08/12/30 17:17:17 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] 08/12/30 17:17:17 INFO mapred.MapTask: numReduceTasks: 1
    [junit] 08/12/30 17:17:17 INFO mapred.MapTask: io.sort.mb = 100
    [junit] 08/12/30 17:17:17 INFO mapred.MapTask: data buffer = 79691776/99614720
    [junit] 08/12/30 17:17:17 INFO mapred.MapTask: record buffer = 262144/327680
    [junit] 08/12/30 17:17:17 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 17:17:17 INFO exec.MapOperator: Adding alias src to work list for file /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src/kv1.txt
    [junit] 08/12/30 17:17:17 INFO exec.MapOperator: Got partitions: null
    [junit] 08/12/30 17:17:17 INFO exec.TableScanOperator: Initializing Self
    [junit] 08/12/30 17:17:17 INFO exec.TableScanOperator: Initializing children:
    [junit] 08/12/30 17:17:17 INFO exec.ReduceSinkOperator: Initializing Self
    [junit] 08/12/30 17:17:17 INFO exec.ReduceSinkOperator: Using tag = -1
    [junit] 08/12/30 17:17:17 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 17:17:17 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 17:17:17 INFO exec.TableScanOperator: Initialization Done
    [junit] 08/12/30 17:17:17 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 17:17:17 INFO mapred.MapTask: Starting flush of map output
    [junit] 08/12/30 17:17:17 INFO mapred.MapTask: Finished spill 0
    [junit] 08/12/30 17:17:17 INFO mapred.TaskRunner: Task:attempt_local_0001_m_000000_0 is done. And is in the process of commiting
    [junit] 08/12/30 17:17:17 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src/kv1.txt:0+5812
    [junit] 08/12/30 17:17:17 INFO mapred.TaskRunner: Task 'attempt_local_0001_m_000000_0' done.
    [junit] 08/12/30 17:17:17 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 17:17:17 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 17:17:17 INFO mapred.Merger: Merging 1 sorted segments
    [junit] 08/12/30 17:17:17 INFO mapred.Merger: Down to the last merge-pass, with 1 segments left of total size: 12814 bytes
    [junit] 08/12/30 17:17:17 INFO exec.GroupByOperator: Initializing Self
    [junit] 08/12/30 17:17:17 INFO exec.GroupByOperator: Initializing children:
    [junit] 08/12/30 17:17:17 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 17:17:17 INFO exec.GroupByOperator: Initialization Done
    [junit] 08/12/30 17:17:17 INFO mapred.TaskRunner: Task:attempt_local_0001_r_000000_0 is done. And is in the process of commiting
    [junit] 08/12/30 17:17:17 INFO mapred.LocalJobRunner: 
    [junit] 08/12/30 17:17:17 INFO mapred.TaskRunner: Task attempt_local_0001_r_000000_0 is allowed to commit now
    [junit] 08/12/30 17:17:17 INFO mapred.FileOutputCommitter: Saved output of task 'attempt_local_0001_r_000000_0' to file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/tmp-255644626
    [junit] 08/12/30 17:17:17 INFO mapred.LocalJobRunner: reduce > reduce
    [junit] 08/12/30 17:17:17 INFO mapred.TaskRunner: Task 'attempt_local_0001_r_000000_0' done.
    [junit]  map = 100%,  reduce =100%
    [junit] Ended Job = job_local_0001
    [junit] 08/12/30 17:17:18 INFO exec.ExecDriver:  map = 100%,  reduce =100%
    [junit] 08/12/30 17:17:18 INFO exec.ExecDriver: Ended Job = job_local_0001
    [junit] plan = /tmp/plan1437.xml
    [junit] 08/12/30 17:17:19 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to jobconf value of: 31
    [junit] 08/12/30 17:17:19 INFO exec.ExecDriver: Adding input file /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/tmp/918819307/320080722.10002
    [junit] 08/12/30 17:17:19 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/data/files/TestSerDe.jar
    [junit] 08/12/30 17:17:19 WARN exec.ExecDriver: Number of reduce tasks inferred based on input size to : 1
    [junit] 08/12/30 17:17:19 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
    [junit] 08/12/30 17:17:19 WARN mapred.JobClient: Use GenericOptionsParser for parsing the arguments. Applications should implement Tool for the same.
    [junit] 08/12/30 17:17:19 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] 08/12/30 17:17:20 INFO exec.ExecDriver: Job running in-process (local Hadoop)
    [junit] 08/12/30 17:17:20 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] Job running in-process (local Hadoop)
    [junit] 08/12/30 17:17:20 INFO mapred.MapTask: numReduceTasks: 1
    [junit] 08/12/30 17:17:20 INFO mapred.MapTask: io.sort.mb = 100
    [junit] 08/12/30 17:17:20 INFO mapred.MapTask: data buffer = 79691776/99614720
    [junit] 08/12/30 17:17:20 INFO mapred.MapTask: record buffer = 262144/327680
    [junit] 08/12/30 17:17:20 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 17:17:20 INFO exec.MapOperator: Adding alias /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/tmp/918819307/320080722.10002 to work list for file /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/tmp/918819307/320080722.10002/attempt_local_0001_r_000000_0
    [junit] 08/12/30 17:17:20 INFO exec.MapOperator: Got partitions: null
    [junit] 08/12/30 17:17:20 INFO exec.ReduceSinkOperator: Initializing Self
    [junit] 08/12/30 17:17:20 INFO exec.ReduceSinkOperator: Using tag = -1
    [junit] 08/12/30 17:17:20 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 17:17:20 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 17:17:20 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 17:17:20 INFO mapred.MapTask: Starting flush of map output
    [junit] 08/12/30 17:17:20 INFO mapred.MapTask: Finished spill 0
    [junit] 08/12/30 17:17:20 INFO mapred.TaskRunner: Task:attempt_local_0001_m_000000_0 is done. And is in the process of commiting
    [junit] 08/12/30 17:17:20 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/tmp/918819307/320080722.10002/attempt_local_0001_r_000000_0:0+11875
    [junit] 08/12/30 17:17:20 INFO mapred.TaskRunner: Task 'attempt_local_0001_m_000000_0' done.
    [junit] 08/12/30 17:17:20 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 17:17:20 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 17:17:20 INFO mapred.Merger: Merging 1 sorted segments
    [junit] 08/12/30 17:17:20 INFO mapred.Merger: Down to the last merge-pass, with 1 segments left of total size: 8282 bytes
    [junit] 08/12/30 17:17:20 INFO exec.GroupByOperator: Initializing Self
    [junit] 08/12/30 17:17:20 INFO exec.GroupByOperator: Initializing children:
    [junit] 08/12/30 17:17:20 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 17:17:20 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 17:17:20 INFO exec.LimitOperator: Initializing Self
    [junit] 08/12/30 17:17:20 INFO exec.LimitOperator: Initializing children:
    [junit] 08/12/30 17:17:20 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 17:17:20 INFO exec.LimitOperator: Initialization Done
    [junit] 08/12/30 17:17:20 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 17:17:20 INFO exec.GroupByOperator: Initialization Done
    [junit] 08/12/30 17:17:20 INFO mapred.TaskRunner: Task:attempt_local_0001_r_000000_0 is done. And is in the process of commiting
    [junit] 08/12/30 17:17:20 INFO mapred.LocalJobRunner: 
    [junit] 08/12/30 17:17:20 INFO mapred.TaskRunner: Task attempt_local_0001_r_000000_0 is allowed to commit now
    [junit] 08/12/30 17:17:20 INFO mapred.FileOutputCommitter: Saved output of task 'attempt_local_0001_r_000000_0' to file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/tmp-537189748
    [junit] 08/12/30 17:17:20 INFO mapred.LocalJobRunner: reduce > reduce
    [junit] 08/12/30 17:17:20 INFO mapred.TaskRunner: Task 'attempt_local_0001_r_000000_0' done.
    [junit]  map = 100%,  reduce =100%
    [junit] 08/12/30 17:17:21 INFO exec.ExecDriver:  map = 100%,  reduce =100%
    [junit] Ended Job = job_local_0001
    [junit] 08/12/30 17:17:21 INFO exec.ExecDriver: Ended Job = job_local_0001
    [junit] diff -I \(file:\)\|\(/tmp/.*\) /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/../build/ql/test/logs/groupby2_limit.q.out /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/src/test/results/clientpositive/groupby2_limit.q.out
    [junit] Done query: groupby2_limit.q
    [junit] Begin query: inputddl4.q
    [junit] diff -I \(file:\)\|\(/tmp/.*\) /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/../build/ql/test/logs/inputddl4.q.out /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/src/test/results/clientpositive/inputddl4.q.out
    [junit] Done query: inputddl4.q
    [junit] Begin query: sample7.q
    [junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src)
    [junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:361)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
    [junit] 	at org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_sample7(TestCliDriver.java:2128)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit] 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit] 	at java.lang.reflect.Method.invoke(Method.java:597)
    [junit] 	at junit.framework.TestCase.runTest(TestCase.java:164)
    [junit] 	at junit.framework.TestCase.runBare(TestCase.java:130)
    [junit] 	at junit.framework.TestResult$1.protect(TestResult.java:106)
    [junit] 	at junit.framework.TestResult.runProtected(TestResult.java:124)
    [junit] 	at junit.framework.TestResult.run(TestResult.java:109)
    [junit] 	at junit.framework.TestCase.run(TestCase.java:120)
    [junit] 	at junit.framework.TestSuite.runTest(TestSuite.java:230)
    [junit] 	at junit.framework.TestSuite.run(TestSuite.java:225)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
    [junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src)
    [junit] 	at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
    [junit] 	at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
    [junit] 	at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
    [junit] 	... 21 more
    [junit] Begin query: groupby5_map.q
    [junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/srcbucket)
    [junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/srcbucket)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:361)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
    [junit] 	at org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_groupby5_map(TestCliDriver.java:2153)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit] 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit] 	at java.lang.reflect.Method.invoke(Method.java:597)
    [junit] 	at junit.framework.TestCase.runTest(TestCase.java:164)
    [junit] 	at junit.framework.TestCase.runBare(TestCase.java:130)
    [junit] 	at junit.framework.TestResult$1.protect(TestResult.java:106)
    [junit] 	at junit.framework.TestResult.runProtected(TestResult.java:124)
    [junit] 	at junit.framework.TestResult.run(TestResult.java:109)
    [junit] 	at junit.framework.TestCase.run(TestCase.java:120)
    [junit] 	at junit.framework.TestSuite.runTest(TestSuite.java:230)
    [junit] 	at junit.framework.TestSuite.run(TestSuite.java:225)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
    [junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/srcbucket)
    [junit] 	at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
    [junit] 	at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
    [junit] 	at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
    [junit] 	... 21 more
    [junit] Begin query: inputddl6.q
    [junit] Exception: Client Execution failed with error code = 9
    [junit] junit.framework.AssertionFailedError: Client Execution failed with error code = 9
    [junit] 	at junit.framework.Assert.fail(Assert.java:47)
    [junit] 	at org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_inputddl6(TestCliDriver.java:2181)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit] 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit] 	at java.lang.reflect.Method.invoke(Method.java:597)
    [junit] 	at junit.framework.TestCase.runTest(TestCase.java:164)
    [junit] 	at junit.framework.TestCase.runBare(TestCase.java:130)
    [junit] 	at junit.framework.TestResult$1.protect(TestResult.java:106)
    [junit] 	at junit.framework.TestResult.runProtected(TestResult.java:124)
    [junit] 	at junit.framework.TestResult.run(TestResult.java:109)
    [junit] 	at junit.framework.TestCase.run(TestCase.java:120)
    [junit] 	at junit.framework.TestSuite.runTest(TestSuite.java:230)
    [junit] 	at junit.framework.TestSuite.run(TestSuite.java:225)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
    [junit] Begin query: input16_cc.q
    [junit] plan = /tmp/plan1438.xml
    [junit] 08/12/30 17:17:28 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to 0 since there's no reduce operator
    [junit] 08/12/30 17:17:28 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/input16_cc
    [junit] 08/12/30 17:17:28 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/data/files/TestSerDe.jar
    [junit] 08/12/30 17:17:28 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
    [junit] 08/12/30 17:17:28 WARN mapred.JobClient: Use GenericOptionsParser for parsing the arguments. Applications should implement Tool for the same.
    [junit] 08/12/30 17:17:28 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] 08/12/30 17:17:28 INFO exec.ExecDriver: Job running in-process (local Hadoop)
    [junit] 08/12/30 17:17:28 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] Job running in-process (local Hadoop)
    [junit] 08/12/30 17:17:28 INFO mapred.MapTask: numReduceTasks: 0
    [junit] 08/12/30 17:17:28 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 17:17:28 INFO exec.MapOperator: Adding alias input16_cc to work list for file /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/input16_cc/kv1_cc.txt
    [junit] 08/12/30 17:17:28 INFO serde2.TestSerDe: org.apache.hadoop.hive.serde2.TestSerDe: initialized with columnNames: [key, value]
    [junit] 08/12/30 17:17:28 INFO exec.MapOperator: Got partitions: null
    [junit] 08/12/30 17:17:28 INFO exec.TableScanOperator: Initializing Self
    [junit] 08/12/30 17:17:28 INFO exec.TableScanOperator: Initializing children:
    [junit] 08/12/30 17:17:28 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 17:17:28 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 17:17:28 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 17:17:29 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 17:17:29 INFO exec.TableScanOperator: Initialization Done
    [junit] 08/12/30 17:17:29 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 17:17:29 INFO mapred.TaskRunner: Task:attempt_local_0001_m_000000_0 is done. And is in the process of commiting
    [junit] 08/12/30 17:17:29 INFO mapred.LocalJobRunner: 
    [junit] 08/12/30 17:17:29 INFO mapred.TaskRunner: Task attempt_local_0001_m_000000_0 is allowed to commit now
    [junit] 08/12/30 17:17:29 INFO mapred.FileOutputCommitter: Saved output of task 'attempt_local_0001_m_000000_0' to file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/tmp-181852077
    [junit] 08/12/30 17:17:29 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/input16_cc/kv1_cc.txt:0+5812
    [junit] 08/12/30 17:17:29 INFO mapred.TaskRunner: Task 'attempt_local_0001_m_000000_0' done.
    [junit]  map = 100%,  reduce =0%
    [junit] 08/12/30 17:17:29 INFO exec.ExecDriver:  map = 100%,  reduce =0%
    [junit] 08/12/30 17:17:29 INFO exec.ExecDriver: Ended Job = job_local_0001
    [junit] Ended Job = job_local_0001
    [junit] diff -I \(file:\)\|\(/tmp/.*\) /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/../build/ql/test/logs/input16_cc.q.out /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/src/test/results/clientpositive/input16_cc.q.out
    [junit] Done query: input16_cc.q
    [junit] Begin query: cast1.q
    [junit] plan = /tmp/plan1439.xml
    [junit] 08/12/30 17:17:33 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to 0 since there's no reduce operator
    [junit] 08/12/30 17:17:33 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src
    [junit] 08/12/30 17:17:33 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/data/files/TestSerDe.jar
    [junit] 08/12/30 17:17:33 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
    [junit] 08/12/30 17:17:33 WARN mapred.JobClient: Use GenericOptionsParser for parsing the arguments. Applications should implement Tool for the same.
    [junit] 08/12/30 17:17:33 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] 08/12/30 17:17:33 INFO exec.ExecDriver: Job running in-process (local Hadoop)
    [junit] Job running in-process (local Hadoop)
    [junit] 08/12/30 17:17:33 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] 08/12/30 17:17:33 INFO mapred.MapTask: numReduceTasks: 0
    [junit] 08/12/30 17:17:33 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 17:17:33 INFO exec.MapOperator: Adding alias src to work list for file /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src/kv1.txt
    [junit] 08/12/30 17:17:33 INFO exec.MapOperator: Got partitions: null
    [junit] 08/12/30 17:17:33 INFO exec.TableScanOperator: Initializing Self
    [junit] 08/12/30 17:17:33 INFO exec.TableScanOperator: Initializing children:
    [junit] 08/12/30 17:17:33 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 17:17:33 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 17:17:33 INFO exec.FilterOperator: Initializing Self
    [junit] 08/12/30 17:17:33 INFO exec.FilterOperator: Initializing children:
    [junit] 08/12/30 17:17:33 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 17:17:33 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 17:17:33 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 17:17:33 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 17:17:33 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 17:17:33 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 17:17:33 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 17:17:33 INFO exec.FilterOperator: Initialization Done
    [junit] 08/12/30 17:17:33 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 17:17:33 INFO exec.TableScanOperator: Initialization Done
    [junit] 08/12/30 17:17:33 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 17:17:33 INFO exec.FilterOperator: PASSED:1
    [junit] 08/12/30 17:17:33 INFO exec.FilterOperator: FILTERED:499
    [junit] 08/12/30 17:17:33 INFO mapred.TaskRunner: Task:attempt_local_0001_m_000000_0 is done. And is in the process of commiting
    [junit] 08/12/30 17:17:33 INFO mapred.LocalJobRunner: 
    [junit] 08/12/30 17:17:33 INFO mapred.TaskRunner: Task attempt_local_0001_m_000000_0 is allowed to commit now
    [junit] 08/12/30 17:17:33 INFO mapred.FileOutputCommitter: Saved output of task 'attempt_local_0001_m_000000_0' to file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/tmp225028478
    [junit] 08/12/30 17:17:33 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src/kv1.txt:0+5812
    [junit] 08/12/30 17:17:33 INFO mapred.TaskRunner: Task 'attempt_local_0001_m_000000_0' done.
    [junit] 08/12/30 17:17:34 INFO exec.ExecDriver:  map = 100%,  reduce =0%
    [junit] 08/12/30 17:17:34 INFO exec.ExecDriver: Ended Job = job_local_0001
    [junit]  map = 100%,  reduce =0%
    [junit] Ended Job = job_local_0001
    [junit] diff -I \(file:\)\|\(/tmp/.*\) /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/../build/ql/test/logs/cast1.q.out /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/src/test/results/clientpositive/cast1.q.out
    [junit] Done query: cast1.q
    [junit] Begin query: inputddl8.q
    [junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src)
    [junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:361)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
    [junit] 	at org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_inputddl8(TestCliDriver.java:2253)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit] 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit] 	at java.lang.reflect.Method.invoke(Method.java:597)
    [junit] 	at junit.framework.TestCase.runTest(TestCase.java:164)
    [junit] 	at junit.framework.TestCase.runBare(TestCase.java:130)
    [junit] 	at junit.framework.TestResult$1.protect(TestResult.java:106)
    [junit] 	at junit.framework.TestResult.runProtected(TestResult.java:124)
    [junit] 	at junit.framework.TestResult.run(TestResult.java:109)
    [junit] 	at junit.framework.TestCase.run(TestCase.java:120)
    [junit] 	at junit.framework.TestSuite.runTest(TestSuite.java:230)
    [junit] 	at junit.framework.TestSuite.run(TestSuite.java:225)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
    [junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src)
    [junit] 	at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
    [junit] 	at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
    [junit] 	at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
    [junit] 	... 21 more
    [junit] Begin query: quote1.q
    [junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/srcbucket)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:361)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
    [junit] 	at org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_quote1(TestCliDriver.java:2278)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit] 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit] 	at java.lang.reflect.Method.invoke(Method.java:597)
    [junit] 	at junit.framework.TestCase.runTest(TestCase.java:164)
    [junit] 	at junit.framework.TestCase.runBare(TestCase.java:130)
    [junit] 	at junit.framework.TestResult$1.protect(TestResult.java:106)
    [junit] 	at junit.framework.TestResult.runProtected(TestResult.java:124)
    [junit] 	at junit.framework.TestResult.run(TestResult.java:109)
    [junit] 	at junit.framework.TestCase.run(TestCase.java:120)
    [junit] 	at junit.framework.TestSuite.runTest(TestSuite.java:230)
    [junit] 	at junit.framework.TestSuite.run(TestSuite.java:225)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
    [junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/srcbucket)
    [junit] 	at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
    [junit] 	at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
    [junit] 	at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
    [junit] 	... 21 more
    [junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/srcbucket)
    [junit] Begin query: join0.q
    [junit] plan = /tmp/plan1440.xml
    [junit] 08/12/30 17:17:37 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to jobconf value of: 1
    [junit] 08/12/30 17:17:38 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src
    [junit] 08/12/30 17:17:38 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/data/files/TestSerDe.jar
    [junit] 08/12/30 17:17:38 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
    [junit] 08/12/30 17:17:38 WARN mapred.JobClient: Use GenericOptionsParser for parsing the arguments. Applications should implement Tool for the same.
    [junit] 08/12/30 17:17:38 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] Job running in-process (local Hadoop)
    [junit] 08/12/30 17:17:38 INFO exec.ExecDriver: Job running in-process (local Hadoop)
    [junit] 08/12/30 17:17:38 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] 08/12/30 17:17:38 INFO mapred.MapTask: numReduceTasks: 1
    [junit] 08/12/30 17:17:38 INFO mapred.MapTask: io.sort.mb = 100
    [junit] 08/12/30 17:17:38 INFO mapred.MapTask: data buffer = 79691776/99614720
    [junit] 08/12/30 17:17:38 INFO mapred.MapTask: record buffer = 262144/327680
    [junit] 08/12/30 17:17:38 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 17:17:38 INFO exec.MapOperator: Adding alias src2:src to work list for file /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src/kv1.txt
    [junit] 08/12/30 17:17:38 INFO exec.MapOperator: Adding alias src1:src to work list for file /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src/kv1.txt
    [junit] 08/12/30 17:17:38 INFO exec.MapOperator: Got partitions: null
    [junit] 08/12/30 17:17:38 INFO exec.TableScanOperator: Initializing Self
    [junit] 08/12/30 17:17:38 INFO exec.TableScanOperator: Initializing children:
    [junit] 08/12/30 17:17:38 INFO exec.FilterOperator: Initializing Self
    [junit] 08/12/30 17:17:38 INFO exec.FilterOperator: Initializing children:
    [junit] 08/12/30 17:17:38 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 17:17:38 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 17:17:38 INFO exec.ReduceSinkOperator: Initializing Self
    [junit] 08/12/30 17:17:38 INFO exec.ReduceSinkOperator: Using tag = 1
    [junit] 08/12/30 17:17:38 INFO thrift.TBinarySortableProtocol: Sort order is ""
    [junit] 08/12/30 17:17:38 INFO thrift.TBinarySortableProtocol: Sort order is ""
    [junit] 08/12/30 17:17:38 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 17:17:38 INFO exec.FilterOperator: Initialization Done
    [junit] 08/12/30 17:17:38 INFO exec.TableScanOperator: Initialization Done
    [junit] 08/12/30 17:17:38 INFO exec.TableScanOperator: Initializing Self
    [junit] 08/12/30 17:17:38 INFO exec.TableScanOperator: Initializing children:
    [junit] 08/12/30 17:17:38 INFO exec.FilterOperator: Initializing Self
    [junit] 08/12/30 17:17:38 INFO exec.FilterOperator: Initializing children:
    [junit] 08/12/30 17:17:38 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 17:17:38 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 17:17:38 INFO exec.ReduceSinkOperator: Initializing Self
    [junit] 08/12/30 17:17:38 INFO exec.ReduceSinkOperator: Using tag = 0
    [junit] 08/12/30 17:17:38 INFO thrift.TBinarySortableProtocol: Sort order is ""
    [junit] 08/12/30 17:17:38 INFO thrift.TBinarySortableProtocol: Sort order is ""
    [junit] 08/12/30 17:17:38 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 17:17:38 INFO exec.FilterOperator: Initialization Done
    [junit] 08/12/30 17:17:38 INFO exec.TableScanOperator: Initialization Done
    [junit] 08/12/30 17:17:39 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 17:17:39 INFO exec.FilterOperator: FILTERED:490
    [junit] 08/12/30 17:17:39 INFO exec.FilterOperator: PASSED:10
    [junit] 08/12/30 17:17:39 INFO exec.FilterOperator: FILTERED:490
    [junit] 08/12/30 17:17:39 INFO exec.FilterOperator: PASSED:10
    [junit] 08/12/30 17:17:39 INFO mapred.MapTask: Starting flush of map output
    [junit] 08/12/30 17:17:39 INFO mapred.MapTask: Finished spill 0
    [junit] 08/12/30 17:17:39 INFO mapred.TaskRunner: Task:attempt_local_0001_m_000000_0 is done. And is in the process of commiting
    [junit] 08/12/30 17:17:39 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src/kv1.txt:0+5812
    [junit] 08/12/30 17:17:39 INFO mapred.TaskRunner: Task 'attempt_local_0001_m_000000_0' done.
    [junit] 08/12/30 17:17:39 INFO thrift.TBinarySortableProtocol: Sort order is ""
    [junit] 08/12/30 17:17:39 INFO thrift.TBinarySortableProtocol: Sort order is ""
    [junit] 08/12/30 17:17:39 INFO mapred.Merger: Merging 1 sorted segments
    [junit] 08/12/30 17:17:39 INFO mapred.Merger: Down to the last merge-pass, with 1 segments left of total size: 642 bytes
    [junit] 08/12/30 17:17:39 INFO exec.JoinOperator: Initializing Self
    [junit] 08/12/30 17:17:39 INFO exec.JoinOperator: Initializing children:
    [junit] 08/12/30 17:17:39 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 17:17:39 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 17:17:39 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 17:17:39 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 17:17:39 INFO exec.JoinOperator: Initialization Done
    [junit] 08/12/30 17:17:39 INFO mapred.TaskRunner: Task:attempt_local_0001_r_000000_0 is done. And is in the process of commiting
    [junit] 08/12/30 17:17:39 INFO mapred.LocalJobRunner: 
    [junit] 08/12/30 17:17:39 INFO mapred.TaskRunner: Task attempt_local_0001_r_000000_0 is allowed to commit now
    [junit] 08/12/30 17:17:39 INFO mapred.FileOutputCommitter: Saved output of task 'attempt_local_0001_r_000000_0' to file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/tmp-750829966
    [junit] 08/12/30 17:17:39 INFO mapred.LocalJobRunner: reduce > reduce
    [junit] 08/12/30 17:17:39 INFO mapred.TaskRunner: Task 'attempt_local_0001_r_000000_0' done.
    [junit] 08/12/30 17:17:39 INFO exec.ExecDriver:  map = 100%,  reduce =100%
    [junit] 08/12/30 17:17:39 INFO exec.ExecDriver: Ended Job = job_local_0001
    [junit]  map = 100%,  reduce =100%
    [junit] Ended Job = job_local_0001
    [junit] plan = /tmp/plan1441.xml
    [junit] 08/12/30 17:17:41 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to jobconf value of: 1
    [junit] 08/12/30 17:17:41 INFO exec.ExecDriver: Adding input file /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/tmp/346734168/497249140.10002
    [junit] 08/12/30 17:17:41 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/data/files/TestSerDe.jar
    [junit] 08/12/30 17:17:41 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
    [junit] 08/12/30 17:17:41 WARN mapred.JobClient: Use GenericOptionsParser for parsing the arguments. Applications should implement Tool for the same.
    [junit] 08/12/30 17:17:41 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] 08/12/30 17:17:41 INFO exec.ExecDriver: Job running in-process (local Hadoop)
    [junit] 08/12/30 17:17:41 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] Job running in-process (local Hadoop)
    [junit] 08/12/30 17:17:41 INFO mapred.MapTask: numReduceTasks: 1
    [junit] 08/12/30 17:17:41 INFO mapred.MapTask: io.sort.mb = 100
    [junit] 08/12/30 17:17:41 INFO mapred.MapTask: data buffer = 79691776/99614720
    [junit] 08/12/30 17:17:41 INFO mapred.MapTask: record buffer = 262144/327680
    [junit] 08/12/30 17:17:41 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 17:17:41 INFO exec.MapOperator: Adding alias /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/tmp/346734168/497249140.10002 to work list for file /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/tmp/346734168/497249140.10002/attempt_local_0001_r_000000_0
    [junit] 08/12/30 17:17:41 INFO exec.MapOperator: Got partitions: null
    [junit] 08/12/30 17:17:41 INFO exec.ReduceSinkOperator: Initializing Self
    [junit] 08/12/30 17:17:41 INFO exec.ReduceSinkOperator: Using tag = -1
    [junit] 08/12/30 17:17:41 INFO thrift.TBinarySortableProtocol: Sort order is "++++"
    [junit] 08/12/30 17:17:41 INFO thrift.TBinarySortableProtocol: Sort order is "++++"
    [junit] 08/12/30 17:17:41 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 17:17:41 INFO mapred.MapTask: Starting flush of map output
    [junit] 08/12/30 17:17:41 INFO mapred.MapTask: Finished spill 0
    [junit] 08/12/30 17:17:41 INFO mapred.TaskRunner: Task:attempt_local_0001_m_000000_0 is done. And is in the process of commiting
    [junit] 08/12/30 17:17:41 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/tmp/346734168/497249140.10002/attempt_local_0001_r_000000_0:0+5836
    [junit] 08/12/30 17:17:41 INFO mapred.TaskRunner: Task 'attempt_local_0001_m_000000_0' done.
    [junit] 08/12/30 17:17:41 INFO thrift.TBinarySortableProtocol: Sort order is "++++"
    [junit] 08/12/30 17:17:41 INFO thrift.TBinarySortableProtocol: Sort order is "++++"
    [junit] 08/12/30 17:17:41 INFO mapred.Merger: Merging 1 sorted segments
    [junit] 08/12/30 17:17:41 INFO mapred.Merger: Down to the last merge-pass, with 1 segments left of total size: 7102 bytes
    [junit] 08/12/30 17:17:41 INFO exec.ExtractOperator: Initializing Self
    [junit] 08/12/30 17:17:41 INFO exec.ExtractOperator: Initializing children:
    [junit] 08/12/30 17:17:41 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 17:17:42 INFO exec.ExtractOperator: Initialization Done
    [junit] 08/12/30 17:17:42 INFO mapred.TaskRunner: Task:attempt_local_0001_r_000000_0 is done. And is in the process of commiting
    [junit] 08/12/30 17:17:42 INFO mapred.LocalJobRunner: 
    [junit] 08/12/30 17:17:42 INFO mapred.TaskRunner: Task attempt_local_0001_r_000000_0 is allowed to commit now
    [junit] 08/12/30 17:17:42 INFO mapred.FileOutputCommitter: Saved output of task 'attempt_local_0001_r_000000_0' to file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/tmp1357218109
    [junit] 08/12/30 17:17:42 INFO mapred.LocalJobRunner: reduce > reduce
    [junit] 08/12/30 17:17:42 INFO mapred.TaskRunner: Task 'attempt_local_0001_r_000000_0' done.
    [junit]  map = 100%,  reduce =100%
    [junit] 08/12/30 17:17:42 INFO exec.ExecDriver:  map = 100%,  reduce =100%
    [junit] Ended Job = job_local_0001
    [junit] 08/12/30 17:17:42 INFO exec.ExecDriver: Ended Job = job_local_0001
    [junit] diff -I \(file:\)\|\(/tmp/.*\) /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/../build/ql/test/logs/join0.q.out /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/src/test/results/clientpositive/join0.q.out
    [junit] Done query: join0.q
    [junit] Begin query: notable_alias2.q
    [junit] plan = /tmp/plan1442.xml
    [junit] 08/12/30 17:17:45 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to jobconf value of: 1
    [junit] 08/12/30 17:17:45 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src
    [junit] 08/12/30 17:17:45 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/data/files/TestSerDe.jar
    [junit] 08/12/30 17:17:45 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
    [junit] 08/12/30 17:17:45 WARN mapred.JobClient: Use GenericOptionsParser for parsing the arguments. Applications should implement Tool for the same.
    [junit] 08/12/30 17:17:45 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] Job running in-process (local Hadoop)
    [junit] 08/12/30 17:17:46 INFO exec.ExecDriver: Job running in-process (local Hadoop)
    [junit] 08/12/30 17:17:46 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] 08/12/30 17:17:46 INFO mapred.MapTask: numReduceTasks: 1
    [junit] 08/12/30 17:17:46 INFO mapred.MapTask: io.sort.mb = 100
    [junit] 08/12/30 17:17:46 INFO mapred.MapTask: data buffer = 79691776/99614720
    [junit] 08/12/30 17:17:46 INFO mapred.MapTask: record buffer = 262144/327680
    [junit] 08/12/30 17:17:46 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 17:17:46 INFO exec.MapOperator: Adding alias src to work list for file /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src/kv1.txt
    [junit] 08/12/30 17:17:46 INFO exec.MapOperator: Got partitions: null
    [junit] 08/12/30 17:17:46 INFO exec.TableScanOperator: Initializing Self
    [junit] 08/12/30 17:17:46 INFO exec.TableScanOperator: Initializing children:
    [junit] 08/12/30 17:17:46 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 17:17:46 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 17:17:46 INFO exec.FilterOperator: Initializing Self
    [junit] 08/12/30 17:17:46 INFO exec.FilterOperator: Initializing children:
    [junit] 08/12/30 17:17:46 INFO exec.ReduceSinkOperator: Initializing Self
    [junit] 08/12/30 17:17:46 INFO exec.ReduceSinkOperator: Using tag = -1
    [junit] 08/12/30 17:17:46 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 17:17:46 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 17:17:46 INFO exec.FilterOperator: Initialization Done
    [junit] 08/12/30 17:17:46 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 17:17:46 INFO exec.TableScanOperator: Initialization Done
    [junit] 08/12/30 17:17:46 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 17:17:46 INFO exec.FilterOperator: FILTERED:416
    [junit] 08/12/30 17:17:46 INFO exec.FilterOperator: PASSED:84
    [junit] 08/12/30 17:17:46 INFO mapred.MapTask: Starting flush of map output
    [junit] 08/12/30 17:17:46 INFO mapred.MapTask: Finished spill 0
    [junit] 08/12/30 17:17:46 INFO mapred.TaskRunner: Task:attempt_local_0001_m_000000_0 is done. And is in the process of commiting
    [junit] 08/12/30 17:17:46 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src/kv1.txt:0+5812
    [junit] 08/12/30 17:17:46 INFO mapred.TaskRunner: Task 'attempt_local_0001_m_000000_0' done.
    [junit] 08/12/30 17:17:46 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 17:17:46 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 17:17:46 INFO mapred.Merger: Merging 1 sorted segments
    [junit] 08/12/30 17:17:46 INFO mapred.Merger: Down to the last merge-pass, with 1 segments left of total size: 1840 bytes
    [junit] 08/12/30 17:17:46 INFO exec.GroupByOperator: Initializing Self
    [junit] 08/12/30 17:17:46 INFO exec.GroupByOperator: Initializing children:
    [junit] 08/12/30 17:17:46 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 17:17:46 INFO exec.GroupByOperator: Initialization Done
    [junit] 08/12/30 17:17:46 INFO mapred.TaskRunner: Task:attempt_local_0001_r_000000_0 is done. And is in the process of commiting
    [junit] 08/12/30 17:17:46 INFO mapred.LocalJobRunner: 
    [junit] 08/12/30 17:17:46 INFO mapred.TaskRunner: Task attempt_local_0001_r_000000_0 is allowed to commit now
    [junit] 08/12/30 17:17:46 INFO mapred.FileOutputCommitter: Saved output of task 'attempt_local_0001_r_000000_0' to file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/tmp-643569259
    [junit] 08/12/30 17:17:46 INFO mapred.LocalJobRunner: reduce > reduce
    [junit] 08/12/30 17:17:46 INFO mapred.TaskRunner: Task 'attempt_local_0001_r_000000_0' done.
    [junit]  map = 100%,  reduce =100%
    [junit] 08/12/30 17:17:47 INFO exec.ExecDriver:  map = 100%,  reduce =100%
    [junit] Ended Job = job_local_0001
    [junit] 08/12/30 17:17:47 INFO exec.ExecDriver: Ended Job = job_local_0001
    [junit] plan = /tmp/plan1443.xml
    [junit] 08/12/30 17:17:48 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to jobconf value of: 1
    [junit] 08/12/30 17:17:48 INFO exec.ExecDriver: Adding input file /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/tmp/612917680/89071149.10001
    [junit] 08/12/30 17:17:48 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/data/files/TestSerDe.jar
    [junit] 08/12/30 17:17:48 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
    [junit] 08/12/30 17:17:48 WARN mapred.JobClient: Use GenericOptionsParser for parsing the arguments. Applications should implement Tool for the same.
    [junit] 08/12/30 17:17:48 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] 08/12/30 17:17:48 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] 08/12/30 17:17:48 INFO exec.ExecDriver: Job running in-process (local Hadoop)
    [junit] Job running in-process (local Hadoop)
    [junit] 08/12/30 17:17:48 INFO mapred.MapTask: numReduceTasks: 1
    [junit] 08/12/30 17:17:48 INFO mapred.MapTask: io.sort.mb = 100
    [junit] 08/12/30 17:17:49 INFO mapred.MapTask: data buffer = 79691776/99614720
    [junit] 08/12/30 17:17:49 INFO mapred.MapTask: record buffer = 262144/327680
    [junit] 08/12/30 17:17:49 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 17:17:49 INFO exec.MapOperator: Adding alias /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/tmp/612917680/89071149.10001 to work list for file /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/tmp/612917680/89071149.10001/attempt_local_0001_r_000000_0
    [junit] 08/12/30 17:17:49 INFO exec.MapOperator: Got partitions: null
    [junit] 08/12/30 17:17:49 INFO exec.ReduceSinkOperator: Initializing Self
    [junit] 08/12/30 17:17:49 INFO exec.ReduceSinkOperator: Using tag = -1
    [junit] 08/12/30 17:17:49 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 17:17:49 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 17:17:49 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 17:17:49 INFO mapred.MapTask: Starting flush of map output
    [junit] 08/12/30 17:17:49 INFO mapred.MapTask: Finished spill 0
    [junit] 08/12/30 17:17:49 INFO mapred.TaskRunner: Task:attempt_local_0001_m_000000_0 is done. And is in the process of commiting
    [junit] 08/12/30 17:17:49 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/tmp/612917680/89071149.10001/attempt_local_0001_r_000000_0:0+2219
    [junit] 08/12/30 17:17:49 INFO mapred.TaskRunner: Task 'attempt_local_0001_m_000000_0' done.
    [junit] 08/12/30 17:17:49 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 17:17:49 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 17:17:49 INFO mapred.Merger: Merging 1 sorted segments
    [junit] 08/12/30 17:17:49 INFO mapred.Merger: Down to the last merge-pass, with 1 segments left of total size: 1478 bytes
    [junit] 08/12/30 17:17:49 INFO exec.GroupByOperator: Initializing Self
    [junit] 08/12/30 17:17:49 INFO exec.GroupByOperator: Initializing children:
    [junit] 08/12/30 17:17:49 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 17:17:49 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 17:17:49 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 17:17:49 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 17:17:49 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 17:17:49 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 17:17:49 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 17:17:49 INFO exec.GroupByOperator: Initialization Done
    [junit] 08/12/30 17:17:49 INFO mapred.TaskRunner: Task:attempt_local_0001_r_000000_0 is done. And is in the process of commiting
    [junit] 08/12/30 17:17:49 INFO mapred.LocalJobRunner: 
    [junit] 08/12/30 17:17:49 INFO mapred.TaskRunner: Task attempt_local_0001_r_000000_0 is allowed to commit now
    [junit] 08/12/30 17:17:49 INFO mapred.FileOutputCommitter: Saved output of task 'attempt_local_0001_r_000000_0' to file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/tmp1735690336
    [junit] 08/12/30 17:17:49 INFO mapred.LocalJobRunner: reduce > reduce
    [junit] 08/12/30 17:17:49 INFO mapred.TaskRunner: Task 'attempt_local_0001_r_000000_0' done.
    [junit]  map = 100%,  reduce =100%
    [junit] 08/12/30 17:17:49 INFO exec.ExecDriver:  map = 100%,  reduce =100%
    [junit] Ended Job = job_local_0001
    [junit] 08/12/30 17:17:49 INFO exec.ExecDriver: Ended Job = job_local_0001
    [junit] diff -I \(file:\)\|\(/tmp/.*\) /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/../build/ql/test/logs/notable_alias2.q.out /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/src/test/results/clientpositive/notable_alias2.q.out
    [junit] Done query: notable_alias2.q
    [junit] Begin query: input1.q
    [junit] diff -I \(file:\)\|\(/tmp/.*\) /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/../build/ql/test/logs/input1.q.out /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/src/test/results/clientpositive/input1.q.out
    [junit] Done query: input1.q
    [junit] Begin query: join2.q
    [junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src)
    [junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:361)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
    [junit] 	at org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_join2(TestCliDriver.java:2378)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit] 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit] 	at java.lang.reflect.Method.invoke(Method.java:597)
    [junit] 	at junit.framework.TestCase.runTest(TestCase.java:164)
    [junit] 	at junit.framework.TestCase.runBare(TestCase.java:130)
    [junit] 	at junit.framework.TestResult$1.protect(TestResult.java:106)
    [junit] 	at junit.framework.TestResult.runProtected(TestResult.java:124)
    [junit] 	at junit.framework.TestResult.run(TestResult.java:109)
    [junit] 	at junit.framework.TestCase.run(TestCase.java:120)
    [junit] 	at junit.framework.TestSuite.runTest(TestSuite.java:230)
    [junit] 	at junit.framework.TestSuite.run(TestSuite.java:225)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
    [junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src)
    [junit] 	at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
    [junit] 	at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
    [junit] 	at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
    [junit] 	... 21 more
    [junit] Begin query: input3.q
    [junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/srcbucket)
    [junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/srcbucket)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:361)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
    [junit] 	at org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_input3(TestCliDriver.java:2403)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit] 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit] 	at java.lang.reflect.Method.invoke(Method.java:597)
    [junit] 	at junit.framework.TestCase.runTest(TestCase.java:164)
    [junit] 	at junit.framework.TestCase.runBare(TestCase.java:130)
    [junit] 	at junit.framework.TestResult$1.protect(TestResult.java:106)
    [junit] 	at junit.framework.TestResult.runProtected(TestResult.java:124)
    [junit] 	at junit.framework.TestResult.run(TestResult.java:109)
    [junit] 	at junit.framework.TestCase.run(TestCase.java:120)
    [junit] 	at junit.framework.TestSuite.runTest(TestSuite.java:230)
    [junit] 	at junit.framework.TestSuite.run(TestSuite.java:225)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
    [junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/srcbucket)
    [junit] 	at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
    [junit] 	at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
    [junit] 	at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
    [junit] 	... 21 more
    [junit] Begin query: join4.q
    [junit] plan = /tmp/plan1444.xml
    [junit] 08/12/30 17:17:54 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to jobconf value of: 1
    [junit] 08/12/30 17:17:55 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src
    [junit] 08/12/30 17:17:55 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/data/files/TestSerDe.jar
    [junit] 08/12/30 17:17:55 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
    [junit] 08/12/30 17:17:55 WARN mapred.JobClient: Use GenericOptionsParser for parsing the arguments. Applications should implement Tool for the same.
    [junit] 08/12/30 17:17:55 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] Job running in-process (local Hadoop)
    [junit] 08/12/30 17:17:55 INFO exec.ExecDriver: Job running in-process (local Hadoop)
    [junit] 08/12/30 17:17:55 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] 08/12/30 17:17:55 INFO mapred.MapTask: numReduceTasks: 1
    [junit] 08/12/30 17:17:55 INFO mapred.MapTask: io.sort.mb = 100
    [junit] 08/12/30 17:17:55 INFO mapred.MapTask: data buffer = 79691776/99614720
    [junit] 08/12/30 17:17:55 INFO mapred.MapTask: record buffer = 262144/327680
    [junit] 08/12/30 17:17:55 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 17:17:55 INFO exec.MapOperator: Adding alias c:a:src1 to work list for file /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src/kv1.txt
    [junit] 08/12/30 17:17:55 INFO exec.MapOperator: Adding alias c:b:src2 to work list for file /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src/kv1.txt
    [junit] 08/12/30 17:17:55 INFO exec.MapOperator: Got partitions: null
    [junit] 08/12/30 17:17:55 INFO exec.TableScanOperator: Initializing Self
    [junit] 08/12/30 17:17:55 INFO exec.TableScanOperator: Initializing children:
    [junit] 08/12/30 17:17:55 INFO exec.FilterOperator: Initializing Self
    [junit] 08/12/30 17:17:55 INFO exec.FilterOperator: Initializing children:
    [junit] 08/12/30 17:17:55 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 17:17:55 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 17:17:55 INFO exec.ReduceSinkOperator: Initializing Self
    [junit] 08/12/30 17:17:55 INFO exec.ReduceSinkOperator: Using tag = 0
    [junit] 08/12/30 17:17:55 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 17:17:55 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 17:17:55 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 17:17:55 INFO exec.FilterOperator: Initialization Done
    [junit] 08/12/30 17:17:55 INFO exec.TableScanOperator: Initialization Done
    [junit] 08/12/30 17:17:55 INFO exec.TableScanOperator: Initializing Self
    [junit] 08/12/30 17:17:55 INFO exec.TableScanOperator: Initializing children:
    [junit] 08/12/30 17:17:55 INFO exec.FilterOperator: Initializing Self
    [junit] 08/12/30 17:17:55 INFO exec.FilterOperator: Initializing children:
    [junit] 08/12/30 17:17:55 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 17:17:55 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 17:17:55 INFO exec.ReduceSinkOperator: Initializing Self
    [junit] 08/12/30 17:17:55 INFO exec.ReduceSinkOperator: Using tag = 1
    [junit] 08/12/30 17:17:55 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 17:17:55 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 17:17:55 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 17:17:55 INFO exec.FilterOperator: Initialization Done
    [junit] 08/12/30 17:17:55 INFO exec.TableScanOperator: Initialization Done
    [junit] 08/12/30 17:17:55 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 17:17:55 INFO exec.FilterOperator: PASSED:9
    [junit] 08/12/30 17:17:55 INFO exec.FilterOperator: FILTERED:491
    [junit] 08/12/30 17:17:55 INFO exec.FilterOperator: PASSED:7
    [junit] 08/12/30 17:17:55 INFO exec.FilterOperator: FILTERED:493
    [junit] 08/12/30 17:17:55 INFO mapred.MapTask: Starting flush of map output
    [junit] 08/12/30 17:17:55 INFO mapred.MapTask: Finished spill 0
    [junit] 08/12/30 17:17:55 INFO mapred.TaskRunner: Task:attempt_local_0001_m_000000_0 is done. And is in the process of commiting
    [junit] 08/12/30 17:17:55 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src/kv1.txt:0+5812
    [junit] 08/12/30 17:17:55 INFO mapred.TaskRunner: Task 'attempt_local_0001_m_000000_0' done.
    [junit] 08/12/30 17:17:55 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 17:17:55 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 17:17:55 INFO mapred.Merger: Merging 1 sorted segments
    [junit] 08/12/30 17:17:55 INFO mapred.Merger: Down to the last merge-pass, with 1 segments left of total size: 610 bytes
    [junit] 08/12/30 17:17:56 INFO exec.JoinOperator: Initializing Self
    [junit] 08/12/30 17:17:56 INFO exec.JoinOperator: Initializing children:
    [junit] 08/12/30 17:17:56 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 17:17:56 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 17:17:56 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 17:17:56 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 17:17:56 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 17:17:56 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 17:17:56 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 17:17:56 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 17:17:56 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 17:17:56 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 17:17:56 INFO exec.JoinOperator: Initialization Done
    [junit] 08/12/30 17:17:56 INFO mapred.TaskRunner: Task:attempt_local_0001_r_000000_0 is done. And is in the process of commiting
    [junit] 08/12/30 17:17:56 INFO mapred.LocalJobRunner: 
    [junit] 08/12/30 17:17:56 INFO mapred.TaskRunner: Task attempt_local_0001_r_000000_0 is allowed to commit now
    [junit] 08/12/30 17:17:56 INFO mapred.FileOutputCommitter: Saved output of task 'attempt_local_0001_r_000000_0' to file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/tmp-337447732
    [junit] 08/12/30 17:17:56 INFO mapred.LocalJobRunner: reduce > reduce
    [junit] 08/12/30 17:17:56 INFO mapred.TaskRunner: Task 'attempt_local_0001_r_000000_0' done.
    [junit] 08/12/30 17:17:56 INFO exec.ExecDriver:  map = 100%,  reduce =100%
    [junit] 08/12/30 17:17:56 INFO exec.ExecDriver: Ended Job = job_local_0001
    [junit]  map = 100%,  reduce =100%
    [junit] Ended Job = job_local_0001
    [junit] diff -I \(file:\)\|\(/tmp/.*\) /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/../build/ql/test/logs/join4.q.out /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/src/test/results/clientpositive/join4.q.out
    [junit] Done query: join4.q
    [junit] Begin query: input5.q
    [junit] plan = /tmp/plan1445.xml
    [junit] 08/12/30 17:17:59 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to jobconf value of: 1
    [junit] 08/12/30 17:17:59 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src_thrift
    [junit] 08/12/30 17:17:59 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/data/files/TestSerDe.jar
    [junit] 08/12/30 17:18:00 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
    [junit] 08/12/30 17:18:00 WARN mapred.JobClient: Use GenericOptionsParser for parsing the arguments. Applications should implement Tool for the same.
    [junit] 08/12/30 17:18:00 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] 08/12/30 17:18:00 INFO exec.ExecDriver: Job running in-process (local Hadoop)
    [junit] 08/12/30 17:18:00 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] Job running in-process (local Hadoop)
    [junit] 08/12/30 17:18:00 INFO mapred.MapTask: numReduceTasks: 1
    [junit] 08/12/30 17:18:00 INFO mapred.MapTask: io.sort.mb = 100
    [junit] 08/12/30 17:18:00 INFO mapred.MapTask: data buffer = 79691776/99614720
    [junit] 08/12/30 17:18:00 INFO mapred.MapTask: record buffer = 262144/327680
    [junit] 08/12/30 17:18:00 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 17:18:00 INFO exec.MapOperator: Adding alias tmap:src_thrift to work list for file /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src_thrift/complex.seq
    [junit] 08/12/30 17:18:00 INFO exec.MapOperator: Got partitions: null
    [junit] 08/12/30 17:18:00 INFO exec.TableScanOperator: Initializing Self
    [junit] 08/12/30 17:18:00 INFO exec.TableScanOperator: Initializing children:
    [junit] 08/12/30 17:18:00 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 17:18:00 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 17:18:00 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 17:18:00 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 17:18:00 INFO exec.ScriptOperator: Initializing Self
    [junit] 08/12/30 17:18:00 INFO exec.ScriptOperator: Initializing children:
    [junit] 08/12/30 17:18:00 INFO exec.ReduceSinkOperator: Initializing Self
    [junit] 08/12/30 17:18:00 INFO exec.ReduceSinkOperator: Using tag = -1
    [junit] 08/12/30 17:18:00 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 17:18:00 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 17:18:00 INFO exec.ScriptOperator: Initialization Done
    [junit] 08/12/30 17:18:00 INFO exec.ScriptOperator: Executing [/bin/cat]
    [junit] 08/12/30 17:18:00 INFO exec.ScriptOperator: tablename=src_thrift
    [junit] 08/12/30 17:18:00 INFO exec.ScriptOperator: partname={}
    [junit] 08/12/30 17:18:00 INFO exec.ScriptOperator: alias=tmap:src_thrift
    [junit] 08/12/30 17:18:00 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 17:18:00 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 17:18:00 INFO exec.TableScanOperator: Initialization Done
    [junit] 08/12/30 17:18:00 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 17:18:00 INFO exec.ScriptOperator: StreamThread ErrorProcessor done
    [junit] 08/12/30 17:18:00 INFO exec.ScriptOperator: StreamThread OutputProcessor done
    [junit] 08/12/30 17:18:00 INFO exec.ScriptOperator: SERIALIZE_ERRORS:0
    [junit] 08/12/30 17:18:00 INFO exec.ScriptOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 17:18:00 INFO mapred.MapTask: Starting flush of map output
    [junit] 08/12/30 17:18:00 INFO mapred.MapTask: Finished spill 0
    [junit] 08/12/30 17:18:00 INFO mapred.TaskRunner: Task:attempt_local_0001_m_000000_0 is done. And is in the process of commiting
    [junit] 08/12/30 17:18:00 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src_thrift/complex.seq:0+1491
    [junit] 08/12/30 17:18:00 INFO mapred.TaskRunner: Task 'attempt_local_0001_m_000000_0' done.
    [junit] 08/12/30 17:18:00 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 17:18:00 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 17:18:00 INFO mapred.Merger: Merging 1 sorted segments
    [junit] 08/12/30 17:18:00 INFO mapred.Merger: Down to the last merge-pass, with 1 segments left of total size: 732 bytes
    [junit] 08/12/30 17:18:00 INFO exec.ExtractOperator: Initializing Self
    [junit] 08/12/30 17:18:00 INFO exec.ExtractOperator: Initializing children:
    [junit] 08/12/30 17:18:00 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 17:18:00 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 17:18:00 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 17:18:00 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 17:18:00 INFO exec.ExtractOperator: Initialization Done
    [junit] 08/12/30 17:18:00 INFO mapred.TaskRunner: Task:attempt_local_0001_r_000000_0 is done. And is in the process of commiting
    [junit] 08/12/30 17:18:00 INFO mapred.LocalJobRunner: 
    [junit] 08/12/30 17:18:00 INFO mapred.TaskRunner: Task attempt_local_0001_r_000000_0 is allowed to commit now
    [junit] 08/12/30 17:18:00 INFO mapred.FileOutputCommitter: Saved output of task 'attempt_local_0001_r_000000_0' to file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/tmp-1329569918
    [junit] 08/12/30 17:18:00 INFO mapred.LocalJobRunner: reduce > reduce
    [junit] 08/12/30 17:18:00 INFO mapred.TaskRunner: Task 'attempt_local_0001_r_000000_0' done.
    [junit]  map = 100%,  reduce =100%
    [junit] Ended Job = job_local_0001
    [junit] 08/12/30 17:18:01 INFO exec.ExecDriver:  map = 100%,  reduce =100%
    [junit] 08/12/30 17:18:01 INFO exec.ExecDriver: Ended Job = job_local_0001
    [junit] diff -I \(file:\)\|\(/tmp/.*\) /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/../build/ql/test/logs/input5.q.out /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/src/test/results/clientpositive/input5.q.out
    [junit] Done query: input5.q
    [junit] Begin query: join6.q
    [junit] plan = /tmp/plan1446.xml
    [junit] 08/12/30 17:18:04 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to jobconf value of: 1
    [junit] 08/12/30 17:18:04 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src
    [junit] 08/12/30 17:18:04 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/data/files/TestSerDe.jar
    [junit] 08/12/30 17:18:04 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
    [junit] 08/12/30 17:18:04 WARN mapred.JobClient: Use GenericOptionsParser for parsing the arguments. Applications should implement Tool for the same.
    [junit] 08/12/30 17:18:04 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] Job running in-process (local Hadoop)
    [junit] 08/12/30 17:18:05 INFO exec.ExecDriver: Job running in-process (local Hadoop)
    [junit] 08/12/30 17:18:05 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] 08/12/30 17:18:05 INFO mapred.MapTask: numReduceTasks: 1
    [junit] 08/12/30 17:18:05 INFO mapred.MapTask: io.sort.mb = 100
    [junit] 08/12/30 17:18:05 INFO mapred.MapTask: data buffer = 79691776/99614720
    [junit] 08/12/30 17:18:05 INFO mapred.MapTask: record buffer = 262144/327680
    [junit] 08/12/30 17:18:05 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 17:18:05 INFO exec.MapOperator: Adding alias c:a:src1 to work list for file /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src/kv1.txt
    [junit] 08/12/30 17:18:05 INFO exec.MapOperator: Adding alias c:b:src2 to work list for file /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src/kv1.txt
    [junit] 08/12/30 17:18:05 INFO exec.MapOperator: Got partitions: null
    [junit] 08/12/30 17:18:05 INFO exec.TableScanOperator: Initializing Self
    [junit] 08/12/30 17:18:05 INFO exec.TableScanOperator: Initializing children:
    [junit] 08/12/30 17:18:05 INFO exec.FilterOperator: Initializing Self
    [junit] 08/12/30 17:18:05 INFO exec.FilterOperator: Initializing children:
    [junit] 08/12/30 17:18:05 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 17:18:05 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 17:18:05 INFO exec.ReduceSinkOperator: Initializing Self
    [junit] 08/12/30 17:18:05 INFO exec.ReduceSinkOperator: Using tag = 0
    [junit] 08/12/30 17:18:05 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 17:18:05 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 17:18:05 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 17:18:05 INFO exec.FilterOperator: Initialization Done
    [junit] 08/12/30 17:18:05 INFO exec.TableScanOperator: Initialization Done
    [junit] 08/12/30 17:18:05 INFO exec.TableScanOperator: Initializing Self
    [junit] 08/12/30 17:18:05 INFO exec.TableScanOperator: Initializing children:
    [junit] 08/12/30 17:18:05 INFO exec.FilterOperator: Initializing Self
    [junit] 08/12/30 17:18:05 INFO exec.FilterOperator: Initializing children:
    [junit] 08/12/30 17:18:05 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 17:18:05 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 17:18:05 INFO exec.ReduceSinkOperator: Initializing Self
    [junit] 08/12/30 17:18:05 INFO exec.ReduceSinkOperator: Using tag = 1
    [junit] 08/12/30 17:18:05 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 17:18:05 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 17:18:05 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 17:18:05 INFO exec.FilterOperator: Initialization Done
    [junit] 08/12/30 17:18:05 INFO exec.TableScanOperator: Initialization Done
    [junit] 08/12/30 17:18:05 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 17:18:05 INFO exec.FilterOperator: PASSED:9
    [junit] 08/12/30 17:18:05 INFO exec.FilterOperator: FILTERED:491
    [junit] 08/12/30 17:18:05 INFO exec.FilterOperator: PASSED:7
    [junit] 08/12/30 17:18:05 INFO exec.FilterOperator: FILTERED:493
    [junit] 08/12/30 17:18:05 INFO mapred.MapTask: Starting flush of map output
    [junit] 08/12/30 17:18:05 INFO mapred.MapTask: Finished spill 0
    [junit] 08/12/30 17:18:05 INFO mapred.TaskRunner: Task:attempt_local_0001_m_000000_0 is done. And is in the process of commiting
    [junit] 08/12/30 17:18:05 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src/kv1.txt:0+5812
    [junit] 08/12/30 17:18:05 INFO mapred.TaskRunner: Task 'attempt_local_0001_m_000000_0' done.
    [junit] 08/12/30 17:18:05 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 17:18:05 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 17:18:05 INFO mapred.Merger: Merging 1 sorted segments
    [junit] 08/12/30 17:18:05 INFO mapred.Merger: Down to the last merge-pass, with 1 segments left of total size: 610 bytes
    [junit] 08/12/30 17:18:05 INFO exec.JoinOperator: Initializing Self
    [junit] 08/12/30 17:18:05 INFO exec.JoinOperator: Initializing children:
    [junit] 08/12/30 17:18:05 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 17:18:05 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 17:18:05 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 17:18:05 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 17:18:05 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 17:18:05 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 17:18:05 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 17:18:05 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 17:18:05 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 17:18:05 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 17:18:05 INFO exec.JoinOperator: Initialization Done
    [junit] 08/12/30 17:18:05 INFO mapred.TaskRunner: Task:attempt_local_0001_r_000000_0 is done. And is in the process of commiting
    [junit] 08/12/30 17:18:05 INFO mapred.LocalJobRunner: 
    [junit] 08/12/30 17:18:05 INFO mapred.TaskRunner: Task attempt_local_0001_r_000000_0 is allowed to commit now
    [junit] 08/12/30 17:18:05 INFO mapred.FileOutputCommitter: Saved output of task 'attempt_local_0001_r_000000_0' to file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/tmp-1325509851
    [junit] 08/12/30 17:18:05 INFO mapred.LocalJobRunner: reduce > reduce
    [junit] 08/12/30 17:18:05 INFO mapred.TaskRunner: Task 'attempt_local_0001_r_000000_0' done.
    [junit] 08/12/30 17:18:06 INFO exec.ExecDriver:  map = 100%,  reduce =100%
    [junit] 08/12/30 17:18:06 INFO exec.ExecDriver: Ended Job = job_local_0001
    [junit]  map = 100%,  reduce =100%
    [junit] Ended Job = job_local_0001
    [junit] diff -I \(file:\)\|\(/tmp/.*\) /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/../build/ql/test/logs/join6.q.out /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/src/test/results/clientpositive/join6.q.out
    [junit] Done query: join6.q
    [junit] Begin query: input_testxpath3.q
    [junit] plan = /tmp/plan1447.xml
    [junit] 08/12/30 17:18:09 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to 0 since there's no reduce operator
    [junit] 08/12/30 17:18:09 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src_thrift
    [junit] 08/12/30 17:18:09 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/data/files/TestSerDe.jar
    [junit] 08/12/30 17:18:09 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
    [junit] 08/12/30 17:18:09 WARN mapred.JobClient: Use GenericOptionsParser for parsing the arguments. Applications should implement Tool for the same.
    [junit] 08/12/30 17:18:09 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] Job running in-process (local Hadoop)
    [junit] 08/12/30 17:18:09 INFO exec.ExecDriver: Job running in-process (local Hadoop)
    [junit] 08/12/30 17:18:09 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] 08/12/30 17:18:09 INFO mapred.MapTask: numReduceTasks: 0
    [junit] 08/12/30 17:18:09 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 17:18:09 INFO exec.MapOperator: Adding alias src_thrift to work list for file /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src_thrift/complex.seq
    [junit] 08/12/30 17:18:10 INFO exec.MapOperator: Got partitions: null
    [junit] 08/12/30 17:18:10 INFO exec.TableScanOperator: Initializing Self
    [junit] 08/12/30 17:18:10 INFO exec.TableScanOperator: Initializing children:
    [junit] 08/12/30 17:18:10 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 17:18:10 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 17:18:10 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 17:18:10 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 17:18:10 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 17:18:10 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 17:18:10 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 17:18:10 INFO exec.TableScanOperator: Initialization Done
    [junit] 08/12/30 17:18:10 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 17:18:10 INFO mapred.TaskRunner: Task:attempt_local_0001_m_000000_0 is done. And is in the process of commiting
    [junit] 08/12/30 17:18:10 INFO mapred.LocalJobRunner: 
    [junit] 08/12/30 17:18:10 INFO mapred.TaskRunner: Task attempt_local_0001_m_000000_0 is allowed to commit now
    [junit] 08/12/30 17:18:10 INFO mapred.FileOutputCommitter: Saved output of task 'attempt_local_0001_m_000000_0' to file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/tmp-89471178
    [junit] 08/12/30 17:18:10 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src_thrift/complex.seq:0+1491
    [junit] 08/12/30 17:18:10 INFO mapred.TaskRunner: Task 'attempt_local_0001_m_000000_0' done.
    [junit]  map = 100%,  reduce =0%
    [junit] 08/12/30 17:18:10 INFO exec.ExecDriver:  map = 100%,  reduce =0%
    [junit] Ended Job = job_local_0001
    [junit] 08/12/30 17:18:10 INFO exec.ExecDriver: Ended Job = job_local_0001
    [junit] diff -I \(file:\)\|\(/tmp/.*\) /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/../build/ql/test/logs/input_testxpath3.q.out /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/src/test/results/clientpositive/input_testxpath3.q.out
    [junit] Done query: input_testxpath3.q
    [junit] Begin query: input_dynamicserde.q
    [junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src)
    [junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:361)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
    [junit] 	at org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_input_dynamicserde(TestCliDriver.java:2528)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit] 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit] 	at java.lang.reflect.Method.invoke(Method.java:597)
    [junit] 	at junit.framework.TestCase.runTest(TestCase.java:164)
    [junit] 	at junit.framework.TestCase.runBare(TestCase.java:130)
    [junit] 	at junit.framework.TestResult$1.protect(TestResult.java:106)
    [junit] 	at junit.framework.TestResult.runProtected(TestResult.java:124)
    [junit] 	at junit.framework.TestResult.run(TestResult.java:109)
    [junit] 	at junit.framework.TestCase.run(TestCase.java:120)
    [junit] 	at junit.framework.TestSuite.runTest(TestSuite.java:230)
    [junit] 	at junit.framework.TestSuite.run(TestSuite.java:225)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
    [junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src)
    [junit] 	at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
    [junit] 	at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
    [junit] 	at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
    [junit] 	... 21 more
    [junit] Begin query: input7.q
    [junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/srcbucket)
    [junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/srcbucket)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:361)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
    [junit] 	at org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_input7(TestCliDriver.java:2553)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit] 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit] 	at java.lang.reflect.Method.invoke(Method.java:597)
    [junit] 	at junit.framework.TestCase.runTest(TestCase.java:164)
    [junit] 	at junit.framework.TestCase.runBare(TestCase.java:130)
    [junit] 	at junit.framework.TestResult$1.protect(TestResult.java:106)
    [junit] 	at junit.framework.TestResult.runProtected(TestResult.java:124)
    [junit] 	at junit.framework.TestResult.run(TestResult.java:109)
    [junit] 	at junit.framework.TestCase.run(TestCase.java:120)
    [junit] 	at junit.framework.TestSuite.runTest(TestSuite.java:230)
    [junit] 	at junit.framework.TestSuite.run(TestSuite.java:225)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
    [junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/srcbucket)
    [junit] 	at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
    [junit] 	at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
    [junit] 	at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
    [junit] 	... 21 more
    [junit] Begin query: input_testsequencefile.q
    [junit] plan = /tmp/plan1448.xml
    [junit] 08/12/30 17:18:14 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to 0 since there's no reduce operator
    [junit] 08/12/30 17:18:14 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src
    [junit] 08/12/30 17:18:14 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/data/files/TestSerDe.jar
    [junit] 08/12/30 17:18:14 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
    [junit] 08/12/30 17:18:14 WARN mapred.JobClient: Use GenericOptionsParser for parsing the arguments. Applications should implement Tool for the same.
    [junit] 08/12/30 17:18:14 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] 08/12/30 17:18:14 INFO exec.ExecDriver: Job running in-process (local Hadoop)
    [junit] 08/12/30 17:18:14 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] Job running in-process (local Hadoop)
    [junit] 08/12/30 17:18:14 INFO mapred.MapTask: numReduceTasks: 0
    [junit] 08/12/30 17:18:14 INFO util.NativeCodeLoader: Loaded the native-hadoop library
    [junit] 08/12/30 17:18:14 INFO zlib.ZlibFactory: Successfully loaded & initialized native-zlib library
    [junit] 08/12/30 17:18:14 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 17:18:14 INFO exec.MapOperator: Adding alias src to work list for file /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src/kv1.txt
    [junit] 08/12/30 17:18:14 INFO exec.MapOperator: Got partitions: null
    [junit] 08/12/30 17:18:14 INFO exec.TableScanOperator: Initializing Self
    [junit] 08/12/30 17:18:14 INFO exec.TableScanOperator: Initializing children:
    [junit] 08/12/30 17:18:14 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 17:18:14 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 17:18:14 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 17:18:14 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 17:18:14 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 17:18:14 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 17:18:14 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 17:18:14 INFO exec.TableScanOperator: Initialization Done
    [junit] 08/12/30 17:18:14 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 17:18:14 INFO mapred.TaskRunner: Task:attempt_local_0001_m_000000_0 is done. And is in the process of commiting
    [junit] 08/12/30 17:18:14 INFO mapred.LocalJobRunner: 
    [junit] 08/12/30 17:18:14 INFO mapred.TaskRunner: Task attempt_local_0001_m_000000_0 is allowed to commit now
    [junit] 08/12/30 17:18:14 INFO mapred.FileOutputCommitter: Saved output of task 'attempt_local_0001_m_000000_0' to file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/tmp-2105734367
    [junit] 08/12/30 17:18:14 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src/kv1.txt:0+5812
    [junit] 08/12/30 17:18:14 INFO mapred.TaskRunner: Task 'attempt_local_0001_m_000000_0' done.
    [junit] 08/12/30 17:18:15 INFO exec.ExecDriver:  map = 100%,  reduce =0%
    [junit] 08/12/30 17:18:15 INFO exec.ExecDriver: Ended Job = job_local_0001
    [junit]  map = 100%,  reduce =0%
    [junit] Ended Job = job_local_0001
    [junit] diff -I \(file:\)\|\(/tmp/.*\) /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/../build/ql/test/logs/input_testsequencefile.q.out /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/src/test/results/clientpositive/input_testsequencefile.q.out
    [junit] Done query: input_testsequencefile.q
    [junit] Begin query: join8.q
    [junit] plan = /tmp/plan1449.xml
    [junit] 08/12/30 17:18:19 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to jobconf value of: 1
    [junit] 08/12/30 17:18:19 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src
    [junit] 08/12/30 17:18:19 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/data/files/TestSerDe.jar
    [junit] 08/12/30 17:18:19 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
    [junit] 08/12/30 17:18:19 WARN mapred.JobClient: Use GenericOptionsParser for parsing the arguments. Applications should implement Tool for the same.
    [junit] 08/12/30 17:18:19 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] 08/12/30 17:18:19 INFO exec.ExecDriver: Job running in-process (local Hadoop)
    [junit] Job running in-process (local Hadoop)
    [junit] 08/12/30 17:18:19 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] 08/12/30 17:18:19 INFO mapred.MapTask: numReduceTasks: 1
    [junit] 08/12/30 17:18:19 INFO mapred.MapTask: io.sort.mb = 100
    [junit] 08/12/30 17:18:19 INFO mapred.MapTask: data buffer = 79691776/99614720
    [junit] 08/12/30 17:18:19 INFO mapred.MapTask: record buffer = 262144/327680
    [junit] 08/12/30 17:18:19 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 17:18:19 INFO exec.MapOperator: Adding alias c:a:src1 to work list for file /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src/kv1.txt
    [junit] 08/12/30 17:18:19 INFO exec.MapOperator: Adding alias c:b:src2 to work list for file /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src/kv1.txt
    [junit] 08/12/30 17:18:19 INFO exec.MapOperator: Got partitions: null
    [junit] 08/12/30 17:18:19 INFO exec.TableScanOperator: Initializing Self
    [junit] 08/12/30 17:18:19 INFO exec.TableScanOperator: Initializing children:
    [junit] 08/12/30 17:18:19 INFO exec.FilterOperator: Initializing Self
    [junit] 08/12/30 17:18:19 INFO exec.FilterOperator: Initializing children:
    [junit] 08/12/30 17:18:19 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 17:18:19 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 17:18:19 INFO exec.ReduceSinkOperator: Initializing Self
    [junit] 08/12/30 17:18:19 INFO exec.ReduceSinkOperator: Using tag = 0
    [junit] 08/12/30 17:18:19 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 17:18:19 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 17:18:19 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 17:18:19 INFO exec.FilterOperator: Initialization Done
    [junit] 08/12/30 17:18:19 INFO exec.TableScanOperator: Initialization Done
    [junit] 08/12/30 17:18:19 INFO exec.TableScanOperator: Initializing Self
    [junit] 08/12/30 17:18:19 INFO exec.TableScanOperator: Initializing children:
    [junit] 08/12/30 17:18:19 INFO exec.FilterOperator: Initializing Self
    [junit] 08/12/30 17:18:19 INFO exec.FilterOperator: Initializing children:
    [junit] 08/12/30 17:18:19 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 17:18:19 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 17:18:19 INFO exec.ReduceSinkOperator: Initializing Self
    [junit] 08/12/30 17:18:19 INFO exec.ReduceSinkOperator: Using tag = 1
    [junit] 08/12/30 17:18:19 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 17:18:19 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 17:18:19 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 17:18:19 INFO exec.FilterOperator: Initialization Done
    [junit] 08/12/30 17:18:19 INFO exec.TableScanOperator: Initialization Done
    [junit] 08/12/30 17:18:19 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 17:18:19 INFO exec.FilterOperator: PASSED:9
    [junit] 08/12/30 17:18:19 INFO exec.FilterOperator: FILTERED:491
    [junit] 08/12/30 17:18:19 INFO exec.FilterOperator: PASSED:7
    [junit] 08/12/30 17:18:19 INFO exec.FilterOperator: FILTERED:493
    [junit] 08/12/30 17:18:19 INFO mapred.MapTask: Starting flush of map output
    [junit] 08/12/30 17:18:20 INFO mapred.MapTask: Finished spill 0
    [junit] 08/12/30 17:18:20 INFO mapred.TaskRunner: Task:attempt_local_0001_m_000000_0 is done. And is in the process of commiting
    [junit] 08/12/30 17:18:20 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src/kv1.txt:0+5812
    [junit] 08/12/30 17:18:20 INFO mapred.TaskRunner: Task 'attempt_local_0001_m_000000_0' done.
    [junit] 08/12/30 17:18:20 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 17:18:20 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 17:18:20 INFO mapred.Merger: Merging 1 sorted segments
    [junit] 08/12/30 17:18:20 INFO mapred.Merger: Down to the last merge-pass, with 1 segments left of total size: 610 bytes
    [junit] 08/12/30 17:18:20 INFO exec.JoinOperator: Initializing Self
    [junit] 08/12/30 17:18:20 INFO exec.JoinOperator: Initializing children:
    [junit] 08/12/30 17:18:20 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 17:18:20 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 17:18:20 INFO exec.FilterOperator: Initializing Self
    [junit] 08/12/30 17:18:20 INFO exec.FilterOperator: Initializing children:
    [junit] 08/12/30 17:18:20 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 17:18:20 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 17:18:20 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 17:18:20 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 17:18:20 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 17:18:20 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 17:18:20 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 17:18:20 INFO exec.FilterOperator: Initialization Done
    [junit] 08/12/30 17:18:20 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 17:18:20 INFO exec.JoinOperator: Initialization Done
    [junit] 08/12/30 17:18:20 INFO exec.FilterOperator: PASSED:5
    [junit] 08/12/30 17:18:20 INFO exec.FilterOperator: FILTERED:6
    [junit] 08/12/30 17:18:20 INFO mapred.TaskRunner: Task:attempt_local_0001_r_000000_0 is done. And is in the process of commiting
    [junit] 08/12/30 17:18:20 INFO mapred.LocalJobRunner: 
    [junit] 08/12/30 17:18:20 INFO mapred.TaskRunner: Task attempt_local_0001_r_000000_0 is allowed to commit now
    [junit] 08/12/30 17:18:20 INFO mapred.FileOutputCommitter: Saved output of task 'attempt_local_0001_r_000000_0' to file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/tmp-759191445
    [junit] 08/12/30 17:18:20 INFO mapred.LocalJobRunner: reduce > reduce
    [junit] 08/12/30 17:18:20 INFO mapred.TaskRunner: Task 'attempt_local_0001_r_000000_0' done.
    [junit]  map = 100%,  reduce =100%
    [junit] Ended Job = job_local_0001
    [junit] 08/12/30 17:18:20 INFO exec.ExecDriver:  map = 100%,  reduce =100%
    [junit] 08/12/30 17:18:20 INFO exec.ExecDriver: Ended Job = job_local_0001
    [junit] diff -I \(file:\)\|\(/tmp/.*\) /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/../build/ql/test/logs/join8.q.out /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/src/test/results/clientpositive/join8.q.out
    [junit] Done query: join8.q
    [junit] Begin query: input9.q
    [junit] plan = /tmp/plan1450.xml
    [junit] 08/12/30 17:18:23 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to 0 since there's no reduce operator
    [junit] 08/12/30 17:18:24 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src1
    [junit] 08/12/30 17:18:24 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/data/files/TestSerDe.jar
    [junit] 08/12/30 17:18:24 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
    [junit] 08/12/30 17:18:24 WARN mapred.JobClient: Use GenericOptionsParser for parsing the arguments. Applications should implement Tool for the same.
    [junit] 08/12/30 17:18:24 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] 08/12/30 17:18:24 INFO exec.ExecDriver: Job running in-process (local Hadoop)
    [junit] 08/12/30 17:18:24 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] Job running in-process (local Hadoop)
    [junit] 08/12/30 17:18:24 INFO mapred.MapTask: numReduceTasks: 0
    [junit] 08/12/30 17:18:24 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 17:18:24 INFO exec.MapOperator: Adding alias src1 to work list for file /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src1/kv3.txt
    [junit] 08/12/30 17:18:24 INFO exec.MapOperator: Got partitions: null
    [junit] 08/12/30 17:18:24 INFO exec.TableScanOperator: Initializing Self
    [junit] 08/12/30 17:18:24 INFO exec.TableScanOperator: Initializing children:
    [junit] 08/12/30 17:18:24 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 17:18:24 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 17:18:24 INFO exec.FilterOperator: Initializing Self
    [junit] 08/12/30 17:18:24 INFO exec.FilterOperator: Initializing children:
    [junit] 08/12/30 17:18:24 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 17:18:24 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 17:18:24 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 17:18:24 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 17:18:24 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 17:18:24 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 17:18:24 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 17:18:24 INFO exec.FilterOperator: Initialization Done
    [junit] 08/12/30 17:18:24 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 17:18:24 INFO exec.TableScanOperator: Initialization Done
    [junit] 08/12/30 17:18:24 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 17:18:24 INFO exec.FilterOperator: PASSED:0
    [junit] 08/12/30 17:18:24 INFO exec.FilterOperator: FILTERED:25
    [junit] 08/12/30 17:18:24 INFO mapred.TaskRunner: Task:attempt_local_0001_m_000000_0 is done. And is in the process of commiting
    [junit] 08/12/30 17:18:24 INFO mapred.LocalJobRunner: 
    [junit] 08/12/30 17:18:24 INFO mapred.TaskRunner: Task attempt_local_0001_m_000000_0 is allowed to commit now
    [junit] 08/12/30 17:18:24 INFO mapred.FileOutputCommitter: Saved output of task 'attempt_local_0001_m_000000_0' to file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/tmp-773182084
    [junit] 08/12/30 17:18:24 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src1/kv3.txt:0+216
    [junit] 08/12/30 17:18:24 INFO mapred.TaskRunner: Task 'attempt_local_0001_m_000000_0' done.
    [junit] 08/12/30 17:18:25 INFO exec.ExecDriver:  map = 100%,  reduce =0%
    [junit] 08/12/30 17:18:25 INFO exec.ExecDriver: Ended Job = job_local_0001
    [junit]  map = 100%,  reduce =0%
    [junit] Ended Job = job_local_0001
    [junit] diff -I \(file:\)\|\(/tmp/.*\) /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/../build/ql/test/logs/input9.q.out /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/src/test/results/clientpositive/input9.q.out
    [junit] Done query: input9.q
    [junit] Begin query: input_dfs.q
    [junit] diff -I \(file:\)\|\(/tmp/.*\) /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/../build/ql/test/logs/input_dfs.q.out /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/src/test/results/clientpositive/input_dfs.q.out
    [junit] Done query: input_dfs.q
    [junit] Begin query: input.q
    [junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src)
    [junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:361)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
    [junit] 	at org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_input(TestCliDriver.java:2678)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit] 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit] 	at java.lang.reflect.Method.invoke(Method.java:597)
    [junit] 	at junit.framework.TestCase.runTest(TestCase.java:164)
    [junit] 	at junit.framework.TestCase.runBare(TestCase.java:130)
    [junit] 	at junit.framework.TestResult$1.protect(TestResult.java:106)
    [junit] 	at junit.framework.TestResult.runProtected(TestResult.java:124)
    [junit] 	at junit.framework.TestResult.run(TestResult.java:109)
    [junit] 	at junit.framework.TestCase.run(TestCase.java:120)
    [junit] 	at junit.framework.TestSuite.runTest(TestSuite.java:230)
    [junit] 	at junit.framework.TestSuite.run(TestSuite.java:225)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
    [junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src)
    [junit] 	at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
    [junit] 	at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
    [junit] 	at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
    [junit] 	... 21 more
    [junit] Begin query: udf1.q
    [junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/srcpart)
    [junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/srcpart)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:361)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
    [junit] 	at org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_udf1(TestCliDriver.java:2703)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit] 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit] 	at java.lang.reflect.Method.invoke(Method.java:597)
    [junit] 	at junit.framework.TestCase.runTest(TestCase.java:164)
    [junit] 	at junit.framework.TestCase.runBare(TestCase.java:130)
    [junit] 	at junit.framework.TestResult$1.protect(TestResult.java:106)
    [junit] 	at junit.framework.TestResult.runProtected(TestResult.java:124)
    [junit] 	at junit.framework.TestResult.run(TestResult.java:109)
    [junit] 	at junit.framework.TestCase.run(TestCase.java:120)
    [junit] 	at junit.framework.TestSuite.runTest(TestSuite.java:230)
    [junit] 	at junit.framework.TestSuite.run(TestSuite.java:225)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
    [junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/srcpart)
    [junit] 	at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
    [junit] 	at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
    [junit] 	at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
    [junit] 	... 21 more
    [junit] Begin query: join10.q
    [junit] plan = /tmp/plan1451.xml
    [junit] 08/12/30 17:18:30 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to jobconf value of: 1
    [junit] 08/12/30 17:18:30 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src
    [junit] 08/12/30 17:18:30 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/data/files/TestSerDe.jar
    [junit] 08/12/30 17:18:30 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
    [junit] 08/12/30 17:18:30 WARN mapred.JobClient: Use GenericOptionsParser for parsing the arguments. Applications should implement Tool for the same.
    [junit] 08/12/30 17:18:30 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] 08/12/30 17:18:30 INFO exec.ExecDriver: Job running in-process (local Hadoop)
    [junit] 08/12/30 17:18:30 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] Job running in-process (local Hadoop)
    [junit] 08/12/30 17:18:30 INFO mapred.MapTask: numReduceTasks: 1
    [junit] 08/12/30 17:18:30 INFO mapred.MapTask: io.sort.mb = 100
    [junit] 08/12/30 17:18:31 INFO mapred.MapTask: data buffer = 79691776/99614720
    [junit] 08/12/30 17:18:31 INFO mapred.MapTask: record buffer = 262144/327680
    [junit] 08/12/30 17:18:31 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 17:18:31 INFO exec.MapOperator: Adding alias x:src to work list for file /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src/kv1.txt
    [junit] 08/12/30 17:18:31 INFO exec.MapOperator: Adding alias y:src to work list for file /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src/kv1.txt
    [junit] 08/12/30 17:18:31 INFO exec.MapOperator: Got partitions: null
    [junit] 08/12/30 17:18:31 INFO exec.TableScanOperator: Initializing Self
    [junit] 08/12/30 17:18:31 INFO exec.TableScanOperator: Initializing children:
    [junit] 08/12/30 17:18:31 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 17:18:31 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 17:18:31 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 17:18:31 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 17:18:31 INFO exec.ReduceSinkOperator: Initializing Self
    [junit] 08/12/30 17:18:31 INFO exec.ReduceSinkOperator: Using tag = 0
    [junit] 08/12/30 17:18:31 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 17:18:31 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 17:18:31 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 17:18:31 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 17:18:31 INFO exec.TableScanOperator: Initialization Done
    [junit] 08/12/30 17:18:31 INFO exec.TableScanOperator: Initializing Self
    [junit] 08/12/30 17:18:31 INFO exec.TableScanOperator: Initializing children:
    [junit] 08/12/30 17:18:31 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 17:18:31 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 17:18:31 INFO exec.ReduceSinkOperator: Initializing Self
    [junit] 08/12/30 17:18:31 INFO exec.ReduceSinkOperator: Using tag = 1
    [junit] 08/12/30 17:18:31 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 17:18:31 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 17:18:31 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 17:18:31 INFO exec.TableScanOperator: Initialization Done
    [junit] 08/12/30 17:18:31 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 17:18:31 INFO mapred.MapTask: Starting flush of map output
    [junit] 08/12/30 17:18:31 INFO mapred.MapTask: Finished spill 0
    [junit] 08/12/30 17:18:31 INFO mapred.TaskRunner: Task:attempt_local_0001_m_000000_0 is done. And is in the process of commiting
    [junit] 08/12/30 17:18:31 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src/kv1.txt:0+5812
    [junit] 08/12/30 17:18:31 INFO mapred.TaskRunner: Task 'attempt_local_0001_m_000000_0' done.
    [junit] 08/12/30 17:18:31 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 17:18:31 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 17:18:31 INFO mapred.Merger: Merging 1 sorted segments
    [junit] 08/12/30 17:18:31 INFO mapred.Merger: Down to the last merge-pass, with 1 segments left of total size: 33532 bytes
    [junit] 08/12/30 17:18:31 INFO exec.JoinOperator: Initializing Self
    [junit] 08/12/30 17:18:31 INFO exec.JoinOperator: Initializing children:
    [junit] 08/12/30 17:18:31 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 17:18:31 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 17:18:31 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 17:18:31 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 17:18:31 INFO exec.JoinOperator: Initialization Done
    [junit] 08/12/30 17:18:31 INFO mapred.TaskRunner: Task:attempt_local_0001_r_000000_0 is done. And is in the process of commiting
    [junit] 08/12/30 17:18:31 INFO mapred.LocalJobRunner: 
    [junit] 08/12/30 17:18:31 INFO mapred.TaskRunner: Task attempt_local_0001_r_000000_0 is allowed to commit now
    [junit] 08/12/30 17:18:31 INFO mapred.FileOutputCommitter: Saved output of task 'attempt_local_0001_r_000000_0' to file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/tmp-1475202659
    [junit] 08/12/30 17:18:31 INFO mapred.LocalJobRunner: reduce > reduce
    [junit] 08/12/30 17:18:31 INFO mapred.TaskRunner: Task 'attempt_local_0001_r_000000_0' done.
    [junit]  map = 100%,  reduce =100%
    [junit] Ended Job = job_local_0001
    [junit] 08/12/30 17:18:31 INFO exec.ExecDriver:  map = 100%,  reduce =100%
    [junit] 08/12/30 17:18:31 INFO exec.ExecDriver: Ended Job = job_local_0001
    [junit] diff -I \(file:\)\|\(/tmp/.*\) /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/../build/ql/test/logs/join10.q.out /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/src/test/results/clientpositive/join10.q.out
    [junit] Done query: join10.q
    [junit] Begin query: input11.q
    [junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src)
    [junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:361)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
    [junit] 	at org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_input11(TestCliDriver.java:2753)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit] 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit] 	at java.lang.reflect.Method.invoke(Method.java:597)
    [junit] 	at junit.framework.TestCase.runTest(TestCase.java:164)
    [junit] 	at junit.framework.TestCase.runBare(TestCase.java:130)
    [junit] 	at junit.framework.TestResult$1.protect(TestResult.java:106)
    [junit] 	at junit.framework.TestResult.runProtected(TestResult.java:124)
    [junit] 	at junit.framework.TestResult.run(TestResult.java:109)
    [junit] 	at junit.framework.TestCase.run(TestCase.java:120)
    [junit] 	at junit.framework.TestSuite.runTest(TestSuite.java:230)
    [junit] 	at junit.framework.TestSuite.run(TestSuite.java:225)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
    [junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src)
    [junit] 	at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
    [junit] 	at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
    [junit] 	at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
    [junit] 	... 21 more
    [junit] Begin query: noalias_subq1.q
    [junit] plan = /tmp/plan1452.xml
    [junit] 08/12/30 17:18:35 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to 0 since there's no reduce operator
    [junit] 08/12/30 17:18:35 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src
    [junit] 08/12/30 17:18:35 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/data/files/TestSerDe.jar
    [junit] 08/12/30 17:18:35 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
    [junit] 08/12/30 17:18:35 WARN mapred.JobClient: Use GenericOptionsParser for parsing the arguments. Applications should implement Tool for the same.
    [junit] 08/12/30 17:18:35 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] Job running in-process (local Hadoop)
    [junit] 08/12/30 17:18:35 INFO exec.ExecDriver: Job running in-process (local Hadoop)
    [junit] 08/12/30 17:18:35 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] 08/12/30 17:18:35 INFO mapred.MapTask: numReduceTasks: 0
    [junit] 08/12/30 17:18:35 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 17:18:35 INFO exec.MapOperator: Adding alias x:src to work list for file /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src/kv1.txt
    [junit] 08/12/30 17:18:35 INFO exec.MapOperator: Got partitions: null
    [junit] 08/12/30 17:18:35 INFO exec.TableScanOperator: Initializing Self
    [junit] 08/12/30 17:18:35 INFO exec.TableScanOperator: Initializing children:
    [junit] 08/12/30 17:18:35 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 17:18:35 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 17:18:35 INFO exec.FilterOperator: Initializing Self
    [junit] 08/12/30 17:18:35 INFO exec.FilterOperator: Initializing children:
    [junit] 08/12/30 17:18:35 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 17:18:35 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 17:18:35 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 17:18:35 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 17:18:35 INFO exec.FilterOperator: Initialization Done
    [junit] 08/12/30 17:18:35 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 17:18:35 INFO exec.TableScanOperator: Initialization Done
    [junit] 08/12/30 17:18:35 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 17:18:35 INFO exec.FilterOperator: PASSED:84
    [junit] 08/12/30 17:18:35 INFO exec.FilterOperator: FILTERED:416
    [junit] 08/12/30 17:18:35 INFO mapred.TaskRunner: Task:attempt_local_0001_m_000000_0 is done. And is in the process of commiting
    [junit] 08/12/30 17:18:35 INFO mapred.LocalJobRunner: 
    [junit] 08/12/30 17:18:35 INFO mapred.TaskRunner: Task attempt_local_0001_m_000000_0 is allowed to commit now
    [junit] 08/12/30 17:18:35 INFO mapred.FileOutputCommitter: Saved output of task 'attempt_local_0001_m_000000_0' to file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/tmp59003636
    [junit] 08/12/30 17:18:35 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src/kv1.txt:0+5812
    [junit] 08/12/30 17:18:35 INFO mapred.TaskRunner: Task 'attempt_local_0001_m_000000_0' done.
    [junit] 08/12/30 17:18:36 INFO exec.ExecDriver:  map = 100%,  reduce =0%
    [junit] 08/12/30 17:18:36 INFO exec.ExecDriver: Ended Job = job_local_0001
    [junit]  map = 100%,  reduce =0%
    [junit] Ended Job = job_local_0001
    [junit] diff -I \(file:\)\|\(/tmp/.*\) /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/../build/ql/test/logs/noalias_subq1.q.out /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/src/test/results/clientpositive/noalias_subq1.q.out
    [junit] Done query: noalias_subq1.q
    [junit] Begin query: udf3.q
    [junit] plan = /tmp/plan1453.xml
    [junit] 08/12/30 17:18:39 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to jobconf value of: 1
    [junit] 08/12/30 17:18:39 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src
    [junit] 08/12/30 17:18:39 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/data/files/TestSerDe.jar
    [junit] 08/12/30 17:18:39 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
    [junit] 08/12/30 17:18:39 WARN mapred.JobClient: Use GenericOptionsParser for parsing the arguments. Applications should implement Tool for the same.
    [junit] 08/12/30 17:18:40 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] 08/12/30 17:18:40 INFO exec.ExecDriver: Job running in-process (local Hadoop)
    [junit] 08/12/30 17:18:40 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] Job running in-process (local Hadoop)
    [junit] 08/12/30 17:18:40 INFO mapred.MapTask: numReduceTasks: 1
    [junit] 08/12/30 17:18:40 INFO mapred.MapTask: io.sort.mb = 100
    [junit] 08/12/30 17:18:40 INFO mapred.MapTask: data buffer = 79691776/99614720
    [junit] 08/12/30 17:18:40 INFO mapred.MapTask: record buffer = 262144/327680
    [junit] 08/12/30 17:18:40 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 17:18:40 INFO exec.MapOperator: Adding alias src to work list for file /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src/kv1.txt
    [junit] 08/12/30 17:18:40 INFO exec.MapOperator: Got partitions: null
    [junit] 08/12/30 17:18:40 INFO exec.TableScanOperator: Initializing Self
    [junit] 08/12/30 17:18:40 INFO exec.TableScanOperator: Initializing children:
    [junit] 08/12/30 17:18:40 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 17:18:40 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 17:18:40 INFO exec.ReduceSinkOperator: Initializing Self
    [junit] 08/12/30 17:18:40 INFO exec.ReduceSinkOperator: Using tag = -1
    [junit] 08/12/30 17:18:40 INFO thrift.TBinarySortableProtocol: Sort order is ""
    [junit] 08/12/30 17:18:40 INFO thrift.TBinarySortableProtocol: Sort order is ""
    [junit] 08/12/30 17:18:40 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 17:18:40 INFO exec.TableScanOperator: Initialization Done
    [junit] 08/12/30 17:18:40 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 17:18:40 INFO mapred.MapTask: Starting flush of map output
    [junit] 08/12/30 17:18:40 INFO mapred.MapTask: Finished spill 0
    [junit] 08/12/30 17:18:40 INFO mapred.TaskRunner: Task:attempt_local_0001_m_000000_0 is done. And is in the process of commiting
    [junit] 08/12/30 17:18:40 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src/kv1.txt:0+5812
    [junit] 08/12/30 17:18:40 INFO mapred.TaskRunner: Task 'attempt_local_0001_m_000000_0' done.
    [junit] 08/12/30 17:18:40 INFO thrift.TBinarySortableProtocol: Sort order is ""
    [junit] 08/12/30 17:18:40 INFO thrift.TBinarySortableProtocol: Sort order is ""
    [junit] 08/12/30 17:18:40 INFO mapred.Merger: Merging 1 sorted segments
    [junit] 08/12/30 17:18:40 INFO mapred.Merger: Down to the last merge-pass, with 1 segments left of total size: 5502 bytes
    [junit] 08/12/30 17:18:40 INFO exec.GroupByOperator: Initializing Self
    [junit] 08/12/30 17:18:40 INFO exec.GroupByOperator: Initializing children:
    [junit] 08/12/30 17:18:40 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 17:18:40 INFO exec.GroupByOperator: Initialization Done
    [junit] 08/12/30 17:18:40 INFO mapred.TaskRunner: Task:attempt_local_0001_r_000000_0 is done. And is in the process of commiting
    [junit] 08/12/30 17:18:40 INFO mapred.LocalJobRunner: 
    [junit] 08/12/30 17:18:40 INFO mapred.TaskRunner: Task attempt_local_0001_r_000000_0 is allowed to commit now
    [junit] 08/12/30 17:18:40 INFO mapred.FileOutputCommitter: Saved output of task 'attempt_local_0001_r_000000_0' to file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/tmp-818963105
    [junit] 08/12/30 17:18:40 INFO mapred.LocalJobRunner: reduce > reduce
    [junit] 08/12/30 17:18:40 INFO mapred.TaskRunner: Task 'attempt_local_0001_r_000000_0' done.
    [junit]  map = 100%,  reduce =100%
    [junit] Ended Job = job_local_0001
    [junit] 08/12/30 17:18:41 INFO exec.ExecDriver:  map = 100%,  reduce =100%
    [junit] 08/12/30 17:18:41 INFO exec.ExecDriver: Ended Job = job_local_0001
    [junit] plan = /tmp/plan1454.xml
    [junit] 08/12/30 17:18:42 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to jobconf value of: 1
    [junit] 08/12/30 17:18:42 INFO exec.ExecDriver: Adding input file /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/tmp/79979961/230490875.10001
    [junit] 08/12/30 17:18:42 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/data/files/TestSerDe.jar
    [junit] 08/12/30 17:18:42 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
    [junit] 08/12/30 17:18:42 WARN mapred.JobClient: Use GenericOptionsParser for parsing the arguments. Applications should implement Tool for the same.
    [junit] 08/12/30 17:18:42 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] 08/12/30 17:18:43 INFO exec.ExecDriver: Job running in-process (local Hadoop)
    [junit] 08/12/30 17:18:43 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] Job running in-process (local Hadoop)
    [junit] 08/12/30 17:18:43 INFO mapred.MapTask: numReduceTasks: 1
    [junit] 08/12/30 17:18:43 INFO mapred.MapTask: io.sort.mb = 100
    [junit] 08/12/30 17:18:43 INFO mapred.MapTask: data buffer = 79691776/99614720
    [junit] 08/12/30 17:18:43 INFO mapred.MapTask: record buffer = 262144/327680
    [junit] 08/12/30 17:18:43 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 17:18:43 INFO exec.MapOperator: Adding alias /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/tmp/79979961/230490875.10001 to work list for file /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/tmp/79979961/230490875.10001/attempt_local_0001_r_000000_0
    [junit] 08/12/30 17:18:43 INFO exec.MapOperator: Got partitions: null
    [junit] 08/12/30 17:18:43 INFO exec.ReduceSinkOperator: Initializing Self
    [junit] 08/12/30 17:18:43 INFO exec.ReduceSinkOperator: Using tag = -1
    [junit] 08/12/30 17:18:43 INFO thrift.TBinarySortableProtocol: Sort order is ""
    [junit] 08/12/30 17:18:43 INFO thrift.TBinarySortableProtocol: Sort order is ""
    [junit] 08/12/30 17:18:43 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 17:18:43 INFO mapred.MapTask: Starting flush of map output
    [junit] 08/12/30 17:18:43 INFO mapred.MapTask: Finished spill 0
    [junit] 08/12/30 17:18:43 INFO mapred.TaskRunner: Task:attempt_local_0001_m_000000_0 is done. And is in the process of commiting
    [junit] 08/12/30 17:18:43 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/tmp/79979961/230490875.10001/attempt_local_0001_r_000000_0:0+124
    [junit] 08/12/30 17:18:43 INFO mapred.TaskRunner: Task 'attempt_local_0001_m_000000_0' done.
    [junit] 08/12/30 17:18:43 INFO thrift.TBinarySortableProtocol: Sort order is ""
    [junit] 08/12/30 17:18:43 INFO thrift.TBinarySortableProtocol: Sort order is ""
    [junit] 08/12/30 17:18:43 INFO mapred.Merger: Merging 1 sorted segments
    [junit] 08/12/30 17:18:43 INFO mapred.Merger: Down to the last merge-pass, with 1 segments left of total size: 24 bytes
    [junit] 08/12/30 17:18:43 INFO exec.GroupByOperator: Initializing Self
    [junit] 08/12/30 17:18:43 INFO exec.GroupByOperator: Initializing children:
    [junit] 08/12/30 17:18:43 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 17:18:43 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 17:18:43 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 17:18:43 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 17:18:43 INFO exec.GroupByOperator: Initialization Done
    [junit] 08/12/30 17:18:43 INFO mapred.TaskRunner: Task:attempt_local_0001_r_000000_0 is done. And is in the process of commiting
    [junit] 08/12/30 17:18:43 INFO mapred.LocalJobRunner: 
    [junit] 08/12/30 17:18:43 INFO mapred.TaskRunner: Task attempt_local_0001_r_000000_0 is allowed to commit now
    [junit] 08/12/30 17:18:43 INFO mapred.FileOutputCommitter: Saved output of task 'attempt_local_0001_r_000000_0' to file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/tmp-1081945831
    [junit] 08/12/30 17:18:43 INFO mapred.LocalJobRunner: reduce > reduce
    [junit] 08/12/30 17:18:43 INFO mapred.TaskRunner: Task 'attempt_local_0001_r_000000_0' done.
    [junit] 08/12/30 17:18:44 INFO exec.ExecDriver:  map = 100%,  reduce =100%
    [junit] 08/12/30 17:18:44 INFO exec.ExecDriver: Ended Job = job_local_0001
    [junit]  map = 100%,  reduce =100%
    [junit] Ended Job = job_local_0001
    [junit] diff -I \(file:\)\|\(/tmp/.*\) /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/../build/ql/test/logs/udf3.q.out /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/src/test/results/clientpositive/udf3.q.out
    [junit] Done query: udf3.q
    [junit] Begin query: join12.q
    [junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src)
    [junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:361)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
    [junit] 	at org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_join12(TestCliDriver.java:2828)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit] 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit] 	at java.lang.reflect.Method.invoke(Method.java:597)
    [junit] 	at junit.framework.TestCase.runTest(TestCase.java:164)
    [junit] 	at junit.framework.TestCase.runBare(TestCase.java:130)
    [junit] 	at junit.framework.TestResult$1.protect(TestResult.java:106)
    [junit] 	at junit.framework.TestResult.runProtected(TestResult.java:124)
    [junit] 	at junit.framework.TestResult.run(TestResult.java:109)
    [junit] 	at junit.framework.TestCase.run(TestCase.java:120)
    [junit] 	at junit.framework.TestSuite.runTest(TestSuite.java:230)
    [junit] 	at junit.framework.TestSuite.run(TestSuite.java:225)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
    [junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src)
    [junit] 	at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
    [junit] 	at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
    [junit] 	at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
    [junit] 	... 21 more
    [junit] Begin query: input_testxpath.q
    [junit] plan = /tmp/plan1455.xml
    [junit] 08/12/30 17:18:47 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to 0 since there's no reduce operator
    [junit] 08/12/30 17:18:47 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src_thrift
    [junit] 08/12/30 17:18:47 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/data/files/TestSerDe.jar
    [junit] 08/12/30 17:18:47 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
    [junit] 08/12/30 17:18:47 WARN mapred.JobClient: Use GenericOptionsParser for parsing the arguments. Applications should implement Tool for the same.
    [junit] 08/12/30 17:18:47 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] 08/12/30 17:18:47 INFO exec.ExecDriver: Job running in-process (local Hadoop)
    [junit] 08/12/30 17:18:47 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] Job running in-process (local Hadoop)
    [junit] 08/12/30 17:18:47 INFO mapred.MapTask: numReduceTasks: 0
    [junit] 08/12/30 17:18:48 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 17:18:48 INFO exec.MapOperator: Adding alias src_thrift to work list for file /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src_thrift/complex.seq
    [junit] 08/12/30 17:18:48 INFO exec.MapOperator: Got partitions: null
    [junit] 08/12/30 17:18:48 INFO exec.TableScanOperator: Initializing Self
    [junit] 08/12/30 17:18:48 INFO exec.TableScanOperator: Initializing children:
    [junit] 08/12/30 17:18:48 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 17:18:48 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 17:18:48 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 17:18:48 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 17:18:48 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 17:18:48 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 17:18:48 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 17:18:48 INFO exec.TableScanOperator: Initialization Done
    [junit] 08/12/30 17:18:48 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 17:18:48 INFO mapred.TaskRunner: Task:attempt_local_0001_m_000000_0 is done. And is in the process of commiting
    [junit] 08/12/30 17:18:48 INFO mapred.LocalJobRunner: 
    [junit] 08/12/30 17:18:48 INFO mapred.TaskRunner: Task attempt_local_0001_m_000000_0 is allowed to commit now
    [junit] 08/12/30 17:18:48 INFO mapred.FileOutputCommitter: Saved output of task 'attempt_local_0001_m_000000_0' to file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/tmp-480264013
    [junit] 08/12/30 17:18:48 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src_thrift/complex.seq:0+1491
    [junit] 08/12/30 17:18:48 INFO mapred.TaskRunner: Task 'attempt_local_0001_m_000000_0' done.
    [junit]  map = 100%,  reduce =0%
    [junit] Ended Job = job_local_0001
    [junit] 08/12/30 17:18:48 INFO exec.ExecDriver:  map = 100%,  reduce =0%
    [junit] 08/12/30 17:18:48 INFO exec.ExecDriver: Ended Job = job_local_0001
    [junit] diff -I \(file:\)\|\(/tmp/.*\) /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/../build/ql/test/logs/input_testxpath.q.out /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/src/test/results/clientpositive/input_testxpath.q.out
    [junit] Done query: input_testxpath.q
    [junit] Begin query: input13.q
    [junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src)
    [junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:361)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
    [junit] 	at org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_input13(TestCliDriver.java:2878)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit] 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit] 	at java.lang.reflect.Method.invoke(Method.java:597)
    [junit] 	at junit.framework.TestCase.runTest(TestCase.java:164)
    [junit] 	at junit.framework.TestCase.runBare(TestCase.java:130)
    [junit] 	at junit.framework.TestResult$1.protect(TestResult.java:106)
    [junit] 	at junit.framework.TestResult.runProtected(TestResult.java:124)
    [junit] 	at junit.framework.TestResult.run(TestResult.java:109)
    [junit] 	at junit.framework.TestCase.run(TestCase.java:120)
    [junit] 	at junit.framework.TestSuite.runTest(TestSuite.java:230)
    [junit] 	at junit.framework.TestSuite.run(TestSuite.java:225)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
    [junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src)
    [junit] 	at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
    [junit] 	at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
    [junit] 	at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
    [junit] 	... 21 more
    [junit] Begin query: udf5.q
    [junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/srcbucket)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:361)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
    [junit] 	at org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_udf5(TestCliDriver.java:2903)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit] 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit] 	at java.lang.reflect.Method.invoke(Method.java:597)
    [junit] 	at junit.framework.TestCase.runTest(TestCase.java:164)
    [junit] 	at junit.framework.TestCase.runBare(TestCase.java:130)
    [junit] 	at junit.framework.TestResult$1.protect(TestResult.java:106)
    [junit] 	at junit.framework.TestResult.runProtected(TestResult.java:124)
    [junit] 	at junit.framework.TestResult.run(TestResult.java:109)
    [junit] 	at junit.framework.TestCase.run(TestCase.java:120)
    [junit] 	at junit.framework.TestSuite.runTest(TestSuite.java:230)
    [junit] 	at junit.framework.TestSuite.run(TestSuite.java:225)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
    [junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/srcbucket)
    [junit] 	at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
    [junit] 	at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
    [junit] 	at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
    [junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/srcbucket)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
    [junit] 	... 21 more
    [junit] Begin query: join14.q
    [junit] plan = /tmp/plan1456.xml
    [junit] 08/12/30 17:18:52 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to jobconf value of: 1
    [junit] 08/12/30 17:18:52 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/srcpart/ds=2008-04-08/hr=11
    [junit] 08/12/30 17:18:52 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/srcpart/ds=2008-04-08/hr=12
    [junit] 08/12/30 17:18:52 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src
    [junit] 08/12/30 17:18:52 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/data/files/TestSerDe.jar
    [junit] 08/12/30 17:18:52 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
    [junit] 08/12/30 17:18:52 WARN mapred.JobClient: Use GenericOptionsParser for parsing the arguments. Applications should implement Tool for the same.
    [junit] 08/12/30 17:18:53 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] 08/12/30 17:18:53 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] 08/12/30 17:18:53 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] Job running in-process (local Hadoop)
    [junit] 08/12/30 17:18:53 INFO exec.ExecDriver: Job running in-process (local Hadoop)
    [junit] 08/12/30 17:18:53 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] 08/12/30 17:18:53 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] 08/12/30 17:18:53 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] 08/12/30 17:18:53 INFO mapred.MapTask: numReduceTasks: 1
    [junit] 08/12/30 17:18:53 INFO mapred.MapTask: io.sort.mb = 100
    [junit] 08/12/30 17:18:53 INFO mapred.MapTask: data buffer = 79691776/99614720
    [junit] 08/12/30 17:18:53 INFO mapred.MapTask: record buffer = 262144/327680
    [junit] 08/12/30 17:18:53 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 17:18:53 INFO exec.MapOperator: Adding alias srcpart to work list for file /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/srcpart/ds=2008-04-08/hr=11/kv1.txt
    [junit] 08/12/30 17:18:53 INFO exec.MapOperator: Got partitions: ds/hr
    [junit] 08/12/30 17:18:53 INFO exec.TableScanOperator: Initializing Self
    [junit] 08/12/30 17:18:53 INFO exec.TableScanOperator: Initializing children:
    [junit] 08/12/30 17:18:53 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 17:18:53 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 17:18:53 INFO exec.FilterOperator: Initializing Self
    [junit] 08/12/30 17:18:53 INFO exec.FilterOperator: Initializing children:
    [junit] 08/12/30 17:18:53 INFO exec.ReduceSinkOperator: Initializing Self
    [junit] 08/12/30 17:18:53 INFO exec.ReduceSinkOperator: Using tag = 1
    [junit] 08/12/30 17:18:53 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 17:18:53 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 17:18:53 INFO exec.FilterOperator: Initialization Done
    [junit] 08/12/30 17:18:53 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 17:18:53 INFO exec.TableScanOperator: Initialization Done
    [junit] 08/12/30 17:18:53 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 17:18:53 INFO exec.FilterOperator: PASSED:500
    [junit] 08/12/30 17:18:53 INFO exec.FilterOperator: FILTERED:0
    [junit] 08/12/30 17:18:53 INFO mapred.MapTask: Starting flush of map output
    [junit] 08/12/30 17:18:53 INFO mapred.MapTask: Finished spill 0
    [junit] 08/12/30 17:18:53 INFO mapred.TaskRunner: Task:attempt_local_0001_m_000000_0 is done. And is in the process of commiting
    [junit] 08/12/30 17:18:53 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/srcpart/ds=2008-04-08/hr=11/kv1.txt:0+5812
    [junit] 08/12/30 17:18:53 INFO mapred.TaskRunner: Task 'attempt_local_0001_m_000000_0' done.
    [junit] 08/12/30 17:18:53 INFO mapred.MapTask: numReduceTasks: 1
    [junit] 08/12/30 17:18:53 INFO mapred.MapTask: io.sort.mb = 100
    [junit] 08/12/30 17:18:53 INFO mapred.MapTask: data buffer = 79691776/99614720
    [junit] 08/12/30 17:18:53 INFO mapred.MapTask: record buffer = 262144/327680
    [junit] 08/12/30 17:18:53 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 17:18:53 INFO exec.MapOperator: Adding alias srcpart to work list for file /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/srcpart/ds=2008-04-08/hr=12/kv1.txt
    [junit] 08/12/30 17:18:53 INFO exec.MapOperator: Got partitions: ds/hr
    [junit] 08/12/30 17:18:53 INFO exec.TableScanOperator: Initializing Self
    [junit] 08/12/30 17:18:53 INFO exec.TableScanOperator: Initializing children:
    [junit] 08/12/30 17:18:53 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 17:18:53 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 17:18:53 INFO exec.FilterOperator: Initializing Self
    [junit] 08/12/30 17:18:53 INFO exec.FilterOperator: Initializing children:
    [junit] 08/12/30 17:18:53 INFO exec.ReduceSinkOperator: Initializing Self
    [junit] 08/12/30 17:18:53 INFO exec.ReduceSinkOperator: Using tag = 1
    [junit] 08/12/30 17:18:53 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 17:18:53 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 17:18:53 INFO exec.FilterOperator: Initialization Done
    [junit] 08/12/30 17:18:53 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 17:18:53 INFO exec.TableScanOperator: Initialization Done
    [junit] 08/12/30 17:18:53 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 17:18:53 INFO exec.FilterOperator: PASSED:1000
    [junit] 08/12/30 17:18:53 INFO exec.FilterOperator: FILTERED:0
    [junit] 08/12/30 17:18:53 INFO mapred.MapTask: Starting flush of map output
    [junit] 08/12/30 17:18:53 INFO mapred.MapTask: Finished spill 0
    [junit] 08/12/30 17:18:53 INFO mapred.TaskRunner: Task:attempt_local_0001_m_000001_0 is done. And is in the process of commiting
    [junit] 08/12/30 17:18:53 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/srcpart/ds=2008-04-08/hr=12/kv1.txt:0+5812
    [junit] 08/12/30 17:18:53 INFO mapred.TaskRunner: Task 'attempt_local_0001_m_000001_0' done.
    [junit] 08/12/30 17:18:54 INFO mapred.MapTask: numReduceTasks: 1
    [junit] 08/12/30 17:18:54 INFO mapred.MapTask: io.sort.mb = 100
    [junit] 08/12/30 17:18:54 INFO mapred.MapTask: data buffer = 79691776/99614720
    [junit] 08/12/30 17:18:54 INFO mapred.MapTask: record buffer = 262144/327680
    [junit] 08/12/30 17:18:54 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 17:18:54 INFO exec.MapOperator: Adding alias src to work list for file /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src/kv1.txt
    [junit] 08/12/30 17:18:54 INFO exec.MapOperator: Got partitions: null
    [junit] 08/12/30 17:18:54 INFO exec.TableScanOperator: Initializing Self
    [junit] 08/12/30 17:18:54 INFO exec.TableScanOperator: Initializing children:
    [junit] 08/12/30 17:18:54 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 17:18:54 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 17:18:54 INFO exec.FilterOperator: Initializing Self
    [junit] 08/12/30 17:18:54 INFO exec.FilterOperator: Initializing children:
    [junit] 08/12/30 17:18:54 INFO exec.ReduceSinkOperator: Initializing Self
    [junit] 08/12/30 17:18:54 INFO exec.ReduceSinkOperator: Using tag = 0
    [junit] 08/12/30 17:18:54 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 17:18:54 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 17:18:54 INFO exec.FilterOperator: Initialization Done
    [junit] 08/12/30 17:18:54 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 17:18:54 INFO exec.TableScanOperator: Initialization Done
    [junit]  map = 100%,  reduce =0%
    [junit] 08/12/30 17:18:54 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 17:18:54 INFO exec.ExecDriver:  map = 100%,  reduce =0%
    [junit] 08/12/30 17:18:54 INFO exec.FilterOperator: PASSED:414
    [junit] 08/12/30 17:18:54 INFO exec.FilterOperator: FILTERED:86
    [junit] 08/12/30 17:18:54 INFO mapred.MapTask: Starting flush of map output
    [junit] 08/12/30 17:18:54 INFO mapred.MapTask: Finished spill 0
    [junit] 08/12/30 17:18:54 INFO mapred.TaskRunner: Task:attempt_local_0001_m_000002_0 is done. And is in the process of commiting
    [junit] 08/12/30 17:18:54 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src/kv1.txt:0+5812
    [junit] 08/12/30 17:18:54 INFO mapred.TaskRunner: Task 'attempt_local_0001_m_000002_0' done.
    [junit] 08/12/30 17:18:54 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 17:18:54 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 17:18:54 INFO mapred.Merger: Merging 3 sorted segments
    [junit] 08/12/30 17:18:54 INFO mapred.Merger: Down to the last merge-pass, with 3 segments left of total size: 68620 bytes
    [junit] 08/12/30 17:18:54 INFO exec.JoinOperator: Initializing Self
    [junit] 08/12/30 17:18:54 INFO exec.JoinOperator: Initializing children:
    [junit] 08/12/30 17:18:54 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 17:18:54 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 17:18:54 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 17:18:54 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 17:18:54 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 17:18:54 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 17:18:54 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 17:18:54 INFO exec.JoinOperator: Initialization Done
    [junit] 08/12/30 17:18:54 INFO mapred.TaskRunner: Task:attempt_local_0001_r_000000_0 is done. And is in the process of commiting
    [junit] 08/12/30 17:18:54 INFO mapred.LocalJobRunner: 
    [junit] 08/12/30 17:18:54 INFO mapred.TaskRunner: Task attempt_local_0001_r_000000_0 is allowed to commit now
    [junit] 08/12/30 17:18:54 INFO mapred.FileOutputCommitter: Saved output of task 'attempt_local_0001_r_000000_0' to file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/tmp2080010575
    [junit] 08/12/30 17:18:54 INFO mapred.LocalJobRunner: reduce > reduce
    [junit] 08/12/30 17:18:54 INFO mapred.TaskRunner: Task 'attempt_local_0001_r_000000_0' done.
    [junit] 08/12/30 17:18:55 INFO exec.ExecDriver:  map = 100%,  reduce =100%
    [junit] 08/12/30 17:18:55 INFO exec.ExecDriver: Ended Job = job_local_0001
    [junit]  map = 100%,  reduce =100%
    [junit] Ended Job = job_local_0001
    [junit] diff -I \(file:\)\|\(/tmp/.*\) /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/../build/ql/test/logs/join14.q.out /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/src/test/results/clientpositive/join14.q.out
    [junit] Done query: join14.q
    [junit] Begin query: input_part0.q
    [junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src)
    [junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:361)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
    [junit] 	at org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_input_part0(TestCliDriver.java:2953)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit] 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit] 	at java.lang.reflect.Method.invoke(Method.java:597)
    [junit] 	at junit.framework.TestCase.runTest(TestCase.java:164)
    [junit] 	at junit.framework.TestCase.runBare(TestCase.java:130)
    [junit] 	at junit.framework.TestResult$1.protect(TestResult.java:106)
    [junit] 	at junit.framework.TestResult.runProtected(TestResult.java:124)
    [junit] 	at junit.framework.TestResult.run(TestResult.java:109)
    [junit] 	at junit.framework.TestCase.run(TestCase.java:120)
    [junit] 	at junit.framework.TestSuite.runTest(TestSuite.java:230)
    [junit] 	at junit.framework.TestSuite.run(TestSuite.java:225)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
    [junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src)
    [junit] 	at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
    [junit] 	at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
    [junit] 	at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
    [junit] 	... 21 more
    [junit] Begin query: input15.q
    [junit] diff -I \(file:\)\|\(/tmp/.*\) /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/../build/ql/test/logs/input15.q.out /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/src/test/results/clientpositive/input15.q.out
    [junit] Done query: input15.q
    [junit] Begin query: join16.q
    [junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:361)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
    [junit] 	at org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_join16(TestCliDriver.java:3003)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit] 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit] 	at java.lang.reflect.Method.invoke(Method.java:597)
    [junit] 	at junit.framework.TestCase.runTest(TestCase.java:164)
    [junit] 	at junit.framework.TestCase.runBare(TestCase.java:130)
    [junit] 	at junit.framework.TestResult$1.protect(TestResult.java:106)
    [junit] 	at junit.framework.TestResult.runProtected(TestResult.java:124)
    [junit] 	at junit.framework.TestResult.run(TestResult.java:109)
    [junit] 	at junit.framework.TestCase.run(TestCase.java:120)
    [junit] 	at junit.framework.TestSuite.runTest(TestSuite.java:230)
    [junit] 	at junit.framework.TestSuite.run(TestSuite.java:225)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
    [junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src)
    [junit] 	at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
    [junit] 	at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
    [junit] 	at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
    [junit] 	... 21 more
    [junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src)
    [junit] Begin query: input_part2.q
    [junit] plan = /tmp/plan1457.xml
    [junit] 08/12/30 17:19:00 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to 0 since there's no reduce operator
    [junit] 08/12/30 17:19:00 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/srcpart/ds=2008-04-08/hr=12
    [junit] 08/12/30 17:19:00 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/srcpart/ds=2008-04-09/hr=12
    [junit] 08/12/30 17:19:00 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/data/files/TestSerDe.jar
    [junit] 08/12/30 17:19:00 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
    [junit] 08/12/30 17:19:00 WARN mapred.JobClient: Use GenericOptionsParser for parsing the arguments. Applications should implement Tool for the same.
    [junit] 08/12/30 17:19:01 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] 08/12/30 17:19:01 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] 08/12/30 17:19:01 INFO exec.ExecDriver: Job running in-process (local Hadoop)
    [junit] Job running in-process (local Hadoop)
    [junit] 08/12/30 17:19:01 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] 08/12/30 17:19:01 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] 08/12/30 17:19:01 INFO mapred.MapTask: numReduceTasks: 0
    [junit] 08/12/30 17:19:01 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 17:19:01 INFO exec.MapOperator: Adding alias srcpart to work list for file /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/srcpart/ds=2008-04-08/hr=12/kv1.txt
    [junit] 08/12/30 17:19:01 INFO exec.MapOperator: Got partitions: ds/hr
    [junit] 08/12/30 17:19:01 INFO exec.TableScanOperator: Initializing Self
    [junit] 08/12/30 17:19:01 INFO exec.TableScanOperator: Initializing children:
    [junit] 08/12/30 17:19:01 INFO exec.FilterOperator: Initializing Self
    [junit] 08/12/30 17:19:01 INFO exec.FilterOperator: Initializing children:
    [junit] 08/12/30 17:19:01 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 17:19:01 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 17:19:01 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 17:19:01 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 17:19:01 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 17:19:01 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 17:19:01 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 17:19:01 INFO exec.FilterOperator: Initialization Done
    [junit] 08/12/30 17:19:01 INFO exec.FilterOperator: Initializing Self
    [junit] 08/12/30 17:19:01 INFO exec.FilterOperator: Initializing children:
    [junit] 08/12/30 17:19:01 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 17:19:01 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 17:19:01 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 17:19:01 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 17:19:01 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 17:19:01 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 17:19:01 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 17:19:01 INFO exec.FilterOperator: Initialization Done
    [junit] 08/12/30 17:19:01 INFO exec.TableScanOperator: Initialization Done
    [junit] 08/12/30 17:19:01 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 17:19:01 INFO exec.FilterOperator: PASSED:84
    [junit] 08/12/30 17:19:01 INFO exec.FilterOperator: FILTERED:416
    [junit] 08/12/30 17:19:01 INFO exec.FilterOperator: PASSED:0
    [junit] 08/12/30 17:19:01 INFO exec.FilterOperator: FILTERED:500
    [junit] 08/12/30 17:19:01 INFO mapred.TaskRunner: Task:attempt_local_0001_m_000000_0 is done. And is in the process of commiting
    [junit] 08/12/30 17:19:01 INFO mapred.LocalJobRunner: 
    [junit] 08/12/30 17:19:01 INFO mapred.TaskRunner: Task attempt_local_0001_m_000000_0 is allowed to commit now
    [junit] 08/12/30 17:19:01 INFO mapred.FileOutputCommitter: Saved output of task 'attempt_local_0001_m_000000_0' to file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/tmp423277260
    [junit] 08/12/30 17:19:01 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/srcpart/ds=2008-04-08/hr=12/kv1.txt:0+5812
    [junit] 08/12/30 17:19:01 INFO mapred.TaskRunner: Task 'attempt_local_0001_m_000000_0' done.
    [junit] 08/12/30 17:19:01 INFO mapred.MapTask: numReduceTasks: 0
    [junit] 08/12/30 17:19:01 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 17:19:01 INFO exec.MapOperator: Adding alias srcpart to work list for file /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/srcpart/ds=2008-04-09/hr=12/kv1.txt
    [junit] 08/12/30 17:19:01 INFO exec.MapOperator: Got partitions: ds/hr
    [junit] 08/12/30 17:19:01 INFO exec.TableScanOperator: Initializing Self
    [junit] 08/12/30 17:19:01 INFO exec.TableScanOperator: Initializing children:
    [junit] 08/12/30 17:19:01 INFO exec.FilterOperator: Initializing Self
    [junit] 08/12/30 17:19:01 INFO exec.FilterOperator: Initializing children:
    [junit] 08/12/30 17:19:01 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 17:19:01 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 17:19:01 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 17:19:01 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 17:19:01 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 17:19:01 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 17:19:01 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 17:19:01 INFO exec.FilterOperator: Initialization Done
    [junit] 08/12/30 17:19:01 INFO exec.FilterOperator: Initializing Self
    [junit] 08/12/30 17:19:01 INFO exec.FilterOperator: Initializing children:
    [junit] 08/12/30 17:19:01 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 17:19:01 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 17:19:01 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 17:19:01 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 17:19:01 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 17:19:01 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 17:19:01 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 17:19:01 INFO exec.FilterOperator: Initialization Done
    [junit] 08/12/30 17:19:01 INFO exec.TableScanOperator: Initialization Done
    [junit] 08/12/30 17:19:01 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 17:19:01 INFO exec.FilterOperator: PASSED:84
    [junit] 08/12/30 17:19:01 INFO exec.FilterOperator: FILTERED:916
    [junit] 08/12/30 17:19:01 INFO exec.FilterOperator: PASSED:84
    [junit] 08/12/30 17:19:01 INFO exec.FilterOperator: FILTERED:916
    [junit] 08/12/30 17:19:01 INFO mapred.TaskRunner: Task:attempt_local_0001_m_000001_0 is done. And is in the process of commiting
    [junit] 08/12/30 17:19:01 INFO mapred.LocalJobRunner: 
    [junit] 08/12/30 17:19:01 INFO mapred.TaskRunner: Task attempt_local_0001_m_000001_0 is allowed to commit now
    [junit] 08/12/30 17:19:01 INFO mapred.FileOutputCommitter: Saved output of task 'attempt_local_0001_m_000001_0' to file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/tmp423277260
    [junit] 08/12/30 17:19:01 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/srcpart/ds=2008-04-09/hr=12/kv1.txt:0+5812
    [junit] 08/12/30 17:19:01 INFO mapred.TaskRunner: Task 'attempt_local_0001_m_000001_0' done.
    [junit]  map = 100%,  reduce =0%
    [junit] Ended Job = job_local_0001
    [junit] 08/12/30 17:19:02 INFO exec.ExecDriver:  map = 100%,  reduce =0%
    [junit] 08/12/30 17:19:02 INFO exec.ExecDriver: Ended Job = job_local_0001
    [junit] diff -I \(file:\)\|\(/tmp/.*\) /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/../build/ql/test/logs/input_part2.q.out /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/src/test/results/clientpositive/input_part2.q.out
    [junit] Done query: input_part2.q
    [junit] Begin query: input17.q
    [junit] plan = /tmp/plan1458.xml
    [junit] 08/12/30 17:19:05 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to jobconf value of: 1
    [junit] 08/12/30 17:19:05 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src_thrift
    [junit] 08/12/30 17:19:05 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/data/files/TestSerDe.jar
    [junit] 08/12/30 17:19:05 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
    [junit] 08/12/30 17:19:05 WARN mapred.JobClient: Use GenericOptionsParser for parsing the arguments. Applications should implement Tool for the same.
    [junit] 08/12/30 17:19:06 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] Job running in-process (local Hadoop)
    [junit] 08/12/30 17:19:06 INFO exec.ExecDriver: Job running in-process (local Hadoop)
    [junit] 08/12/30 17:19:06 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] 08/12/30 17:19:06 INFO mapred.MapTask: numReduceTasks: 1
    [junit] 08/12/30 17:19:06 INFO mapred.MapTask: io.sort.mb = 100
    [junit] 08/12/30 17:19:06 INFO mapred.MapTask: data buffer = 79691776/99614720
    [junit] 08/12/30 17:19:06 INFO mapred.MapTask: record buffer = 262144/327680
    [junit] 08/12/30 17:19:06 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 17:19:06 INFO exec.MapOperator: Adding alias tmap:src_thrift to work list for file /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src_thrift/complex.seq
    [junit] 08/12/30 17:19:06 INFO exec.MapOperator: Got partitions: null
    [junit] 08/12/30 17:19:06 INFO exec.TableScanOperator: Initializing Self
    [junit] 08/12/30 17:19:06 INFO exec.TableScanOperator: Initializing children:
    [junit] 08/12/30 17:19:06 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 17:19:06 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 17:19:06 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 17:19:06 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 17:19:06 INFO exec.ScriptOperator: Initializing Self
    [junit] 08/12/30 17:19:06 INFO exec.ScriptOperator: Initializing children:
    [junit] 08/12/30 17:19:06 INFO exec.ReduceSinkOperator: Initializing Self
    [junit] 08/12/30 17:19:06 INFO exec.ReduceSinkOperator: Using tag = -1
    [junit] 08/12/30 17:19:06 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 17:19:06 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 17:19:06 INFO exec.ScriptOperator: Initialization Done
    [junit] 08/12/30 17:19:06 INFO exec.ScriptOperator: Executing [/bin/cat]
    [junit] 08/12/30 17:19:06 INFO exec.ScriptOperator: tablename=src_thrift
    [junit] 08/12/30 17:19:06 INFO exec.ScriptOperator: partname={}
    [junit] 08/12/30 17:19:06 INFO exec.ScriptOperator: alias=tmap:src_thrift
    [junit] 08/12/30 17:19:06 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 17:19:06 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 17:19:06 INFO exec.TableScanOperator: Initialization Done
    [junit] 08/12/30 17:19:06 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 17:19:06 INFO exec.ScriptOperator: StreamThread OutputProcessor done
    [junit] 08/12/30 17:19:06 INFO exec.ScriptOperator: StreamThread ErrorProcessor done
    [junit] 08/12/30 17:19:06 INFO exec.ScriptOperator: SERIALIZE_ERRORS:0
    [junit] 08/12/30 17:19:06 INFO exec.ScriptOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 17:19:06 INFO mapred.MapTask: Starting flush of map output
    [junit] 08/12/30 17:19:06 INFO mapred.MapTask: Finished spill 0
    [junit] 08/12/30 17:19:06 INFO mapred.TaskRunner: Task:attempt_local_0001_m_000000_0 is done. And is in the process of commiting
    [junit] 08/12/30 17:19:06 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src_thrift/complex.seq:0+1491
    [junit] 08/12/30 17:19:06 INFO mapred.TaskRunner: Task 'attempt_local_0001_m_000000_0' done.
    [junit] 08/12/30 17:19:06 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 17:19:06 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 17:19:06 INFO mapred.Merger: Merging 1 sorted segments
    [junit] 08/12/30 17:19:06 INFO mapred.Merger: Down to the last merge-pass, with 1 segments left of total size: 764 bytes
    [junit] 08/12/30 17:19:06 INFO exec.ExtractOperator: Initializing Self
    [junit] 08/12/30 17:19:06 INFO exec.ExtractOperator: Initializing children:
    [junit] 08/12/30 17:19:06 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 17:19:06 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 17:19:06 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 17:19:06 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 17:19:06 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 17:19:06 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 17:19:06 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 17:19:06 INFO exec.ExtractOperator: Initialization Done
    [junit] 08/12/30 17:19:06 INFO mapred.TaskRunner: Task:attempt_local_0001_r_000000_0 is done. And is in the process of commiting
    [junit] 08/12/30 17:19:06 INFO mapred.LocalJobRunner: 
    [junit] 08/12/30 17:19:06 INFO mapred.TaskRunner: Task attempt_local_0001_r_000000_0 is allowed to commit now
    [junit] 08/12/30 17:19:06 INFO mapred.FileOutputCommitter: Saved output of task 'attempt_local_0001_r_000000_0' to file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/tmp107377823
    [junit] 08/12/30 17:19:06 INFO mapred.LocalJobRunner: reduce > reduce
    [junit] 08/12/30 17:19:06 INFO mapred.TaskRunner: Task 'attempt_local_0001_r_000000_0' done.
    [junit] 08/12/30 17:19:07 INFO exec.ExecDriver:  map = 100%,  reduce =100%
    [junit] 08/12/30 17:19:07 INFO exec.ExecDriver: Ended Job = job_local_0001
    [junit]  map = 100%,  reduce =100%
    [junit] Ended Job = job_local_0001
    [junit] diff -I \(file:\)\|\(/tmp/.*\) /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/../build/ql/test/logs/input17.q.out /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/src/test/results/clientpositive/input17.q.out
    [junit] Done query: input17.q
    [junit] Begin query: groupby1.q
    [junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src)
    [junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:361)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
    [junit] 	at org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_groupby1(TestCliDriver.java:3078)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit] 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit] 	at java.lang.reflect.Method.invoke(Method.java:597)
    [junit] 	at junit.framework.TestCase.runTest(TestCase.java:164)
    [junit] 	at junit.framework.TestCase.runBare(TestCase.java:130)
    [junit] 	at junit.framework.TestResult$1.protect(TestResult.java:106)
    [junit] 	at junit.framework.TestResult.runProtected(TestResult.java:124)
    [junit] 	at junit.framework.TestResult.run(TestResult.java:109)
    [junit] 	at junit.framework.TestCase.run(TestCase.java:120)
    [junit] 	at junit.framework.TestSuite.runTest(TestSuite.java:230)
    [junit] 	at junit.framework.TestSuite.run(TestSuite.java:225)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
    [junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src)
    [junit] 	at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
    [junit] 	at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
    [junit] 	at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
    [junit] 	... 21 more
    [junit] Begin query: join18.q
    [junit] plan = /tmp/plan1459.xml
    [junit] 08/12/30 17:19:10 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to jobconf value of: 1
    [junit] 08/12/30 17:19:10 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src1
    [junit] 08/12/30 17:19:10 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/data/files/TestSerDe.jar
    [junit] 08/12/30 17:19:10 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
    [junit] 08/12/30 17:19:10 WARN mapred.JobClient: Use GenericOptionsParser for parsing the arguments. Applications should implement Tool for the same.
    [junit] 08/12/30 17:19:10 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] 08/12/30 17:19:10 INFO exec.ExecDriver: Job running in-process (local Hadoop)
    [junit] Job running in-process (local Hadoop)
    [junit] 08/12/30 17:19:10 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] 08/12/30 17:19:10 INFO mapred.MapTask: numReduceTasks: 1
    [junit] 08/12/30 17:19:10 INFO mapred.MapTask: io.sort.mb = 100
    [junit] 08/12/30 17:19:11 INFO mapred.MapTask: data buffer = 79691776/99614720
    [junit] 08/12/30 17:19:11 INFO mapred.MapTask: record buffer = 262144/327680
    [junit] 08/12/30 17:19:11 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 17:19:11 INFO exec.MapOperator: Adding alias b:src2 to work list for file /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src1/kv3.txt
    [junit] 08/12/30 17:19:11 INFO exec.MapOperator: Got partitions: null
    [junit] 08/12/30 17:19:11 INFO exec.TableScanOperator: Initializing Self
    [junit] 08/12/30 17:19:11 INFO exec.TableScanOperator: Initializing children:
    [junit] 08/12/30 17:19:11 INFO exec.ReduceSinkOperator: Initializing Self
    [junit] 08/12/30 17:19:11 INFO exec.ReduceSinkOperator: Using tag = -1
    [junit] 08/12/30 17:19:11 INFO thrift.TBinarySortableProtocol: Sort order is "++"
    [junit] 08/12/30 17:19:11 INFO thrift.TBinarySortableProtocol: Sort order is "++"
    [junit] 08/12/30 17:19:11 INFO exec.TableScanOperator: Initialization Done
    [junit] 08/12/30 17:19:11 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 17:19:11 INFO mapred.MapTask: Starting flush of map output
    [junit] 08/12/30 17:19:11 INFO mapred.MapTask: Finished spill 0
    [junit] 08/12/30 17:19:11 INFO mapred.TaskRunner: Task:attempt_local_0001_m_000000_0 is done. And is in the process of commiting
    [junit] 08/12/30 17:19:11 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src1/kv3.txt:0+216
    [junit] 08/12/30 17:19:11 INFO mapred.TaskRunner: Task 'attempt_local_0001_m_000000_0' done.
    [junit] 08/12/30 17:19:11 INFO thrift.TBinarySortableProtocol: Sort order is "++"
    [junit] 08/12/30 17:19:11 INFO thrift.TBinarySortableProtocol: Sort order is "++"
    [junit] 08/12/30 17:19:11 INFO mapred.Merger: Merging 1 sorted segments
    [junit] 08/12/30 17:19:11 INFO mapred.Merger: Down to the last merge-pass, with 1 segments left of total size: 543 bytes
    [junit] 08/12/30 17:19:11 INFO exec.GroupByOperator: Initializing Self
    [junit] 08/12/30 17:19:11 INFO exec.GroupByOperator: Initializing children:
    [junit] 08/12/30 17:19:11 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 17:19:11 INFO exec.GroupByOperator: Initialization Done
    [junit] 08/12/30 17:19:11 INFO mapred.TaskRunner: Task:attempt_local_0001_r_000000_0 is done. And is in the process of commiting
    [junit] 08/12/30 17:19:11 INFO mapred.LocalJobRunner: 
    [junit] 08/12/30 17:19:11 INFO mapred.TaskRunner: Task attempt_local_0001_r_000000_0 is allowed to commit now
    [junit] 08/12/30 17:19:11 INFO mapred.FileOutputCommitter: Saved output of task 'attempt_local_0001_r_000000_0' to file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/tmp-1243404375
    [junit] 08/12/30 17:19:11 INFO mapred.LocalJobRunner: reduce > reduce
    [junit] 08/12/30 17:19:11 INFO mapred.TaskRunner: Task 'attempt_local_0001_r_000000_0' done.
    [junit]  map = 100%,  reduce =100%
    [junit] 08/12/30 17:19:11 INFO exec.ExecDriver:  map = 100%,  reduce =100%
    [junit] 08/12/30 17:19:11 INFO exec.ExecDriver: Ended Job = job_local_0001
    [junit] Ended Job = job_local_0001
    [junit] plan = /tmp/plan1460.xml
    [junit] 08/12/30 17:19:13 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to jobconf value of: 1
    [junit] 08/12/30 17:19:13 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src
    [junit] 08/12/30 17:19:13 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/data/files/TestSerDe.jar
    [junit] 08/12/30 17:19:13 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
    [junit] 08/12/30 17:19:13 WARN mapred.JobClient: Use GenericOptionsParser for parsing the arguments. Applications should implement Tool for the same.
    [junit] 08/12/30 17:19:13 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] 08/12/30 17:19:13 INFO exec.ExecDriver: Job running in-process (local Hadoop)
    [junit] Job running in-process (local Hadoop)
    [junit] 08/12/30 17:19:13 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] 08/12/30 17:19:13 INFO mapred.MapTask: numReduceTasks: 1
    [junit] 08/12/30 17:19:13 INFO mapred.MapTask: io.sort.mb = 100
    [junit] 08/12/30 17:19:13 INFO mapred.MapTask: data buffer = 79691776/99614720
    [junit] 08/12/30 17:19:13 INFO mapred.MapTask: record buffer = 262144/327680
    [junit] 08/12/30 17:19:13 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 17:19:13 INFO exec.MapOperator: Adding alias a:src1 to work list for file /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src/kv1.txt
    [junit] 08/12/30 17:19:13 INFO exec.MapOperator: Got partitions: null
    [junit] 08/12/30 17:19:13 INFO exec.TableScanOperator: Initializing Self
    [junit] 08/12/30 17:19:13 INFO exec.TableScanOperator: Initializing children:
    [junit] 08/12/30 17:19:13 INFO exec.ReduceSinkOperator: Initializing Self
    [junit] 08/12/30 17:19:13 INFO exec.ReduceSinkOperator: Using tag = -1
    [junit] 08/12/30 17:19:13 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 17:19:13 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 17:19:13 INFO exec.TableScanOperator: Initialization Done
    [junit] 08/12/30 17:19:14 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 17:19:14 INFO mapred.MapTask: Starting flush of map output
    [junit] 08/12/30 17:19:14 INFO mapred.MapTask: Finished spill 0
    [junit] 08/12/30 17:19:14 INFO mapred.TaskRunner: Task:attempt_local_0001_m_000000_0 is done. And is in the process of commiting
    [junit] 08/12/30 17:19:14 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src/kv1.txt:0+5812
    [junit] 08/12/30 17:19:14 INFO mapred.TaskRunner: Task 'attempt_local_0001_m_000000_0' done.
    [junit] 08/12/30 17:19:14 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 17:19:14 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 17:19:14 INFO mapred.Merger: Merging 1 sorted segments
    [junit] 08/12/30 17:19:14 INFO mapred.Merger: Down to the last merge-pass, with 1 segments left of total size: 14814 bytes
    [junit] 08/12/30 17:19:14 INFO exec.GroupByOperator: Initializing Self
    [junit] 08/12/30 17:19:14 INFO exec.GroupByOperator: Initializing children:
    [junit] 08/12/30 17:19:14 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 17:19:14 INFO exec.GroupByOperator: Initialization Done
    [junit] 08/12/30 17:19:14 INFO mapred.TaskRunner: Task:attempt_local_0001_r_000000_0 is done. And is in the process of commiting
    [junit] 08/12/30 17:19:14 INFO mapred.LocalJobRunner: 
    [junit] 08/12/30 17:19:14 INFO mapred.TaskRunner: Task attempt_local_0001_r_000000_0 is allowed to commit now
    [junit] 08/12/30 17:19:14 INFO mapred.FileOutputCommitter: Saved output of task 'attempt_local_0001_r_000000_0' to file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/tmp-722448119
    [junit] 08/12/30 17:19:14 INFO mapred.LocalJobRunner: reduce > reduce
    [junit] 08/12/30 17:19:14 INFO mapred.TaskRunner: Task 'attempt_local_0001_r_000000_0' done.
    [junit]  map = 100%,  reduce =100%
    [junit] Ended Job = job_local_0001
    [junit] 08/12/30 17:19:14 INFO exec.ExecDriver:  map = 100%,  reduce =100%
    [junit] 08/12/30 17:19:14 INFO exec.ExecDriver: Ended Job = job_local_0001
    [junit] plan = /tmp/plan1461.xml
    [junit] 08/12/30 17:19:15 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to jobconf value of: 1
    [junit] 08/12/30 17:19:16 INFO exec.ExecDriver: Adding input file /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/tmp/6233267/118583309.10002
    [junit] 08/12/30 17:19:16 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/data/files/TestSerDe.jar
    [junit] 08/12/30 17:19:16 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
    [junit] 08/12/30 17:19:16 WARN mapred.JobClient: Use GenericOptionsParser for parsing the arguments. Applications should implement Tool for the same.
    [junit] 08/12/30 17:19:16 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] Job running in-process (local Hadoop)
    [junit] 08/12/30 17:19:16 INFO exec.ExecDriver: Job running in-process (local Hadoop)
    [junit] 08/12/30 17:19:16 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] 08/12/30 17:19:16 INFO mapred.MapTask: numReduceTasks: 1
    [junit] 08/12/30 17:19:16 INFO mapred.MapTask: io.sort.mb = 100
    [junit] 08/12/30 17:19:16 INFO mapred.MapTask: data buffer = 79691776/99614720
    [junit] 08/12/30 17:19:16 INFO mapred.MapTask: record buffer = 262144/327680
    [junit] 08/12/30 17:19:16 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 17:19:16 INFO exec.MapOperator: Adding alias /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/tmp/6233267/118583309.10002 to work list for file /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/tmp/6233267/118583309.10002/attempt_local_0001_r_000000_0
    [junit] 08/12/30 17:19:16 INFO exec.MapOperator: Got partitions: null
    [junit] 08/12/30 17:19:16 INFO exec.ReduceSinkOperator: Initializing Self
    [junit] 08/12/30 17:19:16 INFO exec.ReduceSinkOperator: Using tag = -1
    [junit] 08/12/30 17:19:16 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 17:19:16 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 17:19:16 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 17:19:16 INFO mapred.MapTask: Starting flush of map output
    [junit] 08/12/30 17:19:16 INFO mapred.MapTask: Finished spill 0
    [junit] 08/12/30 17:19:16 INFO mapred.TaskRunner: Task:attempt_local_0001_m_000000_0 is done. And is in the process of commiting
    [junit] 08/12/30 17:19:16 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/tmp/6233267/118583309.10002/attempt_local_0001_r_000000_0:0+699
    [junit] 08/12/30 17:19:16 INFO mapred.TaskRunner: Task 'attempt_local_0001_m_000000_0' done.
    [junit] 08/12/30 17:19:16 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 17:19:16 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 17:19:16 INFO mapred.Merger: Merging 1 sorted segments
    [junit] 08/12/30 17:19:16 INFO mapred.Merger: Down to the last merge-pass, with 1 segments left of total size: 429 bytes
    [junit] 08/12/30 17:19:16 INFO exec.GroupByOperator: Initializing Self
    [junit] 08/12/30 17:19:16 INFO exec.GroupByOperator: Initializing children:
    [junit] 08/12/30 17:19:16 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 17:19:16 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 17:19:16 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 17:19:16 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 17:19:16 INFO exec.GroupByOperator: Initialization Done
    [junit] 08/12/30 17:19:16 INFO mapred.TaskRunner: Task:attempt_local_0001_r_000000_0 is done. And is in the process of commiting
    [junit] 08/12/30 17:19:16 INFO mapred.LocalJobRunner: 
    [junit] 08/12/30 17:19:16 INFO mapred.TaskRunner: Task attempt_local_0001_r_000000_0 is allowed to commit now
    [junit] 08/12/30 17:19:17 INFO mapred.FileOutputCommitter: Saved output of task 'attempt_local_0001_r_000000_0' to file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/tmp-609596459
    [junit] 08/12/30 17:19:17 INFO mapred.LocalJobRunner: reduce > reduce
    [junit] 08/12/30 17:19:17 INFO mapred.TaskRunner: Task 'attempt_local_0001_r_000000_0' done.
    [junit]  map = 100%,  reduce =100%
    [junit] Ended Job = job_local_0001
    [junit] 08/12/30 17:19:17 INFO exec.ExecDriver:  map = 100%,  reduce =100%
    [junit] 08/12/30 17:19:17 INFO exec.ExecDriver: Ended Job = job_local_0001
    [junit] plan = /tmp/plan1462.xml
    [junit] 08/12/30 17:19:18 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to jobconf value of: 1
    [junit] 08/12/30 17:19:18 INFO exec.ExecDriver: Adding input file /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/tmp/6233267/118583309.10004
    [junit] 08/12/30 17:19:18 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/data/files/TestSerDe.jar
    [junit] 08/12/30 17:19:18 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
    [junit] 08/12/30 17:19:19 WARN mapred.JobClient: Use GenericOptionsParser for parsing the arguments. Applications should implement Tool for the same.
    [junit] 08/12/30 17:19:19 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] Job running in-process (local Hadoop)
    [junit] 08/12/30 17:19:19 INFO exec.ExecDriver: Job running in-process (local Hadoop)
    [junit] 08/12/30 17:19:19 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] 08/12/30 17:19:19 INFO mapred.MapTask: numReduceTasks: 1
    [junit] 08/12/30 17:19:19 INFO mapred.MapTask: io.sort.mb = 100
    [junit] 08/12/30 17:19:19 INFO mapred.MapTask: data buffer = 79691776/99614720
    [junit] 08/12/30 17:19:19 INFO mapred.MapTask: record buffer = 262144/327680
    [junit] 08/12/30 17:19:19 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 17:19:19 INFO exec.MapOperator: Adding alias /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/tmp/6233267/118583309.10004 to work list for file /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/tmp/6233267/118583309.10004/attempt_local_0001_r_000000_0
    [junit] 08/12/30 17:19:19 INFO exec.MapOperator: Got partitions: null
    [junit] 08/12/30 17:19:19 INFO exec.ReduceSinkOperator: Initializing Self
    [junit] 08/12/30 17:19:19 INFO exec.ReduceSinkOperator: Using tag = -1
    [junit] 08/12/30 17:19:19 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 17:19:19 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 17:19:19 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 17:19:19 INFO mapred.MapTask: Starting flush of map output
    [junit] 08/12/30 17:19:19 INFO mapred.MapTask: Finished spill 0
    [junit] 08/12/30 17:19:19 INFO mapred.TaskRunner: Task:attempt_local_0001_m_000000_0 is done. And is in the process of commiting
    [junit] 08/12/30 17:19:19 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/tmp/6233267/118583309.10004/attempt_local_0001_r_000000_0:0+11875
    [junit] 08/12/30 17:19:19 INFO mapred.TaskRunner: Task 'attempt_local_0001_m_000000_0' done.
    [junit] 08/12/30 17:19:19 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 17:19:19 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 17:19:19 INFO mapred.Merger: Merging 1 sorted segments
    [junit] 08/12/30 17:19:19 INFO mapred.Merger: Down to the last merge-pass, with 1 segments left of total size: 8282 bytes
    [junit] 08/12/30 17:19:19 INFO exec.GroupByOperator: Initializing Self
    [junit] 08/12/30 17:19:19 INFO exec.GroupByOperator: Initializing children:
    [junit] 08/12/30 17:19:19 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 17:19:19 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 17:19:19 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 17:19:19 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 17:19:19 INFO exec.GroupByOperator: Initialization Done
    [junit] 08/12/30 17:19:19 INFO mapred.TaskRunner: Task:attempt_local_0001_r_000000_0 is done. And is in the process of commiting
    [junit] 08/12/30 17:19:19 INFO mapred.LocalJobRunner: 
    [junit] 08/12/30 17:19:19 INFO mapred.TaskRunner: Task attempt_local_0001_r_000000_0 is allowed to commit now
    [junit] 08/12/30 17:19:19 INFO mapred.FileOutputCommitter: Saved output of task 'attempt_local_0001_r_000000_0' to file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/tmp-363375571
    [junit] 08/12/30 17:19:19 INFO mapred.LocalJobRunner: reduce > reduce
    [junit] 08/12/30 17:19:19 INFO mapred.TaskRunner: Task 'attempt_local_0001_r_000000_0' done.
    [junit] 08/12/30 17:19:20 INFO exec.ExecDriver:  map = 100%,  reduce =100%
    [junit] 08/12/30 17:19:20 INFO exec.ExecDriver: Ended Job = job_local_0001
    [junit]  map = 100%,  reduce =100%
    [junit] Ended Job = job_local_0001
    [junit] plan = /tmp/plan1463.xml
    [junit] 08/12/30 17:19:21 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to jobconf value of: 1
    [junit] 08/12/30 17:19:22 INFO exec.ExecDriver: Adding input file /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/tmp/6233267/118583309.10003
    [junit] 08/12/30 17:19:22 INFO exec.ExecDriver: Adding input file /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/tmp/6233267/118583309.10005
    [junit] 08/12/30 17:19:22 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/data/files/TestSerDe.jar
    [junit] 08/12/30 17:19:22 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
    [junit] 08/12/30 17:19:22 WARN mapred.JobClient: Use GenericOptionsParser for parsing the arguments. Applications should implement Tool for the same.
    [junit] 08/12/30 17:19:22 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] 08/12/30 17:19:22 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] Job running in-process (local Hadoop)
    [junit] 08/12/30 17:19:22 INFO exec.ExecDriver: Job running in-process (local Hadoop)
    [junit] 08/12/30 17:19:22 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] 08/12/30 17:19:22 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] 08/12/30 17:19:22 INFO mapred.MapTask: numReduceTasks: 1
    [junit] 08/12/30 17:19:22 INFO mapred.MapTask: io.sort.mb = 100
    [junit] 08/12/30 17:19:22 INFO mapred.MapTask: data buffer = 79691776/99614720
    [junit] 08/12/30 17:19:22 INFO mapred.MapTask: record buffer = 262144/327680
    [junit] 08/12/30 17:19:22 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 17:19:22 INFO exec.MapOperator: Adding alias $INTNAME to work list for file /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/tmp/6233267/118583309.10003/attempt_local_0001_r_000000_0
    [junit] 08/12/30 17:19:22 INFO exec.MapOperator: Got partitions: null
    [junit] 08/12/30 17:19:22 INFO exec.ReduceSinkOperator: Initializing Self
    [junit] 08/12/30 17:19:22 INFO exec.ReduceSinkOperator: Using tag = 1
    [junit] 08/12/30 17:19:22 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 17:19:22 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 17:19:22 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 17:19:22 INFO mapred.MapTask: Starting flush of map output
    [junit] 08/12/30 17:19:22 INFO mapred.MapTask: Finished spill 0
    [junit] 08/12/30 17:19:22 INFO mapred.TaskRunner: Task:attempt_local_0001_m_000000_0 is done. And is in the process of commiting
    [junit] 08/12/30 17:19:22 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/tmp/6233267/118583309.10003/attempt_local_0001_r_000000_0:0+699
    [junit] 08/12/30 17:19:22 INFO mapred.TaskRunner: Task 'attempt_local_0001_m_000000_0' done.
    [junit] 08/12/30 17:19:22 INFO mapred.MapTask: numReduceTasks: 1
    [junit] 08/12/30 17:19:22 INFO mapred.MapTask: io.sort.mb = 100
    [junit] 08/12/30 17:19:22 INFO mapred.MapTask: data buffer = 79691776/99614720
    [junit] 08/12/30 17:19:22 INFO mapred.MapTask: record buffer = 262144/327680
    [junit] 08/12/30 17:19:22 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 17:19:22 INFO exec.MapOperator: Adding alias $INTNAME1 to work list for file /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/tmp/6233267/118583309.10005/attempt_local_0001_r_000000_0
    [junit] 08/12/30 17:19:22 INFO exec.MapOperator: Got partitions: null
    [junit] 08/12/30 17:19:22 INFO exec.ReduceSinkOperator: Initializing Self
    [junit] 08/12/30 17:19:22 INFO exec.ReduceSinkOperator: Initializing children:
    [junit] 08/12/30 17:19:22 INFO exec.JoinOperator: Initializing Self
    [junit] 08/12/30 17:19:22 INFO exec.JoinOperator: Initializing children:
    [junit] 08/12/30 17:19:22 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 17:19:22 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 17:19:22 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 17:19:23 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 17:19:23 INFO exec.JoinOperator: Initialization Done
    [junit] 08/12/30 17:19:23 INFO exec.ReduceSinkOperator: Initialization Done
    [junit] 08/12/30 17:19:23 INFO exec.ReduceSinkOperator: Using tag = 0
    [junit] 08/12/30 17:19:23 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 17:19:23 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 17:19:23 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 17:19:23 INFO mapred.MapTask: Starting flush of map output
    [junit] 08/12/30 17:19:23 INFO mapred.MapTask: Finished spill 0
    [junit] 08/12/30 17:19:23 INFO mapred.TaskRunner: Task:attempt_local_0001_m_000001_0 is done. And is in the process of commiting
    [junit] 08/12/30 17:19:23 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/tmp/6233267/118583309.10005/attempt_local_0001_r_000000_0:0+11875
    [junit] 08/12/30 17:19:23 INFO mapred.TaskRunner: Task 'attempt_local_0001_m_000001_0' done.
    [junit] 08/12/30 17:19:23 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 17:19:23 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 17:19:23 INFO mapred.Merger: Merging 2 sorted segments
    [junit] 08/12/30 17:19:23 INFO mapred.Merger: Down to the last merge-pass, with 2 segments left of total size: 12218 bytes
    [junit] 08/12/30 17:19:23 INFO exec.JoinOperator: Initializing Self
    [junit] 08/12/30 17:19:23 INFO exec.JoinOperator: Initializing children:
    [junit] 08/12/30 17:19:23 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 17:19:23 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 17:19:23 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 17:19:23 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 17:19:23 INFO exec.JoinOperator: Initialization Done
    [junit] 08/12/30 17:19:23 INFO mapred.TaskRunner: Task:attempt_local_0001_r_000000_0 is done. And is in the process of commiting
    [junit] 08/12/30 17:19:23 INFO mapred.LocalJobRunner: 
    [junit] 08/12/30 17:19:23 INFO mapred.TaskRunner: Task attempt_local_0001_r_000000_0 is allowed to commit now
    [junit] 08/12/30 17:19:23 INFO mapred.FileOutputCommitter: Saved output of task 'attempt_local_0001_r_000000_0' to file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/tmp1148258804
    [junit] 08/12/30 17:19:23 INFO mapred.LocalJobRunner: reduce > reduce
    [junit] 08/12/30 17:19:23 INFO mapred.TaskRunner: Task 'attempt_local_0001_r_000000_0' done.
    [junit]  map = 100%,  reduce =100%
    [junit] Ended Job = job_local_0001
    [junit] 08/12/30 17:19:23 INFO exec.ExecDriver:  map = 100%,  reduce =100%
    [junit] 08/12/30 17:19:23 INFO exec.ExecDriver: Ended Job = job_local_0001
    [junit] diff -I \(file:\)\|\(/tmp/.*\) /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/../build/ql/test/logs/join18.q.out /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/src/test/results/clientpositive/join18.q.out
    [junit] Done query: join18.q
    [junit] Begin query: input_part4.q
    [junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src)
    [junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:361)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
    [junit] 	at org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_input_part4(TestCliDriver.java:3128)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit] 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit] 	at java.lang.reflect.Method.invoke(Method.java:597)
    [junit] 	at junit.framework.TestCase.runTest(TestCase.java:164)
    [junit] 	at junit.framework.TestCase.runBare(TestCase.java:130)
    [junit] 	at junit.framework.TestResult$1.protect(TestResult.java:106)
    [junit] 	at junit.framework.TestResult.runProtected(TestResult.java:124)
    [junit] 	at junit.framework.TestResult.run(TestResult.java:109)
    [junit] 	at junit.framework.TestCase.run(TestCase.java:120)
    [junit] 	at junit.framework.TestSuite.runTest(TestSuite.java:230)
    [junit] 	at junit.framework.TestSuite.run(TestSuite.java:225)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
    [junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src)
    [junit] 	at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
    [junit] 	at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
    [junit] 	at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
    [junit] 	... 21 more
    [junit] Begin query: input19.q
    [junit] Exception: Client Execution failed with error code = 9
    [junit] junit.framework.AssertionFailedError: Client Execution failed with error code = 9
    [junit] 	at junit.framework.Assert.fail(Assert.java:47)
    [junit] 	at org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_input19(TestCliDriver.java:3156)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit] 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit] 	at java.lang.reflect.Method.invoke(Method.java:597)
    [junit] 	at junit.framework.TestCase.runTest(TestCase.java:164)
    [junit] 	at junit.framework.TestCase.runBare(TestCase.java:130)
    [junit] 	at junit.framework.TestResult$1.protect(TestResult.java:106)
    [junit] 	at junit.framework.TestResult.runProtected(TestResult.java:124)
    [junit] 	at junit.framework.TestResult.run(TestResult.java:109)
    [junit] 	at junit.framework.TestCase.run(TestCase.java:120)
    [junit] 	at junit.framework.TestSuite.runTest(TestSuite.java:230)
    [junit] 	at junit.framework.TestSuite.run(TestSuite.java:225)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
    [junit] Begin query: groupby3.q
    [junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src)
    [junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:361)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
    [junit] 	at org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_groupby3(TestCliDriver.java:3178)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit] 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit] 	at java.lang.reflect.Method.invoke(Method.java:597)
    [junit] 	at junit.framework.TestCase.runTest(TestCase.java:164)
    [junit] 	at junit.framework.TestCase.runBare(TestCase.java:130)
    [junit] 	at junit.framework.TestResult$1.protect(TestResult.java:106)
    [junit] 	at junit.framework.TestResult.runProtected(TestResult.java:124)
    [junit] 	at junit.framework.TestResult.run(TestResult.java:109)
    [junit] 	at junit.framework.TestCase.run(TestCase.java:120)
    [junit] 	at junit.framework.TestSuite.runTest(TestSuite.java:230)
    [junit] 	at junit.framework.TestSuite.run(TestSuite.java:225)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
    [junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src)
    [junit] 	at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
    [junit] 	at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
    [junit] 	at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
    [junit] 	... 21 more
    [junit] Begin query: subq.q
    [junit] plan = /tmp/plan1464.xml
    [junit] 08/12/30 17:19:28 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to 0 since there's no reduce operator
    [junit] 08/12/30 17:19:28 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src
    [junit] 08/12/30 17:19:28 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/data/files/TestSerDe.jar
    [junit] 08/12/30 17:19:28 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
    [junit] 08/12/30 17:19:28 WARN mapred.JobClient: Use GenericOptionsParser for parsing the arguments. Applications should implement Tool for the same.
    [junit] 08/12/30 17:19:28 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] Job running in-process (local Hadoop)
    [junit] 08/12/30 17:19:28 INFO exec.ExecDriver: Job running in-process (local Hadoop)
    [junit] 08/12/30 17:19:28 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] 08/12/30 17:19:28 INFO mapred.MapTask: numReduceTasks: 0
    [junit] 08/12/30 17:19:29 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 17:19:29 INFO exec.MapOperator: Adding alias unioninput:src to work list for file /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src/kv1.txt
    [junit] 08/12/30 17:19:29 INFO exec.MapOperator: Got partitions: null
    [junit] 08/12/30 17:19:29 INFO exec.TableScanOperator: Initializing Self
    [junit] 08/12/30 17:19:29 INFO exec.TableScanOperator: Initializing children:
    [junit] 08/12/30 17:19:29 INFO exec.FilterOperator: Initializing Self
    [junit] 08/12/30 17:19:29 INFO exec.FilterOperator: Initializing children:
    [junit] 08/12/30 17:19:29 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 17:19:29 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 17:19:29 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 17:19:29 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 17:19:29 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 17:19:29 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 17:19:29 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 17:19:29 INFO exec.FilterOperator: Initialization Done
    [junit] 08/12/30 17:19:29 INFO exec.TableScanOperator: Initialization Done
    [junit] 08/12/30 17:19:29 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 17:19:29 INFO exec.FilterOperator: PASSED:84
    [junit] 08/12/30 17:19:29 INFO exec.FilterOperator: FILTERED:416
    [junit] 08/12/30 17:19:29 INFO mapred.TaskRunner: Task:attempt_local_0001_m_000000_0 is done. And is in the process of commiting
    [junit] 08/12/30 17:19:29 INFO mapred.LocalJobRunner: 
    [junit] 08/12/30 17:19:29 INFO mapred.TaskRunner: Task attempt_local_0001_m_000000_0 is allowed to commit now
    [junit] 08/12/30 17:19:29 INFO mapred.FileOutputCommitter: Saved output of task 'attempt_local_0001_m_000000_0' to file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/tmp-1064677399
    [junit] 08/12/30 17:19:29 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src/kv1.txt:0+5812
    [junit] 08/12/30 17:19:29 INFO mapred.TaskRunner: Task 'attempt_local_0001_m_000000_0' done.
    [junit]  map = 100%,  reduce =0%
    [junit] Ended Job = job_local_0001
    [junit] 08/12/30 17:19:29 INFO exec.ExecDriver:  map = 100%,  reduce =0%
    [junit] 08/12/30 17:19:29 INFO exec.ExecDriver: Ended Job = job_local_0001
    [junit] diff -I \(file:\)\|\(/tmp/.*\) /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/../build/ql/test/logs/subq.q.out /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/src/test/results/clientpositive/subq.q.out
    [junit] Done query: subq.q
    [junit] Begin query: union2.q
    [junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:361)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
    [junit] 	at org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_union2(TestCliDriver.java:3228)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit] 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit] 	at java.lang.reflect.Method.invoke(Method.java:597)
    [junit] 	at junit.framework.TestCase.runTest(TestCase.java:164)
    [junit] 	at junit.framework.TestCase.runBare(TestCase.java:130)
    [junit] 	at junit.framework.TestResult$1.protect(TestResult.java:106)
    [junit] 	at junit.framework.TestResult.runProtected(TestResult.java:124)
    [junit] 	at junit.framework.TestResult.run(TestResult.java:109)
    [junit] 	at junit.framework.TestCase.run(TestCase.java:120)
    [junit] 	at junit.framework.TestSuite.runTest(TestSuite.java:230)
    [junit] 	at junit.framework.TestSuite.run(TestSuite.java:225)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
    [junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src)
    [junit] 	at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
    [junit] 	at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
    [junit] 	at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
    [junit] 	... 21 more
    [junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src)
    [junit] Begin query: input_part6.q
    [junit] plan = /tmp/plan1465.xml
    [junit] 08/12/30 17:19:33 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to 0 since there's no reduce operator
    [junit] 08/12/30 17:19:33 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/srcpart/ds=2008-04-08/hr=11
    [junit] 08/12/30 17:19:33 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/srcpart/ds=2008-04-08/hr=12
    [junit] 08/12/30 17:19:33 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/srcpart/ds=2008-04-09/hr=11
    [junit] 08/12/30 17:19:33 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/srcpart/ds=2008-04-09/hr=12
    [junit] 08/12/30 17:19:33 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/data/files/TestSerDe.jar
    [junit] 08/12/30 17:19:33 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
    [junit] 08/12/30 17:19:33 WARN mapred.JobClient: Use GenericOptionsParser for parsing the arguments. Applications should implement Tool for the same.
    [junit] 08/12/30 17:19:33 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] 08/12/30 17:19:33 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] 08/12/30 17:19:33 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] 08/12/30 17:19:33 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] 08/12/30 17:19:33 INFO exec.ExecDriver: Job running in-process (local Hadoop)
    [junit] Job running in-process (local Hadoop)
    [junit] 08/12/30 17:19:33 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] 08/12/30 17:19:33 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] 08/12/30 17:19:33 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] 08/12/30 17:19:33 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] 08/12/30 17:19:33 INFO mapred.MapTask: numReduceTasks: 0
    [junit] 08/12/30 17:19:33 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 17:19:33 INFO exec.MapOperator: Adding alias x to work list for file /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/srcpart/ds=2008-04-08/hr=11/kv1.txt
    [junit] 08/12/30 17:19:33 INFO exec.MapOperator: Got partitions: ds/hr
    [junit] 08/12/30 17:19:33 INFO exec.TableScanOperator: Initializing Self
    [junit] 08/12/30 17:19:33 INFO exec.TableScanOperator: Initializing children:
    [junit] 08/12/30 17:19:33 INFO exec.FilterOperator: Initializing Self
    [junit] 08/12/30 17:19:33 INFO exec.FilterOperator: Initializing children:
    [junit] 08/12/30 17:19:33 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 17:19:33 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 17:19:33 INFO exec.LimitOperator: Initializing Self
    [junit] 08/12/30 17:19:33 INFO exec.LimitOperator: Initializing children:
    [junit] 08/12/30 17:19:33 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 17:19:33 INFO exec.LimitOperator: Initialization Done
    [junit] 08/12/30 17:19:33 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 17:19:33 INFO exec.FilterOperator: Initialization Done
    [junit] 08/12/30 17:19:33 INFO exec.TableScanOperator: Initialization Done
    [junit] 08/12/30 17:19:33 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 17:19:33 INFO exec.FilterOperator: FILTERED:500
    [junit] 08/12/30 17:19:33 INFO exec.FilterOperator: PASSED:0
    [junit] 08/12/30 17:19:33 INFO mapred.TaskRunner: Task:attempt_local_0001_m_000000_0 is done. And is in the process of commiting
    [junit] 08/12/30 17:19:33 INFO mapred.LocalJobRunner: 
    [junit] 08/12/30 17:19:33 INFO mapred.TaskRunner: Task attempt_local_0001_m_000000_0 is allowed to commit now
    [junit] 08/12/30 17:19:33 INFO mapred.FileOutputCommitter: Saved output of task 'attempt_local_0001_m_000000_0' to file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/tmp151738983
    [junit] 08/12/30 17:19:33 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/srcpart/ds=2008-04-08/hr=11/kv1.txt:0+5812
    [junit] 08/12/30 17:19:33 INFO mapred.TaskRunner: Task 'attempt_local_0001_m_000000_0' done.
    [junit] 08/12/30 17:19:33 INFO mapred.MapTask: numReduceTasks: 0
    [junit] 08/12/30 17:19:33 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 17:19:33 INFO exec.MapOperator: Adding alias x to work list for file /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/srcpart/ds=2008-04-08/hr=12/kv1.txt
    [junit] 08/12/30 17:19:33 INFO exec.MapOperator: Got partitions: ds/hr
    [junit] 08/12/30 17:19:33 INFO exec.TableScanOperator: Initializing Self
    [junit] 08/12/30 17:19:33 INFO exec.TableScanOperator: Initializing children:
    [junit] 08/12/30 17:19:33 INFO exec.FilterOperator: Initializing Self
    [junit] 08/12/30 17:19:33 INFO exec.FilterOperator: Initializing children:
    [junit] 08/12/30 17:19:33 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 17:19:33 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 17:19:33 INFO exec.LimitOperator: Initializing Self
    [junit] 08/12/30 17:19:33 INFO exec.LimitOperator: Initializing children:
    [junit] 08/12/30 17:19:33 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 17:19:33 INFO exec.LimitOperator: Initialization Done
    [junit] 08/12/30 17:19:33 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 17:19:33 INFO exec.FilterOperator: Initialization Done
    [junit] 08/12/30 17:19:33 INFO exec.TableScanOperator: Initialization Done
    [junit] 08/12/30 17:19:33 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 17:19:33 INFO exec.FilterOperator: FILTERED:1000
    [junit] 08/12/30 17:19:33 INFO exec.FilterOperator: PASSED:0
    [junit] 08/12/30 17:19:33 INFO mapred.TaskRunner: Task:attempt_local_0001_m_000001_0 is done. And is in the process of commiting
    [junit] 08/12/30 17:19:33 INFO mapred.LocalJobRunner: 
    [junit] 08/12/30 17:19:33 INFO mapred.TaskRunner: Task attempt_local_0001_m_000001_0 is allowed to commit now
    [junit] 08/12/30 17:19:33 INFO mapred.FileOutputCommitter: Saved output of task 'attempt_local_0001_m_000001_0' to file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/tmp151738983
    [junit] 08/12/30 17:19:33 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/srcpart/ds=2008-04-08/hr=12/kv1.txt:0+5812
    [junit] 08/12/30 17:19:33 INFO mapred.TaskRunner: Task 'attempt_local_0001_m_000001_0' done.
    [junit] 08/12/30 17:19:34 INFO mapred.MapTask: numReduceTasks: 0
    [junit] 08/12/30 17:19:34 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 17:19:34 INFO exec.MapOperator: Adding alias x to work list for file /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/srcpart/ds=2008-04-09/hr=11/kv1.txt
    [junit] 08/12/30 17:19:34 INFO exec.MapOperator: Got partitions: ds/hr
    [junit] 08/12/30 17:19:34 INFO exec.TableScanOperator: Initializing Self
    [junit] 08/12/30 17:19:34 INFO exec.TableScanOperator: Initializing children:
    [junit] 08/12/30 17:19:34 INFO exec.FilterOperator: Initializing Self
    [junit] 08/12/30 17:19:34 INFO exec.FilterOperator: Initializing children:
    [junit] 08/12/30 17:19:34 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 17:19:34 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 17:19:34 INFO exec.LimitOperator: Initializing Self
    [junit] 08/12/30 17:19:34 INFO exec.LimitOperator: Initializing children:
    [junit] 08/12/30 17:19:34 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 17:19:34 INFO exec.LimitOperator: Initialization Done
    [junit] 08/12/30 17:19:34 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 17:19:34 INFO exec.FilterOperator: Initialization Done
    [junit] 08/12/30 17:19:34 INFO exec.TableScanOperator: Initialization Done
    [junit] 08/12/30 17:19:34 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 17:19:34 INFO exec.FilterOperator: FILTERED:1500
    [junit] 08/12/30 17:19:34 INFO exec.FilterOperator: PASSED:0
    [junit] 08/12/30 17:19:34 INFO mapred.TaskRunner: Task:attempt_local_0001_m_000002_0 is done. And is in the process of commiting
    [junit] 08/12/30 17:19:34 INFO mapred.LocalJobRunner: 
    [junit] 08/12/30 17:19:34 INFO mapred.TaskRunner: Task attempt_local_0001_m_000002_0 is allowed to commit now
    [junit] 08/12/30 17:19:34 INFO mapred.FileOutputCommitter: Saved output of task 'attempt_local_0001_m_000002_0' to file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/tmp151738983
    [junit] 08/12/30 17:19:34 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/srcpart/ds=2008-04-09/hr=11/kv1.txt:0+5812
    [junit] 08/12/30 17:19:34 INFO mapred.TaskRunner: Task 'attempt_local_0001_m_000002_0' done.
    [junit] 08/12/30 17:19:34 INFO mapred.MapTask: numReduceTasks: 0
    [junit] 08/12/30 17:19:34 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 17:19:34 INFO exec.MapOperator: Adding alias x to work list for file /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/srcpart/ds=2008-04-09/hr=12/kv1.txt
    [junit] 08/12/30 17:19:34 INFO exec.MapOperator: Got partitions: ds/hr
    [junit] 08/12/30 17:19:34 INFO exec.TableScanOperator: Initializing Self
    [junit] 08/12/30 17:19:34 INFO exec.TableScanOperator: Initializing children:
    [junit] 08/12/30 17:19:34 INFO exec.FilterOperator: Initializing Self
    [junit] 08/12/30 17:19:34 INFO exec.FilterOperator: Initializing children:
    [junit] 08/12/30 17:19:34 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 17:19:34 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 17:19:34 INFO exec.LimitOperator: Initializing Self
    [junit] 08/12/30 17:19:34 INFO exec.LimitOperator: Initializing children:
    [junit] 08/12/30 17:19:34 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 17:19:34 INFO exec.LimitOperator: Initialization Done
    [junit] 08/12/30 17:19:34 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 17:19:34 INFO exec.FilterOperator: Initialization Done
    [junit] 08/12/30 17:19:34 INFO exec.TableScanOperator: Initialization Done
    [junit] 08/12/30 17:19:34 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 17:19:34 INFO exec.FilterOperator: FILTERED:2000
    [junit] 08/12/30 17:19:34 INFO exec.FilterOperator: PASSED:0
    [junit] 08/12/30 17:19:34 INFO mapred.TaskRunner: Task:attempt_local_0001_m_000003_0 is done. And is in the process of commiting
    [junit] 08/12/30 17:19:34 INFO mapred.LocalJobRunner: 
    [junit] 08/12/30 17:19:34 INFO mapred.TaskRunner: Task attempt_local_0001_m_000003_0 is allowed to commit now
    [junit] 08/12/30 17:19:34 INFO mapred.FileOutputCommitter: Saved output of task 'attempt_local_0001_m_000003_0' to file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/tmp151738983
    [junit] 08/12/30 17:19:34 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/srcpart/ds=2008-04-09/hr=12/kv1.txt:0+5812
    [junit] 08/12/30 17:19:34 INFO mapred.TaskRunner: Task 'attempt_local_0001_m_000003_0' done.
    [junit]  map = 100%,  reduce =0%
    [junit] 08/12/30 17:19:34 INFO exec.ExecDriver:  map = 100%,  reduce =0%
    [junit] Ended Job = job_local_0001
    [junit] 08/12/30 17:19:34 INFO exec.ExecDriver: Ended Job = job_local_0001
    [junit] diff -I \(file:\)\|\(/tmp/.*\) /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/../build/ql/test/logs/input_part6.q.out /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/src/test/results/clientpositive/input_part6.q.out
    [junit] Done query: input_part6.q
    [junit] Begin query: groupby5.q
    [junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src)
    [junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:361)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
    [junit] 	at org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_groupby5(TestCliDriver.java:3278)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit] 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit] 	at java.lang.reflect.Method.invoke(Method.java:597)
    [junit] 	at junit.framework.TestCase.runTest(TestCase.java:164)
    [junit] 	at junit.framework.TestCase.runBare(TestCase.java:130)
    [junit] 	at junit.framework.TestResult$1.protect(TestResult.java:106)
    [junit] 	at junit.framework.TestResult.runProtected(TestResult.java:124)
    [junit] 	at junit.framework.TestResult.run(TestResult.java:109)
    [junit] 	at junit.framework.TestCase.run(TestCase.java:120)
    [junit] 	at junit.framework.TestSuite.runTest(TestSuite.java:230)
    [junit] 	at junit.framework.TestSuite.run(TestSuite.java:225)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
    [junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src)
    [junit] 	at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
    [junit] 	at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
    [junit] 	at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
    [junit] 	... 21 more
    [junit] Begin query: groupby7.q
    [junit] plan = /tmp/plan1466.xml
    [junit] 08/12/30 17:19:37 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to jobconf value of: 1
    [junit] 08/12/30 17:19:37 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src
    [junit] 08/12/30 17:19:37 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/data/files/TestSerDe.jar
    [junit] 08/12/30 17:19:37 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
    [junit] 08/12/30 17:19:37 WARN mapred.JobClient: Use GenericOptionsParser for parsing the arguments. Applications should implement Tool for the same.
    [junit] 08/12/30 17:19:38 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] Job running in-process (local Hadoop)
    [junit] 08/12/30 17:19:38 INFO exec.ExecDriver: Job running in-process (local Hadoop)
    [junit] 08/12/30 17:19:38 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] 08/12/30 17:19:38 INFO mapred.MapTask: numReduceTasks: 1
    [junit] 08/12/30 17:19:38 INFO mapred.MapTask: io.sort.mb = 100
    [junit] 08/12/30 17:19:38 INFO mapred.MapTask: data buffer = 79691776/99614720
    [junit] 08/12/30 17:19:38 INFO mapred.MapTask: record buffer = 262144/327680
    [junit] 08/12/30 17:19:38 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 17:19:38 INFO exec.MapOperator: Adding alias src to work list for file /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src/kv1.txt
    [junit] 08/12/30 17:19:38 INFO exec.MapOperator: Got partitions: null
    [junit] 08/12/30 17:19:38 INFO exec.TableScanOperator: Initializing Self
    [junit] 08/12/30 17:19:38 INFO exec.TableScanOperator: Initializing children:
    [junit] 08/12/30 17:19:38 INFO exec.ReduceSinkOperator: Initializing Self
    [junit] 08/12/30 17:19:38 INFO exec.ReduceSinkOperator: Using tag = -1
    [junit] 08/12/30 17:19:38 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 17:19:38 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 17:19:38 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 17:19:38 INFO util.NativeCodeLoader: Loaded the native-hadoop library
    [junit] 08/12/30 17:19:38 INFO zlib.ZlibFactory: Successfully loaded & initialized native-zlib library
    [junit] 08/12/30 17:19:38 INFO compress.CodecPool: Got brand-new compressor
    [junit] 08/12/30 17:19:38 INFO exec.TableScanOperator: Initialization Done
    [junit] 08/12/30 17:19:38 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 17:19:38 INFO mapred.MapTask: Starting flush of map output
    [junit] 08/12/30 17:19:38 INFO mapred.MapTask: Finished spill 0
    [junit] 08/12/30 17:19:38 INFO mapred.TaskRunner: Task:attempt_local_0001_m_000000_0 is done. And is in the process of commiting
    [junit] 08/12/30 17:19:38 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src/kv1.txt:0+5812
    [junit] 08/12/30 17:19:38 INFO mapred.TaskRunner: Task 'attempt_local_0001_m_000000_0' done.
    [junit] 08/12/30 17:19:38 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 17:19:38 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 17:19:38 INFO mapred.Merger: Merging 1 sorted segments
    [junit] 08/12/30 17:19:38 INFO mapred.Merger: Down to the last merge-pass, with 1 segments left of total size: 12814 bytes
    [junit] 08/12/30 17:19:38 INFO exec.GroupByOperator: Initializing Self
    [junit] 08/12/30 17:19:38 INFO exec.GroupByOperator: Initializing children:
    [junit] 08/12/30 17:19:38 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 17:19:38 INFO exec.GroupByOperator: Initialization Done
    [junit] 08/12/30 17:19:38 INFO mapred.TaskRunner: Task:attempt_local_0001_r_000000_0 is done. And is in the process of commiting
    [junit] 08/12/30 17:19:38 INFO mapred.LocalJobRunner: 
    [junit] 08/12/30 17:19:38 INFO mapred.TaskRunner: Task attempt_local_0001_r_000000_0 is allowed to commit now
    [junit] 08/12/30 17:19:38 INFO mapred.FileOutputCommitter: Saved output of task 'attempt_local_0001_r_000000_0' to file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/tmp-1261262043
    [junit] 08/12/30 17:19:38 INFO mapred.LocalJobRunner: reduce > reduce
    [junit] 08/12/30 17:19:38 INFO mapred.TaskRunner: Task 'attempt_local_0001_r_000000_0' done.
    [junit]  map = 100%,  reduce =100%
    [junit] Ended Job = job_local_0001
    [junit] 08/12/30 17:19:39 INFO exec.ExecDriver:  map = 100%,  reduce =100%
    [junit] 08/12/30 17:19:39 INFO exec.ExecDriver: Ended Job = job_local_0001
    [junit] plan = /tmp/plan1467.xml
    [junit] 08/12/30 17:19:40 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to jobconf value of: 1
    [junit] 08/12/30 17:19:40 INFO exec.ExecDriver: Adding input file /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/tmp/415838160/144769402.10002
    [junit] 08/12/30 17:19:40 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/data/files/TestSerDe.jar
    [junit] 08/12/30 17:19:40 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
    [junit] 08/12/30 17:19:40 WARN mapred.JobClient: Use GenericOptionsParser for parsing the arguments. Applications should implement Tool for the same.
    [junit] 08/12/30 17:19:40 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] 08/12/30 17:19:41 INFO exec.ExecDriver: Job running in-process (local Hadoop)
    [junit] 08/12/30 17:19:41 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] Job running in-process (local Hadoop)
    [junit] 08/12/30 17:19:41 INFO mapred.MapTask: numReduceTasks: 1
    [junit] 08/12/30 17:19:41 INFO mapred.MapTask: io.sort.mb = 100
    [junit] 08/12/30 17:19:41 INFO mapred.MapTask: data buffer = 79691776/99614720
    [junit] 08/12/30 17:19:41 INFO mapred.MapTask: record buffer = 262144/327680
    [junit] 08/12/30 17:19:41 INFO util.NativeCodeLoader: Loaded the native-hadoop library
    [junit] 08/12/30 17:19:41 INFO zlib.ZlibFactory: Successfully loaded & initialized native-zlib library
    [junit] 08/12/30 17:19:41 INFO compress.CodecPool: Got brand-new decompressor
    [junit] 08/12/30 17:19:41 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 17:19:41 INFO exec.MapOperator: Adding alias /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/tmp/415838160/144769402.10002 to work list for file /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/tmp/415838160/144769402.10002/attempt_local_0001_r_000000_0
    [junit] 08/12/30 17:19:41 INFO exec.MapOperator: Got partitions: null
    [junit] 08/12/30 17:19:41 INFO exec.ReduceSinkOperator: Initializing Self
    [junit] 08/12/30 17:19:41 INFO exec.ReduceSinkOperator: Using tag = -1
    [junit] 08/12/30 17:19:41 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 17:19:41 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 17:19:41 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 17:19:41 INFO mapred.MapTask: Starting flush of map output
    [junit] 08/12/30 17:19:41 INFO mapred.MapTask: Finished spill 0
    [junit] 08/12/30 17:19:41 INFO mapred.TaskRunner: Task:attempt_local_0001_m_000000_0 is done. And is in the process of commiting
    [junit] 08/12/30 17:19:41 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/tmp/415838160/144769402.10002/attempt_local_0001_r_000000_0:0+13424
    [junit] 08/12/30 17:19:41 INFO mapred.TaskRunner: Task 'attempt_local_0001_m_000000_0' done.
    [junit] 08/12/30 17:19:41 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 17:19:41 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 17:19:41 INFO mapred.Merger: Merging 1 sorted segments
    [junit] 08/12/30 17:19:41 INFO mapred.Merger: Down to the last merge-pass, with 1 segments left of total size: 8282 bytes
    [junit] 08/12/30 17:19:41 INFO exec.GroupByOperator: Initializing Self
    [junit] 08/12/30 17:19:41 INFO exec.GroupByOperator: Initializing children:
    [junit] 08/12/30 17:19:41 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 17:19:41 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 17:19:41 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 17:19:41 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 17:19:41 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 17:19:41 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 17:19:41 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 17:19:41 INFO exec.GroupByOperator: Initialization Done
    [junit] 08/12/30 17:19:41 INFO mapred.TaskRunner: Task:attempt_local_0001_r_000000_0 is done. And is in the process of commiting
    [junit] 08/12/30 17:19:41 INFO mapred.LocalJobRunner: 
    [junit] 08/12/30 17:19:41 INFO mapred.TaskRunner: Task attempt_local_0001_r_000000_0 is allowed to commit now
    [junit] 08/12/30 17:19:41 INFO mapred.FileOutputCommitter: Saved output of task 'attempt_local_0001_r_000000_0' to file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/tmp612699998
    [junit] 08/12/30 17:19:41 INFO mapred.LocalJobRunner: reduce > reduce
    [junit] 08/12/30 17:19:41 INFO mapred.TaskRunner: Task 'attempt_local_0001_r_000000_0' done.
    [junit] 08/12/30 17:19:42 INFO exec.ExecDriver:  map = 100%,  reduce =100%
    [junit] 08/12/30 17:19:42 INFO exec.ExecDriver: Ended Job = job_local_0001
    [junit]  map = 100%,  reduce =100%
    [junit] Ended Job = job_local_0001
    [junit] plan = /tmp/plan1468.xml
    [junit] 08/12/30 17:19:43 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to jobconf value of: 1
    [junit] 08/12/30 17:19:43 INFO exec.ExecDriver: Adding input file /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/tmp/415838160/144769402.10003
    [junit] 08/12/30 17:19:43 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/data/files/TestSerDe.jar
    [junit] 08/12/30 17:19:43 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
    [junit] 08/12/30 17:19:43 WARN mapred.JobClient: Use GenericOptionsParser for parsing the arguments. Applications should implement Tool for the same.
    [junit] 08/12/30 17:19:43 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] Job running in-process (local Hadoop)
    [junit] 08/12/30 17:19:43 INFO exec.ExecDriver: Job running in-process (local Hadoop)
    [junit] 08/12/30 17:19:43 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] 08/12/30 17:19:43 INFO mapred.MapTask: numReduceTasks: 1
    [junit] 08/12/30 17:19:43 INFO mapred.MapTask: io.sort.mb = 100
    [junit] 08/12/30 17:19:44 INFO mapred.MapTask: data buffer = 79691776/99614720
    [junit] 08/12/30 17:19:44 INFO mapred.MapTask: record buffer = 262144/327680
    [junit] 08/12/30 17:19:44 INFO util.NativeCodeLoader: Loaded the native-hadoop library
    [junit] 08/12/30 17:19:44 INFO zlib.ZlibFactory: Successfully loaded & initialized native-zlib library
    [junit] 08/12/30 17:19:44 INFO compress.CodecPool: Got brand-new decompressor
    [junit] 08/12/30 17:19:44 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 17:19:44 INFO exec.MapOperator: Adding alias /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/tmp/415838160/144769402.10003 to work list for file /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/tmp/415838160/144769402.10003/attempt_local_0001_m_000000_0
    [junit] 08/12/30 17:19:44 INFO exec.MapOperator: Got partitions: null
    [junit] 08/12/30 17:19:44 INFO exec.ReduceSinkOperator: Initializing Self
    [junit] 08/12/30 17:19:44 INFO exec.ReduceSinkOperator: Using tag = -1
    [junit] 08/12/30 17:19:44 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 17:19:44 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 17:19:44 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 17:19:44 INFO mapred.MapTask: Starting flush of map output
    [junit] 08/12/30 17:19:44 INFO mapred.MapTask: Finished spill 0
    [junit] 08/12/30 17:19:44 INFO mapred.TaskRunner: Task:attempt_local_0001_m_000000_0 is done. And is in the process of commiting
    [junit] 08/12/30 17:19:44 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/tmp/415838160/144769402.10003/attempt_local_0001_m_000000_0:0+23755
    [junit] 08/12/30 17:19:44 INFO mapred.TaskRunner: Task 'attempt_local_0001_m_000000_0' done.
    [junit] 08/12/30 17:19:44 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 17:19:44 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 17:19:44 INFO mapred.Merger: Merging 1 sorted segments
    [junit] 08/12/30 17:19:44 INFO mapred.Merger: Down to the last merge-pass, with 1 segments left of total size: 12814 bytes
    [junit] 08/12/30 17:19:44 INFO exec.GroupByOperator: Initializing Self
    [junit] 08/12/30 17:19:44 INFO exec.GroupByOperator: Initializing children:
    [junit] 08/12/30 17:19:44 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 17:19:44 INFO compress.CodecPool: Got brand-new compressor
    [junit] 08/12/30 17:19:44 INFO exec.GroupByOperator: Initialization Done
    [junit] 08/12/30 17:19:44 INFO mapred.TaskRunner: Task:attempt_local_0001_r_000000_0 is done. And is in the process of commiting
    [junit] 08/12/30 17:19:44 INFO mapred.LocalJobRunner: 
    [junit] 08/12/30 17:19:44 INFO mapred.TaskRunner: Task attempt_local_0001_r_000000_0 is allowed to commit now
    [junit] 08/12/30 17:19:44 INFO mapred.FileOutputCommitter: Saved output of task 'attempt_local_0001_r_000000_0' to file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/tmp1761005757
    [junit] 08/12/30 17:19:44 INFO mapred.LocalJobRunner: reduce > reduce
    [junit] 08/12/30 17:19:44 INFO mapred.TaskRunner: Task 'attempt_local_0001_r_000000_0' done.
    [junit] 08/12/30 17:19:44 INFO exec.ExecDriver:  map = 100%,  reduce =100%
    [junit] 08/12/30 17:19:44 INFO exec.ExecDriver: Ended Job = job_local_0001
    [junit]  map = 100%,  reduce =100%
    [junit] Ended Job = job_local_0001
    [junit] plan = /tmp/plan1469.xml
    [junit] 08/12/30 17:19:46 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to jobconf value of: 1
    [junit] 08/12/30 17:19:46 INFO exec.ExecDriver: Adding input file /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/tmp/415838160/144769402.10004
    [junit] 08/12/30 17:19:46 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/data/files/TestSerDe.jar
    [junit] 08/12/30 17:19:46 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
    [junit] 08/12/30 17:19:46 WARN mapred.JobClient: Use GenericOptionsParser for parsing the arguments. Applications should implement Tool for the same.
    [junit] 08/12/30 17:19:46 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] Job running in-process (local Hadoop)
    [junit] 08/12/30 17:19:46 INFO exec.ExecDriver: Job running in-process (local Hadoop)
    [junit] 08/12/30 17:19:46 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] 08/12/30 17:19:46 INFO mapred.MapTask: numReduceTasks: 1
    [junit] 08/12/30 17:19:46 INFO mapred.MapTask: io.sort.mb = 100
    [junit] 08/12/30 17:19:47 INFO mapred.MapTask: data buffer = 79691776/99614720
    [junit] 08/12/30 17:19:47 INFO mapred.MapTask: record buffer = 262144/327680
    [junit] 08/12/30 17:19:47 INFO util.NativeCodeLoader: Loaded the native-hadoop library
    [junit] 08/12/30 17:19:47 INFO zlib.ZlibFactory: Successfully loaded & initialized native-zlib library
    [junit] 08/12/30 17:19:47 INFO compress.CodecPool: Got brand-new decompressor
    [junit] 08/12/30 17:19:47 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 17:19:47 INFO exec.MapOperator: Adding alias /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/tmp/415838160/144769402.10004 to work list for file /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/tmp/415838160/144769402.10004/attempt_local_0001_r_000000_0
    [junit] 08/12/30 17:19:47 INFO exec.MapOperator: Got partitions: null
    [junit] 08/12/30 17:19:47 INFO exec.ReduceSinkOperator: Initializing Self
    [junit] 08/12/30 17:19:47 INFO exec.ReduceSinkOperator: Using tag = -1
    [junit] 08/12/30 17:19:47 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 17:19:47 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 17:19:47 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 17:19:47 INFO mapred.MapTask: Starting flush of map output
    [junit] 08/12/30 17:19:47 INFO mapred.MapTask: Finished spill 0
    [junit] 08/12/30 17:19:47 INFO mapred.TaskRunner: Task:attempt_local_0001_m_000000_0 is done. And is in the process of commiting
    [junit] 08/12/30 17:19:47 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/tmp/415838160/144769402.10004/attempt_local_0001_r_000000_0:0+13424
    [junit] 08/12/30 17:19:47 INFO mapred.TaskRunner: Task 'attempt_local_0001_m_000000_0' done.
    [junit] 08/12/30 17:19:47 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 17:19:47 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 17:19:47 INFO mapred.Merger: Merging 1 sorted segments
    [junit] 08/12/30 17:19:47 INFO mapred.Merger: Down to the last merge-pass, with 1 segments left of total size: 8282 bytes
    [junit] 08/12/30 17:19:47 INFO exec.GroupByOperator: Initializing Self
    [junit] 08/12/30 17:19:47 INFO exec.GroupByOperator: Initializing children:
    [junit] 08/12/30 17:19:47 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 17:19:47 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 17:19:47 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 17:19:47 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 17:19:47 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 17:19:47 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 17:19:47 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 17:19:47 INFO exec.GroupByOperator: Initialization Done
    [junit] 08/12/30 17:19:47 INFO mapred.TaskRunner: Task:attempt_local_0001_r_000000_0 is done. And is in the process of commiting
    [junit] 08/12/30 17:19:47 INFO mapred.LocalJobRunner: 
    [junit] 08/12/30 17:19:47 INFO mapred.TaskRunner: Task attempt_local_0001_r_000000_0 is allowed to commit now
    [junit] 08/12/30 17:19:47 INFO mapred.FileOutputCommitter: Saved output of task 'attempt_local_0001_r_000000_0' to file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/tmp2125453385
    [junit] 08/12/30 17:19:47 INFO mapred.LocalJobRunner: reduce > reduce
    [junit] 08/12/30 17:19:47 INFO mapred.TaskRunner: Task 'attempt_local_0001_r_000000_0' done.
    [junit]  map = 100%,  reduce =100%
    [junit] Ended Job = job_local_0001
    [junit] 08/12/30 17:19:47 INFO exec.ExecDriver:  map = 100%,  reduce =100%
    [junit] 08/12/30 17:19:47 INFO exec.ExecDriver: Ended Job = job_local_0001
    [junit] diff -I \(file:\)\|\(/tmp/.*\) /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/../build/ql/test/logs/groupby7.q.out /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/src/test/results/clientpositive/groupby7.q.out
    [junit] Done query: groupby7.q
    [junit] Begin query: udf_testlength.q
    [junit] plan = /tmp/plan1470.xml
    [junit] 08/12/30 17:19:51 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to 0 since there's no reduce operator
    [junit] 08/12/30 17:19:51 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src
    [junit] 08/12/30 17:19:51 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/data/files/TestSerDe.jar
    [junit] 08/12/30 17:19:51 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
    [junit] 08/12/30 17:19:51 WARN mapred.JobClient: Use GenericOptionsParser for parsing the arguments. Applications should implement Tool for the same.
    [junit] 08/12/30 17:19:51 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] 08/12/30 17:19:51 INFO exec.ExecDriver: Job running in-process (local Hadoop)
    [junit] 08/12/30 17:19:51 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] Job running in-process (local Hadoop)
    [junit] 08/12/30 17:19:51 INFO mapred.MapTask: numReduceTasks: 0
    [junit] 08/12/30 17:19:51 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 17:19:51 INFO exec.MapOperator: Adding alias src to work list for file /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src/kv1.txt
    [junit] 08/12/30 17:19:51 INFO exec.MapOperator: Got partitions: null
    [junit] 08/12/30 17:19:51 INFO exec.TableScanOperator: Initializing Self
    [junit] 08/12/30 17:19:51 INFO exec.TableScanOperator: Initializing children:
    [junit] 08/12/30 17:19:51 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 17:19:51 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 17:19:51 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 17:19:51 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 17:19:51 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 17:19:51 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 17:19:51 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 17:19:51 INFO exec.TableScanOperator: Initialization Done
    [junit] 08/12/30 17:19:51 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 17:19:51 INFO mapred.TaskRunner: Task:attempt_local_0001_m_000000_0 is done. And is in the process of commiting
    [junit] 08/12/30 17:19:51 INFO mapred.LocalJobRunner: 
    [junit] 08/12/30 17:19:51 INFO mapred.TaskRunner: Task attempt_local_0001_m_000000_0 is allowed to commit now
    [junit] 08/12/30 17:19:51 INFO mapred.FileOutputCommitter: Saved output of task 'attempt_local_0001_m_000000_0' to file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/tmp-2020189018
    [junit] 08/12/30 17:19:51 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src/kv1.txt:0+5812
    [junit] 08/12/30 17:19:51 INFO mapred.TaskRunner: Task 'attempt_local_0001_m_000000_0' done.
    [junit]  map = 100%,  reduce =0%
    [junit] Ended Job = job_local_0001
    [junit] 08/12/30 17:19:52 INFO exec.ExecDriver:  map = 100%,  reduce =0%
    [junit] 08/12/30 17:19:52 INFO exec.ExecDriver: Ended Job = job_local_0001
    [junit] diff -I \(file:\)\|\(/tmp/.*\) /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/../build/ql/test/logs/udf_testlength.q.out /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/src/test/results/clientpositive/udf_testlength.q.out
    [junit] Done query: udf_testlength.q
    [junit] Begin query: sort.q
    [junit] plan = /tmp/plan1471.xml
    [junit] 08/12/30 17:19:55 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to jobconf value of: 1
    [junit] 08/12/30 17:19:55 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src
    [junit] 08/12/30 17:19:55 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/data/files/TestSerDe.jar
    [junit] 08/12/30 17:19:55 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
    [junit] 08/12/30 17:19:55 WARN mapred.JobClient: Use GenericOptionsParser for parsing the arguments. Applications should implement Tool for the same.
    [junit] 08/12/30 17:19:56 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] 08/12/30 17:19:56 INFO exec.ExecDriver: Job running in-process (local Hadoop)
    [junit] 08/12/30 17:19:56 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] Job running in-process (local Hadoop)
    [junit] 08/12/30 17:19:56 INFO mapred.MapTask: numReduceTasks: 1
    [junit] 08/12/30 17:19:56 INFO mapred.MapTask: io.sort.mb = 100
    [junit] 08/12/30 17:19:56 INFO mapred.MapTask: data buffer = 79691776/99614720
    [junit] 08/12/30 17:19:56 INFO mapred.MapTask: record buffer = 262144/327680
    [junit] 08/12/30 17:19:56 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 17:19:56 INFO exec.MapOperator: Adding alias x to work list for file /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src/kv1.txt
    [junit] 08/12/30 17:19:56 INFO exec.MapOperator: Got partitions: null
    [junit] 08/12/30 17:19:56 INFO exec.TableScanOperator: Initializing Self
    [junit] 08/12/30 17:19:56 INFO exec.TableScanOperator: Initializing children:
    [junit] 08/12/30 17:19:56 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 17:19:56 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 17:19:56 INFO exec.ReduceSinkOperator: Initializing Self
    [junit] 08/12/30 17:19:56 INFO exec.ReduceSinkOperator: Using tag = -1
    [junit] 08/12/30 17:19:56 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 17:19:56 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 17:19:56 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 17:19:56 INFO exec.TableScanOperator: Initialization Done
    [junit] 08/12/30 17:19:56 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 17:19:56 INFO mapred.MapTask: Starting flush of map output
    [junit] 08/12/30 17:19:56 INFO mapred.MapTask: Finished spill 0
    [junit] 08/12/30 17:19:56 INFO mapred.TaskRunner: Task:attempt_local_0001_m_000000_0 is done. And is in the process of commiting
    [junit] 08/12/30 17:19:56 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src/kv1.txt:0+5812
    [junit] 08/12/30 17:19:56 INFO mapred.TaskRunner: Task 'attempt_local_0001_m_000000_0' done.
    [junit] 08/12/30 17:19:56 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 17:19:56 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 17:19:56 INFO mapred.Merger: Merging 1 sorted segments
    [junit] 08/12/30 17:19:56 INFO mapred.Merger: Down to the last merge-pass, with 1 segments left of total size: 19720 bytes
    [junit] 08/12/30 17:19:56 INFO exec.ExtractOperator: Initializing Self
    [junit] 08/12/30 17:19:56 INFO exec.ExtractOperator: Initializing children:
    [junit] 08/12/30 17:19:56 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 17:19:56 INFO exec.ExtractOperator: Initialization Done
    [junit] 08/12/30 17:19:56 INFO mapred.TaskRunner: Task:attempt_local_0001_r_000000_0 is done. And is in the process of commiting
    [junit] 08/12/30 17:19:56 INFO mapred.LocalJobRunner: 
    [junit] 08/12/30 17:19:56 INFO mapred.TaskRunner: Task attempt_local_0001_r_000000_0 is allowed to commit now
    [junit] 08/12/30 17:19:56 INFO mapred.FileOutputCommitter: Saved output of task 'attempt_local_0001_r_000000_0' to file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/tmp722335852
    [junit] 08/12/30 17:19:56 INFO mapred.LocalJobRunner: reduce > reduce
    [junit] 08/12/30 17:19:56 INFO mapred.TaskRunner: Task 'attempt_local_0001_r_000000_0' done.
    [junit]  map = 100%,  reduce =100%
    [junit] Ended Job = job_local_0001
    [junit] 08/12/30 17:19:57 INFO exec.ExecDriver:  map = 100%,  reduce =100%
    [junit] 08/12/30 17:19:57 INFO exec.ExecDriver: Ended Job = job_local_0001
    [junit] diff -I \(file:\)\|\(/tmp/.*\) /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/../build/ql/test/logs/sort.q.out /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/src/test/results/clientpositive/sort.q.out
    [junit] Done query: sort.q
    [junit] Begin query: cluster.q
    [junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src)
    [junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:361)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
    [junit] 	at org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_cluster(TestCliDriver.java:3378)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit] 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit] 	at java.lang.reflect.Method.invoke(Method.java:597)
    [junit] 	at junit.framework.TestCase.runTest(TestCase.java:164)
    [junit] 	at junit.framework.TestCase.runBare(TestCase.java:130)
    [junit] 	at junit.framework.TestResult$1.protect(TestResult.java:106)
    [junit] 	at junit.framework.TestResult.runProtected(TestResult.java:124)
    [junit] 	at junit.framework.TestResult.run(TestResult.java:109)
    [junit] 	at junit.framework.TestCase.run(TestCase.java:120)
    [junit] 	at junit.framework.TestSuite.runTest(TestSuite.java:230)
    [junit] 	at junit.framework.TestSuite.run(TestSuite.java:225)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
    [junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src)
    [junit] 	at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
    [junit] 	at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
    [junit] 	at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
    [junit] 	... 21 more
    [junit] Begin query: fileformat_text.q
    [junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/srcpart)
    [junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/srcpart)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:361)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
    [junit] 	at org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_fileformat_text(TestCliDriver.java:3403)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit] 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit] 	at java.lang.reflect.Method.invoke(Method.java:597)
    [junit] 	at junit.framework.TestCase.runTest(TestCase.java:164)
    [junit] 	at junit.framework.TestCase.runBare(TestCase.java:130)
    [junit] 	at junit.framework.TestResult$1.protect(TestResult.java:106)
    [junit] 	at junit.framework.TestResult.runProtected(TestResult.java:124)
    [junit] 	at junit.framework.TestResult.run(TestResult.java:109)
    [junit] 	at junit.framework.TestCase.run(TestCase.java:120)
    [junit] 	at junit.framework.TestSuite.runTest(TestSuite.java:230)
    [junit] 	at junit.framework.TestSuite.run(TestSuite.java:225)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
    [junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/srcpart)
    [junit] 	at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
    [junit] 	at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
    [junit] 	at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
    [junit] 	... 21 more
    [junit] Begin query: fileformat_void.q
    [junit] diff -I \(file:\)\|\(/tmp/.*\) /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/../build/ql/test/logs/fileformat_void.q.out /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/src/test/results/clientpositive/fileformat_void.q.out
    [junit] Done query: fileformat_void.q
    [junit] Begin query: fileformat_sequencefile.q
    [junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:361)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
    [junit] 	at org.apache.hadoop.hive.cli.TestCliDriver.testCliDriver_fileformat_sequencefile(TestCliDriver.java:3453)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit] 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit] 	at java.lang.reflect.Method.invoke(Method.java:597)
    [junit] 	at junit.framework.TestCase.runTest(TestCase.java:164)
    [junit] 	at junit.framework.TestCase.runBare(TestCase.java:130)
    [junit] 	at junit.framework.TestResult$1.protect(TestResult.java:106)
    [junit] 	at junit.framework.TestResult.runProtected(TestResult.java:124)
    [junit] 	at junit.framework.TestResult.run(TestResult.java:109)
    [junit] 	at junit.framework.TestCase.run(TestCase.java:120)
    [junit] 	at junit.framework.TestSuite.runTest(TestSuite.java:230)
    [junit] 	at junit.framework.TestSuite.run(TestSuite.java:225)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
    [junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src)
    [junit] 	at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
    [junit] 	at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
    [junit] 	at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
    [junit] 	... 21 more
    [junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src)
    [junit] Begin query: udf_round.q
    [junit] plan = /tmp/plan1472.xml
    [junit] 08/12/30 17:20:02 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to 0 since there's no reduce operator
    [junit] 08/12/30 17:20:02 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src
    [junit] 08/12/30 17:20:02 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/data/files/TestSerDe.jar
    [junit] 08/12/30 17:20:02 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
    [junit] 08/12/30 17:20:02 WARN mapred.JobClient: Use GenericOptionsParser for parsing the arguments. Applications should implement Tool for the same.
    [junit] 08/12/30 17:20:02 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] 08/12/30 17:20:03 INFO exec.ExecDriver: Job running in-process (local Hadoop)
    [junit] 08/12/30 17:20:03 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] Job running in-process (local Hadoop)
    [junit] 08/12/30 17:20:03 INFO mapred.MapTask: numReduceTasks: 0
    [junit] 08/12/30 17:20:03 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 17:20:03 INFO exec.MapOperator: Adding alias src to work list for file /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src/kv1.txt
    [junit] 08/12/30 17:20:03 INFO exec.MapOperator: Got partitions: null
    [junit] 08/12/30 17:20:03 INFO exec.TableScanOperator: Initializing Self
    [junit] 08/12/30 17:20:03 INFO exec.TableScanOperator: Initializing children:
    [junit] 08/12/30 17:20:03 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 17:20:03 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 17:20:03 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 17:20:03 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 17:20:03 INFO exec.LimitOperator: Initializing Self
    [junit] 08/12/30 17:20:03 INFO exec.LimitOperator: Initializing children:
    [junit] 08/12/30 17:20:03 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 17:20:03 INFO exec.LimitOperator: Initialization Done
    [junit] 08/12/30 17:20:03 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 17:20:03 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 17:20:03 INFO exec.TableScanOperator: Initialization Done
    [junit] 08/12/30 17:20:03 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 17:20:03 INFO mapred.TaskRunner: Task:attempt_local_0001_m_000000_0 is done. And is in the process of commiting
    [junit] 08/12/30 17:20:03 INFO mapred.LocalJobRunner: 
    [junit] 08/12/30 17:20:03 INFO mapred.TaskRunner: Task attempt_local_0001_m_000000_0 is allowed to commit now
    [junit] 08/12/30 17:20:03 INFO mapred.FileOutputCommitter: Saved output of task 'attempt_local_0001_m_000000_0' to file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/tmp874382187
    [junit] 08/12/30 17:20:03 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src/kv1.txt:0+5812
    [junit] 08/12/30 17:20:03 INFO mapred.TaskRunner: Task 'attempt_local_0001_m_000000_0' done.
    [junit] 08/12/30 17:20:04 INFO exec.ExecDriver:  map = 100%,  reduce =0%
    [junit] 08/12/30 17:20:04 INFO exec.ExecDriver: Ended Job = job_local_0001
    [junit]  map = 100%,  reduce =0%
    [junit] Ended Job = job_local_0001
    [junit] plan = /tmp/plan1473.xml
    [junit] 08/12/30 17:20:05 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to 0 since there's no reduce operator
    [junit] 08/12/30 17:20:05 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src
    [junit] 08/12/30 17:20:05 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/data/files/TestSerDe.jar
    [junit] 08/12/30 17:20:05 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
    [junit] 08/12/30 17:20:05 WARN mapred.JobClient: Use GenericOptionsParser for parsing the arguments. Applications should implement Tool for the same.
    [junit] 08/12/30 17:20:06 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] 08/12/30 17:20:06 INFO exec.ExecDriver: Job running in-process (local Hadoop)
    [junit] 08/12/30 17:20:06 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] Job running in-process (local Hadoop)
    [junit] 08/12/30 17:20:06 INFO mapred.MapTask: numReduceTasks: 0
    [junit] 08/12/30 17:20:06 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 17:20:06 INFO exec.MapOperator: Adding alias src to work list for file /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src/kv1.txt
    [junit] 08/12/30 17:20:06 INFO exec.MapOperator: Got partitions: null
    [junit] 08/12/30 17:20:06 INFO exec.TableScanOperator: Initializing Self
    [junit] 08/12/30 17:20:06 INFO exec.TableScanOperator: Initializing children:
    [junit] 08/12/30 17:20:06 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 17:20:06 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 17:20:06 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 17:20:06 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 17:20:06 INFO exec.LimitOperator: Initializing Self
    [junit] 08/12/30 17:20:06 INFO exec.LimitOperator: Initializing children:
    [junit] 08/12/30 17:20:06 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 17:20:06 INFO exec.LimitOperator: Initialization Done
    [junit] 08/12/30 17:20:06 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 17:20:06 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 17:20:06 INFO exec.TableScanOperator: Initialization Done
    [junit] 08/12/30 17:20:06 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 17:20:06 INFO mapred.TaskRunner: Task:attempt_local_0001_m_000000_0 is done. And is in the process of commiting
    [junit] 08/12/30 17:20:06 INFO mapred.LocalJobRunner: 
    [junit] 08/12/30 17:20:06 INFO mapred.TaskRunner: Task attempt_local_0001_m_000000_0 is allowed to commit now
    [junit] 08/12/30 17:20:06 INFO mapred.FileOutputCommitter: Saved output of task 'attempt_local_0001_m_000000_0' to file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/tmp-987170896
    [junit] 08/12/30 17:20:06 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src/kv1.txt:0+5812
    [junit] 08/12/30 17:20:06 INFO mapred.TaskRunner: Task 'attempt_local_0001_m_000000_0' done.
    [junit]  map = 100%,  reduce =0%
    [junit] Ended Job = job_local_0001
    [junit] 08/12/30 17:20:07 INFO exec.ExecDriver:  map = 100%,  reduce =0%
    [junit] 08/12/30 17:20:07 INFO exec.ExecDriver: Ended Job = job_local_0001
    [junit] plan = /tmp/plan1474.xml
    [junit] 08/12/30 17:20:08 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to 0 since there's no reduce operator
    [junit] 08/12/30 17:20:09 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src
    [junit] 08/12/30 17:20:09 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/data/files/TestSerDe.jar
    [junit] 08/12/30 17:20:09 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
    [junit] 08/12/30 17:20:09 WARN mapred.JobClient: Use GenericOptionsParser for parsing the arguments. Applications should implement Tool for the same.
    [junit] 08/12/30 17:20:09 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] 08/12/30 17:20:09 INFO exec.ExecDriver: Job running in-process (local Hadoop)
    [junit] Job running in-process (local Hadoop)
    [junit] 08/12/30 17:20:09 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] 08/12/30 17:20:09 INFO mapred.MapTask: numReduceTasks: 0
    [junit] 08/12/30 17:20:09 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 17:20:09 INFO exec.MapOperator: Adding alias src to work list for file /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src/kv1.txt
    [junit] 08/12/30 17:20:09 INFO exec.MapOperator: Got partitions: null
    [junit] 08/12/30 17:20:09 INFO exec.TableScanOperator: Initializing Self
    [junit] 08/12/30 17:20:09 INFO exec.TableScanOperator: Initializing children:
    [junit] 08/12/30 17:20:09 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 17:20:09 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 17:20:09 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 17:20:09 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 17:20:09 INFO exec.LimitOperator: Initializing Self
    [junit] 08/12/30 17:20:09 INFO exec.LimitOperator: Initializing children:
    [junit] 08/12/30 17:20:09 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 17:20:09 INFO exec.LimitOperator: Initialization Done
    [junit] 08/12/30 17:20:09 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 17:20:09 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 17:20:09 INFO exec.TableScanOperator: Initialization Done
    [junit] 08/12/30 17:20:09 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 17:20:09 INFO mapred.TaskRunner: Task:attempt_local_0001_m_000000_0 is done. And is in the process of commiting
    [junit] 08/12/30 17:20:09 INFO mapred.LocalJobRunner: 
    [junit] 08/12/30 17:20:09 INFO mapred.TaskRunner: Task attempt_local_0001_m_000000_0 is allowed to commit now
    [junit] 08/12/30 17:20:09 INFO mapred.FileOutputCommitter: Saved output of task 'attempt_local_0001_m_000000_0' to file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/tmp-1069450885
    [junit] 08/12/30 17:20:09 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src/kv1.txt:0+5812
    [junit] 08/12/30 17:20:09 INFO mapred.TaskRunner: Task 'attempt_local_0001_m_000000_0' done.
    [junit] 08/12/30 17:20:10 INFO exec.ExecDriver:  map = 100%,  reduce =0%
    [junit] 08/12/30 17:20:10 INFO exec.ExecDriver: Ended Job = job_local_0001
    [junit]  map = 100%,  reduce =0%
    [junit] Ended Job = job_local_0001
    [junit] plan = /tmp/plan1475.xml
    [junit] 08/12/30 17:20:12 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to 0 since there's no reduce operator
    [junit] 08/12/30 17:20:12 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src
    [junit] 08/12/30 17:20:12 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/data/files/TestSerDe.jar
    [junit] 08/12/30 17:20:12 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
    [junit] 08/12/30 17:20:12 WARN mapred.JobClient: Use GenericOptionsParser for parsing the arguments. Applications should implement Tool for the same.
    [junit] 08/12/30 17:20:12 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] Job running in-process (local Hadoop)
    [junit] 08/12/30 17:20:12 INFO exec.ExecDriver: Job running in-process (local Hadoop)
    [junit] 08/12/30 17:20:12 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] 08/12/30 17:20:12 INFO mapred.MapTask: numReduceTasks: 0
    [junit] 08/12/30 17:20:12 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 17:20:12 INFO exec.MapOperator: Adding alias src to work list for file /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src/kv1.txt
    [junit] 08/12/30 17:20:12 INFO exec.MapOperator: Got partitions: null
    [junit] 08/12/30 17:20:12 INFO exec.TableScanOperator: Initializing Self
    [junit] 08/12/30 17:20:12 INFO exec.TableScanOperator: Initializing children:
    [junit] 08/12/30 17:20:12 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 17:20:12 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 17:20:12 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 17:20:12 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 17:20:12 INFO exec.LimitOperator: Initializing Self
    [junit] 08/12/30 17:20:12 INFO exec.LimitOperator: Initializing children:
    [junit] 08/12/30 17:20:12 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 17:20:12 INFO exec.LimitOperator: Initialization Done
    [junit] 08/12/30 17:20:12 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 17:20:12 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 17:20:12 INFO exec.TableScanOperator: Initialization Done
    [junit] 08/12/30 17:20:12 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 17:20:12 INFO mapred.TaskRunner: Task:attempt_local_0001_m_000000_0 is done. And is in the process of commiting
    [junit] 08/12/30 17:20:12 INFO mapred.LocalJobRunner: 
    [junit] 08/12/30 17:20:12 INFO mapred.TaskRunner: Task attempt_local_0001_m_000000_0 is allowed to commit now
    [junit] 08/12/30 17:20:12 INFO mapred.FileOutputCommitter: Saved output of task 'attempt_local_0001_m_000000_0' to file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/tmp-1772640410
    [junit] 08/12/30 17:20:12 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src/kv1.txt:0+5812
    [junit] 08/12/30 17:20:12 INFO mapred.TaskRunner: Task 'attempt_local_0001_m_000000_0' done.
    [junit]  map = 100%,  reduce =0%
    [junit] Ended Job = job_local_0001
    [junit] 08/12/30 17:20:13 INFO exec.ExecDriver:  map = 100%,  reduce =0%
    [junit] 08/12/30 17:20:13 INFO exec.ExecDriver: Ended Job = job_local_0001
    [junit] plan = /tmp/plan1476.xml
    [junit] 08/12/30 17:20:14 WARN exec.ExecDriver: Number of reduce tasks not specified. Defaulting to 0 since there's no reduce operator
    [junit] 08/12/30 17:20:15 INFO exec.ExecDriver: Adding input file file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src
    [junit] 08/12/30 17:20:15 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/data/files/TestSerDe.jar
    [junit] 08/12/30 17:20:15 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
    [junit] 08/12/30 17:20:15 WARN mapred.JobClient: Use GenericOptionsParser for parsing the arguments. Applications should implement Tool for the same.
    [junit] 08/12/30 17:20:15 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] 08/12/30 17:20:15 INFO exec.ExecDriver: Job running in-process (local Hadoop)
    [junit] Job running in-process (local Hadoop)
    [junit] 08/12/30 17:20:15 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] 08/12/30 17:20:15 INFO mapred.MapTask: numReduceTasks: 0
    [junit] 08/12/30 17:20:15 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 17:20:15 INFO exec.MapOperator: Adding alias src to work list for file /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src/kv1.txt
    [junit] 08/12/30 17:20:15 INFO exec.MapOperator: Got partitions: null
    [junit] 08/12/30 17:20:15 INFO exec.TableScanOperator: Initializing Self
    [junit] 08/12/30 17:20:15 INFO exec.TableScanOperator: Initializing children:
    [junit] 08/12/30 17:20:15 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 17:20:15 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 17:20:15 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 17:20:15 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 17:20:15 INFO exec.LimitOperator: Initializing Self
    [junit] 08/12/30 17:20:15 INFO exec.LimitOperator: Initializing children:
    [junit] 08/12/30 17:20:15 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 17:20:15 INFO exec.LimitOperator: Initialization Done
    [junit] 08/12/30 17:20:15 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 17:20:15 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 17:20:15 INFO exec.TableScanOperator: Initialization Done
    [junit] 08/12/30 17:20:15 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 17:20:15 INFO mapred.TaskRunner: Task:attempt_local_0001_m_000000_0 is done. And is in the process of commiting
    [junit] 08/12/30 17:20:15 INFO mapred.LocalJobRunner: 
    [junit] 08/12/30 17:20:15 INFO mapred.TaskRunner: Task attempt_local_0001_m_000000_0 is allowed to commit now
    [junit] 08/12/30 17:20:15 INFO mapred.FileOutputCommitter: Saved output of task 'attempt_local_0001_m_000000_0' to file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/tmp-550260335
    [junit] 08/12/30 17:20:15 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src/kv1.txt:0+5812
    [junit] 08/12/30 17:20:15 INFO mapred.TaskRunner: Task 'attempt_local_0001_m_000000_0' done.
    [junit]  map = 100%,  reduce =0%
    [junit] Ended Job = job_local_0001
    [junit] 08/12/30 17:20:16 INFO exec.ExecDriver:  map = 100%,  reduce =0%
    [junit] 08/12/30 17:20:16 INFO exec.ExecDriver: Ended Job = job_local_0001
    [junit] diff -I \(file:\)\|\(/tmp/.*\) /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/../build/ql/test/logs/udf_round.q.out /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/src/test/results/clientpositive/udf_round.q.out
    [junit] Done query: udf_round.q
    [junit] Tests run: 128, Failures: 54, Errors: 0, Time elapsed: 473.043 sec
    [junit] Test org.apache.hadoop.hive.cli.TestCliDriver FAILED
    [junit] Running org.apache.hadoop.hive.cli.TestNegativeCliDriver
    [junit] Begin query: input1.q
    [junit] Loading data to table srcpart partition {ds=2008-04-08, hr=11}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-08, hr=12}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-09, hr=11}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-09, hr=12}
    [junit] OK
    [junit] Loading data to table srcbucket
    [junit] OK
    [junit] Loading data to table srcbucket
    [junit] OK
    [junit] Loading data to table src
    [junit] OK
    [junit] diff -I \(file:\)\|\(/tmp/.*\) /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/../build/ql/test/logs/input1.q.out /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/src/test/results/clientnegative/input1.q.out
    [junit] Done query: input1.q
    [junit] Begin query: notable_alias3.q
    [junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src)
    [junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:361)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
    [junit] 	at org.apache.hadoop.hive.cli.TestNegativeCliDriver.testNegativeCliDriver_notable_alias3(TestNegativeCliDriver.java:117)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit] 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit] 	at java.lang.reflect.Method.invoke(Method.java:597)
    [junit] 	at junit.framework.TestCase.runTest(TestCase.java:164)
    [junit] 	at junit.framework.TestCase.runBare(TestCase.java:130)
    [junit] 	at junit.framework.TestResult$1.protect(TestResult.java:106)
    [junit] 	at junit.framework.TestResult.runProtected(TestResult.java:124)
    [junit] 	at junit.framework.TestResult.run(TestResult.java:109)
    [junit] 	at junit.framework.TestCase.run(TestCase.java:120)
    [junit] 	at junit.framework.TestSuite.runTest(TestSuite.java:230)
    [junit] 	at junit.framework.TestSuite.run(TestSuite.java:225)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
    [junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src)
    [junit] 	at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
    [junit] 	at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
    [junit] 	at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
    [junit] 	... 21 more
    [junit] Begin query: notable_alias4.q
    [junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/srcpart)
    [junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/srcpart)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:361)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
    [junit] 	at org.apache.hadoop.hive.cli.TestNegativeCliDriver.testNegativeCliDriver_notable_alias4(TestNegativeCliDriver.java:142)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit] 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit] 	at java.lang.reflect.Method.invoke(Method.java:597)
    [junit] 	at junit.framework.TestCase.runTest(TestCase.java:164)
    [junit] 	at junit.framework.TestCase.runBare(TestCase.java:130)
    [junit] 	at junit.framework.TestResult$1.protect(TestResult.java:106)
    [junit] 	at junit.framework.TestResult.runProtected(TestResult.java:124)
    [junit] 	at junit.framework.TestResult.run(TestResult.java:109)
    [junit] 	at junit.framework.TestCase.run(TestCase.java:120)
    [junit] 	at junit.framework.TestSuite.runTest(TestSuite.java:230)
    [junit] 	at junit.framework.TestSuite.run(TestSuite.java:225)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
    [junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/srcpart)
    [junit] 	at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
    [junit] 	at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
    [junit] 	at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
    [junit] 	... 21 more
    [junit] Begin query: input2.q
    [junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/srcbucket)
    [junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/srcbucket)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:361)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
    [junit] 	at org.apache.hadoop.hive.cli.TestNegativeCliDriver.testNegativeCliDriver_input2(TestNegativeCliDriver.java:167)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit] 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit] 	at java.lang.reflect.Method.invoke(Method.java:597)
    [junit] 	at junit.framework.TestCase.runTest(TestCase.java:164)
    [junit] 	at junit.framework.TestCase.runBare(TestCase.java:130)
    [junit] 	at junit.framework.TestResult$1.protect(TestResult.java:106)
    [junit] 	at junit.framework.TestResult.runProtected(TestResult.java:124)
    [junit] 	at junit.framework.TestResult.run(TestResult.java:109)
    [junit] 	at junit.framework.TestCase.run(TestCase.java:120)
    [junit] 	at junit.framework.TestSuite.runTest(TestSuite.java:230)
    [junit] 	at junit.framework.TestSuite.run(TestSuite.java:225)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
    [junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/srcbucket)
    [junit] 	at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
    [junit] 	at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
    [junit] 	at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
    [junit] 	... 21 more
    [junit] Begin query: bad_sample_clause.q
    [junit] diff -I \(file:\)\|\(/tmp/.*\) /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/../build/ql/test/logs/bad_sample_clause.q.out /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/src/test/results/clientnegative/bad_sample_clause.q.out
    [junit] Done query: bad_sample_clause.q
    [junit] Begin query: input_testxpath4.q
    [junit] diff -I \(file:\)\|\(/tmp/.*\) /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/../build/ql/test/logs/input_testxpath4.q.out /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/src/test/results/clientnegative/input_testxpath4.q.out
    [junit] Done query: input_testxpath4.q
    [junit] Begin query: invalid_tbl_name.q
    [junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src)
    [junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:361)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
    [junit] 	at org.apache.hadoop.hive.cli.TestNegativeCliDriver.testNegativeCliDriver_invalid_tbl_name(TestNegativeCliDriver.java:242)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit] 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit] 	at java.lang.reflect.Method.invoke(Method.java:597)
    [junit] 	at junit.framework.TestCase.runTest(TestCase.java:164)
    [junit] 	at junit.framework.TestCase.runBare(TestCase.java:130)
    [junit] 	at junit.framework.TestResult$1.protect(TestResult.java:106)
    [junit] 	at junit.framework.TestResult.runProtected(TestResult.java:124)
    [junit] 	at junit.framework.TestResult.run(TestResult.java:109)
    [junit] 	at junit.framework.TestCase.run(TestCase.java:120)
    [junit] 	at junit.framework.TestSuite.runTest(TestSuite.java:230)
    [junit] 	at junit.framework.TestSuite.run(TestSuite.java:225)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
    [junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src)
    [junit] 	at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
    [junit] 	at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
    [junit] 	at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
    [junit] 	... 21 more
    [junit] Begin query: union.q
    [junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/srcbucket)
    [junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/srcbucket)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:361)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
    [junit] 	at org.apache.hadoop.hive.cli.TestNegativeCliDriver.testNegativeCliDriver_union(TestNegativeCliDriver.java:267)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit] 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit] 	at java.lang.reflect.Method.invoke(Method.java:597)
    [junit] 	at junit.framework.TestCase.runTest(TestCase.java:164)
    [junit] 	at junit.framework.TestCase.runBare(TestCase.java:130)
    [junit] 	at junit.framework.TestResult$1.protect(TestResult.java:106)
    [junit] 	at junit.framework.TestResult.runProtected(TestResult.java:124)
    [junit] 	at junit.framework.TestResult.run(TestResult.java:109)
    [junit] 	at junit.framework.TestCase.run(TestCase.java:120)
    [junit] 	at junit.framework.TestSuite.runTest(TestSuite.java:230)
    [junit] 	at junit.framework.TestSuite.run(TestSuite.java:225)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
    [junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/srcbucket)
    [junit] 	at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
    [junit] 	at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
    [junit] 	at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
    [junit] 	... 21 more
    [junit] Begin query: joinneg.q
    [junit] diff -I \(file:\)\|\(/tmp/.*\) /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/../build/ql/test/logs/joinneg.q.out /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/src/test/results/clientnegative/joinneg.q.out
    [junit] Done query: joinneg.q
    [junit] Begin query: invalid_create_tbl1.q
    [junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src)
    [junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:361)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
    [junit] 	at org.apache.hadoop.hive.cli.TestNegativeCliDriver.testNegativeCliDriver_invalid_create_tbl1(TestNegativeCliDriver.java:317)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit] 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit] 	at java.lang.reflect.Method.invoke(Method.java:597)
    [junit] 	at junit.framework.TestCase.runTest(TestCase.java:164)
    [junit] 	at junit.framework.TestCase.runBare(TestCase.java:130)
    [junit] 	at junit.framework.TestResult$1.protect(TestResult.java:106)
    [junit] 	at junit.framework.TestResult.runProtected(TestResult.java:124)
    [junit] 	at junit.framework.TestResult.run(TestResult.java:109)
    [junit] 	at junit.framework.TestCase.run(TestCase.java:120)
    [junit] 	at junit.framework.TestSuite.runTest(TestSuite.java:230)
    [junit] 	at junit.framework.TestSuite.run(TestSuite.java:225)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
    [junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src)
    [junit] 	at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
    [junit] 	at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
    [junit] 	at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
    [junit] 	... 21 more
    [junit] Begin query: invalid_create_tbl2.q
    [junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/srcpart)
    [junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/srcpart)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:361)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
    [junit] 	at org.apache.hadoop.hive.cli.TestNegativeCliDriver.testNegativeCliDriver_invalid_create_tbl2(TestNegativeCliDriver.java:342)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit] 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit] 	at java.lang.reflect.Method.invoke(Method.java:597)
    [junit] 	at junit.framework.TestCase.runTest(TestCase.java:164)
    [junit] 	at junit.framework.TestCase.runBare(TestCase.java:130)
    [junit] 	at junit.framework.TestResult$1.protect(TestResult.java:106)
    [junit] 	at junit.framework.TestResult.runProtected(TestResult.java:124)
    [junit] 	at junit.framework.TestResult.run(TestResult.java:109)
    [junit] 	at junit.framework.TestCase.run(TestCase.java:120)
    [junit] 	at junit.framework.TestSuite.runTest(TestSuite.java:230)
    [junit] 	at junit.framework.TestSuite.run(TestSuite.java:225)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
    [junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/srcpart)
    [junit] 	at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
    [junit] 	at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
    [junit] 	at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
    [junit] 	... 21 more
    [junit] Begin query: subq_insert.q
    [junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/srcbucket)
    [junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/srcbucket)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:361)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
    [junit] 	at org.apache.hadoop.hive.cli.TestNegativeCliDriver.testNegativeCliDriver_subq_insert(TestNegativeCliDriver.java:367)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit] 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit] 	at java.lang.reflect.Method.invoke(Method.java:597)
    [junit] 	at junit.framework.TestCase.runTest(TestCase.java:164)
    [junit] 	at junit.framework.TestCase.runBare(TestCase.java:130)
    [junit] 	at junit.framework.TestResult$1.protect(TestResult.java:106)
    [junit] 	at junit.framework.TestResult.runProtected(TestResult.java:124)
    [junit] 	at junit.framework.TestResult.run(TestResult.java:109)
    [junit] 	at junit.framework.TestCase.run(TestCase.java:120)
    [junit] 	at junit.framework.TestSuite.runTest(TestSuite.java:230)
    [junit] 	at junit.framework.TestSuite.run(TestSuite.java:225)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
    [junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/srcbucket)
    [junit] 	at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
    [junit] 	at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
    [junit] 	at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
    [junit] 	... 21 more
    [junit] Begin query: load_wrong_fileformat.q
    [junit] Exception: load command: LOAD DATA INPATH '/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/tmp/kv1.txt' INTO TABLE srcpart PARTITION (ds='2008-04-09',hr='12') failed with exit code= 9
    [junit] java.lang.Exception: load command: LOAD DATA INPATH '/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/tmp/kv1.txt' INTO TABLE srcpart PARTITION (ds='2008-04-09',hr='12') failed with exit code= 9
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.createSources(QTestUtil.java:250)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:362)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
    [junit] 	at org.apache.hadoop.hive.cli.TestNegativeCliDriver.testNegativeCliDriver_load_wrong_fileformat(TestNegativeCliDriver.java:392)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit] 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit] 	at java.lang.reflect.Method.invoke(Method.java:597)
    [junit] 	at junit.framework.TestCase.runTest(TestCase.java:164)
    [junit] 	at junit.framework.TestCase.runBare(TestCase.java:130)
    [junit] 	at junit.framework.TestResult$1.protect(TestResult.java:106)
    [junit] 	at junit.framework.TestResult.runProtected(TestResult.java:124)
    [junit] 	at junit.framework.TestResult.run(TestResult.java:109)
    [junit] 	at junit.framework.TestCase.run(TestCase.java:120)
    [junit] 	at junit.framework.TestSuite.runTest(TestSuite.java:230)
    [junit] 	at junit.framework.TestSuite.run(TestSuite.java:225)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
    [junit] Begin query: describe_xpath1.q
    [junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/srcpart)
    [junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/srcpart)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:361)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
    [junit] 	at org.apache.hadoop.hive.cli.TestNegativeCliDriver.testNegativeCliDriver_describe_xpath1(TestNegativeCliDriver.java:417)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit] 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit] 	at java.lang.reflect.Method.invoke(Method.java:597)
    [junit] 	at junit.framework.TestCase.runTest(TestCase.java:164)
    [junit] 	at junit.framework.TestCase.runBare(TestCase.java:130)
    [junit] 	at junit.framework.TestResult$1.protect(TestResult.java:106)
    [junit] 	at junit.framework.TestResult.runProtected(TestResult.java:124)
    [junit] 	at junit.framework.TestResult.run(TestResult.java:109)
    [junit] 	at junit.framework.TestCase.run(TestCase.java:120)
    [junit] 	at junit.framework.TestSuite.runTest(TestSuite.java:230)
    [junit] 	at junit.framework.TestSuite.run(TestSuite.java:225)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
    [junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/srcpart)
    [junit] 	at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
    [junit] 	at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
    [junit] 	at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
    [junit] 	... 21 more
    [junit] Begin query: clusterbydistributeby.q
    [junit] Exception: load command: LOAD DATA INPATH '/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/tmp/kv1.txt' INTO TABLE srcpart PARTITION (ds='2008-04-08',hr='12') failed with exit code= 9
    [junit] java.lang.Exception: load command: LOAD DATA INPATH '/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/tmp/kv1.txt' INTO TABLE srcpart PARTITION (ds='2008-04-08',hr='12') failed with exit code= 9
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.createSources(QTestUtil.java:250)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:362)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
    [junit] 	at org.apache.hadoop.hive.cli.TestNegativeCliDriver.testNegativeCliDriver_clusterbydistributeby(TestNegativeCliDriver.java:442)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit] 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit] 	at java.lang.reflect.Method.invoke(Method.java:597)
    [junit] 	at junit.framework.TestCase.runTest(TestCase.java:164)
    [junit] 	at junit.framework.TestCase.runBare(TestCase.java:130)
    [junit] 	at junit.framework.TestResult$1.protect(TestResult.java:106)
    [junit] 	at junit.framework.TestResult.runProtected(TestResult.java:124)
    [junit] 	at junit.framework.TestResult.run(TestResult.java:109)
    [junit] 	at junit.framework.TestCase.run(TestCase.java:120)
    [junit] 	at junit.framework.TestSuite.runTest(TestSuite.java:230)
    [junit] 	at junit.framework.TestSuite.run(TestSuite.java:225)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
    [junit] Begin query: describe_xpath2.q
    [junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/srcpart)
    [junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/srcpart)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:361)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
    [junit] 	at org.apache.hadoop.hive.cli.TestNegativeCliDriver.testNegativeCliDriver_describe_xpath2(TestNegativeCliDriver.java:467)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit] 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit] 	at java.lang.reflect.Method.invoke(Method.java:597)
    [junit] 	at junit.framework.TestCase.runTest(TestCase.java:164)
    [junit] 	at junit.framework.TestCase.runBare(TestCase.java:130)
    [junit] 	at junit.framework.TestResult$1.protect(TestResult.java:106)
    [junit] 	at junit.framework.TestResult.runProtected(TestResult.java:124)
    [junit] 	at junit.framework.TestResult.run(TestResult.java:109)
    [junit] 	at junit.framework.TestCase.run(TestCase.java:120)
    [junit] 	at junit.framework.TestSuite.runTest(TestSuite.java:230)
    [junit] 	at junit.framework.TestSuite.run(TestSuite.java:225)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
    [junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/srcpart)
    [junit] 	at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
    [junit] 	at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
    [junit] 	at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
    [junit] 	... 21 more
    [junit] Begin query: describe_xpath3.q
    [junit] Exception: load command: LOAD DATA INPATH '/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/tmp/kv1.txt' INTO TABLE srcpart PARTITION (ds='2008-04-08',hr='11') failed with exit code= 9
    [junit] java.lang.Exception: load command: LOAD DATA INPATH '/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/tmp/kv1.txt' INTO TABLE srcpart PARTITION (ds='2008-04-08',hr='11') failed with exit code= 9
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.createSources(QTestUtil.java:250)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:362)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
    [junit] 	at org.apache.hadoop.hive.cli.TestNegativeCliDriver.testNegativeCliDriver_describe_xpath3(TestNegativeCliDriver.java:492)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit] 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit] 	at java.lang.reflect.Method.invoke(Method.java:597)
    [junit] 	at junit.framework.TestCase.runTest(TestCase.java:164)
    [junit] 	at junit.framework.TestCase.runBare(TestCase.java:130)
    [junit] 	at junit.framework.TestResult$1.protect(TestResult.java:106)
    [junit] 	at junit.framework.TestResult.runProtected(TestResult.java:124)
    [junit] 	at junit.framework.TestResult.run(TestResult.java:109)
    [junit] 	at junit.framework.TestCase.run(TestCase.java:120)
    [junit] 	at junit.framework.TestSuite.runTest(TestSuite.java:230)
    [junit] 	at junit.framework.TestSuite.run(TestSuite.java:225)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
    [junit] Begin query: describe_xpath4.q
    [junit] diff -I \(file:\)\|\(/tmp/.*\) /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/../build/ql/test/logs/describe_xpath4.q.out /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/src/test/results/clientnegative/describe_xpath4.q.out
    [junit] Done query: describe_xpath4.q
    [junit] Begin query: strict_pruning.q
    [junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:361)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
    [junit] 	at org.apache.hadoop.hive.cli.TestNegativeCliDriver.testNegativeCliDriver_strict_pruning(TestNegativeCliDriver.java:542)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit] 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit] 	at java.lang.reflect.Method.invoke(Method.java:597)
    [junit] 	at junit.framework.TestCase.runTest(TestCase.java:164)
    [junit] 	at junit.framework.TestCase.runBare(TestCase.java:130)
    [junit] 	at junit.framework.TestResult$1.protect(TestResult.java:106)
    [junit] 	at junit.framework.TestResult.runProtected(TestResult.java:124)
    [junit] 	at junit.framework.TestResult.run(TestResult.java:109)
    [junit] 	at junit.framework.TestCase.run(TestCase.java:120)
    [junit] 	at junit.framework.TestSuite.runTest(TestSuite.java:230)
    [junit] 	at junit.framework.TestSuite.run(TestSuite.java:225)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
    [junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src)
    [junit] 	at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
    [junit] 	at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
    [junit] 	at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
    [junit] 	... 21 more
    [junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src)
    [junit] Begin query: clusterbysortby.q
    [junit] diff -I \(file:\)\|\(/tmp/.*\) /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/../build/ql/test/logs/clusterbysortby.q.out /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/src/test/results/clientnegative/clusterbysortby.q.out
    [junit] Done query: clusterbysortby.q
    [junit] Begin query: fileformat_bad_class.q
    [junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src)
    [junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:361)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
    [junit] 	at org.apache.hadoop.hive.cli.TestNegativeCliDriver.testNegativeCliDriver_fileformat_bad_class(TestNegativeCliDriver.java:592)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit] 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit] 	at java.lang.reflect.Method.invoke(Method.java:597)
    [junit] 	at junit.framework.TestCase.runTest(TestCase.java:164)
    [junit] 	at junit.framework.TestCase.runBare(TestCase.java:130)
    [junit] 	at junit.framework.TestResult$1.protect(TestResult.java:106)
    [junit] 	at junit.framework.TestResult.runProtected(TestResult.java:124)
    [junit] 	at junit.framework.TestResult.run(TestResult.java:109)
    [junit] 	at junit.framework.TestCase.run(TestCase.java:120)
    [junit] 	at junit.framework.TestSuite.runTest(TestSuite.java:230)
    [junit] 	at junit.framework.TestSuite.run(TestSuite.java:225)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
    [junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src)
    [junit] 	at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
    [junit] 	at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
    [junit] 	at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
    [junit] 	... 21 more
    [junit] Begin query: fileformat_void_input.q
    [junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/srcpart)
    [junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/srcpart)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:361)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
    [junit] 	at org.apache.hadoop.hive.cli.TestNegativeCliDriver.testNegativeCliDriver_fileformat_void_input(TestNegativeCliDriver.java:617)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit] 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit] 	at java.lang.reflect.Method.invoke(Method.java:597)
    [junit] 	at junit.framework.TestCase.runTest(TestCase.java:164)
    [junit] 	at junit.framework.TestCase.runBare(TestCase.java:130)
    [junit] 	at junit.framework.TestResult$1.protect(TestResult.java:106)
    [junit] 	at junit.framework.TestResult.runProtected(TestResult.java:124)
    [junit] 	at junit.framework.TestResult.run(TestResult.java:109)
    [junit] 	at junit.framework.TestCase.run(TestCase.java:120)
    [junit] 	at junit.framework.TestSuite.runTest(TestSuite.java:230)
    [junit] 	at junit.framework.TestSuite.run(TestSuite.java:225)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
    [junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/srcpart)
    [junit] 	at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
    [junit] 	at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
    [junit] 	at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
    [junit] 	... 21 more
    [junit] Begin query: fileformat_void_output.q
    [junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/srcbucket)
    [junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/srcbucket)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:361)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cliInit(QTestUtil.java:356)
    [junit] 	at org.apache.hadoop.hive.cli.TestNegativeCliDriver.testNegativeCliDriver_fileformat_void_output(TestNegativeCliDriver.java:642)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit] 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit] 	at java.lang.reflect.Method.invoke(Method.java:597)
    [junit] 	at junit.framework.TestCase.runTest(TestCase.java:164)
    [junit] 	at junit.framework.TestCase.runBare(TestCase.java:130)
    [junit] 	at junit.framework.TestResult$1.protect(TestResult.java:106)
    [junit] 	at junit.framework.TestResult.runProtected(TestResult.java:124)
    [junit] 	at junit.framework.TestResult.run(TestResult.java:109)
    [junit] 	at junit.framework.TestCase.run(TestCase.java:120)
    [junit] 	at junit.framework.TestSuite.runTest(TestSuite.java:230)
    [junit] 	at junit.framework.TestSuite.run(TestSuite.java:225)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
    [junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/srcbucket)
    [junit] 	at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
    [junit] 	at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
    [junit] 	at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
    [junit] 	... 21 more
    [junit] Tests run: 23, Failures: 17, Errors: 0, Time elapsed: 24.332 sec
    [junit] Test org.apache.hadoop.hive.cli.TestNegativeCliDriver FAILED
    [junit] Running org.apache.hadoop.hive.ql.exec.TestExecDriver
    [junit] Beginning testMapPlan1
    [junit] Generating plan file /tmp/plan60893.xml
    [junit] Executing: /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/hadoopcore/hadoop-0.19.0/bin/hadoop jar /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/hive_exec.jar org.apache.hadoop.hive.ql.exec.ExecDriver -plan /tmp/plan60893.xml -jobconf fs.scheme.class=dfs -jobconf hive.exec.scratchdir=%24%7Buser.dir%7D%2F..%2Fbuild%2Fql%2Ftmp -jobconf test.query.file1=file%3A%2F%2F%24%7Buser.dir%7D%2F..%2Fql%2Fsrc%2Ftest%2Forg%2Fapache%2Fhadoop%2Fhive%2Fql%2Finput2.q -jobconf hive.metastore.connect.retries=5 -jobconf hive.metastore.rawstore.impl=org.apache.hadoop.hive.metastore.ObjectStore -jobconf hive.metastore.metadb.dir=file%3A%2F%2F%24%7Buser.dir%7D%2F..%2Fbuild%2Fql%2Ftest%2Fdata%2Fmetadb%2F -jobconf javax.jdo.option.ConnectionPassword=mine -jobconf hive.metastore.uris=file%3A%2F%2F%2Fvar%2Fmetastore%2Fmetadb%2F -jobconf hive.metastore.warehouse.dir=file%3A%2F%2F%24%7Buser.dir%7D%2F..%2Fbuild%2Fql%2Ftest%2Fdata%2
 Fwarehouse%2F -jobconf hive.aux.jars.path=%2Fusr%2Ffbtools%2Fcontinuous_builds%2Fhiveopensource-trunk%2Fhiveopensource_trunk%2Fbuild%2Fql%2Ftest%2Ftest-udfs.jar%2C%2Fusr%2Ffbtools%2Fcontinuous_builds%2Fhiveopensource-trunk%2Fhiveopensource_trunk%2Fdata%2Ffiles%2FTestSerDe.jar -jobconf hive.metastore.local=true -jobconf test.data.files=%24%7Buser.dir%7D%2F..%2Fdata%2Ffiles -jobconf javax.jdo.option.ConnectionURL=jdbc%3Aderby%3A%3BdatabaseName%3D..%2Fbuild%2Ftest%2Fjunit_metastore_db%3Bcreate%3Dtrue -jobconf javax.jdo.option.ConnectionDriverName=org.apache.derby.jdbc.EmbeddedDriver -jobconf hive.exec.script.maxerrsize=100000 -jobconf javax.jdo.option.ConnectionUserName=APP -jobconf hive.jar.path=%24%7Buser.dir%7D%2F..%2Fbuild%2Fql%2Fhive_exec.jar -jobconf hadoop.config.dir=%2Fusr%2Ffbtools%2Fcontinuous_builds%2Fhiveopensource-trunk%2Fhiveopensource_trunk%2Fbuild%2Fhadoopcore%2Fhadoop-0.19.0%2Fconf -jobconf hive.join.emit.interval=1000 -jobconf hive.exec.compress.output=false -
 jobconf test.src.dir=file%3A%2F%2F%24%7Buser.dir%7D%2F..%2Fql%2Fsrc%2Ftest -jobconf test.log.dir=%24%7Buser.dir%7D%2F..%2Fbuild%2Fql%2Ftest%2Flogs -jobconf hive.map.aggr=false -jobconf hive.exec.compress.intermediate=false -jobconf hive.default.fileformat=TextFile -jobconf mapred.system.dir=%2Ftmp%2Fhadoop-mvaradachari%2Fmapred%2Fsystem%2F251589052 -jobconf mapred.local.dir=%2Ftmp%2Fhadoop-mvaradachari%2Fmapred%2Flocal%2F-883432777
    [junit] plan = /tmp/plan60893.xml
    [junit] 08/12/30 17:20:48 INFO exec.ExecDriver: Number of reduce tasks determined at compile : 0
    [junit] 08/12/30 17:20:48 INFO exec.ExecDriver: Adding input file /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src
    [junit] 08/12/30 17:20:48 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/data/files/TestSerDe.jar
    [junit] 08/12/30 17:20:48 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
    [junit] 08/12/30 17:20:48 WARN mapred.JobClient: Use GenericOptionsParser for parsing the arguments. Applications should implement Tool for the same.
    [junit] 08/12/30 17:20:48 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] 08/12/30 17:20:49 INFO exec.ExecDriver: Job running in-process (local Hadoop)
    [junit] Job running in-process (local Hadoop)
    [junit] 08/12/30 17:20:49 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] 08/12/30 17:20:49 INFO mapred.MapTask: numReduceTasks: 0
    [junit] 08/12/30 17:20:49 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 17:20:49 INFO exec.MapOperator: Adding alias a to work list for file /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src/kv1.txt
    [junit] 08/12/30 17:20:49 INFO exec.MapOperator: Got partitions: null
    [junit] 08/12/30 17:20:49 INFO exec.FilterOperator: Initializing Self
    [junit] 08/12/30 17:20:49 INFO exec.FilterOperator: Initializing children:
    [junit] 08/12/30 17:20:49 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 17:20:49 INFO util.NativeCodeLoader: Loaded the native-hadoop library
    [junit] 08/12/30 17:20:49 INFO zlib.ZlibFactory: Successfully loaded & initialized native-zlib library
    [junit] 08/12/30 17:20:49 INFO exec.FilterOperator: Initialization Done
    [junit] 08/12/30 17:20:49 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 17:20:49 INFO exec.FilterOperator: PASSED:84
    [junit] 08/12/30 17:20:49 INFO exec.FilterOperator: FILTERED:416
    [junit] 08/12/30 17:20:49 INFO mapred.TaskRunner: Task:attempt_local_0001_m_000000_0 is done. And is in the process of commiting
    [junit] 08/12/30 17:20:49 INFO mapred.LocalJobRunner: 
    [junit] 08/12/30 17:20:49 INFO mapred.TaskRunner: Task attempt_local_0001_m_000000_0 is allowed to commit now
    [junit] 08/12/30 17:20:49 INFO mapred.FileOutputCommitter: Saved output of task 'attempt_local_0001_m_000000_0' to file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/tmp-68193666
    [junit] 08/12/30 17:20:49 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src/kv1.txt:0+5812
    [junit] 08/12/30 17:20:49 INFO mapred.TaskRunner: Task 'attempt_local_0001_m_000000_0' done.
    [junit] 08/12/30 17:20:50 INFO exec.ExecDriver:  map = 100%,  reduce =0%
    [junit] 08/12/30 17:20:50 INFO exec.ExecDriver: Ended Job = job_local_0001
    [junit]  map = 100%,  reduce =0%
    [junit] Ended Job = job_local_0001
    [junit] testMapPlan1 execution completed successfully
    [junit] /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/../data/files
    [junit] Beginning testMapPlan2
    [junit] Generating plan file /tmp/plan60894.xml
    [junit] Executing: /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/hadoopcore/hadoop-0.19.0/bin/hadoop jar /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/hive_exec.jar org.apache.hadoop.hive.ql.exec.ExecDriver -plan /tmp/plan60894.xml -jobconf fs.scheme.class=dfs -jobconf hive.exec.scratchdir=%24%7Buser.dir%7D%2F..%2Fbuild%2Fql%2Ftmp -jobconf test.query.file1=file%3A%2F%2F%24%7Buser.dir%7D%2F..%2Fql%2Fsrc%2Ftest%2Forg%2Fapache%2Fhadoop%2Fhive%2Fql%2Finput2.q -jobconf hive.metastore.connect.retries=5 -jobconf hive.metastore.rawstore.impl=org.apache.hadoop.hive.metastore.ObjectStore -jobconf hive.metastore.metadb.dir=file%3A%2F%2F%24%7Buser.dir%7D%2F..%2Fbuild%2Fql%2Ftest%2Fdata%2Fmetadb%2F -jobconf javax.jdo.option.ConnectionPassword=mine -jobconf hive.metastore.uris=file%3A%2F%2F%2Fvar%2Fmetastore%2Fmetadb%2F -jobconf hive.metastore.warehouse.dir=file%3A%2F%2F%24%7Buser.dir%7D%2F..%2Fbuild%2Fql%2Ftest%2Fdata%2
 Fwarehouse%2F -jobconf hive.aux.jars.path=%2Fusr%2Ffbtools%2Fcontinuous_builds%2Fhiveopensource-trunk%2Fhiveopensource_trunk%2Fbuild%2Fql%2Ftest%2Ftest-udfs.jar%2C%2Fusr%2Ffbtools%2Fcontinuous_builds%2Fhiveopensource-trunk%2Fhiveopensource_trunk%2Fdata%2Ffiles%2FTestSerDe.jar -jobconf hive.metastore.local=true -jobconf test.data.files=%24%7Buser.dir%7D%2F..%2Fdata%2Ffiles -jobconf javax.jdo.option.ConnectionURL=jdbc%3Aderby%3A%3BdatabaseName%3D..%2Fbuild%2Ftest%2Fjunit_metastore_db%3Bcreate%3Dtrue -jobconf javax.jdo.option.ConnectionDriverName=org.apache.derby.jdbc.EmbeddedDriver -jobconf hive.exec.script.maxerrsize=100000 -jobconf javax.jdo.option.ConnectionUserName=APP -jobconf hive.jar.path=%24%7Buser.dir%7D%2F..%2Fbuild%2Fql%2Fhive_exec.jar -jobconf hadoop.config.dir=%2Fusr%2Ffbtools%2Fcontinuous_builds%2Fhiveopensource-trunk%2Fhiveopensource_trunk%2Fbuild%2Fhadoopcore%2Fhadoop-0.19.0%2Fconf -jobconf hive.join.emit.interval=1000 -jobconf hive.exec.compress.output=false -
 jobconf test.src.dir=file%3A%2F%2F%24%7Buser.dir%7D%2F..%2Fql%2Fsrc%2Ftest -jobconf test.log.dir=%24%7Buser.dir%7D%2F..%2Fbuild%2Fql%2Ftest%2Flogs -jobconf hive.map.aggr=false -jobconf hive.exec.compress.intermediate=false -jobconf hive.default.fileformat=TextFile -jobconf mapred.system.dir=%2Ftmp%2Fhadoop-mvaradachari%2Fmapred%2Fsystem%2F1601570458 -jobconf mapred.local.dir=%2Ftmp%2Fhadoop-mvaradachari%2Fmapred%2Flocal%2F1713136076
    [junit] plan = /tmp/plan60894.xml
    [junit] 08/12/30 17:20:51 INFO exec.ExecDriver: Number of reduce tasks determined at compile : 0
    [junit] 08/12/30 17:20:51 INFO exec.ExecDriver: Adding input file /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src
    [junit] 08/12/30 17:20:51 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/data/files/TestSerDe.jar
    [junit] 08/12/30 17:20:51 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
    [junit] 08/12/30 17:20:51 WARN mapred.JobClient: Use GenericOptionsParser for parsing the arguments. Applications should implement Tool for the same.
    [junit] 08/12/30 17:20:51 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] 08/12/30 17:20:51 INFO exec.ExecDriver: Job running in-process (local Hadoop)
    [junit] Job running in-process (local Hadoop)
    [junit] 08/12/30 17:20:51 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] 08/12/30 17:20:51 INFO mapred.MapTask: numReduceTasks: 0
    [junit] 08/12/30 17:20:51 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 17:20:51 INFO exec.MapOperator: Adding alias a to work list for file /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src/kv1.txt
    [junit] 08/12/30 17:20:51 INFO exec.MapOperator: Got partitions: null
    [junit] 08/12/30 17:20:51 INFO exec.FilterOperator: Initializing Self
    [junit] 08/12/30 17:20:51 INFO exec.FilterOperator: Initializing children:
    [junit] 08/12/30 17:20:51 INFO exec.ScriptOperator: Initializing Self
    [junit] 08/12/30 17:20:51 INFO exec.ScriptOperator: Initializing children:
    [junit] 08/12/30 17:20:51 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 17:20:51 INFO exec.ScriptOperator: Initialization Done
    [junit] 08/12/30 17:20:52 INFO exec.ScriptOperator: Executing [/bin/cat]
    [junit] 08/12/30 17:20:52 INFO exec.ScriptOperator: tablename=src
    [junit] 08/12/30 17:20:52 INFO exec.ScriptOperator: partname=null
    [junit] 08/12/30 17:20:52 INFO exec.ScriptOperator: alias=a
    [junit] 08/12/30 17:20:52 INFO exec.FilterOperator: Initialization Done
    [junit] 08/12/30 17:20:52 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 17:20:52 INFO exec.FilterOperator: FILTERED:416
    [junit] 08/12/30 17:20:52 INFO exec.FilterOperator: PASSED:84
    [junit] 08/12/30 17:20:52 INFO exec.ScriptOperator: StreamThread ErrorProcessor done
    [junit] 08/12/30 17:20:52 INFO exec.ScriptOperator: StreamThread OutputProcessor done
    [junit] 08/12/30 17:20:52 INFO exec.ScriptOperator: SERIALIZE_ERRORS:0
    [junit] 08/12/30 17:20:52 INFO exec.ScriptOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 17:20:52 INFO mapred.TaskRunner: Task:attempt_local_0001_m_000000_0 is done. And is in the process of commiting
    [junit] 08/12/30 17:20:52 INFO mapred.LocalJobRunner: 
    [junit] 08/12/30 17:20:52 INFO mapred.TaskRunner: Task attempt_local_0001_m_000000_0 is allowed to commit now
    [junit] 08/12/30 17:20:52 INFO mapred.FileOutputCommitter: Saved output of task 'attempt_local_0001_m_000000_0' to file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/tmp880095420
    [junit] 08/12/30 17:20:52 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src/kv1.txt:0+5812
    [junit] 08/12/30 17:20:52 INFO mapred.TaskRunner: Task 'attempt_local_0001_m_000000_0' done.
    [junit]  map = 100%,  reduce =0%
    [junit] Ended Job = job_local_0001
    [junit] 08/12/30 17:20:52 INFO exec.ExecDriver:  map = 100%,  reduce =0%
    [junit] 08/12/30 17:20:52 INFO exec.ExecDriver: Ended Job = job_local_0001
    [junit] testMapPlan2 execution completed successfully
    [junit] /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/../data/files
    [junit] Beginning testMapRedPlan1
    [junit] Generating plan file /tmp/plan60895.xml
    [junit] Executing: /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/hadoopcore/hadoop-0.19.0/bin/hadoop jar /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/hive_exec.jar org.apache.hadoop.hive.ql.exec.ExecDriver -plan /tmp/plan60895.xml -jobconf fs.scheme.class=dfs -jobconf hive.exec.scratchdir=%24%7Buser.dir%7D%2F..%2Fbuild%2Fql%2Ftmp -jobconf test.query.file1=file%3A%2F%2F%24%7Buser.dir%7D%2F..%2Fql%2Fsrc%2Ftest%2Forg%2Fapache%2Fhadoop%2Fhive%2Fql%2Finput2.q -jobconf hive.metastore.connect.retries=5 -jobconf hive.metastore.rawstore.impl=org.apache.hadoop.hive.metastore.ObjectStore -jobconf hive.metastore.metadb.dir=file%3A%2F%2F%24%7Buser.dir%7D%2F..%2Fbuild%2Fql%2Ftest%2Fdata%2Fmetadb%2F -jobconf javax.jdo.option.ConnectionPassword=mine -jobconf hive.metastore.uris=file%3A%2F%2F%2Fvar%2Fmetastore%2Fmetadb%2F -jobconf hive.metastore.warehouse.dir=file%3A%2F%2F%24%7Buser.dir%7D%2F..%2Fbuild%2Fql%2Ftest%2Fdata%2
 Fwarehouse%2F -jobconf hive.aux.jars.path=%2Fusr%2Ffbtools%2Fcontinuous_builds%2Fhiveopensource-trunk%2Fhiveopensource_trunk%2Fbuild%2Fql%2Ftest%2Ftest-udfs.jar%2C%2Fusr%2Ffbtools%2Fcontinuous_builds%2Fhiveopensource-trunk%2Fhiveopensource_trunk%2Fdata%2Ffiles%2FTestSerDe.jar -jobconf hive.metastore.local=true -jobconf test.data.files=%24%7Buser.dir%7D%2F..%2Fdata%2Ffiles -jobconf javax.jdo.option.ConnectionURL=jdbc%3Aderby%3A%3BdatabaseName%3D..%2Fbuild%2Ftest%2Fjunit_metastore_db%3Bcreate%3Dtrue -jobconf javax.jdo.option.ConnectionDriverName=org.apache.derby.jdbc.EmbeddedDriver -jobconf hive.exec.script.maxerrsize=100000 -jobconf javax.jdo.option.ConnectionUserName=APP -jobconf hive.jar.path=%24%7Buser.dir%7D%2F..%2Fbuild%2Fql%2Fhive_exec.jar -jobconf hadoop.config.dir=%2Fusr%2Ffbtools%2Fcontinuous_builds%2Fhiveopensource-trunk%2Fhiveopensource_trunk%2Fbuild%2Fhadoopcore%2Fhadoop-0.19.0%2Fconf -jobconf hive.join.emit.interval=1000 -jobconf hive.exec.compress.output=false -
 jobconf test.src.dir=file%3A%2F%2F%24%7Buser.dir%7D%2F..%2Fql%2Fsrc%2Ftest -jobconf test.log.dir=%24%7Buser.dir%7D%2F..%2Fbuild%2Fql%2Ftest%2Flogs -jobconf hive.map.aggr=false -jobconf hive.exec.compress.intermediate=false -jobconf hive.default.fileformat=TextFile -jobconf mapred.system.dir=%2Ftmp%2Fhadoop-mvaradachari%2Fmapred%2Fsystem%2F29053934 -jobconf mapred.local.dir=%2Ftmp%2Fhadoop-mvaradachari%2Fmapred%2Flocal%2F70362449
    [junit] plan = /tmp/plan60895.xml
    [junit] 08/12/30 17:20:54 INFO exec.ExecDriver: Number of reduce tasks determined at compile : 1
    [junit] 08/12/30 17:20:54 INFO exec.ExecDriver: Adding input file /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src
    [junit] 08/12/30 17:20:54 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/data/files/TestSerDe.jar
    [junit] 08/12/30 17:20:54 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
    [junit] 08/12/30 17:20:54 WARN mapred.JobClient: Use GenericOptionsParser for parsing the arguments. Applications should implement Tool for the same.
    [junit] 08/12/30 17:20:54 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] Job running in-process (local Hadoop)
    [junit] 08/12/30 17:20:54 INFO exec.ExecDriver: Job running in-process (local Hadoop)
    [junit] 08/12/30 17:20:54 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] 08/12/30 17:20:54 INFO mapred.MapTask: numReduceTasks: 1
    [junit] 08/12/30 17:20:54 INFO mapred.MapTask: io.sort.mb = 100
    [junit] 08/12/30 17:20:54 INFO mapred.MapTask: data buffer = 79691776/99614720
    [junit] 08/12/30 17:20:54 INFO mapred.MapTask: record buffer = 262144/327680
    [junit] 08/12/30 17:20:54 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 17:20:54 INFO exec.MapOperator: Adding alias a to work list for file /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src/kv1.txt
    [junit] 08/12/30 17:20:54 INFO exec.MapOperator: Got partitions: null
    [junit] 08/12/30 17:20:54 INFO exec.ReduceSinkOperator: Initializing Self
    [junit] 08/12/30 17:20:54 INFO exec.ReduceSinkOperator: Using tag = -1
    [junit] 08/12/30 17:20:54 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 17:20:54 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 17:20:54 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 17:20:54 INFO mapred.MapTask: Starting flush of map output
    [junit] 08/12/30 17:20:55 INFO mapred.MapTask: Finished spill 0
    [junit] 08/12/30 17:20:55 INFO mapred.TaskRunner: Task:attempt_local_0001_m_000000_0 is done. And is in the process of commiting
    [junit] 08/12/30 17:20:55 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src/kv1.txt:0+5812
    [junit] 08/12/30 17:20:55 INFO mapred.TaskRunner: Task 'attempt_local_0001_m_000000_0' done.
    [junit] 08/12/30 17:20:55 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 17:20:55 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 17:20:55 INFO mapred.Merger: Merging 1 sorted segments
    [junit] 08/12/30 17:20:55 INFO mapred.Merger: Down to the last merge-pass, with 1 segments left of total size: 14814 bytes
    [junit] 08/12/30 17:20:55 INFO exec.ExtractOperator: Initializing Self
    [junit] 08/12/30 17:20:55 INFO exec.ExtractOperator: Initializing children:
    [junit] 08/12/30 17:20:55 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 17:20:55 INFO exec.ExtractOperator: Initialization Done
    [junit] 08/12/30 17:20:55 INFO mapred.TaskRunner: Task:attempt_local_0001_r_000000_0 is done. And is in the process of commiting
    [junit] 08/12/30 17:20:55 INFO mapred.LocalJobRunner: 
    [junit] 08/12/30 17:20:55 INFO mapred.TaskRunner: Task attempt_local_0001_r_000000_0 is allowed to commit now
    [junit] 08/12/30 17:20:55 INFO mapred.FileOutputCommitter: Saved output of task 'attempt_local_0001_r_000000_0' to file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/tmp-878819521
    [junit] 08/12/30 17:20:55 INFO mapred.LocalJobRunner: reduce > reduce
    [junit] 08/12/30 17:20:55 INFO mapred.TaskRunner: Task 'attempt_local_0001_r_000000_0' done.
    [junit] 08/12/30 17:20:55 INFO exec.ExecDriver:  map = 100%,  reduce =100%
    [junit] 08/12/30 17:20:55 INFO exec.ExecDriver: Ended Job = job_local_0001
    [junit]  map = 100%,  reduce =100%
    [junit] Ended Job = job_local_0001
    [junit] testMapRedPlan1 execution completed successfully
    [junit] /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/../data/files
    [junit] Beginning testMapPlan2
    [junit] Generating plan file /tmp/plan60896.xml
    [junit] Executing: /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/hadoopcore/hadoop-0.19.0/bin/hadoop jar /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/hive_exec.jar org.apache.hadoop.hive.ql.exec.ExecDriver -plan /tmp/plan60896.xml -jobconf fs.scheme.class=dfs -jobconf hive.exec.scratchdir=%24%7Buser.dir%7D%2F..%2Fbuild%2Fql%2Ftmp -jobconf test.query.file1=file%3A%2F%2F%24%7Buser.dir%7D%2F..%2Fql%2Fsrc%2Ftest%2Forg%2Fapache%2Fhadoop%2Fhive%2Fql%2Finput2.q -jobconf hive.metastore.connect.retries=5 -jobconf hive.metastore.rawstore.impl=org.apache.hadoop.hive.metastore.ObjectStore -jobconf hive.metastore.metadb.dir=file%3A%2F%2F%24%7Buser.dir%7D%2F..%2Fbuild%2Fql%2Ftest%2Fdata%2Fmetadb%2F -jobconf javax.jdo.option.ConnectionPassword=mine -jobconf hive.metastore.uris=file%3A%2F%2F%2Fvar%2Fmetastore%2Fmetadb%2F -jobconf hive.metastore.warehouse.dir=file%3A%2F%2F%24%7Buser.dir%7D%2F..%2Fbuild%2Fql%2Ftest%2Fdata%2
 Fwarehouse%2F -jobconf hive.aux.jars.path=%2Fusr%2Ffbtools%2Fcontinuous_builds%2Fhiveopensource-trunk%2Fhiveopensource_trunk%2Fbuild%2Fql%2Ftest%2Ftest-udfs.jar%2C%2Fusr%2Ffbtools%2Fcontinuous_builds%2Fhiveopensource-trunk%2Fhiveopensource_trunk%2Fdata%2Ffiles%2FTestSerDe.jar -jobconf hive.metastore.local=true -jobconf test.data.files=%24%7Buser.dir%7D%2F..%2Fdata%2Ffiles -jobconf javax.jdo.option.ConnectionURL=jdbc%3Aderby%3A%3BdatabaseName%3D..%2Fbuild%2Ftest%2Fjunit_metastore_db%3Bcreate%3Dtrue -jobconf javax.jdo.option.ConnectionDriverName=org.apache.derby.jdbc.EmbeddedDriver -jobconf hive.exec.script.maxerrsize=100000 -jobconf javax.jdo.option.ConnectionUserName=APP -jobconf hive.jar.path=%24%7Buser.dir%7D%2F..%2Fbuild%2Fql%2Fhive_exec.jar -jobconf hadoop.config.dir=%2Fusr%2Ffbtools%2Fcontinuous_builds%2Fhiveopensource-trunk%2Fhiveopensource_trunk%2Fbuild%2Fhadoopcore%2Fhadoop-0.19.0%2Fconf -jobconf hive.join.emit.interval=1000 -jobconf hive.exec.compress.output=false -
 jobconf test.src.dir=file%3A%2F%2F%24%7Buser.dir%7D%2F..%2Fql%2Fsrc%2Ftest -jobconf test.log.dir=%24%7Buser.dir%7D%2F..%2Fbuild%2Fql%2Ftest%2Flogs -jobconf hive.map.aggr=false -jobconf hive.exec.compress.intermediate=false -jobconf hive.default.fileformat=TextFile -jobconf mapred.system.dir=%2Ftmp%2Fhadoop-mvaradachari%2Fmapred%2Fsystem%2F-93693462 -jobconf mapred.local.dir=%2Ftmp%2Fhadoop-mvaradachari%2Fmapred%2Flocal%2F-1275054956
    [junit] plan = /tmp/plan60896.xml
    [junit] 08/12/30 17:20:56 INFO exec.ExecDriver: Number of reduce tasks determined at compile : 1
    [junit] 08/12/30 17:20:56 INFO exec.ExecDriver: Adding input file /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src
    [junit] 08/12/30 17:20:56 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/data/files/TestSerDe.jar
    [junit] 08/12/30 17:20:56 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
    [junit] 08/12/30 17:20:57 WARN mapred.JobClient: Use GenericOptionsParser for parsing the arguments. Applications should implement Tool for the same.
    [junit] 08/12/30 17:20:57 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] 08/12/30 17:20:57 INFO exec.ExecDriver: Job running in-process (local Hadoop)
    [junit] 08/12/30 17:20:57 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] Job running in-process (local Hadoop)
    [junit] 08/12/30 17:20:57 INFO mapred.MapTask: numReduceTasks: 1
    [junit] 08/12/30 17:20:57 INFO mapred.MapTask: io.sort.mb = 100
    [junit] 08/12/30 17:20:57 INFO mapred.MapTask: data buffer = 79691776/99614720
    [junit] 08/12/30 17:20:57 INFO mapred.MapTask: record buffer = 262144/327680
    [junit] 08/12/30 17:20:57 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 17:20:57 INFO exec.MapOperator: Adding alias a to work list for file /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src/kv1.txt
    [junit] 08/12/30 17:20:57 INFO exec.MapOperator: Got partitions: null
    [junit] 08/12/30 17:20:57 INFO exec.ReduceSinkOperator: Initializing Self
    [junit] 08/12/30 17:20:57 INFO exec.ReduceSinkOperator: Using tag = -1
    [junit] 08/12/30 17:20:57 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 17:20:57 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 17:20:57 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 17:20:57 INFO mapred.MapTask: Starting flush of map output
    [junit] 08/12/30 17:20:57 INFO mapred.MapTask: Finished spill 0
    [junit] 08/12/30 17:20:57 INFO mapred.TaskRunner: Task:attempt_local_0001_m_000000_0 is done. And is in the process of commiting
    [junit] 08/12/30 17:20:57 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src/kv1.txt:0+5812
    [junit] 08/12/30 17:20:57 INFO mapred.TaskRunner: Task 'attempt_local_0001_m_000000_0' done.
    [junit] 08/12/30 17:20:57 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 17:20:57 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 17:20:57 INFO mapred.Merger: Merging 1 sorted segments
    [junit] 08/12/30 17:20:57 INFO mapred.Merger: Down to the last merge-pass, with 1 segments left of total size: 19720 bytes
    [junit] 08/12/30 17:20:57 INFO exec.ExtractOperator: Initializing Self
    [junit] 08/12/30 17:20:57 INFO exec.ExtractOperator: Initializing children:
    [junit] 08/12/30 17:20:57 INFO exec.FilterOperator: Initializing Self
    [junit] 08/12/30 17:20:57 INFO exec.FilterOperator: Initializing children:
    [junit] 08/12/30 17:20:57 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 17:20:57 INFO exec.FilterOperator: Initialization Done
    [junit] 08/12/30 17:20:57 INFO exec.ExtractOperator: Initialization Done
    [junit] 08/12/30 17:20:57 INFO exec.FilterOperator: FILTERED:416
    [junit] 08/12/30 17:20:57 INFO exec.FilterOperator: PASSED:84
    [junit] 08/12/30 17:20:57 INFO mapred.TaskRunner: Task:attempt_local_0001_r_000000_0 is done. And is in the process of commiting
    [junit] 08/12/30 17:20:57 INFO mapred.LocalJobRunner: 
    [junit] 08/12/30 17:20:57 INFO mapred.TaskRunner: Task attempt_local_0001_r_000000_0 is allowed to commit now
    [junit] 08/12/30 17:20:57 INFO mapred.FileOutputCommitter: Saved output of task 'attempt_local_0001_r_000000_0' to file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/tmp-281317359
    [junit] 08/12/30 17:20:57 INFO mapred.LocalJobRunner: reduce > reduce
    [junit] 08/12/30 17:20:57 INFO mapred.TaskRunner: Task 'attempt_local_0001_r_000000_0' done.
    [junit]  map = 100%,  reduce =100%
    [junit] Ended Job = job_local_0001
    [junit] 08/12/30 17:20:58 INFO exec.ExecDriver:  map = 100%,  reduce =100%
    [junit] 08/12/30 17:20:58 INFO exec.ExecDriver: Ended Job = job_local_0001
    [junit] testMapRedPlan2 execution completed successfully
    [junit] /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/../data/files
    [junit] Beginning testMapPlan3
    [junit] Generating plan file /tmp/plan60897.xml
    [junit] Executing: /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/hadoopcore/hadoop-0.19.0/bin/hadoop jar /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/hive_exec.jar org.apache.hadoop.hive.ql.exec.ExecDriver -plan /tmp/plan60897.xml -jobconf fs.scheme.class=dfs -jobconf hive.exec.scratchdir=%24%7Buser.dir%7D%2F..%2Fbuild%2Fql%2Ftmp -jobconf test.query.file1=file%3A%2F%2F%24%7Buser.dir%7D%2F..%2Fql%2Fsrc%2Ftest%2Forg%2Fapache%2Fhadoop%2Fhive%2Fql%2Finput2.q -jobconf hive.metastore.connect.retries=5 -jobconf hive.metastore.rawstore.impl=org.apache.hadoop.hive.metastore.ObjectStore -jobconf hive.metastore.metadb.dir=file%3A%2F%2F%24%7Buser.dir%7D%2F..%2Fbuild%2Fql%2Ftest%2Fdata%2Fmetadb%2F -jobconf javax.jdo.option.ConnectionPassword=mine -jobconf hive.metastore.uris=file%3A%2F%2F%2Fvar%2Fmetastore%2Fmetadb%2F -jobconf hive.metastore.warehouse.dir=file%3A%2F%2F%24%7Buser.dir%7D%2F..%2Fbuild%2Fql%2Ftest%2Fdata%2
 Fwarehouse%2F -jobconf hive.aux.jars.path=%2Fusr%2Ffbtools%2Fcontinuous_builds%2Fhiveopensource-trunk%2Fhiveopensource_trunk%2Fbuild%2Fql%2Ftest%2Ftest-udfs.jar%2C%2Fusr%2Ffbtools%2Fcontinuous_builds%2Fhiveopensource-trunk%2Fhiveopensource_trunk%2Fdata%2Ffiles%2FTestSerDe.jar -jobconf hive.metastore.local=true -jobconf test.data.files=%24%7Buser.dir%7D%2F..%2Fdata%2Ffiles -jobconf javax.jdo.option.ConnectionURL=jdbc%3Aderby%3A%3BdatabaseName%3D..%2Fbuild%2Ftest%2Fjunit_metastore_db%3Bcreate%3Dtrue -jobconf javax.jdo.option.ConnectionDriverName=org.apache.derby.jdbc.EmbeddedDriver -jobconf hive.exec.script.maxerrsize=100000 -jobconf javax.jdo.option.ConnectionUserName=APP -jobconf hive.jar.path=%24%7Buser.dir%7D%2F..%2Fbuild%2Fql%2Fhive_exec.jar -jobconf hadoop.config.dir=%2Fusr%2Ffbtools%2Fcontinuous_builds%2Fhiveopensource-trunk%2Fhiveopensource_trunk%2Fbuild%2Fhadoopcore%2Fhadoop-0.19.0%2Fconf -jobconf hive.join.emit.interval=1000 -jobconf hive.exec.compress.output=false -
 jobconf test.src.dir=file%3A%2F%2F%24%7Buser.dir%7D%2F..%2Fql%2Fsrc%2Ftest -jobconf test.log.dir=%24%7Buser.dir%7D%2F..%2Fbuild%2Fql%2Ftest%2Flogs -jobconf hive.map.aggr=false -jobconf hive.exec.compress.intermediate=false -jobconf hive.default.fileformat=TextFile -jobconf mapred.system.dir=%2Ftmp%2Fhadoop-mvaradachari%2Fmapred%2Fsystem%2F-1869528422 -jobconf mapred.local.dir=%2Ftmp%2Fhadoop-mvaradachari%2Fmapred%2Flocal%2F-1760590139
    [junit] plan = /tmp/plan60897.xml
    [junit] 08/12/30 17:20:59 INFO exec.ExecDriver: Number of reduce tasks determined at compile : 5
    [junit] 08/12/30 17:20:59 INFO exec.ExecDriver: Adding input file /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src
    [junit] 08/12/30 17:20:59 INFO exec.ExecDriver: Adding input file /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src2
    [junit] 08/12/30 17:20:59 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/data/files/TestSerDe.jar
    [junit] 08/12/30 17:20:59 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
    [junit] 08/12/30 17:20:59 WARN mapred.JobClient: Use GenericOptionsParser for parsing the arguments. Applications should implement Tool for the same.
    [junit] 08/12/30 17:21:00 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] 08/12/30 17:21:00 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] 08/12/30 17:21:00 INFO exec.ExecDriver: Job running in-process (local Hadoop)
    [junit] Job running in-process (local Hadoop)
    [junit] 08/12/30 17:21:00 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] 08/12/30 17:21:00 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] 08/12/30 17:21:00 INFO mapred.MapTask: numReduceTasks: 1
    [junit] 08/12/30 17:21:00 INFO mapred.MapTask: io.sort.mb = 100
    [junit] 08/12/30 17:21:00 INFO mapred.MapTask: data buffer = 79691776/99614720
    [junit] 08/12/30 17:21:00 INFO mapred.MapTask: record buffer = 262144/327680
    [junit] 08/12/30 17:21:00 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 17:21:00 INFO exec.MapOperator: Adding alias a to work list for file /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src/kv1.txt
    [junit] 08/12/30 17:21:00 INFO exec.MapOperator: Got partitions: null
    [junit] 08/12/30 17:21:00 INFO exec.ReduceSinkOperator: Initializing Self
    [junit] 08/12/30 17:21:00 INFO exec.ReduceSinkOperator: Using tag = 0
    [junit] 08/12/30 17:21:00 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 17:21:00 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 17:21:00 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 17:21:00 INFO mapred.MapTask: Starting flush of map output
    [junit] 08/12/30 17:21:00 INFO mapred.MapTask: Finished spill 0
    [junit] 08/12/30 17:21:00 INFO mapred.TaskRunner: Task:attempt_local_0001_m_000000_0 is done. And is in the process of commiting
    [junit] 08/12/30 17:21:00 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src/kv1.txt:0+5812
    [junit] 08/12/30 17:21:00 INFO mapred.TaskRunner: Task 'attempt_local_0001_m_000000_0' done.
    [junit] 08/12/30 17:21:00 INFO mapred.MapTask: numReduceTasks: 1
    [junit] 08/12/30 17:21:00 INFO mapred.MapTask: io.sort.mb = 100
    [junit] 08/12/30 17:21:00 INFO mapred.MapTask: data buffer = 79691776/99614720
    [junit] 08/12/30 17:21:00 INFO mapred.MapTask: record buffer = 262144/327680
    [junit] 08/12/30 17:21:00 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 17:21:00 INFO exec.MapOperator: Adding alias b to work list for file /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src2/kv2.txt
    [junit] 08/12/30 17:21:00 INFO exec.MapOperator: Got partitions: null
    [junit] 08/12/30 17:21:00 INFO exec.ReduceSinkOperator: Initializing Self
    [junit] 08/12/30 17:21:00 INFO exec.ReduceSinkOperator: Using tag = 1
    [junit] 08/12/30 17:21:00 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 17:21:00 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 17:21:00 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 17:21:00 INFO mapred.MapTask: Starting flush of map output
    [junit] 08/12/30 17:21:00 INFO mapred.MapTask: Finished spill 0
    [junit] 08/12/30 17:21:00 INFO mapred.TaskRunner: Task:attempt_local_0001_m_000001_0 is done. And is in the process of commiting
    [junit] 08/12/30 17:21:00 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src2/kv2.txt:0+5791
    [junit] 08/12/30 17:21:00 INFO mapred.TaskRunner: Task 'attempt_local_0001_m_000001_0' done.
    [junit] 08/12/30 17:21:00 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 17:21:00 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 17:21:00 INFO mapred.Merger: Merging 2 sorted segments
    [junit] 08/12/30 17:21:00 INFO mapred.Merger: Down to the last merge-pass, with 2 segments left of total size: 28606 bytes
    [junit] 08/12/30 17:21:00 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 17:21:00 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 17:21:00 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 17:21:00 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 17:21:01 INFO mapred.TaskRunner: Task:attempt_local_0001_r_000000_0 is done. And is in the process of commiting
    [junit] 08/12/30 17:21:01 INFO mapred.LocalJobRunner: 
    [junit] 08/12/30 17:21:01 INFO mapred.TaskRunner: Task attempt_local_0001_r_000000_0 is allowed to commit now
    [junit] 08/12/30 17:21:01 INFO mapred.FileOutputCommitter: Saved output of task 'attempt_local_0001_r_000000_0' to file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/tmp-1168763096
    [junit] 08/12/30 17:21:01 INFO mapred.LocalJobRunner: reduce > reduce
    [junit] 08/12/30 17:21:01 INFO mapred.TaskRunner: Task 'attempt_local_0001_r_000000_0' done.
    [junit]  map = 100%,  reduce =100%
    [junit] Ended Job = job_local_0001
    [junit] 08/12/30 17:21:01 INFO exec.ExecDriver:  map = 100%,  reduce =100%
    [junit] 08/12/30 17:21:01 INFO exec.ExecDriver: Ended Job = job_local_0001
    [junit] testMapRedPlan3 execution completed successfully
    [junit] /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/../data/files
    [junit] Beginning testMapPlan4
    [junit] Generating plan file /tmp/plan60898.xml
    [junit] Executing: /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/hadoopcore/hadoop-0.19.0/bin/hadoop jar /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/hive_exec.jar org.apache.hadoop.hive.ql.exec.ExecDriver -plan /tmp/plan60898.xml -jobconf fs.scheme.class=dfs -jobconf hive.exec.scratchdir=%24%7Buser.dir%7D%2F..%2Fbuild%2Fql%2Ftmp -jobconf test.query.file1=file%3A%2F%2F%24%7Buser.dir%7D%2F..%2Fql%2Fsrc%2Ftest%2Forg%2Fapache%2Fhadoop%2Fhive%2Fql%2Finput2.q -jobconf hive.metastore.connect.retries=5 -jobconf hive.metastore.rawstore.impl=org.apache.hadoop.hive.metastore.ObjectStore -jobconf hive.metastore.metadb.dir=file%3A%2F%2F%24%7Buser.dir%7D%2F..%2Fbuild%2Fql%2Ftest%2Fdata%2Fmetadb%2F -jobconf javax.jdo.option.ConnectionPassword=mine -jobconf hive.metastore.uris=file%3A%2F%2F%2Fvar%2Fmetastore%2Fmetadb%2F -jobconf hive.metastore.warehouse.dir=file%3A%2F%2F%24%7Buser.dir%7D%2F..%2Fbuild%2Fql%2Ftest%2Fdata%2
 Fwarehouse%2F -jobconf hive.aux.jars.path=%2Fusr%2Ffbtools%2Fcontinuous_builds%2Fhiveopensource-trunk%2Fhiveopensource_trunk%2Fbuild%2Fql%2Ftest%2Ftest-udfs.jar%2C%2Fusr%2Ffbtools%2Fcontinuous_builds%2Fhiveopensource-trunk%2Fhiveopensource_trunk%2Fdata%2Ffiles%2FTestSerDe.jar -jobconf hive.metastore.local=true -jobconf test.data.files=%24%7Buser.dir%7D%2F..%2Fdata%2Ffiles -jobconf javax.jdo.option.ConnectionURL=jdbc%3Aderby%3A%3BdatabaseName%3D..%2Fbuild%2Ftest%2Fjunit_metastore_db%3Bcreate%3Dtrue -jobconf javax.jdo.option.ConnectionDriverName=org.apache.derby.jdbc.EmbeddedDriver -jobconf hive.exec.script.maxerrsize=100000 -jobconf javax.jdo.option.ConnectionUserName=APP -jobconf hive.jar.path=%24%7Buser.dir%7D%2F..%2Fbuild%2Fql%2Fhive_exec.jar -jobconf hadoop.config.dir=%2Fusr%2Ffbtools%2Fcontinuous_builds%2Fhiveopensource-trunk%2Fhiveopensource_trunk%2Fbuild%2Fhadoopcore%2Fhadoop-0.19.0%2Fconf -jobconf hive.join.emit.interval=1000 -jobconf hive.exec.compress.output=false -
 jobconf test.src.dir=file%3A%2F%2F%24%7Buser.dir%7D%2F..%2Fql%2Fsrc%2Ftest -jobconf test.log.dir=%24%7Buser.dir%7D%2F..%2Fbuild%2Fql%2Ftest%2Flogs -jobconf hive.map.aggr=false -jobconf hive.exec.compress.intermediate=false -jobconf hive.default.fileformat=TextFile -jobconf mapred.system.dir=%2Ftmp%2Fhadoop-mvaradachari%2Fmapred%2Fsystem%2F296747340 -jobconf mapred.local.dir=%2Ftmp%2Fhadoop-mvaradachari%2Fmapred%2Flocal%2F-270019237
    [junit] plan = /tmp/plan60898.xml
    [junit] 08/12/30 17:21:02 INFO exec.ExecDriver: Number of reduce tasks determined at compile : 1
    [junit] 08/12/30 17:21:02 INFO exec.ExecDriver: Adding input file /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src
    [junit] 08/12/30 17:21:02 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/data/files/TestSerDe.jar
    [junit] 08/12/30 17:21:02 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
    [junit] 08/12/30 17:21:02 WARN mapred.JobClient: Use GenericOptionsParser for parsing the arguments. Applications should implement Tool for the same.
    [junit] 08/12/30 17:21:02 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] 08/12/30 17:21:03 INFO exec.ExecDriver: Job running in-process (local Hadoop)
    [junit] Job running in-process (local Hadoop)
    [junit] 08/12/30 17:21:03 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] 08/12/30 17:21:03 INFO mapred.MapTask: numReduceTasks: 1
    [junit] 08/12/30 17:21:03 INFO mapred.MapTask: io.sort.mb = 100
    [junit] 08/12/30 17:21:03 INFO mapred.MapTask: data buffer = 79691776/99614720
    [junit] 08/12/30 17:21:03 INFO mapred.MapTask: record buffer = 262144/327680
    [junit] 08/12/30 17:21:03 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 17:21:03 INFO exec.MapOperator: Adding alias a to work list for file /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src/kv1.txt
    [junit] 08/12/30 17:21:03 INFO exec.MapOperator: Got partitions: null
    [junit] 08/12/30 17:21:03 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 17:21:03 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 17:21:03 INFO exec.ScriptOperator: Initializing Self
    [junit] 08/12/30 17:21:03 INFO exec.ScriptOperator: Initializing children:
    [junit] 08/12/30 17:21:03 INFO exec.ReduceSinkOperator: Initializing Self
    [junit] 08/12/30 17:21:03 INFO exec.ReduceSinkOperator: Using tag = -1
    [junit] 08/12/30 17:21:03 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 17:21:03 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 17:21:03 INFO exec.ScriptOperator: Initialization Done
    [junit] 08/12/30 17:21:03 INFO exec.ScriptOperator: Executing [/bin/cat]
    [junit] 08/12/30 17:21:03 INFO exec.ScriptOperator: tablename=src
    [junit] 08/12/30 17:21:03 INFO exec.ScriptOperator: partname=null
    [junit] 08/12/30 17:21:03 INFO exec.ScriptOperator: alias=a
    [junit] 08/12/30 17:21:03 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 17:21:03 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 17:21:03 INFO exec.ScriptOperator: StreamThread ErrorProcessor done
    [junit] 08/12/30 17:21:03 INFO exec.ScriptOperator: StreamThread OutputProcessor done
    [junit] 08/12/30 17:21:03 INFO exec.ScriptOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 17:21:03 INFO exec.ScriptOperator: SERIALIZE_ERRORS:0
    [junit] 08/12/30 17:21:03 INFO mapred.MapTask: Starting flush of map output
    [junit] 08/12/30 17:21:03 INFO mapred.MapTask: Finished spill 0
    [junit] 08/12/30 17:21:03 INFO mapred.TaskRunner: Task:attempt_local_0001_m_000000_0 is done. And is in the process of commiting
    [junit] 08/12/30 17:21:03 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src/kv1.txt:0+5812
    [junit] 08/12/30 17:21:03 INFO mapred.TaskRunner: Task 'attempt_local_0001_m_000000_0' done.
    [junit] 08/12/30 17:21:03 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 17:21:03 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 17:21:03 INFO mapred.Merger: Merging 1 sorted segments
    [junit] 08/12/30 17:21:03 INFO mapred.Merger: Down to the last merge-pass, with 1 segments left of total size: 19720 bytes
    [junit] 08/12/30 17:21:03 INFO exec.ExtractOperator: Initializing Self
    [junit] 08/12/30 17:21:03 INFO exec.ExtractOperator: Initializing children:
    [junit] 08/12/30 17:21:03 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 17:21:03 INFO exec.ExtractOperator: Initialization Done
    [junit] 08/12/30 17:21:03 INFO mapred.TaskRunner: Task:attempt_local_0001_r_000000_0 is done. And is in the process of commiting
    [junit] 08/12/30 17:21:03 INFO mapred.LocalJobRunner: 
    [junit] 08/12/30 17:21:03 INFO mapred.TaskRunner: Task attempt_local_0001_r_000000_0 is allowed to commit now
    [junit] 08/12/30 17:21:03 INFO mapred.FileOutputCommitter: Saved output of task 'attempt_local_0001_r_000000_0' to file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/tmp1762985149
    [junit] 08/12/30 17:21:03 INFO mapred.LocalJobRunner: reduce > reduce
    [junit] 08/12/30 17:21:03 INFO mapred.TaskRunner: Task 'attempt_local_0001_r_000000_0' done.
    [junit] 08/12/30 17:21:04 INFO exec.ExecDriver:  map = 100%,  reduce =100%
    [junit] 08/12/30 17:21:04 INFO exec.ExecDriver: Ended Job = job_local_0001
    [junit]  map = 100%,  reduce =100%
    [junit] Ended Job = job_local_0001
    [junit] testMapRedPlan4 execution completed successfully
    [junit] /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/../data/files
    [junit] Beginning testMapPlan5
    [junit] Generating plan file /tmp/plan60899.xml
    [junit] Executing: /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/hadoopcore/hadoop-0.19.0/bin/hadoop jar /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/hive_exec.jar org.apache.hadoop.hive.ql.exec.ExecDriver -plan /tmp/plan60899.xml -jobconf fs.scheme.class=dfs -jobconf hive.exec.scratchdir=%24%7Buser.dir%7D%2F..%2Fbuild%2Fql%2Ftmp -jobconf test.query.file1=file%3A%2F%2F%24%7Buser.dir%7D%2F..%2Fql%2Fsrc%2Ftest%2Forg%2Fapache%2Fhadoop%2Fhive%2Fql%2Finput2.q -jobconf hive.metastore.connect.retries=5 -jobconf hive.metastore.rawstore.impl=org.apache.hadoop.hive.metastore.ObjectStore -jobconf hive.metastore.metadb.dir=file%3A%2F%2F%24%7Buser.dir%7D%2F..%2Fbuild%2Fql%2Ftest%2Fdata%2Fmetadb%2F -jobconf javax.jdo.option.ConnectionPassword=mine -jobconf hive.metastore.uris=file%3A%2F%2F%2Fvar%2Fmetastore%2Fmetadb%2F -jobconf hive.metastore.warehouse.dir=file%3A%2F%2F%24%7Buser.dir%7D%2F..%2Fbuild%2Fql%2Ftest%2Fdata%2
 Fwarehouse%2F -jobconf hive.aux.jars.path=%2Fusr%2Ffbtools%2Fcontinuous_builds%2Fhiveopensource-trunk%2Fhiveopensource_trunk%2Fbuild%2Fql%2Ftest%2Ftest-udfs.jar%2C%2Fusr%2Ffbtools%2Fcontinuous_builds%2Fhiveopensource-trunk%2Fhiveopensource_trunk%2Fdata%2Ffiles%2FTestSerDe.jar -jobconf hive.metastore.local=true -jobconf test.data.files=%24%7Buser.dir%7D%2F..%2Fdata%2Ffiles -jobconf javax.jdo.option.ConnectionURL=jdbc%3Aderby%3A%3BdatabaseName%3D..%2Fbuild%2Ftest%2Fjunit_metastore_db%3Bcreate%3Dtrue -jobconf javax.jdo.option.ConnectionDriverName=org.apache.derby.jdbc.EmbeddedDriver -jobconf hive.exec.script.maxerrsize=100000 -jobconf javax.jdo.option.ConnectionUserName=APP -jobconf hive.jar.path=%24%7Buser.dir%7D%2F..%2Fbuild%2Fql%2Fhive_exec.jar -jobconf hadoop.config.dir=%2Fusr%2Ffbtools%2Fcontinuous_builds%2Fhiveopensource-trunk%2Fhiveopensource_trunk%2Fbuild%2Fhadoopcore%2Fhadoop-0.19.0%2Fconf -jobconf hive.join.emit.interval=1000 -jobconf hive.exec.compress.output=false -
 jobconf test.src.dir=file%3A%2F%2F%24%7Buser.dir%7D%2F..%2Fql%2Fsrc%2Ftest -jobconf test.log.dir=%24%7Buser.dir%7D%2F..%2Fbuild%2Fql%2Ftest%2Flogs -jobconf hive.map.aggr=false -jobconf hive.exec.compress.intermediate=false -jobconf hive.default.fileformat=TextFile -jobconf mapred.system.dir=%2Ftmp%2Fhadoop-mvaradachari%2Fmapred%2Fsystem%2F1487018976 -jobconf mapred.local.dir=%2Ftmp%2Fhadoop-mvaradachari%2Fmapred%2Flocal%2F788870745
    [junit] plan = /tmp/plan60899.xml
    [junit] 08/12/30 17:21:05 INFO exec.ExecDriver: Number of reduce tasks determined at compile : 1
    [junit] 08/12/30 17:21:05 INFO exec.ExecDriver: Adding input file /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src
    [junit] 08/12/30 17:21:05 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/data/files/TestSerDe.jar
    [junit] 08/12/30 17:21:05 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
    [junit] 08/12/30 17:21:05 WARN mapred.JobClient: Use GenericOptionsParser for parsing the arguments. Applications should implement Tool for the same.
    [junit] 08/12/30 17:21:05 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] 08/12/30 17:21:05 INFO exec.ExecDriver: Job running in-process (local Hadoop)
    [junit] Job running in-process (local Hadoop)
    [junit] 08/12/30 17:21:05 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] 08/12/30 17:21:05 INFO mapred.MapTask: numReduceTasks: 1
    [junit] 08/12/30 17:21:05 INFO mapred.MapTask: io.sort.mb = 100
    [junit] 08/12/30 17:21:05 INFO mapred.MapTask: data buffer = 79691776/99614720
    [junit] 08/12/30 17:21:05 INFO mapred.MapTask: record buffer = 262144/327680
    [junit] 08/12/30 17:21:05 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 17:21:05 INFO exec.MapOperator: Adding alias a to work list for file /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src/kv1.txt
    [junit] 08/12/30 17:21:06 INFO exec.MapOperator: Got partitions: null
    [junit] 08/12/30 17:21:06 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 17:21:06 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 17:21:06 INFO exec.ReduceSinkOperator: Initializing Self
    [junit] 08/12/30 17:21:06 INFO exec.ReduceSinkOperator: Using tag = -1
    [junit] 08/12/30 17:21:06 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 17:21:06 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 17:21:06 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 17:21:06 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 17:21:06 INFO mapred.MapTask: Starting flush of map output
    [junit] 08/12/30 17:21:06 INFO mapred.MapTask: Finished spill 0
    [junit] 08/12/30 17:21:06 INFO mapred.TaskRunner: Task:attempt_local_0001_m_000000_0 is done. And is in the process of commiting
    [junit] 08/12/30 17:21:06 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src/kv1.txt:0+5812
    [junit] 08/12/30 17:21:06 INFO mapred.TaskRunner: Task 'attempt_local_0001_m_000000_0' done.
    [junit] 08/12/30 17:21:06 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 17:21:06 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 17:21:06 INFO mapred.Merger: Merging 1 sorted segments
    [junit] 08/12/30 17:21:06 INFO mapred.Merger: Down to the last merge-pass, with 1 segments left of total size: 19720 bytes
    [junit] 08/12/30 17:21:06 INFO exec.ExtractOperator: Initializing Self
    [junit] 08/12/30 17:21:06 INFO exec.ExtractOperator: Initializing children:
    [junit] 08/12/30 17:21:06 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 17:21:06 INFO exec.ExtractOperator: Initialization Done
    [junit] 08/12/30 17:21:06 INFO mapred.TaskRunner: Task:attempt_local_0001_r_000000_0 is done. And is in the process of commiting
    [junit] 08/12/30 17:21:06 INFO mapred.LocalJobRunner: 
    [junit] 08/12/30 17:21:06 INFO mapred.TaskRunner: Task attempt_local_0001_r_000000_0 is allowed to commit now
    [junit] 08/12/30 17:21:06 INFO mapred.FileOutputCommitter: Saved output of task 'attempt_local_0001_r_000000_0' to file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/tmp809007142
    [junit] 08/12/30 17:21:06 INFO mapred.LocalJobRunner: reduce > reduce
    [junit] 08/12/30 17:21:06 INFO mapred.TaskRunner: Task 'attempt_local_0001_r_000000_0' done.
    [junit] 08/12/30 17:21:06 INFO exec.ExecDriver:  map = 100%,  reduce =100%
    [junit] 08/12/30 17:21:06 INFO exec.ExecDriver: Ended Job = job_local_0001
    [junit]  map = 100%,  reduce =100%
    [junit] Ended Job = job_local_0001
    [junit] testMapRedPlan5 execution completed successfully
    [junit] /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/../data/files
    [junit] Beginning testMapPlan6
    [junit] Generating plan file /tmp/plan60900.xml
    [junit] Executing: /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/hadoopcore/hadoop-0.19.0/bin/hadoop jar /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/hive_exec.jar org.apache.hadoop.hive.ql.exec.ExecDriver -plan /tmp/plan60900.xml -jobconf fs.scheme.class=dfs -jobconf hive.exec.scratchdir=%24%7Buser.dir%7D%2F..%2Fbuild%2Fql%2Ftmp -jobconf test.query.file1=file%3A%2F%2F%24%7Buser.dir%7D%2F..%2Fql%2Fsrc%2Ftest%2Forg%2Fapache%2Fhadoop%2Fhive%2Fql%2Finput2.q -jobconf hive.metastore.connect.retries=5 -jobconf hive.metastore.rawstore.impl=org.apache.hadoop.hive.metastore.ObjectStore -jobconf hive.metastore.metadb.dir=file%3A%2F%2F%24%7Buser.dir%7D%2F..%2Fbuild%2Fql%2Ftest%2Fdata%2Fmetadb%2F -jobconf javax.jdo.option.ConnectionPassword=mine -jobconf hive.metastore.uris=file%3A%2F%2F%2Fvar%2Fmetastore%2Fmetadb%2F -jobconf hive.metastore.warehouse.dir=file%3A%2F%2F%24%7Buser.dir%7D%2F..%2Fbuild%2Fql%2Ftest%2Fdata%2
 Fwarehouse%2F -jobconf hive.aux.jars.path=%2Fusr%2Ffbtools%2Fcontinuous_builds%2Fhiveopensource-trunk%2Fhiveopensource_trunk%2Fbuild%2Fql%2Ftest%2Ftest-udfs.jar%2C%2Fusr%2Ffbtools%2Fcontinuous_builds%2Fhiveopensource-trunk%2Fhiveopensource_trunk%2Fdata%2Ffiles%2FTestSerDe.jar -jobconf hive.metastore.local=true -jobconf test.data.files=%24%7Buser.dir%7D%2F..%2Fdata%2Ffiles -jobconf javax.jdo.option.ConnectionURL=jdbc%3Aderby%3A%3BdatabaseName%3D..%2Fbuild%2Ftest%2Fjunit_metastore_db%3Bcreate%3Dtrue -jobconf javax.jdo.option.ConnectionDriverName=org.apache.derby.jdbc.EmbeddedDriver -jobconf hive.exec.script.maxerrsize=100000 -jobconf javax.jdo.option.ConnectionUserName=APP -jobconf hive.jar.path=%24%7Buser.dir%7D%2F..%2Fbuild%2Fql%2Fhive_exec.jar -jobconf hadoop.config.dir=%2Fusr%2Ffbtools%2Fcontinuous_builds%2Fhiveopensource-trunk%2Fhiveopensource_trunk%2Fbuild%2Fhadoopcore%2Fhadoop-0.19.0%2Fconf -jobconf hive.join.emit.interval=1000 -jobconf hive.exec.compress.output=false -
 jobconf test.src.dir=file%3A%2F%2F%24%7Buser.dir%7D%2F..%2Fql%2Fsrc%2Ftest -jobconf test.log.dir=%24%7Buser.dir%7D%2F..%2Fbuild%2Fql%2Ftest%2Flogs -jobconf hive.map.aggr=false -jobconf hive.exec.compress.intermediate=false -jobconf hive.default.fileformat=TextFile -jobconf mapred.system.dir=%2Ftmp%2Fhadoop-mvaradachari%2Fmapred%2Fsystem%2F-833918898 -jobconf mapred.local.dir=%2Ftmp%2Fhadoop-mvaradachari%2Fmapred%2Flocal%2F370167599
    [junit] plan = /tmp/plan60900.xml
    [junit] 08/12/30 17:21:07 INFO exec.ExecDriver: Number of reduce tasks determined at compile : 1
    [junit] 08/12/30 17:21:08 INFO exec.ExecDriver: Adding input file /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src
    [junit] 08/12/30 17:21:08 INFO exec.ExecDriver: adding libjars: /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/test-udfs.jar,/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/data/files/TestSerDe.jar
    [junit] 08/12/30 17:21:08 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId=
    [junit] 08/12/30 17:21:08 WARN mapred.JobClient: Use GenericOptionsParser for parsing the arguments. Applications should implement Tool for the same.
    [junit] 08/12/30 17:21:08 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] 08/12/30 17:21:08 INFO exec.ExecDriver: Job running in-process (local Hadoop)
    [junit] Job running in-process (local Hadoop)
    [junit] 08/12/30 17:21:08 INFO mapred.FileInputFormat: Total input paths to process : 1
    [junit] 08/12/30 17:21:08 INFO mapred.MapTask: numReduceTasks: 1
    [junit] 08/12/30 17:21:08 INFO mapred.MapTask: io.sort.mb = 100
    [junit] 08/12/30 17:21:08 INFO mapred.MapTask: data buffer = 79691776/99614720
    [junit] 08/12/30 17:21:08 INFO mapred.MapTask: record buffer = 262144/327680
    [junit] 08/12/30 17:21:08 INFO exec.MapOperator: Initializing Self
    [junit] 08/12/30 17:21:08 INFO exec.MapOperator: Adding alias a to work list for file /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src/kv1.txt
    [junit] 08/12/30 17:21:08 INFO exec.MapOperator: Got partitions: null
    [junit] 08/12/30 17:21:08 INFO exec.SelectOperator: Initializing Self
    [junit] 08/12/30 17:21:08 INFO exec.SelectOperator: Initializing children:
    [junit] 08/12/30 17:21:08 INFO exec.ScriptOperator: Initializing Self
    [junit] 08/12/30 17:21:08 INFO exec.ScriptOperator: Initializing children:
    [junit] 08/12/30 17:21:08 INFO exec.ReduceSinkOperator: Initializing Self
    [junit] 08/12/30 17:21:08 INFO exec.ReduceSinkOperator: Using tag = -1
    [junit] 08/12/30 17:21:08 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 17:21:08 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 17:21:08 INFO exec.ScriptOperator: Initialization Done
    [junit] 08/12/30 17:21:08 INFO exec.ScriptOperator: Executing [/bin/cat]
    [junit] 08/12/30 17:21:08 INFO exec.ScriptOperator: tablename=src
    [junit] 08/12/30 17:21:08 INFO exec.ScriptOperator: partname=null
    [junit] 08/12/30 17:21:08 INFO exec.ScriptOperator: alias=a
    [junit] 08/12/30 17:21:08 INFO exec.SelectOperator: Initialization Done
    [junit] 08/12/30 17:21:08 INFO exec.MapOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 17:21:08 INFO exec.ScriptOperator: StreamThread ErrorProcessor done
    [junit] 08/12/30 17:21:08 INFO exec.ScriptOperator: StreamThread OutputProcessor done
    [junit] 08/12/30 17:21:08 INFO exec.ScriptOperator: SERIALIZE_ERRORS:0
    [junit] 08/12/30 17:21:08 INFO exec.ScriptOperator: DESERIALIZE_ERRORS:0
    [junit] 08/12/30 17:21:09 INFO mapred.MapTask: Starting flush of map output
    [junit] 08/12/30 17:21:09 INFO mapred.MapTask: Finished spill 0
    [junit] 08/12/30 17:21:09 INFO mapred.TaskRunner: Task:attempt_local_0001_m_000000_0 is done. And is in the process of commiting
    [junit] 08/12/30 17:21:09 INFO mapred.LocalJobRunner: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src/kv1.txt:0+5812
    [junit] 08/12/30 17:21:09 INFO mapred.TaskRunner: Task 'attempt_local_0001_m_000000_0' done.
    [junit] 08/12/30 17:21:09 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 17:21:09 INFO thrift.TBinarySortableProtocol: Sort order is "+"
    [junit] 08/12/30 17:21:09 INFO mapred.Merger: Merging 1 sorted segments
    [junit] 08/12/30 17:21:09 INFO mapred.Merger: Down to the last merge-pass, with 1 segments left of total size: 19720 bytes
    [junit] 08/12/30 17:21:09 INFO exec.ExtractOperator: Initializing Self
    [junit] 08/12/30 17:21:09 INFO exec.ExtractOperator: Initializing children:
    [junit] 08/12/30 17:21:09 INFO exec.FilterOperator: Initializing Self
    [junit] 08/12/30 17:21:09 INFO exec.FilterOperator: Initializing children:
    [junit] 08/12/30 17:21:09 INFO exec.FileSinkOperator: Initializing Self
    [junit] 08/12/30 17:21:09 INFO exec.FilterOperator: Initialization Done
    [junit] 08/12/30 17:21:09 INFO exec.ExtractOperator: Initialization Done
    [junit] 08/12/30 17:21:09 INFO exec.FilterOperator: PASSED:84
    [junit] 08/12/30 17:21:09 INFO exec.FilterOperator: FILTERED:416
    [junit] 08/12/30 17:21:09 INFO mapred.TaskRunner: Task:attempt_local_0001_r_000000_0 is done. And is in the process of commiting
    [junit] 08/12/30 17:21:09 INFO mapred.LocalJobRunner: 
    [junit] 08/12/30 17:21:09 INFO mapred.TaskRunner: Task attempt_local_0001_r_000000_0 is allowed to commit now
    [junit] 08/12/30 17:21:09 INFO mapred.FileOutputCommitter: Saved output of task 'attempt_local_0001_r_000000_0' to file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/tmp1611358640
    [junit] 08/12/30 17:21:09 INFO mapred.LocalJobRunner: reduce > reduce
    [junit] 08/12/30 17:21:09 INFO mapred.TaskRunner: Task 'attempt_local_0001_r_000000_0' done.
    [junit] 08/12/30 17:21:09 INFO exec.ExecDriver:  map = 100%,  reduce =100%
    [junit] 08/12/30 17:21:09 INFO exec.ExecDriver: Ended Job = job_local_0001
    [junit]  map = 100%,  reduce =100%
    [junit] Ended Job = job_local_0001
    [junit] testMapRedPlan6 execution completed successfully
    [junit] /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/../data/files
    [junit] Tests run: 8, Failures: 0, Errors: 0, Time elapsed: 22.367 sec
    [junit] Running org.apache.hadoop.hive.ql.exec.TestExpressionEvaluator
    [junit] ExprNodeColumnEvaluator ok
    [junit] ExprNodeFuncEvaluator ok
    [junit] testExprNodeConversionEvaluator ok
    [junit] Evaluating 1 + 2 for 10000000 times
    [junit] Evaluation finished: 0.570 seconds, 0.057 seconds/million call.
    [junit] Evaluating 1 + 2 - 3 for 10000000 times
    [junit] Evaluation finished: 1.519 seconds, 0.152 seconds/million call.
    [junit] Evaluating 1 + 2 - 3 + 4 for 10000000 times
    [junit] Evaluation finished: 2.020 seconds, 0.202 seconds/million call.
    [junit] Evaluating concat("1", "2") for 10000000 times
    [junit] Evaluation finished: 1.889 seconds, 0.189 seconds/million call.
    [junit] Evaluating concat(concat("1", "2"), "3") for 10000000 times
    [junit] Evaluation finished: 3.394 seconds, 0.339 seconds/million call.
    [junit] Evaluating concat(concat(concat("1", "2"), "3"), "4") for 10000000 times
    [junit] Evaluation finished: 4.991 seconds, 0.499 seconds/million call.
    [junit] Evaluating concat(col1[1], cola[1]) for 1000000 times
    [junit] Evaluation finished: 0.309 seconds, 0.309 seconds/million call.
    [junit] Evaluating concat(concat(col1[1], cola[1]), col1[2]) for 1000000 times
    [junit] Evaluation finished: 0.515 seconds, 0.515 seconds/million call.
    [junit] Evaluating concat(concat(concat(col1[1], cola[1]), col1[2]), cola[2]) for 1000000 times
    [junit] Evaluation finished: 0.754 seconds, 0.754 seconds/million call.
    [junit] Tests run: 4, Failures: 0, Errors: 0, Time elapsed: 16.109 sec
    [junit] Running org.apache.hadoop.hive.ql.exec.TestJEXL
    [junit] JEXL library test ok
    [junit] Evaluating 1 + 2 for 10000000 times
    [junit] Evaluation finished: 0.780 seconds, 0.078 seconds/million call.
    [junit] Evaluating __udf__concat.evaluate("1", "2") for 1000000 times
    [junit] Evaluation finished: 1.476 seconds, 1.476 seconds/million call.
    [junit] Tests run: 3, Failures: 0, Errors: 0, Time elapsed: 2.558 sec
    [junit] Running org.apache.hadoop.hive.ql.exec.TestOperators
    [junit] Testing Filter Operator
    [junit] filtered = 4
    [junit] passed = 1
    [junit] Filter Operator ok
    [junit] Testing FileSink Operator
    [junit] FileSink Operator ok
    [junit] Testing Script Operator
    [junit] [0] io.o=[1, 01]
    [junit] [0] io.oi=org.apache.hadoop.hive.serde2.objectinspector.StandardStructObjectInspector@5d61dfb5
    [junit] [1] io.o=[2, 11]
    [junit] [1] io.oi=org.apache.hadoop.hive.serde2.objectinspector.StandardStructObjectInspector@5d61dfb5
    [junit] [2] io.o=[3, 21]
    [junit] [2] io.oi=org.apache.hadoop.hive.serde2.objectinspector.StandardStructObjectInspector@5d61dfb5
    [junit] [3] io.o=[4, 31]
    [junit] [3] io.oi=org.apache.hadoop.hive.serde2.objectinspector.StandardStructObjectInspector@5d61dfb5
    [junit] [4] io.o=[5, 41]
    [junit] [4] io.oi=org.apache.hadoop.hive.serde2.objectinspector.StandardStructObjectInspector@5d61dfb5
    [junit] Script Operator ok
    [junit] Testing Map Operator
    [junit] io1.o.toString() = [[0, 1, 2]]
    [junit] io2.o.toString() = [[0, 1, 2]]
    [junit] answer.toString() = [[0, 1, 2]]
    [junit] io1.o.toString() = [[1, 2, 3]]
    [junit] io2.o.toString() = [[1, 2, 3]]
    [junit] answer.toString() = [[1, 2, 3]]
    [junit] io1.o.toString() = [[2, 3, 4]]
    [junit] io2.o.toString() = [[2, 3, 4]]
    [junit] answer.toString() = [[2, 3, 4]]
    [junit] io1.o.toString() = [[3, 4, 5]]
    [junit] io2.o.toString() = [[3, 4, 5]]
    [junit] answer.toString() = [[3, 4, 5]]
    [junit] io1.o.toString() = [[4, 5, 6]]
    [junit] io2.o.toString() = [[4, 5, 6]]
    [junit] answer.toString() = [[4, 5, 6]]
    [junit] Map Operator ok
    [junit] Tests run: 4, Failures: 0, Errors: 0, Time elapsed: 0.928 sec
    [junit] Running org.apache.hadoop.hive.ql.exec.TestPlan
    [junit] Serialization/Deserialization of plan successful
    [junit] Tests run: 1, Failures: 0, Errors: 0, Time elapsed: 0.699 sec
    [junit] Running org.apache.hadoop.hive.ql.io.TestFlatFileInputFormat
    [junit] Tests run: 2, Failures: 0, Errors: 0, Time elapsed: 0.778 sec
    [junit] Running org.apache.hadoop.hive.ql.metadata.TestHive
    [junit] Tests run: 4, Failures: 0, Errors: 0, Time elapsed: 7.502 sec
    [junit] Running org.apache.hadoop.hive.ql.metadata.TestPartition
    [junit] Tests run: 1, Failures: 0, Errors: 0, Time elapsed: 0.241 sec
    [junit] Running org.apache.hadoop.hive.ql.parse.TestParse
    [junit] Begin query: case_sensitivity.q
    [junit] Loading data to table srcpart partition {ds=2008-04-08, hr=11}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-08, hr=12}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-09, hr=11}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-09, hr=12}
    [junit] OK
    [junit] Loading data to table srcbucket
    [junit] OK
    [junit] Loading data to table srcbucket
    [junit] OK
    [junit] Loading data to table src
    [junit] OK
    [junit] diff /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/../build/ql/test/logs/case_sensitivity.q.out /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/src/test/results/compiler/parse/case_sensitivity.q.out
    [junit] diff -b -I'\(\(<java version=".*" class="java.beans.XMLDecoder">\)\|\(<string>.*/tmp/.*</string>\)\|\(<string>file:.*</string>\)\|\(<string>/.*/warehouse/.*</string>\)\)' /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/../build/ql/test/logs/case_sensitivity.q.xml /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/src/test/results/compiler/plan/case_sensitivity.q.xml
    [junit] Done query: case_sensitivity.q
    [junit] Begin query: input20.q
    [junit] Loading data to table srcpart partition {ds=2008-04-08, hr=11}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-08, hr=12}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-09, hr=11}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-09, hr=12}
    [junit] OK
    [junit] Loading data to table srcbucket
    [junit] OK
    [junit] Loading data to table srcbucket
    [junit] OK
    [junit] Loading data to table src
    [junit] OK
    [junit] diff /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/../build/ql/test/logs/input20.q.out /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/src/test/results/compiler/parse/input20.q.out
    [junit] diff -b -I'\(\(<java version=".*" class="java.beans.XMLDecoder">\)\|\(<string>.*/tmp/.*</string>\)\|\(<string>file:.*</string>\)\|\(<string>/.*/warehouse/.*</string>\)\)' /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/../build/ql/test/logs/input20.q.xml /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/src/test/results/compiler/plan/input20.q.xml
    [junit] Done query: input20.q
    [junit] Begin query: sample1.q
    [junit] Loading data to table srcpart partition {ds=2008-04-08, hr=11}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-08, hr=12}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-09, hr=11}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-09, hr=12}
    [junit] OK
    [junit] Loading data to table srcbucket
    [junit] OK
    [junit] Loading data to table srcbucket
    [junit] OK
    [junit] Loading data to table src
    [junit] OK
    [junit] diff /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/../build/ql/test/logs/sample1.q.out /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/src/test/results/compiler/parse/sample1.q.out
    [junit] diff -b -I'\(\(<java version=".*" class="java.beans.XMLDecoder">\)\|\(<string>.*/tmp/.*</string>\)\|\(<string>file:.*</string>\)\|\(<string>/.*/warehouse/.*</string>\)\)' /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/../build/ql/test/logs/sample1.q.xml /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/src/test/results/compiler/plan/sample1.q.xml
    [junit] Done query: sample1.q
    [junit] Begin query: sample2.q
    [junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src)
    [junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.init(QTestUtil.java:329)
    [junit] 	at org.apache.hadoop.hive.ql.parse.TestParse.testParse_sample2(TestParse.java:204)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit] 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit] 	at java.lang.reflect.Method.invoke(Method.java:597)
    [junit] 	at junit.framework.TestCase.runTest(TestCase.java:164)
    [junit] 	at junit.framework.TestCase.runBare(TestCase.java:130)
    [junit] 	at junit.framework.TestResult$1.protect(TestResult.java:106)
    [junit] 	at junit.framework.TestResult.runProtected(TestResult.java:124)
    [junit] 	at junit.framework.TestResult.run(TestResult.java:109)
    [junit] 	at junit.framework.TestCase.run(TestCase.java:120)
    [junit] 	at junit.framework.TestSuite.runTest(TestSuite.java:230)
    [junit] 	at junit.framework.TestSuite.run(TestSuite.java:225)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
    [junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src)
    [junit] 	at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
    [junit] 	at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
    [junit] 	at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
    [junit] 	... 20 more
    [junit] Begin query: sample3.q
    [junit] Loading data to table srcpart partition {ds=2008-04-08, hr=11}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-08, hr=12}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-09, hr=11}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-09, hr=12}
    [junit] OK
    [junit] Loading data to table srcbucket
    [junit] OK
    [junit] Loading data to table srcbucket
    [junit] OK
    [junit] Loading data to table src
    [junit] OK
    [junit] diff /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/../build/ql/test/logs/sample3.q.out /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/src/test/results/compiler/parse/sample3.q.out
    [junit] diff -b -I'\(\(<java version=".*" class="java.beans.XMLDecoder">\)\|\(<string>.*/tmp/.*</string>\)\|\(<string>file:.*</string>\)\|\(<string>/.*/warehouse/.*</string>\)\)' /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/../build/ql/test/logs/sample3.q.xml /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/src/test/results/compiler/plan/sample3.q.xml
    [junit] Done query: sample3.q
    [junit] Begin query: sample4.q
    [junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src)
    [junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.init(QTestUtil.java:329)
    [junit] 	at org.apache.hadoop.hive.ql.parse.TestParse.testParse_sample4(TestParse.java:256)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit] 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit] 	at java.lang.reflect.Method.invoke(Method.java:597)
    [junit] 	at junit.framework.TestCase.runTest(TestCase.java:164)
    [junit] 	at junit.framework.TestCase.runBare(TestCase.java:130)
    [junit] 	at junit.framework.TestResult$1.protect(TestResult.java:106)
    [junit] 	at junit.framework.TestResult.runProtected(TestResult.java:124)
    [junit] 	at junit.framework.TestResult.run(TestResult.java:109)
    [junit] 	at junit.framework.TestCase.run(TestCase.java:120)
    [junit] 	at junit.framework.TestSuite.runTest(TestSuite.java:230)
    [junit] 	at junit.framework.TestSuite.run(TestSuite.java:225)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
    [junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src)
    [junit] 	at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
    [junit] 	at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
    [junit] 	at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
    [junit] 	... 20 more
    [junit] Begin query: sample5.q
    [junit] Loading data to table srcpart partition {ds=2008-04-08, hr=11}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-08, hr=12}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-09, hr=11}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-09, hr=12}
    [junit] OK
    [junit] Loading data to table srcbucket
    [junit] OK
    [junit] Loading data to table srcbucket
    [junit] OK
    [junit] Loading data to table src
    [junit] OK
    [junit] diff /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/../build/ql/test/logs/sample5.q.out /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/src/test/results/compiler/parse/sample5.q.out
    [junit] diff -b -I'\(\(<java version=".*" class="java.beans.XMLDecoder">\)\|\(<string>.*/tmp/.*</string>\)\|\(<string>file:.*</string>\)\|\(<string>/.*/warehouse/.*</string>\)\)' /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/../build/ql/test/logs/sample5.q.xml /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/src/test/results/compiler/plan/sample5.q.xml
    [junit] Done query: sample5.q
    [junit] Begin query: sample6.q
    [junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src)
    [junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.init(QTestUtil.java:329)
    [junit] 	at org.apache.hadoop.hive.ql.parse.TestParse.testParse_sample6(TestParse.java:308)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit] 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit] 	at java.lang.reflect.Method.invoke(Method.java:597)
    [junit] 	at junit.framework.TestCase.runTest(TestCase.java:164)
    [junit] 	at junit.framework.TestCase.runBare(TestCase.java:130)
    [junit] 	at junit.framework.TestResult$1.protect(TestResult.java:106)
    [junit] 	at junit.framework.TestResult.runProtected(TestResult.java:124)
    [junit] 	at junit.framework.TestResult.run(TestResult.java:109)
    [junit] 	at junit.framework.TestCase.run(TestCase.java:120)
    [junit] 	at junit.framework.TestSuite.runTest(TestSuite.java:230)
    [junit] 	at junit.framework.TestSuite.run(TestSuite.java:225)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
    [junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src)
    [junit] 	at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
    [junit] 	at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
    [junit] 	at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
    [junit] 	... 20 more
    [junit] Begin query: sample7.q
    [junit] Loading data to table srcpart partition {ds=2008-04-08, hr=11}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-08, hr=12}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-09, hr=11}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-09, hr=12}
    [junit] OK
    [junit] Loading data to table srcbucket
    [junit] OK
    [junit] Loading data to table srcbucket
    [junit] OK
    [junit] Loading data to table src
    [junit] OK
    [junit] diff /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/../build/ql/test/logs/sample7.q.out /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/src/test/results/compiler/parse/sample7.q.out
    [junit] diff -b -I'\(\(<java version=".*" class="java.beans.XMLDecoder">\)\|\(<string>.*/tmp/.*</string>\)\|\(<string>file:.*</string>\)\|\(<string>/.*/warehouse/.*</string>\)\)' /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/../build/ql/test/logs/sample7.q.xml /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/src/test/results/compiler/plan/sample7.q.xml
    [junit] Done query: sample7.q
    [junit] Begin query: cast1.q
    [junit] Loading data to table srcpart partition {ds=2008-04-08, hr=11}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-08, hr=12}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-09, hr=11}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-09, hr=12}
    [junit] OK
    [junit] Loading data to table srcbucket
    [junit] OK
    [junit] Loading data to table srcbucket
    [junit] OK
    [junit] Loading data to table src
    [junit] OK
    [junit] diff /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/../build/ql/test/logs/cast1.q.out /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/src/test/results/compiler/parse/cast1.q.out
    [junit] diff -b -I'\(\(<java version=".*" class="java.beans.XMLDecoder">\)\|\(<string>.*/tmp/.*</string>\)\|\(<string>file:.*</string>\)\|\(<string>/.*/warehouse/.*</string>\)\)' /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/../build/ql/test/logs/cast1.q.xml /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/src/test/results/compiler/plan/cast1.q.xml
    [junit] Done query: cast1.q
    [junit] Begin query: join1.q
    [junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.init(QTestUtil.java:329)
    [junit] 	at org.apache.hadoop.hive.ql.parse.TestParse.testParse_join1(TestParse.java:386)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit] 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit] 	at java.lang.reflect.Method.invoke(Method.java:597)
    [junit] 	at junit.framework.TestCase.runTest(TestCase.java:164)
    [junit] 	at junit.framework.TestCase.runBare(TestCase.java:130)
    [junit] 	at junit.framework.TestResult$1.protect(TestResult.java:106)
    [junit] 	at junit.framework.TestResult.runProtected(TestResult.java:124)
    [junit] 	at junit.framework.TestResult.run(TestResult.java:109)
    [junit] 	at junit.framework.TestCase.run(TestCase.java:120)
    [junit] 	at junit.framework.TestSuite.runTest(TestSuite.java:230)
    [junit] 	at junit.framework.TestSuite.run(TestSuite.java:225)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
    [junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src)
    [junit] 	at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
    [junit] 	at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
    [junit] 	at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
    [junit] 	... 20 more
    [junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src)
    [junit] Begin query: input1.q
    [junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/srcpart)
    [junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/srcpart)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.init(QTestUtil.java:329)
    [junit] 	at org.apache.hadoop.hive.ql.parse.TestParse.testParse_input1(TestParse.java:412)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit] 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit] 	at java.lang.reflect.Method.invoke(Method.java:597)
    [junit] 	at junit.framework.TestCase.runTest(TestCase.java:164)
    [junit] 	at junit.framework.TestCase.runBare(TestCase.java:130)
    [junit] 	at junit.framework.TestResult$1.protect(TestResult.java:106)
    [junit] 	at junit.framework.TestResult.runProtected(TestResult.java:124)
    [junit] 	at junit.framework.TestResult.run(TestResult.java:109)
    [junit] 	at junit.framework.TestCase.run(TestCase.java:120)
    [junit] 	at junit.framework.TestSuite.runTest(TestSuite.java:230)
    [junit] 	at junit.framework.TestSuite.run(TestSuite.java:225)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
    [junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/srcpart)
    [junit] 	at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
    [junit] 	at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
    [junit] 	at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
    [junit] 	... 20 more
    [junit] Begin query: join2.q
    [junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/srcbucket)
    [junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/srcbucket)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.init(QTestUtil.java:329)
    [junit] 	at org.apache.hadoop.hive.ql.parse.TestParse.testParse_join2(TestParse.java:438)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit] 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit] 	at java.lang.reflect.Method.invoke(Method.java:597)
    [junit] 	at junit.framework.TestCase.runTest(TestCase.java:164)
    [junit] 	at junit.framework.TestCase.runBare(TestCase.java:130)
    [junit] 	at junit.framework.TestResult$1.protect(TestResult.java:106)
    [junit] 	at junit.framework.TestResult.runProtected(TestResult.java:124)
    [junit] 	at junit.framework.TestResult.run(TestResult.java:109)
    [junit] 	at junit.framework.TestCase.run(TestCase.java:120)
    [junit] 	at junit.framework.TestSuite.runTest(TestSuite.java:230)
    [junit] 	at junit.framework.TestSuite.run(TestSuite.java:225)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
    [junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/srcbucket)
    [junit] 	at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
    [junit] 	at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
    [junit] 	at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
    [junit] 	... 20 more
    [junit] Begin query: input2.q
    [junit] Loading data to table srcpart partition {ds=2008-04-08, hr=11}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-08, hr=12}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-09, hr=11}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-09, hr=12}
    [junit] OK
    [junit] Loading data to table srcbucket
    [junit] OK
    [junit] Loading data to table srcbucket
    [junit] OK
    [junit] Loading data to table src
    [junit] OK
    [junit] diff /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/../build/ql/test/logs/input2.q.out /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/src/test/results/compiler/parse/input2.q.out
    [junit] diff -b -I'\(\(<java version=".*" class="java.beans.XMLDecoder">\)\|\(<string>.*/tmp/.*</string>\)\|\(<string>file:.*</string>\)\|\(<string>/.*/warehouse/.*</string>\)\)' /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/../build/ql/test/logs/input2.q.xml /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/src/test/results/compiler/plan/input2.q.xml
    [junit] Done query: input2.q
    [junit] Begin query: join3.q
    [junit] Loading data to table srcpart partition {ds=2008-04-08, hr=11}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-08, hr=12}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-09, hr=11}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-09, hr=12}
    [junit] OK
    [junit] Loading data to table srcbucket
    [junit] OK
    [junit] Loading data to table srcbucket
    [junit] OK
    [junit] Loading data to table src
    [junit] OK
    [junit] diff /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/../build/ql/test/logs/join3.q.out /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/src/test/results/compiler/parse/join3.q.out
    [junit] diff -b -I'\(\(<java version=".*" class="java.beans.XMLDecoder">\)\|\(<string>.*/tmp/.*</string>\)\|\(<string>file:.*</string>\)\|\(<string>/.*/warehouse/.*</string>\)\)' /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/../build/ql/test/logs/join3.q.xml /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/src/test/results/compiler/plan/join3.q.xml
    [junit] Done query: join3.q
    [junit] Begin query: input3.q
    [junit] Loading data to table srcpart partition {ds=2008-04-08, hr=11}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-08, hr=12}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-09, hr=11}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-09, hr=12}
    [junit] OK
    [junit] Loading data to table srcbucket
    [junit] OK
    [junit] Loading data to table srcbucket
    [junit] OK
    [junit] Loading data to table src
    [junit] OK
    [junit] diff /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/../build/ql/test/logs/input3.q.out /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/src/test/results/compiler/parse/input3.q.out
    [junit] diff -b -I'\(\(<java version=".*" class="java.beans.XMLDecoder">\)\|\(<string>.*/tmp/.*</string>\)\|\(<string>file:.*</string>\)\|\(<string>/.*/warehouse/.*</string>\)\)' /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/../build/ql/test/logs/input3.q.xml /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/src/test/results/compiler/plan/input3.q.xml
    [junit] Done query: input3.q
    [junit] Begin query: input4.q
    [junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src)
    [junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.init(QTestUtil.java:329)
    [junit] 	at org.apache.hadoop.hive.ql.parse.TestParse.testParse_input4(TestParse.java:542)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit] 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit] 	at java.lang.reflect.Method.invoke(Method.java:597)
    [junit] 	at junit.framework.TestCase.runTest(TestCase.java:164)
    [junit] 	at junit.framework.TestCase.runBare(TestCase.java:130)
    [junit] 	at junit.framework.TestResult$1.protect(TestResult.java:106)
    [junit] 	at junit.framework.TestResult.runProtected(TestResult.java:124)
    [junit] 	at junit.framework.TestResult.run(TestResult.java:109)
    [junit] 	at junit.framework.TestCase.run(TestCase.java:120)
    [junit] 	at junit.framework.TestSuite.runTest(TestSuite.java:230)
    [junit] 	at junit.framework.TestSuite.run(TestSuite.java:225)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
    [junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src)
    [junit] 	at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
    [junit] 	at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
    [junit] 	at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
    [junit] 	... 20 more
    [junit] Begin query: join4.q
    [junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/srcbucket)
    [junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/srcbucket)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.init(QTestUtil.java:329)
    [junit] 	at org.apache.hadoop.hive.ql.parse.TestParse.testParse_join4(TestParse.java:568)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit] 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit] 	at java.lang.reflect.Method.invoke(Method.java:597)
    [junit] 	at junit.framework.TestCase.runTest(TestCase.java:164)
    [junit] 	at junit.framework.TestCase.runBare(TestCase.java:130)
    [junit] 	at junit.framework.TestResult$1.protect(TestResult.java:106)
    [junit] 	at junit.framework.TestResult.runProtected(TestResult.java:124)
    [junit] 	at junit.framework.TestResult.run(TestResult.java:109)
    [junit] 	at junit.framework.TestCase.run(TestCase.java:120)
    [junit] 	at junit.framework.TestSuite.runTest(TestSuite.java:230)
    [junit] 	at junit.framework.TestSuite.run(TestSuite.java:225)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
    [junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/srcbucket)
    [junit] 	at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
    [junit] 	at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
    [junit] 	at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
    [junit] 	... 20 more
    [junit] Begin query: input5.q
    [junit] Loading data to table srcpart partition {ds=2008-04-08, hr=11}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-08, hr=12}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-09, hr=11}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-09, hr=12}
    [junit] OK
    [junit] Loading data to table srcbucket
    [junit] OK
    [junit] Loading data to table srcbucket
    [junit] OK
    [junit] Loading data to table src
    [junit] OK
    [junit] diff /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/../build/ql/test/logs/input5.q.out /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/src/test/results/compiler/parse/input5.q.out
    [junit] diff -b -I'\(\(<java version=".*" class="java.beans.XMLDecoder">\)\|\(<string>.*/tmp/.*</string>\)\|\(<string>file:.*</string>\)\|\(<string>/.*/warehouse/.*</string>\)\)' /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/../build/ql/test/logs/input5.q.xml /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/src/test/results/compiler/plan/input5.q.xml
    [junit] Done query: input5.q
    [junit] Begin query: join5.q
    [junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src)
    [junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.init(QTestUtil.java:329)
    [junit] 	at org.apache.hadoop.hive.ql.parse.TestParse.testParse_join5(TestParse.java:620)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit] 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit] 	at java.lang.reflect.Method.invoke(Method.java:597)
    [junit] 	at junit.framework.TestCase.runTest(TestCase.java:164)
    [junit] 	at junit.framework.TestCase.runBare(TestCase.java:130)
    [junit] 	at junit.framework.TestResult$1.protect(TestResult.java:106)
    [junit] 	at junit.framework.TestResult.runProtected(TestResult.java:124)
    [junit] 	at junit.framework.TestResult.run(TestResult.java:109)
    [junit] 	at junit.framework.TestCase.run(TestCase.java:120)
    [junit] 	at junit.framework.TestSuite.runTest(TestSuite.java:230)
    [junit] 	at junit.framework.TestSuite.run(TestSuite.java:225)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
    [junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src)
    [junit] 	at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
    [junit] 	at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
    [junit] 	at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
    [junit] 	... 20 more
    [junit] Begin query: input6.q
    [junit] Loading data to table srcpart partition {ds=2008-04-08, hr=11}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-08, hr=12}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-09, hr=11}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-09, hr=12}
    [junit] OK
    [junit] Loading data to table srcbucket
    [junit] OK
    [junit] Loading data to table srcbucket
    [junit] OK
    [junit] Loading data to table src
    [junit] OK
    [junit] diff /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/../build/ql/test/logs/input6.q.out /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/src/test/results/compiler/parse/input6.q.out
    [junit] diff -b -I'\(\(<java version=".*" class="java.beans.XMLDecoder">\)\|\(<string>.*/tmp/.*</string>\)\|\(<string>file:.*</string>\)\|\(<string>/.*/warehouse/.*</string>\)\)' /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/../build/ql/test/logs/input6.q.xml /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/src/test/results/compiler/plan/input6.q.xml
    [junit] Done query: input6.q
    [junit] Begin query: input_testxpath2.q
    [junit] Loading data to table srcpart partition {ds=2008-04-08, hr=11}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-08, hr=12}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-09, hr=11}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-09, hr=12}
    [junit] OK
    [junit] Loading data to table srcbucket
    [junit] OK
    [junit] Loading data to table srcbucket
    [junit] OK
    [junit] Loading data to table src
    [junit] OK
    [junit] diff /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/../build/ql/test/logs/input_testxpath2.q.out /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/src/test/results/compiler/parse/input_testxpath2.q.out
    [junit] diff -b -I'\(\(<java version=".*" class="java.beans.XMLDecoder">\)\|\(<string>.*/tmp/.*</string>\)\|\(<string>file:.*</string>\)\|\(<string>/.*/warehouse/.*</string>\)\)' /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/../build/ql/test/logs/input_testxpath2.q.xml /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/src/test/results/compiler/plan/input_testxpath2.q.xml
    [junit] Done query: input_testxpath2.q
    [junit] Begin query: join6.q
    [junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src)
    [junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.init(QTestUtil.java:329)
    [junit] 	at org.apache.hadoop.hive.ql.parse.TestParse.testParse_join6(TestParse.java:698)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit] 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit] 	at java.lang.reflect.Method.invoke(Method.java:597)
    [junit] 	at junit.framework.TestCase.runTest(TestCase.java:164)
    [junit] 	at junit.framework.TestCase.runBare(TestCase.java:130)
    [junit] 	at junit.framework.TestResult$1.protect(TestResult.java:106)
    [junit] 	at junit.framework.TestResult.runProtected(TestResult.java:124)
    [junit] 	at junit.framework.TestResult.run(TestResult.java:109)
    [junit] 	at junit.framework.TestCase.run(TestCase.java:120)
    [junit] 	at junit.framework.TestSuite.runTest(TestSuite.java:230)
    [junit] 	at junit.framework.TestSuite.run(TestSuite.java:225)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
    [junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src)
    [junit] 	at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
    [junit] 	at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
    [junit] 	at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
    [junit] 	... 20 more
    [junit] Begin query: input7.q
    [junit] Loading data to table srcpart partition {ds=2008-04-08, hr=11}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-08, hr=12}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-09, hr=11}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-09, hr=12}
    [junit] OK
    [junit] Loading data to table srcbucket
    [junit] OK
    [junit] Loading data to table srcbucket
    [junit] OK
    [junit] Loading data to table src
    [junit] OK
    [junit] diff /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/../build/ql/test/logs/input7.q.out /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/src/test/results/compiler/parse/input7.q.out
    [junit] diff -b -I'\(\(<java version=".*" class="java.beans.XMLDecoder">\)\|\(<string>.*/tmp/.*</string>\)\|\(<string>file:.*</string>\)\|\(<string>/.*/warehouse/.*</string>\)\)' /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/../build/ql/test/logs/input7.q.xml /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/src/test/results/compiler/plan/input7.q.xml
    [junit] Done query: input7.q
    [junit] Begin query: join7.q
    [junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src)
    [junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.init(QTestUtil.java:329)
    [junit] 	at org.apache.hadoop.hive.ql.parse.TestParse.testParse_join7(TestParse.java:750)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit] 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit] 	at java.lang.reflect.Method.invoke(Method.java:597)
    [junit] 	at junit.framework.TestCase.runTest(TestCase.java:164)
    [junit] 	at junit.framework.TestCase.runBare(TestCase.java:130)
    [junit] 	at junit.framework.TestResult$1.protect(TestResult.java:106)
    [junit] 	at junit.framework.TestResult.runProtected(TestResult.java:124)
    [junit] 	at junit.framework.TestResult.run(TestResult.java:109)
    [junit] 	at junit.framework.TestCase.run(TestCase.java:120)
    [junit] 	at junit.framework.TestSuite.runTest(TestSuite.java:230)
    [junit] 	at junit.framework.TestSuite.run(TestSuite.java:225)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
    [junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src)
    [junit] 	at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
    [junit] 	at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
    [junit] 	at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
    [junit] 	... 20 more
    [junit] Begin query: input8.q
    [junit] Loading data to table srcpart partition {ds=2008-04-08, hr=11}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-08, hr=12}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-09, hr=11}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-09, hr=12}
    [junit] OK
    [junit] Loading data to table srcbucket
    [junit] OK
    [junit] Loading data to table srcbucket
    [junit] OK
    [junit] Loading data to table src
    [junit] OK
    [junit] diff /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/../build/ql/test/logs/input8.q.out /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/src/test/results/compiler/parse/input8.q.out
    [junit] diff -b -I'\(\(<java version=".*" class="java.beans.XMLDecoder">\)\|\(<string>.*/tmp/.*</string>\)\|\(<string>file:.*</string>\)\|\(<string>/.*/warehouse/.*</string>\)\)' /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/../build/ql/test/logs/input8.q.xml /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/src/test/results/compiler/plan/input8.q.xml
    [junit] Done query: input8.q
    [junit] Begin query: input_testsequencefile.q
    [junit] Loading data to table srcpart partition {ds=2008-04-08, hr=11}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-08, hr=12}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-09, hr=11}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-09, hr=12}
    [junit] OK
    [junit] Loading data to table srcbucket
    [junit] OK
    [junit] Loading data to table srcbucket
    [junit] OK
    [junit] Loading data to table src
    [junit] OK
    [junit] diff /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/../build/ql/test/logs/input_testsequencefile.q.out /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/src/test/results/compiler/parse/input_testsequencefile.q.out
    [junit] diff -b -I'\(\(<java version=".*" class="java.beans.XMLDecoder">\)\|\(<string>.*/tmp/.*</string>\)\|\(<string>file:.*</string>\)\|\(<string>/.*/warehouse/.*</string>\)\)' /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/../build/ql/test/logs/input_testsequencefile.q.xml /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/src/test/results/compiler/plan/input_testsequencefile.q.xml
    [junit] Done query: input_testsequencefile.q
    [junit] Begin query: join8.q
    [junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src)
    [junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.init(QTestUtil.java:329)
    [junit] 	at org.apache.hadoop.hive.ql.parse.TestParse.testParse_join8(TestParse.java:828)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit] 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit] 	at java.lang.reflect.Method.invoke(Method.java:597)
    [junit] 	at junit.framework.TestCase.runTest(TestCase.java:164)
    [junit] 	at junit.framework.TestCase.runBare(TestCase.java:130)
    [junit] 	at junit.framework.TestResult$1.protect(TestResult.java:106)
    [junit] 	at junit.framework.TestResult.runProtected(TestResult.java:124)
    [junit] 	at junit.framework.TestResult.run(TestResult.java:109)
    [junit] 	at junit.framework.TestCase.run(TestCase.java:120)
    [junit] 	at junit.framework.TestSuite.runTest(TestSuite.java:230)
    [junit] 	at junit.framework.TestSuite.run(TestSuite.java:225)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
    [junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src)
    [junit] 	at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
    [junit] 	at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
    [junit] 	at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
    [junit] 	... 20 more
    [junit] Begin query: union.q
    [junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/srcpart)
    [junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/srcpart)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.init(QTestUtil.java:329)
    [junit] 	at org.apache.hadoop.hive.ql.parse.TestParse.testParse_union(TestParse.java:854)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit] 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit] 	at java.lang.reflect.Method.invoke(Method.java:597)
    [junit] 	at junit.framework.TestCase.runTest(TestCase.java:164)
    [junit] 	at junit.framework.TestCase.runBare(TestCase.java:130)
    [junit] 	at junit.framework.TestResult$1.protect(TestResult.java:106)
    [junit] 	at junit.framework.TestResult.runProtected(TestResult.java:124)
    [junit] 	at junit.framework.TestResult.run(TestResult.java:109)
    [junit] 	at junit.framework.TestCase.run(TestCase.java:120)
    [junit] 	at junit.framework.TestSuite.runTest(TestSuite.java:230)
    [junit] 	at junit.framework.TestSuite.run(TestSuite.java:225)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
    [junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/srcpart)
    [junit] 	at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
    [junit] 	at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
    [junit] 	at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
    [junit] 	... 20 more
    [junit] Begin query: input9.q
    [junit] Loading data to table srcpart partition {ds=2008-04-08, hr=11}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-08, hr=12}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-09, hr=11}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-09, hr=12}
    [junit] OK
    [junit] Loading data to table srcbucket
    [junit] OK
    [junit] Loading data to table srcbucket
    [junit] OK
    [junit] Loading data to table src
    [junit] OK
    [junit] diff /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/../build/ql/test/logs/input9.q.out /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/src/test/results/compiler/parse/input9.q.out
    [junit] diff -b -I'\(\(<java version=".*" class="java.beans.XMLDecoder">\)\|\(<string>.*/tmp/.*</string>\)\|\(<string>file:.*</string>\)\|\(<string>/.*/warehouse/.*</string>\)\)' /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/../build/ql/test/logs/input9.q.xml /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/src/test/results/compiler/plan/input9.q.xml
    [junit] Done query: input9.q
    [junit] Begin query: udf1.q
    [junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src)
    [junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.init(QTestUtil.java:329)
    [junit] 	at org.apache.hadoop.hive.ql.parse.TestParse.testParse_udf1(TestParse.java:906)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit] 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit] 	at java.lang.reflect.Method.invoke(Method.java:597)
    [junit] 	at junit.framework.TestCase.runTest(TestCase.java:164)
    [junit] 	at junit.framework.TestCase.runBare(TestCase.java:130)
    [junit] 	at junit.framework.TestResult$1.protect(TestResult.java:106)
    [junit] 	at junit.framework.TestResult.runProtected(TestResult.java:124)
    [junit] 	at junit.framework.TestResult.run(TestResult.java:109)
    [junit] 	at junit.framework.TestCase.run(TestCase.java:120)
    [junit] 	at junit.framework.TestSuite.runTest(TestSuite.java:230)
    [junit] 	at junit.framework.TestSuite.run(TestSuite.java:225)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
    [junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src)
    [junit] 	at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
    [junit] 	at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
    [junit] 	at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
    [junit] 	... 20 more
    [junit] Begin query: udf4.q
    [junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/srcbucket)
    [junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/srcbucket)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.init(QTestUtil.java:329)
    [junit] 	at org.apache.hadoop.hive.ql.parse.TestParse.testParse_udf4(TestParse.java:932)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit] 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit] 	at java.lang.reflect.Method.invoke(Method.java:597)
    [junit] 	at junit.framework.TestCase.runTest(TestCase.java:164)
    [junit] 	at junit.framework.TestCase.runBare(TestCase.java:130)
    [junit] 	at junit.framework.TestResult$1.protect(TestResult.java:106)
    [junit] 	at junit.framework.TestResult.runProtected(TestResult.java:124)
    [junit] 	at junit.framework.TestResult.run(TestResult.java:109)
    [junit] 	at junit.framework.TestCase.run(TestCase.java:120)
    [junit] 	at junit.framework.TestSuite.runTest(TestSuite.java:230)
    [junit] 	at junit.framework.TestSuite.run(TestSuite.java:225)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
    [junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/srcbucket)
    [junit] 	at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
    [junit] 	at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
    [junit] 	at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
    [junit] 	... 20 more
    [junit] Begin query: input_testxpath.q
    [junit] Loading data to table srcpart partition {ds=2008-04-08, hr=11}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-08, hr=12}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-09, hr=11}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-09, hr=12}
    [junit] OK
    [junit] Loading data to table srcbucket
    [junit] OK
    [junit] Loading data to table srcbucket
    [junit] OK
    [junit] Loading data to table src
    [junit] OK
    [junit] diff /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/../build/ql/test/logs/input_testxpath.q.out /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/src/test/results/compiler/parse/input_testxpath.q.out
    [junit] diff -b -I'\(\(<java version=".*" class="java.beans.XMLDecoder">\)\|\(<string>.*/tmp/.*</string>\)\|\(<string>file:.*</string>\)\|\(<string>/.*/warehouse/.*</string>\)\)' /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/../build/ql/test/logs/input_testxpath.q.xml /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/src/test/results/compiler/plan/input_testxpath.q.xml
    [junit] Done query: input_testxpath.q
    [junit] Begin query: input_part1.q
    [junit] Loading data to table srcpart partition {ds=2008-04-08, hr=11}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-08, hr=12}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-09, hr=11}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-09, hr=12}
    [junit] OK
    [junit] Loading data to table srcbucket
    [junit] OK
    [junit] Loading data to table srcbucket
    [junit] OK
    [junit] Loading data to table src
    [junit] OK
    [junit] diff /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/../build/ql/test/logs/input_part1.q.out /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/src/test/results/compiler/parse/input_part1.q.out
    [junit] diff -b -I'\(\(<java version=".*" class="java.beans.XMLDecoder">\)\|\(<string>.*/tmp/.*</string>\)\|\(<string>file:.*</string>\)\|\(<string>/.*/warehouse/.*</string>\)\)' /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/../build/ql/test/logs/input_part1.q.xml /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/src/test/results/compiler/plan/input_part1.q.xml
    [junit] Done query: input_part1.q
    [junit] Begin query: groupby1.q
    [junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src)
    [junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.init(QTestUtil.java:329)
    [junit] 	at org.apache.hadoop.hive.ql.parse.TestParse.testParse_groupby1(TestParse.java:1010)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit] 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit] 	at java.lang.reflect.Method.invoke(Method.java:597)
    [junit] 	at junit.framework.TestCase.runTest(TestCase.java:164)
    [junit] 	at junit.framework.TestCase.runBare(TestCase.java:130)
    [junit] 	at junit.framework.TestResult$1.protect(TestResult.java:106)
    [junit] 	at junit.framework.TestResult.runProtected(TestResult.java:124)
    [junit] 	at junit.framework.TestResult.run(TestResult.java:109)
    [junit] 	at junit.framework.TestCase.run(TestCase.java:120)
    [junit] 	at junit.framework.TestSuite.runTest(TestSuite.java:230)
    [junit] 	at junit.framework.TestSuite.run(TestSuite.java:225)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
    [junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src)
    [junit] 	at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
    [junit] 	at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
    [junit] 	at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
    [junit] 	... 20 more
    [junit] Begin query: groupby2.q
    [junit] Loading data to table srcpart partition {ds=2008-04-08, hr=11}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-08, hr=12}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-09, hr=11}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-09, hr=12}
    [junit] OK
    [junit] Loading data to table srcbucket
    [junit] OK
    [junit] Loading data to table srcbucket
    [junit] OK
    [junit] Loading data to table src
    [junit] OK
    [junit] diff /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/../build/ql/test/logs/groupby2.q.out /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/src/test/results/compiler/parse/groupby2.q.out
    [junit] diff -b -I'\(\(<java version=".*" class="java.beans.XMLDecoder">\)\|\(<string>.*/tmp/.*</string>\)\|\(<string>file:.*</string>\)\|\(<string>/.*/warehouse/.*</string>\)\)' /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/../build/ql/test/logs/groupby2.q.xml /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/src/test/results/compiler/plan/groupby2.q.xml
    [junit] Done query: groupby2.q
    [junit] Begin query: groupby3.q
    [junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src)
    [junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.init(QTestUtil.java:329)
    [junit] 	at org.apache.hadoop.hive.ql.parse.TestParse.testParse_groupby3(TestParse.java:1062)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit] 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit] 	at java.lang.reflect.Method.invoke(Method.java:597)
    [junit] 	at junit.framework.TestCase.runTest(TestCase.java:164)
    [junit] 	at junit.framework.TestCase.runBare(TestCase.java:130)
    [junit] 	at junit.framework.TestResult$1.protect(TestResult.java:106)
    [junit] 	at junit.framework.TestResult.runProtected(TestResult.java:124)
    [junit] 	at junit.framework.TestResult.run(TestResult.java:109)
    [junit] 	at junit.framework.TestCase.run(TestCase.java:120)
    [junit] 	at junit.framework.TestSuite.runTest(TestSuite.java:230)
    [junit] 	at junit.framework.TestSuite.run(TestSuite.java:225)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
    [junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src)
    [junit] 	at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
    [junit] 	at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
    [junit] 	at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
    [junit] 	... 20 more
    [junit] Begin query: subq.q
    [junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/srcbucket)
    [junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/srcbucket)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.init(QTestUtil.java:329)
    [junit] 	at org.apache.hadoop.hive.ql.parse.TestParse.testParse_subq(TestParse.java:1088)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit] 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit] 	at java.lang.reflect.Method.invoke(Method.java:597)
    [junit] 	at junit.framework.TestCase.runTest(TestCase.java:164)
    [junit] 	at junit.framework.TestCase.runBare(TestCase.java:130)
    [junit] 	at junit.framework.TestResult$1.protect(TestResult.java:106)
    [junit] 	at junit.framework.TestResult.runProtected(TestResult.java:124)
    [junit] 	at junit.framework.TestResult.run(TestResult.java:109)
    [junit] 	at junit.framework.TestCase.run(TestCase.java:120)
    [junit] 	at junit.framework.TestSuite.runTest(TestSuite.java:230)
    [junit] 	at junit.framework.TestSuite.run(TestSuite.java:225)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
    [junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/srcbucket)
    [junit] 	at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
    [junit] 	at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
    [junit] 	at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
    [junit] 	... 20 more
    [junit] Begin query: groupby4.q
    [junit] Loading data to table srcpart partition {ds=2008-04-08, hr=11}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-08, hr=12}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-09, hr=11}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-09, hr=12}
    [junit] OK
    [junit] Loading data to table srcbucket
    [junit] OK
    [junit] Loading data to table srcbucket
    [junit] OK
    [junit] Loading data to table src
    [junit] OK
    [junit] diff /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/../build/ql/test/logs/groupby4.q.out /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/src/test/results/compiler/parse/groupby4.q.out
    [junit] diff -b -I'\(\(<java version=".*" class="java.beans.XMLDecoder">\)\|\(<string>.*/tmp/.*</string>\)\|\(<string>file:.*</string>\)\|\(<string>/.*/warehouse/.*</string>\)\)' /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/../build/ql/test/logs/groupby4.q.xml /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/src/test/results/compiler/plan/groupby4.q.xml
    [junit] Done query: groupby4.q
    [junit] Begin query: groupby5.q
    [junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src)
    [junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.init(QTestUtil.java:329)
    [junit] 	at org.apache.hadoop.hive.ql.parse.TestParse.testParse_groupby5(TestParse.java:1140)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit] 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit] 	at java.lang.reflect.Method.invoke(Method.java:597)
    [junit] 	at junit.framework.TestCase.runTest(TestCase.java:164)
    [junit] 	at junit.framework.TestCase.runBare(TestCase.java:130)
    [junit] 	at junit.framework.TestResult$1.protect(TestResult.java:106)
    [junit] 	at junit.framework.TestResult.runProtected(TestResult.java:124)
    [junit] 	at junit.framework.TestResult.run(TestResult.java:109)
    [junit] 	at junit.framework.TestCase.run(TestCase.java:120)
    [junit] 	at junit.framework.TestSuite.runTest(TestSuite.java:230)
    [junit] 	at junit.framework.TestSuite.run(TestSuite.java:225)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
    [junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src)
    [junit] 	at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
    [junit] 	at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
    [junit] 	at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
    [junit] 	... 20 more
    [junit] Begin query: groupby6.q
    [junit] Loading data to table srcpart partition {ds=2008-04-08, hr=11}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-08, hr=12}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-09, hr=11}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-09, hr=12}
    [junit] OK
    [junit] Loading data to table srcbucket
    [junit] OK
    [junit] Loading data to table srcbucket
    [junit] OK
    [junit] Loading data to table src
    [junit] OK
    [junit] diff /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/../build/ql/test/logs/groupby6.q.out /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/src/test/results/compiler/parse/groupby6.q.out
    [junit] diff -b -I'\(\(<java version=".*" class="java.beans.XMLDecoder">\)\|\(<string>.*/tmp/.*</string>\)\|\(<string>file:.*</string>\)\|\(<string>/.*/warehouse/.*</string>\)\)' /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/../build/ql/test/logs/groupby6.q.xml /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/src/test/results/compiler/plan/groupby6.q.xml
    [junit] Done query: groupby6.q
    [junit] Tests run: 41, Failures: 19, Errors: 0, Time elapsed: 66.551 sec
    [junit] Test org.apache.hadoop.hive.ql.parse.TestParse FAILED
    [junit] Running org.apache.hadoop.hive.ql.parse.TestParseNegative
    [junit] Begin query: insert_wrong_number_columns.q
    [junit] Loading data to table srcpart partition {ds=2008-04-08, hr=11}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-08, hr=12}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-09, hr=11}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-09, hr=12}
    [junit] OK
    [junit] Loading data to table srcbucket
    [junit] OK
    [junit] Loading data to table srcbucket
    [junit] OK
    [junit] Loading data to table src
    [junit] OK
    [junit] diff /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/../build/ql/test/logs/insert_wrong_number_columns.q.out /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/src/test/results/compiler/errors/insert_wrong_number_columns.q.out
    [junit] Done query: insert_wrong_number_columns.q
    [junit] Begin query: duplicate_alias.q
    [junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src)
    [junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.init(QTestUtil.java:329)
    [junit] 	at org.apache.hadoop.hive.ql.parse.TestParseNegative.testParseNegative_duplicate_alias(TestParseNegative.java:133)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit] 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit] 	at java.lang.reflect.Method.invoke(Method.java:597)
    [junit] 	at junit.framework.TestCase.runTest(TestCase.java:164)
    [junit] 	at junit.framework.TestCase.runBare(TestCase.java:130)
    [junit] 	at junit.framework.TestResult$1.protect(TestResult.java:106)
    [junit] 	at junit.framework.TestResult.runProtected(TestResult.java:124)
    [junit] 	at junit.framework.TestResult.run(TestResult.java:109)
    [junit] 	at junit.framework.TestCase.run(TestCase.java:120)
    [junit] 	at junit.framework.TestSuite.runTest(TestSuite.java:230)
    [junit] 	at junit.framework.TestSuite.run(TestSuite.java:225)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
    [junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src)
    [junit] 	at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
    [junit] 	at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
    [junit] 	at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
    [junit] 	... 20 more
    [junit] Begin query: unknown_function1.q
    [junit] Loading data to table srcpart partition {ds=2008-04-08, hr=11}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-08, hr=12}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-09, hr=11}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-09, hr=12}
    [junit] OK
    [junit] Loading data to table srcbucket
    [junit] OK
    [junit] Loading data to table srcbucket
    [junit] OK
    [junit] Loading data to table src
    [junit] OK
    [junit] diff /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/../build/ql/test/logs/unknown_function1.q.out /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/src/test/results/compiler/errors/unknown_function1.q.out
    [junit] Done query: unknown_function1.q
    [junit] Begin query: unknown_function2.q
    [junit] Loading data to table srcpart partition {ds=2008-04-08, hr=11}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-08, hr=12}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-09, hr=11}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-09, hr=12}
    [junit] OK
    [junit] Loading data to table srcbucket
    [junit] OK
    [junit] Loading data to table srcbucket
    [junit] OK
    [junit] Loading data to table src
    [junit] OK
    [junit] diff /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/../build/ql/test/logs/unknown_function2.q.out /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/src/test/results/compiler/errors/unknown_function2.q.out
    [junit] Done query: unknown_function2.q
    [junit] Begin query: unknown_table1.q
    [junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src)
    [junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.init(QTestUtil.java:329)
    [junit] 	at org.apache.hadoop.hive.ql.parse.TestParseNegative.testParseNegative_unknown_table1(TestParseNegative.java:226)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit] 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit] 	at java.lang.reflect.Method.invoke(Method.java:597)
    [junit] 	at junit.framework.TestCase.runTest(TestCase.java:164)
    [junit] 	at junit.framework.TestCase.runBare(TestCase.java:130)
    [junit] 	at junit.framework.TestResult$1.protect(TestResult.java:106)
    [junit] 	at junit.framework.TestResult.runProtected(TestResult.java:124)
    [junit] 	at junit.framework.TestResult.run(TestResult.java:109)
    [junit] 	at junit.framework.TestCase.run(TestCase.java:120)
    [junit] 	at junit.framework.TestSuite.runTest(TestSuite.java:230)
    [junit] 	at junit.framework.TestSuite.run(TestSuite.java:225)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
    [junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src)
    [junit] 	at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
    [junit] 	at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
    [junit] 	at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
    [junit] 	... 20 more
    [junit] Begin query: unknown_function3.q
    [junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/srcbucket)
    [junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/srcbucket)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.init(QTestUtil.java:329)
    [junit] 	at org.apache.hadoop.hive.ql.parse.TestParseNegative.testParseNegative_unknown_function3(TestParseNegative.java:257)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit] 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit] 	at java.lang.reflect.Method.invoke(Method.java:597)
    [junit] 	at junit.framework.TestCase.runTest(TestCase.java:164)
    [junit] 	at junit.framework.TestCase.runBare(TestCase.java:130)
    [junit] 	at junit.framework.TestResult$1.protect(TestResult.java:106)
    [junit] 	at junit.framework.TestResult.runProtected(TestResult.java:124)
    [junit] 	at junit.framework.TestResult.run(TestResult.java:109)
    [junit] 	at junit.framework.TestCase.run(TestCase.java:120)
    [junit] 	at junit.framework.TestSuite.runTest(TestSuite.java:230)
    [junit] 	at junit.framework.TestSuite.run(TestSuite.java:225)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
    [junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/srcbucket)
    [junit] 	at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
    [junit] 	at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
    [junit] 	at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
    [junit] 	... 20 more
    [junit] Begin query: quoted_string.q
    [junit] Loading data to table srcpart partition {ds=2008-04-08, hr=11}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-08, hr=12}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-09, hr=11}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-09, hr=12}
    [junit] OK
    [junit] Loading data to table srcbucket
    [junit] OK
    [junit] Loading data to table srcbucket
    [junit] OK
    [junit] Loading data to table src
    [junit] OK
    [junit] diff /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/../build/ql/test/logs/quoted_string.q.out /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/src/test/results/compiler/errors/quoted_string.q.out
    [junit] Done query: quoted_string.q
    [junit] Begin query: unknown_table2.q
    [junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src)
    [junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.init(QTestUtil.java:329)
    [junit] 	at org.apache.hadoop.hive.ql.parse.TestParseNegative.testParseNegative_unknown_table2(TestParseNegative.java:319)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit] 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit] 	at java.lang.reflect.Method.invoke(Method.java:597)
    [junit] 	at junit.framework.TestCase.runTest(TestCase.java:164)
    [junit] 	at junit.framework.TestCase.runBare(TestCase.java:130)
    [junit] 	at junit.framework.TestResult$1.protect(TestResult.java:106)
    [junit] 	at junit.framework.TestResult.runProtected(TestResult.java:124)
    [junit] 	at junit.framework.TestResult.run(TestResult.java:109)
    [junit] 	at junit.framework.TestCase.run(TestCase.java:120)
    [junit] 	at junit.framework.TestSuite.runTest(TestSuite.java:230)
    [junit] 	at junit.framework.TestSuite.run(TestSuite.java:225)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
    [junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src)
    [junit] 	at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
    [junit] 	at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
    [junit] 	at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
    [junit] 	... 20 more
    [junit] Begin query: unknown_function4.q
    [junit] Loading data to table srcpart partition {ds=2008-04-08, hr=11}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-08, hr=12}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-09, hr=11}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-09, hr=12}
    [junit] OK
    [junit] Loading data to table srcbucket
    [junit] OK
    [junit] Loading data to table srcbucket
    [junit] OK
    [junit] Loading data to table src
    [junit] OK
    [junit] diff /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/../build/ql/test/logs/unknown_function4.q.out /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/src/test/results/compiler/errors/unknown_function4.q.out
    [junit] Done query: unknown_function4.q
    [junit] Begin query: garbage.q
    [junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.init(QTestUtil.java:329)
    [junit] 	at org.apache.hadoop.hive.ql.parse.TestParseNegative.testParseNegative_garbage(TestParseNegative.java:381)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit] 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit] 	at java.lang.reflect.Method.invoke(Method.java:597)
    [junit] 	at junit.framework.TestCase.runTest(TestCase.java:164)
    [junit] 	at junit.framework.TestCase.runBare(TestCase.java:130)
    [junit] 	at junit.framework.TestResult$1.protect(TestResult.java:106)
    [junit] 	at junit.framework.TestResult.runProtected(TestResult.java:124)
    [junit] 	at junit.framework.TestResult.run(TestResult.java:109)
    [junit] 	at junit.framework.TestCase.run(TestCase.java:120)
    [junit] 	at junit.framework.TestSuite.runTest(TestSuite.java:230)
    [junit] 	at junit.framework.TestSuite.run(TestSuite.java:225)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
    [junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src)
    [junit] 	at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
    [junit] 	at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
    [junit] 	at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
    [junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
    [junit] 	... 20 more
    [junit] Begin query: unknown_function5.q
    [junit] Loading data to table srcpart partition {ds=2008-04-08, hr=11}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-08, hr=12}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-09, hr=11}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-09, hr=12}
    [junit] OK
    [junit] Loading data to table srcbucket
    [junit] OK
    [junit] Loading data to table srcbucket
    [junit] OK
    [junit] Loading data to table src
    [junit] OK
    [junit] diff /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/../build/ql/test/logs/unknown_function5.q.out /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/src/test/results/compiler/errors/unknown_function5.q.out
    [junit] Done query: unknown_function5.q
    [junit] Begin query: invalid_list_index2.q
    [junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src)
    [junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.init(QTestUtil.java:329)
    [junit] 	at org.apache.hadoop.hive.ql.parse.TestParseNegative.testParseNegative_invalid_list_index2(TestParseNegative.java:443)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit] 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit] 	at java.lang.reflect.Method.invoke(Method.java:597)
    [junit] 	at junit.framework.TestCase.runTest(TestCase.java:164)
    [junit] 	at junit.framework.TestCase.runBare(TestCase.java:130)
    [junit] 	at junit.framework.TestResult$1.protect(TestResult.java:106)
    [junit] 	at junit.framework.TestResult.runProtected(TestResult.java:124)
    [junit] 	at junit.framework.TestResult.run(TestResult.java:109)
    [junit] 	at junit.framework.TestCase.run(TestCase.java:120)
    [junit] 	at junit.framework.TestSuite.runTest(TestSuite.java:230)
    [junit] 	at junit.framework.TestSuite.run(TestSuite.java:225)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
    [junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src)
    [junit] 	at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
    [junit] 	at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
    [junit] 	at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
    [junit] 	... 20 more
    [junit] Begin query: invalid_dot.q
    [junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/srcpart)
    [junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/srcpart)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.init(QTestUtil.java:329)
    [junit] 	at org.apache.hadoop.hive.ql.parse.TestParseNegative.testParseNegative_invalid_dot(TestParseNegative.java:474)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit] 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit] 	at java.lang.reflect.Method.invoke(Method.java:597)
    [junit] 	at junit.framework.TestCase.runTest(TestCase.java:164)
    [junit] 	at junit.framework.TestCase.runBare(TestCase.java:130)
    [junit] 	at junit.framework.TestResult$1.protect(TestResult.java:106)
    [junit] 	at junit.framework.TestResult.runProtected(TestResult.java:124)
    [junit] 	at junit.framework.TestResult.run(TestResult.java:109)
    [junit] 	at junit.framework.TestCase.run(TestCase.java:120)
    [junit] 	at junit.framework.TestSuite.runTest(TestSuite.java:230)
    [junit] 	at junit.framework.TestSuite.run(TestSuite.java:225)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
    [junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/srcpart)
    [junit] 	at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
    [junit] 	at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
    [junit] 	at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
    [junit] 	... 20 more
    [junit] Begin query: invalid_function_param1.q
    [junit] Loading data to table srcpart partition {ds=2008-04-08, hr=11}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-08, hr=12}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-09, hr=11}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-09, hr=12}
    [junit] OK
    [junit] Loading data to table srcbucket
    [junit] OK
    [junit] Loading data to table srcbucket
    [junit] OK
    [junit] Loading data to table src
    [junit] OK
    [junit] diff /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/../build/ql/test/logs/invalid_function_param1.q.out /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/src/test/results/compiler/errors/invalid_function_param1.q.out
    [junit] Done query: invalid_function_param1.q
    [junit] Begin query: invalid_map_index2.q
    [junit] Loading data to table srcpart partition {ds=2008-04-08, hr=11}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-08, hr=12}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-09, hr=11}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-09, hr=12}
    [junit] OK
    [junit] Loading data to table srcbucket
    [junit] OK
    [junit] Loading data to table srcbucket
    [junit] OK
    [junit] Loading data to table src
    [junit] OK
    [junit] diff /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/../build/ql/test/logs/invalid_map_index2.q.out /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/src/test/results/compiler/errors/invalid_map_index2.q.out
    [junit] Done query: invalid_map_index2.q
    [junit] Begin query: unknown_column1.q
    [junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src)
    [junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.init(QTestUtil.java:329)
    [junit] 	at org.apache.hadoop.hive.ql.parse.TestParseNegative.testParseNegative_unknown_column1(TestParseNegative.java:567)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit] 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit] 	at java.lang.reflect.Method.invoke(Method.java:597)
    [junit] 	at junit.framework.TestCase.runTest(TestCase.java:164)
    [junit] 	at junit.framework.TestCase.runBare(TestCase.java:130)
    [junit] 	at junit.framework.TestResult$1.protect(TestResult.java:106)
    [junit] 	at junit.framework.TestResult.runProtected(TestResult.java:124)
    [junit] 	at junit.framework.TestResult.run(TestResult.java:109)
    [junit] 	at junit.framework.TestCase.run(TestCase.java:120)
    [junit] 	at junit.framework.TestSuite.runTest(TestSuite.java:230)
    [junit] 	at junit.framework.TestSuite.run(TestSuite.java:225)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
    [junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src)
    [junit] 	at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
    [junit] 	at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
    [junit] 	at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
    [junit] 	... 20 more
    [junit] Begin query: invalid_function_param2.q
    [junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/srcbucket)
    [junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/srcbucket)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.init(QTestUtil.java:329)
    [junit] 	at org.apache.hadoop.hive.ql.parse.TestParseNegative.testParseNegative_invalid_function_param2(TestParseNegative.java:598)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit] 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit] 	at java.lang.reflect.Method.invoke(Method.java:597)
    [junit] 	at junit.framework.TestCase.runTest(TestCase.java:164)
    [junit] 	at junit.framework.TestCase.runBare(TestCase.java:130)
    [junit] 	at junit.framework.TestResult$1.protect(TestResult.java:106)
    [junit] 	at junit.framework.TestResult.runProtected(TestResult.java:124)
    [junit] 	at junit.framework.TestResult.run(TestResult.java:109)
    [junit] 	at junit.framework.TestCase.run(TestCase.java:120)
    [junit] 	at junit.framework.TestSuite.runTest(TestSuite.java:230)
    [junit] 	at junit.framework.TestSuite.run(TestSuite.java:225)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
    [junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/srcbucket)
    [junit] 	at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
    [junit] 	at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
    [junit] 	at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
    [junit] 	... 20 more
    [junit] Begin query: unknown_column2.q
    [junit] Loading data to table srcpart partition {ds=2008-04-08, hr=11}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-08, hr=12}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-09, hr=11}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-09, hr=12}
    [junit] OK
    [junit] Loading data to table srcbucket
    [junit] OK
    [junit] Loading data to table srcbucket
    [junit] OK
    [junit] Loading data to table src
    [junit] OK
    [junit] diff /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/../build/ql/test/logs/unknown_column2.q.out /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/src/test/results/compiler/errors/unknown_column2.q.out
    [junit] Done query: unknown_column2.q
    [junit] Begin query: unknown_column3.q
    [junit] Loading data to table srcpart partition {ds=2008-04-08, hr=11}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-08, hr=12}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-09, hr=11}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-09, hr=12}
    [junit] OK
    [junit] Loading data to table srcbucket
    [junit] OK
    [junit] Loading data to table srcbucket
    [junit] OK
    [junit] Loading data to table src
    [junit] OK
    [junit] diff /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/../build/ql/test/logs/unknown_column3.q.out /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/src/test/results/compiler/errors/unknown_column3.q.out
    [junit] Done query: unknown_column3.q
    [junit] Begin query: unknown_column4.q
    [junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src)
    [junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.init(QTestUtil.java:329)
    [junit] 	at org.apache.hadoop.hive.ql.parse.TestParseNegative.testParseNegative_unknown_column4(TestParseNegative.java:691)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit] 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit] 	at java.lang.reflect.Method.invoke(Method.java:597)
    [junit] 	at junit.framework.TestCase.runTest(TestCase.java:164)
    [junit] 	at junit.framework.TestCase.runBare(TestCase.java:130)
    [junit] 	at junit.framework.TestResult$1.protect(TestResult.java:106)
    [junit] 	at junit.framework.TestResult.runProtected(TestResult.java:124)
    [junit] 	at junit.framework.TestResult.run(TestResult.java:109)
    [junit] 	at junit.framework.TestCase.run(TestCase.java:120)
    [junit] 	at junit.framework.TestSuite.runTest(TestSuite.java:230)
    [junit] 	at junit.framework.TestSuite.run(TestSuite.java:225)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
    [junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src)
    [junit] 	at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
    [junit] 	at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
    [junit] 	at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
    [junit] 	... 20 more
    [junit] Begin query: unknown_column5.q
    [junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/srcbucket)
    [junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/srcbucket)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.init(QTestUtil.java:329)
    [junit] 	at org.apache.hadoop.hive.ql.parse.TestParseNegative.testParseNegative_unknown_column5(TestParseNegative.java:722)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit] 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit] 	at java.lang.reflect.Method.invoke(Method.java:597)
    [junit] 	at junit.framework.TestCase.runTest(TestCase.java:164)
    [junit] 	at junit.framework.TestCase.runBare(TestCase.java:130)
    [junit] 	at junit.framework.TestResult$1.protect(TestResult.java:106)
    [junit] 	at junit.framework.TestResult.runProtected(TestResult.java:124)
    [junit] 	at junit.framework.TestResult.run(TestResult.java:109)
    [junit] 	at junit.framework.TestCase.run(TestCase.java:120)
    [junit] 	at junit.framework.TestSuite.runTest(TestSuite.java:230)
    [junit] 	at junit.framework.TestSuite.run(TestSuite.java:225)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
    [junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/srcbucket)
    [junit] 	at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
    [junit] 	at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
    [junit] 	at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
    [junit] 	... 20 more
    [junit] Begin query: unknown_column6.q
    [junit] Loading data to table srcpart partition {ds=2008-04-08, hr=11}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-08, hr=12}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-09, hr=11}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-09, hr=12}
    [junit] OK
    [junit] Loading data to table srcbucket
    [junit] OK
    [junit] Loading data to table srcbucket
    [junit] OK
    [junit] Loading data to table src
    [junit] OK
    [junit] diff /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/../build/ql/test/logs/unknown_column6.q.out /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/src/test/results/compiler/errors/unknown_column6.q.out
    [junit] Done query: unknown_column6.q
    [junit] Begin query: invalid_list_index.q
    [junit] Loading data to table srcpart partition {ds=2008-04-08, hr=11}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-08, hr=12}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-09, hr=11}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-09, hr=12}
    [junit] OK
    [junit] Loading data to table srcbucket
    [junit] OK
    [junit] Loading data to table srcbucket
    [junit] OK
    [junit] Loading data to table src
    [junit] OK
    [junit] diff /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/../build/ql/test/logs/invalid_list_index.q.out /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/src/test/results/compiler/errors/invalid_list_index.q.out
    [junit] Done query: invalid_list_index.q
    [junit] Begin query: nonkey_groupby.q
    [junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src)
    [junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.init(QTestUtil.java:329)
    [junit] 	at org.apache.hadoop.hive.ql.parse.TestParseNegative.testParseNegative_nonkey_groupby(TestParseNegative.java:815)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit] 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit] 	at java.lang.reflect.Method.invoke(Method.java:597)
    [junit] 	at junit.framework.TestCase.runTest(TestCase.java:164)
    [junit] 	at junit.framework.TestCase.runBare(TestCase.java:130)
    [junit] 	at junit.framework.TestResult$1.protect(TestResult.java:106)
    [junit] 	at junit.framework.TestResult.runProtected(TestResult.java:124)
    [junit] 	at junit.framework.TestResult.run(TestResult.java:109)
    [junit] 	at junit.framework.TestCase.run(TestCase.java:120)
    [junit] 	at junit.framework.TestSuite.runTest(TestSuite.java:230)
    [junit] 	at junit.framework.TestSuite.run(TestSuite.java:225)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
    [junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src)
    [junit] 	at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
    [junit] 	at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
    [junit] 	at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
    [junit] 	... 20 more
    [junit] Begin query: invalid_map_index.q
    [junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/srcpart)
    [junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/srcpart)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.init(QTestUtil.java:329)
    [junit] 	at org.apache.hadoop.hive.ql.parse.TestParseNegative.testParseNegative_invalid_map_index(TestParseNegative.java:846)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit] 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit] 	at java.lang.reflect.Method.invoke(Method.java:597)
    [junit] 	at junit.framework.TestCase.runTest(TestCase.java:164)
    [junit] 	at junit.framework.TestCase.runBare(TestCase.java:130)
    [junit] 	at junit.framework.TestResult$1.protect(TestResult.java:106)
    [junit] 	at junit.framework.TestResult.runProtected(TestResult.java:124)
    [junit] 	at junit.framework.TestResult.run(TestResult.java:109)
    [junit] 	at junit.framework.TestCase.run(TestCase.java:120)
    [junit] 	at junit.framework.TestSuite.runTest(TestSuite.java:230)
    [junit] 	at junit.framework.TestSuite.run(TestSuite.java:225)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
    [junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/srcpart)
    [junit] 	at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
    [junit] 	at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
    [junit] 	at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
    [junit] 	... 20 more
    [junit] Begin query: invalid_index.q
    [junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/srcbucket)
    [junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/srcbucket)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.init(QTestUtil.java:329)
    [junit] 	at org.apache.hadoop.hive.ql.parse.TestParseNegative.testParseNegative_invalid_index(TestParseNegative.java:877)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit] 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit] 	at java.lang.reflect.Method.invoke(Method.java:597)
    [junit] 	at junit.framework.TestCase.runTest(TestCase.java:164)
    [junit] 	at junit.framework.TestCase.runBare(TestCase.java:130)
    [junit] 	at junit.framework.TestResult$1.protect(TestResult.java:106)
    [junit] 	at junit.framework.TestResult.runProtected(TestResult.java:124)
    [junit] 	at junit.framework.TestResult.run(TestResult.java:109)
    [junit] 	at junit.framework.TestCase.run(TestCase.java:120)
    [junit] 	at junit.framework.TestSuite.runTest(TestSuite.java:230)
    [junit] 	at junit.framework.TestSuite.run(TestSuite.java:225)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
    [junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/srcbucket)
    [junit] 	at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
    [junit] 	at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
    [junit] 	at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
    [junit] 	... 20 more
    [junit] Begin query: wrong_distinct1.q
    [junit] Loading data to table srcpart partition {ds=2008-04-08, hr=11}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-08, hr=12}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-09, hr=11}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-09, hr=12}
    [junit] OK
    [junit] Loading data to table srcbucket
    [junit] OK
    [junit] Loading data to table srcbucket
    [junit] OK
    [junit] Loading data to table src
    [junit] OK
    [junit] diff /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/../build/ql/test/logs/wrong_distinct1.q.out /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/src/test/results/compiler/errors/wrong_distinct1.q.out
    [junit] Done query: wrong_distinct1.q
    [junit] Begin query: missing_overwrite.q
    [junit] Exception: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src)
    [junit] org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:268)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:246)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:219)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.cleanUp(QTestUtil.java:208)
    [junit] 	at org.apache.hadoop.hive.ql.QTestUtil.init(QTestUtil.java:329)
    [junit] 	at org.apache.hadoop.hive.ql.parse.TestParseNegative.testParseNegative_missing_overwrite(TestParseNegative.java:939)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit] 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit] 	at java.lang.reflect.Method.invoke(Method.java:597)
    [junit] 	at junit.framework.TestCase.runTest(TestCase.java:164)
    [junit] 	at junit.framework.TestCase.runBare(TestCase.java:130)
    [junit] 	at junit.framework.TestResult$1.protect(TestResult.java:106)
    [junit] 	at junit.framework.TestResult.runProtected(TestResult.java:124)
    [junit] 	at junit.framework.TestResult.run(TestResult.java:109)
    [junit] 	at junit.framework.TestCase.run(TestCase.java:120)
    [junit] 	at junit.framework.TestSuite.runTest(TestSuite.java:230)
    [junit] 	at junit.framework.TestSuite.run(TestSuite.java:225)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:297)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:672)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:567)
    [junit] Caused by: MetaException(message:Unable to delete directory: file:/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build/ql/test/data/warehouse/src)
    [junit] 	at org.apache.hadoop.hive.metastore.Warehouse.deleteDir(Warehouse.java:102)
    [junit] 	at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.drop_table(HiveMetaStore.java:337)
    [junit] 	at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.dropTable(HiveMetaStoreClient.java:340)
    [junit] 	at org.apache.hadoop.hive.ql.metadata.Hive.dropTable(Hive.java:262)
    [junit] 	... 20 more
    [junit] Begin query: wrong_distinct2.q
    [junit] Loading data to table srcpart partition {ds=2008-04-08, hr=11}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-08, hr=12}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-09, hr=11}
    [junit] OK
    [junit] Loading data to table srcpart partition {ds=2008-04-09, hr=12}
    [junit] OK
    [junit] Loading data to table srcbucket
    [junit] OK
    [junit] Loading data to table srcbucket
    [junit] OK
    [junit] Loading data to table src
    [junit] OK
    [junit] diff /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/../build/ql/test/logs/wrong_distinct2.q.out /usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/ql/src/test/results/compiler/errors/wrong_distinct2.q.out
    [junit] Done query: wrong_distinct2.q
    [junit] Tests run: 29, Failures: 15, Errors: 0, Time elapsed: 45.574 sec
    [junit] Test org.apache.hadoop.hive.ql.parse.TestParseNegative FAILED
    [junit] Running org.apache.hadoop.hive.ql.tool.TestLineageInfo
    [junit] Tests run: 4, Failures: 0, Errors: 0, Time elapsed: 0.408 sec

BUILD FAILED
/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build.xml:104: The following error occurred while executing this line:
/usr/fbtools/continuous_builds/hiveopensource-trunk/hiveopensource_trunk/build-common.xml:261: Tests failed!

Total time: 11 minutes 38 seconds
EXIT VALUE IS 1 for runtests

Mime
View raw message