Return-Path: Delivered-To: apmail-hadoop-hive-dev-archive@minotaur.apache.org Received: (qmail 53308 invoked from network); 31 Mar 2009 13:02:58 -0000 Received: from hermes.apache.org (HELO mail.apache.org) (140.211.11.3) by minotaur.apache.org with SMTP; 31 Mar 2009 13:02:58 -0000 Received: (qmail 2805 invoked by uid 500); 31 Mar 2009 13:02:58 -0000 Delivered-To: apmail-hadoop-hive-dev-archive@hadoop.apache.org Received: (qmail 2767 invoked by uid 500); 31 Mar 2009 13:02:57 -0000 Mailing-List: contact hive-dev-help@hadoop.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: hive-dev@hadoop.apache.org Delivered-To: mailing list hive-dev@hadoop.apache.org Received: (qmail 2750 invoked by uid 99); 31 Mar 2009 13:02:57 -0000 Received: from nike.apache.org (HELO nike.apache.org) (192.87.106.230) by apache.org (qpsmtpd/0.29) with ESMTP; Tue, 31 Mar 2009 13:02:57 +0000 X-ASF-Spam-Status: No, hits=-2000.0 required=10.0 tests=ALL_TRUSTED X-Spam-Check-By: apache.org Received: from [140.211.11.106] (HELO hudson.zones.apache.org) (140.211.11.106) by apache.org (qpsmtpd/0.29) with ESMTP; Tue, 31 Mar 2009 13:02:55 +0000 Received: from hudson.zones.apache.org (localhost [127.0.0.1]) by hudson.zones.apache.org (8.13.8+Sun/8.13.8) with ESMTP id n2VD2Xsg001159 for ; Tue, 31 Mar 2009 09:02:33 -0400 (EDT) Date: Tue, 31 Mar 2009 13:02:33 +0000 (UTC) From: Apache Hudson Server To: hive-dev@hadoop.apache.org Message-ID: <31360977.2811238504553445.JavaMail.hudson@hudson.zones.apache.org> In-Reply-To: <20690438.2661238418147055.JavaMail.hudson@hudson.zones.apache.org> References: <20690438.2661238418147055.JavaMail.hudson@hudson.zones.apache.org> Subject: Build failed in Hudson: Hive-trunk-h0.17 #48 MIME-Version: 1.0 Content-Type: text/plain; charset=us-ascii Content-Transfer-Encoding: 7bit X-Virus-Checked: Checked by ClamAV on apache.org See http://hudson.zones.apache.org/hudson/job/Hive-trunk-h0.17/48/ ------------------------------------------ [...truncated 572 lines...] [echo] hive: anttasks configure: compile: [echo] Compiling: ql jar: [echo] Jar: ql [unzip] Expanding: http://hudson.zones.apache.org/hudson/job/Hive-trunk-h0.17/ws/hive/ql/lib/commons-jexl-1.1.jar into http://hudson.zones.apache.org/hudson/job/Hive-trunk-h0.17/ws/hive/build/jexl/classes [unzip] Expanding: http://hudson.zones.apache.org/hudson/job/Hive-trunk-h0.17/ws/hive/lib/libthrift.jar into http://hudson.zones.apache.org/hudson/job/Hive-trunk-h0.17/ws/hive/build/thrift/classes [unzip] Expanding: http://hudson.zones.apache.org/hudson/job/Hive-trunk-h0.17/ws/hive/lib/commons-lang-2.4.jar into http://hudson.zones.apache.org/hudson/job/Hive-trunk-h0.17/ws/hive/build/commons-lang/classes [unzip] Expanding: http://hudson.zones.apache.org/hudson/job/Hive-trunk-h0.17/ws/hive/lib/json.jar into http://hudson.zones.apache.org/hudson/job/Hive-trunk-h0.17/ws/hive/build/json/classes deploy: [echo] hive: ql init: [mkdir] Created dir: http://hudson.zones.apache.org/hudson/job/Hive-trunk-h0.17/ws/hive/build/cli/test [mkdir] Created dir: http://hudson.zones.apache.org/hudson/job/Hive-trunk-h0.17/ws/hive/build/cli/test/src [mkdir] Created dir: http://hudson.zones.apache.org/hudson/job/Hive-trunk-h0.17/ws/hive/build/cli/test/classes download-ivy: init-ivy: settings-ivy: resolve: :: loading settings :: file = http://hudson.zones.apache.org/hudson/job/Hive-trunk-h0.17/ws/hive/ivy/ivysettings.xml [ivy:retrieve] :: resolving dependencies :: org.apache.hadoop.hive#cli;working@minerva.apache.org [ivy:retrieve] confs: [default] [ivy:retrieve] found hadoop#core;0.17.2.1 in hadoop-resolver [ivy:retrieve] :: resolution report :: resolve 13ms :: artifacts dl 1ms --------------------------------------------------------------------- | | modules || artifacts | | conf | number| search|dwnlded|evicted|| number|dwnlded| --------------------------------------------------------------------- | default | 1 | 0 | 0 | 0 || 1 | 0 | --------------------------------------------------------------------- [ivy:retrieve] :: retrieving :: org.apache.hadoop.hive#cli [ivy:retrieve] confs: [default] [ivy:retrieve] 0 artifacts copied, 1 already retrieved (0kB/1ms) install-hadoopcore: compile: [echo] Compiling: cli jar: [echo] Jar: cli deploy: [echo] hive: cli init: [mkdir] Created dir: http://hudson.zones.apache.org/hudson/job/Hive-trunk-h0.17/ws/hive/build/service/test [mkdir] Created dir: http://hudson.zones.apache.org/hudson/job/Hive-trunk-h0.17/ws/hive/build/service/test/src [mkdir] Created dir: http://hudson.zones.apache.org/hudson/job/Hive-trunk-h0.17/ws/hive/build/service/test/classes core-compile: compile: jar: [echo] Jar: service deploy: [echo] hive: service init: [mkdir] Created dir: http://hudson.zones.apache.org/hudson/job/Hive-trunk-h0.17/ws/hive/build/jdbc/test [mkdir] Created dir: http://hudson.zones.apache.org/hudson/job/Hive-trunk-h0.17/ws/hive/build/jdbc/test/src [mkdir] Created dir: http://hudson.zones.apache.org/hudson/job/Hive-trunk-h0.17/ws/hive/build/jdbc/test/classes core-compile: compile: jar: [echo] Jar: jdbc deploy: [echo] hive: jdbc hwi-init: [mkdir] Created dir: http://hudson.zones.apache.org/hudson/job/Hive-trunk-h0.17/ws/hive/build/hwi/test/classes [mkdir] Created dir: http://hudson.zones.apache.org/hudson/job/Hive-trunk-h0.17/ws/hive/build/hwi/test/src compile: [echo] Compiling: hwi [javac] Compiling 7 source files to http://hudson.zones.apache.org/hudson/job/Hive-trunk-h0.17/ws/hive/build/hwi/classes jar: [echo] Jar: hwi [jar] Building jar: http://hudson.zones.apache.org/hudson/job/Hive-trunk-h0.17/ws/hive/build/hwi/hive_hwi.jar deploy: [echo] hive: hwi [copy] Copying 1 file to http://hudson.zones.apache.org/hudson/job/Hive-trunk-h0.17/ws/hive/build test: test: [echo] Nothing to do! test: [echo] Nothing to do! test-conditions: gen-test: hwi-init: compile: [echo] Compiling: hwi [javac] Compiling 7 source files to http://hudson.zones.apache.org/hudson/job/Hive-trunk-h0.17/ws/hive/build/hwi/classes compile-test: [javac] Compiling 1 source file to http://hudson.zones.apache.org/hudson/job/Hive-trunk-h0.17/ws/hive/build/hwi/test/classes test-jar: [jar] Building MANIFEST-only jar: http://hudson.zones.apache.org/hudson/job/Hive-trunk-h0.17/ws/hive/build/hwi/test/test-udfs.jar test-init: [mkdir] Created dir: http://hudson.zones.apache.org/hudson/job/Hive-trunk-h0.17/ws/hive/build/hwi/test/data [mkdir] Created dir: http://hudson.zones.apache.org/hudson/job/Hive-trunk-h0.17/ws/hive/build/hwi/test/logs/clientpositive [mkdir] Created dir: http://hudson.zones.apache.org/hudson/job/Hive-trunk-h0.17/ws/hive/build/hwi/test/logs/clientnegative [mkdir] Created dir: http://hudson.zones.apache.org/hudson/job/Hive-trunk-h0.17/ws/hive/build/hwi/test/logs/positive [mkdir] Created dir: http://hudson.zones.apache.org/hudson/job/Hive-trunk-h0.17/ws/hive/build/hwi/test/logs/negative [copy] Copying 24 files to http://hudson.zones.apache.org/hudson/job/Hive-trunk-h0.17/ws/hive/build/hwi/test/data [copy] Copied 6 empty directories to 2 empty directories under http://hudson.zones.apache.org/hudson/job/Hive-trunk-h0.17/ws/hive/build/hwi/test/data test: [junit] Running org.apache.hadoop.hive.hwi.TestHWISessionManager [junit] Hive history file=http://hudson.zones.apache.org/hudson/job/Hive-trunk-h0.17/ws/hive/hwi/../build/ql/tmp/hive_job_log_hudson_200903310617_532032663.txt [junit] Hive history file=http://hudson.zones.apache.org/hudson/job/Hive-trunk-h0.17/ws/hive/hwi/../build/ql/tmp/hive_job_log_hudson_200903310617_606824081.txt [junit] Hive history file=http://hudson.zones.apache.org/hudson/job/Hive-trunk-h0.17/ws/hive/hwi/../build/ql/tmp/hive_job_log_hudson_200903310617_2069376018.txt [junit] OK [junit] Hive history file=http://hudson.zones.apache.org/hudson/job/Hive-trunk-h0.17/ws/hive/hwi/../build/ql/tmp/hive_job_log_hudson_200903310617_594155454.txt [junit] Copying data from http://hudson.zones.apache.org/hudson/job/Hive-trunk-h0.17/ws/hive/data/files/kv1.txt [junit] Loading data to table test_hwi_table [junit] OK [junit] plan = /tmp/plan8506048898232763157.xml [junit] plan = /tmp/plan190117070373517557.xml [junit] Number of reduce tasks not specified. Defaulting to jobconf value of: 1 [junit] 09/03/31 06:17:11 INFO exec.ExecDriver: Number of reduce tasks not specified. Defaulting to jobconf value of: 1 [junit] In order to change the average load for a reducer (in bytes): [junit] 09/03/31 06:17:11 INFO exec.ExecDriver: In order to change the average load for a reducer (in bytes): [junit] 09/03/31 06:17:11 INFO exec.ExecDriver: set hive.exec.reducers.bytes.per.reducer= [junit] 09/03/31 06:17:11 INFO exec.ExecDriver: In order to limit the maximum number of reducers: [junit] 09/03/31 06:17:11 INFO exec.ExecDriver: set hive.exec.reducers.max= [junit] set hive.exec.reducers.bytes.per.reducer= [junit] In order to limit the maximum number of reducers: [junit] set hive.exec.reducers.max= [junit] In order to set a constant number of reducers: [junit] 09/03/31 06:17:11 INFO exec.ExecDriver: In order to set a constant number of reducers: [junit] set mapred.reduce.tasks= [junit] 09/03/31 06:17:11 INFO exec.ExecDriver: set mapred.reduce.tasks= [junit] Number of reduce tasks not specified. Defaulting to jobconf value of: 1 [junit] 09/03/31 06:17:11 INFO exec.ExecDriver: Number of reduce tasks not specified. Defaulting to jobconf value of: 1 [junit] In order to change the average load for a reducer (in bytes): [junit] 09/03/31 06:17:11 INFO exec.ExecDriver: In order to change the average load for a reducer (in bytes): [junit] set hive.exec.reducers.bytes.per.reducer= [junit] 09/03/31 06:17:11 INFO exec.ExecDriver: set hive.exec.reducers.bytes.per.reducer= [junit] In order to limit the maximum number of reducers: [junit] 09/03/31 06:17:11 INFO exec.ExecDriver: In order to limit the maximum number of reducers: [junit] set hive.exec.reducers.max= [junit] 09/03/31 06:17:11 INFO exec.ExecDriver: set hive.exec.reducers.max= [junit] In order to set a constant number of reducers: [junit] 09/03/31 06:17:11 INFO exec.ExecDriver: In order to set a constant number of reducers: [junit] set mapred.reduce.tasks= [junit] 09/03/31 06:17:11 INFO exec.ExecDriver: set mapred.reduce.tasks= [junit] 09/03/31 06:17:11 INFO exec.ExecDriver: Adding input file http://hudson.zones.apache.org/hudson/job/Hive-trunk-h0.17/ws/hive/build/ql/test/data/warehouse/test_hwi_table [junit] 09/03/31 06:17:11 INFO exec.ExecDriver: adding libjars: http://hudson.zones.apache.org/hudson/job/Hive-trunk-h0.17/ws/hive/build/hwi/test/test-udfs.jar,/home/hudson/hudson-slave/workspace/Hive-trunk-h0.17/hive/data/files/TestSerDe.jar [junit] 09/03/31 06:17:11 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId= [junit] Job Submission failed with exception 'org.apache.hadoop.util.Shell$ExitCodeException(chmod: cannot access `/tmp/hadoop-hudson/mapred/system/1423997103/job_local_1': No such file or directory [junit] )' [junit] 09/03/31 06:17:11 ERROR exec.ExecDriver: Job Submission failed with exception 'org.apache.hadoop.util.Shell$ExitCodeException(chmod: cannot access `/tmp/hadoop-hudson/mapred/system/1423997103/job_local_1': No such file or directory [junit] )' [junit] org.apache.hadoop.util.Shell$ExitCodeException: chmod: cannot access `/tmp/hadoop-hudson/mapred/system/1423997103/job_local_1': No such file or directory [junit] [junit] at org.apache.hadoop.util.Shell.runCommand(Shell.java:195) [junit] at org.apache.hadoop.util.Shell.run(Shell.java:134) [junit] at org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:286) [junit] at org.apache.hadoop.util.Shell.execCommand(Shell.java:317) [junit] at org.apache.hadoop.fs.RawLocalFileSystem.execCommand(RawLocalFileSystem.java:522) [junit] at org.apache.hadoop.fs.RawLocalFileSystem.setPermission(RawLocalFileSystem.java:514) [junit] at org.apache.hadoop.fs.FilterFileSystem.setPermission(FilterFileSystem.java:267) [junit] at org.apache.hadoop.fs.FileSystem.mkdirs(FileSystem.java:273) [junit] at org.apache.hadoop.mapred.JobClient.configureCommandLineOptions(JobClient.java:549) [junit] at org.apache.hadoop.mapred.JobClient.submitJob(JobClient.java:700) [junit] at org.apache.hadoop.hive.ql.exec.ExecDriver.execute(ExecDriver.java:393) [junit] at org.apache.hadoop.hive.ql.exec.ExecDriver.main(ExecDriver.java:556) [junit] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) [junit] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39) [junit] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25) [junit] at java.lang.reflect.Method.invoke(Method.java:597) [junit] at org.apache.hadoop.util.RunJar.main(RunJar.java:155) [junit] at org.apache.hadoop.mapred.JobShell.run(JobShell.java:194) [junit] at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65) [junit] at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:79) [junit] at org.apache.hadoop.mapred.JobShell.main(JobShell.java:220) [junit] [junit] Job Failed [junit] 09/03/31 06:17:11 INFO exec.ExecDriver: Adding input file http://hudson.zones.apache.org/hudson/job/Hive-trunk-h0.17/ws/hive/build/ql/test/data/warehouse/test_hwi_table [junit] 09/03/31 06:17:11 INFO exec.ExecDriver: adding libjars: http://hudson.zones.apache.org/hudson/job/Hive-trunk-h0.17/ws/hive/build/hwi/test/test-udfs.jar,/home/hudson/hudson-slave/workspace/Hive-trunk-h0.17/hive/data/files/TestSerDe.jar [junit] 09/03/31 06:17:11 INFO jvm.JvmMetrics: Initializing JVM Metrics with processName=JobTracker, sessionId= [junit] Job Submission failed with exception 'org.apache.hadoop.util.Shell$ExitCodeException(chmod: cannot access `/tmp/hadoop-hudson/mapred/system/1085677376/job_local_1': No such file or directory [junit] )' [junit] 09/03/31 06:17:11 ERROR exec.ExecDriver: Job Submission failed with exception 'org.apache.hadoop.util.Shell$ExitCodeException(chmod: cannot access `/tmp/hadoop-hudson/mapred/system/1085677376/job_local_1': No such file or directory [junit] )' [junit] org.apache.hadoop.util.Shell$ExitCodeException: chmod: cannot access `/tmp/hadoop-hudson/mapred/system/1085677376/job_local_1': No such file or directory [junit] [junit] at org.apache.hadoop.util.Shell.runCommand(Shell.java:195) [junit] at org.apache.hadoop.util.Shell.run(Shell.java:134) [junit] at org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:286) [junit] at org.apache.hadoop.util.Shell.execCommand(Shell.java:317) [junit] at org.apache.hadoop.fs.RawLocalFileSystem.execCommand(RawLocalFileSystem.java:522) [junit] at org.apache.hadoop.fs.RawLocalFileSystem.setPermission(RawLocalFileSystem.java:514) [junit] at org.apache.hadoop.fs.FilterFileSystem.setPermission(FilterFileSystem.java:267) [junit] at org.apache.hadoop.fs.FileSystem.mkdirs(FileSystem.java:273) [junit] at org.apache.hadoop.mapred.JobClient.configureCommandLineOptions(JobClient.java:549) [junit] at org.apache.hadoop.mapred.JobClient.submitJob(JobClient.java:700) [junit] at org.apache.hadoop.hive.ql.exec.ExecDriver.execute(ExecDriver.java:393) [junit] at org.apache.hadoop.hive.ql.exec.ExecDriver.main(ExecDriver.java:556) [junit] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) [junit] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39) [junit] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25) [junit] at java.lang.reflect.Method.invoke(Method.java:597) [junit] at org.apache.hadoop.util.RunJar.main(RunJar.java:155) [junit] at org.apache.hadoop.mapred.JobShell.run(JobShell.java:194) [junit] at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65) [junit] at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:79) [junit] at org.apache.hadoop.mapred.JobShell.main(JobShell.java:220) [junit] [junit] Job Failed [junit] FAILED: Execution Error, return code 2 from org.apache.hadoop.hive.ql.exec.MapRedTask [junit] FAILED: Execution Error, return code 2 from org.apache.hadoop.hive.ql.exec.MapRedTask [junit] Tests run: 1, Failures: 1, Errors: 0, Time elapsed: 5.616 sec [junit] Test org.apache.hadoop.hive.hwi.TestHWISessionManager FAILED BUILD FAILED http://hudson.zones.apache.org/hudson/job/Hive-trunk-h0.17/ws/hive/build.xml :117: The following error occurred while executing this line: http://hudson.zones.apache.org/hudson/job/Hive-trunk-h0.17/ws/hive/build-common.xml :273: Tests failed! Total time: 26 seconds Recording test results