hadoop-common-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Apache Hudson Server <hud...@hudson.apache.org>
Subject Build failed in Hudson: Hadoop-Common-trunk-Commit #408
Date Fri, 29 Oct 2010 23:18:01 GMT
See <https://hudson.apache.org/hudson/job/Hadoop-Common-trunk-Commit/408/changes>

Changes:

[cutting] HADOOP-7011.  Fix KerberosName.main() to not throw an NPE.  Contributed by Aaron
T. Myers.

------------------------------------------
[...truncated 20208 lines...]
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit] 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit] 	at java.lang.reflect.Method.invoke(Method.java:597)
    [junit] 	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:44)
    [junit] 	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:15)
    [junit] 	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:41)
    [junit] 	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:20)
    [junit] 	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:76)
    [junit] 	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:50)
    [junit] checkDir success: false
    [junit] 	at org.junit.runners.ParentRunner$3.run(ParentRunner.java:193)
    [junit] 	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:52)
    [junit] 	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:191)
    [junit] 	at org.junit.runners.ParentRunner.access$000(ParentRunner.java:42)
    [junit] 	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:184)
    [junit] 	at org.junit.runners.ParentRunner.run(ParentRunner.java:236)
    [junit] 	at junit.framework.JUnit4TestAdapter.run(JUnit4TestAdapter.java:39)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:420)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:911)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:768)
    [junit] org.apache.hadoop.util.DiskChecker$DiskErrorException: directory is not listable:
Mock for Path, hashCode: 26443833
    [junit] 	at org.apache.hadoop.util.DiskChecker.checkDir(DiskChecker.java:166)
    [junit] 	at org.apache.hadoop.util.TestDiskChecker._checkDirs(TestDiskChecker.java:114)
    [junit] 	at org.apache.hadoop.util.TestDiskChecker.__CLR3_0_27686uizt2(TestDiskChecker.java:98)
    [junit] 	at org.apache.hadoop.util.TestDiskChecker.testCheckDir_notListable(TestDiskChecker.java:97)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    [junit] 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    [junit] 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    [junit] 	at java.lang.reflect.Method.invoke(Method.java:597)
    [junit] 	at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:44)
    [junit] 	at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:15)
    [junit] 	at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:41)
    [junit] 	at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:20)
    [junit] 	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:76)
    [junit] 	at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:50)
    [junit] 	at org.junit.runners.ParentRunner$3.run(ParentRunner.java:193)
    [junit] 	at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:52)
    [junit] checkDir success: false
    [junit] 	at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:191)
    [junit] 	at org.junit.runners.ParentRunner.access$000(ParentRunner.java:42)
    [junit] 	at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:184)
    [junit] 	at org.junit.runners.ParentRunner.run(ParentRunner.java:236)
    [junit] 	at junit.framework.JUnit4TestAdapter.run(JUnit4TestAdapter.java:39)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.run(JUnitTestRunner.java:420)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.launch(JUnitTestRunner.java:911)
    [junit] 	at org.apache.tools.ant.taskdefs.optional.junit.JUnitTestRunner.main(JUnitTestRunner.java:768)
    [junit] Tests run: 9, Failures: 0, Errors: 0, Time elapsed: 0.573 sec
    [junit] Running org.apache.hadoop.util.TestGenericOptionsParser
    [junit] Tests run: 4, Failures: 0, Errors: 0, Time elapsed: 0.673 sec
    [junit] Running org.apache.hadoop.util.TestGenericsUtil
    [junit] 2010-10-29 23:17:27,819 WARN  util.GenericOptionsParser (GenericOptionsParser.java:parseGeneralOptions(417))
- options parsing failed: Missing argument for option: jt
    [junit] usage: general options are:
    [junit]  -archives <paths>              comma separated archives to be unarchived
    [junit]                                 on the compute machines.
    [junit]  -conf <configuration file>     specify an application configuration file
    [junit]  -D <property=value>            use value for given property
    [junit]  -files <paths>                 comma separated files to be copied to the
    [junit]                                 map reduce cluster
    [junit]  -fs <local|namenode:port>      specify a namenode
    [junit]  -jt <local|jobtracker:port>    specify a job tracker
    [junit]  -libjars <paths>               comma separated jar files to include in
    [junit]                                 the classpath.
    [junit]  -tokenCacheFile <tokensFile>   name of the file with the tokens
    [junit] Tests run: 6, Failures: 0, Errors: 0, Time elapsed: 0.312 sec
    [junit] Running org.apache.hadoop.util.TestHostsFileReader
    [junit] 2010-10-29 23:17:28,509 INFO  util.HostsFileReader (HostsFileReader.java:refresh(85))
- Refreshing hosts (include/exclude) list
    [junit] 2010-10-29 23:17:28,512 INFO  util.HostsFileReader (HostsFileReader.java:readFileToSet(70))
- Adding somehost1 to the list of hosts from <https://hudson.apache.org/hudson/job/Hadoop-Common-trunk-Commit/ws/trunk/build/test/data/dfs.include>
    [junit] 2010-10-29 23:17:28,512 INFO  util.HostsFileReader (HostsFileReader.java:readFileToSet(70))
- Adding somehost2 to the list of hosts from <https://hudson.apache.org/hudson/job/Hadoop-Common-trunk-Commit/ws/trunk/build/test/data/dfs.include>
    [junit] 2010-10-29 23:17:28,513 INFO  util.HostsFileReader (HostsFileReader.java:readFileToSet(70))
- Adding somehost3 to the list of hosts from <https://hudson.apache.org/hudson/job/Hadoop-Common-trunk-Commit/ws/trunk/build/test/data/dfs.include>
    [junit] 2010-10-29 23:17:28,514 INFO  util.HostsFileReader (HostsFileReader.java:readFileToSet(70))
- Adding somehost4 to the list of hosts from <https://hudson.apache.org/hudson/job/Hadoop-Common-trunk-Commit/ws/trunk/build/test/data/dfs.include>
    [junit] 2010-10-29 23:17:28,515 INFO  util.HostsFileReader (HostsFileReader.java:readFileToSet(70))
- Adding somehost4 to the list of hosts from <https://hudson.apache.org/hudson/job/Hadoop-Common-trunk-Commit/ws/trunk/build/test/data/dfs.include>
    [junit] 2010-10-29 23:17:28,515 INFO  util.HostsFileReader (HostsFileReader.java:readFileToSet(70))
- Adding somehost5 to the list of hosts from <https://hudson.apache.org/hudson/job/Hadoop-Common-trunk-Commit/ws/trunk/build/test/data/dfs.include>
    [junit] 2010-10-29 23:17:28,516 INFO  util.HostsFileReader (HostsFileReader.java:readFileToSet(70))
- Adding somehost1 to the list of hosts from <https://hudson.apache.org/hudson/job/Hadoop-Common-trunk-Commit/ws/trunk/build/test/data/dfs.exclude>
    [junit] 2010-10-29 23:17:28,517 INFO  util.HostsFileReader (HostsFileReader.java:readFileToSet(70))
- Adding somehost2 to the list of hosts from <https://hudson.apache.org/hudson/job/Hadoop-Common-trunk-Commit/ws/trunk/build/test/data/dfs.exclude>
    [junit] 2010-10-29 23:17:28,517 INFO  util.HostsFileReader (HostsFileReader.java:readFileToSet(70))
- Adding somehost3 to the list of hosts from <https://hudson.apache.org/hudson/job/Hadoop-Common-trunk-Commit/ws/trunk/build/test/data/dfs.exclude>
    [junit] 2010-10-29 23:17:28,518 INFO  util.HostsFileReader (HostsFileReader.java:readFileToSet(70))
- Adding somehost4 to the list of hosts from <https://hudson.apache.org/hudson/job/Hadoop-Common-trunk-Commit/ws/trunk/build/test/data/dfs.exclude>
    [junit] 2010-10-29 23:17:28,518 INFO  util.HostsFileReader (HostsFileReader.java:readFileToSet(70))
- Adding somehost4 to the list of hosts from <https://hudson.apache.org/hudson/job/Hadoop-Common-trunk-Commit/ws/trunk/build/test/data/dfs.exclude>
    [junit] 2010-10-29 23:17:28,519 INFO  util.HostsFileReader (HostsFileReader.java:readFileToSet(70))
- Adding somehost5 to the list of hosts from <https://hudson.apache.org/hudson/job/Hadoop-Common-trunk-Commit/ws/trunk/build/test/data/dfs.exclude>
    [junit] 2010-10-29 23:17:28,527 INFO  util.HostsFileReader (HostsFileReader.java:refresh(85))
- Refreshing hosts (include/exclude) list
    [junit] 2010-10-29 23:17:28,534 INFO  util.HostsFileReader (HostsFileReader.java:refresh(85))
- Refreshing hosts (include/exclude) list
    [junit] 2010-10-29 23:17:28,538 INFO  util.HostsFileReader (HostsFileReader.java:refresh(85))
- Refreshing hosts (include/exclude) list
    [junit] 2010-10-29 23:17:28,538 INFO  util.HostsFileReader (HostsFileReader.java:readFileToSet(70))
- Adding somehost to the list of hosts from <https://hudson.apache.org/hudson/job/Hadoop-Common-trunk-Commit/ws/trunk/build/test/data/dfs.include>
    [junit] 2010-10-29 23:17:28,539 INFO  util.HostsFileReader (HostsFileReader.java:readFileToSet(70))
- Adding somehost2 to the list of hosts from <https://hudson.apache.org/hudson/job/Hadoop-Common-trunk-Commit/ws/trunk/build/test/data/dfs.include>
    [junit] 2010-10-29 23:17:28,539 INFO  util.HostsFileReader (HostsFileReader.java:readFileToSet(70))
- Adding somehost3 to the list of hosts from <https://hudson.apache.org/hudson/job/Hadoop-Common-trunk-Commit/ws/trunk/build/test/data/dfs.include>
    [junit] 2010-10-29 23:17:28,540 INFO  util.HostsFileReader (HostsFileReader.java:readFileToSet(70))
- Adding somehost to the list of hosts from <https://hudson.apache.org/hudson/job/Hadoop-Common-trunk-Commit/ws/trunk/build/test/data/dfs.exclude>
    [junit] 2010-10-29 23:17:28,540 INFO  util.HostsFileReader (HostsFileReader.java:readFileToSet(70))
- Adding somehost2 to the list of hosts from <https://hudson.apache.org/hudson/job/Hadoop-Common-trunk-Commit/ws/trunk/build/test/data/dfs.exclude>
    [junit] 2010-10-29 23:17:28,541 INFO  util.HostsFileReader (HostsFileReader.java:readFileToSet(70))
- Adding somehost3 to the list of hosts from <https://hudson.apache.org/hudson/job/Hadoop-Common-trunk-Commit/ws/trunk/build/test/data/dfs.exclude>
    [junit] 2010-10-29 23:17:28,544 INFO  util.HostsFileReader (HostsFileReader.java:refresh(85))
- Refreshing hosts (include/exclude) list
    [junit] 2010-10-29 23:17:28,544 INFO  util.HostsFileReader (HostsFileReader.java:readFileToSet(70))
- Adding somehost to the list of hosts from <https://hudson.apache.org/hudson/job/Hadoop-Common-trunk-Commit/ws/trunk/build/test/data/dfs.include>
    [junit] 2010-10-29 23:17:28,545 INFO  util.HostsFileReader (HostsFileReader.java:readFileToSet(70))
- Adding somehost2 to the list of hosts from <https://hudson.apache.org/hudson/job/Hadoop-Common-trunk-Commit/ws/trunk/build/test/data/dfs.include>
    [junit] 2010-10-29 23:17:28,545 INFO  util.HostsFileReader (HostsFileReader.java:readFileToSet(70))
- Adding somehost4 to the list of hosts from <https://hudson.apache.org/hudson/job/Hadoop-Common-trunk-Commit/ws/trunk/build/test/data/dfs.include>
    [junit] 2010-10-29 23:17:28,546 INFO  util.HostsFileReader (HostsFileReader.java:readFileToSet(70))
- Adding somehost3 to the list of hosts from <https://hudson.apache.org/hudson/job/Hadoop-Common-trunk-Commit/ws/trunk/build/test/data/dfs.include>
    [junit] 2010-10-29 23:17:28,546 INFO  util.HostsFileReader (HostsFileReader.java:readFileToSet(70))
- Adding somehost to the list of hosts from <https://hudson.apache.org/hudson/job/Hadoop-Common-trunk-Commit/ws/trunk/build/test/data/dfs.exclude>
    [junit] 2010-10-29 23:17:28,547 INFO  util.HostsFileReader (HostsFileReader.java:readFileToSet(70))
- Adding somehost2 to the list of hosts from <https://hudson.apache.org/hudson/job/Hadoop-Common-trunk-Commit/ws/trunk/build/test/data/dfs.exclude>
    [junit] 2010-10-29 23:17:28,547 INFO  util.HostsFileReader (HostsFileReader.java:readFileToSet(70))
- Adding somehost4 to the list of hosts from <https://hudson.apache.org/hudson/job/Hadoop-Common-trunk-Commit/ws/trunk/build/test/data/dfs.exclude>
    [junit] 2010-10-29 23:17:28,548 INFO  util.HostsFileReader (HostsFileReader.java:readFileToSet(70))
- Adding somehost3 to the list of hosts from <https://hudson.apache.org/hudson/job/Hadoop-Common-trunk-Commit/ws/trunk/build/test/data/dfs.exclude>
    [junit] Tests run: 5, Failures: 0, Errors: 0, Time elapsed: 0.255 sec
    [junit] Running org.apache.hadoop.util.TestIndexedSort
    [junit] sortRandom seed: -5167897124795842351(org.apache.hadoop.util.QuickSort)
    [junit] testSorted seed: -3093620812982128053(org.apache.hadoop.util.QuickSort)
    [junit] testAllEqual setting min/max at 285/289(org.apache.hadoop.util.QuickSort)
    [junit] sortWritable seed: -1553852819516805776(org.apache.hadoop.util.QuickSort)
    [junit] QuickSort degen cmp/swp: 23252/3713(org.apache.hadoop.util.QuickSort)
    [junit] sortRandom seed: 3729974861615156858(org.apache.hadoop.util.HeapSort)
    [junit] testSorted seed: -3987341760111958517(org.apache.hadoop.util.HeapSort)
    [junit] testAllEqual setting min/max at 160/254(org.apache.hadoop.util.HeapSort)
    [junit] sortWritable seed: 8477618673802300546(org.apache.hadoop.util.HeapSort)
    [junit] Tests run: 2, Failures: 0, Errors: 0, Time elapsed: 1.549 sec
    [junit] Running org.apache.hadoop.util.TestOptions
    [junit] Tests run: 2, Failures: 0, Errors: 0, Time elapsed: 0.147 sec
    [junit] Running org.apache.hadoop.util.TestPureJavaCrc32
    [junit] Tests run: 1, Failures: 0, Errors: 0, Time elapsed: 1.549 sec
    [junit] Running org.apache.hadoop.util.TestReflectionUtils
    [junit] Tests run: 4, Failures: 0, Errors: 0, Time elapsed: 0.631 sec
    [junit] Running org.apache.hadoop.util.TestRunJar
    [junit] Tests run: 2, Failures: 0, Errors: 0, Time elapsed: 0.233 sec
    [junit] Running org.apache.hadoop.util.TestShell
    [junit] Tests run: 3, Failures: 0, Errors: 0, Time elapsed: 4.267 sec
    [junit] Running org.apache.hadoop.util.TestStringUtils
    [junit] Tests run: 8, Failures: 0, Errors: 0, Time elapsed: 0.189 sec

checkfailure:

injectfaults:
    [mkdir] Created dir: <https://hudson.apache.org/hudson/job/Hadoop-Common-trunk-Commit/ws/trunk/build-fi>

ivy-download:
      [get] Getting: http://repo2.maven.org/maven2/org/apache/ivy/ivy/2.1.0/ivy-2.1.0.jar
      [get] To: <https://hudson.apache.org/hudson/job/Hadoop-Common-trunk-Commit/ws/trunk/ivy/ivy-2.1.0.jar>
      [get] Not modified - so not downloaded

ivy-init-dirs:
    [mkdir] Created dir: <https://hudson.apache.org/hudson/job/Hadoop-Common-trunk-Commit/ws/trunk/build-fi/ivy>
    [mkdir] Created dir: <https://hudson.apache.org/hudson/job/Hadoop-Common-trunk-Commit/ws/trunk/build-fi/ivy/lib>
    [mkdir] Created dir: <https://hudson.apache.org/hudson/job/Hadoop-Common-trunk-Commit/ws/trunk/build-fi/ivy/report>
    [mkdir] Created dir: <https://hudson.apache.org/hudson/job/Hadoop-Common-trunk-Commit/ws/trunk/build-fi/ivy/maven>

ivy-probe-antlib:

ivy-init-antlib:

ivy-init:
[ivy:configure] :: loading settings :: file = <https://hudson.apache.org/hudson/job/Hadoop-Common-trunk-Commit/ws/trunk/ivy/ivysettings.xml>

ivy-resolve-common:

ivy-retrieve-common:
[ivy:cachepath] DEPRECATED: 'ivy.conf.file' is deprecated, use 'ivy.settings.file' instead
[ivy:cachepath] :: loading settings :: file = <https://hudson.apache.org/hudson/job/Hadoop-Common-trunk-Commit/ws/trunk/ivy/ivysettings.xml>

init:
    [mkdir] Created dir: <https://hudson.apache.org/hudson/job/Hadoop-Common-trunk-Commit/ws/trunk/build-fi/classes>
    [mkdir] Created dir: <https://hudson.apache.org/hudson/job/Hadoop-Common-trunk-Commit/ws/trunk/build-fi/src>
    [mkdir] Created dir: <https://hudson.apache.org/hudson/job/Hadoop-Common-trunk-Commit/ws/trunk/build-fi/webapps>
    [mkdir] Created dir: <https://hudson.apache.org/hudson/job/Hadoop-Common-trunk-Commit/ws/trunk/build-fi/test>
    [mkdir] Created dir: <https://hudson.apache.org/hudson/job/Hadoop-Common-trunk-Commit/ws/trunk/build-fi/test/classes>
    [mkdir] Created dir: <https://hudson.apache.org/hudson/job/Hadoop-Common-trunk-Commit/ws/trunk/build-fi/test/extraconf>
    [touch] Creating /tmp/null1104197170
   [delete] Deleting: /tmp/null1104197170
    [mkdir] Created dir: <https://hudson.apache.org/hudson/job/Hadoop-Common-trunk-Commit/ws/trunk/build-fi/test/conf>
     [copy] Copying 5 files to <https://hudson.apache.org/hudson/job/Hadoop-Common-trunk-Commit/ws/trunk/build-fi/test/conf>
     [copy] Copying <https://hudson.apache.org/hudson/job/Hadoop-Common-trunk-Commit/ws/trunk/conf/core-site.xml.template>
to <https://hudson.apache.org/hudson/job/Hadoop-Common-trunk-Commit/ws/trunk/build-fi/test/conf/core-site.xml>
     [copy] Copying <https://hudson.apache.org/hudson/job/Hadoop-Common-trunk-Commit/ws/trunk/conf/masters.template>
to <https://hudson.apache.org/hudson/job/Hadoop-Common-trunk-Commit/ws/trunk/build-fi/test/conf/masters>
     [copy] Copying <https://hudson.apache.org/hudson/job/Hadoop-Common-trunk-Commit/ws/trunk/conf/hadoop-env.sh.template>
to <https://hudson.apache.org/hudson/job/Hadoop-Common-trunk-Commit/ws/trunk/build-fi/test/conf/hadoop-env.sh>
     [copy] Copying <https://hudson.apache.org/hudson/job/Hadoop-Common-trunk-Commit/ws/trunk/conf/slaves.template>
to <https://hudson.apache.org/hudson/job/Hadoop-Common-trunk-Commit/ws/trunk/build-fi/test/conf/slaves>
     [copy] Copying <https://hudson.apache.org/hudson/job/Hadoop-Common-trunk-Commit/ws/trunk/conf/hadoop-policy.xml.template>
to <https://hudson.apache.org/hudson/job/Hadoop-Common-trunk-Commit/ws/trunk/build-fi/test/conf/hadoop-policy.xml>

record-parser:

compile-rcc-compiler:
    [javac] Compiling 29 source files to <https://hudson.apache.org/hudson/job/Hadoop-Common-trunk-Commit/ws/trunk/build-fi/classes>
    [javac] Note: Some input files use or override a deprecated API.
    [javac] Note: Recompile with -Xlint:deprecation for details.
Trying to override old definition of task recordcc

compile-core-classes:
    [javac] Compiling 392 source files to <https://hudson.apache.org/hudson/job/Hadoop-Common-trunk-Commit/ws/trunk/build-fi/classes>
    [javac] <https://hudson.apache.org/hudson/job/Hadoop-Common-trunk-Commit/ws/trunk/src/java/org/apache/hadoop/security/KerberosName.java>:31:
warning: sun.security.krb5.Config is Sun proprietary API and may be removed in a future release
    [javac] import sun.security.krb5.Config;
    [javac]                         ^
    [javac] <https://hudson.apache.org/hudson/job/Hadoop-Common-trunk-Commit/ws/trunk/src/java/org/apache/hadoop/security/KerberosName.java>:32:
warning: sun.security.krb5.KrbException is Sun proprietary API and may be removed in a future
release
    [javac] import sun.security.krb5.KrbException;
    [javac]                         ^
    [javac] <https://hudson.apache.org/hudson/job/Hadoop-Common-trunk-Commit/ws/trunk/src/java/org/apache/hadoop/security/KerberosName.java>:81:
warning: sun.security.krb5.Config is Sun proprietary API and may be removed in a future release
    [javac]   private static Config kerbConf;
    [javac]                  ^
    [javac] <https://hudson.apache.org/hudson/job/Hadoop-Common-trunk-Commit/ws/trunk/src/java/org/apache/hadoop/security/SecurityUtil.java>:39:
warning: sun.security.jgss.krb5.Krb5Util is Sun proprietary API and may be removed in a future
release
    [javac] import sun.security.jgss.krb5.Krb5Util;
    [javac]                              ^
    [javac] <https://hudson.apache.org/hudson/job/Hadoop-Common-trunk-Commit/ws/trunk/src/java/org/apache/hadoop/security/SecurityUtil.java>:40:
warning: sun.security.krb5.Credentials is Sun proprietary API and may be removed in a future
release
    [javac] import sun.security.krb5.Credentials;
    [javac]                         ^
    [javac] <https://hudson.apache.org/hudson/job/Hadoop-Common-trunk-Commit/ws/trunk/src/java/org/apache/hadoop/security/SecurityUtil.java>:41:
warning: sun.security.krb5.PrincipalName is Sun proprietary API and may be removed in a future
release
    [javac] import sun.security.krb5.PrincipalName;
    [javac]                         ^
    [javac] <https://hudson.apache.org/hudson/job/Hadoop-Common-trunk-Commit/ws/trunk/src/java/org/apache/hadoop/security/KerberosName.java>:85:
warning: sun.security.krb5.Config is Sun proprietary API and may be removed in a future release
    [javac]       kerbConf = Config.getInstance();
    [javac]                  ^
    [javac] <https://hudson.apache.org/hudson/job/Hadoop-Common-trunk-Commit/ws/trunk/src/java/org/apache/hadoop/security/KerberosName.java>:87:
warning: sun.security.krb5.KrbException is Sun proprietary API and may be removed in a future
release
    [javac]     } catch (KrbException ke) {
    [javac]              ^
    [javac] <https://hudson.apache.org/hudson/job/Hadoop-Common-trunk-Commit/ws/trunk/src/java/org/apache/hadoop/security/SecurityUtil.java>:120:
warning: sun.security.krb5.Credentials is Sun proprietary API and may be removed in a future
release
    [javac]     Credentials serviceCred = null;
    [javac]     ^
    [javac] <https://hudson.apache.org/hudson/job/Hadoop-Common-trunk-Commit/ws/trunk/src/java/org/apache/hadoop/security/SecurityUtil.java>:122:
warning: sun.security.krb5.PrincipalName is Sun proprietary API and may be removed in a future
release
    [javac]       PrincipalName principal = new PrincipalName(serviceName,
    [javac]       ^
    [javac] <https://hudson.apache.org/hudson/job/Hadoop-Common-trunk-Commit/ws/trunk/src/java/org/apache/hadoop/security/SecurityUtil.java>:122:
warning: sun.security.krb5.PrincipalName is Sun proprietary API and may be removed in a future
release
    [javac]       PrincipalName principal = new PrincipalName(serviceName,
    [javac]                                     ^
    [javac] <https://hudson.apache.org/hudson/job/Hadoop-Common-trunk-Commit/ws/trunk/src/java/org/apache/hadoop/security/SecurityUtil.java>:123:
warning: sun.security.krb5.PrincipalName is Sun proprietary API and may be removed in a future
release
    [javac]           PrincipalName.KRB_NT_SRV_HST);
    [javac]           ^
    [javac] <https://hudson.apache.org/hudson/job/Hadoop-Common-trunk-Commit/ws/trunk/src/java/org/apache/hadoop/security/SecurityUtil.java>:125:
warning: sun.security.jgss.krb5.Krb5Util is Sun proprietary API and may be removed in a future
release
    [javac]           .toString(), Krb5Util.ticketToCreds(getTgtFromSubject()));
    [javac]                        ^
    [javac] <https://hudson.apache.org/hudson/job/Hadoop-Common-trunk-Commit/ws/trunk/src/java/org/apache/hadoop/security/SecurityUtil.java>:124:
warning: sun.security.krb5.Credentials is Sun proprietary API and may be removed in a future
release
    [javac]       serviceCred = Credentials.acquireServiceCreds(principal
    [javac]                     ^
    [javac] <https://hudson.apache.org/hudson/job/Hadoop-Common-trunk-Commit/ws/trunk/src/java/org/apache/hadoop/security/SecurityUtil.java>:134:
warning: sun.security.jgss.krb5.Krb5Util is Sun proprietary API and may be removed in a future
release
    [javac]         .add(Krb5Util.credsToTicket(serviceCred));
    [javac]              ^
    [javac] Note: Some input files use or override a deprecated API.
    [javac] Note: Recompile with -Xlint:deprecation for details.
    [javac] 15 warnings
     [copy] Copying 1 file to <https://hudson.apache.org/hudson/job/Hadoop-Common-trunk-Commit/ws/trunk/build-fi/classes>

ivy-resolve-test:

ivy-retrieve-test:

generate-test-records:

generate-avro-records:

BUILD FAILED
<https://hudson.apache.org/hudson/job/Hadoop-Common-trunk-Commit/ws/trunk/build.xml>:756:
The following error occurred while executing this line:
<https://hudson.apache.org/hudson/job/Hadoop-Common-trunk-Commit/ws/trunk/src/test/aop/build/aop.xml>:119:
The following error occurred while executing this line:
<https://hudson.apache.org/hudson/job/Hadoop-Common-trunk-Commit/ws/trunk/src/test/aop/build/aop.xml>:147:
The following error occurred while executing this line:
<https://hudson.apache.org/hudson/job/Hadoop-Common-trunk-Commit/ws/trunk/build.xml>:466:
taskdef class org.apache.avro.specific.SchemaTask cannot be found

Total time: 12 minutes 59 seconds
Publishing Javadoc
Archiving artifacts
Recording test results
Recording fingerprints
Publishing Clover coverage report...
No Clover report will be published due to a Build Failure


Mime
View raw message