accumulo-notifications mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "John Vines (JIRA)" <j...@apache.org>
Subject [jira] [Commented] (ACCUMULO-3100) Accumulo fails to test against recent Hadoop 2.6.0-SNAPSHOT
Date Fri, 05 Sep 2014 16:51:29 GMT

    [ https://issues.apache.org/jira/browse/ACCUMULO-3100?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14123154#comment-14123154
] 

John Vines commented on ACCUMULO-3100:
--------------------------------------

The issue with MAC was that the maven plugins were using a fork of guava, hence the ordering
issues.

Mucking with guava version's is not something so simple. If we pull back the version, we break
compatibility for all users who may be relying on guava features introduced after whatever
version hadoop is using. That is cause of issue in the long term about client usability, but
it's also an issue about our compatibility guarantees.

Furthermore, when we discussed 2215, I belive in the irc room, it was also noted that there
were bug fixes between the version of guava hadoop was using and our previous guava version
that DID affect us, so that's another case of not wanting to blindly roll back.

> Accumulo fails to test against recent Hadoop 2.6.0-SNAPSHOT
> -----------------------------------------------------------
>
>                 Key: ACCUMULO-3100
>                 URL: https://issues.apache.org/jira/browse/ACCUMULO-3100
>             Project: Accumulo
>          Issue Type: Bug
>          Components: test
>    Affects Versions: 1.6.0
>            Reporter: Josh Elser
>            Assignee: Josh Elser
>             Fix For: 1.6.1, 1.7.0
>
>
> JobSubmitted makes a call out to CryptoUtils to test for encrypted shuffle support that
was recently added to branch-2 (specifically HDFS-6134 and HADOOP-10150 looking at the blame)
> {noformat}
> java.lang.NoClassDefFoundError: com/google/common/io/LimitInputStream
> 	at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
> 	at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
> 	at java.security.AccessController.doPrivileged(Native Method)
> 	at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
> 	at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
> 	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
> 	at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
> 	at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:380)
> 	at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1294)
> 	at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1291)
> 	at java.security.AccessController.doPrivileged(Native Method)
> 	at javax.security.auth.Subject.doAs(Subject.java:415)
> 	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1614)
> 	at org.apache.hadoop.mapreduce.Job.submit(Job.java:1291)
> 	at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1312)
> {noformat}
> Because of this, we can't run the test because we can't load LimitInputStream because
we depend on Guava 15.0 which doesn't contain LimitInputStream.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

Mime
View raw message