accumulo-notifications mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Bill Havanki (JIRA)" <j...@apache.org>
Subject [jira] [Commented] (ACCUMULO-3100) Accumulo fails to test against recent Hadoop 2.6.0-SNAPSHOT
Date Fri, 05 Sep 2014 21:04:29 GMT

    [ https://issues.apache.org/jira/browse/ACCUMULO-3100?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14123570#comment-14123570
] 

Bill Havanki commented on ACCUMULO-3100:
----------------------------------------

How bad would it be to just include {{LimitInputStream}} from 14.0.1 in the Accumulo codebase,
until Hadoop moves up to 15.0 or later? Yes, maybe a slippery slope and all that. But, the
gap between 14.0.1 and 15.0 is probably not so great, but if the policy going forward is to
match Hadoop's Guava version, it doesn't seem like it would get too unwieldy.

And there's always the Maven shade plugin ...

> Accumulo fails to test against recent Hadoop 2.6.0-SNAPSHOT
> -----------------------------------------------------------
>
>                 Key: ACCUMULO-3100
>                 URL: https://issues.apache.org/jira/browse/ACCUMULO-3100
>             Project: Accumulo
>          Issue Type: Bug
>          Components: test
>    Affects Versions: 1.6.0
>            Reporter: Josh Elser
>            Assignee: Josh Elser
>             Fix For: 1.6.1, 1.7.0
>
>
> JobSubmitted makes a call out to CryptoUtils to test for encrypted shuffle support that
was recently added to branch-2 (specifically HDFS-6134 and HADOOP-10150 looking at the blame)
> {noformat}
> java.lang.NoClassDefFoundError: com/google/common/io/LimitInputStream
> 	at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
> 	at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
> 	at java.security.AccessController.doPrivileged(Native Method)
> 	at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
> 	at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
> 	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
> 	at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
> 	at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:380)
> 	at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1294)
> 	at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1291)
> 	at java.security.AccessController.doPrivileged(Native Method)
> 	at javax.security.auth.Subject.doAs(Subject.java:415)
> 	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1614)
> 	at org.apache.hadoop.mapreduce.Job.submit(Job.java:1291)
> 	at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1312)
> {noformat}
> Because of this, we can't run the test because we can't load LimitInputStream because
we depend on Guava 15.0 which doesn't contain LimitInputStream.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

Mime
View raw message