hadoop-common-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Ewan Higgs <Ewan.Hi...@wdc.com>
Subject Re: About supporting upcoming java versions
Date Thu, 08 Nov 2018 23:34:46 GMT
Hi all,
Reporting bugs to Java is a bit weird/nontrivial so I'm sympathetic to Owen's situation. The
openjdk bug tracker requires users be committers so no one can comment unless they're already
contributing.

Their tracker is here:
https://bugs.openjdk.java.net/projects/JDK/issues

To actually file a bug, the form is here:
https://bugreport.java.com/bugreport

Yours,
Ewan

On 07/11/2018, 11:18, "Steve Loughran" <stevel@hortonworks.com> wrote:

    
    If there are problems w/ JDK11 then we should be talking to oracle about them to have
them fixed. Is there an ASF JIRA on this issue yet?
    
    As usual, the large physical clusters will be slow to upgrade,
    
    but the smaller cloud ones can get away with being agile, and as I believe that YARN does
let you run code with a different path to the jvm, people can mix things.
    This makes it possible for people to run java 11+ apps even if hadoop itself is on java
8.
    
    And this time we may want to think about: which release we declare "ready for Java 11",
being proactive rather than lagging behind the public releases by many years (6=>7, 7=>8).
Of course, we'll have to stay with the java 8 language for a while, but there's a lot more
we can do there in our code. I'm currently (HADOOP-14556) embracing Optional, as it makes
explicit when things are potentially null, and while its  crippled by the java language itself
(http://steveloughran.blogspot.com/2018/10/javas-use-of-checked-exceptions.html ), its still
something we can embrace (*)
    
    
    Takanobu,
    
    I've been watching the work you, Akira and others have been putting in for java 9+ support
and its wonderful, If we had an annual award for "persevering in the presence of extreme suffering"
it'd be the top candidate for this year's work.
    
    it means we are lined up to let people run on Hadoop 11 if they want, and gives that option
of moving to java 11 sooner rather than later. I'm also looking at JUnit 5, wondering when
I can embrace it fully (i.e. not worry about cherry picking code into junit 4 tests)
    
    Thanks for all your work
    
    -Steve
    
    (*) I also have in the test code of that branch a bonding of UG.doAs which takes closures
    
    https://github.com/steveloughran/hadoop/blob/s3/HADOOP-14556-delegation-token/hadoop-common-project/hadoop-common/src/test/java/org/apache/hadoop/test/LambdaTestUtils.java#L865
    
    
    lets me do things like
    
        assertEquals("FS username in doAs()",
            ALICE,
            doAs(bobUser, () -> fs.getUsername()))
    
    If someone wants to actually pull this support into UGI itself, happy to review. as moving
our doAs code to things like bobUser.doAs(() -> fs.create(path)) will transform all those
UGI code users.
    
    On 6 Nov 2018, at 05:57, Takanobu Asanuma <tasanuma@apache.org<mailto:tasanuma@apache.org>>
wrote:
    
    Thanks for your reply, Owen.
    
    That said, I’d be surprised if the work items for JDK 9 and 10 aren’t a
    strict subset of the issues getting to JDK 11.
    
    Most of the issues that we have fixed are subset of the ones of JDK 11. But
    there seem to be some exceptions. HADOOP-15905 is a bug of JDK 9/10 which
    has been fixed in JDK 11. It is difficult to fix it since JDK 9/10 have
    already been EOL. I wonder if we should treat such a kind of error going
    forward.
    
    I've hit at least one pretty serious JVM bug in JDK 11
    Could you please share that detail?
    
    In any case, we should be carefully that what version of Hadoop is ready
    for JDK 11. It will take some time yet. And we also need to keep supporting
    JDK 8 for a while.
    
    Regards,
    - Takanobu
    
    
    
    

Mime
View raw message