flink-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From zentol <...@git.apache.org>
Subject [GitHub] flink pull request: [FLINK-3491] Prevent failure of HDFSCopyUtilit...
Date Wed, 24 Feb 2016 10:28:44 GMT
GitHub user zentol opened a pull request:

    https://github.com/apache/flink/pull/1703

    [FLINK-3491] Prevent failure of HDFSCopyUtilitiesTest on Windows

    This PR contains two commits to that prevent this test from failing on Windows.
    
    The first commit resolves the problem raised in the JIRA: it changes how the URI is generated,
using new Path(file).toUri() instead of file.toUri(), since the latter fails for Windows paths.
    
    After this change, i got a new exception when running the test:
    ```
    testCopyFromLocal(org.apache.flink.streaming.util.HDFSCopyUtilitiesTest)  Time elapsed:
1.892 sec  <<< ERROR!
    java.lang.NullPointerException: null
            at java.lang.ProcessBuilder.start(ProcessBuilder.java:1012)
            at org.apache.hadoop.util.Shell.runCommand(Shell.java:445)
            at org.apache.hadoop.util.Shell.run(Shell.java:418)
            at org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:650)
            at org.apache.hadoop.util.Shell.execCommand(Shell.java:739)
            at org.apache.hadoop.util.Shell.execCommand(Shell.java:722)
            at org.apache.hadoop.fs.RawLocalFileSystem.setPermission(RawLocalFileSystem.java:631)
            at org.apache.hadoop.fs.FilterFileSystem.setPermission(FilterFileSystem.java:468)
            at org.apache.hadoop.fs.ChecksumFileSystem.create(ChecksumFileSystem.java:456)
            at org.apache.hadoop.fs.ChecksumFileSystem.create(ChecksumFileSystem.java:424)
            at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:907)
            at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:888)
            at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:785)
            at org.apache.hadoop.fs.FileUtil.copy(FileUtil.java:365)
            at org.apache.hadoop.fs.FileUtil.copy(FileUtil.java:338)
            at org.apache.hadoop.fs.FileUtil.copy(FileUtil.java:289)
            at org.apache.hadoop.fs.LocalFileSystem.copyFromLocalFile(LocalFileSystem.java:82)
            at org.apache.hadoop.fs.FileSystem.copyFromLocalFile(FileSystem.java:1837)
            at org.apache.flink.streaming.util.HDFSCopyFromLocal$1.run(HDFSCopyFromLocal.java:49)
    
    ```
    
    After googling a bit it appears that several hadoop versions can't deal with windows paths
unless it has access to fancy libraries/dll's. (see https://issues.apache.org/jira/browse/SPARK-6961)
As such I added a  second commit that disables the test when run on windows. 

You can merge this pull request into a Git repository by running:

    $ git pull https://github.com/zentol/flink 3491_test_hdfscopy

Alternatively you can review and apply these changes as the patch at:

    https://github.com/apache/flink/pull/1703.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

    This closes #1703
    
----
commit 124da4220968ce331ebd9c45cdf35e4a4074848b
Author: zentol <s.motsu@web.de>
Date:   2016-02-24T10:14:23Z

    [FLINK-3491] Prevent URIException in HDFSCopyTest

commit c458f305ad23f9f075993133ffcf5ed5dd606eb4
Author: zentol <s.motsu@web.de>
Date:   2016-02-24T10:14:52Z

    Disable HDFSCopyUtilitiesTest on Windows

----


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

Mime
View raw message