spark-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From hsaputra <...@git.apache.org>
Subject [GitHub] incubator-spark pull request: SPARK-1111: URL Validation Throws Er...
Date Fri, 21 Feb 2014 18:31:57 GMT
Github user hsaputra commented on a diff in the pull request:

    https://github.com/apache/incubator-spark/pull/625#discussion_r9955979
  
    --- Diff: core/src/main/scala/org/apache/spark/deploy/ClientArguments.scala ---
    @@ -115,3 +110,7 @@ private[spark] class ClientArguments(args: Array[String]) {
         System.exit(exitCode)
       }
     }
    +
    +object ClientArguments {
    +  def isValidJarUrl(s: String) = s.matches("(.+):(.+)jar")
    --- End diff --
    
    Ah, I did not know Hadoop is ok with file:/ pattern, I always use file://
    
    Thanks for the response, @pwendell
    
    But we do want URL instead of URI for the JAR path location I suppose.
    So this PR is just to "relax" java.net.URL requirement to have the double slashes?
    If that is the case, then maybe we could update the comment to include the ones without
"//"
    
    e.g. hdfs://XX.jar, file:/XX.jar, file:XX.jar
    
    Sorry about the questions for this small PR, just wanted to know more about the reasoning.
    Always nervous whenever we add custom URL/ URI validation.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. To do so, please top-post your response.
If your project does not have this feature enabled and wishes so, or if the
feature is enabled but not working, please contact infrastructure at
infrastructure@apache.org or file a JIRA ticket with INFRA.
---

Mime
View raw message