spark-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Pritpal Singh (JIRA)" <j...@apache.org>
Subject [jira] [Created] (SPARK-24448) File not found on the address SparkFiles.get returns on standalone cluster
Date Fri, 01 Jun 2018 02:19:00 GMT
Pritpal Singh created SPARK-24448:
-------------------------------------

             Summary: File not found on the address SparkFiles.get returns on standalone cluster
                 Key: SPARK-24448
                 URL: https://issues.apache.org/jira/browse/SPARK-24448
             Project: Spark
          Issue Type: Bug
          Components: Spark Core
    Affects Versions: 2.2.1
            Reporter: Pritpal Singh


I want to upload a file on all worker nodes in a standalone cluster and retrieve the location
of file. Here is my code

 

val tempKeyStoreLoc = System.getProperty("java.io.tmpdir") + "/keystore.jks"

val file = new File(tempKeyStoreLoc)

sparkContext.addFile(file.getAbsolutePath)

val keyLoc = SparkFiles.get("keystore.jks")

 

SparkFiles.get returns a random location where keystore.jks does not exist. I submit the
job in cluster mode. In fact the location Spark.Files returns does not exist on any of the
worker nodes (including the driver node). 

I observed that Spark does load keystore.jks files on worker nodes at <SPARK_HOME>/work/<app_id>/<partition_id>/keystore.jks.
The partition_id changes from one worker node to another.

My requirement is to upload a file on all nodes of a cluster and retrieve its location. I'm
expecting the location to be common across all worker nodes.

 

 



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org


Mime
View raw message