spark-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Christophe Boivin (JIRA)" <j...@apache.org>
Subject [jira] [Comment Edited] (SPARK-13284) Cannot submit app from Windows java.io.FileNotFoundException: /C:
Date Tue, 26 Jul 2016 09:37:20 GMT

    [ https://issues.apache.org/jira/browse/SPARK-13284?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15393453#comment-15393453
] 

Christophe Boivin edited comment on SPARK-13284 at 7/26/16 9:37 AM:
--------------------------------------------------------------------

-Same problem here with spark 1.6.2 on windows-
-The problem occurs with \--deploy-mode cluster but not \--deploy-mode client-

My bad, didn't read the doc correctly : in cluster mode your file needs to be accessible on
every worker 


was (Author: cboivin):
Same problem here with spark 1.6.2 on windows
The problem occurs with \--deploy-mode cluster but not \--deploy-mode client

> Cannot submit app from Windows java.io.FileNotFoundException: /C:
> -----------------------------------------------------------------
>
>                 Key: SPARK-13284
>                 URL: https://issues.apache.org/jira/browse/SPARK-13284
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Submit
>    Affects Versions: 1.6.0
>            Reporter: DK
>
> I've tried multiple different ways to submit my spark java application from my Windows
machine to a remote Spark master running on Linux.
> I'm consistently hitting {{java.io.FileNotFoundException: /C:}} error.
> I've tried surrounding the jar path with and without quotes.
> I've also tried including {{file:///}} and {{file:/}}
> The file has full permissions
> {code}
> C:\>Icacls C:/spark-app-1.0-SNAPSHOT.jar
> C:/spark-app-1.0-SNAPSHOT.jar BUILTIN\Administrators:(M,WDAC,WO)
>                               GLOBAL\Domain Users:(RX,W)
>                               Everyone:(RX,W)
>                               NT AUTHORITY\Authenticated Users:(RX,W)
>                               NT AUTHORITY\SYSTEM:(I)(RX,W)
>                               BUILTIN\Users:(I)(RX)
>                               NT AUTHORITY\Authenticated Users:(I)(RX,W)
>                               Mandatory Label\High Mandatory Level:(I)(NW)
> {code}
> Submit failure:
> {code}
> C:\>"C:/spark-1.6.0-bin-hadoop2.6/bin/spark-submit.cmd" --master spark://10.134.41.16:6066
--deploy-mode cluster --class com.company.app.JavaDirectKafkaWordCount
> file:///C:/spark-app-1.0-SNAPSHOT.jar kafka-server:9092 dessie
> Running Spark using the REST application submission protocol.
> Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
> 16/02/11 16:00:58 INFO RestSubmissionClient: Submitting a request to launch an application
in spark://10.134.41.16:6066.
> 16/02/11 16:00:58 INFO RestSubmissionClient: Submission successfully created as driver-20160211155905-0013.
Polling submission state...
> 16/02/11 16:00:58 INFO RestSubmissionClient: Submitting a request for the status of submission
driver-20160211155905-0013 in spark://10.134.41.16:6066.
> 16/02/11 16:00:58 INFO RestSubmissionClient: State of driver driver-20160211155905-0013
is now ERROR.
> 16/02/11 16:00:58 INFO RestSubmissionClient: Driver is running on worker worker-20160211152455-172.18.0.8-55808
at 172.18.0.8:55808.
> 16/02/11 16:00:58 ERROR RestSubmissionClient: Exception from the cluster:
> java.io.FileNotFoundException: /C:/spark-app-1.0-SNAPSHOT.jar (No such file or directory)
>         java.io.FileInputStream.open0(Native Method)
>         java.io.FileInputStream.open(FileInputStream.java:195)
>         java.io.FileInputStream.<init>(FileInputStream.java:138)
>         org.spark-project.guava.io.Files$FileByteSource.openStream(Files.java:124)
>         org.spark-project.guava.io.Files$FileByteSource.openStream(Files.java:114)
>         org.spark-project.guava.io.ByteSource.copyTo(ByteSource.java:202)
>         org.spark-project.guava.io.Files.copy(Files.java:436)
>         org.apache.spark.util.Utils$.org$apache$spark$util$Utils$$copyRecursive(Utils.scala:539)
>         org.apache.spark.util.Utils$.copyFile(Utils.scala:510)
>         org.apache.spark.util.Utils$.doFetchFile(Utils.scala:595)
>         org.apache.spark.util.Utils$.fetchFile(Utils.scala:394)
>         org.apache.spark.deploy.worker.DriverRunner.org$apache$spark$deploy$worker$DriverRunner$$downloadUserJar(DriverRunner.scala:150)
>         org.apache.spark.deploy.worker.DriverRunner$$anon$1.run(DriverRunner.scala:79)
> 16/02/11 16:00:58 INFO RestSubmissionClient: Server responded with CreateSubmissionResponse:
> {
>   "action" : "CreateSubmissionResponse",
>   "message" : "Driver successfully submitted as driver-20160211155905-0013",
>   "serverSparkVersion" : "1.6.0",
>   "submissionId" : "driver-20160211155905-0013",
>   "success" : true
> }
> {code}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org


Mime
View raw message