spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Sun, Rui" <rui....@intel.com>
Subject RE: sparkR 1.5.1 batch yarn-client mode failing on daemon.R not found
Date Mon, 02 Nov 2015 07:01:15 GMT
Tom,

Have you set the “MASTER” evn variable on your machine? What is the value if set?

From: Tom Stewart [mailto:stewartthomasj@yahoo.com.INVALID]
Sent: Friday, October 30, 2015 10:11 PM
To: user@spark.apache.org
Subject: sparkR 1.5.1 batch yarn-client mode failing on daemon.R not found

I have the following script in a file named test.R:

library(SparkR)
sc <- sparkR.init(master="yarn-client")
sqlContext <- sparkRSQL.init(sc)
df <- createDataFrame(sqlContext, faithful)
showDF(df)
sparkR.stop()
q(save="no")

If I submit this with "sparkR test.R" or "R  CMD BATCH test.R" or "Rscript test.R" it fails
with this error:
15/10/29 08:08:49 INFO r.BufferedStreamThread: Fatal error: cannot open file '/mnt/hdfs9/yarn/nm-local-dir/usercache/hadoop/appcache/application_1446058618330_0171/container_e805_1446058618330_0171_01_000005/sparkr/SparkR/worker/daemon.R':
No such file or directory
15/10/29 08:08:59 ERROR executor.Executor: Exception in task 0.0 in stage 1.0 (TID 1)
java.net.SocketTimeoutException: Accept timed out


However, if I launch just an interactive sparkR shell and cut/paste those commands - it runs
fine.
It also runs fine on the same Hadoop cluster with Spark 1.4.1.
And, it runs fine from batch mode if I just use sparkR.init() and not sparkR.init(master="yarn-client")
Mime
View raw message