spark-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Jason Pan (JIRA)" <j...@apache.org>
Subject [jira] [Commented] (SPARK-18353) spark.rpc.askTimeout defalut value is not 120s
Date Thu, 10 Nov 2016 01:11:59 GMT

    [ https://issues.apache.org/jira/browse/SPARK-18353?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15652585#comment-15652585
] 

Jason Pan commented on SPARK-18353:
-----------------------------------

No matter what the default value is at last,  I think we need a way to configure it.

> spark.rpc.askTimeout defalut value is not 120s
> ----------------------------------------------
>
>                 Key: SPARK-18353
>                 URL: https://issues.apache.org/jira/browse/SPARK-18353
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Core
>    Affects Versions: 1.6.1, 2.0.1
>         Environment: Linux zzz 3.10.0-327.el7.x86_64 #1 SMP Thu Oct 29 17:29:29 EDT 2015
x86_64 x86_64 x86_64 GNU/Linux
>            Reporter: Jason Pan
>            Priority: Critical
>
> in http://spark.apache.org/docs/latest/configuration.html 
> spark.rpc.askTimeout  120s	Duration for an RPC ask operation to wait before timing out
> the defalut value is 120s as documented.
> However when I run "spark-summit" with standalone cluster mode:
> the cmd is:
> Launch Command: "/opt/jdk1.8.0_102/bin/java" "-cp" "/opt/spark-2.0.1-bin-hadoop2.7/conf/:/opt/spark-2.0.1-bin-hadoop2.7/jars/*"
"-Xmx1024M" "-Dspark.eventLog.enabled=true" "-Dspark.master=spark://9.111.159.127:7101" "-Dspark.driver.supervise=false"
"-Dspark.app.name=org.apache.spark.examples.SparkPi" "-Dspark.submit.deployMode=cluster" "-Dspark.jars=file:/opt/spark-1.6.1-bin-hadoop2.6/lib/spark-examples-1.6.1-hadoop2.6.0.jar"
"-Dspark.history.ui.port=18087" "-Dspark.rpc.askTimeout=10" "-Dspark.history.fs.logDirectory=file:/opt/tmp/spark-event"
"-Dspark.eventLog.dir=file:///opt/tmp/spark-event" "org.apache.spark.deploy.worker.DriverWrapper"
"spark://Worker@9.111.159.127:7103" "/opt/spark-2.0.1-bin-hadoop2.7/work/driver-20161109031939-0002/spark-examples-1.6.1-hadoop2.6.0.jar"
"org.apache.spark.examples.SparkPi" "1000"
> Dspark.rpc.askTimeout=10
> the value is 10, it is not the same as document.
> Note: when I summit to REST URL, it has no this issue.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org


Mime
View raw message