spark-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Jean-Baptiste Onofré ...@nanthrax.net>
Subject Re: SPARK_MASTER_IP actually expects a DNS name, not IP address
Date Fri, 16 Oct 2015 16:05:18 GMT
Hi Nick,

there's the Spark master defined in conf/spark-defaults.conf and the -h 
option that you can provide to sbin/start-master.sh script.

Did you try:

sbin/start-master.sh -h xxx.xxx.xxx.xxx

and then use the IP when you start the slaves:

sbin/start-slave.sh spark://xxx.xxx.xxx.xxx.7077

?

Regards
JB

On 10/16/2015 06:01 PM, Nicholas Chammas wrote:
> I'd look into tracing a possible bug here, but I'm not sure where to
> look. Searching the codebase for `SPARK_MASTER_IP`, amazingly, does not
> show it being used in any place directly by Spark
> <https://github.com/apache/spark/search?utf8=%E2%9C%93&q=SPARK_MASTER_IP>.
>
> Clearly, Spark is using this environment variable (otherwise I wouldn't
> see the behavior described in my first email), but I can't see where.
>
> Can someone give me a pointer?
>
> Nick
>
> On Thu, Oct 15, 2015 at 12:37 AM Ted Yu <yuzhihong@gmail.com
> <mailto:yuzhihong@gmail.com>> wrote:
>
>     Some old bits:
>
>     http://stackoverflow.com/questions/28162991/cant-run-spark-1-2-in-standalone-mode-on-mac
>     http://stackoverflow.com/questions/29412157/passing-hostname-to-netty
>
>     FYI
>
>     On Wed, Oct 14, 2015 at 7:10 PM, Nicholas Chammas
>     <nicholas.chammas@gmail.com <mailto:nicholas.chammas@gmail.com>> wrote:
>
>         I’m setting the Spark master address via the |SPARK_MASTER_IP|
>         environment variable in |spark-env.sh|, like spark-ec2 does
>         <https://github.com/amplab/spark-ec2/blob/a990752575cd8b0ab25731d7820a55c714798ec3/templates/root/spark/conf/spark-env.sh#L13>.
>
>         The funny thing is that Spark seems to accept this only if the
>         value of |SPARK_MASTER_IP| is a DNS name and not an IP address.
>
>         When I provide an IP address, I get errors in the log when
>         starting the master:
>
>         |15/10/15 01:47:31 ERROR NettyTransport: failed to bind to
>         /54.210.XX.XX:7077, shutting down Netty transport |
>
>         (XX is my redaction of the full IP address.)
>
>         Am I misunderstanding something about how to use this
>         environment variable?
>
>         The spark-env.sh template indicates that either an IP address or
>         a hostname should work
>         <https://github.com/apache/spark/blob/4ace4f8a9c91beb21a0077e12b75637a4560a542/conf/spark-env.sh.template#L49>,
>         but my testing shows that only hostnames work.
>
>         Nick
>
>         ​
>
>

-- 
Jean-Baptiste Onofré
jbonofre@apache.org
http://blog.nanthrax.net
Talend - http://www.talend.com

---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
For additional commands, e-mail: dev-help@spark.apache.org


Mime
View raw message