spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From James King <jakwebin...@gmail.com>
Subject Re: spark-defaults.conf
Date Tue, 28 Apr 2015 11:13:21 GMT
So no takers regarding why spark-defaults.conf is not being picked up.

Here is another one:

If Zookeeper is configured in Spark why do we need to start a slave like
this:

spark-1.3.0-bin-hadoop2.4/sbin/start-slave.sh 1 spark://somemaster:7077

i.e. why do we need to specify the master url explicitly

Shouldn't Spark just consult with ZK and us the active master?

Or is ZK only used during failure?


On Mon, Apr 27, 2015 at 1:53 PM, James King <jakwebinbox@gmail.com> wrote:

> Thanks.
>
> I've set SPARK_HOME and SPARK_CONF_DIR appropriately in .bash_profile
>
> But when I start worker like this
>
> spark-1.3.0-bin-hadoop2.4/sbin/start-slave.sh
>
> I still get
>
> failed to launch org.apache.spark.deploy.worker.Worker:
>                              Default is conf/spark-defaults.conf.
>   15/04/27 11:51:33 DEBUG Utils: Shutdown hook called
>
>
>
>
>
> On Mon, Apr 27, 2015 at 1:15 PM, Zoltán Zvara <zoltan.zvara@gmail.com>
> wrote:
>
>> You should distribute your configuration file to workers and set the
>> appropriate environment variables, like HADOOP_HOME, SPARK_HOME,
>> HADOOP_CONF_DIR, SPARK_CONF_DIR.
>>
>> On Mon, Apr 27, 2015 at 12:56 PM James King <jakwebinbox@gmail.com>
>> wrote:
>>
>>> I renamed spark-defaults.conf.template to spark-defaults.conf
>>> and invoked
>>>
>>> spark-1.3.0-bin-hadoop2.4/sbin/start-slave.sh
>>>
>>> But I still get
>>>
>>> failed to launch org.apache.spark.deploy.worker.Worker:
>>>     --properties-file FILE   Path to a custom Spark properties file.
>>>                              Default is conf/spark-defaults.conf.
>>>
>>> But I'm thinking it should pick up the default spark-defaults.conf from
>>> conf dir
>>>
>>> Am I expecting or doing something wrong?
>>>
>>> Regards
>>> jk
>>>
>>>
>>>
>

Mime
View raw message