livy-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From kant kodali <kanth...@gmail.com>
Subject Re: What happens if Livy server crashes ? All the spark jobs are gone?
Date Wed, 21 Mar 2018 08:55:36 GMT
I have the following but I am unable to successfully submit a job though
livy in cluster mode.

here are my settings

# spark-defaults.conf

spark.master yarn


#livy.conf


livy.spark.master=yarn

livy.spark.deploy-mode = cluster

livy.server.recovery.mode = recovery

livy.server.recovery.state-store = zookeeper

livy.server.recovery.state-store.url = localhost:2181


Anything wrong with this conf?


Thanks!

On Tue, Mar 20, 2018 at 5:38 PM, kant kodali <kanth909@gmail.com> wrote:

> got it! is it livy.spark.deploy-mode=yarn-cluster or livy.spark.deploy-mode
> = cluster ? Sorry to ask this question. I couldn't find it in docs or the
> comments in livy.conf and I am using livy 0.4.0
>
> On Tue, Mar 20, 2018 at 5:01 PM, Meisam Fathi <meisam.fathi@gmail.com>
> wrote:
>
>> If you are running on cluster mode, the application should keep running
>> on YRAN.
>>
>> On Tue, Mar 20, 2018 at 3:34 PM kant kodali <kanth909@gmail.com> wrote:
>>
>>> @Meisam Fathi I am running with yarn and zookeeper as a state store. I
>>> spawned a job via livy that reads from kafka and writes to Kafka
>>> but the moment I kill the livy server the job also is getting killed.
>>> not sure why? I believe once the livy server crashes the spark context also
>>> get's killed so do I need to need to set the livy.spark.deploy.mode ? if
>>> so, what value should I set it to?
>>>
>>>
>>> On Mon, Mar 12, 2018 at 12:30 PM, Meisam Fathi <meisam.fathi@gmail.com>
>>> wrote:
>>>
>>>> On YARN, your application keeps running even if the launcher fails. So
>>>> after recovery, Livy reconnects to the application. On Spark standalone,
I
>>>> am not sure what happens to the application of the launcher fails.
>>>>
>>>> Thanks,
>>>> Meisam
>>>>
>>>> On Mon, Mar 12, 2018 at 10:34 AM kant kodali <kanth909@gmail.com>
>>>> wrote:
>>>>
>>>>> can someone please explain how YARN helps here? And why not spark
>>>>> master?
>>>>>
>>>>> On Mon, Mar 12, 2018 at 3:41 AM, Matteo Durighetto <
>>>>> m.durighetto@miriade.it> wrote:
>>>>>
>>>>>>
>>>>>>
>>>>>> 2018-03-12 9:58 GMT+01:00 kant kodali <kanth909@gmail.com>:
>>>>>>
>>>>>>> Sorry I see there is a recovery mode and also I can set state
store
>>>>>>> to zookeeper but looks like I need YARN? because I get the error
message
>>>>>>> below
>>>>>>>
>>>>>>> "requirement failed: Session recovery requires YARN"
>>>>>>>
>>>>>>>
>>>>>>> I am using spark standalone and I don't use YARN anywhere in
my
>>>>>>> cluster. is there any other option for recovery in this case?
>>>>>>>
>>>>>>>
>>>>>>> On Sun, Mar 11, 2018 at 11:57 AM, kant kodali <kanth909@gmail.com>
>>>>>>> wrote:
>>>>>>>
>>>>>>>> Hi All,
>>>>>>>>
>>>>>>>> When my live server crashes it looks like all my spark jobs
are
>>>>>>>> gone. I am trying to see how I can make it more resilient?
other words, I
>>>>>>>> would like spark jobs that were spawned by Livy to be running
even if my
>>>>>>>> Livy server crashes because in theory Livy server can crash
anytime and
>>>>>>>> Spark Jobs should run for weeks or months in my case. How
can I achieve
>>>>>>>> this?
>>>>>>>>
>>>>>>>> Thanks!
>>>>>>>>
>>>>>>>>
>>>>>>> Hello,
>>>>>>              to enable recovery in Livy you need Spark on YARN
>>>>>>
>>>>>> ( https://spark.apache.org/docs/latest/running-on-yarn.html )
>>>>>>
>>>>>>
>>>>>>
>>>>>> Kind Regards
>>>>>>
>>>>>
>>>>>
>>>
>

Mime
View raw message