livy-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Dimosthenis Masouros <demo.masou...@gmail.com>
Subject Livy Automatically Deletes Log Files
Date Tue, 04 Jul 2017 20:09:39 GMT
Good evening,

Before I state my problem, for reference reasons, my livy.conf file looks
like this:

livy.spark.master = yarn
livy.spark.deployMode = cluster
livy.server.recovery.mode = recovery
livy.server.recovery.state-store = filesystem
livy.server.recovery.state-store.url = file:///root/hdfs/livy
livy.server.request-log-retain.days = 5

I have set up a livy server on a hadoop namenode. When i submit a new job
everything operates normally and the job is submitted successfully.

curl -X POST --data '{"file": "/pi.py", "args": ["10"], "name": "test
livy"}' -H "Content-Type: application/json" namenode:8998/batches

After the job is submitted i am able to see the status of the application
by running:

curl -X POST namenode:8998/batches/id

which returns a json file with the id of the job etc. If i list the files
of the /root/hdfs/livy folder on the namenode, during the job's execution I
am able to see the exact same file as the replied from the curl request.
However, the problem i am experiencing is that when the job finishes its
execution, after some period of time (~10-15 mins) the log file of the job
is deleted automatically from the host. Is there any way i can keep this
log file? Or alternatively, is there any way to find an application's
status submitted with livy, using a curl request (or anything familiar)?


Thanks for your support.

Mime
View raw message