airflow-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Frank Maritato <fmarit...@opentable.com.INVALID>
Subject Re: [External] Re: scheduler logging
Date Tue, 23 Apr 2019 21:49:17 GMT
ok, yes, looks like 1.10.3 resolves this. Thanks for the help!


On Tue, Apr 23, 2019 at 2:32 PM Kaxil Naik <kaxilnaik@gmail.com> wrote:

> Yes, please update us if 1.10.3 resolves that for you or if you face
> same problem.
>
> On Tue, Apr 23, 2019 at 10:30 PM Frank Maritato
> <fmaritato@opentable.com.invalid> wrote:
>
> > Ok, I'll update to 1.10.3 and see if I see the same behavior.
> >
> > On Tue, Apr 23, 2019 at 2:28 PM Kaxil Naik <kaxilnaik@gmail.com> wrote:
> >
> > > It works for me i.e in *1.10.3 *if I changed the "logging_level =
> > > WARN" in *airflow.cfg
> > > *file. I no longer get *info* logs in logs/schedule or in the UI.
> > >
> > >
> > >
> > > On Tue, Apr 23, 2019 at 10:08 PM Kaxil Naik <kaxilnaik@gmail.com>
> wrote:
> > >
> > > > On having a closer look, only cli.py was using the  LOGGING_LEVEL
> from
> > > > settings.py
> > > >
> > > > The logger is configured at
> > > >
> > >
> >
> https://github.com/apache/airflow/blob/master/airflow/config_templates/airflow_local_settings.py
> > > >
> > > > Check the following note in UPDATING.md:
> > > >
> > > >
> > >
> >
> https://github.com/apache/airflow/blob/85899b3aee1ffcd633f101f0fff154bd2d4370c1/UPDATING.md#changes-in-airflow-logging
> > > >
> > > > I will try this out myself and see if it works for me or not.
> > > >
> > > >
> > > > On Tue, Apr 23, 2019 at 9:58 PM Frank Maritato
> > > > <fmaritato@opentable.com.invalid> wrote:
> > > >
> > > >> Thank you!!
> > > >>
> > > >> On Tue, Apr 23, 2019 at 1:54 PM Kaxil Naik <kaxilnaik@gmail.com>
> > wrote:
> > > >>
> > > >> > @Frank - It is still the case. I will fix this in master and
> should
> > be
> > > >> > fixed in 1.10.4
> > > >> >
> > > >> > On Tue, Apr 23, 2019 at 9:38 PM Frank Maritato
> > > >> > <fmaritato@opentable.com.invalid> wrote:
> > > >> >
> > > >> > > sorry, it's line 93, not 35.
> > > >> > >
> > > >> > > On Tue, Apr 23, 2019 at 1:34 PM Frank Maritato <
> > > >> fmaritato@opentable.com>
> > > >> > > wrote:
> > > >> > >
> > > >> > > > Ok, I figured out why this is happening. settings.py:35
is
> hard
> > > >> coding
> > > >> > > the
> > > >> > > > logging level to INFO. I hacked it locally to:
> > > >> > > >
> > > >> > > > LOGGING_LEVEL = conf.get('core', 'logging_level')
> > > >> > > >
> > > >> > > > and now all the INFO logs are gone. Is this fixed in
a later
> > > >> version of
> > > >> > > > airflow?
> > > >> > > >
> > > >> > > >
> > > >> > > > On Tue, Apr 23, 2019 at 1:09 PM Frank Maritato <
> > > >> > fmaritato@opentable.com>
> > > >> > > > wrote:
> > > >> > > >
> > > >> > > >> Those two issues don't seem to be present in 1.10.0.
I only
> see
> > > one
> > > >> > > >> message for "db connection invalidated" and I don't
see
> > anything
> > > >> about
> > > >> > > >> harvesting.
> > > >> > > >>
> > > >> > > >> I have tried using a custom logger to see if that
effects the
> > > >> output
> > > >> > to
> > > >> > > >> airflow-scheduler.{log|out} but so far I haven't
had any
> luck.
> > > The
> > > >> > logs
> > > >> > > >> that go to /var/log/airflow/scheduler are affected,
but
> > > >> > > >> AIRFLOW_HOME/airflow-scheduler.log always is printing
out
> INFO
> > or
> > > >> > below.
> > > >> > > >>
> > > >> > > >> Is there any documentation on how to do this? Or
examples
> > > >> somewhere?
> > > >> > > >>
> > > >> > > >> On Tue, Apr 23, 2019 at 11:39 AM Daniel Standish
<
> > > >> > dpstandish@gmail.com>
> > > >> > > >> wrote:
> > > >> > > >>
> > > >> > > >>> I noticed two issues in 1.10.2, one of which
has been
> > resolved,
> > > >> but I
> > > >> > > am
> > > >> > > >>> not sure if they were present in 1.10.0, or
if they are the
> > > >> messages
> > > >> > > that
> > > >> > > >>> are bothering you.
> > > >> > > >>>
> > > >> > > >>> 1. "Harvesting DAG parsing results" was printed
every 2
> > seconds
> > > or
> > > >> > > >>> something.
> > > >> > > >>> This was resolved in commit [AIRFLOW-3911]
Change Harvesting
> > DAG
> > > >> > > parsing
> > > >> > > >>> results to DEBUG log level (#4729) which I
believe is in
> > 1.10.3
> > > >> now.
> > > >> > > >>> It just changes log level to debug for that
message.
> > > >> > > >>>
> > > >> > > >>> 2. Frequent "db connection invalidated" warning
> > > >> > > >>> This one is unsolved.  I created a ticket here:
> > > >> > > >>> https://issues.apache.org/jira/browse/AIRFLOW-4134
> > > >> > > >>> It seems like every 5 seconds the warning "db
connection
> > > >> invalidated"
> > > >> > > is
> > > >> > > >>> logged.
> > > >> > > >>> It happens in a connection reconnect try loop.
 It always
> > seems
> > > >> to be
> > > >> > > >>> able
> > > >> > > >>> to reconnect on first try, so one idea is we
could just set
> it
> > > to
> > > >> > only
> > > >> > > >>> warn
> > > >> > > >>> if the first retry fails (debug on first reconnect).
 But it
> > > >> would be
> > > >> > > >>> more
> > > >> > > >>> satisfying to figure out why this connection
always seems to
> > be
> > > >> > > >>> invalidated
> > > >> > > >>> and fix the root cause.  Alas I am not sure
how to
> proceed...
> > > >> > > >>>
> > > >> > > >>>
> > > >> > > >>>
> > > >> > > >>> On Tue, Apr 23, 2019 at 9:54 AM Bolke de Bruin
<
> > > bdbruin@gmail.com
> > > >> >
> > > >> > > >>> wrote:
> > > >> > > >>>
> > > >> > > >>> > It's probably better to create a custom
logging.conf and
> use
> > > >> that
> > > >> > > >>> instead.
> > > >> > > >>> >
> > > >> > > >>> > B.
> > > >> > > >>> >
> > > >> > > >>> > Op di 23 apr. 2019 18:13 schreef Frank
Maritato
> > > >> > > >>> > <fmaritato@opentable.com.invalid>:
> > > >> > > >>> >
> > > >> > > >>> > > No one else has this issue? Or no
one has a solution?
> > > >> > > >>> > >
> > > >> > > >>> > >
> > > >> > > >>> > > On Wed, Apr 17, 2019 at 5:49 PM Frank
Maritato <
> > > >> > > >>> fmaritato@opentable.com>
> > > >> > > >>> > > wrote:
> > > >> > > >>> > >
> > > >> > > >>> > > > Hi All,
> > > >> > > >>> > > >
> > > >> > > >>> > > > We are running airflow 1.10.0
and I'm wondering how I
> > can
> > > >> turn
> > > >> > > off
> > > >> > > >>> or
> > > >> > > >>> > > turn
> > > >> > > >>> > > > down the logging for the scheduler?
I tried setting
> > > >> > > >>> logging_level=WARN
> > > >> > > >>> > in
> > > >> > > >>> > > > airflow.cfg and restarting the
process but I'm still
> > > seeing
> > > >> a
> > > >> > ton
> > > >> > > >>> of
> > > >> > > >>> > info
> > > >> > > >>> > > > logging to .out and .log.
> > > >> > > >>> > > >
> > > >> > > >>> > > > Thanks!
> > > >> > > >>> > > > --
> > > >> > > >>> > > > Frank Maritato
> > > >> > > >>> > > >
> > > >> > > >>> > >
> > > >> > > >>> > >
> > > >> > > >>> > > --
> > > >> > > >>> > > Frank Maritato
> > > >> > > >>> > >
> > > >> > > >>> >
> > > >> > > >>>
> > > >> > > >>
> > > >> > > >>
> > > >> > > >> --
> > > >> > > >> Frank Maritato
> > > >> > > >>
> > > >> > > >
> > > >> > > >
> > > >> > > > --
> > > >> > > > Frank Maritato
> > > >> > > >
> > > >> > >
> > > >> > >
> > > >> > > --
> > > >> > > Frank Maritato
> > > >> > >
> > > >> >
> > > >> >
> > > >> > --
> > > >> > *Kaxil Naik*
> > > >> > *Big Data Consultant *@ *Data Reply UK*
> > > >> > *Certified *Google Cloud Data Engineer | *Certified* Apache Spark
> &
> > > >> Neo4j
> > > >> > Developer
> > > >> > *LinkedIn*: https://www.linkedin.com/in/kaxil
> > > >> >
> > > >>
> > > >>
> > > >> --
> > > >> Frank Maritato
> > > >>
> > > >
> > > >
> > > > --
> > > > *Kaxil Naik*
> > > > *Big Data Consultant *@ *Data Reply UK*
> > > > *Certified *Google Cloud Data Engineer | *Certified* Apache Spark &
> > Neo4j
> > > > Developer
> > > > *LinkedIn*: https://www.linkedin.com/in/kaxil
> > > >
> > >
> > >
> > > --
> > > *Kaxil Naik*
> > > *Big Data Consultant *@ *Data Reply UK*
> > > *Certified *Google Cloud Data Engineer | *Certified* Apache Spark &
> Neo4j
> > > Developer
> > > *LinkedIn*: https://www.linkedin.com/in/kaxil
> > >
> >
> >
> > --
> > Frank Maritato
> >
>
>
> --
> *Kaxil Naik*
> *Big Data Consultant *@ *Data Reply UK*
> *Certified *Google Cloud Data Engineer | *Certified* Apache Spark & Neo4j
> Developer
> *LinkedIn*: https://www.linkedin.com/in/kaxil
>


-- 
Frank Maritato

Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message