flink-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Scott Sue <scott....@celer-tech.com>
Subject Re: Logging Kafka during exceptions
Date Thu, 22 Nov 2018 16:17:17 GMT
Hi Till,

Yeah I think that would work especially knowing this isn’ something that is out of the box
at the moment.  Do you think its worth raising this as a feature request at all?  I think
that’s one thing with my experience with Flink is that its quite hard to debug what is going
on when there is an unexpected exception.


Regards,
Scott

SCOTT SUE
CHIEF TECHNOLOGY OFFICER

Support Line : +44(0) 2031 371 603
Mobile : +852 9611 3969

9/F, 33 Lockhart Road, Wan Chai, Hong Kong
www.celer-tech.com <http://www.celer-tech.com/>






> On 23 Nov 2018, at 00:12, Till Rohrmann <trohrmann@apache.org> wrote:
> 
> Hi Scott,
> 
> I think you could write some Wrappers for the different user function types which could
contain the logging logic. That way you would still need to wrap you actual business logic
but don't have to duplicate the logic over and over again.
> 
> If you also want to log the state, then you would need to wrap the RuntimeContext to
interfere all state registering calls so that you can keep track of them.
> 
> Would that work for you?
> 
> Cheers,
> Till
> 
> On Thu, Nov 22, 2018 at 8:44 AM Scott Sue <scott.sue@celer-tech.com <mailto:scott.sue@celer-tech.com>>
wrote:
> Yeah I think that would work for incorrect data consumed, but not for if deserialization
passes correctly, but one of my custom functions post deserialization generates an error?
> 
> 
> Regards,
> Scott
> 
> SCOTT SUE
> CHIEF TECHNOLOGY OFFICER
> 
> Support Line : +44(0) 2031 371 603
> Mobile : +852 9611 3969
> 
> 9/F, 33 Lockhart Road, Wan Chai, Hong Kong
> www.celer-tech.com <http://www.celer-tech.com/>
> 
> 
> 
> 
> 
> 
>> On 22 Nov 2018, at 15:15, miki haiat <miko5054@gmail.com <mailto:miko5054@gmail.com>>
wrote:
>> 
>> If so , then you can implement your own deserializer[1] with costume logic  and error
handling 
>> 
>> 
>> 
>> 1.https://ci.apache.org/projects/flink/flink-docs-stable/api/java/org/apache/flink/streaming/util/serialization/KeyedDeserializationSchemaWrapper.html
<https://ci.apache.org/projects/flink/flink-docs-stable/api/java/org/apache/flink/streaming/util/serialization/KeyedDeserializationSchemaWrapper.html>
>> 
>> 
>> On Thu, Nov 22, 2018 at 8:57 AM Scott Sue <scott.sue@celer-tech.com <mailto:scott.sue@celer-tech.com>>
wrote:
>> Json is sent into Kafka
>> 
>> 
>> Regards,
>> Scott
>> 
>> SCOTT SUE
>> CHIEF TECHNOLOGY OFFICER
>> 
>> Support Line : +44(0) 2031 371 603
>> Mobile : +852 9611 3969
>> 
>> 9/F, 33 Lockhart Road, Wan Chai, Hong Kong
>> www.celer-tech.com <http://www.celer-tech.com/>
>> 
>> 
>> 
>> 
>> 
>> 
>>> On 22 Nov 2018, at 14:55, miki haiat <miko5054@gmail.com <mailto:miko5054@gmail.com>>
wrote:
>>> 
>>> Which data format   is sent to kafka ? 
>>> Json Avro Other ?
>>> 
>>> 
>>> 
>>> On Thu, Nov 22, 2018 at 7:36 AM Scott Sue <scott.sue@celer-tech.com <mailto:scott.sue@celer-tech.com>>
wrote:
>>> Unexpected data meaning business level data that I didn’t expect to receive.
So business level data that doesn’t quite conform
>>> 
>>> On Thu, 22 Nov 2018 at 13:30, miki haiat <miko5054@gmail.com <mailto:miko5054@gmail.com>>
wrote:
>>>  Unexpected data you mean parsing error ?
>>> Which format is sent to Kafka ?
>>> 
>>> 
>>> 
>>> On Thu, 22 Nov 2018, 6:59 Scott Sue <scott.sue@celer-tech.com <mailto:scott.sue@celer-tech.com>
wrote:
>>> Hi all,
>>> 
>>> When I'm running my jobs I am consuming data from Kafka to process in my
>>> job.  Unfortunately my job receives unexpected data from time to time which
>>> I'm trying to find the root cause of the issue.
>>> 
>>> Ideally, I want to be able to have a way to know when the job has failed due
>>> to an exception, to then log to file the last message that it was consuming
>>> at the time to help track down the offending message consumed.  How is this
>>> possible within Flink?
>>> 
>>> Thinking about this more, it may not be a consumed message that killed the
>>> job, but maybe a transformation within the job itself and it died in a
>>> downstream Operator.  In this case, is there a way to log to file the
>>> message that an Operator was processing at the time that caused the
>>> exception?
>>> 
>>> 
>>> Thanks in advance!
>>> 
>>> 
>>> 
>>> --
>>> Sent from: http://apache-flink-user-mailing-list-archive.2336050.n4.nabble.com/
<http://apache-flink-user-mailing-list-archive.2336050.n4.nabble.com/>
>>> -- 
>>> 
>>> 
>>> Regards,
>>> Scott
>>> 
>>> SCOTT SUE
>>> CHIEF TECHNOLOGY OFFICER
>>> 
>>> Support Line : +44(0) 2031 371 603 <tel:+44%2020%203137%201603>
>>> Mobile : +852 9611 3969 <tel:9611%203969>
>>> 
>>> 9/F, 33 Lockhart Road, Wanchai, Hong Kong
>>> www.celer-tech.com <http://www.celer-tech.com/>
>>> This message, including any attachments, may include private, privileged and
confidential information and is intended only for the personal and confidential use of the
intended recipient(s). If the reader of this message is not an intended recipient, you are
hereby notified that any review, use, dissemination, distribution, printing or copying of
this message or its contents is strictly prohibited and may be unlawful. If you are not an
intended recipient or have received this communication in error, please immediately notify
the sender by telephone and/or a reply email and permanently delete the original message,
including any attachments, without making a copy.
>> 
>> 
>> This message, including any attachments, may include private, privileged and confidential
information and is intended only for the personal and confidential use of the intended recipient(s).
If the reader of this message is not an intended recipient, you are hereby notified that any
review, use, dissemination, distribution, printing or copying of this message or its contents
is strictly prohibited and may be unlawful. If you are not an intended recipient or have received
this communication in error, please immediately notify the sender by telephone and/or a reply
email and permanently delete the original message, including any attachments, without making
a copy.
> 
> 
> This message, including any attachments, may include private, privileged and confidential
information and is intended only for the personal and confidential use of the intended recipient(s).
If the reader of this message is not an intended recipient, you are hereby notified that any
review, use, dissemination, distribution, printing or copying of this message or its contents
is strictly prohibited and may be unlawful. If you are not an intended recipient or have received
this communication in error, please immediately notify the sender by telephone and/or a reply
email and permanently delete the original message, including any attachments, without making
a copy.


-- 








_This message, including any attachments, may include private, 
privileged and confidential information and is intended only for the 
personal and confidential use of the intended recipient(s). If the reader 
of this message is not an intended recipient, you are hereby notified that 
any review, use, dissemination, distribution, printing or copying of this 
message or its contents is strictly prohibited and may be unlawful. If you 
are not an intended recipient or have received this communication in error, 
please immediately notify the sender by telephone and/or a reply email and 
permanently delete the original message, including any attachments, without 
making a copy._

Mime
View raw message