flink-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Aljoscha Krettek <aljos...@apache.org>
Subject Re: Ability to partition logs per pipeline
Date Sun, 31 Jul 2016 22:41:11 GMT
Hi,
I'm afraid that's not possible right now. The preferred way of running
would be to have a Yarn cluster per job, that way you can isolate the logs.

Cheers,
Aljoscha

On Thu, 14 Jul 2016 at 09:49 Chawla,Sumit <sumitkchawla@gmail.com> wrote:

> Hi Robert
>
> I actually mean both.  Scenarios where multiple jobs are running on
> cluster, and same job could  be running on multiple task managers.  How can
> we make sure that each job logs to a different file so that Logs are not
> mixed, and its easy to debug a particular job.  Something like Hadoop Yarn,
> where each attempt of a task produces a different log file.
>
> Regards
>
> Sumit Chawla
>
>
> On Thu, Jul 14, 2016 at 6:11 AM, Robert Metzger <rmetzger@apache.org>
> wrote:
>
>> Hi Sumit,
>>
>> What exactly do you mean by pipeline?
>> Are you talking about cases were multiple jobs are running concurrently
>> on the same TaskManager, or are you referring to parallel instances of a
>> Flink job?
>>
>> On Wed, Jul 13, 2016 at 9:49 PM, Chawla,Sumit <sumitkchawla@gmail.com>
>> wrote:
>>
>>> Hi All
>>>
>>> Does flink provide any ability to streamline logs being generated from a
>>>  pipeline.  How can we keep the logs from two pipelines separate so that
>>> its easy to debug the pipeline execution (something dynamic to
>>> automatically partition the logs per pipeline)
>>> Regards
>>> Sumit Chawla
>>>
>>>
>>
>

Mime
View raw message