From user-return-35391-archive-asf-public=cust-asf.ponee.io@flink.apache.org Thu May 28 22:43:45 2020 Return-Path: X-Original-To: archive-asf-public@cust-asf.ponee.io Delivered-To: archive-asf-public@cust-asf.ponee.io Received: from mail.apache.org (hermes.apache.org [207.244.88.153]) by mx-eu-01.ponee.io (Postfix) with SMTP id AF70F180643 for ; Fri, 29 May 2020 00:43:44 +0200 (CEST) Received: (qmail 25564 invoked by uid 500); 28 May 2020 22:43:42 -0000 Mailing-List: contact user-help@flink.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Delivered-To: mailing list user@flink.apache.org Received: (qmail 25547 invoked by uid 99); 28 May 2020 22:43:42 -0000 Received: from pnap-us-west-generic-nat.apache.org (HELO spamd1-us-west.apache.org) (209.188.14.142) by apache.org (qpsmtpd/0.29) with ESMTP; Thu, 28 May 2020 22:43:42 +0000 Received: from localhost (localhost [127.0.0.1]) by spamd1-us-west.apache.org (ASF Mail Server at spamd1-us-west.apache.org) with ESMTP id 4EB1CC03CD; Thu, 28 May 2020 22:43:41 +0000 (UTC) X-Virus-Scanned: Debian amavisd-new at spamd1-us-west.apache.org X-Spam-Flag: NO X-Spam-Score: -1.842 X-Spam-Level: X-Spam-Status: No, score=-1.842 tagged_above=-999 required=6.31 tests=[DKIMWL_WL_HIGH=-0.001, DKIM_SIGNED=0.1, DKIM_VALID=-0.1, DKIM_VALID_AU=-0.1, DKIM_VALID_EF=-0.1, HTML_MESSAGE=0.2, HTTPS_HTTP_MISMATCH=0.1, KAM_LOTSOFHASH=0.25, KAM_SHORT=0.001, RCVD_IN_DNSWL_MED=-2.3, RCVD_IN_MSPIKE_H2=-0.001, SPF_HELO_PASS=-0.001, SPF_PASS=-0.001, T_KAM_HTML_FONT_INVALID=0.01, URIBL_BLOCKED=0.001, URI_HEX=0.1] autolearn=disabled Authentication-Results: spamd1-us-west.apache.org (amavisd-new); dkim=pass (2048-bit key) header.d=gs.com Received: from mx1-he-de.apache.org ([10.40.0.8]) by localhost (spamd1-us-west.apache.org [10.40.0.7]) (amavisd-new, port 10024) with ESMTP id uIRIggrE0lqI; Thu, 28 May 2020 22:43:38 +0000 (UTC) Received-SPF: Pass (mailfrom) identity=mailfrom; client-ip=207.17.46.170; helo=mxe08.gs.com; envelope-from=andreas.hailu@gs.com; receiver= Received: from mxe08.gs.com (mxe08.gs.com [207.17.46.170]) by mx1-he-de.apache.org (ASF Mail Server at mx1-he-de.apache.org) with ESMTPS id 327107E148; Thu, 28 May 2020 22:43:36 +0000 (UTC) Received: from pps.filterd (gsppabdp04sd.idz.gs.com [127.0.0.1]) by gsppabdp04sd.idz.gs.com (8.16.0.21/8.16.0.21) with SMTP id 04SMXfjB031222; Thu, 28 May 2020 18:43:28 -0400 DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/simple; d=gs.com; h=from : to : subject : date : message-id : references : content-type : mime-version; s=201802; bh=86fTpEbDShQ0bvB8lZxZHHNkm+gTJbpTVydHiSVnZfw=; b=WuiHKPSUeVC/984+YrkbbxHc0FdXlpt1VagdmamEcimRiBYZSZr1BDzvwUk9I+zDB2Ky W8wxSMsjI18Dufn7G+pX8XiSVYArTEAAb/qetw8ZqiWaRm0Cu80I2uiQkwMDLS0C4oQq PcBk5ugSCn9E0zEFdV05n1TOHmledgj5PnpUc2P2MXny0Hzv9u2IHi3xIJx6xyiYMmF5 +UaROLPElLCY65mnoEk0pQ3P941+mv13XCtKj/c2tNaQKH22HJzlzCKwPU4dYwKXKI2i 73xp1Kw+QmwYEP7hl4gQuD6HxAoYMY0JSbbOVaa+zlJjpyHuoX9Mr/yFsmg4mK0uBmh5 cw== Received: from gsppacdp02nd.inz.gs.com ([10.205.68.212]) by gsppabdp04sd.idz.gs.com with ESMTP id 31a5vhsrqy-1; Thu, 28 May 2020 18:43:28 -0400 Received: from pps.filterd (gsppacdp02nd.inz.gs.com [127.0.0.1]) by gsppacdp02nd.inz.gs.com (8.16.0.21/8.16.0.21) with SMTP id 04SMYJab014119; Thu, 28 May 2020 18:43:27 -0400 Received: from gsdgamp23etn4.firmwide.corp.gs.com (gsdgamp23etn4.firmwide.corp.gs.com [10.47.14.170]) by gsppacdp02nd.inz.gs.com with ESMTP id 316xaqy77f-1; Thu, 28 May 2020 18:43:27 -0400 Received: from GSDGAMP01ETN1.firmwide.corp.gs.com (139.172.99.163) by gsdgamp23etn4.firmwide.corp.gs.com (10.47.14.170) with Microsoft SMTP Server (TLS) id 15.0.1395.4; Thu, 28 May 2020 18:43:27 -0400 Received: from GSDGAMP01ETN2.firmwide.corp.gs.com (139.172.99.162) by gsdgamp01etn1.firmwide.corp.gs.com (139.172.99.163) with Microsoft SMTP Server (TLS) id 15.0.1395.4; Thu, 28 May 2020 18:43:26 -0400 Received: from GSDGAMP01ETN2.firmwide.corp.gs.com ([fe80::1989:72a0:5a46:1053]) by gsdgamp01etn2.firmwide.corp.gs.com ([fe80::1989:72a0:5a46:1053%13]) with mapi id 15.00.1395.000; Thu, 28 May 2020 18:43:26 -0400 From: "Hailu, Andreas" To: Chesnay Schepler , "user@flink.apache.org" Subject: RE: History Server Not Showing Any Jobs - File Not Found? Thread-Topic: History Server Not Showing Any Jobs - File Not Found? Thread-Index: AdYX7t5JfkXxZK7fTZaBMPWesdElXQAfqZeAAA8F7HAAaRUIEACTZZEAAAFbyzAACCN7AAAGV/jQAFn7EgAABjFXIADXBoYABKeChVAAL9lNAAAGgZdA///mIICAAC4fEP//8tuw Date: Thu, 28 May 2020 22:43:26 +0000 Message-ID: References: <5775bc01-156d-bd02-05b3-0ce14a05c427@apache.org> <9eca7447c68e4a838768877c6cbb5b22@gsdgamq01etn1.firmwide.corp.gs.com> <11b472e6-02b1-efd1-15e0-58df2d606ea4@apache.org> <7b6a0f32db5243a59a1d32cb1dc76594@gsdgamp01etn2.firmwide.corp.gs.com> <7cb7f12e-dada-cbb0-8d8d-2c8869c48774@apache.org> <690f8dda-c1bf-a777-b96b-36c382a1461c@apache.org> <75ca56c3-0102-3750-ae5f-dd533cbfc90e@apache.org> Accept-Language: en-US Content-Language: en-US X-MS-Has-Attach: X-MS-TNEF-Correlator: x-titus-metadata-40: eyJDYXRlZ29yeUxhYmVscyI6IiIsIk1ldGFkYXRhIjp7Im5zIjoiaHR0cDpcL1wvd3d3LnRpdHVzLmNvbVwvbnNcL0dvbGRtYW5TYWNocyIsImlkIjoiZjg3N2YyNjktOGM4ZC00MWIwLWE5NzktODA2ZGJlYWVjODc2IiwicHJvcHMiOlt7Im4iOiJDbGFzc2lmaWNhdGlvbiIsInZhbHMiOlt7InZhbHVlIjoiRSJ9XX0seyJuIjoiQXVkIiwidmFscyI6W119LHsibiI6IlNFIiwidmFscyI6W3sidmFsdWUiOiJOIn1dfSx7Im4iOiJDUiIsInZhbHMiOltdfV19LCJTdWJqZWN0TGFiZWxzIjpbXSwiVE1DVmVyc2lvbiI6IjE5LjMuMTk1MS4yIiwiVHJ1c3RlZExhYmVsSGFzaCI6IkhVb0lNb216ektUejNheTVucGE3RGhVbFBjRGd5YXcwSlZya1gzRFQ1ZzhQOFlOTG80QU8wNTRQd1wvMTU4OGNVIn0= x-ms-exchange-transport-fromentityheader: Hosted x-originating-ip: [10.46.7.20] Content-Type: multipart/alternative; boundary="_000_e4096b610d904b0d84ed196f45123963gsdgamp01etn2firmwideco_" MIME-Version: 1.0 X-WiganSS: 01000000010022gsdgamp23etn4.firmwide.corp.gs.com ID0045 X-Proofpoint-Virus-Version: vendor=fsecure engine=2.50.10434:6.0.216,18.0.687 definitions=2020-05-28_08:2020-05-28,2020-05-28 signatures=0 X-Proofpoint-Virus-Version: vendor=fsecure engine=2.50.10434:6.0.216,18.0.687 definitions=2020-05-28_08:2020-05-28,2020-05-28 signatures=0 --_000_e4096b610d904b0d84ed196f45123963gsdgamp01etn2firmwideco_ Content-Type: text/plain; charset="us-ascii" Content-Transfer-Encoding: quoted-printable May I also ask what version of flink-hadoop you're using and the number of = jobs you're storing the history for? As of writing we have roughly 101,000 = application history files. I'm curious to know if we're encountering some k= ind of resource problem. // ah From: Hailu, Andreas [Engineering] Sent: Thursday, May 28, 2020 12:18 PM To: 'Chesnay Schepler' ; user@flink.apache.org Subject: RE: History Server Not Showing Any Jobs - File Not Found? Okay, I will look further to see if we're mistakenly using a version that's= pre-2.6.0. However, I don't see flink-shaded-hadoop in my /lib directory f= or flink-1.9.1. flink-dist_2.11-1.9.1.jar flink-table-blink_2.11-1.9.1.jar flink-table_2.11-1.9.1.jar log4j-1.2.17.jar slf4j-log4j12-1.7.15.jar Are the files within /lib. // ah From: Chesnay Schepler > Sent: Thursday, May 28, 2020 11:00 AM To: Hailu, Andreas [Engineering] >; user@flink.apache.org Subject: Re: History Server Not Showing Any Jobs - File Not Found? Looks like it is indeed stuck on downloading the archive. I searched a bit in the Hadoop JIRA and found several similar instances: https://issues.apache.org/jira/browse/HDFS-6999 https://issues.apache.org/jira/browse/HDFS-7005 https://issues.apache.org/jira/browse/HDFS-7145 It is supposed to be fixed in 2.6.0 though :/ If hadoop is available from the HADOOP_CLASSPATH and flink-shaded-hadoop in= /lib then you basically don't know what Hadoop version is actually being u= sed, which could lead to incompatibilities and dependency clashes. If flink-shaded-hadoop 2.4/2.5 is on the classpath, maybe that is being use= d and runs into HDFS-7005. On 28/05/2020 16:27, Hailu, Andreas wrote: Just created a dump, here's what I see: "Flink-HistoryServer-ArchiveFetcher-thread-1" #19 daemon prio=3D5 os_prio= =3D0 tid=3D0x00007f93a5a2c000 nid=3D0x5692 runnable [0x00007f934a0d3000] java.lang.Thread.State: RUNNABLE at sun.nio.ch.EPollArrayWrapper.epollWait(Native Method) at sun.nio.ch.EPollArrayWrapper.poll(EPollArrayWrapper.java:269) at sun.nio.ch.EPollSelectorImpl.doSelect(EPollSelectorImpl.java:79) at sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImpl.java:86) - locked <0x00000005df986960> (a sun.nio.ch.Util$2) - locked <0x00000005df986948> (a java.util.Collections$Unmodifiable= Set) - locked <0x00000005df928390> (a sun.nio.ch.EPollSelectorImpl) at sun.nio.ch.SelectorImpl.select(SelectorImpl.java:97) at org.apache.hadoop.net.SocketIOWithTimeout$SelectorPool.select(So= cketIOWithTimeout.java:335) at org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeo= ut.java:157) at org.apache.hadoop.net.SocketInputStream.read(SocketInputStream.j= ava:161) at org.apache.hadoop.hdfs.protocol.datatransfer.PacketReceiver.read= ChannelFully(PacketReceiver.java:258) at org.apache.hadoop.hdfs.protocol.datatransfer.PacketReceiver.doRe= adFully(PacketReceiver.java:209) at org.apache.hadoop.hdfs.protocol.datatransfer.PacketReceiver.doRe= ad(PacketReceiver.java:171) at org.apache.hadoop.hdfs.protocol.datatransfer.PacketReceiver.rece= iveNextPacket(PacketReceiver.java:102) at org.apache.hadoop.hdfs.RemoteBlockReader2.readNextPacket(RemoteB= lockReader2.java:201) at org.apache.hadoop.hdfs.RemoteBlockReader2.read(RemoteBlockReader= 2.java:152) - locked <0x00000005ceade5e0> (a org.apache.hadoop.hdfs.RemoteBlock= Reader2) at org.apache.hadoop.hdfs.DFSInputStream$ByteArrayStrategy.doRead(D= FSInputStream.java:781) at org.apache.hadoop.hdfs.DFSInputStream.readBuffer(DFSInputStream.= java:837) - eliminated <0x00000005cead3688> (a org.apache.hadoop.hdfs.DFSInpu= tStream) at org.apache.hadoop.hdfs.DFSInputStream.readWithStrategy(DFSInputS= tream.java:897) - locked <0x00000005cead3688> (a org.apache.hadoop.hdfs.DFSInputStr= eam) at org.apache.hadoop.hdfs.DFSInputStream.read(DFSInputStream.java:94= 5) - locked <0x00000005cead3688> (a org.apache.hadoop.hdfs.DFSInputStr= eam) at java.io.DataInputStream.read(DataInputStream.java:149) at org.apache.flink.runtime.fs.hdfs.HadoopDataInputStream.read(Hado= opDataInputStream.java:94) at java.io.InputStream.read(InputStream.java:101) at org.apache.flink.util.IOUtils.copyBytes(IOUtils.java:69) at org.apache.flink.util.IOUtils.copyBytes(IOUtils.java:91) at org.apache.flink.runtime.history.FsJobArchivist.getArchivedJsons= (FsJobArchivist.java:110) at org.apache.flink.runtime.webmonitor.history.HistoryServerArchive= Fetcher$JobArchiveFetcherTask.run(HistoryServerArchiveFetcher.java:169) at java.util.concurrent.Executors$RunnableAdapter.call(Executors.ja= va:511) at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308) at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFuture= Task.access$301(ScheduledThreadPoolExecutor.java:180) at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFuture= Task.run(ScheduledThreadPoolExecutor.java:294) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExec= utor.java:1142) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExe= cutor.java:617) at java.lang.Thread.run(Thread.java:745) What problems could the flink-shaded-hadoop jar being included introduce? // ah From: Chesnay Schepler Sent: Thursday, May 28, 2020 9:26 AM To: Hailu, Andreas [Engineering] ; user@flink.apache.org Subject: Re: History Server Not Showing Any Jobs - File Not Found? If it were a class-loading issue I would think that we'd see an exception o= f some kind. Maybe double-check that flink-shaded-hadoop is not in the lib = directory. (usually I would ask for the full classpath that the HS is start= ed with, but as it turns out this isn't getting logged :( (FLINK-18008)) The fact that overview.json and jobs/overview.json are missing indicates th= at something goes wrong directly on startup. What is supposed to happens is= that the HS starts, fetches all currently available archives and then crea= tes these files. So it seems like the download gets stuck for some reason. Can you use jstack to create a thread dump, and see what the Flink-HistoryS= erver-ArchiveFetcher is doing? I will also file a JIRA for adding more logging statements, like when fetch= ing starts/stops. On 27/05/2020 20:57, Hailu, Andreas wrote: Hi Chesney, apologies for not getting back to you sooner here. So I did wha= t you suggested - I downloaded a few files from my jobmanager.archive.fs.di= r HDFS directory to a locally available directory named /local/scratch/hail= ua_p2epdlsuat/historyserver/archived/. I then changed my historyserver.arch= ive.fs.dir to file:///local/scratch/hailua_p2epdlsuat/historyserver/archive= d/ and that seemed to work. I'm able to see the history of the applications= I downloaded. So this points to a problem with sourcing the history from H= DFS. Do you think this could be classpath related? This is what we use for our H= ADOOP_CLASSPATH var: /gns/software/infra/big-data/hadoop/hdp-2.6.5.0/hadoop/*:/gns/software/infr= a/big-data/hadoop/hdp-2.6.5.0/hadoop/lib/*:/gns/software/infra/big-data/had= oop/hdp-2.6.5.0/hadoop-hdfs/*:/gns/software/infra/big-data/hadoop/hdp-2.6.5= .0/hadoop-hdfs/lib/*:/gns/software/infra/big-data/hadoop/hdp-2.6.5.0/hadoop= -mapreduce/*:/gns/software/infra/big-data/hadoop/hdp-2.6.5.0/hadoop-mapredu= ce/lib/*:/gns/software/infra/big-data/hadoop/hdp-2.6.5.0/hadoop-yarn/*:/gns= /software/infra/big-data/hadoop/hdp-2.6.5.0/hadoop-yarn/lib/*:/gns/software= /ep/da/dataproc/dataproc-prod/lakeRmProxy.jar:/gns/software/infra/big-data/= hadoop/hdp-2.6.5.0/hadoop/bin::/gns/mw/dbclient/postgres/jdbc/pg-jdbc-9.3.v= 01/postgresql-9.3-1100-jdbc4.jar You can see we have references to Hadoop mapred/yarn/hdfs libs in there. // ah From: Chesnay Schepler Sent: Sunday, May 3, 2020 6:00 PM To: Hailu, Andreas [Engineering] ; user@flink.apache.org Subject: Re: History Server Not Showing Any Jobs - File Not Found? yes, exactly; I want to rule out that (somehow) HDFS is the problem. I couldn't reproduce the issue locally myself so far. On 01/05/2020 22:31, Hailu, Andreas wrote: Hi Chesnay, yes - they were created using Flink 1.9.1 as we've only just st= arted to archive them in the past couple weeks. Could you clarify on how yo= u want to try local filesystem archives? As in changing jobmanager.archive.= fs.dir and historyserver.web.tmpdir to the same local directory? // ah From: Chesnay Schepler Sent: Wednesday, April 29, 2020 8:26 AM To: Hailu, Andreas [Engineering] ; user@flink.apache.org Subject: Re: History Server Not Showing Any Jobs - File Not Found? hmm...let's see if I can reproduce the issue locally. Are the archives from the same version the history server runs on? (Which I= supposed would be 1.9.1?) Just for the sake of narrowing things down, it would also be interesting to= check if it works with the archives residing in the local filesystem. On 27/04/2020 18:35, Hailu, Andreas wrote: bash-4.1$ ls -l /local/scratch/flink_historyserver_tmpdir/ total 8 drwxrwxr-x 3 p2epdlsuat p2epdlsuat 4096 Apr 21 10:43 flink-web-history-7fbb= 97cc-9f38-4844-9bcf-6272fe6828e9 drwxrwxr-x 3 p2epdlsuat p2epdlsuat 4096 Apr 21 10:22 flink-web-history-95b3= f928-c60f-4351-9926-766c6ad3ee76 There are just two directories in here. I don't see cache directories from = my attempts today, which is interesting. Looking a little deeper into them: bash-4.1$ ls -lr /local/scratch/flink_historyserver_tmpdir/flink-web-histor= y-7fbb97cc-9f38-4844-9bcf-6272fe6828e9 total 1756 drwxrwxr-x 2 p2epdlsuat p2epdlsuat 1789952 Apr 21 10:44 jobs bash-4.1$ ls -lr /local/scratch/flink_historyserver_tmpdir/flink-web-histor= y-7fbb97cc-9f38-4844-9bcf-6272fe6828e9/jobs total 0 -rw-rw-r-- 1 p2epdlsuat p2epdlsuat 0 Apr 21 10:43 overview.json There are indeed archives already in HDFS - I've included some in my initia= l mail, but here they are again just for reference: -bash-4.1$ hdfs dfs -ls /user/p2epda/lake/delp_qa/flink_hs Found 44282 items -rw-r----- 3 delp datalake_admin_dev 50569 2020-03-21 23:17 /user/p2= epda/lake/delp_qa/flink_hs/000144dba9dc0f235768a46b2f26e936 -rw-r----- 3 delp datalake_admin_dev 49578 2020-03-03 08:45 /user/p2= epda/lake/delp_qa/flink_hs/000347625d8128ee3fd0b672018e38a5 -rw-r----- 3 delp datalake_admin_dev 50842 2020-03-24 15:19 /user/p2= epda/lake/delp_qa/flink_hs/0004be6ce01ba9677d1eb619ad0fa757 ... // ah From: Chesnay Schepler Sent: Monday, April 27, 2020 10:28 AM To: Hailu, Andreas [Engineering] ; user@flink.apache.org Subject: Re: History Server Not Showing Any Jobs - File Not Found? If historyserver.web.tmpdir is not set then java.io.tmpdir is used, so that= should be fine. What are the contents of /local/scratch/flink_historyserver_tmpdir? I assume there are already archives in HDFS? On 27/04/2020 16:02, Hailu, Andreas wrote: My machine's /tmp directory is not large enough to support the archived fil= es, so I changed my java.io.tmpdir to be in some other location which is si= gnificantly larger. I hadn't set anything for historyserver.web.tmpdir, so = I suspect it was still pointing at /tmp. I just tried setting historyserver= .web.tmpdir to the same location as my java.io.tmpdir location, but I'm afr= aid I'm still seeing the following issue: 2020-04-27 09:37:42,904 [nioEventLoopGroup-3-4] DEBUG HistoryServerStaticFi= leServerHandler - Unable to load requested file /overview.json from classlo= ader 2020-04-27 09:37:42,906 [nioEventLoopGroup-3-6] DEBUG HistoryServerStaticFi= leServerHandler - Unable to load requested file /jobs/overview.json from cl= assloader flink-conf.yaml for reference: jobmanager.archive.fs.dir: hdfs:///user/p2epda/lake/delp_qa/flink_hs/ historyserver.archive.fs.dir: hdfs:///user/p2epda/lake/delp_qa/flink_hs/ historyserver.web.tmpdir: /local/scratch/flink_historyserver_tmpdir/ Did you have anything else in mind when you said pointing somewhere funny? // ah From: Chesnay Schepler Sent: Monday, April 27, 2020 5:56 AM To: Hailu, Andreas [Engineering] ; user@flink.apache.org Subject: Re: History Server Not Showing Any Jobs - File Not Found? overview.json is a generated file that is placed in the local directory con= trolled by historyserver.web.tmpdir. Have you configured this option to point to some non-local filesystem? (Or = if not, is the java.io.tmpdir property pointing somewhere funny?) On 24/04/2020 18:24, Hailu, Andreas wrote: I'm having a further look at the code in HistoryServerStaticFileServerHandl= er - is there an assumption about where overview.json is supposed to be loc= ated? // ah From: Hailu, Andreas [Engineering] Sent: Wednesday, April 22, 2020 1:32 PM To: 'Chesnay Schepler' ; Hai= lu, Andreas [Engineering] ; user@flink.apache.org Subject: RE: History Server Not Showing Any Jobs - File Not Found? Hi Chesnay, thanks for responding. We're using Flink 1.9.1. I enabled DEBUG= level logging and this is something relevant I see: 2020-04-22 13:25:52,566 [Flink-HistoryServer-ArchiveFetcher-thread-1] DEBUG= DFSInputStream - Connecting to datanode 10.79.252.101:1019 2020-04-22 13:25:52,567 [Flink-HistoryServer-ArchiveFetcher-thread-1] DEBUG= SaslDataTransferClient - SASL encryption trust check: localHostTrusted =3D= false, remoteHostTrusted =3D false 2020-04-22 13:25:52,567 [Flink-HistoryServer-ArchiveFetcher-thread-1] DEBUG= SaslDataTransferClient - SASL client skipping handshake in secured configu= ration with privileged port for addr =3D /10.79.252.101, datanodeId =3D Dat= anodeI nfoWithStorage[10.79.252.101:1019,DS-7f4ec55d-7c5f-4a0e-b817-d9e635480b21,D= ISK] 2020-04-22 13:25:52,571 [Flink-HistoryServer-ArchiveFetcher-thread-1] DEBUG= DFSInputStream - DFSInputStream has been closed already 2020-04-22 13:25:52,573 [nioEventLoopGroup-3-6] DEBUG HistoryServerStaticFi= leServerHandler - Unable to load requested file /jobs/overview.json from cl= assloader 2020-04-22 13:25:52,576 [IPC Parameter Sending Thread #0] DEBUG Client$Conn= ection$3 - IPC Client (1578587450) connection to d279536-002.dc.gs.com/10.5= 9.61.87:8020 from delp@GS.COM sending #1391 Aside from that, it looks like a lot of logging around datanodes and block = location metadata. Did I miss something in my classpath, perhaps? If so, do= you have a suggestion on what I could try? // ah From: Chesnay Schepler > Sent: Wednesday, April 22, 2020 2:16 AM To: Hailu, Andreas [Engineering] >; user@flink.apache.org Subject: Re: History Server Not Showing Any Jobs - File Not Found? Which Flink version are you using? Have you checked the history server logs after enabling debug logging? On 21/04/2020 17:16, Hailu, Andreas [Engineering] wrote: Hi, I'm trying to set up the History Server, but none of my applications are sh= owing up in the Web UI. Looking at the console, I see that all of the calls= to /overview return the following 404 response: {"errors":["File not found= ."]}. I've set up my configuration as follows: JobManager Archive directory: jobmanager.archive.fs.dir: hdfs:///user/p2epda/lake/delp_qa/flink_hs/ -bash-4.1$ hdfs dfs -ls /user/p2epda/lake/delp_qa/flink_hs Found 44282 items -rw-r----- 3 delp datalake_admin_dev 50569 2020-03-21 23:17 /user/p2= epda/lake/delp_qa/flink_hs/000144dba9dc0f235768a46b2f26e936 -rw-r----- 3 delp datalake_admin_dev 49578 2020-03-03 08:45 /user/p2= epda/lake/delp_qa/flink_hs/000347625d8128ee3fd0b672018e38a5 -rw-r----- 3 delp datalake_admin_dev 50842 2020-03-24 15:19 /user/p2= epda/lake/delp_qa/flink_hs/0004be6ce01ba9677d1eb619ad0fa757 ... ... History Server will fetch the archived jobs from the same location: historyserver.archive.fs.dir: hdfs:///user/p2epda/lake/delp_qa/flink_hs/ So I'm able to confirm that there are indeed archived applications that I s= hould be able to view in the histserver. I'm not able to find out what file= the overview service is looking for from the repository - any suggestions = as to what I could look into next? Best, Andreas ________________________________ Your Personal Data: We may collect and process information about you that m= ay be subject to data protection laws. For more information about how we us= e and disclose your personal data, how we protect your information, our leg= al basis to use your information, your rights and who you can contact, plea= se refer to: www.gs.com/privacy-notices ________________________________ Your Personal Data: We may collect and process information about you that m= ay be subject to data protection laws. For more information about how we us= e and disclose your personal data, how we protect your information, our leg= al basis to use your information, your rights and who you can contact, plea= se refer to: www.gs.com/privacy-notices ________________________________ Your Personal Data: We may collect and process information about you that m= ay be subject to data protection laws. For more information about how we us= e and disclose your personal data, how we protect your information, our leg= al basis to use your information, your rights and who you can contact, plea= se refer to: www.gs.com/privacy-notices ________________________________ Your Personal Data: We may collect and process information about you that m= ay be subject to data protection laws. For more information about how we us= e and disclose your personal data, how we protect your information, our leg= al basis to use your information, your rights and who you can contact, plea= se refer to: www.gs.com/privacy-notices ________________________________ Your Personal Data: We may collect and process information about you that m= ay be subject to data protection laws. For more information about how we us= e and disclose your personal data, how we protect your information, our leg= al basis to use your information, your rights and who you can contact, plea= se refer to: www.gs.com/privacy-notices ________________________________ Your Personal Data: We may collect and process information about you that m= ay be subject to data protection laws. For more information about how we us= e and disclose your personal data, how we protect your information, our leg= al basis to use your information, your rights and who you can contact, plea= se refer to: www.gs.com/privacy-notices ________________________________ Your Personal Data: We may collect and process information about you that m= ay be subject to data protection laws. For more information about how we us= e and disclose your personal data, how we protect your information, our leg= al basis to use your information, your rights and who you can contact, plea= se refer to: www.gs.com/privacy-notices ________________________________ Your Personal Data: We may collect and process information about you that m= ay be subject to data protection laws. For more information about how we us= e and disclose your personal data, how we protect your information, our leg= al basis to use your information, your rights and who you can contact, plea= se refer to: www.gs.com/privacy-notices --_000_e4096b610d904b0d84ed196f45123963gsdgamp01etn2firmwideco_ Content-Type: text/html; charset="us-ascii" Content-Transfer-Encoding: quoted-printable

May I also ask what ve= rsion of flink-hadoop you’re using and the number of jobs you’r= e storing the history for? As of writing we have roughly 101,000 applicatio= n history files. I’m curious to know if we’re encountering some kind of resource problem.

 

// ah

 

From: Hai= lu, Andreas [Engineering]
Sent: Thursday, May 28, 2020 12:18 PM
To: 'Chesnay Schepler' <chesnay@apache.org>; user@flink.apache= .org
Subject: RE: History Server Not Showing Any Jobs - File Not Found?

 

Okay, I will look furt= her to see if we’re mistakenly using a version that’s pre-2.6.0= . However, I don’t see flink-shaded-hadoop in my /lib directory for f= link-1.9.1.

 

flink-dist_2.11-1.9.1.= jar

flink-table-blink_2.11= -1.9.1.jar

flink-table_2.11-1.9.1= .jar

log4j-1.2.17.jar<= /o:p>

slf4j-log4j12-1.7.15.j= ar

 

Are the files within /= lib.

 

// ah

 

From:= Chesnay Schepler <chesnay@apache.org>
Sent: Thursday, May 28, 2020 11:00 AM
To: Hailu, Andreas [Engineering] <Andreas.Hailu@ny.email.gs.com>; user@flink.apache.org
Subject: Re: History Server Not Showing Any Jobs - File Not Found?

 

Looks like it is indeed stuck on downloading the arc= hive.

 

I searched a bit in the Hadoop JIRA and found severa= l similar instances:

 

It is supposed to be fixed in 2.6.0 though :/

 

If hadoop is available from the HADOOP_CLASSPATH and= flink-shaded-hadoop in /lib then you basically don't know what Hadoop vers= ion is actually being used,

which could lead to incompatibilities and dependency= clashes.

If flink-shaded-hadoop 2.4/2.5 is on the classpath, = maybe that is being used and runs into HDFS-7005.

 

On 28/05/2020 16:27, Hailu, Andreas wrote:

Just created a dump, h= ere’s what I see:

 

"Flink-HistorySer= ver-ArchiveFetcher-thread-1" #19 daemon prio=3D5 os_prio=3D0 tid=3D0x0= 0007f93a5a2c000 nid=3D0x5692 runnable [0x00007f934a0d3000]

   java.lang= .Thread.State: RUNNABLE

   &nbs= p;    at sun.nio.ch.EPollArrayWrapper.epollWait(Native Metho= d)

   &nbs= p;    at sun.nio.ch.EPollArrayWrapper.poll(EPollArrayWrapper= .java:269)

   &nbs= p;    at sun.nio.ch.EPollSelectorImpl.doSelect(EPollSelector= Impl.java:79)

    &nb= sp;   at sun.nio.ch.SelectorImpl.lockAndDoSelect(SelectorImp= l.java:86)

   &nbs= p;    - locked <0x00000005df986960> (a sun.nio.ch.Util= $2)

   &nbs= p;    - locked <0x00000005df986948> (a java.util.Colle= ctions$UnmodifiableSet)

   &nbs= p;    - locked <0x00000005df928390> (a sun.nio.ch.EPol= lSelectorImpl)

   &nbs= p;    at sun.nio.ch.SelectorImpl.select(SelectorImpl.java:97= )

   &nbs= p;    at org.apache.hadoop.net.SocketIOWithTimeout$SelectorP= ool.select(SocketIOWithTimeout.java:335)

   &nbs= p;    at org.apache.hadoop.net.SocketIOWithTimeout.doIO(Sock= etIOWithTimeout.java:157)

   &nbs= p;    at org.apache.hadoop.net.SocketInputStream.read(Socket= InputStream.java:161)

   &nbs= p;    at org.apache.hadoop.hdfs.protocol.datatransfer.Packet= Receiver.readChannelFully(PacketReceiver.java:258)

   &nbs= p;    at org.apache.hadoop.hdfs.protocol.datatransfer.Packet= Receiver.doReadFully(PacketReceiver.java:209)

   &nbs= p;    at org.apache.hadoop.hdfs.protocol.datatransfer.Packet= Receiver.doRead(PacketReceiver.java:171)

   &nbs= p;    at org.apache.hadoop.hdfs.protocol.datatransfer.Packet= Receiver.receiveNextPacket(PacketReceiver.java:102)

   &nbs= p;    at org.apache.hadoop.hdfs.RemoteBlockReader2.readNextP= acket(RemoteBlockReader2.java:201)

   &nbs= p;    at org.apache.hadoop.hdfs.RemoteBlockReader2.read(Remo= teBlockReader2.java:152)

   &nbs= p;    - locked <0x00000005ceade5e0> (a org.apache.hado= op.hdfs.RemoteBlockReader2)

   &nbs= p;    at org.apache.hadoop.hdfs.DFSInputStream$ByteArrayStra= tegy.doRead(DFSInputStream.java:781)

   &nbs= p;    at org.apache.hadoop.hdfs.DFSInputStream.readBuffer(DF= SInputStream.java:837)

   &nbs= p;    - eliminated <0x00000005cead3688> (a org.apache.= hadoop.hdfs.DFSInputStream)

   &nbs= p;    at org.apache.hadoop.hdfs.DFSInputStream.readWithStrat= egy(DFSInputStream.java:897)

   &nbs= p;    - locked <0x00000005cead3688> (a org.apache.hado= op.hdfs.DFSInputStream)

   &nbs= p;   at org.apache.hadoop.hdfs.DFSInputStream.read(DFSInputS= tream.java:945)

   &nbs= p;    - locked <0x00000005cead3688> (a org.apache.hado= op.hdfs.DFSInputStream)

   &nbs= p;    at java.io.DataInputStream.read(DataInputStream.java:1= 49)

   &nbs= p;    at org.apache.flink.runtime.fs.hdfs.HadoopDataInputStr= eam.read(HadoopDataInputStream.java:94)

   &nbs= p;    at java.io.InputStream.read(InputStream.java:101)

   &nbs= p;    at org.apache.flink.util.IOUtils.copyBytes(IOUtils.jav= a:69)

   &nbs= p;    at org.apache.flink.util.IOUtils.copyBytes(IOUtils.jav= a:91)

   &nbs= p;    at org.apache.flink.runtime.history.FsJobArchivist.get= ArchivedJsons(FsJobArchivist.java:110)

   &nbs= p;    at org.apache.flink.runtime.webmonitor.history.History= ServerArchiveFetcher$JobArchiveFetcherTask.run(HistoryServerArchiveFetcher.= java:169)

   &nbs= p;    at java.util.concurrent.Executors$RunnableAdapter.call= (Executors.java:511)

   &nbs= p;    at java.util.concurrent.FutureTask.runAndReset(FutureT= ask.java:308)

   &nbs= p;    at java.util.concurrent.ScheduledThreadPoolExecutor$Sc= heduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:180)

   &nbs= p;    at java.util.concurrent.ScheduledThreadPoolExecutor$Sc= heduledFutureTask.run(ScheduledThreadPoolExecutor.java:294)

   &nbs= p;    at java.util.concurrent.ThreadPoolExecutor.runWorker(T= hreadPoolExecutor.java:1142)

   &nbs= p;    at java.util.concurrent.ThreadPoolExecutor$Worker.run(= ThreadPoolExecutor.java:617)

   &nbs= p;    at java.lang.Thread.run(Thread.java:745)

 

What problems could th= e flink-shaded-hadoop jar being included introduce?

 

// ah

 

From:= Chesnay Schepler <chesnay@apache.org>
Sent: Thursday, May 28, 2020 9:26 AM
To: Hailu, Andreas [Engineering] <Andreas.Hailu@ny.email.gs.com>; user@flink.apache.org
Subject: Re: History Server Not Showing Any Jobs - File Not Found?

 

If it were a class-loading issue I would think that = we'd see an exception of some kind. Maybe double-check that flink-shaded-ha= doop is not in the lib directory. (usually I would ask for the full classpa= th that the HS is started with, but as it turns out this isn't getting logged :( (FLINK-18008))

 

The fact that overview.json and jobs/overview.json a= re missing indicates that something goes wrong directly on startup. What is= supposed to happens is that the HS starts, fetches all currently available= archives and then creates these files.

So it seems like the download gets stuck for some re= ason.

 

Can you use jstack to create a thread dump, and see = what the Flink-HistoryServer-ArchiveFetcher is doing?

 

I will also file a JIRA for adding more logging stat= ements, like when fetching starts/stops.

 

On 27/05/2020 20:57, Hailu, Andreas wrote:

Hi Chesney, apologies = for not getting back to you sooner here. So I did what you suggested - I do= wnloaded a few files from my jobmanager.archive.fs.dir HDFS directory to a = locally available directory named /local/scratch/hailua_p2epdlsuat/historys= erver/archived/. I then changed my historyserver.archive.fs.dir to file:///local/scratch/hailua_p2epdlsuat/historyserver/archived/ and tha= t seemed to work. I’m able to see the history of the applications I d= ownloaded. So this points to a problem with sourcing the history from HDFS.=

 

Do you think this coul= d be classpath related? This is what we use for our HADOOP_CLASSPATH var:

/gns/software/infra= /big-data/hadoop/hdp-2.6.5.0/hadoop/*:/gns/software/infra/big-data/hadoop/h= dp-2.6.5.0/hadoop/lib/*:/gns/software/infra/big-data/hadoop/hdp-2.6.5.0/had= oop-hdfs/*:/gns/software/infra/big-data/hadoop/hdp-2.6.5.0/hadoop-hdfs/lib/= *:/gns/software/infra/big-data/hadoop/hdp-2.6.5.0/hadoop-mapreduce/*:/gns/s= oftware/infra/big-data/hadoop/hdp-2.6.5.0/hadoop-mapreduce/lib/*:/gns/softw= are/infra/big-data/hadoop/hdp-2.6.5.0/hadoop-yarn/*:/gns/software/infra/big= -data/hadoop/hdp-2.6.5.0/hadoop-yarn/lib/*:/gns/software/ep/da/dataproc/dat= aproc-prod/lakeRmProxy.jar:/gns/software/infra/big-data/hadoop/hdp-2.6.5.0/= hadoop/bin::/gns/mw/dbclient/postgres/jdbc/pg-jdbc-9.3.v01/postgresql-9.3-1= 100-jdbc4.jar

 

You can see we have re= ferences to Hadoop mapred/yarn/hdfs libs in there.

 

// ah

 

From:= Chesnay Schepler <chesnay@apache.org>
Sent: Sunday, May 3, 2020 6:00 PM
To: Hailu, Andreas [Engineering] <Andreas.Hailu@ny.email.gs.com>; user@flink.apache.org
Subject: Re: History Server Not Showing Any Jobs - File Not Found?

 

yes, exactly; I want to rule out that (somehow) HDFS= is the problem.

 

I couldn't reproduce the issue locally myself so far= .

 

On 01/05/2020 22:31, Hailu, Andreas wrote:

Hi Chesnay, yes –= ; they were created using Flink 1.9.1 as we’ve only just started to a= rchive them in the past couple weeks. Could you clarify on how you want to = try local filesystem archives? As in changing jobmanager.archive.fs.dir and historyserver.web.tmpdir to the same local directory?

 

// ah

 

From:= Chesnay Schepler <chesnay@apache.org>
Sent: Wednesday, April 29, 2020 8:26 AM
To: Hailu, Andreas [Engineering] <Andreas.Hailu@ny.email.gs.com>; user@flink.apache.org
Subject: Re: History Server Not Showing Any Jobs - File Not Found?

 

hmm...let's see if I can reproduce the issue locally= .

 

Are the archives from the same version the history s= erver runs on? (Which I supposed would be 1.9.1?)

 

Just for the sake of narrowing things down, it would= also be interesting to check if it works with the archives residing in the= local filesystem.

 

On 27/04/2020 18:35, Hailu, Andreas wrote:

bash-4.1$ ls -l /local= /scratch/flink_historyserver_tmpdir/

total 8

drwxrwxr-x 3 p2epdlsua= t p2epdlsuat 4096 Apr 21 10:43 flink-web-history-7fbb97cc-9f38-4844-9bcf-62= 72fe6828e9

drwxrwxr-x 3 p2epdlsua= t p2epdlsuat 4096 Apr 21 10:22 flink-web-history-95b3f928-c60f-4351-9926-76= 6c6ad3ee76

 

There are just two dir= ectories in here. I don’t see cache directories from my attempts toda= y, which is interesting. Looking a little deeper into them:

 

bash-4.1$ ls -lr /loca= l/scratch/flink_historyserver_tmpdir/flink-web-history-7fbb97cc-9f38-4844-9= bcf-6272fe6828e9

total 1756=

drwxrwxr-x 2 p2epdlsua= t p2epdlsuat 1789952 Apr 21 10:44 jobs

bash-4.1$ ls -lr /loca= l/scratch/flink_historyserver_tmpdir/flink-web-history-7fbb97cc-9f38-4844-9= bcf-6272fe6828e9/jobs

total 0

-rw-rw-r-- 1 p2epdlsua= t p2epdlsuat 0 Apr 21 10:43 overview.json

 = ;

There are indeed archi= ves already in HDFS – I’ve included some in my initial mail, bu= t here they are again just for reference:

-bash-4.1$ hdfs dfs -l= s /user/p2epda/lake/delp_qa/flink_hs

Found 44282 items

-rw-r-----  = 3 delp datalake_admin_dev      50569 2020-03-21 2= 3:17 /user/p2epda/lake/delp_qa/flink_hs/000144dba9dc0f235768a46b2f26e936

-rw-r-----  = 3 delp datalake_admin_dev      49578 2020-03-03 0= 8:45 /user/p2epda/lake/delp_qa/flink_hs/000347625d8128ee3fd0b672018e38a5

-rw-r-----  = 3 delp datalake_admin_dev      50842 2020-03-24 1= 5:19 /user/p2epda/lake/delp_qa/flink_hs/0004be6ce01ba9677d1eb619ad0fa757

...<= /p>

 = ;

 = ;

// ah

 = ;

From:= Chesnay Schepler <chesnay@apache.org>
Sent: Monday, April 27, 2020 10:28 AM
To: Hailu, Andreas [Engineering] <Andreas.Hailu@ny.email.gs.com>; user@flink.apache.org
Subject: Re: History Server Not Showing Any Jobs - File Not Found?

 

If historyserver.web.tmpdir is not set then java.io.= tmpdir is used, so that should be fine.

 

What are the contents of /local/scratch/flink_histor= yserver_tmpdir?

I assume there are already archives in HDFS?

 

On 27/04/2020 16:02, Hailu, Andreas wrote:

My machine’s /tmp directory is no= t large enough to support the archived files, so I changed my java.io.tmpdi= r to be in some other location which is significantly larger. I hadn’t set anything for historyserver.web.tmpdir, so I sus= pect it was still pointing at /tmp. I just tried setting historyserver.web.= tmpdir to the same location as my java.io.tmpdir location, but I’m af= raid I’m still seeing the following issue:

 

2020-04-27 09:37:42,904 [nioEventLoopGr= oup-3-4] DEBUG HistoryServerStaticFileServerHandler - Unable to load reques= ted file /overview.json from classloader

2020-04-27 09:37:42,906 [nioEventLoopGr= oup-3-6] DEBUG HistoryServerStaticFileServerHandler - Unable to load reques= ted file /jobs/overview.json from classloader

 

flink-conf.yaml for reference:

jobmanager.archive.fs.dir: hdfs:///user= /p2epda/lake/delp_qa/flink_hs/

historyserver.archive.fs.dir: hdfs:///u= ser/p2epda/lake/delp_qa/flink_hs/

historyserver.web.tmpdir: /local/scratc= h/flink_historyserver_tmpdir/

 

Did you have anything else in mind when= you said pointing somewhere funny?

 

// ah

 

From:= Chesnay Schepler <chesnay@apache.org>
Sent: Monday, April 27, 2020 5:56 AM
To: Hailu, Andreas [Engineering] <Andreas.Hailu@ny.email.gs.com>; user@flink.apache.org
Subject: Re: History Server Not Showing Any Jobs - File Not Found?

 

overview.json is a generated file that is placed in the local directory = controlled by historyserver.web.tmpdir.

Have you configured this option to point to some non-local filesystem? (= Or if not, is the java.io.tmpdir property pointing somewhere funny?)

On 24/04/2020 18:24, Hailu, Andreas wrote:

I’m having a further look at the = code in HistoryServerStaticFileServerHandler - is there an assumption about= where overview.json is supposed to be located?

 

// ah

 

From:= Hailu, Andreas [Engineering]
Sent: Wednesday, April 22, 2020 1:32 PM
To: 'Chesnay Schepler' <che= snay@apache.org>; Hailu, Andreas [Engineering] <Andreas.Hailu@ny.email= .gs.com>; user@flink.apache.org
Subject: RE: History Server Not Showing Any Jobs - File Not Found?

 

Hi Chesnay, thanks for responding. We&#= 8217;re using Flink 1.9.1. I enabled DEBUG level logging and this is someth= ing relevant I see:

 

2020-04-22 13:25:52,566 [Flink-HistoryS= erver-ArchiveFetcher-thread-1] DEBUG DFSInputStream - Connecting to datanod= e 10.79.252.101:1019

2020-04-22 13:25:52,567 [Flink-HistoryS= erver-ArchiveFetcher-thread-1] DEBUG SaslDataTransferClient - SASL encrypti= on trust check: localHostTrusted =3D false, remoteHostTrusted =3D false

2020-04-22 13:25:52,567 [Flink-HistoryS= erver-ArchiveFetcher-thread-1] DEBUG SaslDataTransferClient - SASL client s= kipping handshake in secured configuration with privileged port for addr =3D /10.79.252.101, datanodeId =3D DatanodeI

nfoWithStorage[10.79.252.101:1019,DS-7f= 4ec55d-7c5f-4a0e-b817-d9e635480b21,DISK]

2020-04-22 13:25:52,571 [Flink-Histo= ryServer-ArchiveFetcher-thread-1] DEBUG DFSInputStream - DFSInputStream has= been closed already

2020-04-22 13:25:52,573 [nioEventLoo= pGroup-3-6] DEBUG HistoryServerStaticFileServerHandler - Unable to load req= uested file /jobs/overview.json from classloader

2020-04-22 13:25:52,576 [IPC Parameter = Sending Thread #0] DEBUG Client$Connection$3 - IPC Client (1578587450) conn= ection to d279536-002.dc.gs.com/10.59.61.87:8020 from delp@GS.COM sending #1391 <= o:p>

 

Aside from that, it looks like a lot of= logging around datanodes and block location metadata. Did I miss something= in my classpath, perhaps? If so, do you have a suggestion on what I could try?

 

// ah

 

From:= Chesnay Schepler <chesnay@apache.org>
Sent: Wednesday, April 22, 2020 2:16 AM
To: Hailu, Andreas [Engineering] <Andreas.Hailu@ny.email.gs.com>; user@flink.apache.org
Subject: Re: History Server Not Showing Any Jobs - File Not Found?

 

Which Flink version are you using?

Have you checked the history server logs after enabl= ing debug logging?

 

On 21/04/2020 17:16, Hailu, Andreas [Engineering] wr= ote:

Hi,

 

I’m trying to set up the History Server, but no= ne of my applications are showing up in the Web UI. Looking at the console,= I see that all of the calls to /overview return the following 404 response: {"errors":["File not found."]}= .

 

I’ve set up my configuration as follows:=

 

JobManager Archive directory:

jobmanager.archive.fs.dir: hdfs:///user/= p2epda/lake/delp_qa/flink_hs/

-bash-4.1$ hdfs dfs -ls /user/p2epda/lake/delp_qa/fli= nk_hs

Found 44282 items

-rw-r-----   3 delp datalake_admin_dev = ;     50569 2020-03-21 23:17 /user/p2epda/lake/delp_qa/= flink_hs/000144dba9dc0f235768a46b2f26e936

-rw-r-----   3 delp datalake_admin_dev = ;     49578 2020-03-03 08:45 /user/p2epda/lake/delp_qa/= flink_hs/000347625d8128ee3fd0b672018e38a5

-rw-r-----   3 delp datalake_admin_dev = ;     50842 2020-03-24 15:19 /user/p2epda/lake/delp_qa/= flink_hs/0004be6ce01ba9677d1eb619ad0fa757

...

...

 

History Server will fetch the archived jobs from the = same location:

historyserver.archive.fs.dir: hdfs:///us= er/p2epda/lake/delp_qa/flink_hs/

 

So I’m able to confirm that there are indeed ar= chived applications that I should be able to view in the histserver. I̵= 7;m not able to find out what file the overview service is looking for from the repository – any suggestions as to what I could= look into next?

 

Best,

Andreas

 



Your Personal Data: We may collect and process information about you that m= ay be subject to data protection laws. For more information about how we us= e and disclose your personal data, how we protect your information, our leg= al basis to use your information, your rights and who you can contact, please refer to: www.gs.com/privacy-notices

 

 



Your Personal Data: We may collect and process information about you that m= ay be subject to data protection laws. For more information about how we us= e and disclose your personal data, how we protect your information, our leg= al basis to use your information, your rights and who you can contact, please refer to: www.gs.com/privacy-notices

 

 



Your Personal Data: We may collect and process information about you that m= ay be subject to data protection laws. For more information about how we us= e and disclose your personal data, how we protect your information, our leg= al basis to use your information, your rights and who you can contact, please refer to: www.gs.com/privacy-notices

 

 



Your Personal Data: We may collect and process information about you that m= ay be subject to data protection laws. For more information about how we us= e and disclose your personal data, how we protect your information, our leg= al basis to use your information, your rights and who you can contact, please refer to: www.gs.com/privacy-notices

 

 



Your Personal Data: We may collect and process information about you that m= ay be subject to data protection laws. For more information about how we us= e and disclose your personal data, how we protect your information, our leg= al basis to use your information, your rights and who you can contact, please refer to: www.gs.com/privacy-notices

 

 



Your Personal Data: We may collect and process information about you that m= ay be subject to data protection laws. For more information about how we us= e and disclose your personal data, how we protect your information, our leg= al basis to use your information, your rights and who you can contact, please refer to: www.gs.com/privacy-notices

 

 



Your Personal Data: We may collect and process information about you that m= ay be subject to data protection laws. For more information about how we us= e and disclose your personal data, how we protect your information, our leg= al basis to use your information, your rights and who you can contact, please refer to: www.gs.com/privacy-notices

 




Your Personal Data: We may collect and process information about you that m= ay be subject to data protection laws. For more information about how we us= e and disclose your personal data, how we protect your information, our leg= al basis to use your information, your rights and who you can contact, please refer to: www.gs.com/privacy-notices
--_000_e4096b610d904b0d84ed196f45123963gsdgamp01etn2firmwideco_--