From user-return-264-archive-asf-public=cust-asf.ponee.io@livy.incubator.apache.org Wed Apr 11 08:48:10 2018 Return-Path: X-Original-To: archive-asf-public@cust-asf.ponee.io Delivered-To: archive-asf-public@cust-asf.ponee.io Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by mx-eu-01.ponee.io (Postfix) with SMTP id 9C26A18067B for ; Wed, 11 Apr 2018 08:48:07 +0200 (CEST) Received: (qmail 98982 invoked by uid 500); 11 Apr 2018 06:48:06 -0000 Mailing-List: contact user-help@livy.incubator.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: user@livy.incubator.apache.org Delivered-To: mailing list user@livy.incubator.apache.org Received: (qmail 98930 invoked by uid 99); 11 Apr 2018 06:48:06 -0000 Received: from pnap-us-west-generic-nat.apache.org (HELO spamd3-us-west.apache.org) (209.188.14.142) by apache.org (qpsmtpd/0.29) with ESMTP; Wed, 11 Apr 2018 06:48:06 +0000 Received: from localhost (localhost [127.0.0.1]) by spamd3-us-west.apache.org (ASF Mail Server at spamd3-us-west.apache.org) with ESMTP id 8B31018048D for ; Wed, 11 Apr 2018 06:48:05 +0000 (UTC) X-Virus-Scanned: Debian amavisd-new at spamd3-us-west.apache.org X-Spam-Flag: NO X-Spam-Score: 3.06 X-Spam-Level: *** X-Spam-Status: No, score=3.06 tagged_above=-999 required=6.31 tests=[DKIM_SIGNED=0.1, DKIM_VALID=-0.1, DKIM_VALID_AU=-0.1, HTML_MESSAGE=2, HTML_OBFUSCATE_10_20=1.162, RCVD_IN_DNSWL_NONE=-0.0001, RCVD_IN_MSPIKE_H2=-0.001, SPF_PASS=-0.001] autolearn=disabled Authentication-Results: spamd3-us-west.apache.org (amavisd-new); dkim=pass (2048-bit key) header.d=gmail.com Received: from mx1-lw-eu.apache.org ([10.40.0.8]) by localhost (spamd3-us-west.apache.org [10.40.0.10]) (amavisd-new, port 10024) with ESMTP id kg3Nd1hOWZe2 for ; Wed, 11 Apr 2018 06:47:58 +0000 (UTC) Received: from mail-wr0-f177.google.com (mail-wr0-f177.google.com [209.85.128.177]) by mx1-lw-eu.apache.org (ASF Mail Server at mx1-lw-eu.apache.org) with ESMTPS id 6806E5F24C for ; Wed, 11 Apr 2018 06:47:58 +0000 (UTC) Received: by mail-wr0-f177.google.com with SMTP id y7so595639wrh.10 for ; Tue, 10 Apr 2018 23:47:58 -0700 (PDT) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=gmail.com; s=20161025; h=mime-version:in-reply-to:references:from:date:message-id:subject:to; bh=8ZQuCR0Ja9BpiHdM6qVUH/FPrOueC8nLaNjmOc7a+Z4=; b=qO9I64rPTZ4o1TKe7f+ikgOT7/FIEviiZUotGPINQ9RKuy3mYfBjZh5x8Ucpg5UaCx UAFKyP/GS83XXAGcH4OUN+WvSrVbauYfwWD0gsiJtTzKFHzdgXji/ho7u6b5LFHBSfrk jqZ7CPU12/OIEsM6UznnvG0hRf8xtGinLSLJ+JWR4AUO39tHe1OUAIhjA0bso0uEOoHV yj6EmTp25gs37y4y9rAXq+MWytmAdXhVbIQwZ+h9dZPmi/i8TmU4yIQ9C/0oC9u2h7H5 XQZIaDTASJYBBZtXdUZFscKjjl0+2b5A4Q0/W/iTRdUZZGs0npSD3TFz+docQ4EY6Nuh FZxQ== X-Google-DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=1e100.net; s=20161025; h=x-gm-message-state:mime-version:in-reply-to:references:from:date :message-id:subject:to; bh=8ZQuCR0Ja9BpiHdM6qVUH/FPrOueC8nLaNjmOc7a+Z4=; b=Y0E7WleNZGPsIemd/kmZYEXauVyE1i2CLSS6+RMvJqwRXEHapQJeiNaxrYqZ6ECGiF 6zrqLOZSwmh0Jzi0e+lYYKpm8Hbiv4vEIMFursDck6B6g2DpyMCKdx1VdboER6oKlvPV cZYZuEdIS/g9ghm4FqHc7Mnfmns85050QzyoFKEyllwMwrLighKvNgb0LBj1wP+u4xe2 vbpsaZuJyO0tkQGgt+/3NZJVvaKICKpD6GYOPHdHMX0Bk3BDDVkEyAB46xOHgjuvRkLX AP6PeYJD7T8QQthnzWZtiVnoM3otajRjJqeJaWLVlFlpLpZOUZFd2UiEGiyDTS0GYIsy l2Fg== X-Gm-Message-State: ALQs6tCMToOEdL0IL4Ug63DJY596uT5nZxKdTX1pGcSJhCnN3p4MTGAg sbZ/uqtXXMLZXSG9PLLZAmFSh69JaCXfxSYzK/v9pA== X-Google-Smtp-Source: AIpwx48IuBP/denoPbrytcB7MEubIImT6xLGeagks/0/LSPk0+bU+sUWc/1EEXnOqEI7IQTXP5SVpaRVou4m0WGBwXA= X-Received: by 10.223.178.87 with SMTP id y23mr2295940wra.95.1523429271913; Tue, 10 Apr 2018 23:47:51 -0700 (PDT) MIME-Version: 1.0 Received: by 10.28.48.195 with HTTP; Tue, 10 Apr 2018 23:47:50 -0700 (PDT) In-Reply-To: References: From: Saisai Shao Date: Wed, 11 Apr 2018 14:47:50 +0800 Message-ID: Subject: Re: Livy running into OOM after several hours. Whats the best way to diagnose and fix? To: user@livy.incubator.apache.org Content-Type: multipart/alternative; boundary="f403045f1a0a97c97a05698d04aa" --f403045f1a0a97c97a05698d04aa Content-Type: text/plain; charset="UTF-8" This mostly like a Spark issue, not a Livy issue ( https://issues.apache.org/jira/browse/SPARK-23682). 2018-04-11 9:53 GMT+08:00 kant kodali : > Hi All, > > Livy had been running into OOM after running few long running streaming > queries (<10 queries) for a while. It happens after several hours. I am > trying to figure out why it happens before I tweak some parameters? > > currently, I set spark.executor.memory = 3g and spark.driver.memory = 3g > and I wonder if I still need to set these when I see in spark documentation > there is spark.memory.fraction and spark.dynamicAllocation.enabled? > > I am using spark 2.3.0 and running livy in a client mode. > > *Also, How to scale livy? should I one session for every long running > streaming query? should I have multiple sessions?* > > Stacktrace #1 > > { > "id": 0, > "from": 102, > "total": 202, > "log": [ > "18/04/08 19:13:40 ERROR MicroBatchExecution: Query [id = > 7aedaf72-41e0-4be5-8ea6-e374bfbf0ae7, runId = b85bacee-d54e-421e-b453- > 450591e128c9] terminated with error", > "java.lang.OutOfMemoryError: GC overhead limit exceeded", > "\tat java.lang.StringCoding$StringEncoder.encode(StringCoding.jav > a:300)", > "\tat java.lang.StringCoding.encode(StringCoding.java:344)", > "\tat java.lang.String.getBytes(String.java:918)", > "\tat java.io.UnixFileSystem.getLength(Native Method)", > "\tat java.io.File.length(File.java:974)", > "\tat org.apache.hadoop.fs.RawLocalFileSystem$DeprecatedRawLocalFi > leStatus.(RawLocalFileSystem.java:626)", > "\tat org.apache.hadoop.fs.RawLocalFileSystem.deprecatedGetFileSta > tus(RawLocalFileSystem.java:609)", > "\tat org.apache.hadoop.fs.RawLocalFileSystem.getFileLinkStatusInt > ernal(RawLocalFileSystem.java:824)", > "\tat org.apache.hadoop.fs.RawLocalFileSystem.getFileStatus(RawLoc > alFileSystem.java:601)", > "\tat org.apache.hadoop.fs.FileSystem.rename(FileSystem.java:1309) > ", > "\tat org.apache.hadoop.fs.DelegateToFileSystem.renameInternal(Del > egateToFileSystem.java:197)", > "\tat org.apache.hadoop.fs.AbstractFileSystem.renameInternal(Abstr > actFileSystem.java:748)", > "\tat org.apache.hadoop.fs.FilterFs.renameInternal(FilterFs.java:2 > 36)", > "\tat org.apache.hadoop.fs.AbstractFileSystem.rename(AbstractFileS > ystem.java:678)", > "\tat org.apache.hadoop.fs.FileContext.rename(FileContext.java:958 > )", > "\tat org.apache.spark.sql.execution.streaming.HDFSMetadataLog$Fil > eContextManager.rename(HDFSMetadataLog.scala:374)", > "\tat org.apache.spark.sql.execution.streaming.HDFSMetadataLog.org > > $apache$spark$sql$execution$streaming$HDFSMetadataLog$$writeBatch( > HDFSMetadataLog.scala:160)", > "\tat org.apache.spark.sql.execution.streaming.HDFSMetadataLog$$an > onfun$add$1.apply$mcZ$sp(HDFSMetadataLog.scala:112)", > "\tat org.apache.spark.sql.execution.streaming.HDFSMetadataLog$$an > onfun$add$1.apply(HDFSMetadataLog.scala:110)", > "\tat org.apache.spark.sql.execution.streaming.HDFSMetadataLog$$an > onfun$add$1.apply(HDFSMetadataLog.scala:110)", > "\tat scala.Option.getOrElse(Option.scala:121)", > "\tat org.apache.spark.sql.execution.streaming.HDFSMetadataLog.add > (HDFSMetadataLog.scala:110)", > "\tat org.apache.spark.sql.execution.streaming.MicroBatchExecution > $$anonfun$org$apache$spark$sql$execution$streaming$MicroBatchExecution$$ > constructNextBatch$1.apply$mcV$sp(MicroBatchExecution.scala:339)", > "\tat org.apache.spark.sql.execution.streaming.MicroBatchExecution > $$anonfun$org$apache$spark$sql$execution$streaming$MicroBatchExecution$$ > constructNextBatch$1.apply(MicroBatchExecution.scala:338)", > "\tat org.apache.spark.sql.execution.streaming.MicroBatchExecution > $$anonfun$org$apache$spark$sql$execution$streaming$MicroBatchExecution$$ > constructNextBatch$1.apply(MicroBatchExecution.scala:338)", > "\tat org.apache.spark.sql.execution.streaming.ProgressReporter$cl > ass.reportTimeTaken(ProgressReporter.scala:271)", > "\tat org.apache.spark.sql.execution.streaming.StreamExecution.rep > ortTimeTaken(StreamExecution.scala:58)", > "\tat org.apache.spark.sql.execution.streaming.MicroBatchExe > cution.org > > $apache$spark$sql$execution$streaming$MicroBatchExecution$ > $constructNextBatch(MicroBatchExecution.scala:338)", > "\tat org.apache.spark.sql.execution.streaming.MicroBatchExecution > $$anonfun$runActivatedStream$1$$anonfun$apply$mcZ$sp$1.apply$mcV$sp( > MicroBatchExecution.scala:128)", > "\tat org.apache.spark.sql.execution.streaming.MicroBatchExecution > $$anonfun$runActivatedStream$1$$anonfun$apply$mcZ$sp$1.apply( > MicroBatchExecution.scala:121)", > "\tat org.apache.spark.sql.execution.streaming.MicroBatchExecution > $$anonfun$runActivatedStream$1$$anonfun$apply$mcZ$sp$1.apply( > MicroBatchExecution.scala:121)", > "\tat org.apache.spark.sql.execution.streaming.ProgressReporter$cl > ass.reportTimeTaken(ProgressReporter.scala:271)", > "Exception in thread \"stream execution thread for [id = > 7aedaf72-41e0-4be5-8ea6-e374bfbf0ae7, runId = b85bacee-d54e-421e-b453- > 450591e128c9]\" java.lang.OutOfMemoryError: GC overhead limit exceeded", > "\tat java.lang.StringCoding$StringEncoder.encode(StringCoding.jav > a:300)", > "\tat java.lang.StringCoding.encode(StringCoding.java:344)", > "\tat java.lang.String.getBytes(String.java:918)", > "\tat java.io.UnixFileSystem.getLength(Native Method)", > "\tat java.io.File.length(File.java:974)", > "\tat org.apache.hadoop.fs.RawLocalFileSystem$DeprecatedRawLocalFi > leStatus.(RawLocalFileSystem.java:626)", > "\tat org.apache.hadoop.fs.RawLocalFileSystem.deprecatedGetFileSta > tus(RawLocalFileSystem.java:609)", > "\tat org.apache.hadoop.fs.RawLocalFileSystem.getFileLinkStatusInt > ernal(RawLocalFileSystem.java:824)", > "\tat org.apache.hadoop.fs.RawLocalFileSystem.getFileStatus(RawLoc > alFileSystem.java:601)", > "\tat org.apache.hadoop.fs.FileSystem.rename(FileSystem.java:1309) > ", > "\tat org.apache.hadoop.fs.DelegateToFileSystem.renameInternal(Del > egateToFileSystem.java:197)", > "\tat org.apache.hadoop.fs.AbstractFileSystem.renameInternal(Abstr > actFileSystem.java:748)", > "\tat org.apache.hadoop.fs.FilterFs.renameInternal(FilterFs.java:2 > 36)", > "\tat org.apache.hadoop.fs.AbstractFileSystem.rename(AbstractFileS > ystem.java:678)", > "\tat org.apache.hadoop.fs.FileContext.rename(FileContext.java:958 > )", > "\tat org.apache.spark.sql.execution.streaming.HDFSMetadataLog$Fil > eContextManager.rename(HDFSMetadataLog.scala:374)", > "\tat org.apache.spark.sql.execution.streaming.HDFSMetadataLog.org > > $apache$spark$sql$execution$streaming$HDFSMetadataLog$$writeBatch( > HDFSMetadataLog.scala:160)", > "\tat org.apache.spark.sql.execution.streaming.HDFSMetadataLog$$an > onfun$add$1.apply$mcZ$sp(HDFSMetadataLog.scala:112)", > "\tat org.apache.spark.sql.execution.streaming.HDFSMetadataLog$$an > onfun$add$1.apply(HDFSMetadataLog.scala:110)", > "\tat org.apache.spark.sql.execution.streaming.HDFSMetadataLog$$an > onfun$add$1.apply(HDFSMetadataLog.scala:110)", > "\tat scala.Option.getOrElse(Option.scala:121)", > "\tat org.apache.spark.sql.execution.streaming.HDFSMetadataLog.add > (HDFSMetadataLog.scala:110)", > "\tat org.apache.spark.sql.execution.streaming.MicroBatchExecution > $$anonfun$org$apache$spark$sql$execution$streaming$MicroBatchExecution$$ > constructNextBatch$1.apply$mcV$sp(MicroBatchExecution.scala:339)", > "\tat org.apache.spark.sql.execution.streaming.MicroBatchExecution > $$anonfun$org$apache$spark$sql$execution$streaming$MicroBatchExecution$$ > constructNextBatch$1.apply(MicroBatchExecution.scala:338)", > "\tat org.apache.spark.sql.execution.streaming.MicroBatchExecution > $$anonfun$org$apache$spark$sql$execution$streaming$MicroBatchExecution$$ > constructNextBatch$1.apply(MicroBatchExecution.scala:338)", > "\tat org.apache.spark.sql.execution.streaming.ProgressReporter$cl > ass.reportTimeTaken(ProgressReporter.scala:271)", > "\tat org.apache.spark.sql.execution.streaming.StreamExecution.rep > ortTimeTaken(StreamExecution.scala:58)", > "\tat org.apache.spark.sql.execution.streaming.MicroBatchExe > cution.org > > $apache$spark$sql$execution$streaming$MicroBatchExecution$ > $constructNextBatch(MicroBatchExecution.scala:338)", > "\tat org.apache.spark.sql.execution.streaming.MicroBatchExecution > $$anonfun$runActivatedStream$1$$anonfun$apply$mcZ$sp$1.apply$mcV$sp( > MicroBatchExecution.scala:128)", > "\tat org.apache.spark.sql.execution.streaming.MicroBatchExecution > $$anonfun$runActivatedStream$1$$anonfun$apply$mcZ$sp$1.apply( > MicroBatchExecution.scala:121)", > "\tat org.apache.spark.sql.execution.streaming.MicroBatchExecution > $$anonfun$runActivatedStream$1$$anonfun$apply$mcZ$sp$1.apply( > MicroBatchExecution.scala:121)", > "\tat org.apache.spark.sql.execution.streaming.ProgressReporter$cl > ass.reportTimeTaken(ProgressReporter.scala:271)", > "Exception in thread \"dag-scheduler-event-loop\" > java.lang.OutOfMemoryError: GC overhead limit exceeded", > "\tat java.lang.Class.getDeclaredMethods0(Native Method)", > "\tat java.lang.Class.privateGetDeclaredMethods(Class.java:2701)", > "\tat java.lang.Class.getDeclaredMethod(Class.java:2128)", > "\tat java.io.ObjectStreamClass.getPrivateMethod(ObjectStreamClass > .java:1575)", > "\tat java.io.ObjectStreamClass.access$1700(ObjectStreamClass.java > :79)", > "\tat java.io.ObjectStreamClass$2.run(ObjectStreamClass.java:508)" > , > "\tat java.io.ObjectStreamClass$2.run(ObjectStreamClass.java:482)" > , > "\tat java.security.AccessController.doPrivileged(Native Method)", > "\tat java.io.ObjectStreamClass.(ObjectStreamClass.java:482) > ", > "\tat java.io.ObjectStreamClass.lookup(ObjectStreamClass.java:379) > ", > "\tat java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.j > ava:1134)", > "\tat java.io.ObjectOutputStream.defaultWriteFields(ObjectOutputSt > ream.java:1548)", > "\tat java.io.ObjectOutputStream.writeSerialData(ObjectOutputStrea > m.java:1509)", > "\tat java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputS > tream.java:1432)", > "\tat java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.j > ava:1178)", > "\tat java.io.ObjectOutputStream.defaultWriteFields(ObjectOutputSt > ream.java:1548)", > "\tat java.io.ObjectOutputStream.writeSerialData(ObjectOutputStrea > m.java:1509)", > "\tat java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputS > tream.java:1432)", > "\tat java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.j > ava:1178)", > "\tat java.io.ObjectOutputStream.defaultWriteFields(ObjectOutputSt > ream.java:1548)", > "\tat java.io.ObjectOutputStream.writeSerialData(ObjectOutputStrea > m.java:1509)", > "\tat java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputS > tream.java:1432)", > "\tat java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.j > ava:1178)", > "\tat java.io.ObjectOutputStream.writeArray(ObjectOutputStream.jav > a:1378)", > "\tat java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.j > ava:1174)", > "\tat java.io.ObjectOutputStream.defaultWriteFields(ObjectOutputSt > ream.java:1548)", > "\tat java.io.ObjectOutputStream.writeSerialData(ObjectOutputStrea > m.java:1509)", > "\tat java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputS > tream.java:1432)", > "\tat java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.j > ava:1178)", > "\tat java.io.ObjectOutputStream.defaultWriteFields(ObjectOutputSt > ream.java:1548)", > "\tat java.io.ObjectOutputStream.writeSerialData(ObjectOutputStrea > m.java:1509)", > "\tat java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputS > tream.java:1432)" > ] > } > > StackTrace2 > > > { > "id": 0, > "from": 102, > "total": 202, > "log": [ > "\tat org.apache.spark.sql.execution.streaming.MicroBatchExecution > .runActivatedStream(MicroBatchExecution.scala:117)", > "\tat org.apache.spark.sql.execution.streaming.StreamExecution.org > > $apache$spark$sql$execution$streaming$StreamExecution$$runStream( > StreamExecution.scala:279)", > "\tat org.apache.spark.sql.execution.streaming.StreamExecution$$an > on$1.run(StreamExecution.scala:189)", > "Caused by: java.lang.IllegalStateException: Error reading delta > file file:/tmp/7a32968b1cea96f54c771da72784ae21/state/0/1/1.delta of > HDFSStateStoreProvider[id = (op=0,part=1),dir = file:/tmp/ > 7a32968b1cea96f54c771da72784ae21/state/0/1]: file:/tmp/ > 7a32968b1cea96f54c771da72784ae21/state/0/1/1.delta does not exist", > "\tat org.apache.spark.sql.execution.streaming.state.HDFSBac > kedStateStoreProvider.org > > $apache$spark$sql$execution$streaming$state$HDFSBackedStateStoreProvider$$ > updateFromDeltaFile(HDFSBackedStateStoreProvider.scala:371)", > "\tat org.apache.spark.sql.execution.streaming.state.HDFSBackedSta > teStoreProvider$$anonfun$loadMap$1.apply$mcVJ$sp(HDFSBackedS > tateStoreProvider.scala:333)", > "\tat org.apache.spark.sql.execution.streaming.state.HDFSBackedSta > teStoreProvider$$anonfun$loadMap$1.apply(HDFSBackedStateStoreProvider. > scala:332)", > "\tat org.apache.spark.sql.execution.streaming.state.HDFSBackedSta > teStoreProvider$$anonfun$loadMap$1.apply(HDFSBackedStateStoreProvider. > scala:332)", > "\tat scala.collection.immutable.NumericRange.foreach(NumericRange > .scala:73)", > "\tat org.apache.spark.sql.execution.streaming.state.HDFSBackedSta > teStoreProvider.loadMap(HDFSBackedStateStoreProvider.scala:332)", > "\tat org.apache.spark.sql.execution.streaming.state.HDFSBackedSta > teStoreProvider.getStore(HDFSBackedStateStoreProvider.scala:196)", > "\tat org.apache.spark.sql.execution.streaming.state.StateStore$.g > et(StateStore.scala:369)", > "\tat org.apache.spark.sql.execution.streaming.state.StateStoreRDD > .compute(StateStoreRDD.scala:74)", > "\tat org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:3 > 24)", > "\tat org.apache.spark.rdd.RDD.iterator(RDD.scala:288)", > "\tat org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsR > DD.scala:38)", > "\tat org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:3 > 24)", > "\tat org.apache.spark.rdd.RDD.iterator(RDD.scala:288)", > "\tat org.apache.spark.scheduler.ResultTask.runTask(ResultTask.sca > la:87)", > "\tat org.apache.spark.scheduler.Task.run(Task.scala:109)", > "\tat org.apache.spark.executor.Executor$TaskRunner.run(Executor.s > cala:345)", > "\tat java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPool > Executor.java:1142)", > "\tat java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoo > lExecutor.java:617)", > "\tat java.lang.Thread.run(Thread.java:748)", > "Caused by: java.io.FileNotFoundException: File file:/tmp/ > 7a32968b1cea96f54c771da72784ae21/state/0/1/1.delta does not exist", > "\tat org.apache.hadoop.fs.RawLocalFileSystem.deprecatedGetFileSta > tus(RawLocalFileSystem.java:611)", > "\tat org.apache.hadoop.fs.RawLocalFileSystem.getFileLinkStatusInt > ernal(RawLocalFileSystem.java:824)", > "\tat org.apache.hadoop.fs.RawLocalFileSystem.getFileStatus(RawLoc > alFileSystem.java:601)", > "\tat org.apache.hadoop.fs.FilterFileSystem.getFileStatus(FilterFi > leSystem.java:421)", > "\tat org.apache.hadoop.fs.ChecksumFileSystem$ChecksumFSInputCheck > er.(ChecksumFileSystem.java:142)", > "\tat org.apache.hadoop.fs.ChecksumFileSystem.open(ChecksumFileSys > tem.java:346)", > "\tat org.apache.hadoop.fs.FileSystem.open(FileSystem.java:769)", > "\tat org.apache.spark.sql.execution.streaming.state.HDFSBac > kedStateStoreProvider.org > > $apache$spark$sql$execution$streaming$state$HDFSBackedStateStoreProvider$$ > updateFromDeltaFile(HDFSBackedStateStoreProvider.scala:368)", > "\t... 19 more", > "Exception in thread \"dispatcher-event-loop-3\" > java.lang.OutOfMemoryError: GC overhead limit exceeded", > "\tat java.lang.Class.getDeclaredMethods0(Native Method)", > "\tat java.lang.Class.privateGetDeclaredMethods(Class.java:2701)", > "\tat java.lang.Class.getDeclaredMethod(Class.java:2128)", > "\tat java.io.ObjectStreamClass.getPrivateMethod(ObjectStreamClass > .java:1475)", > "\tat java.io.ObjectStreamClass.access$1700(ObjectStreamClass.java > :72)", > "\tat java.io.ObjectStreamClass$2.run(ObjectStreamClass.java:498)" > , > "\tat java.io.ObjectStreamClass$2.run(ObjectStreamClass.java:472)" > , > "\tat java.security.AccessController.doPrivileged(Native Method)", > "\tat java.io.ObjectStreamClass.(ObjectStreamClass.java:472) > ", > "\tat java.io.ObjectStreamClass.lookup(ObjectStreamClass.java:369) > ", > "\tat java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.j > ava:1134)", > "\tat java.io.ObjectOutputStream.defaultWriteFields(ObjectOutputSt > ream.java:1548)", > "\tat java.io.ObjectOutputStream.defaultWriteObject(ObjectOutputSt > ream.java:441)", > "\tat org.apache.spark.broadcast.TorrentBroadcast$$anonfun$writeOb > ject$1.apply$mcV$sp(TorrentBroadcast.scala:204)", > "\tat org.apache.spark.broadcast.TorrentBroadcast$$anonfun$writeOb > ject$1.apply(TorrentBroadcast.scala:202)", > "\tat org.apache.spark.broadcast.TorrentBroadcast$$anonfun$writeOb > ject$1.apply(TorrentBroadcast.scala:202)", > "\tat org.apache.spark.util.Utils$.tryOrIOException(Utils.scala:13 > 43)", > "\tat org.apache.spark.broadcast.TorrentBroadcast.writeObject(Torr > entBroadcast.scala:202)", > "\tat sun.reflect.NativeMethodAccessorImpl.invoke0(Native > Method)", > "\tat sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAcce > ssorImpl.java:62)", > "\tat sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMe > thodAccessorImpl.java:43)", > "\tat java.lang.reflect.Method.invoke(Method.java:498)", > "\tat java.io.ObjectStreamClass.invokeWriteObject(ObjectStreamClas > s.java:1028)", > "\tat java.io.ObjectOutputStream.writeSerialData(ObjectOutputStrea > m.java:1496)", > "\tat java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputS > tream.java:1432)", > "\tat java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.j > ava:1178)", > "\tat java.io.ObjectOutputStream.defaultWriteFields(ObjectOutputSt > ream.java:1548)", > "\tat java.io.ObjectOutputStream.writeSerialData(ObjectOutputStrea > m.java:1509)", > "\tat java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputS > tream.java:1432)", > "\tat java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.j > ava:1178)", > "\tat java.io.ObjectOutputStream.writeObject(ObjectOutputStream.ja > va:348)", > "\tat org.apache.spark.serializer.JavaSerializationStream.writeObj > ect(JavaSerializer.scala:43)", > "18/04/10 03:21:09 ERROR Utils: Uncaught exception in thread > element-tracking-store-worker", > "java.lang.OutOfMemoryError: GC overhead limit exceeded", > "\tat org.apache.spark.util.kvstore.KVTypeInfo$MethodAccessor.get( > KVTypeInfo.java:154)", > "\tat org.apache.spark.util.kvstore.InMemoryStore$InMemoryView.com > pare(InMemoryStore.java:248)", > "\tat org.apache.spark.util.kvstore.InMemoryStore$InMemoryView.lam > bda$iterator$0(InMemoryStore.java:203)", > "\tat org.apache.spark.util.kvstore.InMemoryStore$InMemoryView$$La > mbda$24/1059294725.compare(Unknown Source)", > "\tat java.util.TimSort.binarySort(TimSort.java:296)", > "\tat java.util.TimSort.sort(TimSort.java:239)", > "\tat java.util.Arrays.sort(Arrays.java:1512)", > "\tat java.util.ArrayList.sort(ArrayList.java:1454)", > "\tat java.util.Collections.sort(Collections.java:175)", > "\tat org.apache.spark.util.kvstore.InMemoryStore$InMemoryView.ite > rator(InMemoryStore.java:203)", > "\tat scala.collection.convert.Wrappers$JIterableWrapper.iterator( > Wrappers.scala:54)", > "\tat scala.collection.IterableLike$class.foreach(IterableLike.sca > la:72)", > "\tat scala.collection.AbstractIterable.foreach(Iterable.scala:54) > ", > "\tat org.apache.spark.status.AppStatusListener$$anonfun$org$apach > e$spark$status$AppStatusListener$$cleanupStages$1.apply(AppS > tatusListener.scala:891)", > "\tat org.apache.spark.status.AppStatusListener$$anonfun$org$apach > e$spark$status$AppStatusListener$$cleanupStages$1.apply(AppS > tatusListener.scala:871)", > "\tat scala.collection.immutable.List.foreach(List.scala:381)", > "\tat org.apache.spark.status.AppStatusListener.org > $apache$s > park$status$AppStatusListener$$cleanupStages(AppStatusListen > er.scala:871)", > "\tat org.apache.spark.status.AppStatusListener$$anonfun$3.apply$m > cVJ$sp(AppStatusListener.scala:84)", > "\tat org.apache.spark.status.ElementTrackingStore$$anonfun$write$ > 1$$anonfun$apply$1$$anonfun$apply$mcV$sp$1.apply(ElementTrac > kingStore.scala:109)", > "\tat org.apache.spark.status.ElementTrackingStore$$anonfun$write$ > 1$$anonfun$apply$1$$anonfun$apply$mcV$sp$1.apply(ElementTrac > kingStore.scala:107)", > "\tat scala.collection.immutable.List.foreach(List.scala:381)", > "\tat org.apache.spark.status.ElementTrackingStore$$anonfun$write$ > 1$$anonfun$apply$1.apply$mcV$sp(ElementTrackingStore.scala:107)", > "\tat org.apache.spark.status.ElementTrackingStore$$anonfun$write$ > 1$$anonfun$apply$1.apply(ElementTrackingStore.scala:105)", > "\tat org.apache.spark.status.ElementTrackingStore$$anonfun$write$ > 1$$anonfun$apply$1.apply(ElementTrackingStore.scala:105)", > "\tat org.apache.spark.util.Utils$.tryLog(Utils.scala:2001)", > "\tat org.apache.spark.status.ElementTrackingStore$$anon$1.run(Ele > mentTrackingStore.scala:91)", > "\tat java.util.concurrent.Executors$RunnableAdapter.call(Executor > s.java:511)", > "\tat java.util.concurrent.FutureTask.run(FutureTask.java:266)", > "\tat java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPool > Executor.java:1142)", > "\tat java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoo > lExecutor.java:617)", > "\tat java.lang.Thread.run(Thread.java:748)" > ] > } > > > Thanks! > --f403045f1a0a97c97a05698d04aa Content-Type: text/html; charset="UTF-8" Content-Transfer-Encoding: quoted-printable
This mostly like a Spark issue, not a Livy issue (https://issues.apach= e.org/jira/browse/SPARK-23682).

2018-04-11 9:53 GMT+08:00 kant kodali <kanth909@g= mail.com>:
Hi All,

Livy had been running into O= OM after running few long running streaming queries (<10 queries) for a = while. It happens after several hours. I am trying to figure out why it hap= pens before I tweak some parameters?
<= br>
currently, I set=C2=A0spark.execut= or.memory =3D 3g and=C2=A0spark.driver.memory =3D 3g and I wonder if I stil= l need to set these when I see in spark documentation there is=C2=A0 spark.= memory.fraction and spark.dynamicAllocation.enabled?=C2=A0

I am us= ing spark 2.3.0 and running livy in a client mode.

Also, How to scal= e livy? should I one session for every long running streaming query? should= I have multiple sessions?

Stacktrace #1

{
=C2=A0=C2=A0=C2=A0 "id": 0,
=C2=A0=C2=A0=C2=A0 "from": 102,
=C2=A0=C2=A0= =C2=A0 "total": 202,
=C2=A0=C2=A0=C2=A0 "log"= ;: [
=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 "18/04/08 1= 9:13:40 ERROR MicroBatchExecution: Query [id =3D 7aedaf72-41e0-4be5-8ea6-e374bfbf0ae7, runId =3D = b85bacee-d54e-421e-b453-450= 591e128c9] terminated with error",
=C2=A0=C2=A0=C2=A0= =C2=A0=C2=A0=C2=A0=C2=A0 "java.lang.OutOfMemoryError: GC overhead limi= t exceeded",
=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 &q= uot;\tat java.lang.StringCoding$StringEncoder.encode(StringCoding.java:300)",
=C2=A0=C2=A0=C2=A0=C2=A0= =C2=A0=C2=A0=C2=A0 "\tat java.lang.StringCoding.encode(StringCoding.java:344)",<= br style=3D"color:rgb(34,34,34);font-family:arial,sans-serif;font-size:12.8= px;background-color:rgb(255,255,255)">=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 "\tat java.lang.String.= getBytes(String.java:9= 18)",
=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 "\tat= java.io.UnixFileSystem.get= Length(Native Method)",
=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0= =C2=A0=C2=A0 "\tat java.io.File.length(File.java:974)",
=C2=A0=C2=A0=C2=A0= =C2=A0=C2=A0=C2=A0=C2=A0 "\tat org.apache.hadoop.fs.RawLocalFileSystem$DeprecatedRawLocalFileStatus.<init>(RawLocalFileSystem.java= :
626)",
=C2= =A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 "\tat org.apache.hadoop.fs.RawLocalFileSystem.deprecatedGetFileStatus(RawLocalFileSystem.java:60= 9)",
=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 "\tat = org.apache.hadoop.fs.RawLoc= alFileSystem.getFileLi= nkStatusInternal(RawLo= calFileSystem.java:824)",
=C2=A0=C2=A0=C2=A0=C2=A0=C2= =A0=C2=A0=C2=A0 "\tat org.apache.hadoop.fs.RawLocalFileSystem.getFileStatus(RawLocalFileSystem.java:601)",
=C2=A0=C2=A0=C2=A0= =C2=A0=C2=A0=C2=A0=C2=A0 "\tat org.apache.hadoop.fs.FileSystem.rename(FileSystem.java:1309)",

=C2= =A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 "\tat org.apache.hadoop.fs.DelegateToFileSystem.renameInternal(DelegateToFileSystem.java:197)<= span style=3D"color:rgb(34,34,34);font-family:arial,sans-serif;font-size:12= .8px;background-color:rgb(255,255,255)">",
=C2=A0=C2=A0=C2= =A0=C2=A0=C2=A0=C2=A0=C2=A0 "\tat org.apache.hadoop.fs.AbstractFileSystem.renameInternal(AbstractFileSystem.java:748)",
=C2= =A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 "\tat org.apache.hadoop.fs.Fil= terFs.renameInternal(F= ilterFs.java:236)"= ;,
=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 "\tat org.apa= che.hadoop.fs.AbstractFileSystem.rename(AbstractF= ileSystem.java:678)",
=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0= =C2=A0=C2=A0 "\tat org.apache.hadoop.fs.
FileContext.rename(FileContext.java:958)",
=C2=A0=C2=A0=C2= =A0=C2=A0=C2=A0=C2=A0=C2=A0 "\tat org.apache.spark.sql.execution.streaming.HDFSMetadataLog$FileContextManager.rename(HDFSMetadataLog.scala:374)",
= =C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 "\tat=C2=A0org.apache.spark.= sql.execution.streaming.HDFSMetadataLog.org$apache$spark$sql$execution$streaming$HDFSMetadataLog$$writeBatch(HDFSMetadataLog.scala:160)",
=C2=A0= =C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 "\tat org.apache.spark.sql.execution.streaming.<= span style=3D"color:rgb(34,34,34);font-family:arial,sans-serif;font-size:12= .8px;background-color:rgb(255,255,255)">HDFSMetadataLog$$anonfun$add$<= /span>1.apply$mcZ$sp(HDFSMetadataLog.scala:112)",
=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 "\tat org.apache.= spark.sql.execution.st= reaming.HDFSMetadataLog$$an= onfun$add$1.apply(HDFS= MetadataLog.scala:110)= ",
=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 "\tat or= g.apache.spark.sql.executio= n.streaming.HDFSMetada= taLog$$anonfun$add$1.a= pply(HDFSMetadataLog.scala:110)",
=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 &q= uot;\tat scala.Option.getOrElse(Option.scala:121)",
=C2=A0=C2=A0=C2=A0=C2=A0=C2= =A0=C2=A0=C2=A0 "\tat org.apache.spark.sql.execution.streaming.HDFSMetadataLog.add(HDFSMetadataLog.scala:110)",
=C2=A0=C2=A0= =C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 "\tat org.apache.spark.sql.execution.streaming.
MicroBatchExecution$$anonfun$= org$apache$spark$sql$<= span style=3D"color:rgb(34,34,34);font-family:arial,sans-serif;font-size:12= .8px;background-color:rgb(255,255,255)">execution$streaming$
MicroBatchExecution$$constructNextBatch$1.apply$mcV$sp(MicroBatchExecution.scala:339)",
=C2= =A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 "\tat org.apache.spark.sql.execution.streaming.
MicroBatchExecution$$anonfu= n$org$apache$spark$sql= $execution$streaming$MicroBatchExecution$$= constructNextBatch$1.apply(MicroBatchExecution.scal= a:338)",
= =C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 "\tat org.apache.spark.sql.= execution.streaming.MicroBatchExecution$$ano= nfun$org$apache$spark$sql$execution$streaming$MicroBatchExecution$$constructNextBatch$1.apply= (MicroBatchExecution.s= cala:338)",
=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 "\tat org.apache.spark.sql= .execution.streaming.<= /span>ProgressReporter$class.= reportTimeTaken(ProgressReporter.scala:271)"= ,
=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 "\tat org.apac= he.spark.sql.execution= .streaming.StreamExecution.= reportTimeTaken(StreamExecution.scala:58)&qu= ot;,
=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 "\tat=C2=A0= org.apache.spark.sql.execution.streaming.MicroBatchExecution.org= $apache$spark$sql$execution$streaming$MicroBatchExecution$$constructNextBatch(MicroBatchExecution.scala:338)",
=C2=A0=C2=A0=C2=A0=C2= =A0=C2=A0=C2=A0=C2=A0 "\tat org.apache.spark.sql.execution.streaming.MicroBatchExecution$$anonfun$runActivatedStream$1$$anonfun$<= span style=3D"color:rgb(34,34,34);font-family:arial,sans-serif;font-size:12= .8px;background-color:rgb(255,255,255)">apply$mcZ$sp$1.apply$mcV$sp(MicroBatchExecution.scal= a:128)",
= =C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 "\tat org.apache.spark.sql.= execution.streaming.MicroBatchExecution$$ano= nfun$runActivatedStream$1$$anonfun$apply$mcZ$sp$1= .apply(MicroBatchExecu= tion.scala:121)",=
=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 "\tat org.apach= e.spark.sql.execution.= streaming.MicroBatchExecuti= on$$anonfun$runActivat= edStream$1$$anonfun$ap= ply$mcZ$sp$1.apply(Mic= roBatchExecution.scala:121)",
=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 "\t= at org.apache.spark.sql.exe= cution.streaming.Progr= essReporter$class.repo= rtTimeTaken(ProgressRe= porter.scala:271)",
=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2= =A0 "Exception in thread \"stream execution thread for [id =3D 7a= edaf72-41e0-4be5-8ea6-
e374b= fbf0ae7, runId =3D b85bacee-d54e-421e-b453-450591e128c9]\" java.lang.OutOfMemoryError: = GC overhead limit exceeded",
=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0= =C2=A0=C2=A0 "\tat java.lang.StringCoding$StringEncoder.encode(StringCoding.java:300)",
=C2=A0=C2= =A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 "\tat java.lang.StringCoding.encode(= StringCoding.java:344)= ",
=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 "\tat ja= va.lang.String.getBytes(Str= ing.java:918)",
=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0= =C2=A0 "\tat java.io.UnixFileSystem.
getLength(Native Method)",
=C2=A0=C2=A0= =C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 "\tat java.io.File.length(File.java:974)",
= =C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 "\tat org.apache.hadoop.fs.= RawLocalFileSystem$DeprecatedRawLocalFileSta= tus.<init>(RawLocalFi= leSystem.java:626)&quo= t;,
=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 "\tat org.ap= ache.hadoop.fs.RawLocalFileSystem.deprecatedGetFi= leStatus(RawLocalFileS= ystem.java:609)",
=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0= =C2=A0 "\tat org.apache.hadoop.fs.
RawLocalFileSystem.getFileLinkStatusInternal(RawLocalFileSystem.java:824)",
=C2=A0=C2= =A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 "\tat org.apache.hadoop.fs.RawLocalFileSystem.
getFileStatus(
RawLocalFileSystem.java:601)",
= =C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 "\tat org.apache.hadoop.fs.= FileSystem.rename(File= System.java:1309)"= ;,
=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 "\tat org.apa= che.hadoop.fs.DelegateToFileSystem.renameInternal= (DelegateToFileSystem.= java:197)",
=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 "\tat org.apache.hadoop.fs= .AbstractFileSystem.renameInternal(AbstractFileSystem.java:748)",<= /span>
=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 "\tat org.apache= .hadoop.fs.FilterFs.re= nameInternal(FilterFs.java:= 236)",
=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 &q= uot;\tat org.apache.hadoop.fs.AbstractFileSystem.rename(AbstractFileSystem.java:678)",
=C2=A0=C2=A0=C2= =A0=C2=A0=C2=A0=C2=A0=C2=A0 "\tat org.apache.hadoop.fs.FileContext.rename(FileContext.java:958)",
= =C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 "\tat org.apache.spark.sql.= execution.streaming.HDFSMetadataLog$FileContextManager.rename(HDFSMetadataLog.scala:374)"= ;,
=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 "\tat=C2=A0org.a= pache.spark.sql.execution.streaming.HDFSMetadataLog.org$apache$spark$sql$execution$streaming$HDFSMetadataLog$$writeBatch(HDFSMetadataLog.scala:160)",=
=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 "\tat org.apache.spark= .sql.execution.streami= ng.HDFSMetadataLog$$an= onfun$add$1.apply$mcZ$sp(HDFSMetadataLog.scala:11= 2)",
=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 "\tat = org.apache.spark.sql.execut= ion.streaming.HDFSMeta= dataLog$$anonfun$add$1= .apply(HDFSMetadataLog.scala:110)",
=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 &q= uot;\tat org.apache.spark.sql.execution.streaming.HDFSMetadataLog$$anonfun$add$1.apply(HDFSMetadataLog.scala:110)",
=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0= =C2=A0 "\tat scala.Option.getOrElse(Option.
scala:121)",
=C2=A0=C2=A0=C2=A0= =C2=A0=C2=A0=C2=A0=C2=A0 "\tat org.apache.spark.sql.execution.streaming.HDFSMetadataLog.add(HDFSMetadataLog.scala:110)",
= =C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 "\tat org.apache.spark.sql.= execution.streaming.MicroBatchExecution$$ano= nfun$org$apache$spark$sql$execution$streaming$MicroBatchExecution$$constructNextBatch$1.apply= $mcV$sp(MicroBatchExec= ution.scala:339)",
=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 "\tat org.apache.sp= ark.sql.execution.stre= aming.MicroBatchExecution$$anonfun$org$apache$spa= rk$sql$execution$strea= ming$MicroBatchExecuti= on$$constructNextBatch= $1.apply(MicroBatchExecutio= n.scala:338)",
=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 "\tat org.apache.s= park.sql.execution.str= eaming.MicroBatchExecution<= wbr>$$anonfun$org$apache$sp= ark$sql$execution$stre= aming$MicroBatchExecut= ion$$constructNextBatch$1.apply(MicroBatchExecuti= on.scala:338)",
=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 "\tat org.apache.= spark.sql.execution.st= reaming.ProgressReporter$cl= ass.reportTimeTaken(ProgressReporter.scala:2= 71)",
=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 "\tat= org.apache.spark.sql.execu= tion.streaming.StreamE= xecution.reportTimeTak= en(StreamExecution.sca= la:58)",
=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 "\= tat=C2=A0org.apache.spark.sql.execution.streaming.MicroBatchExecution.org$apache$spark$sql$execution$streaming$MicroBatchExecution$$constructNextBatch(MicroBatchExecution.scala:338)",
=C2=A0=C2=A0= =C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 "\tat org.apache.spark.sql.execution.streaming.
MicroBatchExecution$$anonfun$= runActivatedStream$1$$anonfun= $apply$mcZ$sp$1.apply$= mcV$sp(MicroBatchExecution.= scala:128)",
=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 "\tat org.apache.spa= rk.sql.execution.strea= ming.MicroBatchExecution$$anonfun$runActivatedStr= eam$1$$anonfun$apply$m= cZ$sp$1.apply(MicroBat= chExecution.scala:121)= ",
=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 "\tat or= g.apache.spark.sql.executio= n.streaming.MicroBatch= Execution$$anonfun$run= ActivatedStream$1$$anonfun$apply$mcZ$sp$1.apply(MicroBatchExecution.scala:121)",
=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 &= quot;\tat org.apache.spark.sql.execution.streaming.reportTimeTaken(Progress= Reporter.scala:271)",
=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0= =C2=A0=C2=A0 "Exception in thread \"dag-scheduler-event-loop\&quo= t; java.lang.OutOfMemoryError: GC overhead limit exceeded",

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 "\tat java.lang.Class.getDeclaredMethods0(Native= Method)",
=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 "= ;\tat java.lang.Class.priva= teGetDeclaredMethods(C= lass.java:2701)",
=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2= =A0 "\tat java.lang.Class.getDeclaredMethod(Class.java:2128)",
=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2= =A0 "\tat java.io.ObjectStreamClass.getPrivateMethod(ObjectStreamClass.java:1575)",
=C2=A0=C2=A0= =C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 "\tat java.io.ObjectStreamClass.= access$1700(ObjectStreamClass= .java:79)",
=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 "\tat java.io.ObjectS= treamClass$2.run(Objec= tStreamClass.java:508)"= ;,
=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 "\tat ja= va.io.ObjectStreamClass$2.r= un(ObjectStreamClass.java:482)",
=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 = "\tat java.security.Ac= cessController.doPrivileged(Native Method)",
=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0= =C2=A0 "\tat java.io.ObjectStreamClass.<
init>(ObjectStreamClass.java:482)",
=C2=A0=C2=A0=C2= =A0=C2=A0=C2=A0=C2=A0=C2=A0 "\tat java.io.ObjectStreamClass.lookup(ObjectStreamClass.java:379)",
= =C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 "\tat java.io.ObjectOutputS= tream.writeObject0(ObjectOutputStream.java:1= 134)",
=C2= =A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 "\tat java.io.ObjectOutputStre= am.defaultWriteFields(= ObjectOutputStream.jav= a:1548)",
= =C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 "\tat java.io.ObjectOutputS= tream.writeSerialData(= ObjectOutputStream.jav= a:1509)",
= =C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 "\tat java.io.ObjectOutputS= tream.writeOrdinaryObj= ect(ObjectOutputStream= .java:1432)",=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 "\tat java.io.ObjectOutp= utStream.writeObject0(= ObjectOutputStream.jav= a:1178)",
= =C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 "\tat java.io.ObjectOutputS= tream.defaultWriteFiel= ds(ObjectOutputStream.= java:1548)",writeSerialDat= a(ObjectOutputStream.j= ava:1509)",
=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 "\tat java.io.ObjectOutput= Stream.writeOrdinaryOb= ject(ObjectOutputStrea= m.java:1432)",<= br style=3D"color:rgb(34,34,34);font-family:arial,sans-serif;font-size:12.8= px;background-color:rgb(255,255,255)">=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 "\tat java.io.ObjectOut= putStream.writeObject0= (ObjectOutputStream.ja= va:1178)",
= =C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 "\tat java.io.ObjectOutputS= tream.defaultWriteFiel= ds(ObjectOutputStream.= java:1548)",writeSerialDat= a(ObjectOutputStream.j= ava:1509)",
=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 "\tat java.io.ObjectOutput= Stream.writeOrdinaryOb= ject(ObjectOutputStrea= m.java:1432)",<= br style=3D"color:rgb(34,34,34);font-family:arial,sans-serif;font-size:12.8= px;background-color:rgb(255,255,255)">=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 "\tat java.io.ObjectOut= putStream.writeObject0= (ObjectOutputStream.ja= va:1178)",
= =C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 "\tat java.io.ObjectOutputS= tream.writeArray(Objec= tOutputStream.java:137= 8)",
=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 "\tat = java.io.ObjectOutputStream.= writeObject0(ObjectOut= putStream.java:1174)",
=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 "\tat java= .io.ObjectOutputStream.def<= wbr>aultWriteFields(ObjectO= utputStream.java:1548)",
=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 "\tat ja= va.io.ObjectOutputStream.wr= iteSerialData(ObjectOu= tputStream.java:1509)",
=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 "\tat jav= a.io.ObjectOutputStream.wri= teOrdinaryObject(Objec= tOutputStream.java:1432)",
=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 "\tat = java.io.ObjectOutputStream.= writeObject0(ObjectOut= putStream.java:1178)",
=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 "\tat java= .io.ObjectOutputStream.def<= wbr>aultWriteFields(ObjectO= utputStream.java:1548)",
=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 "\tat ja= va.io.ObjectOutputStream.wr= iteSerialData(ObjectOu= tputStream.java:1509)",
=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 "\tat jav= a.io.ObjectOutputStream.wri= teOrdinaryObject(Objec= tOutputStream.java:1432)"
=C2=A0=C2=A0=C2=A0 ]<= br style=3D"color:rgb(34,34,34);font-family:arial,sans-serif;font-size:12.8= px;background-color:rgb(255,255,255)">}

Stack= Trace2


{
=C2=A0=C2=A0=C2=A0 "id": 0,
=C2=A0=C2=A0=C2=A0 "from": 102,
=C2=A0=C2=A0= =C2=A0 "total": 202,
=C2=A0=C2=A0=C2=A0 "log"= ;: [
=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 "\tat org.a= pache.spark.sql.execution.streaming.MicroBatchExe= cution.runActivatedStr= eam(MicroBatchExecutio= n.scala:117)",<= br style=3D"color:rgb(34,34,34);font-family:arial,sans-serif;font-size:12.8= px;background-color:rgb(255,255,255)">=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 "\tat=C2=A0org.apache.spar= k.sql.execution.streaming.StreamExecution.org$apache$spark$sql$execution$streaming$StreamExecution$$runStream(StreamExecution.scala:279)",
=C2= =A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 "\tat org.apache.spark.sql.execution.streaming.StreamExecution$$anon$1.run= (StreamExecution.scala= :189)",
=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 "Ca= used by: java.lang.IllegalS= tateException: Error reading delta file file:/tmp/7a32968b1cea96f54c771da72784ae21/state/0/1/1.delta of HDFSStateStor= eProvider[id =3D (op=3D0,part=3D1),dir =3D file:/tmp/
7a32968b1cea96f54c771da72784ae21/state/0/1]: file:/tmp/7a32968b1cea96f54c771da72784ae<= span style=3D"color:rgb(34,34,34);font-family:arial,sans-serif;font-size:12= .8px;background-color:rgb(255,255,255)">21/state/0/1/1.delta does not exist= ",

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 "\tat=C2= =A0org.apache.spark.sql.execution.streaming.state.HDFS= BackedStateStoreProvider.org$apache$spark$sql$e= xecution$streaming$state$HDFSBackedStateStoreProvider$$updateFromDeltaFile(HDFSBackedStateStoreProvider.scala:371)",
=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0= =C2=A0=C2=A0 "\tat org.apache.spark.sql.execution.streaming.state.HDFSBackedStateStoreProvider$$anonfun$loadMap$1.apply$mcVJ$sp(
HDFSBackedStateStoreProvider.scala:333)",
=C2=A0=C2=A0=C2=A0=C2=A0= =C2=A0=C2=A0=C2=A0 "\tat org.apache.spark.sql.execution.streaming.state.HDFSBackedStateStoreProvider$$anonfun$loadMap$1.apply(<= span style=3D"color:rgb(34,34,34);font-family:arial,sans-serif;font-size:12= .8px;background-color:rgb(255,255,255)">HDFSBackedStateStoreProvider.<= /span>scala:332)",
=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 "\tat org.apache.spark.sql= .execution.streaming.s= tate.HDFSBackedStateSt= oreProvider$$anonfun$loadM<= wbr>ap$1.apply(HDFSBackedSt= ateStoreProvider.scala= :332)",
=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 "\t= at scala.collection.immutable.NumericRange.foreach(=C2=A0=C2=A0=C2=A0=C2=A0=C2= =A0=C2=A0=C2=A0 "\tat org.apache.spark.sql.execution.streaming.state.HDFSBackedStateStoreProvider.loadMap(HDFSBackedStateStoreProvider.scala:332)",
=C2=A0=C2=A0=C2=A0=C2= =A0=C2=A0=C2=A0=C2=A0 "\tat org.apache.spark.sql.execution.streaming.state.HDFSBackedStateStoreProvider.getStore(
HDFSBackedStateStoreProvider.scala:196)",
=C2=A0=C2=A0= =C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 "\tat org.apache.spark.sql.execution.streaming.state.<= span style=3D"color:rgb(34,34,34);font-family:arial,sans-serif;font-size:12= .8px;background-color:rgb(255,255,255)">StateStore$.get(StateStore.scala:369)",
= =C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 "\tat org.apache.spark.sql.= execution.streaming.st= ate.StateStoreRDD.comp= ute(StateStoreRDD.scala:74)",
=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 "\t= at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:324)",
= =C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2= =A0=C2=A0 "\tat org.apache.spark.rdd.RDD.iterator(RDD.scala:288)",
=C2=A0=C2= =A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 "\tat org.apache.spark.rdd.MapPartitionsRDD.compute(
MapPartitionsRDD.scala:38)&q= uot;,
=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 "\tat org.= apache.spark.rdd.RDD.compu<= wbr>teOrReadCheckpoint(RDD.= scala:324)",
=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2= =A0 "\tat org.apache.spark.rdd.RDD.iterator(RDD.scala:288)",
=C2=A0=C2=A0=C2= =A0=C2=A0=C2=A0=C2=A0=C2=A0 "\tat org.apache.spark.scheduler.ResultTask.runTask(ResultTask.<= /span>scala:87)",=
=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 "\tat org.apache.spark= .scheduler.Task.run(Ta= sk.scala:109)",
=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 = "\tat org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:345)",
=C2=A0=C2=A0=C2=A0= =C2=A0=C2=A0=C2=A0=C2=A0 "\tat java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)&quo= t;,
=C2=A0=C2=A0= =C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 "\tat java.util.concurrent.ThreadPoolExecutor$Worker.run(
ThreadPoolExecutor.java:61= 7)",
=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 "\tat = java.lang.Thread.run(Thread.java:748)",
=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2= =A0 "Caused by: java.io.FileNotFoundException: File file:/tmp/<= span style=3D"color:rgb(34,34,34);font-family:arial,sans-serif;font-size:12= .8px;background-color:rgb(255,255,255)">7a32968b1cea96f54c771da72784ae= 21/state/0/1/1.delta does n= ot exist",
=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 "= ;\tat org.apache.hadoop.fs.= RawLocalFileSystem.dep= recatedGetFileStatus(R= awLocalFileSystem.java:611)",
=C2=A0=C2=A0=C2=A0=C2=A0= =C2=A0=C2=A0=C2=A0 "\tat org.apache.hadoop.fs.RawLocalFileSystem.getFileLinkStatusInternal(RawLocalFileSystem.java:824)",RawLocalFileSystem.<= /span>getFileStatus(RawLocalFileSystem.java:601)",<= /span>
=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 "\tat org.apache= .hadoop.fs.FilterFileS= ystem.getFileStatus(= FilterFileSystem.java:421)&qu= ot;,
=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 "\tat org.a= pache.hadoop.fs.ChecksumFileSystem$ChecksumFSInpu= tChecker.<init>(= ChecksumFileSystem.java:142)",
=C2=A0=C2=A0=C2=A0=C2= =A0=C2=A0=C2=A0=C2=A0 "\tat org.apache.hadoop.fs.ChecksumFileSystem.open(ChecksumFileSystem.java:346)",
=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 "\tat org.apache.had= oop.fs.FileSystem.open= (FileSystem.java:769)"= ,
=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 "\tat=C2=A0org.apache.spark.sql.execution.streaming.state.HDFSBackedStateStoreProvider.org$ap= ache$spark$sql$executi= on$streaming$state$HDF= SBackedStateStoreProvider$$updateFromDeltaFile(H<= wbr>DFSBackedStateStoreProvider.scala:368)",
=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0= =C2=A0 "\t... 19 more",

=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0= =C2=A0=C2=A0 "Exception in thread \"dispatcher-event-loop-3\"= ; java.lang.OutOfMemoryError: GC overhead limit exceeded",
= =C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 "\tat java.lang.Class.getDeclaredMethods0(Native = Method)",
=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 "= \tat java.lang.Class.privat= eGetDeclaredMethods(Cl= ass.java:2701)",
=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0= "\tat java.lang.Class.getDeclaredMethod(Class.java:2128)",
=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 = "\tat java.io.ObjectStreamClass.getPrivateMethod(ObjectStreamClass.java:1475)",
=C2=A0=C2=A0=C2=A0= =C2=A0=C2=A0=C2=A0=C2=A0 "\tat java.io.ObjectStreamClass.access$1700(ObjectStreamClass.java:72)",
= =C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 "\tat java.io.ObjectStreamC= lass$2.run(ObjectStrea= mClass.java:498)"= ,
=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 "\tat java.io.= ObjectStreamClass$2.ru= n(ObjectStreamClass.java:47= 2)",
=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 "= \tat java.security.AccessCo= ntroller.doPrivileged(= Native Method)",
=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0= "\tat java.io.ObjectStreamClass.<init>(ObjectStreamClass.java:472)",
=C2=A0=C2=A0=C2=A0=C2= =A0=C2=A0=C2=A0=C2=A0 "\tat java.io.ObjectStreamClass.lookup(ObjectStreamClass.java:<= span style=3D"color:rgb(34,34,34);font-family:arial,sans-serif;font-size:12= .8px;background-color:rgb(255,255,255)">369)",

=C2=A0= =C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 "\tat java.io.ObjectOutputStream.= writeObject0(ObjectOutputStream.java:1134)&q= uot;
,
=C2=A0=C2= =A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 "\tat java.io.ObjectOutputStream.defaultWriteFields(ObjectOutputStream.java:1548= )",
=C2=A0= =C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 "\tat java.io.ObjectOutputStream.= defaultWriteObject(ObjectOutputStream.java:4= 41)",
=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 "\tat= org.apache.spark.broadcast.TorrentBroadcast$$anonfun$writeObject$1.apply$mcV$sp(TorrentBroadcast.scala:204)",
=C2=A0=C2=A0= =C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 "\tat org.apache.spark.broadcast.TorrentBroadcast$$anonfun$writeObject$1.apply(TorrentBroadcast.scala:202= )",
=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 "\tat o= rg.apache.spark.broadcast.T= orrentBroadcast$$anonfun$writeObject$1.apply(TorrentBroadcast.scala:202)",
=C2=A0=C2=A0=C2=A0=C2= =A0=C2=A0=C2=A0=C2=A0 "\tat org.apache.spark.util.Utils$.tryOrIOException(Utils.scala:1343)",
=C2= =A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 "\tat org.apache.spark.broadca= st.TorrentBroadcast.wr= iteObject(TorrentBroad= cast.scala:202)",
=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2= =A0 "\tat sun.reflect.= NativeMethodAccessorImpl.invoke0(Native Method)",
=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0= =C2=A0=C2=A0 "\tat sun.reflect.NativeMethodAccessorImpl.invoke(NativeMe= thodAccessorImpl.java:= 62)",
=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 "\tat= sun.reflect.DelegatingMeth= odAccessorImpl.invoke(= DelegatingMethodAccess= orImpl.java:43)",
=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 "\tat java.lang.refl= ect.Method.invoke(Meth= od.java:498)",
=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 &= quot;\tat java.io.ObjectStreamClass.invokeWriteObject(ObjectStreamClass.java:1028)",
=C2=A0=C2=A0=C2=A0= =C2=A0=C2=A0=C2=A0=C2=A0 "\tat java.io.ObjectOutputStream.writeSerialData(
ObjectOutputStream.java:1496)",
=C2=A0=C2=A0=C2=A0= =C2=A0=C2=A0=C2=A0=C2=A0 "\tat java.io.ObjectOutputStream.writeOrdinaryObject(
ObjectOutputStream.java:1432)"<= /span>,
=C2=A0=C2=A0=C2= =A0=C2=A0=C2=A0=C2=A0=C2=A0 "\tat java.io.ObjectOutputStream.writeObject0(
ObjectOutputStream.java:1178)",
=C2=A0=C2=A0=C2=A0= =C2=A0=C2=A0=C2=A0=C2=A0 "\tat java.io.ObjectOutputStream.defaultWriteFields(
ObjectOutputStream.java:1548)",
=C2=A0=C2=A0=C2= =A0=C2=A0=C2=A0=C2=A0=C2=A0 "\tat java.io.ObjectOutputStream.writeSerialData(
ObjectOutputStream.java:1509)",
=C2=A0=C2=A0=C2= =A0=C2=A0=C2=A0=C2=A0=C2=A0 "\tat java.io.ObjectOutputStream.writeOrdinaryObject(
ObjectOutputStream.java:1432)&quo= t;
,
=C2=A0=C2=A0= =C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 "\tat java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1178)",
=C2=A0=C2=A0=C2= =A0=C2=A0=C2=A0=C2=A0=C2=A0 "\tat java.io.ObjectOutputStream.writeObject(
ObjectOutputStream.java:348)",
=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 "\tat org.apache.spa= rk.serializer.JavaSeri= alizationStream.writeObject(JavaSerializer.scala:= 43)",
=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 "18/0= 4/10 03:21:09 ERROR Utils: Uncaught exception in thread element-tracking-st= ore-worker",
=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 "java.lang.OutOfMemo= ryError: GC overhead limit exceeded",
=C2=A0=C2=A0=C2=A0=C2= =A0=C2=A0=C2=A0=C2=A0 "\tat org.apache.spark.util.kvstore.KVTypeInfo$MethodAccessor.get(KVTypeInfo.java:154)"= ,
=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 "\tat org.apac= he.spark.util.kvstore.= InMemoryStore$InMemoryView.= compare(InMemoryStore.java:248)",
=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 &q= uot;\tat org.apache.spark.util.kvstore.InMemoryStore$InMemoryView.lambda$iterator$0(InMemoryStore.java:203)",
=C2=A0=C2=A0=C2= =A0=C2=A0=C2=A0=C2=A0=C2=A0 "\tat org.apache.spark.util.kvstore.
InMemoryStore$InMemoryView$$= Lambda$24/1059294725.c= ompare(Unknown Source)= ",
=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 "\tat ja= va.util.TimSort.binarySort(= TimSort.java:296)",
= =C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2= =A0=C2=A0 "\tat java.util.TimSort.sort(TimSort.java:239)",
=C2=A0=C2=A0=C2= =A0=C2=A0=C2=A0=C2=A0=C2=A0 "\tat java.util.Arrays.sort(Arrays.= java:1512)",
= =C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 "\tat java.util.ArrayList.s= ort(ArrayList.java:145= 4)",
=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 "\tat = java.util.Collections.sort(= Collections.java:175)",
=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0= =C2=A0=C2=A0 "\tat org.apache.spark.util.kvstore.InMemoryStore$InMemoryView.iterator(InMemoryStore.java:203)",

=C2=A0=C2= =A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 "\tat scala.collection.convert.Wrappers$JIterableWrapper.<= /span>iterator(Wrappers.scala= :54)",
=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 "\ta= t scala.collection.IterableLike$class.foreach(IterableLike.scala:72)",
=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0= =C2=A0=C2=A0 "\tat scala.collection.
AbstractIterable.foreach(Iterable.scala:54)",
=C2=A0=C2=A0=C2= =A0=C2=A0=C2=A0=C2=A0=C2=A0 "\tat org.apache.spark.status.AppStatusListener$$anonfun$
= org$apache$spark$status$AppStatusListener$$<= span style=3D"color:rgb(34,34,34);font-family:arial,sans-serif;font-size:12= .8px;background-color:rgb(255,255,255)">cleanupStages$1.apply(AppStatusListener.scala:891)",=
=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 "\tat org.apach= e.spark.status.AppStat= usListener$$anonfun$org$apa= che$spark$status$AppSt= atusListener$$cleanupS= tages$1.apply(AppStatu= sListener.scala:871)",
=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0= =C2=A0 "\tat scala.collection.immutable.
List.foreach(List.scala:381)",
=C2= =A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 "\tat=C2=A0org.apache.spark.status.AppStatu= sListener.org$apache$spark$status$
AppStatusListener$$cleanupStages(AppStatusListener.scala:871)",
=C2=A0=C2= =A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 "\tat org.apache.spark.status.AppStatusListener$$anonfun$3= .apply$mcVJ$sp(= AppStatusListener.scala:84)&q= uot;,
=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 "\tat org.= apache.spark.status.Elemen<= wbr>tTrackingStore$$anonfun$write$1$$anonfun$apply$1$$anonfun$apply$mcV$sp$1.apply(ElementTrackingStore.scala:109)",
<= span style=3D"color:rgb(34,34,34);font-family:arial,sans-serif;font-size:12= .8px;background-color:rgb(255,255,255)">=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2= =A0=C2=A0 "\tat org.apache.spark.status.
ElementTrackingStore$$anonfun$write$1$$anonfun$apply$1$$anonfun$apply$mcV$sp$1.apply(ElementTrackingStore.scala:107)",
=C2=A0=C2= =A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 "\tat scala.collection.immutable.List.foreach(List.scala:3= 81)",
=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 "\tat= org.apache.spark.status.El= ementTrackingStore$$anonfun$write$1$$anonfun$apply$1.apply$mcV$sp(= ElementTrackingStore.scala:= 107)",
=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 &q= uot;\tat org.apache.spark.status.ElementTrackingStore$$anonfun$write$1$$anonfun$apply$1.apply(ElementTrackingStore.scala:105)",
=C2=A0=C2=A0=C2=A0=C2=A0= =C2=A0=C2=A0=C2=A0 "\tat org.apache.spark.status.ElementTrackingStore$$anonfun$write$1$$anonfun$apply$1.apply(ElementTrackingStore.
scala:105)",
=C2= =A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 "\tat org.apache.spark.util.Ut= ils$.tryLog(Utils.scal= a:2001)",
=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 "= \tat org.apache.spark.status.run(ElementTrackingStore.scala:91)",
<= span style=3D"color:rgb(34,34,34);font-family:arial,sans-serif;font-size:12= .8px;background-color:rgb(255,255,255)">=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2= =A0=C2=A0 "\tat java.util.concurrent.
Executors$RunnableAdapter.call(Executors.java:511)",
=C2=A0= =C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 "\tat java.util.concurrent.FutureTask.run(FutureTask.java:266)",
= =C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 "\tat java.util.concurrent.= ThreadPoolExecutor.run= Worker(ThreadPoolExecu= tor.java:1142)",
=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0 "\tat java.util.concu= rrent.ThreadPoolExecut= or$Worker.run(ThreadPoolExecutor.java:617)",
=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0=C2=A0= =C2=A0 "\tat java.lang.Thread.run(Thread.
java:748)"
=C2=A0=C2=A0=C2=A0 ]
}

<= br>
Thanks!

--f403045f1a0a97c97a05698d04aa--