Return-Path: X-Original-To: apmail-spark-dev-archive@minotaur.apache.org Delivered-To: apmail-spark-dev-archive@minotaur.apache.org Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by minotaur.apache.org (Postfix) with SMTP id 2ED8010BE7 for ; Fri, 14 Mar 2014 05:50:17 +0000 (UTC) Received: (qmail 66820 invoked by uid 500); 14 Mar 2014 05:50:16 -0000 Delivered-To: apmail-spark-dev-archive@spark.apache.org Received: (qmail 66770 invoked by uid 500); 14 Mar 2014 05:50:15 -0000 Mailing-List: contact dev-help@spark.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: dev@spark.apache.org Delivered-To: mailing list dev@spark.apache.org Received: (qmail 66762 invoked by uid 99); 14 Mar 2014 05:50:14 -0000 Received: from tyr.zones.apache.org (HELO tyr.zones.apache.org) (140.211.11.114) by apache.org (qpsmtpd/0.29) with ESMTP; Fri, 14 Mar 2014 05:50:14 +0000 Received: by tyr.zones.apache.org (Postfix, from userid 65534) id 4AAE4980D7A; Fri, 14 Mar 2014 05:50:14 +0000 (UTC) From: pwendell To: dev@spark.apache.org Reply-To: dev@spark.apache.org References: In-Reply-To: Subject: [GitHub] spark pull request: [SPARK-1132] Persisting Web UI through refacto... Content-Type: text/plain Message-Id: <20140314055014.4AAE4980D7A@tyr.zones.apache.org> Date: Fri, 14 Mar 2014 05:50:14 +0000 (UTC) Github user pwendell commented on a diff in the pull request: https://github.com/apache/spark/pull/42#discussion_r10597094 --- Diff: core/src/main/scala/org/apache/spark/ui/SparkUI.scala --- @@ -68,19 +105,53 @@ private[spark] class SparkUI(sc: SparkContext) extends Logging { /** Initialize all components of the server */ def start() { - // NOTE: This is decoupled from bind() because of the following dependency cycle: - // DAGScheduler() requires that the port of this server is known - // This server must register all handlers, including JobProgressUI, before binding - // JobProgressUI registers a listener with SparkContext, which requires sc to initialize + storage.start() jobs.start() + env.start() exec.start() + + // Listen for events from the SparkContext if it exists, otherwise from persisted storage + val eventBus = if (live) { + val loggingEnabled = conf.getBoolean("spark.eventLog.enabled", false) --- End diff -- I'm referring to lines 115-120. --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at infrastructure@apache.org or file a JIRA ticket with INFRA. ---