Return-Path: X-Original-To: archive-asf-public-internal@cust-asf2.ponee.io Delivered-To: archive-asf-public-internal@cust-asf2.ponee.io Received: from cust-asf.ponee.io (cust-asf.ponee.io [163.172.22.183]) by cust-asf2.ponee.io (Postfix) with ESMTP id B005E200CF8 for ; Wed, 30 Aug 2017 20:57:08 +0200 (CEST) Received: by cust-asf.ponee.io (Postfix) id AE889169CA6; Wed, 30 Aug 2017 18:57:08 +0000 (UTC) Delivered-To: archive-asf-public@cust-asf.ponee.io Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by cust-asf.ponee.io (Postfix) with SMTP id 01D49169CA5 for ; Wed, 30 Aug 2017 20:57:07 +0200 (CEST) Received: (qmail 78163 invoked by uid 500); 30 Aug 2017 18:57:07 -0000 Mailing-List: contact issues-help@spark.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Delivered-To: mailing list issues@spark.apache.org Received: (qmail 78014 invoked by uid 99); 30 Aug 2017 18:57:07 -0000 Received: from pnap-us-west-generic-nat.apache.org (HELO spamd4-us-west.apache.org) (209.188.14.142) by apache.org (qpsmtpd/0.29) with ESMTP; Wed, 30 Aug 2017 18:57:07 +0000 Received: from localhost (localhost [127.0.0.1]) by spamd4-us-west.apache.org (ASF Mail Server at spamd4-us-west.apache.org) with ESMTP id 9F8E8C2346 for ; Wed, 30 Aug 2017 18:57:06 +0000 (UTC) X-Virus-Scanned: Debian amavisd-new at spamd4-us-west.apache.org X-Spam-Flag: NO X-Spam-Score: -100.002 X-Spam-Level: X-Spam-Status: No, score=-100.002 tagged_above=-999 required=6.31 tests=[RP_MATCHES_RCVD=-0.001, SPF_PASS=-0.001, USER_IN_WHITELIST=-100] autolearn=disabled Received: from mx1-lw-eu.apache.org ([10.40.0.8]) by localhost (spamd4-us-west.apache.org [10.40.0.11]) (amavisd-new, port 10024) with ESMTP id O-gpaRR8AkLk for ; Wed, 30 Aug 2017 18:57:05 +0000 (UTC) Received: from mailrelay1-us-west.apache.org (mailrelay1-us-west.apache.org [209.188.14.139]) by mx1-lw-eu.apache.org (ASF Mail Server at mx1-lw-eu.apache.org) with ESMTP id D67B460EF8 for ; Wed, 30 Aug 2017 18:57:04 +0000 (UTC) Received: from jira-lw-us.apache.org (unknown [207.244.88.139]) by mailrelay1-us-west.apache.org (ASF Mail Server at mailrelay1-us-west.apache.org) with ESMTP id 2CA16E0E6E for ; Wed, 30 Aug 2017 18:57:03 +0000 (UTC) Received: from jira-lw-us.apache.org (localhost [127.0.0.1]) by jira-lw-us.apache.org (ASF Mail Server at jira-lw-us.apache.org) with ESMTP id 6A6122416F for ; Wed, 30 Aug 2017 18:57:01 +0000 (UTC) Date: Wed, 30 Aug 2017 18:57:01 +0000 (UTC) From: "Jacek Laskowski (JIRA)" To: issues@spark.apache.org Message-ID: In-Reply-To: References: Subject: [jira] [Commented] (SPARK-21728) Allow SparkSubmit to use logging MIME-Version: 1.0 Content-Type: text/plain; charset=utf-8 Content-Transfer-Encoding: 7bit X-JIRA-FingerPrint: 30527f35849b9dde25b450d4833f0394 archived-at: Wed, 30 Aug 2017 18:57:08 -0000 [ https://issues.apache.org/jira/browse/SPARK-21728?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16147840#comment-16147840 ] Jacek Laskowski commented on SPARK-21728: ----------------------------------------- The idea behind the custom {{conf/log4j.properties}} is to disable all the logging and enable only {{org.apache.spark.sql.execution.streaming}} currently. {code} $ cat conf/log4j.properties # Set everything to be logged to the console log4j.rootCategory=OFF, console log4j.appender.console=org.apache.log4j.ConsoleAppender log4j.appender.console.target=System.err log4j.appender.console.layout=org.apache.log4j.PatternLayout log4j.appender.console.layout.ConversionPattern=%d{yy/MM/dd HH:mm:ss} %p %c{1}: %m%n # Set the default spark-shell log level to WARN. When running the spark-shell, the # log level for this class is used to overwrite the root logger's log level, so that # the user can have different defaults for the shell and regular Spark apps. log4j.logger.org.apache.spark.repl.Main=WARN # Settings to quiet third party logs that are too verbose log4j.logger.org.spark_project.jetty=WARN log4j.logger.org.spark_project.jetty.util.component.AbstractLifeCycle=ERROR log4j.logger.org.apache.spark.repl.SparkIMain$exprTyper=INFO log4j.logger.org.apache.spark.repl.SparkILoop$SparkILoopInterpreter=INFO log4j.logger.org.apache.parquet=ERROR log4j.logger.parquet=ERROR # SPARK-9183: Settings to avoid annoying messages when looking up nonexistent UDFs in SparkSQL with Hive support log4j.logger.org.apache.hadoop.hive.metastore.RetryingHMSHandler=FATAL log4j.logger.org.apache.hadoop.hive.ql.exec.FunctionRegistry=ERROR #log4j.logger.org.apache.spark=OFF log4j.logger.org.apache.spark.metrics.MetricsSystem=WARN # Structured Streaming log4j.logger.org.apache.spark.sql.execution.streaming.StreamExecution=DEBUG log4j.logger.org.apache.spark.sql.execution.streaming.ProgressReporter=INFO log4j.logger.org.apache.spark.sql.execution.streaming.RateStreamSource=DEBUG log4j.logger.org.apache.spark.sql.kafka010.KafkaSource=DEBUG log4j.logger.org.apache.spark.sql.kafka010.KafkaOffsetReader=DEBUG log4j.logger.org.apache.spark.sql.execution.streaming.FlatMapGroupsWithStateExec=INFO {code} > Allow SparkSubmit to use logging > -------------------------------- > > Key: SPARK-21728 > URL: https://issues.apache.org/jira/browse/SPARK-21728 > Project: Spark > Issue Type: Improvement > Components: Spark Core > Affects Versions: 2.3.0 > Reporter: Marcelo Vanzin > Assignee: Marcelo Vanzin > Priority: Minor > Fix For: 2.3.0 > > > Currently, code in {{SparkSubmit}} cannot call classes or methods that initialize the Spark {{Logging}} framework. That is because at that time {{SparkSubmit}} doesn't yet know which application will run, and logging is initialized differently for certain special applications (notably, the shells). > It would be better if either {{SparkSubmit}} did logging initialization earlier based on the application to be run, or did it in a way that could be overridden later when the app initializes. > Without this, there are currently a few parts of {{SparkSubmit}} that duplicates code from other parts of Spark just to avoid logging. For example: > * [downloadFiles|https://github.com/apache/spark/blob/master/core/src/main/scala/org/apache/spark/deploy/SparkSubmit.scala#L860] replicates code from Utils.scala > * [createTempDir|https://github.com/apache/spark/blob/master/core/src/main/scala/org/apache/spark/deploy/DependencyUtils.scala#L54] replicates code from Utils.scala and installs its own shutdown hook > * a few parts of the code could use {{SparkConf}} but can't right now because of the logging issue. -- This message was sent by Atlassian JIRA (v6.4.14#64029) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org For additional commands, e-mail: issues-help@spark.apache.org