From issues-return-191526-archive-asf-public=cust-asf.ponee.io@spark.apache.org Sat May 12 04:41:05 2018 Return-Path: X-Original-To: archive-asf-public@cust-asf.ponee.io Delivered-To: archive-asf-public@cust-asf.ponee.io Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by mx-eu-01.ponee.io (Postfix) with SMTP id 993F0180647 for ; Sat, 12 May 2018 04:41:04 +0200 (CEST) Received: (qmail 31231 invoked by uid 500); 12 May 2018 02:41:03 -0000 Mailing-List: contact issues-help@spark.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Delivered-To: mailing list issues@spark.apache.org Received: (qmail 31218 invoked by uid 99); 12 May 2018 02:41:03 -0000 Received: from pnap-us-west-generic-nat.apache.org (HELO spamd3-us-west.apache.org) (209.188.14.142) by apache.org (qpsmtpd/0.29) with ESMTP; Sat, 12 May 2018 02:41:03 +0000 Received: from localhost (localhost [127.0.0.1]) by spamd3-us-west.apache.org (ASF Mail Server at spamd3-us-west.apache.org) with ESMTP id 29ECC18084D for ; Sat, 12 May 2018 02:41:03 +0000 (UTC) X-Virus-Scanned: Debian amavisd-new at spamd3-us-west.apache.org X-Spam-Flag: NO X-Spam-Score: -110.311 X-Spam-Level: X-Spam-Status: No, score=-110.311 tagged_above=-999 required=6.31 tests=[ENV_AND_HDR_SPF_MATCH=-0.5, RCVD_IN_DNSWL_MED=-2.3, SPF_PASS=-0.001, T_RP_MATCHES_RCVD=-0.01, USER_IN_DEF_SPF_WL=-7.5, USER_IN_WHITELIST=-100] autolearn=disabled Received: from mx1-lw-eu.apache.org ([10.40.0.8]) by localhost (spamd3-us-west.apache.org [10.40.0.10]) (amavisd-new, port 10024) with ESMTP id UHa0WNvGATrM for ; Sat, 12 May 2018 02:41:02 +0000 (UTC) Received: from mailrelay1-us-west.apache.org (mailrelay1-us-west.apache.org [209.188.14.139]) by mx1-lw-eu.apache.org (ASF Mail Server at mx1-lw-eu.apache.org) with ESMTP id 344495F23D for ; Sat, 12 May 2018 02:41:01 +0000 (UTC) Received: from jira-lw-us.apache.org (unknown [207.244.88.139]) by mailrelay1-us-west.apache.org (ASF Mail Server at mailrelay1-us-west.apache.org) with ESMTP id 5D46FE0F2B for ; Sat, 12 May 2018 02:41:00 +0000 (UTC) Received: from jira-lw-us.apache.org (localhost [127.0.0.1]) by jira-lw-us.apache.org (ASF Mail Server at jira-lw-us.apache.org) with ESMTP id 0CBC621545 for ; Sat, 12 May 2018 02:41:00 +0000 (UTC) Date: Sat, 12 May 2018 02:41:00 +0000 (UTC) From: "Nikolay Sokolov (JIRA)" To: issues@spark.apache.org Message-ID: In-Reply-To: References: Subject: [jira] [Updated] (SPARK-24174) Expose Hadoop config as part of /environment API MIME-Version: 1.0 Content-Type: text/plain; charset=utf-8 Content-Transfer-Encoding: quoted-printable X-JIRA-FingerPrint: 30527f35849b9dde25b450d4833f0394 [ https://issues.apache.org/jira/browse/SPARK-24174?page=3Dcom.atlassi= an.jira.plugin.system.issuetabpanels:all-tabpanel ] Nikolay Sokolov updated SPARK-24174: ------------------------------------ Description:=20 Currently, /environment API call exposes only system properties and=C2=A0Sp= arkConf. However,=C2=A0in some cases when Spark is used in conjunction with= Hadoop, it is useful to know Hadoop configuration properties. For example,= HDFS or GS buffer sizes, hive metastore settings, and so on. So it would be good to have hadoop properties being exposed in /environment= API, for example: {code:none} GET .../application_1525395994996_5/environment { "runtime": {"javaVersion": "1.8.0_131 (Oracle Corporation)", ...} "sparkProperties": ["java.io.tmpdir","/tmp", ...], "systemProperties": [["spark.yarn.jars", "local:/usr/lib/spark/jars/*"],= ...], "classpathEntries": [["/usr/lib/hadoop/hadoop-annotations.jar","System C= lasspath"], ...], "hadoopProperties": [["dfs.stream-buffer-size", 4096], ...], } {code} was: Currently, /environment API call exposes only system properties and=C2=A0Sp= arkConf. However,=C2=A0in some cases when Spark is used in conjunction with= Hadoop, it is useful to know Hadoop configuration properties. For example,= HDFS or GS buffer sizes, hive metastore settings, and so on. So it would be good to have hadoop properties being exposed in /environment= API, for example: {code:none} GET .../application_1525395994996_5/environment { "runtime": {"javaVersion": "1.8.0_131 (Oracle Corporation)", ...} "sparkProperties": ["java.io.tmpdir","/tmp", ...], "systemProperties": [["spark.yarn.jars", "local:/usr/lib/spark/jars/*"],= ...], "classpathEntries": [["/usr/lib/hadoop/hadoop-annotations.jar","System C= lasspath"], ...], "hadoopProperties": [["dfs.stream-buffer-size": 4096], ...], } {code} > Expose Hadoop config as part of /environment API > ------------------------------------------------ > > Key: SPARK-24174 > URL: https://issues.apache.org/jira/browse/SPARK-24174 > Project: Spark > Issue Type: Wish > Components: Spark Core > Affects Versions: 2.1.0 > Reporter: Nikolay Sokolov > Priority: Minor > Labels: features, usability > > Currently, /environment API call exposes only system properties and=C2=A0= SparkConf. However,=C2=A0in some cases when Spark is used in conjunction wi= th Hadoop, it is useful to know Hadoop configuration properties. For exampl= e, HDFS or GS buffer sizes, hive metastore settings, and so on. > So it would be good to have hadoop properties being exposed in /environme= nt API, for example: > {code:none} > GET .../application_1525395994996_5/environment > { > "runtime": {"javaVersion": "1.8.0_131 (Oracle Corporation)", ...} > "sparkProperties": ["java.io.tmpdir","/tmp", ...], > "systemProperties": [["spark.yarn.jars", "local:/usr/lib/spark/jars/*"= ], ...], > "classpathEntries": [["/usr/lib/hadoop/hadoop-annotations.jar","System= Classpath"], ...], > "hadoopProperties": [["dfs.stream-buffer-size", 4096], ...], > } > {code} -- This message was sent by Atlassian JIRA (v7.6.3#76005) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org For additional commands, e-mail: issues-help@spark.apache.org