Return-Path: X-Original-To: apmail-hadoop-common-commits-archive@www.apache.org Delivered-To: apmail-hadoop-common-commits-archive@www.apache.org Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by minotaur.apache.org (Postfix) with SMTP id 6C12011D30 for ; Fri, 15 Aug 2014 14:00:25 +0000 (UTC) Received: (qmail 77876 invoked by uid 500); 15 Aug 2014 14:00:25 -0000 Delivered-To: apmail-hadoop-common-commits-archive@hadoop.apache.org Received: (qmail 77811 invoked by uid 500); 15 Aug 2014 14:00:25 -0000 Mailing-List: contact common-commits-help@hadoop.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: common-dev@hadoop.apache.org Delivered-To: mailing list common-commits@hadoop.apache.org Received: (qmail 77802 invoked by uid 500); 15 Aug 2014 14:00:25 -0000 Delivered-To: apmail-hadoop-core-commits@hadoop.apache.org Received: (qmail 77799 invoked by uid 99); 15 Aug 2014 14:00:25 -0000 Received: from nike.apache.org (HELO nike.apache.org) (192.87.106.230) by apache.org (qpsmtpd/0.29) with ESMTP; Fri, 15 Aug 2014 14:00:25 +0000 X-ASF-Spam-Status: No, hits=-2000.0 required=5.0 tests=ALL_TRUSTED X-Spam-Check-By: apache.org Received: from [140.211.11.131] (HELO eos.apache.org) (140.211.11.131) by apache.org (qpsmtpd/0.29) with ESMTP; Fri, 15 Aug 2014 14:00:01 +0000 Received: from eos.apache.org (localhost [127.0.0.1]) by eos.apache.org (Postfix) with ESMTP id 36DE3482; Fri, 15 Aug 2014 14:00:00 +0000 (UTC) MIME-Version: 1.0 Content-Type: text/plain; charset="utf-8" Content-Transfer-Encoding: quoted-printable From: Apache Wiki To: Apache Wiki Date: Fri, 15 Aug 2014 13:59:59 -0000 Message-ID: <20140815135959.28606.57339@eos.apache.org> Subject: =?utf-8?q?=5BHadoop_Wiki=5D_Update_of_=22ShellScriptProgrammingGuide=22_b?= =?utf-8?q?y_SomeOtherAccount?= Auto-Submitted: auto-generated X-Virus-Checked: Checked by ClamAV on apache.org Dear Wiki user, You have subscribed to a wiki page or wiki category on "Hadoop Wiki" for ch= ange notification. The "ShellScriptProgrammingGuide" page has been changed by SomeOtherAccount: https://wiki.apache.org/hadoop/ShellScriptProgrammingGuide?action=3Ddiff&re= v1=3D12&rev2=3D13 = 5. At this point, this is where the majority of your code goes. Program= s should process the rest of the arguments and doing whatever their script = is supposed to do. = - 6. Before executing a Java program or giving user output, call `hadoop_f= inalize`. This finishes up the configuration details: adds the user class = path, fixes up any missing Java properties, configures library paths, etc. + 6. Before executing a Java program (preferably via hadoop_java_exec) or = giving user output, call `hadoop_finalize`. This finishes up the configura= tion details: adds the user class path, fixes up any missing Java propertie= s, configures library paths, etc. = = 7. Either an `exit` or an `exec`. This should return 0 for success and = 1 or higher for failure. = @@ -36, +36 @@ = c. For methods that can also be daemons, set `daemon=3Dtrue`. This wil= l allow for the `--daemon` option to work. = - d. For HDFS daemons, if it supports security, set `secure_service=3Dtru= e` and `secure_user` equal to the user that should run the daemon. + d. If it supports security, set `secure_service=3Dtrue` and `secure_use= r` equal to the user that should run the daemon. + = + 3. If a new subcommand needs one or more extra environment variables: + = + a. Add documentation and a '''commented''' out example that shows the d= efault setting. + = + b. Add the default(s) to that subprojects' hadoop_subproject_init or ha= doop_basic_init for common, using the current shell vars as a guide. (Speci= fically, it should allow overriding!) = + = = =3D Better Practices =3D = - * Avoid adding more globals or project specific globals and/or entries i= n *-env.sh. In a lot of cases, there is pre-existing functionality that al= ready does what you might need to do. Additionally, every configuration op= tion makes it that much harder for end users. If you do need to add a new g= lobal variable for additional functionality, start it with HADOOP_ for comm= on, HDFS_ for HDFS, YARN_ for YARN, and MAPRED_ for MapReduce. It should b= e documented in either *-env.sh (for user overridable parts) or hadoop-func= tions.sh (for internal-only globals). This helps prevents our variables fro= m clobbering other people. + * Avoid adding more globals or project specific globals and/or entries i= n *-env.sh and/or a comment at the bottom here. In a lot of cases, there i= s pre-existing functionality that already does what you might need to do. = Additionally, every configuration option makes it that much harder for end = users. If you do need to add a new global variable for additional functiona= lity, start it with HADOOP_ for common, HDFS_ for HDFS, YARN_ for YARN, and= MAPRED_ for MapReduce. It should be documented in either *-env.sh (for us= er overridable parts) or hadoop-functions.sh (for internal-only globals). T= his helps prevents our variables from clobbering other people. = - * Remember that abc_xyz_OPTS can and should act as a catch-all for Java = daemon options. Custom heap environment variables add unnecessary complexi= ty for both the user and us. + * Remember that abc_xyz_OPTS can and should act as a catch-all for Java = daemon options. Custom heap environment variables add unnecessary complexi= ty for both the user and us. They should be avoided. = * Avoid mutli-level `if`'s where the comparisons are static strings. Us= e case statements instead, as they are easier to read. = - * BSDisms, GNUisms, or SysVisms. Double check your esoteric command and = parameters on multiple operating systems. (Usually a quick Google search w= ill pull up man pages for other OSes.) + * BSDisms, GNUisms, or SysVisms. Double check your command, parameters, = and/or reading of OS files on multiple operating systems. (Usually a quick= Google search will pull up man pages for other OSes.) In particular, check= out Linux, OS X, FreeBSD, Solaris, and AIX. It's reasonable to expect code= to work for approximately three-five years. Also take note of hadoop_os_t= ricks, where OS-specific start up stuff can go, so long as a user would nev= er want to change it... = * Output to the screen, especially for daemons, should be avoided. No o= ne wants to see a multitude of messages during startup. Errors should go t= o STDERR instead of STDOUT. Use the `hadoop_error` function to make it clea= r in the code. = @@ -56, +63 @@ = * The [[http://wiki.bash-hackers.org/scripting/style|Bash Hackers websit= e]] and [[https://google-styleguide.googlecode.com/svn/trunk/shell.xml|Goog= le]] have great general advice for style guidlines in bash. Additionally, = Paul Lutus's [[http://www.arachnoid.com/python/beautify_bash_program.html|B= eautify Bash]] does a tremendously good job reformatting bash. = - * A decent shell lint is available at http://www.shellcheck.net . Mac u= sers can `brew install shellcheck` to install it locally. Like lint, howeve= r, be aware that it will sometimes flag things that are legitimate. = + * A decent shell lint is available at http://www.shellcheck.net . Mac u= sers can `brew install shellcheck` to install it locally. Like lint, howeve= r, be aware that it will sometimes flag things that are legitimate. These c= an be marked using a 'shellcheck disable' comment. (Usually, the flag for $= HADOOP_OPTS being called without quotes is our biggest offense that shellch= eck flags. Our usage without quotes is correct for the current code base. = It is, however, a bad practice and shellcheck is correct for telling us ab= out it.) = =3D Standard Environment Variables =3D =20