hadoop-common-commits mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From a.@apache.org
Subject hadoop git commit: HADOOP-12935. API documentation for dynamic subcommands (aw)
Date Tue, 26 Apr 2016 00:54:00 GMT
Repository: hadoop
Updated Branches:
  refs/heads/HADOOP-12930 8eadd7145 -> a86b99c82

HADOOP-12935. API documentation for dynamic subcommands (aw)

Project: http://git-wip-us.apache.org/repos/asf/hadoop/repo
Commit: http://git-wip-us.apache.org/repos/asf/hadoop/commit/a86b99c8
Tree: http://git-wip-us.apache.org/repos/asf/hadoop/tree/a86b99c8
Diff: http://git-wip-us.apache.org/repos/asf/hadoop/diff/a86b99c8

Branch: refs/heads/HADOOP-12930
Commit: a86b99c82f337a8e9214737fafb944c7e0b97569
Parents: 8eadd71
Author: Allen Wittenauer <aw@apache.org>
Authored: Mon Mar 28 09:00:07 2016 -0700
Committer: Allen Wittenauer <aw@apache.org>
Committed: Mon Apr 25 17:53:47 2016 -0700

 .../src/site/markdown/UnixShellGuide.md         | 46 +++++++++++++++++++-
 1 file changed, 44 insertions(+), 2 deletions(-)

diff --git a/hadoop-common-project/hadoop-common/src/site/markdown/UnixShellGuide.md b/hadoop-common-project/hadoop-common/src/site/markdown/UnixShellGuide.md
index caa3aa7..8f3f261 100644
--- a/hadoop-common-project/hadoop-common/src/site/markdown/UnixShellGuide.md
+++ b/hadoop-common-project/hadoop-common/src/site/markdown/UnixShellGuide.md
@@ -87,7 +87,7 @@ Shell profiles may be installed in either `${HADOOP_CONF_DIR}/shellprofile.d`
 An example of a shell profile is in the libexec directory.
-## Shell API
+### Shell API
 Hadoop's shell code has a [function library](./UnixShellAPI.html) that is open for administrators
and developers to use to assist in their configuration and advanced feature management.  These
APIs follow the standard [Hadoop Interface Classification](./InterfaceClassification.html),
with one addition: Replaceable.
@@ -95,6 +95,48 @@ The shell code allows for core functions to be overridden. However, not
all func
 In order to replace a function, create a file called `hadoop-user-functions.sh` in the `${HADOOP_CONF_DIR}`
directory.  Simply define the new, replacement function in this file and the system will pick
it up automatically.  There may be as many replacement functions as needed in this file. 
Examples of function replacement are in the `hadoop-user-functions.sh.examples` file.
 Functions that are marked Public and Stable are safe to use in shell profiles as-is.  Other
functions may change in a minor release.
+### Dynamic Subcommands
+Utilizing the above, it is possible for third parties to add their own subcommands to the
primary Hadoop shell scripts (hadoop, hdfs, mapred, yarn).
+Prior to executing a subcommand, the primary scripts will check for the existance of a (scriptname)_subcommand_(subcommand)
function.  This function gets executed with the parameters set to all remaining command line
arguments.  For example, if the following function is defined:
+function yarn_subcommand_hello
+  echo "$@"
+then executing `yarn --debug hello world I see you` will activate script debugging and call
the `yarn_subcommand_hello` funciton as:
+yarn_subcommand_hello world I see you
+which will result in the output of:
+world I see you
+It is also possible to add the new subcommands to the usage output. The `hadoop_add_subcommand`
function adds text to the usage output.  Utilizing the standard HADOOP_SHELL_EXECNAME variable,
we can limit which command gets our new function.
+if [[ "${HADOOP_SHELL_EXECNAME}" = "yarn" ]]; then
+  hadoop_add_subcommand "hello" "Print some text to the screen"
+This functionality may also be use to override the built-ins.  For example, defining:
+function hdfs_subcommand_fetchdt
+  ...
+... will replace the existing `hdfs fetchdt` subcommand with a custom one.

View raw message