spark-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From aarondav <...@git.apache.org>
Subject [GitHub] spark pull request: [SPARK-1186] : Enrich the Spark Shell to suppo...
Date Thu, 13 Mar 2014 02:57:59 GMT
Github user aarondav commented on a diff in the pull request:

    https://github.com/apache/spark/pull/116#discussion_r10551389
  
    --- Diff: bin/spark-shell ---
    @@ -30,69 +30,367 @@ esac
     # Enter posix mode for bash
     set -o posix
     
    -CORE_PATTERN="^[0-9]+$"
    -MEM_PATTERN="^[0-9]+[m|g|M|G]$"
    -
    +## Global script variables
     FWDIR="$(cd `dirname $0`/..; pwd)"
     
    -if [ "$1" = "--help" ] || [ "$1" = "-h" ]; then
    -	echo "Usage: spark-shell [OPTIONS]"
    -	echo "OPTIONS:"
    -	echo "-c --cores num, the maximum number of cores to be used by the spark shell"
    -	echo "-em --execmem num[m|g], the memory used by each executor of spark shell"
    -	echo "-dm --drivermem num[m|g], the memory used by the spark shell and driver"
    -	echo "-h --help, print this help information" 
    -	exit
    -fi
    +VERBOSE=0
    +DRY_RUN=0
    +SPARK_REPL_OPTS="${SPARK_REPL_OPTS:-""}"
    +MASTER=""
    +
    +#CLI Color Templates
    +txtund=$(tput sgr 0 1)          # Underline
    +txtbld=$(tput bold)             # Bold
    +bldred=${txtbld}$(tput setaf 1) # red
    +bldyel=${txtbld}$(tput setaf 3) # yellow
    +bldblu=${txtbld}$(tput setaf 4) # blue
    +bldwht=${txtbld}$(tput setaf 7) # white
    +txtrst=$(tput sgr0)             # Reset
    +info=${bldwht}*${txtrst}        # Feedback
    +pass=${bldblu}*${txtrst}
    +warn=${bldred}*${txtrst}
    +ques=${bldblu}?${txtrst}
    +
    +# Helper function to describe the script usage
    +function usage() {
    +    cat << EOF
    +
    +${txtbld}Usage${txtrst}: spark-shell [OPTIONS]
    +
    +${txtbld}OPTIONS${txtrst}:
    +
    +${txtund}basic${txtrst}:
    +
    +    -h  --help              : print this help information.
    +    -c  --executor-cores    : the maximum number of cores to be used by the spark shell.
    +    -em --executor-memory   : num[m|g], the memory used by each executor of spark shell.
    +    -dm --drivermem  --driver-memory     : num[m|g], the memory used by the spark shell
and driver.
    +
    +${txtund}soon to be deprecated${txtrst}:
    +
    +    --cores         : please use -c/--executor-cores
    +
    +${txtund}other options${txtrst}:
    +
    +    -mip --master-ip     : Spark Master IP/Host Address
    +    -mp  --master-port   : num, Spark Master Port
    --- End diff --
    
    I'm not sure if the "num" qualifications are useful, I think all parameters either start
with "number of" or "port", which is pretty clear.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

Mime
View raw message