spark-reviews mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From andrewor14 <...@git.apache.org>
Subject [GitHub] spark pull request: SPARK-1565 (Addendum): Replace `run-example` w...
Date Fri, 09 May 2014 03:21:18 GMT
Github user andrewor14 commented on a diff in the pull request:

    https://github.com/apache/spark/pull/704#discussion_r12463032
  
    --- Diff: bin/run-example ---
    @@ -49,46 +31,31 @@ fi
     
     if [[ -z $SPARK_EXAMPLES_JAR ]]; then
       echo "Failed to find Spark examples assembly in $FWDIR/lib or $FWDIR/examples/target"
>&2
    -  echo "You need to build Spark with sbt/sbt assembly before running this program" >&2
    +  echo "You need to build Spark before running this program" >&2
       exit 1
     fi
     
    +SPARK_EXAMPLES_JAR_REL=${SPARK_EXAMPLES_JAR#$FWDIR/}
     
    -# Since the examples JAR ideally shouldn't include spark-core (that dependency should
be
    -# "provided"), also add our standard Spark classpath, built using compute-classpath.sh.
    -CLASSPATH=`$FWDIR/bin/compute-classpath.sh`
    -CLASSPATH="$SPARK_EXAMPLES_JAR:$CLASSPATH"
    -
    -if $cygwin; then
    -    CLASSPATH=`cygpath -wp $CLASSPATH`
    -    export SPARK_EXAMPLES_JAR=`cygpath -w $SPARK_EXAMPLES_JAR`
    -fi
    -
    -# Find java binary
    -if [ -n "${JAVA_HOME}" ]; then
    -  RUNNER="${JAVA_HOME}/bin/java"
    -else
    -  if [ `command -v java` ]; then
    -    RUNNER="java"
    -  else
    -    echo "JAVA_HOME is not set" >&2
    -    exit 1
    -  fi
    -fi
    +EXAMPLE_CLASS="<example-class>"
    +EXAMPLE_ARGS="[<example args>]"
    +EXAMPLE_MASTER=${MASTER:-"<master>"}
     
    -# Set JAVA_OPTS to be able to load native libraries and to set heap size
    -JAVA_OPTS="$SPARK_JAVA_OPTS"
    -# Load extra JAVA_OPTS from conf/java-opts, if it exists
    -if [ -e "$FWDIR/conf/java-opts" ] ; then
    -  JAVA_OPTS="$JAVA_OPTS `cat $FWDIR/conf/java-opts`"
    +if [ -n "$1" ]; then
    +  EXAMPLE_CLASS="$1"
    +  shift
     fi
    -export JAVA_OPTS
     
    -if [ "$SPARK_PRINT_LAUNCH_COMMAND" == "1" ]; then
    -  echo -n "Spark Command: "
    -  echo "$RUNNER" -cp "$CLASSPATH" $JAVA_OPTS "$@"
    -  echo "========================================"
    -  echo
    +if [ -n "$1" ]; then
    +  EXAMPLE_ARGS="$@"
     fi
     
    -exec "$RUNNER" -cp "$CLASSPATH" $JAVA_OPTS "$@"
    +echo "NOTE: This script has been replaced with ./bin/spark-submit. Please run:" >&2
    +echo
    +echo "./bin/spark-submit \\" >&2
    --- End diff --
    
    well but then you have streaming examples and mllib examples. Do we expect the user to
type in millib.MovieLensALS then? I actually think the `org.apache.examples.spark.SparkPi`
is more consistent with the rest (i.e. SparkPi). Maybe we should accept both.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

Mime
View raw message