spark-reviews mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From nchammas <...@git.apache.org>
Subject [GitHub] spark pull request: [SPARK-4137] [EC2] Don't change working dir on...
Date Thu, 30 Oct 2014 04:10:02 GMT
Github user nchammas commented on a diff in the pull request:

    https://github.com/apache/spark/pull/2988#discussion_r19587131
  
    --- Diff: ec2/spark_ec2.py ---
    @@ -718,12 +726,16 @@ def get_num_disks(instance_type):
             return 1
     
     
    -# Deploy the configuration file templates in a given local directory to
    -# a cluster, filling in any template parameters with information about the
    -# cluster (e.g. lists of masters and slaves). Files are only deployed to
    -# the first master instance in the cluster, and we expect the setup
    -# script to be run on that instance to copy them to other nodes.
     def deploy_files(conn, root_dir, opts, master_nodes, slave_nodes, modules):
    +    """
    +    Deploy the configuration file templates in a given local directory to
    --- End diff --
    
    Yeah, I thought I'd make this the first change toward having all the function descriptions
be in docstrings, but for consistency's sake you're right--it should be a comment on top.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


Mime
View raw message