spark-reviews mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From shaneknapp <>
Subject [GitHub] spark issue #15821: [SPARK-13534][PySpark] Using Apache Arrow to increase pe...
Date Tue, 27 Jun 2017 17:35:12 GMT
Github user shaneknapp commented on the issue:
    i agree w/@MaheshIBM that we're looking at a bad CA cert.  i think we're looking at a
problem on's side, not our side.  
    however, i do no like the thought of ignoring certs (on principle).  :)
    and finally, if i'm reading the run-pip-tests code correctly (and please correct me if
i'm wrong @holdenk ), we're just creating a temp python environment in /tmp, installing some
packages, running the tests, and then moving on.
    some thoughts/suggestions:
    * our conda environment is pretty stagnant and hasn't been explicitly upgraded since we
deployed anaconda python over a year ago.
    * the py3k environment that exists in the workers' conda installation is solely used by
spark builds, so updating said environment w/the packages in the run-pip-tests will remove
the need to download them, but at the same time, make the tests a NOOP.
    * we can hope that continuum fixes their cert issue asap.  :\

If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at or file a JIRA ticket
with INFRA.

To unsubscribe, e-mail:
For additional commands, e-mail:

View raw message