spark-reviews mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From GitBox <...@apache.org>
Subject [GitHub] [spark] srowen commented on a change in pull request #29634: [SPARK-32783][DOCS][PYTHON] Development - Testing PySpark
Date Thu, 03 Sep 2020 13:33:04 GMT

srowen commented on a change in pull request #29634:
URL: https://github.com/apache/spark/pull/29634#discussion_r482980446



##########
File path: python/docs/source/development/testing.rst
##########
@@ -0,0 +1,61 @@
+..  Licensed to the Apache Software Foundation (ASF) under one
+    or more contributor license agreements.  See the NOTICE file
+    distributed with this work for additional information
+    regarding copyright ownership.  The ASF licenses this file
+    to you under the Apache License, Version 2.0 (the
+    "License"); you may not use this file except in compliance
+    with the License.  You may obtain a copy of the License at
+
+..    http://www.apache.org/licenses/LICENSE-2.0
+
+..  Unless required by applicable law or agreed to in writing,
+    software distributed under the License is distributed on an
+    "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+    KIND, either express or implied.  See the License for the
+    specific language governing permissions and limitations
+    under the License.
+
+===============
+Testing PySpark
+===============
+
+In order to run PySpark tests, you should build Spark itself first via Maven
+or SBT. For example,
+
+.. code-block:: bash
+
+    build/mvn -DskipTests clean package
+
+After that, the PySpark test cases can be run via using ``python/run-tests``. For example,
+
+.. code-block:: bash
+
+    python/run-tests --python-executable=python3
+
+Note that:
+
+* If you are running tests on Mac OS, you may set ``OBJC_DISABLE_INITIALIZE_FORK_SAFETY``
environment variable to ``YES``.
+* If you are using JDK 11, you should set ``-Dio.netty.tryReflectionSetAccessible=true``
for Arrow related features. See also `Downloading <https://spark.apache.org/docs/latest/#downloading>`_.
+
+Please see the guidance on how to `build Spark <https://github.com/apache/spark#building-spark>`_,
+`run tests for a module, or individual tests <https://spark.apache.org/developer-tools.html>`_.
+
+
+Running Individual PySpark Tests
+--------------------------------
+
+You can run a specific test via using ``python/run-tests``, for example, as below:
+
+.. code-block:: bash
+
+    python/run-tests --testnames pyspark.sql.tests.test_arrow
+
+Please refer `Testing PySpark <https://spark.apache.org/developer-tools.html>`_ for
more details.

Review comment:
       refer to

##########
File path: python/docs/source/development/testing.rst
##########
@@ -0,0 +1,61 @@
+..  Licensed to the Apache Software Foundation (ASF) under one
+    or more contributor license agreements.  See the NOTICE file
+    distributed with this work for additional information
+    regarding copyright ownership.  The ASF licenses this file
+    to you under the Apache License, Version 2.0 (the
+    "License"); you may not use this file except in compliance
+    with the License.  You may obtain a copy of the License at
+
+..    http://www.apache.org/licenses/LICENSE-2.0
+
+..  Unless required by applicable law or agreed to in writing,
+    software distributed under the License is distributed on an
+    "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+    KIND, either express or implied.  See the License for the
+    specific language governing permissions and limitations
+    under the License.
+
+===============
+Testing PySpark
+===============
+
+In order to run PySpark tests, you should build Spark itself first via Maven
+or SBT. For example,
+
+.. code-block:: bash
+
+    build/mvn -DskipTests clean package
+
+After that, the PySpark test cases can be run via using ``python/run-tests``. For example,
+
+.. code-block:: bash
+
+    python/run-tests --python-executable=python3
+
+Note that:
+
+* If you are running tests on Mac OS, you may set ``OBJC_DISABLE_INITIALIZE_FORK_SAFETY``
environment variable to ``YES``.
+* If you are using JDK 11, you should set ``-Dio.netty.tryReflectionSetAccessible=true``
for Arrow related features. See also `Downloading <https://spark.apache.org/docs/latest/#downloading>`_.

Review comment:
       Just curious is this specifically necessary for debugging? shouldn't be necessary in
Spark 3 in general.

##########
File path: python/docs/source/development/testing.rst
##########
@@ -0,0 +1,61 @@
+..  Licensed to the Apache Software Foundation (ASF) under one
+    or more contributor license agreements.  See the NOTICE file
+    distributed with this work for additional information
+    regarding copyright ownership.  The ASF licenses this file
+    to you under the Apache License, Version 2.0 (the
+    "License"); you may not use this file except in compliance
+    with the License.  You may obtain a copy of the License at
+
+..    http://www.apache.org/licenses/LICENSE-2.0
+
+..  Unless required by applicable law or agreed to in writing,
+    software distributed under the License is distributed on an
+    "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+    KIND, either express or implied.  See the License for the
+    specific language governing permissions and limitations
+    under the License.
+
+===============
+Testing PySpark
+===============
+
+In order to run PySpark tests, you should build Spark itself first via Maven
+or SBT. For example,
+
+.. code-block:: bash
+
+    build/mvn -DskipTests clean package
+
+After that, the PySpark test cases can be run via using ``python/run-tests``. For example,
+
+.. code-block:: bash
+
+    python/run-tests --python-executable=python3
+
+Note that:
+
+* If you are running tests on Mac OS, you may set ``OBJC_DISABLE_INITIALIZE_FORK_SAFETY``
environment variable to ``YES``.
+* If you are using JDK 11, you should set ``-Dio.netty.tryReflectionSetAccessible=true``
for Arrow related features. See also `Downloading <https://spark.apache.org/docs/latest/#downloading>`_.
+
+Please see the guidance on how to `build Spark <https://github.com/apache/spark#building-spark>`_,
+`run tests for a module, or individual tests <https://spark.apache.org/developer-tools.html>`_.
+
+
+Running Individual PySpark Tests
+--------------------------------
+
+You can run a specific test via using ``python/run-tests``, for example, as below:
+
+.. code-block:: bash
+
+    python/run-tests --testnames pyspark.sql.tests.test_arrow
+
+Please refer `Testing PySpark <https://spark.apache.org/developer-tools.html>`_ for
more details.
+
+
+Running tests using GitHub Actions
+----------------------------------
+
+You can run the full PySpark tests by using GitHub Actions in your own forked GitHub

Review comment:
       with a few clicks
   refer to




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


Mime
View raw message