spark-commits mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From lix...@apache.org
Subject spark git commit: [SPARK-22221][SQL][FOLLOWUP] Externalize spark.sql.execution.arrow.maxRecordsPerBatch
Date Tue, 30 Jan 2018 01:37:58 GMT
Repository: spark
Updated Branches:
  refs/heads/master b834446ec -> f235df66a


[SPARK-22221][SQL][FOLLOWUP] Externalize spark.sql.execution.arrow.maxRecordsPerBatch

## What changes were proposed in this pull request?

This is a followup to #19575 which added a section on setting max Arrow record batches and
this will externalize the conf that was referenced in the docs.

## How was this patch tested?
NA

Author: Bryan Cutler <cutlerb@gmail.com>

Closes #20423 from BryanCutler/arrow-user-doc-externalize-maxRecordsPerBatch-SPARK-22221.


Project: http://git-wip-us.apache.org/repos/asf/spark/repo
Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/f235df66
Tree: http://git-wip-us.apache.org/repos/asf/spark/tree/f235df66
Diff: http://git-wip-us.apache.org/repos/asf/spark/diff/f235df66

Branch: refs/heads/master
Commit: f235df66a4754cbb64d5b7b5cfd5a52bdd243b8a
Parents: b834446
Author: Bryan Cutler <cutlerb@gmail.com>
Authored: Mon Jan 29 17:37:55 2018 -0800
Committer: gatorsmile <gatorsmile@gmail.com>
Committed: Mon Jan 29 17:37:55 2018 -0800

----------------------------------------------------------------------
 .../src/main/scala/org/apache/spark/sql/internal/SQLConf.scala      | 1 -
 1 file changed, 1 deletion(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/spark/blob/f235df66/sql/catalyst/src/main/scala/org/apache/spark/sql/internal/SQLConf.scala
----------------------------------------------------------------------
diff --git a/sql/catalyst/src/main/scala/org/apache/spark/sql/internal/SQLConf.scala b/sql/catalyst/src/main/scala/org/apache/spark/sql/internal/SQLConf.scala
index 61ea03d..54a3559 100644
--- a/sql/catalyst/src/main/scala/org/apache/spark/sql/internal/SQLConf.scala
+++ b/sql/catalyst/src/main/scala/org/apache/spark/sql/internal/SQLConf.scala
@@ -1051,7 +1051,6 @@ object SQLConf {
 
   val ARROW_EXECUTION_MAX_RECORDS_PER_BATCH =
     buildConf("spark.sql.execution.arrow.maxRecordsPerBatch")
-      .internal()
       .doc("When using Apache Arrow, limit the maximum number of records that can be written
" +
         "to a single ArrowRecordBatch in memory. If set to zero or negative there is no limit.")
       .intConf


---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscribe@spark.apache.org
For additional commands, e-mail: commits-help@spark.apache.org


Mime
View raw message