spark-reviews mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From GitBox <...@apache.org>
Subject [GitHub] [spark] LantaoJin commented on issue #25840: [SPARK-29166][SQL] Add parameters to limit the number of dynamic partitions for data source table
Date Tue, 15 Oct 2019 08:11:21 GMT
LantaoJin commented on issue #25840: [SPARK-29166][SQL] Add parameters to limit the number
of dynamic partitions for data source table
URL: https://github.com/apache/spark/pull/25840#issuecomment-542093641
 
 
   > Do you mean data source and Hive tables would have different configs for the same
feature?
   
   Yes for now. Do you want the new configs also restrict Hive tables? For example, if we
set `spark.sql.dynamic.partition.maxPartitions=100`, insert into hive table also throw exception
when dynamic partitions is over 100, similar we have set `spark.hadoop.hive.exec.max.dynamic.partitions=100`

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


Mime
View raw message