spark-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Jörn Franke <jornfra...@gmail.com>
Subject Re: Support Dynamic Partition Inserts params with SET command in Spark 2.0.1
Date Fri, 28 Jul 2017 10:48:46 GMT
Try sparksession.conf().set

> On 28. Jul 2017, at 12:19, Chetan Khatri <chetan.opensource@gmail.com> wrote:
> 
> Hey Dev/ USer,
> 
> I am working with Spark 2.0.1 and with dynamic partitioning with Hive facing below issue:
> 
> org.apache.hadoop.hive.ql.metadata.HiveException:
> Number of dynamic partitions created is 1344, which is more than 1000.
> To solve this try to set hive.exec.max.dynamic.partitions to at least 1344.
> 
> I tried below options, but failed:
> 
> val spark = sparkSession.builder().enableHiveSupport().getOrCreate()
> 
> spark.sqlContext.setConf("hive.exec.max.dynamic.partitions", "2000")
> 
> Please help with alternate workaround !
> 
> Thanks

Mime
View raw message