ambari-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Vitalyi Brodetskyi" <vbrodets...@hortonworks.com>
Subject Re: Review Request 33716: hive-site.xml packaged under /etc/spark/conf is not correct
Date Thu, 30 Apr 2015 08:43:51 GMT

-----------------------------------------------------------
This is an automatically generated e-mail. To reply, visit:
https://reviews.apache.org/r/33716/#review82121
-----------------------------------------------------------

Ship it!


Ship It!

- Vitalyi Brodetskyi


On Квітень 30, 2015, 8:41 до полудня, Andrew Onischuk wrote:
> 
> -----------------------------------------------------------
> This is an automatically generated e-mail. To reply, visit:
> https://reviews.apache.org/r/33716/
> -----------------------------------------------------------
> 
> (Updated Квітень 30, 2015, 8:41 до полудня)
> 
> 
> Review request for Ambari and Vitalyi Brodetskyi.
> 
> 
> Bugs: AMBARI-10859
>     https://issues.apache.org/jira/browse/AMBARI-10859
> 
> 
> Repository: ambari
> 
> 
> Description
> -------
> 
> Ambari-2.1.0 for Dal is putting a lot more properties in /etc/spark/conf/hive-
> site.xml than desired. Its leading to unnecessary exceptions while trying to
> load HiveContext on Spark shell. Here is the error:
> 
>     
>     
>     15/04/21 08:37:44 INFO ParseDriver: Parsing command: show tables
>     15/04/21 08:37:44 INFO ParseDriver: Parse Completed
>     java.lang.RuntimeException: java.lang.NumberFormatException: For input string: "5s"
>         at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:346)
>         at org.apache.spark.sql.hive.HiveContext$$anonfun$4.apply(HiveContext.scala:237)
>         at org.apache.spark.sql.hive.HiveContext$$anonfun$4.apply(HiveContext.scala:233)
>         at scala.Option.orElse(Option.scala:257)
>         at org.apache.spark.sql.hive.HiveContext.x$3$lzycompute(HiveContext.scala:233)
>         at org.apache.spark.sql.hive.HiveContext.x$3(HiveContext.scala:231)
>         at org.apache.spark.sql.hive.HiveContext.hiveconf$lzycompute(HiveContext.scala:231)
>         at org.apache.spark.sql.hive.HiveContext.hiveconf(HiveContext.scala:231)
>         at org.apache.spark.sql.hive.HiveMetastoreCatalog.<init>(HiveMetastoreCatalog.scala:56)
>         at org.apache.spark.sql.hive.HiveContext$$anon$2.<init>(HiveContext.scala:255)
>         at org.apache.spark.sql.hive.HiveContext.catalog$lzycompute(HiveContext.scala:255)
>         at org.apache.spark.sql.hive.HiveContext.catalog(HiveContext.scala:255)
>         at org.apache.spark.sql.hive.HiveContext$$anon$4.<init>(HiveContext.scala:265)
>         ....
>     
> 
> In previous Ambari release we were adding only a handful of properties (< 10)
> now 150+ (attached), we should revert to the old behavior.
> 
> 
> Diffs
> -----
> 
>   ambari-server/src/main/resources/common-services/SPARK/1.2.0.2.2/package/scripts/params.py
c521b65 
>   ambari-server/src/main/resources/common-services/SPARK/1.2.0.2.2/package/scripts/setup_spark.py
d8fbc8c 
> 
> Diff: https://reviews.apache.org/r/33716/diff/
> 
> 
> Testing
> -------
> 
> mvn clean test
> 
> 
> Thanks,
> 
> Andrew Onischuk
> 
>


Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message