ambari-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Alejandro Fernandez" <>
Subject Review Request 34982: During initial Stack Install, tarballs will not be copied to HDFS because the current_version is not yet known
Date Wed, 03 Jun 2015 05:34:13 GMT

This is an automatically generated e-mail. To reply, visit:

Review request for Ambari.

Bugs: AMBARI-11640

Repository: ambari


The workaround is to start all services so that the CURRENT version exists, and then rerun
a HiveServer2 Start, Spark History Server2, or Tez Service Check.
The fix will be to detect if params.current_version is None, and if so, call hdp-select versions,
and use the value if only one version number is reported.
Also, HiveServer2 Start needs to copy the tez tarball.


  ambari-common/src/main/python/resource_management/libraries/functions/ f8a2570



Verified this worked on a brand new cluster install with HistoryServer Start, which copies
mapreduce.tar.gz to HDFS.

***Still need to fix unit test.***


Alejandro Fernandez

  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message