ambari-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Alejandro Fernandez" <>
Subject Review Request 27061: Pig Service Check, Tez, and WebHCat need dynamic tar properties in cluster-env.xml
Date Thu, 23 Oct 2014 00:47:50 GMT

This is an automatically generated e-mail. To reply, visit:

Review request for Ambari.

Bugs: AMBARI-7913

Repository: ambari


The Pig Service Check fails because the cluster-env.xml file doesn't contain all of the following
tez_tar_source, tez_tar_destination_folder
hive_tar_source, hive_tar_destination_folder
pig_tar_source, pig_tar_destination_folder
hadoop-streaming_tar_source, hadoop-streaming_tar_destination_folder
sqoop_tar_source, sqoop_tar_destination_folder

This happens because site_properties.js is not saving these to cluster-env.xml


  ambari-server/src/main/resources/stacks/HDP/2.2/configuration/cluster-env.xml 5c7ea79 
  ambari-web/app/data/HDP2/site_properties.js 541a6d0 



Started up ambari-server, symlinked the web folder to my local working copy, and during the
installation process verified that the following URL showed all of the properties,

When creating the cluster, selected HDFS, YARN, MR, Tez, Hive, Pig, Zookeeper, Sqoop.

After the deployment completed, verified that tez-site.xml contains tez.lib.uri with an actual
path, e.g., hdfs:///apps/hdp/
Next, re-ran a service check on Pig, which passed.

Unit tests passed,
Total run:669
Total errors:0
Total failures:0

[INFO] ------------------------------------------------------------------------
[INFO] Total time: 25:13.750s
[INFO] Finished at: Wed Oct 22 17:46:23 PDT 2014
[INFO] Final Memory: 52M/492M
[INFO] ------------------------------------------------------------------------

Pig Service Check results:
2014-10-23 00:41:56,977 - ExecuteHadoop['dfs -rmr pigsmoke.out passwd; hadoop --config /etc/hadoop/conf
dfs -put /etc/passwd passwd '] {'security_enabled': False, 'keytab': [EMPTY], 'conf_dir':
'/etc/hadoop/conf', 'try_sleep': 5, 'kinit_path_local': '', 'tries': 3, 'user': 'ambari-qa',
'bin_dir': '/usr/hdp/current/hadoop-client/bin'}
2014-10-23 00:41:56,978 - Execute['hadoop --config /etc/hadoop/conf dfs -rmr pigsmoke.out
passwd; hadoop --config /etc/hadoop/conf dfs -put /etc/passwd passwd '] {'logoutput': False,
'path': ['/usr/hdp/current/hadoop-client/bin'], 'tries': 3, 'user': 'ambari-qa', 'try_sleep':
2014-10-23 00:42:04,277 - File['/var/lib/ambari-agent/data/tmp/'] {'content': StaticFile(''),
'mode': 0755}
2014-10-23 00:42:04,288 - Execute['pig /var/lib/ambari-agent/data/tmp/'] {'path':
['/usr/hdp/current/pig-client/bin:/usr/sbin:/sbin:/usr/local/bin:/bin:/usr/bin'], 'tries':
3, 'user': 'ambari-qa', 'try_sleep': 5}
2014-10-23 00:42:36,804 - ExecuteHadoop['fs -test -e pigsmoke.out'] {'bin_dir': '/usr/hdp/current/hadoop-client/bin',
'user': 'ambari-qa', 'conf_dir': '/etc/hadoop/conf'}
2014-10-23 00:42:36,806 - Execute['hadoop --config /etc/hadoop/conf fs -test -e pigsmoke.out']
{'logoutput': False, 'path': ['/usr/hdp/current/hadoop-client/bin'], 'tries': 1, 'user': 'ambari-qa',
'try_sleep': 0}


Alejandro Fernandez

  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message