ambari-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Sumit Mohanty <smoha...@hortonworks.com>
Subject Re: When does tar copying happen ?
Date Wed, 30 Dec 2015 05:46:19 GMT
Ambaripreupload.py is not used during Ambari based cluster installations.


I believe sqoop tarball is uploaded as part of HIVE installation.


-Sumit

________________________________
From: Jeff Zhang <zjffdu@gmail.com>
Sent: Tuesday, December 29, 2015 9:42 PM
To: user@ambari.apache.org; dev@ambari.apache.org
Subject: When does tar copying happen ?

I install sqoop separately, but find there's no sqoop tar ball uploaded to hdfs.
I find the uploading script in Ambaripreupload.py, and wondering when this script called.
Is it called only when the first hdp installation ? Then some tar ball may missing if I install
them separately.



print "Copying tarballs..."

  copy_tarballs_to_hdfs(format("/usr/hdp/{hdp_version}/hadoop/mapreduce.tar.gz"), hdfs_path_prefix+"/hdp/apps/{{
hdp_stack_version }}/mapreduce/", 'hadoop-mapreduce-historyserver', params.mapred_user, params.hdfs_user,
params.user_group)

  copy_tarballs_to_hdfs(format("/usr/hdp/{hdp_version}/tez/lib/tez.tar.gz"), hdfs_path_prefix+"/hdp/apps/{{
hdp_stack_version }}/tez/", 'hadoop-mapreduce-historyserver', params.mapred_user, params.hdfs_user,
params.user_group)

  copy_tarballs_to_hdfs(format("/usr/hdp/{hdp_version}/hive/hive.tar.gz"), hdfs_path_prefix+"/hdp/apps/{{
hdp_stack_version }}/hive/", 'hadoop-mapreduce-historyserver', params.mapred_user, params.hdfs_user,
params.user_group)

  copy_tarballs_to_hdfs(format("/usr/hdp/{hdp_version}/pig/pig.tar.gz"), hdfs_path_prefix+"/hdp/apps/{{
hdp_stack_version }}/pig/", 'hadoop-mapreduce-historyserver', params.mapred_user, params.hdfs_user,
params.user_group)

  copy_tarballs_to_hdfs(format("/usr/hdp/{hdp_version}/hadoop-mapreduce/hadoop-streaming.jar"),
hdfs_path_prefix+"/hdp/apps/{{ hdp_stack_version }}/mapreduce/", 'hadoop-mapreduce-historyserver',
params.mapred_user, params.hdfs_user, params.user_group)

  copy_tarballs_to_hdfs(format("/usr/hdp/{hdp_version}/sqoop/sqoop.tar.gz"), hdfs_path_prefix+"/hdp/apps/{{
hdp_stack_version }}/sqoop/", 'hadoop-mapreduce-historyserver', params.mapred_user, params.hdfs_user,
params.user_group)



--
Best Regards

Jeff Zhang

Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message