Return-Path: X-Original-To: apmail-ambari-dev-archive@www.apache.org Delivered-To: apmail-ambari-dev-archive@www.apache.org Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by minotaur.apache.org (Postfix) with SMTP id 857B41789A for ; Wed, 22 Oct 2014 12:41:24 +0000 (UTC) Received: (qmail 52684 invoked by uid 500); 22 Oct 2014 12:41:24 -0000 Delivered-To: apmail-ambari-dev-archive@ambari.apache.org Received: (qmail 52654 invoked by uid 500); 22 Oct 2014 12:41:24 -0000 Mailing-List: contact dev-help@ambari.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: dev@ambari.apache.org Delivered-To: mailing list dev@ambari.apache.org Received: (qmail 52633 invoked by uid 99); 22 Oct 2014 12:41:24 -0000 Received: from reviews-vm.apache.org (HELO reviews.apache.org) (140.211.11.40) by apache.org (qpsmtpd/0.29) with ESMTP; Wed, 22 Oct 2014 12:41:24 +0000 Received: from reviews.apache.org (localhost [127.0.0.1]) by reviews.apache.org (Postfix) with ESMTP id D9D031DF486; Wed, 22 Oct 2014 12:41:28 +0000 (UTC) Content-Type: multipart/alternative; boundary="===============7199470003678735713==" MIME-Version: 1.0 Subject: Re: Review Request 26965: WebHCat to support versioned rpms in Ambari From: "Jonathan Hurley" To: "Sumit Mohanty" , "Dmitro Lisnichenko" , "Srimanth Gunturi" , "Andrew Onischuk" , "Dmytro Sen" , "Yusaku Sako" , "Jonathan Hurley" , "Nate Cole" , "Sid Wagle" Cc: "Ambari" , "Alejandro Fernandez" Date: Wed, 22 Oct 2014 12:41:28 -0000 Message-ID: <20141022124128.1283.20102@reviews.apache.org> X-ReviewBoard-URL: https://reviews.apache.org Auto-Submitted: auto-generated Sender: "Jonathan Hurley" X-ReviewGroup: Ambari X-ReviewRequest-URL: https://reviews.apache.org/r/26965/ X-Sender: "Jonathan Hurley" References: <20141022005444.1283.92019@reviews.apache.org> In-Reply-To: <20141022005444.1283.92019@reviews.apache.org> Reply-To: "Jonathan Hurley" X-ReviewRequest-Repository: ambari --===============7199470003678735713== MIME-Version: 1.0 Content-Type: text/plain; charset="utf-8" Content-Transfer-Encoding: 7bit ----------------------------------------------------------- This is an automatically generated e-mail. To reply, visit: https://reviews.apache.org/r/26965/#review57775 ----------------------------------------------------------- ambari-common/src/main/python/resource_management/libraries/functions/version.py I normally try to avoid the use of double underscores since the name mangling can cause problems. If this package is not going to be subclassed (since there's no class, I would assume that's the case) then I think a single underscore is enough. ambari-common/src/main/python/resource_management/libraries/functions/version.py Extra comma? ambari-common/src/main/python/resource_management/libraries/functions/version.py What happens if I pass in 2.10 and 2.9.9 ? Would this not indicate that 2.10 is < 2.9.9 ? - Jonathan Hurley On Oct. 21, 2014, 8:54 p.m., Alejandro Fernandez wrote: > > ----------------------------------------------------------- > This is an automatically generated e-mail. To reply, visit: > https://reviews.apache.org/r/26965/ > ----------------------------------------------------------- > > (Updated Oct. 21, 2014, 8:54 p.m.) > > > Review request for Ambari, Andrew Onischuk, Dmitro Lisnichenko, Dmytro Sen, Jonathan Hurley, Nate Cole, Sumit Mohanty, Srimanth Gunturi, Sid Wagle, and Yusaku Sako. > > > Bugs: AMBARI-7892 > https://issues.apache.org/jira/browse/AMBARI-7892 > > > Repository: ambari > > > Description > ------- > > This is related to AMBARI-7842 > WebHCat relies on the following tarballs/jars > > || File || Property || > | pig-*.tar.gz | templeton.pig.archive | > |hive-*tar.gz | templeton.hive.archive| > | sqoop-*tar.gz | templeton.sqoop.archive| > |hadoop-streaming-*.jar | templeton.streaming.jar| > > All of these need to be copied to HDFS, and the name of the file needs to be injected into the property with the fully qualified path in HDFS. > > > Diffs > ----- > > ambari-common/src/main/python/resource_management/libraries/functions/dynamic_variable_interpretation.py efe7e63 > ambari-common/src/main/python/resource_management/libraries/functions/version.py PRE-CREATION > ambari-server/src/main/resources/stacks/HDP/2.0.6/services/HIVE/configuration/webhcat-site.xml 0523dab > ambari-server/src/main/resources/stacks/HDP/2.0.6/services/HIVE/package/scripts/params.py 7c86070 > ambari-server/src/main/resources/stacks/HDP/2.0.6/services/HIVE/package/scripts/webhcat.py 4aad1a2 > ambari-server/src/main/resources/stacks/HDP/2.1/services/TEZ/package/scripts/params.py 71989c3 > ambari-server/src/main/resources/stacks/HDP/2.2/configuration/cluster-env.xml cc52fe3 > ambari-server/src/main/resources/stacks/HDP/2.2/services/HIVE/configuration/webhcat-site.xml 3435a63 > ambari-server/src/test/python/TestVersion.py PRE-CREATION > ambari-server/src/test/python/stacks/2.0.6/HIVE/test_webhcat_server.py 5f92a2d > ambari-server/src/test/python/stacks/2.0.6/configs/default.json 5e3bad0 > ambari-server/src/test/python/stacks/2.0.6/configs/secured.json d65b0ee > ambari-server/src/test/python/stacks/2.2/configs/default.json ea474e8 > ambari-server/src/test/python/stacks/2.2/configs/secured.json 20678fa > > Diff: https://reviews.apache.org/r/26965/diff/ > > > Testing > ------- > > Ran ambari-server unit tests, > ---------------------------------------------------------------------- > Total run:667 > Total errors:0 > Total failures:0 > OK > > And verified on cluster using the following steps. > > 1. Set properties > /var/lib/ambari-server/resources/scripts/configs.sh set localhost dev cluster-env hive_tar_source "/usr/hdp/current/hive-client/hive-{{ component_version }}.{{ hdp_stack_version }}.tar.gz" > /var/lib/ambari-server/resources/scripts/configs.sh set localhost dev cluster-env hive_tar_destination_folder "hdfs:///hdp/apps/{{ hdp_stack_version }}/hive/" > /var/lib/ambari-server/resources/scripts/configs.sh set localhost dev cluster-env pig_tar_source "/usr/hdp/current/pig-client/pig-{{ component_version }}.{{ hdp_stack_version }}.tar.gz" > /var/lib/ambari-server/resources/scripts/configs.sh set localhost dev cluster-env pig_tar_destination_folder "hdfs:///hdp/apps/{{ hdp_stack_version }}/pig/" > /var/lib/ambari-server/resources/scripts/configs.sh set localhost dev cluster-env sqoop_tar_source "/usr/hdp/current/sqoop-client/sqoop-{{ component_version }}.{{ hdp_stack_version }}.tar.gz" > /var/lib/ambari-server/resources/scripts/configs.sh set localhost dev cluster-env sqoop_tar_destination_folder "hdfs:///hdp/apps/{{ hdp_stack_version }}/sqoop/" > /var/lib/ambari-server/resources/scripts/configs.sh set localhost dev cluster-env hadoop-streaming_tar_source "/usr/hdp/current/hadoop-mapreduce-client/hadoop-streaming-{{ component_version }}.{{ hdp_stack_version }}.jar" > /var/lib/ambari-server/resources/scripts/configs.sh set localhost dev cluster-env hadoop-streaming_tar_destination_folder "hdfs:///hdp/apps/{{ hdp_stack_version }}/mr/" > /var/lib/ambari-server/resources/scripts/configs.sh set localhost dev webhcat-site templeton.jar "/usr/hdp/current/hive-webhcat/share/webhcat/svr/lib/hive-webhcat-*.jar" > /var/lib/ambari-server/resources/scripts/configs.sh set localhost dev webhcat-site templeton.pig.archive "hdfs:///hdp/apps/{{ hdp_stack_version }}/pig/pig-{{ component_version }}.{{ hdp_stack_version }}.tar.gz" > /var/lib/ambari-server/resources/scripts/configs.sh set localhost dev webhcat-site templeton.hive.archive "hdfs:///hdp/apps/{{ hdp_stack_version }}/hive/hive-{{ component_version }}.{{ hdp_stack_version }}.tar.gz" > /var/lib/ambari-server/resources/scripts/configs.sh set localhost dev webhcat-site templeton.sqoop.archive "hdfs:///hdp/apps/{{ hdp_stack_version }}/sqoop/sqoop-{{ component_version }}.{{ hdp_stack_version }}.tar.gz" > /var/lib/ambari-server/resources/scripts/configs.sh set localhost dev webhcat-site templeton.streaming.jar "hdfs:///hdp/apps/{{ hdp_stack_version }}/mr/hadoop-streaming-{{ component_version }}.{{ hdp_stack_version }}.jar" > > 2. Verified properties were saved > http://c6401.ambari.apache.org:8080/api/v1/clusters/dev/configurations?type=cluster-env > http://c6401.ambari.apache.org:8080/api/v1/clusters/dev/configurations?type=webhcat-site > > 3. Copy changed files > yes | cp /vagrant/ambari/ambari-common/src/main/python/resource_management/libraries/script/config_dictionary.py /usr/lib/ambari-server/lib/resource_management/libraries/script/config_dictionary.py > yes | cp /vagrant/ambari/ambari-common/src/main/python/resource_management/libraries/functions/version.py /usr/lib/ambari-server/lib/resource_management/libraries/functions/version.py > yes | cp /vagrant/ambari/ambari-common/src/main/python/resource_management/libraries/functions/dynamic_variable_interpretation.py /usr/lib/ambari-server/lib/resource_management/libraries/functions/dynamic_variable_interpretation.py > yes | cp /vagrant/ambari/ambari-server/src/main/resources/stacks/HDP/2.0.6/services/HIVE/package/scripts/webhcat.py /var/lib/ambari-agent/cache/stacks/HDP/2.0.6/services/HIVE/package/scripts/webhcat.py > yes | cp /vagrant/ambari/ambari-server/src/main/resources/stacks/HDP/2.0.6/services/HIVE/package/scripts/params.py /var/lib/ambari-agent/cache/stacks/HDP/2.0.6/services/HIVE/package/scripts/params.py > yes | cp /vagrant/ambari/ambari-server/src/main/resources/stacks/HDP/2.1/services/TEZ/package/scripts/params.py /var/lib/ambari-agent/cache/stacks/HDP/2.1/services/TEZ/package/scripts/params.py > > 4. Check that tarballs are not already in HDFS. If they are, delete them. > [hdfs@c6404 ~]$ hdfs --config /etc/hadoop/conf dfs -ls /hdp/apps/2.2.0.0-974/hive/ > [hdfs@c6404 ~]$ hdfs --config /etc/hadoop/conf dfs -ls /hdp/apps/2.2.0.0-974/pig/ > [hdfs@c6404 ~]$ hdfs --config /etc/hadoop/conf dfs -ls /hdp/apps/2.2.0.0-974/sqoop/ > [hdfs@c6404 ~]$ hdfs --config /etc/hadoop/conf dfs -ls /hdp/apps/2.2.0.0-974/mr/ > [hdfs@c6404 ~]$ hdfs --config /etc/hadoop/conf dfs -rm -r /hdp/apps/2.2.0.0-974/ > > 5. Before starting WebHCat, check webhcat-site.xml for properties that should be unversioned > less /etc/hive-webhcat/conf/webhcat-site.xml > / templeton.*archive > / templeton.*jar > > 6. Restart WebHCat and verify files are copied to HDFS, > python /var/lib/ambari-agent/cache/stacks/HDP/2.0.6/services/HIVE/package/scripts/webhcat_server.py START /var/lib/ambari-agent/data/command-102.json /var/lib/ambari-agent/cache/stacks/HDP/2.0.6/services/HDFS /var/lib/ambari-agent/data/output-102.txt DEBUG /var/lib/ambari-agent/data/tmp > [hdfs@c6404 ~]$ hdfs --config /etc/hadoop/conf dfs -ls /hdp/apps/2.2.0.0-974/hive/ > [hdfs@c6404 ~]$ hdfs --config /etc/hadoop/conf dfs -ls /hdp/apps/2.2.0.0-974/pig/ > [hdfs@c6404 ~]$ hdfs --config /etc/hadoop/conf dfs -ls /hdp/apps/2.2.0.0-974/sqoop/ > [hdfs@c6404 ~]$ hdfs --config /etc/hadoop/conf dfs -ls /hdp/apps/2.2.0.0-974/mr/ > > 7. Verify that webhcat-site.xml has properties with actual values this time. > > Check the /etc/hive-webhcat/conf/webhcat-site.xml file again. This time it should have the properties with the versioned paths. > > > Thanks, > > Alejandro Fernandez > > --===============7199470003678735713==--