Return-Path: X-Original-To: apmail-ambari-dev-archive@www.apache.org Delivered-To: apmail-ambari-dev-archive@www.apache.org Received: from mail.apache.org (hermes.apache.org [140.211.11.3]) by minotaur.apache.org (Postfix) with SMTP id 1BACF10F06 for ; Tue, 17 Mar 2015 11:48:39 +0000 (UTC) Received: (qmail 83656 invoked by uid 500); 17 Mar 2015 11:48:38 -0000 Delivered-To: apmail-ambari-dev-archive@ambari.apache.org Received: (qmail 83621 invoked by uid 500); 17 Mar 2015 11:48:38 -0000 Mailing-List: contact dev-help@ambari.apache.org; run by ezmlm Precedence: bulk List-Help: List-Unsubscribe: List-Post: List-Id: Reply-To: dev@ambari.apache.org Delivered-To: mailing list dev@ambari.apache.org Received: (qmail 83608 invoked by uid 99); 17 Mar 2015 11:48:38 -0000 Received: from arcas.apache.org (HELO arcas.apache.org) (140.211.11.28) by apache.org (qpsmtpd/0.29) with ESMTP; Tue, 17 Mar 2015 11:48:38 +0000 Date: Tue, 17 Mar 2015 11:48:38 +0000 (UTC) From: "Hudson (JIRA)" To: dev@ambari.apache.org Message-ID: In-Reply-To: References: Subject: [jira] [Commented] (AMBARI-10100) WEBHCAT_SERVER START is failed after Ambari only upgrade from 1.4.4 to 2.0.0 (failed, parent directory /etc/hive-webhcat doesn't exist) MIME-Version: 1.0 Content-Type: text/plain; charset=utf-8 Content-Transfer-Encoding: quoted-printable X-JIRA-FingerPrint: 30527f35849b9dde25b450d4833f0394 [ https://issues.apache.org/jira/browse/AMBARI-10100?page=3Dcom.atlassi= an.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=3D14= 365016#comment-14365016 ]=20 Hudson commented on AMBARI-10100: --------------------------------- ABORTED: Integrated in Ambari-branch-2.0.0 #89 (See [https://builds.apache.= org/job/Ambari-branch-2.0.0/89/]) AMBARI-10100 WEBHCAT_SERVER START is failed after Ambari only upgrade from = 1.4.4 to 2.0.0 (failed, parent directory /etc/hive-webhcat doesn't exist) (= dsen) (dsen: http://git-wip-us.apache.org/repos/asf?p=3Dambari.git&a=3Dcomm= it&h=3D1152fc081e1c062a12a3721f4533879ba0b8d4c1) * ambari-server/src/test/python/stacks/2.0.6/HIVE/test_webhcat_server.py * ambari-server/src/main/resources/common-services/HIVE/0.12.0.2.0/package/= scripts/webhcat.py > WEBHCAT_SERVER START is failed after Ambari only upgrade from 1.4.4 to 2.= 0.0 (failed, parent directory /etc/hive-webhcat doesn't exist) > -------------------------------------------------------------------------= -------------------------------------------------------------- > > Key: AMBARI-10100 > URL: https://issues.apache.org/jira/browse/AMBARI-10100 > Project: Ambari > Issue Type: Bug > Components: ambari-server > Affects Versions: 2.0.0 > Environment: ambari-server version: ambari-server-2.0.0-137.noarc= h > ambari-server --hash: 1e2a741e5fa00f6ecb7ec7d420f3dee0f0f71b8f > HDP Stack: 2.0 > Ambari DB: :PostgreSQL > Oozie/Hive DB: MySQL/MySQL > Security:no > HA: no > Reporter: Dmytro Sen > Assignee: Dmytro Sen > Priority: Blocker > Fix For: 2.0.0 > > Attachments: AMBARI-10100.patch > > > STR: > 1)Deploy old version with all services > 2)Make ambari only upgrade to 2.0.0 > Actual result: > WEBHCAT_SERVER START is failed after upgrade from 1.4.4 to 2.0.0 > {code} > -------------------------------------------------------------------------= ------- > { > "href" : "http://172.18.145.150:8080/api/v1/clusters/cl1/requests/7/tas= ks/294", > "Tasks" : { > "attempt_cnt" : 1, > "cluster_name" : "cl1", > "command" : "START", > "command_detail" : "WEBHCAT_SERVER START", > "end_time" : 1426503383601, > "error_log" : "/var/lib/ambari-agent/data/errors-294.txt", > "exit_code" : 1, > "host_name" : "amb-upg14423-rhel6postgres1426496145-4.cs1cloud.intern= al", > "id" : 294, > "output_log" : "/var/lib/ambari-agent/data/output-294.txt", > "request_id" : 7, > "role" : "WEBHCAT_SERVER", > "stage_id" : 5, > "start_time" : 1426503336711, > "status" : "FAILED", > "stderr" : "2015-03-16 10:56:23,390 - Error while executing command '= start':\nTraceback (most recent call last):\n File \"/usr/lib/python2.6/si= te-packages/resource_management/libraries/script/script.py\", line 214, in = execute\n method(env)\n File \"/var/lib/ambari-agent/cache/common-servi= ces/HIVE/0.12.0.2.0/package/scripts/webhcat_server.py\", line 47, in start\= n self.configure(env) # FOR SECURITY\n File \"/var/lib/ambari-agent/cac= he/common-services/HIVE/0.12.0.2.0/package/scripts/webhcat_server.py\", lin= e 41, in configure\n webhcat()\n File \"/var/lib/ambari-agent/cache/com= mon-services/HIVE/0.12.0.2.0/package/scripts/webhcat.py\", line 152, in web= hcat\n cd_access=3D'a',\n File \"/usr/lib/python2.6/site-packages/resou= rce_management/core/base.py\", line 148, in __init__\n self.env.run()\n = File \"/usr/lib/python2.6/site-packages/resource_management/core/environme= nt.py\", line 152, in run\n self.run_action(resource, action)\n File \"= /usr/lib/python2.6/site-packages/resource_management/core/environment.py\",= line 118, in run_action\n provider_action()\n File \"/usr/lib/python2.= 6/site-packages/resource_management/core/providers/system.py\", line 169, i= n action_create\n raise Fail(\"Applying %s failed, parent directory %s d= oesn't exist\" % (self.resource, dirname))\nFail: Applying u\"Directory['/e= tc/hive-webhcat/conf']\" failed, parent directory /etc/hive-webhcat doesn't= exist", > "stdout" : "2015-03-16 10:55:44,925 - u\"Directory['/var/lib/ambari-a= gent/data/tmp/AMBARI-artifacts/']\" {'recursive': True}\n2015-03-16 10:55:4= 5,102 - u\"File['/var/lib/ambari-agent/data/tmp/AMBARI-artifacts//jce_polic= y-6.zip']\" {'content': DownloadSource('http://amb-upg14423-rhel6postgres14= 26496145-7.cs1cloud.internal:8080/resources//jce_policy-6.zip')}\n2015-03-1= 6 10:55:45,201 - Not downloading the file from http://amb-upg14423-rhel6pos= tgres1426496145-7.cs1cloud.internal:8080/resources//jce_policy-6.zip, becau= se /var/lib/ambari-agent/data/tmp/jce_policy-6.zip already exists\n2015-03-= 16 10:55:45,356 - u\"Group['hadoop']\" {'ignore_failures': False}\n2015-03-= 16 10:55:45,357 - Modifying group hadoop\n2015-03-16 10:55:45,485 - u\"Grou= p['nobody']\" {'ignore_failures': False}\n2015-03-16 10:55:45,485 - Modifyi= ng group nobody\n2015-03-16 10:55:45,652 - u\"Group['users']\" {'ignore_fai= lures': False}\n2015-03-16 10:55:45,652 - Modifying group users\n2015-03-16= 10:55:45,779 - u\"User['nobody']\" {'gid': 'hadoop', 'ignore_failures': Fa= lse, 'groups': [u'nobody']}\n2015-03-16 10:55:45,779 - Modifying user nobod= y\n2015-03-16 10:55:45,828 - u\"User['oozie']\" {'gid': 'hadoop', 'ignore_f= ailures': False, 'groups': [u'users']}\n2015-03-16 10:55:45,828 - Modifying= user oozie\n2015-03-16 10:55:45,876 - u\"User['hive']\" {'gid': 'hadoop', = 'ignore_failures': False, 'groups': [u'hadoop']}\n2015-03-16 10:55:45,877 -= Modifying user hive\n2015-03-16 10:55:45,924 - u\"User['mapred']\" {'gid':= 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']}\n2015-03-16 10:= 55:45,925 - Modifying user mapred\n2015-03-16 10:55:45,973 - u\"User['hbase= ']\" {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']}\n20= 15-03-16 10:55:45,974 - Modifying user hbase\n2015-03-16 10:55:46,025 - u\"= User['ambari-qa']\" {'gid': 'hadoop', 'ignore_failures': False, 'groups': [= u'users']}\n2015-03-16 10:55:46,026 - Modifying user ambari-qa\n2015-03-16 = 10:55:46,076 - u\"User['zookeeper']\" {'gid': 'hadoop', 'ignore_failures': = False, 'groups': [u'hadoop']}\n2015-03-16 10:55:46,076 - Modifying user zoo= keeper\n2015-03-16 10:55:46,126 - u\"User['false']\" {'gid': 'hadoop', 'ign= ore_failures': False, 'groups': [u'hadoop']}\n2015-03-16 10:55:46,126 - Mod= ifying user false\n2015-03-16 10:55:46,176 - u\"User['hdfs']\" {'gid': 'had= oop', 'ignore_failures': False, 'groups': [u'hadoop']}\n2015-03-16 10:55:46= ,177 - Modifying user hdfs\n2015-03-16 10:55:46,225 - u\"User['sqoop']\" {'= gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']}\n2015-03-1= 6 10:55:46,226 - Modifying user sqoop\n2015-03-16 10:55:46,279 - u\"User['y= arn']\" {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']}\= n2015-03-16 10:55:46,280 - Modifying user yarn\n2015-03-16 10:55:46,331 - u= \"User['hcat']\" {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'h= adoop']}\n2015-03-16 10:55:46,331 - Modifying user hcat\n2015-03-16 10:55:4= 6,380 - u\"File['/var/lib/ambari-agent/data/tmp/changeUid.sh']\" {'content'= : StaticFile('changeToSecureUid.sh'), 'mode': 0555}\n2015-03-16 10:55:46,70= 4 - u\"Execute['/var/lib/ambari-agent/data/tmp/changeUid.sh ambari-qa /tmp/= hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/= tmp/sqoop-ambari-qa']\" {'not_if': '(test $(id -u ambari-qa) -gt 1000) || (= false)'}\n2015-03-16 10:55:46,758 - Skipping u\"Execute['/var/lib/ambari-ag= ent/data/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_a= mbari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa']\" due to not= _if\n2015-03-16 10:55:46,758 - u\"Directory['/grid/0/hadoop/hbase']\" {'own= er': 'hbase', 'recursive': True, 'mode': 0775, 'cd_access': 'a'}\n2015-03-1= 6 10:55:47,359 - u\"File['/var/lib/ambari-agent/data/tmp/changeUid.sh']\" {= 'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}\n2015-03-16 10= :55:47,668 - u\"Execute['/var/lib/ambari-agent/data/tmp/changeUid.sh hbase = /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/grid/0/hadoop/hbase']= \" {'not_if': '(test $(id -u hbase) -gt 1000) || (false)'}\n2015-03-16 10:5= 5:47,715 - Skipping u\"Execute['/var/lib/ambari-agent/data/tmp/changeUid.sh= hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/grid/0/hadoop/= hbase']\" due to not_if\n2015-03-16 10:55:47,715 - u\"Group['hdfs']\" {'ign= ore_failures': False}\n2015-03-16 10:55:47,716 - Modifying group hdfs\n2015= -03-16 10:55:47,843 - u\"User['hdfs']\" {'ignore_failures': False, 'groups'= : [u'hadoop', 'users', 'hdfs', 'hadoop', u'hdfs']}\n2015-03-16 10:55:47,843= - Modifying user hdfs\n2015-03-16 10:55:47,892 - u\"Directory['/etc/hadoop= ']\" {'mode': 0755}\n2015-03-16 10:55:48,044 - u\"Directory['/etc/hadoop/co= nf.empty']\" {'owner': 'root', 'group': 'hadoop', 'recursive': True}\n2015-= 03-16 10:55:48,204 - u\"Link['/etc/hadoop/conf']\" {'not_if': 'ls /etc/hado= op/conf', 'to': '/etc/hadoop/conf.empty'}\n2015-03-16 10:55:48,256 - Skippi= ng u\"Link['/etc/hadoop/conf']\" due to not_if\n2015-03-16 10:55:48,272 - u= \"File['/etc/hadoop/conf/hadoop-env.sh']\" {'content': InlineTemplate(...),= 'owner': 'hdfs', 'group': 'hadoop'}\n2015-03-16 10:55:48,533 - u\"Execute[= '('setenforce', '0')']\" {'sudo': True, 'only_if': 'test -f /selinux/enforc= e'}\n2015-03-16 10:55:48,601 - Skipping u\"Execute['('setenforce', '0')']\"= due to only_if\n2015-03-16 10:55:48,601 - u\"Directory['/grid/0/log/hadoop= ']\" {'owner': 'root', 'mode': 0775, 'group': 'hadoop', 'recursive': True, = 'cd_access': 'a'}\n2015-03-16 10:55:49,140 - u\"Directory['/var/run/hadoop'= ]\" {'owner': 'root', 'group': 'root', 'recursive': True, 'cd_access': 'a'}= \n2015-03-16 10:55:49,579 - u\"Directory['/tmp/hadoop-hdfs']\" {'owner': 'h= dfs', 'recursive': True, 'cd_access': 'a'}\n2015-03-16 10:55:49,938 - u\"Fi= le['/etc/hadoop/conf/commons-logging.properties']\" {'content': Template('c= ommons-logging.properties.j2'), 'owner': 'hdfs'}\n2015-03-16 10:55:50,191 -= u\"File['/etc/hadoop/conf/health_check']\" {'content': Template('health_ch= eck-v2.j2'), 'owner': 'hdfs'}\n2015-03-16 10:55:50,435 - u\"File['/etc/hado= op/conf/log4j.properties']\" {'content': '...', 'owner': 'hdfs', 'group': '= hadoop', 'mode': 0644}\n2015-03-16 10:55:51,666 - u\"File['/etc/hadoop/conf= /hadoop-metrics2.properties']\" {'content': Template('hadoop-metrics2.prope= rties.j2'), 'owner': 'hdfs'}\n2015-03-16 10:55:53,018 - u\"File['/etc/hadoo= p/conf/task-log4j.properties']\" {'content': StaticFile('task-log4j.propert= ies'), 'mode': 0755}\n2015-03-16 10:55:53,333 - u\"File['/etc/hadoop/conf/c= onfiguration.xsl']\" {'owner': 'hdfs', 'group': 'hadoop'}\n2015-03-16 10:55= :53,814 - u\"HdfsDirectory['/apps/webhcat']\" {'security_enabled': False, '= keytab': [EMPTY], 'conf_dir': '/etc/hadoop/conf', 'hdfs_user': 'hdfs', 'kin= it_path_local': '', 'mode': 0755, 'owner': 'hcat', 'bin_dir': '/usr/bin', '= action': ['create_delayed']}\n2015-03-16 10:55:53,815 - u\"HdfsDirectory['/= user/hcat']\" {'security_enabled': False, 'keytab': [EMPTY], 'conf_dir': '/= etc/hadoop/conf', 'hdfs_user': 'hdfs', 'kinit_path_local': '', 'mode': 0755= , 'owner': 'hcat', 'bin_dir': '/usr/bin', 'action': ['create_delayed']}\n20= 15-03-16 10:55:53,815 - u\"HdfsDirectory['None']\" {'security_enabled': Fal= se, 'keytab': [EMPTY], 'conf_dir': '/etc/hadoop/conf', 'hdfs_user': 'hdfs',= 'kinit_path_local': '', 'action': ['create'], 'bin_dir': '/usr/bin'}\n2015= -03-16 10:55:53,818 - u\"Execute['hadoop --config /etc/hadoop/conf fs -mkdi= r -p /apps/webhcat /user/hcat && hadoop --config /etc/hadoop/conf fs -chmod= 755 /apps/webhcat /user/hcat && hadoop --config /etc/hadoop/conf fs -chow= n hcat /apps/webhcat /user/hcat']\" {'not_if': \"ambari-sudo.sh su hdfs -l= -s /bin/bash -c 'hadoop --config /etc/hadoop/conf fs -ls /apps/webhcat /us= er/hcat'\", 'user': 'hdfs', 'path': ['/usr/bin']}\n2015-03-16 10:55:56,738 = - Skipping u\"Execute['hadoop --config /etc/hadoop/conf fs -mkdir -p /apps/= webhcat /user/hcat && hadoop --config /etc/hadoop/conf fs -chmod 755 /apps= /webhcat /user/hcat && hadoop --config /etc/hadoop/conf fs -chown hcat /ap= ps/webhcat /user/hcat']\" due to not_if\n2015-03-16 10:55:56,739 - u\"Direc= tory['/var/run/webhcat']\" {'owner': 'hcat', 'group': 'hadoop', 'recursive'= : True, 'mode': 0755}\n2015-03-16 10:55:57,096 - u\"Directory['/grid/0/log/= webhcat']\" {'owner': 'hcat', 'group': 'hadoop', 'recursive': True, 'mode':= 0755}\n2015-03-16 10:55:57,444 - u\"Directory['/etc/hcatalog/conf']\" {'ow= ner': 'hcat', 'group': 'hadoop', 'recursive': True}\n2015-03-16 10:55:57,79= 9 - Changing owner for /etc/hcatalog/conf from 0 to hcat\n2015-03-16 10:55:= 57,901 - Changing group for /etc/hcatalog/conf from 0 to hadoop\n2015-03-16= 10:55:58,027 - u\"CopyFromLocal['/usr/lib/hadoop-mapreduce/hadoop-streamin= g-*.jar']\" {'hadoop_conf_dir': '/etc/hadoop/conf', 'hdfs_user': 'hdfs', 'o= wner': 'hcat', 'mode': 0755, 'dest_dir': '/apps/webhcat', 'hadoop_bin_dir':= '/usr/bin', 'kinnit_if_needed': ''}\n2015-03-16 10:55:58,029 - u\"ExecuteH= adoop['fs -copyFromLocal /usr/lib/hadoop-mapreduce/hadoop-streaming-*.jar /= apps/webhcat']\" {'not_if': \"ambari-sudo.sh su hcat -l -s /bin/bash -c 'PA= TH=3D$PATH:/usr/bin hadoop fs -ls /apps/webhcat/hadoop-streaming-*.jar'\", = 'bin_dir': '/usr/bin', 'user': 'hcat', 'conf_dir': '/etc/hadoop/conf'}\n201= 5-03-16 10:56:04,284 - u\"Execute['hadoop --config /etc/hadoop/conf fs -cop= yFromLocal /usr/lib/hadoop-mapreduce/hadoop-streaming-*.jar /apps/webhcat']= \" {'logoutput': None, 'try_sleep': 0, 'environment': {}, 'tries': 1, 'user= ': 'hcat', 'path': ['/usr/bin']}\n2015-03-16 10:56:07,311 - u\"ExecuteHadoo= p['fs -chown hcat /apps/webhcat/hadoop-streaming-*.jar']\" {'bin_dir': '/us= r/bin', 'user': 'hdfs', 'conf_dir': '/etc/hadoop/conf'}\n2015-03-16 10:56:0= 7,312 - u\"Execute['hadoop --config /etc/hadoop/conf fs -chown hcat /apps/w= ebhcat/hadoop-streaming-*.jar']\" {'logoutput': None, 'try_sleep': 0, 'envi= ronment': {}, 'tries': 1, 'user': 'hdfs', 'path': ['/usr/bin']}\n2015-03-16= 10:56:09,046 - u\"ExecuteHadoop['fs -chmod 755 /apps/webhcat/hadoop-stream= ing-*.jar']\" {'bin_dir': '/usr/bin', 'user': 'hdfs', 'conf_dir': '/etc/had= oop/conf'}\n2015-03-16 10:56:09,048 - u\"Execute['hadoop --config /etc/hado= op/conf fs -chmod 755 /apps/webhcat/hadoop-streaming-*.jar']\" {'logoutput'= : None, 'try_sleep': 0, 'environment': {}, 'tries': 1, 'user': 'hdfs', 'pat= h': ['/usr/bin']}\n2015-03-16 10:56:10,791 - u\"CopyFromLocal['/usr/share/H= DP-webhcat/pig.tar.gz']\" {'hadoop_conf_dir': '/etc/hadoop/conf', 'hdfs_use= r': 'hdfs', 'owner': 'hcat', 'mode': 0755, 'dest_dir': '/apps/webhcat', 'ha= doop_bin_dir': '/usr/bin', 'kinnit_if_needed': ''}\n2015-03-16 10:56:10,795= - u\"ExecuteHadoop['fs -copyFromLocal /usr/share/HDP-webhcat/pig.tar.gz /a= pps/webhcat']\" {'not_if': \"ambari-sudo.sh su hcat -l -s /bin/bash -c 'PAT= H=3D$PATH:/usr/bin hadoop fs -ls /apps/webhcat/pig.tar.gz'\", 'bin_dir': '/= usr/bin', 'user': 'hcat', 'conf_dir': '/etc/hadoop/conf'}\n2015-03-16 10:56= :12,898 - Skipping u\"ExecuteHadoop['fs -copyFromLocal /usr/share/HDP-webhc= at/pig.tar.gz /apps/webhcat']\" due to not_if\n2015-03-16 10:56:12,900 - u\= "ExecuteHadoop['fs -chown hcat /apps/webhcat/pig.tar.gz']\" {'bin_dir': '/u= sr/bin', 'user': 'hdfs', 'conf_dir': '/etc/hadoop/conf'}\n2015-03-16 10:56:= 12,902 - u\"Execute['hadoop --config /etc/hadoop/conf fs -chown hcat /apps/= webhcat/pig.tar.gz']\" {'logoutput': None, 'try_sleep': 0, 'environment': {= }, 'tries': 1, 'user': 'hdfs', 'path': ['/usr/bin']}\n2015-03-16 10:56:14,8= 45 - u\"ExecuteHadoop['fs -chmod 755 /apps/webhcat/pig.tar.gz']\" {'bin_dir= ': '/usr/bin', 'user': 'hdfs', 'conf_dir': '/etc/hadoop/conf'}\n2015-03-16 = 10:56:14,847 - u\"Execute['hadoop --config /etc/hadoop/conf fs -chmod 755 /= apps/webhcat/pig.tar.gz']\" {'logoutput': None, 'try_sleep': 0, 'environmen= t': {}, 'tries': 1, 'user': 'hdfs', 'path': ['/usr/bin']}\n2015-03-16 10:56= :16,913 - u\"CopyFromLocal['/usr/share/HDP-webhcat/hive.tar.gz']\" {'hadoop= _conf_dir': '/etc/hadoop/conf', 'hdfs_user': 'hdfs', 'owner': 'hcat', 'mode= ': 0755, 'dest_dir': '/apps/webhcat', 'hadoop_bin_dir': '/usr/bin', 'kinnit= _if_needed': ''}\n2015-03-16 10:56:16,916 - u\"ExecuteHadoop['fs -copyFromL= ocal /usr/share/HDP-webhcat/hive.tar.gz /apps/webhcat']\" {'not_if': \"amba= ri-sudo.sh su hcat -l -s /bin/bash -c 'PATH=3D$PATH:/usr/bin hadoop fs -ls = /apps/webhcat/hive.tar.gz'\", 'bin_dir': '/usr/bin', 'user': 'hcat', 'conf_= dir': '/etc/hadoop/conf'}\n2015-03-16 10:56:18,922 - Skipping u\"ExecuteHad= oop['fs -copyFromLocal /usr/share/HDP-webhcat/hive.tar.gz /apps/webhcat']\"= due to not_if\n2015-03-16 10:56:18,923 - u\"ExecuteHadoop['fs -chown hcat = /apps/webhcat/hive.tar.gz']\" {'bin_dir': '/usr/bin', 'user': 'hdfs', 'conf= _dir': '/etc/hadoop/conf'}\n2015-03-16 10:56:18,924 - u\"Execute['hadoop --= config /etc/hadoop/conf fs -chown hcat /apps/webhcat/hive.tar.gz']\" {'logo= utput': None, 'try_sleep': 0, 'environment': {}, 'tries': 1, 'user': 'hdfs'= , 'path': ['/usr/bin']}\n2015-03-16 10:56:20,660 - u\"ExecuteHadoop['fs -ch= mod 755 /apps/webhcat/hive.tar.gz']\" {'bin_dir': '/usr/bin', 'user': 'hdfs= ', 'conf_dir': '/etc/hadoop/conf'}\n2015-03-16 10:56:20,662 - u\"Execute['h= adoop --config /etc/hadoop/conf fs -chmod 755 /apps/webhcat/hive.tar.gz']\"= {'logoutput': None, 'try_sleep': 0, 'environment': {}, 'tries': 1, 'user':= 'hdfs', 'path': ['/usr/bin']}\n2015-03-16 10:56:22,477 - u\"XmlConfig['web= hcat-site.xml']\" {'owner': 'hcat', 'group': 'hadoop', 'conf_dir': '/etc/hc= atalog/conf', 'configuration_attributes': {}, 'configurations': ...}\n2015-= 03-16 10:56:22,493 - Generating config: /etc/hcatalog/conf/webhcat-site.xml= \n2015-03-16 10:56:22,494 - u\"File['/etc/hcatalog/conf/webhcat-site.xml']\= " {'owner': 'hcat', 'content': InlineTemplate(...), 'group': 'hadoop', 'mod= e': None, 'encoding': 'UTF-8'}\n2015-03-16 10:56:22,721 - Writing u\"File['= /etc/hcatalog/conf/webhcat-site.xml']\" because contents don't match\n2015-= 03-16 10:56:22,915 - u\"File['/etc/hcatalog/conf/webhcat-env.sh']\" {'conte= nt': InlineTemplate(...), 'owner': 'hcat', 'group': 'hadoop'}\n2015-03-16 1= 0:56:23,122 - Writing u\"File['/etc/hcatalog/conf/webhcat-env.sh']\" becaus= e contents don't match\n2015-03-16 10:56:23,287 - u\"Directory['/etc/hive-w= ebhcat/conf']\" {'cd_access': 'a'}\n2015-03-16 10:56:23,339 - Creating dire= ctory u\"Directory['/etc/hive-webhcat/conf']\"\n2015-03-16 10:56:23,390 - E= rror while executing command 'start':\nTraceback (most recent call last):\n= File \"/usr/lib/python2.6/site-packages/resource_management/libraries/scr= ipt/script.py\", line 214, in execute\n method(env)\n File \"/var/lib/a= mbari-agent/cache/common-services/HIVE/0.12.0.2.0/package/scripts/webhcat_s= erver.py\", line 47, in start\n self.configure(env) # FOR SECURITY\n Fi= le \"/var/lib/ambari-agent/cache/common-services/HIVE/0.12.0.2.0/package/sc= ripts/webhcat_server.py\", line 41, in configure\n webhcat()\n File \"/= var/lib/ambari-agent/cache/common-services/HIVE/0.12.0.2.0/package/scripts/= webhcat.py\", line 152, in webhcat\n cd_access=3D'a',\n File \"/usr/lib= /python2.6/site-packages/resource_management/core/base.py\", line 148, in _= _init__\n self.env.run()\n File \"/usr/lib/python2.6/site-packages/reso= urce_management/core/environment.py\", line 152, in run\n self.run_actio= n(resource, action)\n File \"/usr/lib/python2.6/site-packages/resource_man= agement/core/environment.py\", line 118, in run_action\n provider_action= ()\n File \"/usr/lib/python2.6/site-packages/resource_management/core/prov= iders/system.py\", line 169, in action_create\n raise Fail(\"Applying %s= failed, parent directory %s doesn't exist\" % (self.resource, dirname))\nF= ail: Applying u\"Directory['/etc/hive-webhcat/conf']\" failed, parent direc= tory /etc/hive-webhcat doesn't exist", > "structured_out" : { } > } > } > -------------------------------------------------------------------------= ------- > {code} -- This message was sent by Atlassian JIRA (v6.3.4#6332)