ambari-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Hudson (JIRA)" <j...@apache.org>
Subject [jira] [Commented] (AMBARI-10100) WEBHCAT_SERVER START is failed after Ambari only upgrade from 1.4.4 to 2.0.0 (failed, parent directory /etc/hive-webhcat doesn't exist)
Date Tue, 17 Mar 2015 11:55:38 GMT

    [ https://issues.apache.org/jira/browse/AMBARI-10100?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14365021#comment-14365021
] 

Hudson commented on AMBARI-10100:
---------------------------------

FAILURE: Integrated in Ambari-trunk-Commit #2054 (See [https://builds.apache.org/job/Ambari-trunk-Commit/2054/])
AMBARI-10100 WEBHCAT_SERVER START is failed after Ambari only upgrade from 1.4.4 to 2.0.0
(failed, parent directory /etc/hive-webhcat doesn't exist) (dsen) (dsen: http://git-wip-us.apache.org/repos/asf?p=ambari.git&a=commit&h=4f84615da39c64aab3ee31b23420b98db7926223)
* ambari-server/src/test/python/stacks/2.0.6/HIVE/test_webhcat_server.py
* ambari-server/src/main/resources/common-services/HIVE/0.12.0.2.0/package/scripts/webhcat.py


> WEBHCAT_SERVER START is failed after Ambari only upgrade from 1.4.4 to 2.0.0 (failed,
parent directory /etc/hive-webhcat doesn't exist)
> ---------------------------------------------------------------------------------------------------------------------------------------
>
>                 Key: AMBARI-10100
>                 URL: https://issues.apache.org/jira/browse/AMBARI-10100
>             Project: Ambari
>          Issue Type: Bug
>          Components: ambari-server
>    Affects Versions: 2.0.0
>         Environment: ambari-server version: ambari-server-2.0.0-137.noarch
> ambari-server --hash: 1e2a741e5fa00f6ecb7ec7d420f3dee0f0f71b8f
> HDP Stack: 2.0
> Ambari DB: :PostgreSQL
> Oozie/Hive DB: MySQL/MySQL
> Security:no
> HA: no
>            Reporter: Dmytro Sen
>            Assignee: Dmytro Sen
>            Priority: Blocker
>             Fix For: 2.0.0
>
>         Attachments: AMBARI-10100.patch
>
>
> STR:
> 1)Deploy old version with all services
> 2)Make ambari only upgrade to 2.0.0
> Actual result:
> WEBHCAT_SERVER START is failed after upgrade from 1.4.4 to 2.0.0
> {code}
> --------------------------------------------------------------------------------
> {
>   "href" : "http://172.18.145.150:8080/api/v1/clusters/cl1/requests/7/tasks/294",
>   "Tasks" : {
>     "attempt_cnt" : 1,
>     "cluster_name" : "cl1",
>     "command" : "START",
>     "command_detail" : "WEBHCAT_SERVER START",
>     "end_time" : 1426503383601,
>     "error_log" : "/var/lib/ambari-agent/data/errors-294.txt",
>     "exit_code" : 1,
>     "host_name" : "amb-upg14423-rhel6postgres1426496145-4.cs1cloud.internal",
>     "id" : 294,
>     "output_log" : "/var/lib/ambari-agent/data/output-294.txt",
>     "request_id" : 7,
>     "role" : "WEBHCAT_SERVER",
>     "stage_id" : 5,
>     "start_time" : 1426503336711,
>     "status" : "FAILED",
>     "stderr" : "2015-03-16 10:56:23,390 - Error while executing command 'start':\nTraceback
(most recent call last):\n  File \"/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py\",
line 214, in execute\n    method(env)\n  File \"/var/lib/ambari-agent/cache/common-services/HIVE/0.12.0.2.0/package/scripts/webhcat_server.py\",
line 47, in start\n    self.configure(env) # FOR SECURITY\n  File \"/var/lib/ambari-agent/cache/common-services/HIVE/0.12.0.2.0/package/scripts/webhcat_server.py\",
line 41, in configure\n    webhcat()\n  File \"/var/lib/ambari-agent/cache/common-services/HIVE/0.12.0.2.0/package/scripts/webhcat.py\",
line 152, in webhcat\n    cd_access='a',\n  File \"/usr/lib/python2.6/site-packages/resource_management/core/base.py\",
line 148, in __init__\n    self.env.run()\n  File \"/usr/lib/python2.6/site-packages/resource_management/core/environment.py\",
line 152, in run\n    self.run_action(resource, action)\n  File \"/usr/lib/python2.6/site-packages/resource_management/core/environment.py\",
line 118, in run_action\n    provider_action()\n  File \"/usr/lib/python2.6/site-packages/resource_management/core/providers/system.py\",
line 169, in action_create\n    raise Fail(\"Applying %s failed, parent directory %s doesn't
exist\" % (self.resource, dirname))\nFail: Applying u\"Directory['/etc/hive-webhcat/conf']\"
failed, parent directory /etc/hive-webhcat doesn't exist",
>     "stdout" : "2015-03-16 10:55:44,925 - u\"Directory['/var/lib/ambari-agent/data/tmp/AMBARI-artifacts/']\"
{'recursive': True}\n2015-03-16 10:55:45,102 - u\"File['/var/lib/ambari-agent/data/tmp/AMBARI-artifacts//jce_policy-6.zip']\"
{'content': DownloadSource('http://amb-upg14423-rhel6postgres1426496145-7.cs1cloud.internal:8080/resources//jce_policy-6.zip')}\n2015-03-16
10:55:45,201 - Not downloading the file from http://amb-upg14423-rhel6postgres1426496145-7.cs1cloud.internal:8080/resources//jce_policy-6.zip,
because /var/lib/ambari-agent/data/tmp/jce_policy-6.zip already exists\n2015-03-16 10:55:45,356
- u\"Group['hadoop']\" {'ignore_failures': False}\n2015-03-16 10:55:45,357 - Modifying group
hadoop\n2015-03-16 10:55:45,485 - u\"Group['nobody']\" {'ignore_failures': False}\n2015-03-16
10:55:45,485 - Modifying group nobody\n2015-03-16 10:55:45,652 - u\"Group['users']\" {'ignore_failures':
False}\n2015-03-16 10:55:45,652 - Modifying group users\n2015-03-16 10:55:45,779 - u\"User['nobody']\"
{'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'nobody']}\n2015-03-16 10:55:45,779
- Modifying user nobody\n2015-03-16 10:55:45,828 - u\"User['oozie']\" {'gid': 'hadoop', 'ignore_failures':
False, 'groups': [u'users']}\n2015-03-16 10:55:45,828 - Modifying user oozie\n2015-03-16 10:55:45,876
- u\"User['hive']\" {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']}\n2015-03-16
10:55:45,877 - Modifying user hive\n2015-03-16 10:55:45,924 - u\"User['mapred']\" {'gid':
'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']}\n2015-03-16 10:55:45,925 - Modifying
user mapred\n2015-03-16 10:55:45,973 - u\"User['hbase']\" {'gid': 'hadoop', 'ignore_failures':
False, 'groups': [u'hadoop']}\n2015-03-16 10:55:45,974 - Modifying user hbase\n2015-03-16
10:55:46,025 - u\"User['ambari-qa']\" {'gid': 'hadoop', 'ignore_failures': False, 'groups':
[u'users']}\n2015-03-16 10:55:46,026 - Modifying user ambari-qa\n2015-03-16 10:55:46,076 -
u\"User['zookeeper']\" {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']}\n2015-03-16
10:55:46,076 - Modifying user zookeeper\n2015-03-16 10:55:46,126 - u\"User['false']\" {'gid':
'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']}\n2015-03-16 10:55:46,126 - Modifying
user false\n2015-03-16 10:55:46,176 - u\"User['hdfs']\" {'gid': 'hadoop', 'ignore_failures':
False, 'groups': [u'hadoop']}\n2015-03-16 10:55:46,177 - Modifying user hdfs\n2015-03-16 10:55:46,225
- u\"User['sqoop']\" {'gid': 'hadoop', 'ignore_failures': False, 'groups': [u'hadoop']}\n2015-03-16
10:55:46,226 - Modifying user sqoop\n2015-03-16 10:55:46,279 - u\"User['yarn']\" {'gid': 'hadoop',
'ignore_failures': False, 'groups': [u'hadoop']}\n2015-03-16 10:55:46,280 - Modifying user
yarn\n2015-03-16 10:55:46,331 - u\"User['hcat']\" {'gid': 'hadoop', 'ignore_failures': False,
'groups': [u'hadoop']}\n2015-03-16 10:55:46,331 - Modifying user hcat\n2015-03-16 10:55:46,380
- u\"File['/var/lib/ambari-agent/data/tmp/changeUid.sh']\" {'content': StaticFile('changeToSecureUid.sh'),
'mode': 0555}\n2015-03-16 10:55:46,704 - u\"Execute['/var/lib/ambari-agent/data/tmp/changeUid.sh
ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa']\"
{'not_if': '(test $(id -u ambari-qa) -gt 1000) || (false)'}\n2015-03-16 10:55:46,758 - Skipping
u\"Execute['/var/lib/ambari-agent/data/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa']\"
due to not_if\n2015-03-16 10:55:46,758 - u\"Directory['/grid/0/hadoop/hbase']\" {'owner':
'hbase', 'recursive': True, 'mode': 0775, 'cd_access': 'a'}\n2015-03-16 10:55:47,359 - u\"File['/var/lib/ambari-agent/data/tmp/changeUid.sh']\"
{'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}\n2015-03-16 10:55:47,668 - u\"Execute['/var/lib/ambari-agent/data/tmp/changeUid.sh
hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/grid/0/hadoop/hbase']\" {'not_if':
'(test $(id -u hbase) -gt 1000) || (false)'}\n2015-03-16 10:55:47,715 - Skipping u\"Execute['/var/lib/ambari-agent/data/tmp/changeUid.sh
hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/grid/0/hadoop/hbase']\" due to
not_if\n2015-03-16 10:55:47,715 - u\"Group['hdfs']\" {'ignore_failures': False}\n2015-03-16
10:55:47,716 - Modifying group hdfs\n2015-03-16 10:55:47,843 - u\"User['hdfs']\" {'ignore_failures':
False, 'groups': [u'hadoop', 'users', 'hdfs', 'hadoop', u'hdfs']}\n2015-03-16 10:55:47,843
- Modifying user hdfs\n2015-03-16 10:55:47,892 - u\"Directory['/etc/hadoop']\" {'mode': 0755}\n2015-03-16
10:55:48,044 - u\"Directory['/etc/hadoop/conf.empty']\" {'owner': 'root', 'group': 'hadoop',
'recursive': True}\n2015-03-16 10:55:48,204 - u\"Link['/etc/hadoop/conf']\" {'not_if': 'ls
/etc/hadoop/conf', 'to': '/etc/hadoop/conf.empty'}\n2015-03-16 10:55:48,256 - Skipping u\"Link['/etc/hadoop/conf']\"
due to not_if\n2015-03-16 10:55:48,272 - u\"File['/etc/hadoop/conf/hadoop-env.sh']\" {'content':
InlineTemplate(...), 'owner': 'hdfs', 'group': 'hadoop'}\n2015-03-16 10:55:48,533 - u\"Execute['('setenforce',
'0')']\" {'sudo': True, 'only_if': 'test -f /selinux/enforce'}\n2015-03-16 10:55:48,601 -
Skipping u\"Execute['('setenforce', '0')']\" due to only_if\n2015-03-16 10:55:48,601 - u\"Directory['/grid/0/log/hadoop']\"
{'owner': 'root', 'mode': 0775, 'group': 'hadoop', 'recursive': True, 'cd_access': 'a'}\n2015-03-16
10:55:49,140 - u\"Directory['/var/run/hadoop']\" {'owner': 'root', 'group': 'root', 'recursive':
True, 'cd_access': 'a'}\n2015-03-16 10:55:49,579 - u\"Directory['/tmp/hadoop-hdfs']\" {'owner':
'hdfs', 'recursive': True, 'cd_access': 'a'}\n2015-03-16 10:55:49,938 - u\"File['/etc/hadoop/conf/commons-logging.properties']\"
{'content': Template('commons-logging.properties.j2'), 'owner': 'hdfs'}\n2015-03-16 10:55:50,191
- u\"File['/etc/hadoop/conf/health_check']\" {'content': Template('health_check-v2.j2'), 'owner':
'hdfs'}\n2015-03-16 10:55:50,435 - u\"File['/etc/hadoop/conf/log4j.properties']\" {'content':
'...', 'owner': 'hdfs', 'group': 'hadoop', 'mode': 0644}\n2015-03-16 10:55:51,666 - u\"File['/etc/hadoop/conf/hadoop-metrics2.properties']\"
{'content': Template('hadoop-metrics2.properties.j2'), 'owner': 'hdfs'}\n2015-03-16 10:55:53,018
- u\"File['/etc/hadoop/conf/task-log4j.properties']\" {'content': StaticFile('task-log4j.properties'),
'mode': 0755}\n2015-03-16 10:55:53,333 - u\"File['/etc/hadoop/conf/configuration.xsl']\" {'owner':
'hdfs', 'group': 'hadoop'}\n2015-03-16 10:55:53,814 - u\"HdfsDirectory['/apps/webhcat']\"
{'security_enabled': False, 'keytab': [EMPTY], 'conf_dir': '/etc/hadoop/conf', 'hdfs_user':
'hdfs', 'kinit_path_local': '', 'mode': 0755, 'owner': 'hcat', 'bin_dir': '/usr/bin', 'action':
['create_delayed']}\n2015-03-16 10:55:53,815 - u\"HdfsDirectory['/user/hcat']\" {'security_enabled':
False, 'keytab': [EMPTY], 'conf_dir': '/etc/hadoop/conf', 'hdfs_user': 'hdfs', 'kinit_path_local':
'', 'mode': 0755, 'owner': 'hcat', 'bin_dir': '/usr/bin', 'action': ['create_delayed']}\n2015-03-16
10:55:53,815 - u\"HdfsDirectory['None']\" {'security_enabled': False, 'keytab': [EMPTY], 'conf_dir':
'/etc/hadoop/conf', 'hdfs_user': 'hdfs', 'kinit_path_local': '', 'action': ['create'], 'bin_dir':
'/usr/bin'}\n2015-03-16 10:55:53,818 - u\"Execute['hadoop --config /etc/hadoop/conf fs -mkdir
-p /apps/webhcat /user/hcat && hadoop --config /etc/hadoop/conf fs -chmod  755 /apps/webhcat
/user/hcat && hadoop --config /etc/hadoop/conf fs -chown  hcat /apps/webhcat /user/hcat']\"
{'not_if': \"ambari-sudo.sh su hdfs -l -s /bin/bash -c 'hadoop --config /etc/hadoop/conf fs
-ls /apps/webhcat /user/hcat'\", 'user': 'hdfs', 'path': ['/usr/bin']}\n2015-03-16 10:55:56,738
- Skipping u\"Execute['hadoop --config /etc/hadoop/conf fs -mkdir -p /apps/webhcat /user/hcat
&& hadoop --config /etc/hadoop/conf fs -chmod  755 /apps/webhcat /user/hcat &&
hadoop --config /etc/hadoop/conf fs -chown  hcat /apps/webhcat /user/hcat']\" due to not_if\n2015-03-16
10:55:56,739 - u\"Directory['/var/run/webhcat']\" {'owner': 'hcat', 'group': 'hadoop', 'recursive':
True, 'mode': 0755}\n2015-03-16 10:55:57,096 - u\"Directory['/grid/0/log/webhcat']\" {'owner':
'hcat', 'group': 'hadoop', 'recursive': True, 'mode': 0755}\n2015-03-16 10:55:57,444 - u\"Directory['/etc/hcatalog/conf']\"
{'owner': 'hcat', 'group': 'hadoop', 'recursive': True}\n2015-03-16 10:55:57,799 - Changing
owner for /etc/hcatalog/conf from 0 to hcat\n2015-03-16 10:55:57,901 - Changing group for
/etc/hcatalog/conf from 0 to hadoop\n2015-03-16 10:55:58,027 - u\"CopyFromLocal['/usr/lib/hadoop-mapreduce/hadoop-streaming-*.jar']\"
{'hadoop_conf_dir': '/etc/hadoop/conf', 'hdfs_user': 'hdfs', 'owner': 'hcat', 'mode': 0755,
'dest_dir': '/apps/webhcat', 'hadoop_bin_dir': '/usr/bin', 'kinnit_if_needed': ''}\n2015-03-16
10:55:58,029 - u\"ExecuteHadoop['fs -copyFromLocal /usr/lib/hadoop-mapreduce/hadoop-streaming-*.jar
/apps/webhcat']\" {'not_if': \"ambari-sudo.sh su hcat -l -s /bin/bash -c 'PATH=$PATH:/usr/bin
hadoop fs -ls /apps/webhcat/hadoop-streaming-*.jar'\", 'bin_dir': '/usr/bin', 'user': 'hcat',
'conf_dir': '/etc/hadoop/conf'}\n2015-03-16 10:56:04,284 - u\"Execute['hadoop --config /etc/hadoop/conf
fs -copyFromLocal /usr/lib/hadoop-mapreduce/hadoop-streaming-*.jar /apps/webhcat']\" {'logoutput':
None, 'try_sleep': 0, 'environment': {}, 'tries': 1, 'user': 'hcat', 'path': ['/usr/bin']}\n2015-03-16
10:56:07,311 - u\"ExecuteHadoop['fs -chown hcat /apps/webhcat/hadoop-streaming-*.jar']\" {'bin_dir':
'/usr/bin', 'user': 'hdfs', 'conf_dir': '/etc/hadoop/conf'}\n2015-03-16 10:56:07,312 - u\"Execute['hadoop
--config /etc/hadoop/conf fs -chown hcat /apps/webhcat/hadoop-streaming-*.jar']\" {'logoutput':
None, 'try_sleep': 0, 'environment': {}, 'tries': 1, 'user': 'hdfs', 'path': ['/usr/bin']}\n2015-03-16
10:56:09,046 - u\"ExecuteHadoop['fs -chmod 755 /apps/webhcat/hadoop-streaming-*.jar']\" {'bin_dir':
'/usr/bin', 'user': 'hdfs', 'conf_dir': '/etc/hadoop/conf'}\n2015-03-16 10:56:09,048 - u\"Execute['hadoop
--config /etc/hadoop/conf fs -chmod 755 /apps/webhcat/hadoop-streaming-*.jar']\" {'logoutput':
None, 'try_sleep': 0, 'environment': {}, 'tries': 1, 'user': 'hdfs', 'path': ['/usr/bin']}\n2015-03-16
10:56:10,791 - u\"CopyFromLocal['/usr/share/HDP-webhcat/pig.tar.gz']\" {'hadoop_conf_dir':
'/etc/hadoop/conf', 'hdfs_user': 'hdfs', 'owner': 'hcat', 'mode': 0755, 'dest_dir': '/apps/webhcat',
'hadoop_bin_dir': '/usr/bin', 'kinnit_if_needed': ''}\n2015-03-16 10:56:10,795 - u\"ExecuteHadoop['fs
-copyFromLocal /usr/share/HDP-webhcat/pig.tar.gz /apps/webhcat']\" {'not_if': \"ambari-sudo.sh
su hcat -l -s /bin/bash -c 'PATH=$PATH:/usr/bin hadoop fs -ls /apps/webhcat/pig.tar.gz'\",
'bin_dir': '/usr/bin', 'user': 'hcat', 'conf_dir': '/etc/hadoop/conf'}\n2015-03-16 10:56:12,898
- Skipping u\"ExecuteHadoop['fs -copyFromLocal /usr/share/HDP-webhcat/pig.tar.gz /apps/webhcat']\"
due to not_if\n2015-03-16 10:56:12,900 - u\"ExecuteHadoop['fs -chown hcat /apps/webhcat/pig.tar.gz']\"
{'bin_dir': '/usr/bin', 'user': 'hdfs', 'conf_dir': '/etc/hadoop/conf'}\n2015-03-16 10:56:12,902
- u\"Execute['hadoop --config /etc/hadoop/conf fs -chown hcat /apps/webhcat/pig.tar.gz']\"
{'logoutput': None, 'try_sleep': 0, 'environment': {}, 'tries': 1, 'user': 'hdfs', 'path':
['/usr/bin']}\n2015-03-16 10:56:14,845 - u\"ExecuteHadoop['fs -chmod 755 /apps/webhcat/pig.tar.gz']\"
{'bin_dir': '/usr/bin', 'user': 'hdfs', 'conf_dir': '/etc/hadoop/conf'}\n2015-03-16 10:56:14,847
- u\"Execute['hadoop --config /etc/hadoop/conf fs -chmod 755 /apps/webhcat/pig.tar.gz']\"
{'logoutput': None, 'try_sleep': 0, 'environment': {}, 'tries': 1, 'user': 'hdfs', 'path':
['/usr/bin']}\n2015-03-16 10:56:16,913 - u\"CopyFromLocal['/usr/share/HDP-webhcat/hive.tar.gz']\"
{'hadoop_conf_dir': '/etc/hadoop/conf', 'hdfs_user': 'hdfs', 'owner': 'hcat', 'mode': 0755,
'dest_dir': '/apps/webhcat', 'hadoop_bin_dir': '/usr/bin', 'kinnit_if_needed': ''}\n2015-03-16
10:56:16,916 - u\"ExecuteHadoop['fs -copyFromLocal /usr/share/HDP-webhcat/hive.tar.gz /apps/webhcat']\"
{'not_if': \"ambari-sudo.sh su hcat -l -s /bin/bash -c 'PATH=$PATH:/usr/bin hadoop fs -ls
/apps/webhcat/hive.tar.gz'\", 'bin_dir': '/usr/bin', 'user': 'hcat', 'conf_dir': '/etc/hadoop/conf'}\n2015-03-16
10:56:18,922 - Skipping u\"ExecuteHadoop['fs -copyFromLocal /usr/share/HDP-webhcat/hive.tar.gz
/apps/webhcat']\" due to not_if\n2015-03-16 10:56:18,923 - u\"ExecuteHadoop['fs -chown hcat
/apps/webhcat/hive.tar.gz']\" {'bin_dir': '/usr/bin', 'user': 'hdfs', 'conf_dir': '/etc/hadoop/conf'}\n2015-03-16
10:56:18,924 - u\"Execute['hadoop --config /etc/hadoop/conf fs -chown hcat /apps/webhcat/hive.tar.gz']\"
{'logoutput': None, 'try_sleep': 0, 'environment': {}, 'tries': 1, 'user': 'hdfs', 'path':
['/usr/bin']}\n2015-03-16 10:56:20,660 - u\"ExecuteHadoop['fs -chmod 755 /apps/webhcat/hive.tar.gz']\"
{'bin_dir': '/usr/bin', 'user': 'hdfs', 'conf_dir': '/etc/hadoop/conf'}\n2015-03-16 10:56:20,662
- u\"Execute['hadoop --config /etc/hadoop/conf fs -chmod 755 /apps/webhcat/hive.tar.gz']\"
{'logoutput': None, 'try_sleep': 0, 'environment': {}, 'tries': 1, 'user': 'hdfs', 'path':
['/usr/bin']}\n2015-03-16 10:56:22,477 - u\"XmlConfig['webhcat-site.xml']\" {'owner': 'hcat',
'group': 'hadoop', 'conf_dir': '/etc/hcatalog/conf', 'configuration_attributes': {}, 'configurations':
...}\n2015-03-16 10:56:22,493 - Generating config: /etc/hcatalog/conf/webhcat-site.xml\n2015-03-16
10:56:22,494 - u\"File['/etc/hcatalog/conf/webhcat-site.xml']\" {'owner': 'hcat', 'content':
InlineTemplate(...), 'group': 'hadoop', 'mode': None, 'encoding': 'UTF-8'}\n2015-03-16 10:56:22,721
- Writing u\"File['/etc/hcatalog/conf/webhcat-site.xml']\" because contents don't match\n2015-03-16
10:56:22,915 - u\"File['/etc/hcatalog/conf/webhcat-env.sh']\" {'content': InlineTemplate(...),
'owner': 'hcat', 'group': 'hadoop'}\n2015-03-16 10:56:23,122 - Writing u\"File['/etc/hcatalog/conf/webhcat-env.sh']\"
because contents don't match\n2015-03-16 10:56:23,287 - u\"Directory['/etc/hive-webhcat/conf']\"
{'cd_access': 'a'}\n2015-03-16 10:56:23,339 - Creating directory u\"Directory['/etc/hive-webhcat/conf']\"\n2015-03-16
10:56:23,390 - Error while executing command 'start':\nTraceback (most recent call last):\n
 File \"/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py\",
line 214, in execute\n    method(env)\n  File \"/var/lib/ambari-agent/cache/common-services/HIVE/0.12.0.2.0/package/scripts/webhcat_server.py\",
line 47, in start\n    self.configure(env) # FOR SECURITY\n  File \"/var/lib/ambari-agent/cache/common-services/HIVE/0.12.0.2.0/package/scripts/webhcat_server.py\",
line 41, in configure\n    webhcat()\n  File \"/var/lib/ambari-agent/cache/common-services/HIVE/0.12.0.2.0/package/scripts/webhcat.py\",
line 152, in webhcat\n    cd_access='a',\n  File \"/usr/lib/python2.6/site-packages/resource_management/core/base.py\",
line 148, in __init__\n    self.env.run()\n  File \"/usr/lib/python2.6/site-packages/resource_management/core/environment.py\",
line 152, in run\n    self.run_action(resource, action)\n  File \"/usr/lib/python2.6/site-packages/resource_management/core/environment.py\",
line 118, in run_action\n    provider_action()\n  File \"/usr/lib/python2.6/site-packages/resource_management/core/providers/system.py\",
line 169, in action_create\n    raise Fail(\"Applying %s failed, parent directory %s doesn't
exist\" % (self.resource, dirname))\nFail: Applying u\"Directory['/etc/hive-webhcat/conf']\"
failed, parent directory /etc/hive-webhcat doesn't exist",
>     "structured_out" : { }
>   }
> }
> --------------------------------------------------------------------------------
> {code}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

Mime
View raw message