ambari-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Hudson (JIRA)" <j...@apache.org>
Subject [jira] [Commented] (AMBARI-7732) Zookeeper can not be installed for HDP-2.2
Date Fri, 10 Oct 2014 17:06:34 GMT

    [ https://issues.apache.org/jira/browse/AMBARI-7732?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14167125#comment-14167125
] 

Hudson commented on AMBARI-7732:
--------------------------------

SUCCESS: Integrated in Ambari-branch-1.7.0 #54 (See [https://builds.apache.org/job/Ambari-branch-1.7.0/54/])
AMBARI-7732. Zookeeper can not be installed for HDP-2.2 (aonishuk) (aonishuk: http://git-wip-us.apache.org/repos/asf?p=ambari.git&a=commit&h=5bb80bd8e7b885d0f8cb9c8958f3ab84761a60f8)
* ambari-server/src/main/resources/stacks/HDP/2.0.6/hooks/before-START/scripts/shared_initialization.py
* ambari-server/src/main/resources/stacks/HDP/1.3.2/hooks/before-START/scripts/shared_initialization.py


> Zookeeper can not be installed for HDP-2.2
> ------------------------------------------
>
>                 Key: AMBARI-7732
>                 URL: https://issues.apache.org/jira/browse/AMBARI-7732
>             Project: Ambari
>          Issue Type: Bug
>            Reporter: Andrew Onischuk
>            Assignee: Andrew Onischuk
>             Fix For: 1.7.0
>
>
>     
>     stderr: 
>     2014-10-08 15:27:32,527 - Error while executing command 'install':
>     Traceback (most recent call last):
>       File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py",
line 122, in execute
>         method(env)
>       File "/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/after-INSTALL/scripts/hook.py",
line 31, in hook
>         setup_hdp_install_directory()
>       File "/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/after-INSTALL/scripts/shared_initialization.py",
line 27, in setup_hdp_install_directory
>         only_if=format('ls -d /usr/hdp/{rpm_version}-*')
>       File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", line
148, in __init__
>         self.env.run()
>       File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py",
line 149, in run
>         self.run_action(resource, action)
>       File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py",
line 115, in run_action
>         provider_action()
>       File "/usr/lib/python2.6/site-packages/resource_management/core/providers/system.py",
line 237, in action_run
>         raise ex
>     Fail: Execution of 'ln -s /usr/hdp/2.2.0.0-* /usr/hdp/current' returned 1. ln: target
`/usr/hdp/current' is not a directory
>      stdout:
>     2014-10-08 15:24:26,248 - Execute['mkdir -p /var/lib/ambari-agent/data/tmp/AMBARI-artifacts/;
    curl -kf -x "" --retry 10     http://perf400-a-1.c.pramod-thangali.internal:8080/resources//UnlimitedJCEPolicyJDK7.zip
-o /var/lib/ambari-agent/data/tmp/AMBARI-artifacts//UnlimitedJCEPolicyJDK7.zip'] {'environment':
..., 'not_if': 'test -e /var/lib/ambari-agent/data/tmp/AMBARI-artifacts//UnlimitedJCEPolicyJDK7.zip',
'ignore_failures': True, 'path': ['/bin', '/usr/bin/']}
>     2014-10-08 15:24:26,269 - Skipping Execute['mkdir -p /var/lib/ambari-agent/data/tmp/AMBARI-artifacts/;
    curl -kf -x "" --retry 10     http://perf400-a-1.c.pramod-thangali.internal:8080/resources//UnlimitedJCEPolicyJDK7.zip
-o /var/lib/ambari-agent/data/tmp/AMBARI-artifacts//UnlimitedJCEPolicyJDK7.zip'] due to not_if
>     2014-10-08 15:24:26,270 - Group['hadoop'] {'ignore_failures': False}
>     2014-10-08 15:24:26,272 - Modifying group hadoop
>     2014-10-08 15:24:26,343 - Group['nobody'] {'ignore_failures': False}
>     2014-10-08 15:24:26,343 - Modifying group nobody
>     2014-10-08 15:24:26,379 - Group['users'] {'ignore_failures': False}
>     2014-10-08 15:24:26,379 - Modifying group users
>     2014-10-08 15:24:26,416 - Group['nagios'] {'ignore_failures': False}
>     2014-10-08 15:24:26,417 - Modifying group nagios
>     2014-10-08 15:24:26,458 - User['hive'] {'gid': 'hadoop', 'ignore_failures': False,
'groups': [u'hadoop']}
>     2014-10-08 15:24:26,459 - Modifying user hive
>     2014-10-08 15:24:26,489 - User['oozie'] {'gid': 'hadoop', 'ignore_failures': False,
'groups': [u'hadoop']}
>     2014-10-08 15:24:26,490 - Modifying user oozie
>     2014-10-08 15:24:26,513 - User['nobody'] {'gid': 'hadoop', 'ignore_failures': False,
'groups': [u'nobody']}
>     2014-10-08 15:24:26,513 - Modifying user nobody
>     2014-10-08 15:24:26,549 - User['nagios'] {'gid': 'nagios', 'ignore_failures': False,
'groups': [u'hadoop']}
>     2014-10-08 15:24:26,550 - Modifying user nagios
>     2014-10-08 15:24:26,568 - User['ambari-qa'] {'gid': 'hadoop', 'ignore_failures':
False, 'groups': [u'users']}
>     2014-10-08 15:24:26,569 - Modifying user ambari-qa
>     2014-10-08 15:24:26,603 - User['flume'] {'gid': 'hadoop', 'ignore_failures': False,
'groups': [u'hadoop']}
>     2014-10-08 15:24:26,604 - Modifying user flume
>     2014-10-08 15:24:26,631 - User['hdfs'] {'gid': 'hadoop', 'ignore_failures': False,
'groups': [u'hadoop']}
>     2014-10-08 15:24:26,632 - Modifying user hdfs
>     2014-10-08 15:24:26,652 - User['storm'] {'gid': 'hadoop', 'ignore_failures': False,
'groups': [u'hadoop']}
>     2014-10-08 15:24:26,652 - Modifying user storm
>     2014-10-08 15:24:26,677 - User['mapred'] {'gid': 'hadoop', 'ignore_failures': False,
'groups': [u'hadoop']}
>     2014-10-08 15:24:26,678 - Modifying user mapred
>     2014-10-08 15:24:26,702 - User['hbase'] {'gid': 'hadoop', 'ignore_failures': False,
'groups': [u'hadoop']}
>     2014-10-08 15:24:26,703 - Modifying user hbase
>     2014-10-08 15:24:26,724 - User['tez'] {'gid': 'hadoop', 'ignore_failures': False,
'groups': [u'users']}
>     2014-10-08 15:24:26,726 - Modifying user tez
>     2014-10-08 15:24:26,752 - User['zookeeper'] {'gid': 'hadoop', 'ignore_failures':
False, 'groups': [u'hadoop']}
>     2014-10-08 15:24:26,753 - Modifying user zookeeper
>     2014-10-08 15:24:26,776 - User['falcon'] {'gid': 'hadoop', 'ignore_failures': False,
'groups': [u'hadoop']}
>     2014-10-08 15:24:26,777 - Modifying user falcon
>     2014-10-08 15:24:26,811 - User['sqoop'] {'gid': 'hadoop', 'ignore_failures': False,
'groups': [u'hadoop']}
>     2014-10-08 15:24:26,812 - Modifying user sqoop
>     2014-10-08 15:24:26,835 - User['yarn'] {'gid': 'hadoop', 'ignore_failures': False,
'groups': [u'hadoop']}
>     2014-10-08 15:24:26,835 - Modifying user yarn
>     2014-10-08 15:24:26,860 - User['hcat'] {'gid': 'hadoop', 'ignore_failures': False,
'groups': [u'hadoop']}
>     2014-10-08 15:24:26,860 - Modifying user hcat
>     2014-10-08 15:24:26,888 - File['/var/lib/ambari-agent/data/tmp/changeUid.sh'] {'content':
StaticFile('changeToSecureUid.sh'), 'mode': 0555}
>     2014-10-08 15:24:26,892 - Execute['/var/lib/ambari-agent/data/tmp/changeUid.sh ambari-qa
/tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa
2>/dev/null'] {'not_if': 'test $(id -u ambari-qa) -gt 1000'}
>     2014-10-08 15:24:26,917 - Skipping Execute['/var/lib/ambari-agent/data/tmp/changeUid.sh
ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa
2>/dev/null'] due to not_if
>     2014-10-08 15:24:26,918 - File['/var/lib/ambari-agent/data/tmp/changeUid.sh'] {'content':
StaticFile('changeToSecureUid.sh'), 'mode': 0555}
>     2014-10-08 15:24:26,920 - Execute['/var/lib/ambari-agent/data/tmp/changeUid.sh hbase
/home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/hadoop/hbase 2>/dev/null'] {'not_if':
'test $(id -u hbase) -gt 1000'}
>     2014-10-08 15:24:26,948 - Skipping Execute['/var/lib/ambari-agent/data/tmp/changeUid.sh
hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/hadoop/hbase 2>/dev/null']
due to not_if
>     2014-10-08 15:24:26,959 - Directory['/etc/hadoop/conf.empty'] {'owner': 'root', 'group':
'root', 'recursive': True}
>     2014-10-08 15:24:26,960 - Link['/etc/hadoop/conf'] {'not_if': 'ls /etc/hadoop/conf',
'to': '/etc/hadoop/conf.empty'}
>     2014-10-08 15:24:26,984 - Skipping Link['/etc/hadoop/conf'] due to not_if
>     2014-10-08 15:24:27,007 - File['/etc/hadoop/conf/hadoop-env.sh'] {'content': InlineTemplate(...),
'owner': 'hdfs'}
>     2014-10-08 15:24:27,028 - Repository['HDP-2.2'] {'base_url': 'http://10.240.118.134/repo/HDP-2.2',
'action': ['create'], 'components': [u'HDP', 'main'], 'repo_template': 'repo_suse_rhel.j2',
'repo_file_name': 'HDP', 'mirror_list': None}
>     2014-10-08 15:24:27,040 - File['/etc/yum.repos.d/HDP.repo'] {'content': Template('repo_suse_rhel.j2')}
>     2014-10-08 15:24:27,042 - Repository['HDP-UTILS-1.1.0.20'] {'base_url': 'http://10.240.118.134/repo/HDP-UTILS-1.1.0.20',
'action': ['create'], 'components': [u'HDP-UTILS', 'main'], 'repo_template': 'repo_suse_rhel.j2',
'repo_file_name': 'HDP-UTILS', 'mirror_list': None}
>     2014-10-08 15:24:27,048 - File['/etc/yum.repos.d/HDP-UTILS.repo'] {'content': Template('repo_suse_rhel.j2')}
>     2014-10-08 15:24:27,049 - Package['unzip'] {}
>     2014-10-08 15:24:27,130 - Skipping installing existent package unzip
>     2014-10-08 15:24:27,131 - Package['curl'] {}
>     2014-10-08 15:24:27,169 - Skipping installing existent package curl
>     2014-10-08 15:24:27,171 - Execute['mkdir -p /var/lib/ambari-agent/data/tmp/AMBARI-artifacts/
;   curl -kf -x ""   --retry 10 http://perf400-a-1.c.pramod-thangali.internal:8080/resources//jdk-7u67-linux-x64.tar.gz
-o /var/lib/ambari-agent/data/tmp/AMBARI-artifacts//jdk-7u67-linux-x64.tar.gz'] {'environment':
..., 'not_if': 'test -e /usr/jdk64/jdk1.7.0_67/bin/java', 'path': ['/bin', '/usr/bin/']}
>     2014-10-08 15:24:27,199 - Skipping Execute['mkdir -p /var/lib/ambari-agent/data/tmp/AMBARI-artifacts/
;   curl -kf -x ""   --retry 10 http://perf400-a-1.c.pramod-thangali.internal:8080/resources//jdk-7u67-linux-x64.tar.gz
-o /var/lib/ambari-agent/data/tmp/AMBARI-artifacts//jdk-7u67-linux-x64.tar.gz'] due to not_if
>     2014-10-08 15:24:27,201 - Execute['mkdir -p /usr/jdk64 ; cd /usr/jdk64 ; tar -xf
/var/lib/ambari-agent/data/tmp/AMBARI-artifacts//jdk-7u67-linux-x64.tar.gz > /dev/null
2>&1'] {'not_if': 'test -e /usr/jdk64/jdk1.7.0_67/bin/java', 'path': ['/bin', '/usr/bin/']}
>     2014-10-08 15:24:27,216 - Skipping Execute['mkdir -p /usr/jdk64 ; cd /usr/jdk64 ;
tar -xf /var/lib/ambari-agent/data/tmp/AMBARI-artifacts//jdk-7u67-linux-x64.tar.gz > /dev/null
2>&1'] due to not_if
>     2014-10-08 15:24:27,482 - Package['zookeeper_2_2_0_0_*'] {}
>     2014-10-08 15:24:27,518 - Installing package zookeeper_2_2_0_0_* ('/usr/bin/yum -d
0 -e 0 -y install zookeeper_2_2_0_0_*')
>     2014-10-08 15:27:32,081 - Directory['/etc/zookeeper/conf'] {'owner': 'zookeeper',
'group': 'hadoop', 'recursive': True}
>     2014-10-08 15:27:32,084 - Creating directory Directory['/etc/zookeeper/conf']
>     2014-10-08 15:27:32,092 - Changing owner for /etc/zookeeper/conf from 0 to zookeeper
>     2014-10-08 15:27:32,092 - Changing group for /etc/zookeeper/conf from 0 to hadoop
>     2014-10-08 15:27:32,106 - File['/etc/zookeeper/conf/zookeeper-env.sh'] {'owner':
'zookeeper', 'content': InlineTemplate(...), 'group': 'hadoop'}
>     2014-10-08 15:27:32,106 - Writing File['/etc/zookeeper/conf/zookeeper-env.sh'] because
it doesn't exist
>     2014-10-08 15:27:32,107 - Changing owner for /etc/zookeeper/conf/zookeeper-env.sh
from 0 to zookeeper
>     2014-10-08 15:27:32,107 - Changing group for /etc/zookeeper/conf/zookeeper-env.sh
from 0 to hadoop
>     2014-10-08 15:27:32,125 - File['/etc/zookeeper/conf/zoo.cfg'] {'owner': 'zookeeper',
'content': Template('zoo.cfg.j2'), 'group': 'hadoop'}
>     2014-10-08 15:27:32,126 - Writing File['/etc/zookeeper/conf/zoo.cfg'] because it
doesn't exist
>     2014-10-08 15:27:32,127 - Changing owner for /etc/zookeeper/conf/zoo.cfg from 0 to
zookeeper
>     2014-10-08 15:27:32,128 - Changing group for /etc/zookeeper/conf/zoo.cfg from 0 to
hadoop
>     2014-10-08 15:27:32,130 - File['/etc/zookeeper/conf/configuration.xsl'] {'owner':
'zookeeper', 'content': Template('configuration.xsl.j2'), 'group': 'hadoop'}
>     2014-10-08 15:27:32,130 - Writing File['/etc/zookeeper/conf/configuration.xsl'] because
it doesn't exist
>     2014-10-08 15:27:32,131 - Changing owner for /etc/zookeeper/conf/configuration.xsl
from 0 to zookeeper
>     2014-10-08 15:27:32,131 - Changing group for /etc/zookeeper/conf/configuration.xsl
from 0 to hadoop
>     2014-10-08 15:27:32,131 - Directory['/var/run/zookeeper'] {'owner': 'zookeeper',
'group': 'hadoop', 'recursive': True}
>     2014-10-08 15:27:32,132 - Directory['/var/log/zookeeper'] {'owner': 'zookeeper',
'group': 'hadoop', 'recursive': True}
>     2014-10-08 15:27:32,133 - Directory['/grid/0/hadoop/zookeeper'] {'owner': 'zookeeper',
'group': 'hadoop', 'recursive': True}
>     2014-10-08 15:27:32,134 - Creating directory Directory['/grid/0/hadoop/zookeeper']
>     2014-10-08 15:27:32,156 - Changing owner for /grid/0/hadoop/zookeeper from 0 to zookeeper
>     2014-10-08 15:27:32,156 - Changing group for /grid/0/hadoop/zookeeper from 0 to hadoop
>     2014-10-08 15:27:32,157 - File['/grid/0/hadoop/zookeeper/myid'] {'content': '1',
'mode': 0644}
>     2014-10-08 15:27:32,158 - Writing File['/grid/0/hadoop/zookeeper/myid'] because it
doesn't exist
>     2014-10-08 15:27:32,158 - File['/etc/zookeeper/conf/log4j.properties'] {'content':
'...', 'owner': 'zookeeper', 'group': 'hadoop', 'mode': 0644}
>     2014-10-08 15:27:32,159 - Writing File['/etc/zookeeper/conf/log4j.properties'] because
it doesn't exist
>     2014-10-08 15:27:32,159 - Changing owner for /etc/zookeeper/conf/log4j.properties
from 0 to zookeeper
>     2014-10-08 15:27:32,160 - Changing group for /etc/zookeeper/conf/log4j.properties
from 0 to hadoop
>     2014-10-08 15:27:32,160 - File['/etc/zookeeper/conf/zoo_sample.cfg'] {'owner': 'zookeeper',
'group': 'hadoop'}
>     2014-10-08 15:27:32,160 - Writing File['/etc/zookeeper/conf/zoo_sample.cfg'] because
it doesn't exist
>     2014-10-08 15:27:32,161 - Changing owner for /etc/zookeeper/conf/zoo_sample.cfg from
0 to zookeeper
>     2014-10-08 15:27:32,161 - Changing group for /etc/zookeeper/conf/zoo_sample.cfg from
0 to hadoop
>     2014-10-08 15:27:32,435 - Execute['ln -s /usr/hdp/2.2.0.0-* /usr/hdp/current'] {'not_if':
'ls /usr/hdp/current', 'only_if': 'ls -d /usr/hdp/2.2.0.0-*'}
>     2014-10-08 15:27:32,527 - Error while executing command 'install':
>     Traceback (most recent call last):
>       File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py",
line 122, in execute
>         method(env)
>       File "/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/after-INSTALL/scripts/hook.py",
line 31, in hook
>         setup_hdp_install_directory()
>       File "/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/after-INSTALL/scripts/shared_initialization.py",
line 27, in setup_hdp_install_directory
>         only_if=format('ls -d /usr/hdp/{rpm_version}-*')
>       File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", line
148, in __init__
>         self.env.run()
>       File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py",
line 149, in run
>         self.run_action(resource, action)
>       File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py",
line 115, in run_action
>         provider_action()
>       File "/usr/lib/python2.6/site-packages/resource_management/core/providers/system.py",
line 237, in action_run
>         raise ex
>     Fail: Execution of 'ln -s /usr/hdp/2.2.0.0-* /usr/hdp/current' returned 1. ln: target
`/usr/hdp/current' is not a directory
>     



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

Mime
View raw message