ambari-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Dmytro Sen (JIRA)" <j...@apache.org>
Subject [jira] [Updated] (AMBARI-7484) Versioned RPMS install fails on preinstall Hook.
Date Thu, 25 Sep 2014 12:14:34 GMT

     [ https://issues.apache.org/jira/browse/AMBARI-7484?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]

Dmytro Sen updated AMBARI-7484:
-------------------------------
    Description: 
Versioned RPMS install fails on after-install Hook.

{code}
': ['/usr/libexec/hdp/ganglia', '/usr/sbin', '/sbin:/usr/local/bin', '/bin', '/usr/bin']}
2014-09-24 23:35:34,085 - Execute['/usr/libexec/hdp/ganglia/setupGanglia.sh -c HDPFlumeServer
-m -o root -g hadoop'] {'path': ['/usr/libexec/hdp/ganglia', '/usr/sbin', '/sbin:/usr/local/bin',
'/bin', '/usr/bin']}
2014-09-24 23:35:34,119 - Execute['/usr/libexec/hdp/ganglia/setupGanglia.sh -c HDPSlaves -m
-o root -g hadoop'] {'path': ['/usr/libexec/hdp/ganglia', '/usr/sbin', '/sbin:/usr/local/bin',
'/bin', '/usr/bin']}
2014-09-24 23:35:34,154 - Execute['chkconfig gmond off'] {'path': ['/usr/sbin:/sbin:/usr/local/bin:/bin:/usr/bin']}
2014-09-24 23:35:34,165 - Execute['chkconfig gmetad off'] {'path': ['/usr/sbin:/sbin:/usr/local/bin:/bin:/usr/bin']}
2014-09-24 23:35:34,284 - Directory['/etc/hadoop/conf.empty'] {'owner': 'root', 'group': 'root',
'recursive': True}
2014-09-24 23:35:34,285 - Creating directory Directory['/etc/hadoop/conf.empty']
2014-09-24 23:35:34,287 - Link['/etc/hadoop/conf'] {'not_if': 'ls /etc/hadoop/conf', 'to':
'/etc/hadoop/conf.empty'}
2014-09-24 23:35:34,297 - Creating symbolic Link['/etc/hadoop/conf']
2014-09-24 23:35:34,309 - File['/etc/hadoop/conf/hadoop-env.sh'] {'content': InlineTemplate(...),
'owner': 'hdfs'}
2014-09-24 23:35:34,309 - Writing File['/etc/hadoop/conf/hadoop-env.sh'] because it doesn't
exist
2014-09-24 23:35:34,310 - Changing owner for /etc/hadoop/conf/hadoop-env.sh from 0 to hdfs
2014-09-24 23:35:34,310 - Execute['ln -s /usr/hdp/2.9.9.9* /usr/hdp/current'] {'not_if': 'ls
/usr/hdp/current'}
2014-09-24 23:35:34,329 - Error while executing command 'install':
Traceback (most recent call last):
  File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py",
line 122, in execute
    method(env)
  File "/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/after-INSTALL/scripts/hook.py",
line 31, in hook
    setup_hadoop_env()
  File "/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/after-INSTALL/scripts/shared_initialization.py",
line 44, in setup_hadoop_env
    not_if=format('ls {versioned_hdp_root}')
  File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", line 148, in __init__
    self.env.run()
  File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 149,
in run
    self.run_action(resource, action)
  File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 115,
in run_action
    provider_action()
  File "/usr/lib/python2.6/site-packages/resource_management/core/providers/system.py", line
237, in action_run
    raise ex
Fail: Execution of 'ln -s /usr/hdp/2.9.9.9* /usr/hdp/current' returned 1. ln: creating symbolic
link `/usr/hdp/current': No such file or directory
{code}


  was:
Versioned RPMS install fails on preinstall Hook.

{code}
': ['/usr/libexec/hdp/ganglia', '/usr/sbin', '/sbin:/usr/local/bin', '/bin', '/usr/bin']}
2014-09-24 23:35:34,085 - Execute['/usr/libexec/hdp/ganglia/setupGanglia.sh -c HDPFlumeServer
-m -o root -g hadoop'] {'path': ['/usr/libexec/hdp/ganglia', '/usr/sbin', '/sbin:/usr/local/bin',
'/bin', '/usr/bin']}
2014-09-24 23:35:34,119 - Execute['/usr/libexec/hdp/ganglia/setupGanglia.sh -c HDPSlaves -m
-o root -g hadoop'] {'path': ['/usr/libexec/hdp/ganglia', '/usr/sbin', '/sbin:/usr/local/bin',
'/bin', '/usr/bin']}
2014-09-24 23:35:34,154 - Execute['chkconfig gmond off'] {'path': ['/usr/sbin:/sbin:/usr/local/bin:/bin:/usr/bin']}
2014-09-24 23:35:34,165 - Execute['chkconfig gmetad off'] {'path': ['/usr/sbin:/sbin:/usr/local/bin:/bin:/usr/bin']}
2014-09-24 23:35:34,284 - Directory['/etc/hadoop/conf.empty'] {'owner': 'root', 'group': 'root',
'recursive': True}
2014-09-24 23:35:34,285 - Creating directory Directory['/etc/hadoop/conf.empty']
2014-09-24 23:35:34,287 - Link['/etc/hadoop/conf'] {'not_if': 'ls /etc/hadoop/conf', 'to':
'/etc/hadoop/conf.empty'}
2014-09-24 23:35:34,297 - Creating symbolic Link['/etc/hadoop/conf']
2014-09-24 23:35:34,309 - File['/etc/hadoop/conf/hadoop-env.sh'] {'content': InlineTemplate(...),
'owner': 'hdfs'}
2014-09-24 23:35:34,309 - Writing File['/etc/hadoop/conf/hadoop-env.sh'] because it doesn't
exist
2014-09-24 23:35:34,310 - Changing owner for /etc/hadoop/conf/hadoop-env.sh from 0 to hdfs
2014-09-24 23:35:34,310 - Execute['ln -s /usr/hdp/2.9.9.9* /usr/hdp/current'] {'not_if': 'ls
/usr/hdp/current'}
2014-09-24 23:35:34,329 - Error while executing command 'install':
Traceback (most recent call last):
  File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py",
line 122, in execute
    method(env)
  File "/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/after-INSTALL/scripts/hook.py",
line 31, in hook
    setup_hadoop_env()
  File "/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/after-INSTALL/scripts/shared_initialization.py",
line 44, in setup_hadoop_env
    not_if=format('ls {versioned_hdp_root}')
  File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", line 148, in __init__
    self.env.run()
  File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 149,
in run
    self.run_action(resource, action)
  File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 115,
in run_action
    provider_action()
  File "/usr/lib/python2.6/site-packages/resource_management/core/providers/system.py", line
237, in action_run
    raise ex
Fail: Execution of 'ln -s /usr/hdp/2.9.9.9* /usr/hdp/current' returned 1. ln: creating symbolic
link `/usr/hdp/current': No such file or directory
{code}



> Versioned RPMS install fails on preinstall Hook.
> ------------------------------------------------
>
>                 Key: AMBARI-7484
>                 URL: https://issues.apache.org/jira/browse/AMBARI-7484
>             Project: Ambari
>          Issue Type: Bug
>          Components: stacks
>    Affects Versions: 1.7.0
>         Environment: HDP2.2
>            Reporter: Dmytro Sen
>            Assignee: Dmytro Sen
>            Priority: Blocker
>             Fix For: 1.7.0
>
>
> Versioned RPMS install fails on after-install Hook.
> {code}
> ': ['/usr/libexec/hdp/ganglia', '/usr/sbin', '/sbin:/usr/local/bin', '/bin', '/usr/bin']}
> 2014-09-24 23:35:34,085 - Execute['/usr/libexec/hdp/ganglia/setupGanglia.sh -c HDPFlumeServer
-m -o root -g hadoop'] {'path': ['/usr/libexec/hdp/ganglia', '/usr/sbin', '/sbin:/usr/local/bin',
'/bin', '/usr/bin']}
> 2014-09-24 23:35:34,119 - Execute['/usr/libexec/hdp/ganglia/setupGanglia.sh -c HDPSlaves
-m -o root -g hadoop'] {'path': ['/usr/libexec/hdp/ganglia', '/usr/sbin', '/sbin:/usr/local/bin',
'/bin', '/usr/bin']}
> 2014-09-24 23:35:34,154 - Execute['chkconfig gmond off'] {'path': ['/usr/sbin:/sbin:/usr/local/bin:/bin:/usr/bin']}
> 2014-09-24 23:35:34,165 - Execute['chkconfig gmetad off'] {'path': ['/usr/sbin:/sbin:/usr/local/bin:/bin:/usr/bin']}
> 2014-09-24 23:35:34,284 - Directory['/etc/hadoop/conf.empty'] {'owner': 'root', 'group':
'root', 'recursive': True}
> 2014-09-24 23:35:34,285 - Creating directory Directory['/etc/hadoop/conf.empty']
> 2014-09-24 23:35:34,287 - Link['/etc/hadoop/conf'] {'not_if': 'ls /etc/hadoop/conf',
'to': '/etc/hadoop/conf.empty'}
> 2014-09-24 23:35:34,297 - Creating symbolic Link['/etc/hadoop/conf']
> 2014-09-24 23:35:34,309 - File['/etc/hadoop/conf/hadoop-env.sh'] {'content': InlineTemplate(...),
'owner': 'hdfs'}
> 2014-09-24 23:35:34,309 - Writing File['/etc/hadoop/conf/hadoop-env.sh'] because it doesn't
exist
> 2014-09-24 23:35:34,310 - Changing owner for /etc/hadoop/conf/hadoop-env.sh from 0 to
hdfs
> 2014-09-24 23:35:34,310 - Execute['ln -s /usr/hdp/2.9.9.9* /usr/hdp/current'] {'not_if':
'ls /usr/hdp/current'}
> 2014-09-24 23:35:34,329 - Error while executing command 'install':
> Traceback (most recent call last):
>   File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py",
line 122, in execute
>     method(env)
>   File "/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/after-INSTALL/scripts/hook.py",
line 31, in hook
>     setup_hadoop_env()
>   File "/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/after-INSTALL/scripts/shared_initialization.py",
line 44, in setup_hadoop_env
>     not_if=format('ls {versioned_hdp_root}')
>   File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", line 148,
in __init__
>     self.env.run()
>   File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line
149, in run
>     self.run_action(resource, action)
>   File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line
115, in run_action
>     provider_action()
>   File "/usr/lib/python2.6/site-packages/resource_management/core/providers/system.py",
line 237, in action_run
>     raise ex
> Fail: Execution of 'ln -s /usr/hdp/2.9.9.9* /usr/hdp/current' returned 1. ln: creating
symbolic link `/usr/hdp/current': No such file or directory
> {code}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

Mime
View raw message