ambari-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "apachehadoop (JIRA)" <j...@apache.org>
Subject [jira] [Updated] (AMBARI-10007) Errors informations of HDP deployment failed on RedHat6.5 with python2.6.6(Ambari 1.7.0,HDP2.2.0,HDP-Utils-1.1.0.20)
Date Tue, 10 Mar 2015 15:23:38 GMT

     [ https://issues.apache.org/jira/browse/AMBARI-10007?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]

apachehadoop updated AMBARI-10007:
----------------------------------
    Description: 
When I deploy HDP on RedHat6.5 with python2.6.6( Ambari 1.7.0,HDP2.2.0,HDP-Utils-1.1.0.20),but
I met the following errors,Please help me,help,help,help...

======================================================================================================
2015-03-10 18:15:18,926 - Error while executing command 'any':
Traceback (most recent call last):
  File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py",
line 123, in execute
    method(env)
  File "/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/before-ANY/scripts/hook.py", line
31, in hook
    setup_hadoop_env()
  File "/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/before-ANY/scripts/shared_initialization.py",
line 113, in setup_hadoop_env
    content=InlineTemplate(params.hadoop_env_sh_template)
  File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", line 148, in __init__
    self.env.run()
  File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 149,
in run
    self.run_action(resource, action)
  File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 115,
in run_action
    provider_action()
  File "/usr/lib/python2.6/site-packages/resource_management/core/providers/system.py", line
93, in action_create
    raise Fail("Applying %s failed, parent directory %s doesn't exist" % (self.resource, dirname))
Fail: Applying File['/etc/hadoop/conf/hadoop-env.sh'] failed, parent directory /etc/hadoop/conf
doesn't exist
Error: Error: Unable to run the custom hook script ['/usr/bin/python2.6', '/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/before-ANY/scripts/hook.py',
'ANY', '/var/lib/ambari-agent/data/command-1999.json', '/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/before-ANY',
'/var/lib/ambari-agent/data/structured-out-1999.json', 'INFO', '/var/lib/ambari-agent/data/tmp']
stdout:   /var/lib/ambari-agent/data/output-1999.txt

2015-03-10 18:15:18,132 - Group['hadoop'] {'ignore_failures': False}
2015-03-10 18:15:18,133 - Adding group Group['hadoop']
2015-03-10 18:15:18,211 - Group['nobody'] {'ignore_failures': False}
2015-03-10 18:15:18,212 - Modifying group nobody
2015-03-10 18:15:18,279 - Group['users'] {'ignore_failures': False}
2015-03-10 18:15:18,280 - Modifying group users
2015-03-10 18:15:18,348 - Group['nagios'] {'ignore_failures': False}
2015-03-10 18:15:18,348 - Adding group Group['nagios']
2015-03-10 18:15:18,415 - User['nobody'] {'gid': 'hadoop', 'ignore_failures': False, 'groups':
[u'nobody']}
2015-03-10 18:15:18,415 - Modifying user nobody
2015-03-10 18:15:18,488 - User['nagios'] {'gid': 'nagios', 'ignore_failures': False, 'groups':
[u'hadoop']}
2015-03-10 18:15:18,489 - Adding user User['nagios']
2015-03-10 18:15:18,558 - User['ambari-qa'] {'gid': 'hadoop', 'ignore_failures': False, 'groups':
[u'users']}
2015-03-10 18:15:18,559 - Adding user User['ambari-qa']
2015-03-10 18:15:18,627 - User['zookeeper'] {'gid': 'hadoop', 'ignore_failures': False, 'groups':
[u'hadoop']}
2015-03-10 18:15:18,628 - Adding user User['zookeeper']
2015-03-10 18:15:18,697 - User['hdfs'] {'gid': 'hadoop', 'ignore_failures': False, 'groups':
[u'hadoop']}
2015-03-10 18:15:18,697 - Adding user User['hdfs']
2015-03-10 18:15:18,767 - File['/var/lib/ambari-agent/data/tmp/changeUid.sh'] {'content':
StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2015-03-10 18:15:18,770 - Execute['/var/lib/ambari-agent/data/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa
2>/dev/null'] {'not_if': 'test $(id -u ambari-qa) -gt 1000'}
2015-03-10 18:15:18,839 - Skipping Execute['/var/lib/ambari-agent/data/tmp/changeUid.sh ambari-qa
/tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa
2>/dev/null'] due to not_if
2015-03-10 18:15:18,839 - Directory['/etc/hadoop/conf.empty'] {'owner': 'root', 'group': 'root',
'recursive': True}
2015-03-10 18:15:18,840 - Link['/etc/hadoop/conf'] {'not_if': 'ls /etc/hadoop/conf', 'to':
'/etc/hadoop/conf.empty'}
2015-03-10 18:15:18,907 - Skipping Link['/etc/hadoop/conf'] due to not_if
2015-03-10 18:15:18,926 - File['/etc/hadoop/conf/hadoop-env.sh'] {'content': InlineTemplate(...),
'owner': 'hdfs'}
2015-03-10 18:15:18,926 - Error while executing command 'any':
Traceback (most recent call last):
  File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py",
line 123, in execute
    method(env)
  File "/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/before-ANY/scripts/hook.py", line
31, in hook
    setup_hadoop_env()
  File "/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/before-ANY/scripts/shared_initialization.py",
line 113, in setup_hadoop_env
    content=InlineTemplate(params.hadoop_env_sh_template)
  File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", line 148, in __init__
    self.env.run()
  File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 149,
in run
    self.run_action(resource, action)
  File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 115,
in run_action
    provider_action()
  File "/usr/lib/python2.6/site-packages/resource_management/core/providers/system.py", line
93, in action_create
    raise Fail("Applying %s failed, parent directory %s doesn't exist" % (self.resource, dirname))
Fail: Applying File['/etc/hadoop/conf/hadoop-env.sh'] failed, parent directory /etc/hadoop/conf
doesn't exist
Error: Error: Unable to run the custom hook script ['/usr/bin/python2.6', '/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/before-ANY/scripts/hook.py',
'ANY', '/var/lib/ambari-agent/data/command-1999.json', '/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/before-ANY',
'/var/lib/ambari-agent/data/structured-out-1999.json', 'INFO', '/var/lib/ambari-agent/data/tmp']


  was:
When I deploy HDP on RedHat6.5 with python2.6.6( Ambari 1.7.0,HDP2.2.0,HDP-Utils-1.1.0.20),but
I met the following errors,Please help me,help,help,help...

2015-03-10 18:15:18,926 - Error while executing command 'any':
Traceback (most recent call last):
  File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py",
line 123, in execute
    method(env)
  File "/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/before-ANY/scripts/hook.py", line
31, in hook
    setup_hadoop_env()
  File "/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/before-ANY/scripts/shared_initialization.py",
line 113, in setup_hadoop_env
    content=InlineTemplate(params.hadoop_env_sh_template)
  File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", line 148, in __init__
    self.env.run()
  File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 149,
in run
    self.run_action(resource, action)
  File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 115,
in run_action
    provider_action()
  File "/usr/lib/python2.6/site-packages/resource_management/core/providers/system.py", line
93, in action_create
    raise Fail("Applying %s failed, parent directory %s doesn't exist" % (self.resource, dirname))
Fail: Applying File['/etc/hadoop/conf/hadoop-env.sh'] failed, parent directory /etc/hadoop/conf
doesn't exist
Error: Error: Unable to run the custom hook script ['/usr/bin/python2.6', '/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/before-ANY/scripts/hook.py',
'ANY', '/var/lib/ambari-agent/data/command-1999.json', '/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/before-ANY',
'/var/lib/ambari-agent/data/structured-out-1999.json', 'INFO', '/var/lib/ambari-agent/data/tmp']
stdout:   /var/lib/ambari-agent/data/output-1999.txt

2015-03-10 18:15:18,132 - Group['hadoop'] {'ignore_failures': False}
2015-03-10 18:15:18,133 - Adding group Group['hadoop']
2015-03-10 18:15:18,211 - Group['nobody'] {'ignore_failures': False}
2015-03-10 18:15:18,212 - Modifying group nobody
2015-03-10 18:15:18,279 - Group['users'] {'ignore_failures': False}
2015-03-10 18:15:18,280 - Modifying group users
2015-03-10 18:15:18,348 - Group['nagios'] {'ignore_failures': False}
2015-03-10 18:15:18,348 - Adding group Group['nagios']
2015-03-10 18:15:18,415 - User['nobody'] {'gid': 'hadoop', 'ignore_failures': False, 'groups':
[u'nobody']}
2015-03-10 18:15:18,415 - Modifying user nobody
2015-03-10 18:15:18,488 - User['nagios'] {'gid': 'nagios', 'ignore_failures': False, 'groups':
[u'hadoop']}
2015-03-10 18:15:18,489 - Adding user User['nagios']
2015-03-10 18:15:18,558 - User['ambari-qa'] {'gid': 'hadoop', 'ignore_failures': False, 'groups':
[u'users']}
2015-03-10 18:15:18,559 - Adding user User['ambari-qa']
2015-03-10 18:15:18,627 - User['zookeeper'] {'gid': 'hadoop', 'ignore_failures': False, 'groups':
[u'hadoop']}
2015-03-10 18:15:18,628 - Adding user User['zookeeper']
2015-03-10 18:15:18,697 - User['hdfs'] {'gid': 'hadoop', 'ignore_failures': False, 'groups':
[u'hadoop']}
2015-03-10 18:15:18,697 - Adding user User['hdfs']
2015-03-10 18:15:18,767 - File['/var/lib/ambari-agent/data/tmp/changeUid.sh'] {'content':
StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2015-03-10 18:15:18,770 - Execute['/var/lib/ambari-agent/data/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa
2>/dev/null'] {'not_if': 'test $(id -u ambari-qa) -gt 1000'}
2015-03-10 18:15:18,839 - Skipping Execute['/var/lib/ambari-agent/data/tmp/changeUid.sh ambari-qa
/tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa
2>/dev/null'] due to not_if
2015-03-10 18:15:18,839 - Directory['/etc/hadoop/conf.empty'] {'owner': 'root', 'group': 'root',
'recursive': True}
2015-03-10 18:15:18,840 - Link['/etc/hadoop/conf'] {'not_if': 'ls /etc/hadoop/conf', 'to':
'/etc/hadoop/conf.empty'}
2015-03-10 18:15:18,907 - Skipping Link['/etc/hadoop/conf'] due to not_if
2015-03-10 18:15:18,926 - File['/etc/hadoop/conf/hadoop-env.sh'] {'content': InlineTemplate(...),
'owner': 'hdfs'}
2015-03-10 18:15:18,926 - Error while executing command 'any':
Traceback (most recent call last):
  File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py",
line 123, in execute
    method(env)
  File "/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/before-ANY/scripts/hook.py", line
31, in hook
    setup_hadoop_env()
  File "/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/before-ANY/scripts/shared_initialization.py",
line 113, in setup_hadoop_env
    content=InlineTemplate(params.hadoop_env_sh_template)
  File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", line 148, in __init__
    self.env.run()
  File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 149,
in run
    self.run_action(resource, action)
  File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 115,
in run_action
    provider_action()
  File "/usr/lib/python2.6/site-packages/resource_management/core/providers/system.py", line
93, in action_create
    raise Fail("Applying %s failed, parent directory %s doesn't exist" % (self.resource, dirname))
Fail: Applying File['/etc/hadoop/conf/hadoop-env.sh'] failed, parent directory /etc/hadoop/conf
doesn't exist
Error: Error: Unable to run the custom hook script ['/usr/bin/python2.6', '/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/before-ANY/scripts/hook.py',
'ANY', '/var/lib/ambari-agent/data/command-1999.json', '/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/before-ANY',
'/var/lib/ambari-agent/data/structured-out-1999.json', 'INFO', '/var/lib/ambari-agent/data/tmp']



> Errors informations of HDP deployment failed on RedHat6.5 with python2.6.6(Ambari 1.7.0,HDP2.2.0,HDP-Utils-1.1.0.20)
> --------------------------------------------------------------------------------------------------------------------
>
>                 Key: AMBARI-10007
>                 URL: https://issues.apache.org/jira/browse/AMBARI-10007
>             Project: Ambari
>          Issue Type: Bug
>            Reporter: apachehadoop
>
> When I deploy HDP on RedHat6.5 with python2.6.6( Ambari 1.7.0,HDP2.2.0,HDP-Utils-1.1.0.20),but
I met the following errors,Please help me,help,help,help...
> ======================================================================================================
> 2015-03-10 18:15:18,926 - Error while executing command 'any':
> Traceback (most recent call last):
>   File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py",
line 123, in execute
>     method(env)
>   File "/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/before-ANY/scripts/hook.py",
line 31, in hook
>     setup_hadoop_env()
>   File "/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/before-ANY/scripts/shared_initialization.py",
line 113, in setup_hadoop_env
>     content=InlineTemplate(params.hadoop_env_sh_template)
>   File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", line 148,
in __init__
>     self.env.run()
>   File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line
149, in run
>     self.run_action(resource, action)
>   File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line
115, in run_action
>     provider_action()
>   File "/usr/lib/python2.6/site-packages/resource_management/core/providers/system.py",
line 93, in action_create
>     raise Fail("Applying %s failed, parent directory %s doesn't exist" % (self.resource,
dirname))
> Fail: Applying File['/etc/hadoop/conf/hadoop-env.sh'] failed, parent directory /etc/hadoop/conf
doesn't exist
> Error: Error: Unable to run the custom hook script ['/usr/bin/python2.6', '/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/before-ANY/scripts/hook.py',
'ANY', '/var/lib/ambari-agent/data/command-1999.json', '/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/before-ANY',
'/var/lib/ambari-agent/data/structured-out-1999.json', 'INFO', '/var/lib/ambari-agent/data/tmp']
> stdout:   /var/lib/ambari-agent/data/output-1999.txt
> 2015-03-10 18:15:18,132 - Group['hadoop'] {'ignore_failures': False}
> 2015-03-10 18:15:18,133 - Adding group Group['hadoop']
> 2015-03-10 18:15:18,211 - Group['nobody'] {'ignore_failures': False}
> 2015-03-10 18:15:18,212 - Modifying group nobody
> 2015-03-10 18:15:18,279 - Group['users'] {'ignore_failures': False}
> 2015-03-10 18:15:18,280 - Modifying group users
> 2015-03-10 18:15:18,348 - Group['nagios'] {'ignore_failures': False}
> 2015-03-10 18:15:18,348 - Adding group Group['nagios']
> 2015-03-10 18:15:18,415 - User['nobody'] {'gid': 'hadoop', 'ignore_failures': False,
'groups': [u'nobody']}
> 2015-03-10 18:15:18,415 - Modifying user nobody
> 2015-03-10 18:15:18,488 - User['nagios'] {'gid': 'nagios', 'ignore_failures': False,
'groups': [u'hadoop']}
> 2015-03-10 18:15:18,489 - Adding user User['nagios']
> 2015-03-10 18:15:18,558 - User['ambari-qa'] {'gid': 'hadoop', 'ignore_failures': False,
'groups': [u'users']}
> 2015-03-10 18:15:18,559 - Adding user User['ambari-qa']
> 2015-03-10 18:15:18,627 - User['zookeeper'] {'gid': 'hadoop', 'ignore_failures': False,
'groups': [u'hadoop']}
> 2015-03-10 18:15:18,628 - Adding user User['zookeeper']
> 2015-03-10 18:15:18,697 - User['hdfs'] {'gid': 'hadoop', 'ignore_failures': False, 'groups':
[u'hadoop']}
> 2015-03-10 18:15:18,697 - Adding user User['hdfs']
> 2015-03-10 18:15:18,767 - File['/var/lib/ambari-agent/data/tmp/changeUid.sh'] {'content':
StaticFile('changeToSecureUid.sh'), 'mode': 0555}
> 2015-03-10 18:15:18,770 - Execute['/var/lib/ambari-agent/data/tmp/changeUid.sh ambari-qa
/tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa
2>/dev/null'] {'not_if': 'test $(id -u ambari-qa) -gt 1000'}
> 2015-03-10 18:15:18,839 - Skipping Execute['/var/lib/ambari-agent/data/tmp/changeUid.sh
ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa
2>/dev/null'] due to not_if
> 2015-03-10 18:15:18,839 - Directory['/etc/hadoop/conf.empty'] {'owner': 'root', 'group':
'root', 'recursive': True}
> 2015-03-10 18:15:18,840 - Link['/etc/hadoop/conf'] {'not_if': 'ls /etc/hadoop/conf',
'to': '/etc/hadoop/conf.empty'}
> 2015-03-10 18:15:18,907 - Skipping Link['/etc/hadoop/conf'] due to not_if
> 2015-03-10 18:15:18,926 - File['/etc/hadoop/conf/hadoop-env.sh'] {'content': InlineTemplate(...),
'owner': 'hdfs'}
> 2015-03-10 18:15:18,926 - Error while executing command 'any':
> Traceback (most recent call last):
>   File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py",
line 123, in execute
>     method(env)
>   File "/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/before-ANY/scripts/hook.py",
line 31, in hook
>     setup_hadoop_env()
>   File "/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/before-ANY/scripts/shared_initialization.py",
line 113, in setup_hadoop_env
>     content=InlineTemplate(params.hadoop_env_sh_template)
>   File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", line 148,
in __init__
>     self.env.run()
>   File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line
149, in run
>     self.run_action(resource, action)
>   File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line
115, in run_action
>     provider_action()
>   File "/usr/lib/python2.6/site-packages/resource_management/core/providers/system.py",
line 93, in action_create
>     raise Fail("Applying %s failed, parent directory %s doesn't exist" % (self.resource,
dirname))
> Fail: Applying File['/etc/hadoop/conf/hadoop-env.sh'] failed, parent directory /etc/hadoop/conf
doesn't exist
> Error: Error: Unable to run the custom hook script ['/usr/bin/python2.6', '/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/before-ANY/scripts/hook.py',
'ANY', '/var/lib/ambari-agent/data/command-1999.json', '/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/before-ANY',
'/var/lib/ambari-agent/data/structured-out-1999.json', 'INFO', '/var/lib/ambari-agent/data/tmp']



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

Mime
View raw message