ambari-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Alejandro Fernandez (JIRA)" <j...@apache.org>
Subject [jira] [Updated] (AMBARI-14596) Install cluster failed on Accumulo as tried to write config when hadoop conf dir is missing
Date Fri, 08 Jan 2016 23:58:40 GMT

     [ https://issues.apache.org/jira/browse/AMBARI-14596?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]

Alejandro Fernandez updated AMBARI-14596:
-----------------------------------------
    Description: 
Cluster installation failed on Accumulo Client because it was one of the first tasks scheduled
and HDFS Client had not been installed yet, which installs the hadoop rpm and creates the
/etc/hadoop/conf folder.

If a host does not contain /etc/hadoop/conf, then we should not attempt to write config files
to it during the after-install hooks. Once a component is installed that does contain the
hadoop rpm, then it will be responsible for writing out the configs to it.

Ambari 2.2.1.0-71
HDP 2.4.0.0-47

{code}Traceback (most recent call last):
  File "/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/after-INSTALL/scripts/hook.py",
line 38, in <module>
    AfterInstallHook().execute()
  File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py",
line 219, in execute
    method(env)
  File "/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/after-INSTALL/scripts/hook.py",
line 33, in hook
    setup_config()
  File "/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/after-INSTALL/scripts/shared_initialization.py",
line 55, in setup_config
    only_if=format("ls {hadoop_conf_dir}"))
  File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", line 154, in __init__
    self.env.run()
  File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 158,
in run
    self.run_action(resource, action)
  File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 121,
in run_action
    provider_action()
  File "/usr/lib/python2.6/site-packages/resource_management/libraries/providers/xml_config.py",
line 67, in action_create
    encoding = self.resource.encoding
  File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", line 154, in __init__
    self.env.run()
  File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 158,
in run
    self.run_action(resource, action)
  File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 121,
in run_action
    provider_action()
  File "/usr/lib/python2.6/site-packages/resource_management/core/providers/system.py", line
87, in action_create
    raise Fail("Applying %s failed, parent directory %s doesn't exist" % (self.resource, dirname))
resource_management.core.exceptions.Fail: Applying File['/usr/hdp/current/hadoop-client/conf/core-site.xml']
failed, parent directory /usr/hdp/current/hadoop-client/conf doesn't exist{code}

  was:
Cluster installation failed on Accumulo Client because it was one of the first tasks scheduled
and HDFS Client had not been installed yet, which installs the hadoop rpm and creates the
/etc/hadoop/conf folder.

If a host does not contain /etc/hadoop/conf, then we should not attempt to write config files
to it during the after-install hooks.

Ambari 2.2.1.0-71
HDP 2.4.0.0-47

{code}Traceback (most recent call last):
  File "/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/after-INSTALL/scripts/hook.py",
line 38, in <module>
    AfterInstallHook().execute()
  File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py",
line 219, in execute
    method(env)
  File "/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/after-INSTALL/scripts/hook.py",
line 33, in hook
    setup_config()
  File "/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/after-INSTALL/scripts/shared_initialization.py",
line 55, in setup_config
    only_if=format("ls {hadoop_conf_dir}"))
  File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", line 154, in __init__
    self.env.run()
  File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 158,
in run
    self.run_action(resource, action)
  File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 121,
in run_action
    provider_action()
  File "/usr/lib/python2.6/site-packages/resource_management/libraries/providers/xml_config.py",
line 67, in action_create
    encoding = self.resource.encoding
  File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", line 154, in __init__
    self.env.run()
  File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 158,
in run
    self.run_action(resource, action)
  File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line 121,
in run_action
    provider_action()
  File "/usr/lib/python2.6/site-packages/resource_management/core/providers/system.py", line
87, in action_create
    raise Fail("Applying %s failed, parent directory %s doesn't exist" % (self.resource, dirname))
resource_management.core.exceptions.Fail: Applying File['/usr/hdp/current/hadoop-client/conf/core-site.xml']
failed, parent directory /usr/hdp/current/hadoop-client/conf doesn't exist{code}


> Install cluster failed on Accumulo as tried to write config when hadoop conf dir is missing
> -------------------------------------------------------------------------------------------
>
>                 Key: AMBARI-14596
>                 URL: https://issues.apache.org/jira/browse/AMBARI-14596
>             Project: Ambari
>          Issue Type: Bug
>          Components: ambari-server
>    Affects Versions: 2.2.0
>            Reporter: Alejandro Fernandez
>            Assignee: Alejandro Fernandez
>             Fix For: 2.2.1
>
>         Attachments: AMBARI-14596.trunk.patch
>
>
> Cluster installation failed on Accumulo Client because it was one of the first tasks
scheduled and HDFS Client had not been installed yet, which installs the hadoop rpm and creates
the /etc/hadoop/conf folder.
> If a host does not contain /etc/hadoop/conf, then we should not attempt to write config
files to it during the after-install hooks. Once a component is installed that does contain
the hadoop rpm, then it will be responsible for writing out the configs to it.
> Ambari 2.2.1.0-71
> HDP 2.4.0.0-47
> {code}Traceback (most recent call last):
>   File "/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/after-INSTALL/scripts/hook.py",
line 38, in <module>
>     AfterInstallHook().execute()
>   File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py",
line 219, in execute
>     method(env)
>   File "/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/after-INSTALL/scripts/hook.py",
line 33, in hook
>     setup_config()
>   File "/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/after-INSTALL/scripts/shared_initialization.py",
line 55, in setup_config
>     only_if=format("ls {hadoop_conf_dir}"))
>   File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", line 154,
in __init__
>     self.env.run()
>   File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line
158, in run
>     self.run_action(resource, action)
>   File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line
121, in run_action
>     provider_action()
>   File "/usr/lib/python2.6/site-packages/resource_management/libraries/providers/xml_config.py",
line 67, in action_create
>     encoding = self.resource.encoding
>   File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", line 154,
in __init__
>     self.env.run()
>   File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line
158, in run
>     self.run_action(resource, action)
>   File "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", line
121, in run_action
>     provider_action()
>   File "/usr/lib/python2.6/site-packages/resource_management/core/providers/system.py",
line 87, in action_create
>     raise Fail("Applying %s failed, parent directory %s doesn't exist" % (self.resource,
dirname))
> resource_management.core.exceptions.Fail: Applying File['/usr/hdp/current/hadoop-client/conf/core-site.xml']
failed, parent directory /usr/hdp/current/hadoop-client/conf doesn't exist{code}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

Mime
View raw message