ambari-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Dmitry Lysnichenko (JIRA)" <j...@apache.org>
Subject [jira] [Updated] (AMBARI-14217) RU: Spark install failed after upgrade
Date Fri, 04 Dec 2015 15:25:10 GMT

     [ https://issues.apache.org/jira/browse/AMBARI-14217?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]

Dmitry Lysnichenko updated AMBARI-14217:
----------------------------------------
    Component/s: ambari-server

> RU: Spark install failed after upgrade
> --------------------------------------
>
>                 Key: AMBARI-14217
>                 URL: https://issues.apache.org/jira/browse/AMBARI-14217
>             Project: Ambari
>          Issue Type: Bug
>          Components: ambari-server
>            Reporter: Dmitry Lysnichenko
>            Assignee: Dmitry Lysnichenko
>         Attachments: AMBARI-14217.patch
>
>
> After performing Rolling Upgrade tried to add Spark to cluster.
> Failed with error:
> {code}
> Traceback (most recent call last):
> File \"/var/lib/ambari-agent/cache/common-services/SPARK/1.2.0.2.2/package/scripts/spark_client.py\",
line 59, in <module>
> SparkClient().execute()
> File \"/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py\",
line 217, in execute
> method(env)
> File \"/var/lib/ambari-agent/cache/common-services/SPARK/1.2.0.2.2/package/scripts/spark_client.py\",
line 34, in install
> self.install_packages(env)
> File \"/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py\",
line 393, in install_packages
> Package(name)
> File \"/usr/lib/python2.6/site-packages/resource_management/core/base.py\", line 154,
in __init__
> self.env.run()
> File \"/usr/lib/python2.6/site-packages/resource_management/core/environment.py\", line
158, in run
> self.run_action(resource, action)
> File \"/usr/lib/python2.6/site-packages/resource_management/core/environment.py\", line
121, in run_action
> provider_action()
> File \"/usr/lib/python2.6/site-packages/resource_management/core/providers/package/__init__.py\",
line 49, in action_install
> self.install_package(package_name, self.resource.use_repos, self.resource.skip_repos)
> File \"/usr/lib/python2.6/site-packages/resource_management/core/providers/package/zypper.py\",
line 57, in install_package
> self.checked_call_until_not_locked(cmd, sudo=True, logoutput=self.get_logoutput())
> File \"/usr/lib/python2.6/site-packages/resource_management/core/providers/package/__init__.py\",
line 72, in checked_call_until_not_locked
> return self.wait_until_not_locked(cmd, is_checked=True, **kwargs)
> File \"/usr/lib/python2.6/site-packages/resource_management/core/providers/package/__init__.py\",
line 80, in wait_until_not_locked
> code, out = func(cmd, **kwargs)
> File \"/usr/lib/python2.6/site-packages/resource_management/core/shell.py\", line 70,
in inner
> result = function(command, **kwargs)
> File \"/usr/lib/python2.6/site-packages/resource_management/core/shell.py\", line 92,
in checked_call
> tries=tries, try_sleep=try_sleep)
> File \"/usr/lib/python2.6/site-packages/resource_management/core/shell.py\", line 140,
in _call_wrapper
> result = _call(command, **kwargs_copy)
> File \"/usr/lib/python2.6/site-packages/resource_management/core/shell.py\", line 291,
in _call
> raise Fail(err_msg)
> resource_management.core.exceptions.Fail: Execution of '/usr/bin/zypper --quiet install
--auto-agree-with-licenses --no-confirm 'spark_2_3_*'' returned 4. The following NEW packages
are going to be installed:
> spark_2_3_0_0_2557 spark_2_3_0_0_2557-master spark_2_3_0_0_2557-python spark_2_3_0_0_2557-worker
spark_2_3_4_0_3322 spark_2_3_4_0_3322-master spark_2_3_4_0_3322-python spark_2_3_4_0_3322-worker
> The following packages are not supported by their vendor:
> spark_2_3_0_0_2557 spark_2_3_0_0_2557-master spark_2_3_0_0_2557-python spark_2_3_0_0_2557-worker
spark_2_3_4_0_3322 spark_2_3_4_0_3322-master spark_2_3_4_0_3322-python spark_2_3_4_0_3322-worker
> 8 new packages to install.
> Overall download size: 495.0 MiB. After the operation, additional 562.6 MiB will be used.
> Continue? [y/n/?] (y): y
> Permission to access 'http://repo/HDP/suse11sp3/2.x/BUILDS/2.3.4.0-3322/spark/spark_2_3_0_0_2557-1.3.1.2.3.0.0-2557.noarch.rpm'
denied.
> Abort, retry, ignore? [a/r/i/?] (a): a
> Failed to provide Package spark_2_3_0_0_2557-1.3.1.2.3.0.0-2557. Do you want to retry
retrieval?
> [HDP-2.3|http://repo/HDP/suse11sp3/2.x/BUILDS/2.3.4.0-3322/] Can't provide file './spark/spark_2_3_0_0_2557-1.3.1.2.3.0.0-2557.noarch.rpm'
from repository 'HDP-2.3'
> History:
> - Permission to access 'http://repo/HDP/suse11sp3/2.x/BUILDS/2.3.4.0-3322/spark/spark_2_3_0_0_2557-1.3.1.2.3.0.0-2557.noarch.rpm'
denied.
> - Can't provide ./spark/spark_2_3_0_0_2557-1.3.1.2.3.0.0-2557.noarch.rpm
> Abort, retry, ignore? [a/r/i] (a): a
> Problem occured during or after installation or removal of packages:
> Installation aborted by user",
> "stdout" : "2015-11-25 15:53:22,997 - Group['spark'] {}
> {code}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

Mime
View raw message