ambari-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Dmitro Lisnichenko" <dlysniche...@hortonworks.com>
Subject Re: Review Request 40973: RU: Spark install failed after upgrade
Date Mon, 07 Dec 2015 11:23:19 GMT


> On Dec. 7, 2015, 11:24 a.m., Jayush Luniya wrote:
> > ambari-common/src/main/python/resource_management/libraries/providers/repository.py,
line 59
> > <https://reviews.apache.org/r/40973/diff/1/?file=1153925#file1153925line59>
> >
> >     Does this need to be handled for other OSes too apart from SLES?

I think no. Under RHEL, yum seems to handle that automatically. Under Ubuntu, we manage "apt-get
update" calls ourselves


- Dmitro


-----------------------------------------------------------
This is an automatically generated e-mail. To reply, visit:
https://reviews.apache.org/r/40973/#review109133
-----------------------------------------------------------


On Dec. 4, 2015, 5:30 p.m., Dmitro Lisnichenko wrote:
> 
> -----------------------------------------------------------
> This is an automatically generated e-mail. To reply, visit:
> https://reviews.apache.org/r/40973/
> -----------------------------------------------------------
> 
> (Updated Dec. 4, 2015, 5:30 p.m.)
> 
> 
> Review request for Ambari, Alejandro Fernandez, Jonathan Hurley, Jayush Luniya, and Nate
Cole.
> 
> 
> Bugs: AMBARI-14217
>     https://issues.apache.org/jira/browse/AMBARI-14217
> 
> 
> Repository: ambari
> 
> 
> Description
> -------
> 
> After performing Rolling Upgrade tried to add Spark to cluster.
> Failed with error:
> {code}
> Traceback (most recent call last):
> File \"/var/lib/ambari-agent/cache/common-services/SPARK/1.2.0.2.2/package/scripts/spark_client.py\",
line 59, in <module>
> SparkClient().execute()
> File \"/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py\",
line 217, in execute
> method(env)
> File \"/var/lib/ambari-agent/cache/common-services/SPARK/1.2.0.2.2/package/scripts/spark_client.py\",
line 34, in install
> self.install_packages(env)
> File \"/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py\",
line 393, in install_packages
> Package(name)
> File \"/usr/lib/python2.6/site-packages/resource_management/core/base.py\", line 154,
in __init__
> self.env.run()
> File \"/usr/lib/python2.6/site-packages/resource_management/core/environment.py\", line
158, in run
> self.run_action(resource, action)
> File \"/usr/lib/python2.6/site-packages/resource_management/core/environment.py\", line
121, in run_action
> provider_action()
> File \"/usr/lib/python2.6/site-packages/resource_management/core/providers/package/__init__.py\",
line 49, in action_install
> self.install_package(package_name, self.resource.use_repos, self.resource.skip_repos)
> File \"/usr/lib/python2.6/site-packages/resource_management/core/providers/package/zypper.py\",
line 57, in install_package
> self.checked_call_until_not_locked(cmd, sudo=True, logoutput=self.get_logoutput())
> File \"/usr/lib/python2.6/site-packages/resource_management/core/providers/package/__init__.py\",
line 72, in checked_call_until_not_locked
> return self.wait_until_not_locked(cmd, is_checked=True, **kwargs)
> File \"/usr/lib/python2.6/site-packages/resource_management/core/providers/package/__init__.py\",
line 80, in wait_until_not_locked
> code, out = func(cmd, **kwargs)
> File \"/usr/lib/python2.6/site-packages/resource_management/core/shell.py\", line 70,
in inner
> result = function(command, **kwargs)
> File \"/usr/lib/python2.6/site-packages/resource_management/core/shell.py\", line 92,
in checked_call
> tries=tries, try_sleep=try_sleep)
> File \"/usr/lib/python2.6/site-packages/resource_management/core/shell.py\", line 140,
in _call_wrapper
> result = _call(command, **kwargs_copy)
> File \"/usr/lib/python2.6/site-packages/resource_management/core/shell.py\", line 291,
in _call
> raise Fail(err_msg)
> resource_management.core.exceptions.Fail: Execution of '/usr/bin/zypper --quiet install
--auto-agree-with-licenses --no-confirm 'spark_2_3_*'' returned 4. The following NEW packages
are going to be installed:
> spark_2_3_0_0_2557 spark_2_3_0_0_2557-master spark_2_3_0_0_2557-python spark_2_3_0_0_2557-worker
spark_2_3_4_0_3322 spark_2_3_4_0_3322-master spark_2_3_4_0_3322-python spark_2_3_4_0_3322-worker
> 
> The following packages are not supported by their vendor:
> spark_2_3_0_0_2557 spark_2_3_0_0_2557-master spark_2_3_0_0_2557-python spark_2_3_0_0_2557-worker
spark_2_3_4_0_3322 spark_2_3_4_0_3322-master spark_2_3_4_0_3322-python spark_2_3_4_0_3322-worker
> 
> 8 new packages to install.
> Overall download size: 495.0 MiB. After the operation, additional 562.6 MiB will be used.
> Continue? [y/n/?] (y): y
> Permission to access 'http://repo/HDP/suse11sp3/2.x/BUILDS/2.3.4.0-3322/spark/spark_2_3_0_0_2557-1.3.1.2.3.0.0-2557.noarch.rpm'
denied.
> 
> Abort, retry, ignore? [a/r/i/?] (a): a
> Failed to provide Package spark_2_3_0_0_2557-1.3.1.2.3.0.0-2557. Do you want to retry
retrieval?
> 
> [HDP-2.3|http://repo/HDP/suse11sp3/2.x/BUILDS/2.3.4.0-3322/] Can't provide file './spark/spark_2_3_0_0_2557-1.3.1.2.3.0.0-2557.noarch.rpm'
from repository 'HDP-2.3'
> History:
> - Permission to access 'http://repo/HDP/suse11sp3/2.x/BUILDS/2.3.4.0-3322/spark/spark_2_3_0_0_2557-1.3.1.2.3.0.0-2557.noarch.rpm'
denied.
> 
> - Can't provide ./spark/spark_2_3_0_0_2557-1.3.1.2.3.0.0-2557.noarch.rpm
> 
> Abort, retry, ignore? [a/r/i] (a): a
> Problem occured during or after installation or removal of packages:
> Installation aborted by user",
> "stdout" : "2015-11-25 15:53:22,997 - Group['spark'] {}
> {code}
> 
> 
> Diffs
> -----
> 
>   ambari-agent/src/test/python/resource_management/TestRepositoryResource.py b3e2291

>   ambari-common/src/main/python/resource_management/libraries/providers/repository.py
11002cc 
> 
> Diff: https://reviews.apache.org/r/40973/diff/
> 
> 
> Testing
> -------
> 
> mvn clean test
> 
> 
> Thanks,
> 
> Dmitro Lisnichenko
> 
>


Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message