ambari-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Alejandro Fernandez (JIRA)" <j...@apache.org>
Subject [jira] [Updated] (AMBARI-20910) HDP 3.0 TP - Unable to install Spark, cannot find package/scripts dir
Date Tue, 02 May 2017 20:48:04 GMT

     [ https://issues.apache.org/jira/browse/AMBARI-20910?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]

Alejandro Fernandez updated AMBARI-20910:
-----------------------------------------
    Resolution: Fixed
        Status: Resolved  (was: Patch Available)

Pushed to trunk,
commit 4b588a9237a72465f3ca83c207a8d4234d9c4c12

> HDP 3.0 TP - Unable to install Spark, cannot find package/scripts dir
> ---------------------------------------------------------------------
>
>                 Key: AMBARI-20910
>                 URL: https://issues.apache.org/jira/browse/AMBARI-20910
>             Project: Ambari
>          Issue Type: Bug
>          Components: stacks
>    Affects Versions: 3.0.0
>            Reporter: Alejandro Fernandez
>            Assignee: Alejandro Fernandez
>             Fix For: trunk
>
>         Attachments: AMBARI-20910.patch
>
>
> STR:
> * Install Ambari 3.0 (last build was 650)
> * Install HDP 3.0 (last build is 197) with ZK, HDFS, YARN. Note: will fail on RM and
Service Checks.
> * Because Hive is not yet compiling, temporarily comment out Hive as a required service
for Spark, and HIVE_METASTORE as a required co-hosted component.
> /var/lib/ambari-server/resources/common-services/SPARK/2.2.0/metainfo.xml
> * Restart Ambari Server
> * Attempt to add Spark as a service.
> Error:
> {noformat}
> Caught an exception while executing custom service command: <type 'exceptions.KeyError'>:
'service_package_folder'; 'service_package_folder'
> {noformat}
> This is coming from CustomServiceOrchestrator.py
> {code}
>     except Exception, e: # We do not want to let agent fail completely
>       exc_type, exc_obj, exc_tb = sys.exc_info()
>       message = "Caught an exception while executing "\
>         "custom service command: {0}: {1}; {2}".format(exc_type, exc_obj, str(e))
>       logger.exception(message)
> {code}
> Looks like Spark 2.2.0 doesn't have the package/scripts directory.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)

Mime
View raw message