thrift-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "James E. King III (JIRA)" <>
Subject [jira] [Commented] (THRIFT-4802) Automatic library deployment on online CDN/Repository services
Date Thu, 14 Feb 2019 00:06:00 GMT


James E. King III commented on THRIFT-4802:

We would likely use a special build job in Travis CI that hands credentials to the build job
to be able to publish.  Each package manager has a different set of instructions and criteria
for getting things published.  Some of them (like maven) require manual steps, where you first
publish to a staging area and then you have to commit it.  I don't think it can be entirely
automated for a while but it is definitely a good goal to look forward to.  When we go through
the next release I have a task to thoroughly document the steps required for each external
package manager.  Then we'll have a better sense for what will be involved.

> Automatic library deployment on online CDN/Repository services
> --------------------------------------------------------------
>                 Key: THRIFT-4802
>                 URL:
>             Project: Thrift
>          Issue Type: Improvement
>          Components: Deployment
>    Affects Versions: 0.12.0
>            Reporter: Thibault Piana
>            Priority: Critical
>              Labels: cdn, deploy,deployment, library
>         Attachments: Workflow.png
> Hello everyone,
> This thread follows some other threads dealing with concerns about online libraries (THRIFT-4688,
THRIFT-4687), as well as emails being sent to the user mailing list about the Docker container
on DockerHub.
> If I understand the situation correctly, at the moment, with each new version of the
software, you must prepare and update each library on each website independently.
> In addition, each of those libraries is managed by different people, with whom contact
is not always easy.
> Also, I would like to propose here a system that would offer the following advantages:
>  - No more worrying about who is maintaining each platform/CDN/Repositories...
>  - No more manual uploading on each platform at each release
>  - Allow anyone to develop an extension to put a Thrift package online on a particular
site (For example, a version for Yocto as I requested in a recent email), and be sure this
package will be updated at each new release.
> I will try to explain my idea in the following.
> h2. Vocabulary
>  * *CDN :* Content delivery network
>  * *Manager :* An algorithm to upload content to a CDN/online repository
>  * *Orchestrator :* The algorithm that orchestrates the execution of managers, provides
credentials, manages manager feedback, and generates the final report
>  * *Package* : The library/component we want to upload on the CDN
> In the following, all online resources providers/repositories (like Pypi, NPMJs, DockerHub
...) will be called "CDN"
> h2. General principle
> The idea would be to develop a single interface for module deployment, with an orchestrator
to automatically configure and perform these deployments. 
> The organization would be like this :
>  * A Keepass database will store all passwords/identifiers required for each platform
(NPMJS, Pypi, DockerHub, ...) :
>  ** Each module will specify what credentials it needs from this database.
>  ** This database could be committed directly on the repo, to be sure each committers
have the most recent version.
>  ** The master password of this database will only be known by committers (or project
> I think KeePass is quite perfect for this, Keepass databases are very lightweight, highly
secure, and almost impossible to brute-force.
>  * For each CDN, a manager is in charge of bundling (create the package to upload from
Thrift current release sources), uploading and checking the package on the CDN :
>  ** First, the orchestrator load required IDs from the KeePass database
>  ** The orchestrator creates a temporary directory where the manager will be able to
bundle everything it needs.
>  *** The manager receives credentials, a path to a temporary directory and a path to
Thrift sources
>  *** The manager checks online if the current version of the package already exists online.
If not, it will upload it
>  *** The manager prepares the bundled package, upload it and check if everything went
> All these steps could be executed in a dedicated Docker/Vagrant environment containing
all needed libraries to bundle the package (So that the person/commiter executing the script
does not have to download 500 dependencies).
> The algorithm diagram is provided in attached (it's a draft)
> The directory structure of this solution could be something like this (I chose Python
arbitrarily here):
> {code:java}
> root
> └── deployment/
>     ├── access.kdbx
>     ├──
>     ├── libraries # Some managers
>     │   ├── javascript
>     │   │   ├──
>     │   │   ├──
>     │   │   └──
>     │   ├── nodejs
>     │   │   └──
>     │   └── python
>     │   └── pypi.js
>     │   └── ...
>     └── miscellaneous # Some other managers
>         ├── docker
>         │   └──
>         └── yocto
>         └──
>         └── ...{code}
> I am not a regular contributor on open source projects. So I don't know if I'm making
my proposal in the best way. But I really like this project and I would like to help improve
> I had this idea based on what had been put in place in a project I was recently leading.
It is not perfect but does the job properly, and it's a solution I find quite elegant because
we don't have to worry about the credentials of each site.
> If you find this idea interesting, I could propose a prototype in a personal repository
with some examples of managers for Npm, Pypi, and DockerHub (for example) in several weeks.

This message was sent by Atlassian JIRA

View raw message