incubator-bloodhound-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Olemis Lang <ole...@gmail.com>
Subject Managing plugin dependencies WAS: Tests and CI (was: A word on Release 2)
Date Thu, 11 Oct 2012 03:29:18 GMT
On 10/10/12, Branko Čibej <brane@wandisco.com> wrote:
> On 11.10.2012 02:12, Gary Martin wrote:
>> On 10/10/12 18:26, Peter Koželj wrote:
>>> I admit it, some of my fears could just be the cause of the fact that
>>> I am
>>>
[...]
>>>
>>> Am I missing something?
>>
[...]
>>
>> So far I have found little reason to put any "super heavy shifting"
>> into trac itself as they appear to have provided enough interfaces to
>> allow for some fairly complex subverting of standard trac processing.
>> The namespacing for multiproducts is a case in point where, so far, it
>> appears to be achievable outside of trac. If it is not possible to do
>> all that we want to with, say, per-product configuration or workflow,
>> it might be that we will need to introduce a new interface to Trac to
>> allow us to take over processing at the appropriate time but I am
>> hoping that this will not be necessary.
>
> The point Peter is trying to make is this: if we take as an example the
> multiproduct plugin:
>
> This plugin has to change the trac database schema

Not a MUST ... can be done this way , but it is possible to get things
done without that

> (or rather, add a
> more-or-less independent namespace mapping to the existing schema).

ok , if your point is something should be done , that's correct
;)

> Consequentially, plugins must be allowed to either update the core
> schema, or create their own additional schema. Whichever it is affects
>

Create their own additional schema ... but they could update core
schema too . They are free to do so , yes ... now that I see .

>   * database schema upgrades (from any older version to any newer version)
>       o either each plugin has to have its own schema upgrade mechanism
>         -- a nightmare, or

this is it . There's a trial / error approach . Plugins implementing
IEnvironmentSetupParticipant interface first check whether upgrades
have been committed already . If not then they «propose» their
upgrades to the Environment which tries to apply them all in a single
transaction . On failure transaction is rolled back . Roughly that's
the way it works ...

>       o the plugin API must define unified schema management protocols,
>         wich /all/ plugins that fiddle with the schema have to conform to

... hmmm ... more or less explained above .

>   * data export/import
>       o as above, either each plugin handles its own export/import (and
>         incidentally defines its own export format), which implies that
>         anyone who wants to write an exporter from another issue
>         tracking system to Bloodhound's export format has to know
>         exactly which plugins are in fact part of core Bloodhound; or,

About export ... Trac contains a MIME API . It is the main
architectural mechanism responsible for handling these conversions .
CSV , RSS , ... and similar links in «Download in other format» area
are in part a consequence of this , and as usual there are interfaces
, components implementing interfaces , ... etc ...

Like I said before , the plugin is not the minimal unit of
functionality in Trac , it's the Component . Your plugin might not
even care about export / import and you can implement 12 different
plugins (packages) implementing conversions to 200 different formats .

>       o the plugin API must define a common event streaming protocol
>         that can be converted to a generic export format; which does not
>         seem trivial to me.
>

Trac MIME API ... it's not event or stream based , it's ... guess what
... full of interfaces
:)

In a few words the old and simple API uses an input content + type
plus target type and the component implementing MIME API interfaces
performs the conversion . Use streaming , events , whatever you need
... the API doesn't even care about those implementation details .
MIME API v2 is slightly different , but most of it still applies .

> As another example, a "new search" plugin would conceivably allow
> defining event notification triggers based on the results of a custom
> query having changed.

You create your own interface and/or subscribe to
ITicketChangeListener , IWikiChangeListener , ... or whatever other
interface you might need to get this done . If there is no such thing
available you could actually create your own and give them some
meaning .

> Any export format has to have a way to represent
> this (if only because JIRA has this feature and you'd eventually want to
> be able to import these definitions if BH achieves feature parity in
> that respect).
>
> Almost every new feature that Bloodhound adopts and implements via a
> plugin affects with the goal of creating a common export format,

I don't think so ... unless that's a goal in first place . The
concepts in track are the same in Jira , in the end . You have tickets
, users , changesets , comments , permissions ...

I don't see why CSV , XML , iCal , ... formats will change due to the
fact that somebody implements some component in BH . Maybe it turns
out that there is no standard format to exchange e.g. tickets ,
permissions , ... between different systems . That's a problem much
bigger and substantially different than anything related to BH
architecture .

If there's a standard exchange format there's no problem to implement
a Trac component importing / exporting data by following its rules .
Like I just said , it doesn't matter what package it belongs to
exactly . It just has to be there and enabled .

> a
> concept that trac does not have and therefore isn't likely to be
> supported in the plugin API.
>

What are you thinking of exactly ? Something like JiraToTracIntegration [1]_ ?
In theory you just need to implement a MIME conversion from whatever
MIME type Jira backup might be represented with onto internal
'x-trac-ticket' (<= or similar ...) and that should be enough . If it
is bi-directional , you are exporting trac data to Jira . At least
that's the theory , I have not reviewed the source code of that plugin
in detail .

There are 1000 ways to get to Rome .
;)

PS: ... but in the end , it's all about components and interfaces (...
and component managers ... but that's quite unlikely to be needed most
of the time ;)

.. [1] Import Jira backup files into Trac
        (http://trac-hacks.org/wiki/JiraToTracIntegration)

-- 
Regards,

Olemis.

Blog ES: http://simelo-es.blogspot.com/
Blog EN: http://simelo-en.blogspot.com/

Featured article:

Mime
View raw message