incubator-ooo-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Rob Weir <>
Subject Re: mirrors, release publishing...again
Date Thu, 19 Apr 2012 22:17:52 GMT
On Thu, Apr 19, 2012 at 11:37 PM, Kay Schenk <> wrote:
> On 04/19/2012 02:19 PM, Rob Weir wrote:
>> On Thu, Apr 19, 2012 at 10:52 PM, Kay Schenk<>  wrote:
>>> Hi all--
>>> Please see:
>>> esp the paragraph on "Distribution" sentence...
>>> =======================
>>> Distribution
>>> The Apache infrastructure must be the primary source for all artifacts
>>> officially released by the ASF.
>>> The Apache Infrastructure team maintains the Apache release distribution
>>> infrastructure. This infrastructure has two parts: the mirrored
>>> directories
>>> on and the Maven repository on
>>> =====================
>>> To me, this means we will use Apache infra and mirrors for the
>>> distribution
>>> of release 3.4, when that happens.
>> But in our case Apache Infra has agreed that we could/should accept
>> help from SourceForge due to the size and volume involved in this
>> release.
>> I think this statement from Joe is the most recent and authoritative
>> statement from Infra on this specific topic:
>> -Rob
> Rob--Thanks for pointing this out and pinning this down. Despite the fact
> that I read through this thread, I obviously missed this --and it was just a
> few days ago! ack! (I will now file this in a safe place.)
> OK, but it is not clear to me at all how to incorporate more than one mirror
> system (automatically), and like others I don't think it can be done really.
> So Joe's statement--
> "Infra's position is currently that, for the upcoming release ONLY,
> continuing to use the legacy mirrorbrain system in conjunction with ASF
> mirrors and SF downloads is A-OK"
> is a big ??? for me. How would "in conjunction" work? Maybe Joe or one of
> the other infra staff have some ideas about this.
> Change the current DL script to present the user with a choice of download
> options? ASF or MirrorBrain/SourceForge? We could do this I think.
> Also, it would be difficult to keep actual DL statistics if we split things
> up.
> Thoughts?

A few ways, some worse than others:

1) Offer several download links:  "Download from Apache, from
SourceForge, from MirrorBrain". Of course that doesn't balance the
load, but maybe it would if we randomized the order that they are

2) Have a single link, but it is JavaScript that then directs to one
of the three mirrors systems.  This is easy to distribute the load
according to a defined schedule.  Marcus prototyped an approach like
this. It looked like it was working.  I'm not sure, however, whether
it handled fallbacks.  For example, you randomly select to use the
Apache mirror, but the particular operator chosen is down.  User
experience for backing out of that and repeating was as nice as it
could be.

3) Some variation on 3 where we handle the fallbacks better, or at
least handle failures better, so the user just needs to click again.

>From a download tracking perspective, we can get these numbers if we
have a single script entry point we use.  In that script we can code a
Google Analytics "event", which is like a pseudo page that indicates
the user clicked a link that took them to a mirror outside of our
website.  We could track how many users went to each mirror network,
as well as what platform and language they downloaded.  (Well, not
really downloaded.  We only know that they requested the download.
Whether they waited for it to complete is unknown)


>>> --
>>> ------------------------------------------------------------------------
>>> MzK
>>> "Women and cats will do as they please,
>>>  and men and dogs should relax and get used to the idea."
>>>                                    -- Robert Heinlein
> --
> ------------------------------------------------------------------------
> MzK
> "Women and cats will do as they please,
>  and men and dogs should relax and get used to the idea."
>                                    -- Robert Heinlein

View raw message