allura-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Alvaro del Castillo <>
Subject Re: Presentation and Metrics work in Allura
Date Fri, 14 Sep 2012 10:53:20 GMT

El jue, 13-09-2012 a las 16:01 -0400, Dave Brondsema escribió:
> On 9/11/12 3:37 PM, Alvaro del Castillo wrote:
> > Hi guys,
> > 
> > Time to update Allura community about our progress with Metrics Tools:
> > 
> >> Ok. So next steps:
> >>
> >> - We are expanding the kind of graphs we support and also the metrics.
> >> We will integrate them in our Allura module.
> > 
> > You can see our M0 (Milestone0 reports at) GNOME Shell sample:
> > 
> > * SCM and ITS analysis integrated
> >
> > 
> > * Different isolates graphs showing also SCM and ITS activity.
> >
> > 
> > We are working in some other visualizations, like comparing in the same
> > graph issues and commits:
> > 
> >
> Nice graphs.  Previously your examples had been shown as graphs within Allura,
> right?  These look like static HTML.  Is something different here?

No, the idea behind is the same: generate JSON with the metrics so you
can plot them. The JSON files are like the juice from a fruit, the data
sources. In order to squeeze the juice we gather the data, convert it to
a common format, analyze it with SQL, R and other techniques and
generate the results, normally metrics, in JSON.

And all of this work inside Allura web pages also.

> Gnome shell is a git repo, right?  Does the reporting work with other repo types?

yes, GNOME Shell is a git repo. Currently we depend in our CVSAnalY tool
( that supports CVS,
Subversion and git.

> >>
> >> - Control metrics tools execution from taskd.
> > 
> > I am not sure if the tools execution should be programmed using taskd.
> > What do you think?
> How do you run them now?  If taskd doesn't run them automatically after each
> commit, maybe a cron job would be a good way to deploy it.  I think end-users
> could be okay with reports that are generated daily.

Right now we execute if manually from command line, and in some cases we
have update scripts that are executed automatically. For small/mid
projects with incremental support for its data sources, taskd could be
an option an execute them after each commit or ticket change. But in
general, a daily report could be created and we can give the user a
"button" to execute the update if she is in a hurry to have the data
updated in this moment.

> > 
> > 
> >>
> >> - Allura project configuration to enable graphs and auto configure the
> >> metrics tools to use the Allura backends.
> > 
> > We have improved Bicho tool, the ticket/issues gathering engine, so now
> > it supports Allura Tickets completely and using incremental queries, so
> > after the first analysis, it is really quick to reanalyze the tickets
> > for a project and create the JSON files with the data needed to graph
> > the tickets activity.
> Cool
> > 
> > Right now we are working in the configuration of tools inside Allura.
> > 
> > Our plan is to continue working in a Allura for at Github and once the
> > Metrics Module is in good state, where is the best place to coordinate
> > the future development?
> > 
> Is the right repo for me to be
> looking at?


> An earlier email mentioned R is used for some of the statistics.  Could you
> point me to where that is used in the code?  Or a brief explanation of
> dependencies in email is cool too.

Sure. The R scripts are in:

For git: scm-analysis.R
For Allura Tickets: its-analysis.R

But we are working now in in
order to build a R library for the reports.

> This mailing list is a good place to talk about development, just like you're
> doing :)


>   Out of curiosity, are you interested in this for general purpose
> availability, or specifically interested in SourceForge being able to adopt
> this?

I am interested in improving Allura project metrics, and in general,
improving project metrics in developers tools. I am also interested in
Allura project because it is a good open source product, with a solid
and extensible architecture, for deploying forges.

In Bitergia we are also interested in Sourceforge adopting this kind of
metrics, of course. It is a wonderful use case and a good way to reach
millions of users.

>   I'm sure we'd be interested in getting it up and running, which is why
> I'm asking about supported SCMs, dependencies, scalability, etc :)


* SCMs: git, svn, cvs
* Dependencies:
- CVSAnaly:
* Scalability: The visualization process is totally done client side. In
the server the hard work is to generate the JSON. JSON static files
could be distributed easily in an scalable way. For generating the JSON
we need:
	+ Allura project data sources with an efficient way to access them
	+ Bicho and CVSAnalY do basic download, parsing stuff and write
operations in MySQL. We should measure it for thousands of projects, but
it should be doable.
	+ R scripts for the current basic reports are really fast. We can also
do the work using just Python in some cases.

Dave, if you want to play directly with the tools, I can support you if
you find probs. For example, we can analyze Allura project directly.

All the visualization graphs are done with flotr2 and envision, which
are JS framework agnostic and use jquery, underscore and bonzo
libraries. I have integrated all of this inside Allura generated pages
and works nicely. 

Kind regards!
|\_____/| Alvaro del Castillo San Félix
 [o] [o] - Chief Technical Officer (CTO)
 |  V  |
  |   |   "Bridging the gap between developers and stakeholders"

View raw message