river-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Peter Firmstone <j...@zeus.net.au>
Subject Re: Develop new spec for RMIClassLoader replacement
Date Wed, 29 Aug 2012 12:28:42 GMT

Where can I find the artifact url handler, I only seem to be able to 
find the mvn url handler?



On 27/08/2012 10:50 PM, Dennis Reedy wrote:
>>> I'm not sure if this helps (or is of interest to) you, but what I've
>>> been doing wrt to codebase support is to use the dependency resolution
>>> that you find with maven based artifacts (note you dont have to have a
>>> maven project, you just deploy your jars to a maven repository). We've
>>> been finding it much easier to configure services in a versioned and
>>> easy-to-deploy way.
>>> So what you end up with is the runtime dependencies of a particular
>>> artifact resolved (direct and transitive dependencies) as the codebase
>>> for a service, or a service-ui.  So your 'depends on graph' is
>>> complete, in as much as your dependency graph is correctly constructed
>>> for your artifact. This comes naturally for maven/gradle projects, you
>>> can't produce your artifact unless the dependencies have been declared
>>> correctly.
>>> Note that this becomes important especially for a client that uses a
>>> service. With the artifact URL scheme, instead of annotating a
>>> service's codebase with http:// based jars, the service's codebase
>>> contains the artifact URL, which (when resolved) resolves the
>>> dependencies for the service's codebase at the client. This not only
>>> presents a performance boost (why load classes over http if they can
>>> be loaded locally), but also addresses lost codebase issues.
>> So, the artifact url sticks with the classloader?  Or is it that the
>> service's artifact url implies a set of jars that will be resolved for
>> anyone else as well?
> The artifact URL gets resolved to locally available jars. These then are passed to the
default RMI provider instance as codebase to load.
>>> Add to that secure repository connections that require uid/password,
>>> and you can confirm that the artifact a service requires you download
>>> in order for you to use that service, comes from a site you trust.
>> Does everyone in the Djinn need to agree on where the Maven
>> repository(s) is(are)?
> The artifact URL scheme decomposes as follows:
> artifact:groupId/artifactId/version[/type[/classifier]][;[repository[@repositoryId]]
> Here are some examples:
> An artifact URL for groupId, artifactId and version:
> artifact:org.rioproject.examples.calculator/calculator-service/2.0.1
> An artifact URL for groupId, artifactId, version and repository with an id
> artifact:org.rioproject.examples.calculator/calculator-proxy/2.0.1;http://www.rio-project.org@rio
> The receiving client does not need to know the repository in advance. If the repository
is secure, the client must have locally available credentials to apply when making the connection
though. So using this approach you could easily verify the source before you download. Basically
give trust to the code you download before you use it.
>> The thing I like about the codebase annotation
>> is that the service provider can go ahead and setup a web server and the
>> client finds out everything it needs to know from the annotation.  Can
>> the clients find the repository configurations dynamically?  Plus, I
>> suppose you actually have to have a repository setup somewhere, or can a
>> service container host its own repository?  These may be stupid
>> questions - I haven't used Maven very much.
> One of the things I have found is that having well known places to get things from (like
service jars) in a deployment environment is very helpful. It allows organizations (both development
and commercial) to control, organize and secure their assets. In my experience the dynamic
codebase issues that come along with River offer unnecessary complexity.
> Setting up a repository is needed. Thats easy though. Its really just a web server that
manages a structure of maven-deployed artifacts. Or, it can be a Maven Repository Manager
(Archiva, Artifactory, Nexus). From the projects I've been working with this becomes a natural
extension of the development process, its hooked into the continuous integration that gets
done. Then when we deploy services we deploy them using an artifact that represents the whole
system. We maintain multiple versions side-by-side.
> I'm curious as to how others develop and deploy their systems. I know many on this list
are anti-maven, what do you use?
> Dennis

View raw message