commons-commits mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From l..@apache.org
Subject svn commit: r758074 - in /commons/proper/math/trunk/src: java/org/apache/commons/math/optimization/package.html site/xdoc/userguide/analysis.xml site/xdoc/userguide/optimization.xml
Date Tue, 24 Mar 2009 22:56:49 GMT
Author: luc
Date: Tue Mar 24 22:56:49 2009
New Revision: 758074

URL: http://svn.apache.org/viewvc?rev=758074&view=rev
Log:
updated documentation after changes in the optimization framework
JIRA: MATH-177

Modified:
    commons/proper/math/trunk/src/java/org/apache/commons/math/optimization/package.html
    commons/proper/math/trunk/src/site/xdoc/userguide/analysis.xml
    commons/proper/math/trunk/src/site/xdoc/userguide/optimization.xml

Modified: commons/proper/math/trunk/src/java/org/apache/commons/math/optimization/package.html
URL: http://svn.apache.org/viewvc/commons/proper/math/trunk/src/java/org/apache/commons/math/optimization/package.html?rev=758074&r1=758073&r2=758074&view=diff
==============================================================================
--- commons/proper/math/trunk/src/java/org/apache/commons/math/optimization/package.html (original)
+++ commons/proper/math/trunk/src/java/org/apache/commons/math/optimization/package.html Tue
Mar 24 22:56:49 2009
@@ -19,68 +19,54 @@
 <body>
 <p>
 This package provides common interfaces for the optimization algorithms
-provided in sub-packages. The main interfaces defines objective functions
-and optimizers.
-</p>
-<p>
-Objective functions interfaces are intended to be implemented by
-user code to represent the problem to minimize or maximize. When the goal is to
-minimize, the objective function is often called a cost function. Objective
-functions can be either scalar or vectorial and can be either differentiable or
-not. There are four different interfaces, one for each case:
-<ul>
-  <li>{@link org.apache.commons.math.optimization.ScalarObjectiveFunction
-      ScalarObjectiveFunction}</li>
-  <li>{@link org.apache.commons.math.optimization.ScalarDifferentiableObjectiveFunction
-      ScalarDifferentiableObjectiveFunction}</li>
-  <li>{@link org.apache.commons.math.optimization.VectorialObjectiveFunction
-      VectorialObjectiveFunction}</li>
-  <li>{@link org.apache.commons.math.optimization.VectorialDifferentiableObjectiveFunction
-      VectorialDifferentiableObjectiveFunction}</li>
-</ul>
-</p>
-
-<p>
+provided in sub-packages. The main interfaces defines optimizers and convergence
+checkers. The functions that are optimized by the algorithms provided by this
+package and its sub-packages are a subset of the one defined in the <code>analysis</code>
+package, namely the real and vector valued functions. These functions are called
+objective function here. When the goal is to minimize, the functions are often called
+cost function, this name is not used in this package.
 </p>
 
 <p>
 Optimizers are the algorithms that will either minimize or maximize, the objective function
-by changing its input variables set until an optimal set is found. There are only three
-interfaces defining the common behavior of optimizers, one for each type of objective
-function except {@link org.apache.commons.math.optimization.VectorialObjectiveFunction
-VectorialObjectiveFunction}:
+by changing its input variables set until an optimal set is found. There are only four
+interfaces defining the common behavior of optimizers, one for each supported type of objective
+function:
 <ul>
-  <li>{@link org.apache.commons.math.optimization.ScalarOptimizer
-      ScalarOptimizer}</li>
-  <li>{@link org.apache.commons.math.optimization.ScalarDifferentiableOptimizer
-      ScalarDifferentiableOptimizer}</li>
-  <li>{@link org.apache.commons.math.optimization.VectorialDifferentiableOptimizer
-      VectorialDifferentiableOptimizer}</li>
+  <li>{@link org.apache.commons.math.optimization.UnivariateRealOptimizer
+      UnivariateRealOptimizer} for {@link org.apache.commons.math.analysis.UnivariateRealFunction
+      univariate real functions}</li>
+  <li>{@link org.apache.commons.math.optimization.MultivariateRealOptimizer
+      MultivariateRealOptimizer} for {@link org.apache.commons.math.analysis.MultivariateRealFunction
+      multivariate real functions}</li>
+  <li>{@link org.apache.commons.math.optimization.DifferentiableMultivariateRealOptimizer
+      DifferentiableMultivariateRealOptimizer} for {@link
+      org.apache.commons.math.analysis.DifferentiableMultivariateRealFunction
+      differentiable multivariate real functions}</li>
+  <li>{@link org.apache.commons.math.optimization.DifferentiableMultivariateVectorialOptimizer
+      DifferentiableMultivariateVectorialOptimizer} for {@link
+      org.apache.commons.math.analysis.DifferentiableMultivariateVectorialFunction
+      differentiable multivariate vectorial functions}</li>
 </ul>
 </p>
 
 <p>
-Despite there are only three types of supported optimizers, it is possible to optimize a
-transform a non-differentiable {@link
-org.apache.commons.math.optimization.VectorialObjectiveFunction VectorialObjectiveFunction}
-by transforming into a {@link org.apache.commons.math.optimization.ScalarObjectiveFunction
-ScalarObjectiveFunction} thanks to the {@link
+Despite there are only four types of supported optimizers, it is possible to optimize a
+transform a {@link org.apache.commons.math.analysis.MultivariateVectorialFunction
+non-differentiable multivariate vectorial function} by converting it to a {@link
+org.apache.commons.math.analysis.MultivariateRealFunction non-differentiable multivariate
+real function} thanks to the {@link
 org.apache.commons.math.optimization.LeastSquaresConverter LeastSquaresConverter} helper
class.
 The transformed function can be optimized using any implementation of the {@link
-org.apache.commons.math.optimization.ScalarOptimizer ScalarOptimizer} interface.
+org.apache.commons.math.optimization.MultivariateRealOptimizer MultivariateRealOptimizer}
interface.
 </p>
 
 <p>
-There are also three special implementations which wrap classical optimizers in order to
-add them a multi-start feature. This feature call the underlying optimizer several times
-in sequence with different starting points and returns the best optimum found or all optima
-if desired. This is a classical way to prevent being trapped into a local extremum when
-looking for a global one. The multi-start wrappers are {@link
-org.apache.commons.math.optimization.MultiStartScalarOptimizer MultiStartScalarOptimizer},
-{@link org.apache.commons.math.optimization.MultiStartScalarDifferentiableOptimizer
-MultiStartScalarDifferentiableOptimizer} and {@link
-org.apache.commons.math.optimization.MultiStartMultiStartVectorialOptimizer
-MultiStartVectorialOptimizer}.
+For each of the four types of supported optimizers, there is a special implementation which
+wraps a classical optimizer in order to add it a multi-start feature. This feature call the
+underlying optimizer several times in sequence with different starting points and returns
+the best optimum found or all optima if desired. This is a classical way to prevent being
+trapped into a local extremum when looking for a global one.
 </p>
 </body>
 </html>

Modified: commons/proper/math/trunk/src/site/xdoc/userguide/analysis.xml
URL: http://svn.apache.org/viewvc/commons/proper/math/trunk/src/site/xdoc/userguide/analysis.xml?rev=758074&r1=758073&r2=758074&view=diff
==============================================================================
--- commons/proper/math/trunk/src/site/xdoc/userguide/analysis.xml (original)
+++ commons/proper/math/trunk/src/site/xdoc/userguide/analysis.xml Tue Mar 24 22:56:49 2009
@@ -34,6 +34,13 @@
          coefficients as differentiable real functions.
         </p>
         <p>
+         Functions interfaces are intended to be implemented by user code to represent
+         their domain problems. The algorithms provided by the library will then operate
+         on these function to find their roots, or integrate them, or ... Functions can
+         be multivariate or univariate, real vectorial or matrix valued, and they can be
+         differentiable or not.
+        </p>
+        <p>
           Possible future additions may include numerical differentiation.
         </p>
       </subsection>

Modified: commons/proper/math/trunk/src/site/xdoc/userguide/optimization.xml
URL: http://svn.apache.org/viewvc/commons/proper/math/trunk/src/site/xdoc/userguide/optimization.xml?rev=758074&r1=758073&r2=758074&view=diff
==============================================================================
--- commons/proper/math/trunk/src/site/xdoc/userguide/optimization.xml (original)
+++ commons/proper/math/trunk/src/site/xdoc/userguide/optimization.xml Tue Mar 24 22:56:49
2009
@@ -44,80 +44,75 @@
         </p>
         <p>
         The top level optimization package provides common interfaces for the optimization
-        algorithms provided in sub-packages. The main interfaces defines objective functions
-        and optimizers.
-        </p>
-        <p>
-        Objective functions interfaces are intended to be implemented by
-        user code to represent the problem to minimize or maximize. When the goal is to
-        minimize, the objective function is often called a cost function. Objective
-        functions can be either scalar or vectorial and can be either differentiable or
-        not. There are four different interfaces, one for each case:
+        algorithms provided in sub-packages. The main interfaces defines defines optimizers
+        and convergence checkers. The functions that are optimized by the algorithms provided
+        by this package and its sub-packages are a subset of the one defined in the
+        <code>analysis</code> package, namely the real and vector valued functions.
These
+        functions are called objective function here. When the goal is to minimize, the
+        functions are often called cost function, this name is not used in this package.
+        </p>
+        <p>
+        The type of goal, i.e. minimization or maximization, is defined by the enumerated
+        <a href="../apidocs/org/apache/commons/math/optimization/GoalType.html">
+        GoalType</a> which has only two values: <code>MAXIMIZE</code> and
<code>MINIMIZE</code>.
+        </p>
+        <p>
+        Optimizers are the algorithms that will either minimize or maximize, the objective
+        function by changing its input variables set until an optimal set is found. There
+        are only four interfaces defining the common behavior of optimizers, one for each
+        supported type of objective function:
         <ul>
-          <li><a href="../apidocs/org/apache/commons/math/optimization/ScalarObjectiveFunction.html">
-              ScalarObjectiveFunction</a></li>
-          <li><a href="../apidocs/org/apache/commons/math/optimization/ScalarDifferentiableObjectiveFunction.html">
-              ScalarDifferentiableObjectiveFunction</a></li>
-          <li><a href="../apidocs/org/apache/commons/math/optimization/VectorialObjectiveFunction.html">
-              VectorialObjectiveFunction</a></li>
-          <li><a href="../apidocs/org/apache/commons/math/optimization/VectorialDifferentiableObjectiveFunction.html">
-              VectorialDifferentiableObjectiveFunction</a></li>
+          <li><a href="../apidocs/org/apache/commons/math/optimization/UnivariateRealOptimizer.html">
+              UnivariateRealOptimizer</a> for <a
+              href="../apidocs/org/apache/commons/math/analysis/UnivariateRealFunction.html">
+              univariate real functions</a></li>
+          <li><a href="../apidocs/org/apache/commons/math/optimization/MultivariateRealOptimizer.html">
+              MultivariateRealOptimizer</a> for <a
+              href="../apidocs/org/apache/commons/math/analysis/MultivariateRealFunction.html">
+              multivariate real functions</a></li>
+          <li><a href="../apidocs/org/apache/commons/math/optimization/DifferentiableMultivariateRealOptimizer.html">
+              DifferentiableMultivariateRealOptimizer</a> for <a
+              href="../apidocs/org/apache/commons/math/analysis/DifferentiableMultivariateRealFunction.html">
+              differentiable multivariate real functions</a></li>
+          <li><a href="../apidocs/org/apache/commons/math/optimization/DifferentiableMultivariateVectorialOptimizer.html">
+              DifferentiableMultivariateVectorialOptimizer</a> for <a
+              href="../apidocs/org/apache/commons/math/analysis/DifferentiableMultivariateVectorialFunction.html">
+              differentiable multivariate vectorial functions</a></li>
         </ul>
         </p>
 
         <p>
-        Optimizers are the algorithms that will either minimize or maximize, the objective
function
-        by changing its input variables set until an optimal set is found. There are only
three
-        interfaces defining the common behavior of optimizers, one for each type of objective
-        function except <a href="../apidocs/org/apache/commons/math/optimization/VectorialObjectiveFunction.html">
-        VectorialObjectiveFunction</a>:
-        <ul>
-          <li><a href="../apidocs/org/apache/commons/math/optimization/ScalarOptimizer.html">
-              ScalarOptimizer</a></li>
-          <li><a href="../apidocs/org/apache/commons/math/optimization/ScalarDifferentiableOptimizer.html">
-              ScalarDifferentiableOptimizer</a></li>
-          <li><a href="../apidocs/org/apache/commons/math/optimization/VectorialDifferentiableOptimizer.html">
-              VectorialDifferentiableOptimizer</a></li>
-        </ul>
-        </p>
-
-        <p>
-        Despite there are only three types of supported optimizers, it is possible to optimize
a
-        transform a non-differentiable <a
-        href="../apidocs/org/apache/commons/math/optimization/VectorialObjectiveFunction.html">
-        VectorialObjectiveFunction</a> by transforming into a <a
-        href="../apidocs/org/apache/commons/math/optimization/ScalarObjectiveFunction.html">
-        ScalarObjectiveFunction</a> thanks to the <a
+        Despite there are only four types of supported optimizers, it is possible to optimize
+        a transform a <a
+        href="../apidocs/org/apache/commons/math/analysis/MultivariateVectorialFunction.html">
+        non-differentiable multivariate vectorial function</a> by converting it to
a <a
+        href="../apidocs/org/apache/commons/math/analysis/MultivariateRealFunction.html">
+        non-differentiable multivariate real function</a> thanks to the <a
         href="../apidocs/org/apache/commons/math/optimization/LeastSquaresConverter.html">
-        LeastSquaresConverter</a> helper class. The transformed function can be optimized
using any
-        implementation of the <a href="../apidocs/org/apache/commons/math/optimization/ScalarOptimizer.html">
-        ScalarOptimizer</a> interface.
+        LeastSquaresConverter</a> helper class. The transformed function can be optimized
using
+        any implementation of the <a
+        href="../apidocs/org/apache/commons/math/optimization/MultivariateRealOptimizer.html">
+        MultivariateRealOptimizer</a> interface.
         </p>
 
         <p>
-        There are also three special implementations which wrap classical optimizers in order
to
-        add them a multi-start feature. This feature call the underlying optimizer several
times
-        in sequence with different starting points and returns the best optimum found or
all optima
-        if desired. This is a classical way to prevent being trapped into a local extremum
when
-        looking for a global one. The multi-start wrappers are <a
-        href="../apidocs/org/apache/commons/math/optimization/MultiStartScalarOptimizer.html">
-        MultiStartScalarOptimizer</a>, <a
-        href="../apidocs/org/apache/commons/math/optimization/MultiStartScalarDifferentiableOptimizer.html">
-        MultiStartScalarDifferentiableOptimizer</a> and <a
-        href="../apidocs/org/apache/commons/math/optimization/MultiStartVectorialDifferentiableOptimizer.html">
-        MultiStartVectorialDifferentiableOptimizer</a>.
+        For each of the four types of supported optimizers, there is a special implementation
+        which wraps a classical optimizer in order to add it a multi-start feature. This
feature
+        call the underlying optimizer several times in sequence with different starting points
+        and returns the best optimum found or all optima if desired. This is a classical
way to
+        prevent being trapped into a local extremum when looking for a global one.
         </p>
       </subsection>
       <subsection name="12.2 Univariate Functions" href="univariate">
         <p>
-          A <a href="../apidocs/org/apache/commons/math/optimization/univariate/UnivariateRealMinimizer.html">
-          UnivariateRealMinimizer</a> is used to find the minimal values of a univariate
scalar-valued function
-          <code>f</code>.
+          A <a href="../apidocs/org/apache/commons/math/optimization/UnivariateRealOptimizer.html">
+          UnivariateRealOptimizer</a> is used to find the minimal values of a univariate
real-valued
+          function <code>f</code>.
         </p>
         <p>
-          Minimization algorithms usage is very similar to root-finding algorithms usage
explained
+          These algorithms usage is very similar to root-finding algorithms usage explained
           in the analysis package. The main difference is that the <code>solve</code>
methods in root
-          finding algorithms is replaced by <code>minimize</code> methods.
+          finding algorithms is replaced by <code>optimize</code> methods.
         </p>
       </subsection>
       <subsection name="12.3 Linear Programming" href="linear">
@@ -198,13 +193,24 @@
         <p>
           In order to solve a vectorial optimization problem, the user must provide it as
           an object implementing the <a
-          href="../apidocs/org/apache/commons/math/optimization/VectorialDifferentiableObjectiveFunction.html">
-          VectorialDifferentiableObjectiveFunction</a> interface. The object will be
provided to
+          href="../apidocs/org/apache/commons/math/analysis/DifferentiableMultivariateVectorialFunction.html">
+          DifferentiableMultivariateVectorialFunction</a> interface. The object will
be provided to
           the <code>estimate</code> method of the optimizer, along with the target
and weight arrays,
           thus allowing the optimizer to compute the residuals at will. The last parameter
to the
           <code>estimate</code> method is the point from which the optimizer
will start its
           search for the optimal point.
         </p>
+        <p>
+          In addition to least squares solving, the <a
+          href="../apidocs/org/apache/commons/math/optimization/general/NonLinearConjugateGradientOptimizer.html">
+          NonLinearConjugateGradientOptimizer</a> class provides a non-linear conjugate
gradient algorithm
+          to optimize <a
+          href="../apidocs/org/apache/commons/math/optimization/DifferentiableMultivariateRealFunction.html">
+          DifferentiableMultivariateRealFunction</a>. Both the Fletcher-Reeves and
the Polak-Ribi&#232;re
+          search direction update methods are supported. It is also possible to set up a
preconditioner
+          or to change the line-search algorithm of the inner loop if desired (the default
one is a Brent
+          solver).
+        </p>
       </subsection>
      </section>
   </body>



Mime
View raw message