On 7/23/11 1:37 PM, Ted Dunning wrote:
> Also, if the QRDecomposition interface *is* extended with getP, it is the
> work of a moment to change it to an abstract class instead of an interface
> and provide a default method for getP that throws
> UnsupportedOperationException. Since there are very unlikely to be any user
> extensions of QRDecomposition in the wild, we might as well do both changes
> at once. For that matter, getP can also easily return an identity
> permutation by default.
>
> Much like an immutable set or list implements the standard remove method,
> but complains when it is called, it is quite reasonable for QRDecomposition
> to not be quite the lowest common denominator, but something more like a
> midline that contains the things that you might expect a QR decomposition
> to do. Having a few (very few) methods of this sort is preferable to having
> an elaborate class structure of alternative interfaces.
+1
Looks like a reasonable approach to me, with getP returning identity
by default.
Phil
>
> On Sat, Jul 23, 2011 at 12:45 PM, Chris Nix <chris.nix@gmail.com> wrote:
>
>> We can do this with the current Householder reflection implementation,
>> except instead of just obtaining reflections from columns in sequence
>> across
>> the input matrix, we select the column with the greatest L2norm at each
>> iteration. The resulting permutation matrix is thus built and can be
>> returned later with a getP() method. A pleasing byproduct is that the
>> resulting R matrix is 'rankrevealing', allowing for a quicker getRank()
>> than currently exists in the SingularValueDecompositionImpl class.
>>
>> Does it make sense to extend the current QRDecomposition interface to one
>> for rankrevealing QR decompositions that have the existing methods, plus a
>> getP() and getRank()? The implementing class could extend the current
>> QRDecompositionImpl class to save reproducing code, at the cost of opening
>> up some private variables and methods, and the solver.
>>
>> I'll open a JIRA issue.
>>
>> Chris.
>>
>> On 23 July 2011 18:44, Greg Sterijevski <gsterijevski@gmail.com> wrote:
>>
>>> Chris, you had an algorithm in mind? Greg
>>>
>>> On Sat, Jul 23, 2011 at 11:29 AM, Phil Steitz <phil.steitz@gmail.com>
>>> wrote:
>>>
>>>> On 7/22/11 11:40 AM, Greg Sterijevski wrote:
>>>>> Sorry,
>>>>>
>>>>> public ConstrainedOLSMultipleRegression extends
>> OLSMultipleRegression{}
>>>>> should read:
>>>>>
>>>>> public ConstrainedOLSMultipleRegression extends
>> OLSMultipleRegression{
>>>>> @Override
>>>>> public void newSampleData(double[] data, double[][] coeff,
>> double[]
>>>> rhs,
>>>>> int nob, int nvars) {
>>>>> adjustData( data, coeff, rhs);
>>>>> super.newSampleData(data, nobs, nvars);
>>>>> qr = new QRDecompositionImpl(X);
>>>>> }
>>>>>
>>>>>> }
>>>>> The data would be transformed on the way in, and everything else
>> would
>>>>> remain the same...
>>>> Got it. Sounds good. Patch away...
>>>>
>>>> Couple of things to keep in mind:
>>>>
>>>> 0) We may want to dispense with the QRDecomposition interface
>>>> altogther. If we keep it, we should trim it down to what is common
>>>> and meaningfully implemented in all impls. So both the Householder
>>>> and permutation getters are dropped. If you need a pivoting impl,
>>>> go a head and code it up and we can reassess the interface.
>>>>
>>>> 1) We should be aiming to standardize the regression API. Lets pick
>>>> up the other thread on regression refactoring. Before hacking too
>>>> much more on OLS, lets refine and retrofit the new regression API on
>>>> this class.
>>>>
>>>> Phil
>>>>>
>>>>> On Fri, Jul 22, 2011 at 1:37 PM, Greg Sterijevski <
>>>> gsterijevski@gmail.com>wrote:
>>>>>> On the need for pivoting:
>>>>>>
>>>>>> Here is my first approach for changing OLSMultipleRegression to do
>>>>>> constrained estimation:
>>>>>>
>>>>>> public double[] calculateBeta(double[][] coeff, double[] rhs)
{
>>>>>> if (rhs.length != coeff.length) {
>>>>>> throw new IllegalArgumentException("");
>>>>>> }
>>>>>> for (double[] rest : coeff) {
>>>>>> if (rest.length != this.X.getColumnDimension()) {
>>>>>> throw new IllegalArgumentException("");
>>>>>> }
>>>>>> }
>>>>>> RealMatrix Coeff = new Array2DRowRealMatrix(coeff, false);
>>>>>> RealVector rhsVec = new ArrayRealVector(rhs);
>>>>>> QRDecomposition coeffQRd = new
>>>>>> QRDecompositionImpl(Coeff.transpose());
>>>>>> RealMatrix Qcoeff = coeffQRd.getQ();
>>>>>> RealMatrix R = X.multiply(Qcoeff);
>>>>>>
>>>>>> final int nvars = X.getColumnDimension();
>>>>>> final int nobs = X.getRowDimension();
>>>>>> final int ncons = coeff.length;
>>>>>>
>>>>>> RealMatrix R2 = R.getSubMatrix(
>>>>>> 0, nobs  1, ncons, nvars  1);
>>>>>>
>>>>>> RealMatrix R1 = R.getSubMatrix(
>>>>>> 0, nobs  1, 0, ncons  1);
>>>>>>
>>>>>> RealVector gamma = rhsVec.copy();
>>>>>>
>>>>>> RealMatrix coeffR = coeffQRd.getR().getSubMatrix(
>>>>>> 0, ncons  1, 0, ncons  1);
>>>>>>
>>>>>> MatrixUtils.solveLowerTriangularSystem(coeffR.transpose(),
>>>> gamma);
>>>>>> RealVector gammPrime = Y.subtract(R1.operate(gamma));
>>>>>>
>>>>>> QRDecomposition qr2 = new QRDecompositionImpl(R2);
>>>>>>
>>>>>> RealVector constrainedSolution =
>>>>>> (qr2.getSolver().solve(gammPrime));
>>>>>>
>>>>>> RealVector stackedVector =
>>>>>> new ArrayRealVector(
>>>>>> gamma.toArray(),
>>>>>> constrainedSolution.toArray());
>>>>>>
>>>>>> stackedVector = Qcoeff.operate(stackedVector);
>>>>>>
>>>>>> return stackedVector.toArray();
>>>>>> }
>>>>>>
>>>>>> This approach is based on Dongarra et al:
>>>>>>
>>>>>> LAPACK Working Note
>>>>>> Generalized QR Factorization and its Applications
>>>>>> Work in Progress
>>>>>> E. Anderson, Z. Bai and J. Dongarra
>>>>>> December 9, 1991
>>>>>> August 9, 1994
>>>>>>
>>>>>> There is nothing terrible about this approach, the coding is not
>>>> finished
>>>>>> and tidy, but its a work in progress.
>>>>>>
>>>>>> I am also aware of second approach. I do not have a cite for it,
I
>>> think
>>>> I
>>>>>> may have derived it myself, but it would not surprise me if it is
in
>>>> some
>>>>>> textbook somewhere... That second approach takes the QR
>> decomposition
>>> of
>>>> the
>>>>>> coefficient matrix and calculates adjustment matrices for the design
>>>> matrix
>>>>>> and dependent vector. The problem is that I need to reorganize the
>>>> design
>>>>>> matrix by the pivots of the QR decomposition. Once I have the
>>> adjustment
>>>>>> matrices, everything should proceed as in the case of an
>> unconstrained
>>>>>> estimation. I like the idea that if we transform the data,
>> everything
>>>> works
>>>>>> the same way.
>>>>>>
>>>>>> Since then the ConstrainedOLSMultipleRegression class looks like:
>>>>>> public ConstrainedOLSMultipleRegression extends
>> OLSMultipleRegression{
>>>>>> }
>>>>>>
>>>>>>
>>>>>> As for the fact that the QRDecompositionImpl reflects its interface.
>>> We
>>>>>> should probably add the functions:
>>>>>> public int[] getPivots();
>>>>>> public boolean isPivotting();
>>>>>>
>>>>>> to the interface. As Christopher pointed out, if the current
>>>> decomposition
>>>>>> is non pivoting, its pivot record is the canonical one,
>>> {0,1,2,...,n1}.
>>>>>> Greg
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>
>>>> 
>>>> To unsubscribe, email: devunsubscribe@commons.apache.org
>>>> For additional commands, email: devhelp@commons.apache.org
>>>>
>>>>

To unsubscribe, email: devunsubscribe@commons.apache.org
For additional commands, email: devhelp@commons.apache.org
