Never compute the inverse of a matrix. Use QR or SVD decompositions for
least squared error problems or an optimization technique for convex
problems.
What you have is a small underdetermined system that can be easily handled
using a package like R. You don't need to worry about scaling. You need to
worry (A LOT) about overfitting.
Normal OLS will fail disastrously on the problem as you state it. It might
be possible to use a regularization technique, but with only 10 data points
and 200 parameters to fit, you are unlikely to succeed unless you know a LOT
about your problem that you can encode as a prior.
On Sun, Apr 11, 2010 at 5:32 PM, prasenjit mukherjee
<prasen.bea@gmail.com>wrote:
> I am trying to compute regression coefficients, where the dimensions
> are ~ 200 and number of points = 10. Basically I need to compute the
> beta matrix using OLS ( refer
> http://en.wikipedia.org/wiki/Ordinary_least_squares ). The main
> bottleneck seems to be computing the inverse of a 200X200 matrix.
>
> Any pointers/suggestions ?
>
> Thanks,
> Prasen
>
