But is it actually QR of Y?
On Tue, Jan 8, 2013 at 3:41 PM, Sean Owen <srowen@gmail.com> wrote:
> There's definitely a QR decomposition in there for me since solving A
> = X Y' for X is X = A Y (Y' * Y)^1 and you need some means to
> compute the inverse of that (small) matrix.
>
> On Tue, Jan 8, 2013 at 5:27 PM, Ted Dunning <ted.dunning@gmail.com> wrote:
> > This particular part of the algorithm can be seen as similar to a least
> > squares problem that might normally be solved by QR. I don't think that
> > the updates are quite the same, however.
> >
> > On Tue, Jan 8, 2013 at 3:10 PM, Sebastian Schelter <ssc@apache.org>
> wrote:
> >
> >> This factorization is iteratively refined. In each iteration, ALS first
> >> fixes the itemfeature vectors and solves a leastsquares problem for
> >> each user and then fixes the userfeature vectors and solves a
> >> leastsquares problem for each item.
> >>
>
