Pseudo code for ALS in octave
Users = rand(m,n);
for i=1:iterations
Items = A \ Users;
Users = A' \ Items;
endfor
Start with a random guess of the matrix Users,(mXn)
and then you compute matrix Items. which is a least squares solution to
Users
and then calculate Users which is a least square solution to Items and
iterate until convergence.
On Tue, Jan 8, 2013 at 3:10 PM, Sebastian Schelter <ssc@apache.org> wrote:
> Koobas,
>
> ALS doesn't apply QR decomposition to the useritemmatrix. It
> factorizes this matrix into two smaller matrices which contain the user
> and item features.
>
> Each userfeature vector is modeled as a linear combination of the
> feature vectors of the items which the user rated (and vice versa for
> the itemfeature vectors).
>
> This factorization is iteratively refined. In each iteration, ALS first
> fixes the itemfeature vectors and solves a leastsquares problem for
> each user and then fixes the userfeature vectors and solves a
> leastsquares problem for each item.
>
> I suggest that you read the papers I referred to in the last mail, which
> in detail explain the algorithm.
>
> Best,
> Sebastian
>
>
> On 08.01.2013 23:55, Koobas wrote:
> > I am trying to wrap my head around it.
> > From the Mahout documentation it looks like it's a standard (dense) QR
> with
> > Householder reflectors.
> > But the useritem matrix is usually extremely sparse.
> > So, is the sparsity somehow taken into account,
> > or are the sparse righthandside vectors packed in dense storage and hit
> > with Householder?
> > The underlying question being the computational complexity, i.e. number
> of
> > floating point operations involved.
> >
> >
> > On Tue, Jan 8, 2013 at 4:03 PM, Sebastian Schelter <ssc@apache.org>
> wrote:
> >
> >> Hi Koobas,
> >>
> >> We have two classes that implement the solutions described in the
> >> ALSrelated papers:
> >>
> >> For explicit feedback data [1] we have AlternatingLeastSquaresSolver,
> >> for implicit feedback data [2] we have
> >> ImplicitFeedbackAlternatingLeastSquaresSolver. Both can be found in the
> >> org.apache.mahout.math.als package.
> >>
> >> Internally the use Mahout's QRDecomposition to solve the linear systems
> >> associated with ALS.
> >>
> >> Best,
> >> Sebastian
> >>
> >> [1]
> >>
> >>
> http://www.hpl.hp.com/personal/Robert_Schreiber/papers/2008%20AAIM%20Netflix/netflix_aaim08(submitted).pdf
> >> [2] http://research.yahoo.com/pub/2433
> >>
> >>
> >> On 08.01.2013 21:53, Koobas wrote:
> >>> Perhaps somebody can shed some light on the topic.
> >>> What algorithm is used to solve the least squares problem
> >>> when computing lowrank approximation using Alternating Least Squares?
> >>>
> >>
> >>
> >
>
>

Mohit
"When you want success as badly as you want the air, then you will get it.
There is no other secret of success."
Socrates
