[ https://issues.apache.org/jira/browse/MAHOUT1106?page=com.atlassian.jira.plugin.system.issuetabpanels:commenttabpanel&focusedCommentId=13498742#comment13498742
]
Agnonchik edited comment on MAHOUT1106 at 11/16/12 11:37 AM:

May I ask here some abstract question regarding the SVD++ algorithm? It has nothing to do
with the code. Please excuse me if I'm posting it in the wrong place.
I wonder if the optimization problem solved by the SVD++ algorithm has a unique solution?
Seems that in some cases, for example, when the regularization parameter lambda is equal to
zero, the problem permits multiple solutions.
We can write the SVD++ model as
ratingPrediction(user, item) = mu + bu(user) + bi(item) + (p(user) + N(user)^(0.5) * sum_{implItem
from N(user)} y(implItem)) * q(item)^T
and the learning algorithm try to optimize the following cost function
sum_{(user, item) from R} (ratingPrediction  observedRating)^2 + lambda * (bu_2^2 + bi_2^2
+ P_F^2 + Q_F^2 + Y_F^2)
where P = [p(1); ... ;p(m)], Q = [q(1); ... ;q(n)], Y = [y(1); ... ;y(n)].
Lets introduce the matrix Z such that
[Z * Y](user) = N(user)^(0.5) * sum_{implItem from N(user)} y(implItem)
Then for any solution P and Y of the optimization problem and an arbitrary vector Y2, P2 =
P + Z * (Y  Y2) and Y2 is also a solution.
Am I right?
If yes, then my point is that applying SVD++ doesn't make much sense in comparison to biased
SVD which ignores implicit feedback (Y parameter).
Thanks!
was (Author: agnonchik):
May I ask here some abstract question regarding the SVD++ algorithm? It has nothing to
do with the code. Please excuse me if I'm posting it in the wrong place.
I wonder if the optimization problem solved by the SVD++ algorithm has a unique solution?
Seems that in some cases, for example, when the regularization parameter lambda is equal to
zero, the problem permits multiple solutions.
We can write the SVD++ model as
ratingPrediction(user, item) = mu + bu(user) + bi(item) + (p(user) + N(user)^(0.5) * sum_{implItem
from N(user)} y(implItem)) * q(item)^T
and the learning algorithm try to optimize the following cost function
sum_{(user, item) from R} (ratingPrediction  observedRating)^2 + lambda * (bu_2^2 + bi_2^2
+ P_F^2 + Q_F^2 + Y_F^2)
where P = [p(1); ... ;p(m)], Q = [q(1); ... ;q(n)], Y = [y(1); ... ;y(n)].
Lets introduce the matrix Z such that
[Z * Y](user) = N(user)^(0.5) * sum_{implItem from N(user)} y(implItem)
Then for any solution P and Y of the optimization problem and an arbitrary vector Y2, P2 =
P + Z * (Y  Y2) and Y2 is also a solution.
Am I right?
If yes, then the point is that applying SVD++ doesn't make much sense in comparison to biased
SVD.
Thanks!
> SVD++
> 
>
> Key: MAHOUT1106
> URL: https://issues.apache.org/jira/browse/MAHOUT1106
> Project: Mahout
> Issue Type: New Feature
> Components: Collaborative Filtering
> Reporter: Zeno Gantner
> Assignee: Sebastian Schelter
> Attachments: SVDPlusPlusFactorizer.java
>
>
> Initial shot at SVD++.
> Relies on the RatingsSGDFactorizer class introduced in MAHOUT1089.
> One could also think about several enhancements, e.g. having separate regularization
constants for user and item factors.
> I am also the author of the SVDPlusPlus class in MyMediaLite, so if there are any similarities,
no need to worry  I am okay with relicensing this to the Apache 2.0 license.
> https://github.com/zenogantner/MyMediaLite/blob/master/src/MyMediaLite/RatingPrediction/SVDPlusPlus.cs

This message is automatically generated by JIRA.
If you think it was sent incorrectly, please contact your JIRA administrators
For more information on JIRA, see: http://www.atlassian.com/software/jira
