mahout-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Sean Owen <>
Subject Re: weighted score
Date Tue, 23 Feb 2010 11:49:54 GMT
Yes I want to keep thinking about this. I'm not satisfied that the
right answer is clear.

Ted do you have any standard advice about how people do weighted
averages when weights are negative?

Capping the estimated preference could be reasonable in practice. It
feels funny, but, it's also rare that the weighted average comes out
negative. And, it merely affects estimates on items that are not going
to be recommended.

I'd have to add to recommenders an ability to specify the minimum and
maximum possible preference. Not hard.

Any thoughts on this?

On Mon, Feb 22, 2010 at 6:47 PM, Ted Dunning <> wrote:
> This all smells a lot like the problems that crop up in training
> classifiers.
> Lots of systems have trouble with asymmetric goals (like 1 to 5) and are
> massively helped by offsetting by the mean (changing rating scale to -2 to 2
> might help, or actually subtracting the observed mean).
> This comes up all the time in neural net training algorithms.
> Only in properly done regression systems like logistic regression where you
> fully actually take into account a loss function does this not bite you.
> Even so, the form of the loss function may be much simpler with one
> formulation or the other and interpretation of weights can also be simpler.
> I am not familiar with the details under discussion here, but just looking
> at the words being used makes it sound like pretty much the same problem.
> On Mon, Feb 22, 2010 at 9:05 AM, Sean Owen <> wrote:
>> >> What if the weights are 1,1,-1,-1? The estimate is -2 then. This is
>> >> why I say this won't work
> --
> Ted Dunning, CTO
> DeepDyve

View raw message