mahout-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Ted Dunning <ted.dunn...@gmail.com>
Subject Re: weighted score
Date Mon, 22 Feb 2010 18:47:59 GMT
This all smells a lot like the problems that crop up in training
classifiers.

Lots of systems have trouble with asymmetric goals (like 1 to 5) and are
massively helped by offsetting by the mean (changing rating scale to -2 to 2
might help, or actually subtracting the observed mean).

This comes up all the time in neural net training algorithms.

Only in properly done regression systems like logistic regression where you
fully actually take into account a loss function does this not bite you.
Even so, the form of the loss function may be much simpler with one
formulation or the other and interpretation of weights can also be simpler.

I am not familiar with the details under discussion here, but just looking
at the words being used makes it sound like pretty much the same problem.

On Mon, Feb 22, 2010 at 9:05 AM, Sean Owen <srowen@gmail.com> wrote:

> >> What if the weights are 1,1,-1,-1? The estimate is -2 then. This is
> >> why I say this won't work
>



-- 
Ted Dunning, CTO
DeepDyve

Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message