mahout-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Stanley Xu <>
Subject Re: How could I set a loss function in SGD?
Date Fri, 22 Apr 2011 02:09:46 GMT
Hi Ted,

I thought I got it but wanted to confirm once again for I am not a native
English speaker.

The add weight you mean here is re-define a train method, add a weight
parameter and adjust the learning rate of currentLearningRate() with this
param. Not the weight parameter already exist in the features. Am I correct?

Thanks for your patience to a Machine Learning newbie like me.

Best withes,
Stanley Xu

On Fri, Apr 22, 2011 at 6:14 AM, Ted Dunning <> wrote:

> On Tue, Apr 19, 2011 at 11:02 PM, Stanley Xu <> wrote:
>> What make me still a little confused is that, when training the model, I
>> probably knew the errors, could we thought that the penalty I wanted was
>> already counted in a loss function?
>  It could be, but usually isn't.
>> And for weight the item differently, did you mean I should adjust the
>> number of positive and negative examples in the training dataset? Like doing
>> a down sampling?
> Repeating the samples is not good because it appears to be more data than
> it really is.
> Down-sampling positives and negatives differently won't work either because
> you are just adjusting the offset term in the logistic regression.  It is
> reasonable to downsample
> the most common target in order to speed up learning and to avoid
> regularizing away positive features, but it won't really change the results
> in terms of AUC.  It will shift the threshold
> required for any desired level of false positive, but you could have
> shifted the threshold without downsampling to get the same effect.
> Changing the weights should be done by passing a weight into the training
> method and using that as an additional factor on the learning rate.

  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message