mahout-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Ted Dunning <ted.dunn...@gmail.com>
Subject Re: Logistic Regression cost function
Date Mon, 13 Jan 2014 20:59:58 GMT
I think that this is the link in the code:

http://leon.bottou.org/research/stochastic




On Mon, Jan 13, 2014 at 11:58 AM, Frank Scholten <frank@frankscholten.nl>wrote:

> Do you know which paper it is? He has quite a few publications. I don't see
> any mention of one of his papers in the code. I only see
> www.eecs.tufts.edu/~dsculley/papers/combined-ranking-and-regression.pdf in
> MixedGradient but this is something different.
>
>
>
> On Mon, Jan 13, 2014 at 1:27 PM, Suneel Marthi <suneel_marthi@yahoo.com
> >wrote:
>
> > Mahout's impl is based off of Leon Bottou's paper on this subject.  I
> > don't gave the link handy but it's referenced in the code or try google
> > search
> >
> > Sent from my iPhone
> >
> > > On Jan 13, 2014, at 7:14 AM, Frank Scholten <frank@frankscholten.nl>
> > wrote:
> > >
> > > Hi,
> > >
> > > I followed the Coursera Machine Learning course quite a while ago and I
> > am
> > > trying to find out how Mahout implements the Logistic Regression cost
> > > function in the code surrounding AbstractOnlineLogisticRegression.
> > >
> > > I am looking at the train method in AbstractOnlineLogisticRegression
> and
> > I
> > > see online gradient descent step where the beta matrix is updated but
> to
> > me
> > > its unclear how matches with the cost function described at:
> > > http://www.holehouse.org/mlclass/06_Logistic_Regression.html
> > >
> > > Perhaps Mahout uses an optimized approach for that does not directly
> map
> > > into the formula at that link?
> > >
> > > Cheers,
> > >
> > > Frank
> >
>

Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message