mahout-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Lance Norskog <goks...@gmail.com>
Subject Re: Vector truncation for visualization
Date Fri, 10 Jun 2011 02:27:34 GMT
 For the singular vectors technique:

The matrix is all of the input vectors as rows.

1) Given a matrix where the row vectors are all vectors in the space,
2) Subtract the global mean from each member.
3) Do svd and get the ordered set of singular vectors.
3a) Truncate this and pull the first two singular vectors.
3b) Thus, truncating SVD is helpful.

For the affinity matrix, the variable 'r_ij' means 'random matrix
i,j'? And use this for a direct projection onto 2 dimensions?

For another opinion: Lanczos Vectors versus Singular Vectors for
Effective Dimension Reduction

http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.158.779&rep=rep1&type=pdf

On Wed, Jun 8, 2011 at 11:20 PM, Ted Dunning <ted.dunning@gmail.com> wrote:
> On Thu, Jun 9, 2011 at 2:27 AM, Lance Norskog <goksron@gmail.com> wrote:
>> Projecting to the first "two" singular vectors?
>
> Yes.
>
>> Do an SVD on a random matrix, and use the first 2 (or three) singular
>> vectors as a matrix?
>
> Not a random matrix.  A matrix of positions shifted back to have
> average mean (aka PCA).
>
>>
>> What goes into the affinity matrix?
>
> exp(-r_ij ^ 2 / \sigma) is a common usage.  sigma is chosen to have a
> fairly sparse affinity matrix.  r_ij is distance from i to j.
>



-- 
Lance Norskog
goksron@gmail.com

Mime
View raw message