### Dimensionality reduction: comparison of different methods

(No Ratings Yet)

I’ve already given some answers in one of my first tickets on manifold learning. Here I will give some more complete results on the quality of the dimensionality reduction performed by the most well-known techniques.

First of all, my test is about respecting the geodesic distances in the reduced space. This is not possible for some manifolds like a Gaussian 2D plot. I used the SCurve to create the test, as the speed on the curve is unitary and thus the distances in the coordinate space (the one I used to create the SCurve) are the same as the geodesic ones on the manifold. My test measures the matrix (Frobenius) norm between the original coordinates and the computed one up to an affine transform of the latter.

I tested several noise levels:

• no noise
• 5% of Gaussian noise
• 2% of Laplacian noise (only 2% because there are many outliers in the Laplacian law)
• Impulsive noise on 12.5% elements of the distance matrix with 200% Laplacian noise (quantified with respect to the variance of the noise-free d1-d2, where d2 was estimated with my robust cost function)

Here are the results:

Method no noise Gaussian Noise 5% Laplacian noise 2% Impulsive noise
PCA 43.6 43.6 44.4 na
Isomap 3.01 8.55 7.01 3.80
Sp 2.29 2.94 6.46 2.93
Ssam 2.61 2.60 6.10 3.22
Scca 3.01 6.22 4.70 3.09
Laplacian Eigenmaps 21.13 23.51 23.47 na
Diffusion Maps 67.50 67.76 67.54 na
Hessian Eigenmaps 3.05 18.57 20.51 na
LLE 40.1 90.2 69.2 na

The geodesic-based algorithms perform obviously and logically better than every other algorithm. In my case, I want this to happen as I want to estimate a mapping function between the reduced space and the original space. This estimation and the effect of the reduction algorithm on it will be the subjects of future tickets.

## 4 thoughts on “`Dimensionality reduction: comparison of different methods`”

1. lousjiang says:

Thank you for your article and I find it of great value in my current research area, but I am new in this field,
article(Dimensionality reduction: comparison of different methods), I really appreciate that.
and here is my Email: lousongjiang@hotmail.com, thank you

2. Hi !

I will post another blog ticket in a few days (after my small holidays), because 99% of the code is already available as a subpart of a scikit. So you can use my code, give me some feedback, … before I make the official announcement.
There is a small tutorial here: http://scipy.org/scipy/scikits/wiki/MachineLearning/ManifoldLearning

Thanks for the comment !

3. serra says:

Hi Matt,
The result of your test is very important for my research aswell. But I found it a bit difficult to understand. For example if you have done a classification with this reduced space and give the overall classification results, I would have understood better:) Now sorry but, which method is using Geodesic distances (I only know ISODATA) nad what are the results in the table? If you give further information, I would be happy to read it:) Thanks in advance
Regards,

1. Hi serra,

I don’t understand your first remark. I didn’t do a classification in this example, I only compare the distances in the original space to the distances in the reduced space. This is what I display inside the table (percentage of difference between original distances and distances in the reduced space). Classification can be done in this space, but it’s an other subject 😉
The algorithms that use an approximation of the geodesic distance are:

• Isomap
• Sp
• Ssam
• Scca

This site uses Akismet to reduce spam. Learn how your comment data is processed.