Skip to main content

Questions tagged [dimensionality-reduction]

Techniques for reducing a large number of variables or dimensions spanned by data to a smaller number of dimensions while preserving as much information about the data as possible. Prominent methods include PCA, Factor Analysis, MDS, Independent Component Analysis, Multiple Correspondence Analysis, Isomap, etc. The two main subclasses of techniques: feature extraction and feature selection.

0 votes
0 answers
10 views

Averaging classification metrics vs Direct infer

Suppose I have tensor X_test with this shape: (10,2,512) Where: 10 is num of ID 2 is num of Channel of every ID, let's say ...
Muhammad Ikhwan Perwira's user avatar
1 vote
0 answers
17 views

Dimensions in ICA

I'm going through Andrew Ng's notes on ICA and the blind source separation example mostly makes sense. In essence, we have $d$ microphone recordings $x \in R^d$ and also $d$ independent speakers in $s ...
James Liu's user avatar
-2 votes
0 answers
29 views

What is the basic difference between "feature selection" and "feature extraction" and "dimensionality reduction"? [duplicate]

What is the basic difference between "feature selection" and "feature extraction" and "dimensionality reduction"? I have one thousand features and one million samples in ...
user366312's user avatar
  • 2,201
1 vote
1 answer
26 views

Performing a PCA on data of different hierarchical levels

I (novice) plan to use a PCA on several different, related, i.e. non-orthogonal questionnaire measures. These measures have composite scores (item sums etc.) and some of them have sub-facets. Also, ...
Livster's user avatar
  • 11
4 votes
2 answers
297 views

When is multidimensional scaling exact for a graph?

For an undirected graph with one connected component and distance matrix given by the shortest path between nodes, I would like to embed the nodes in a high dimensional Euclidean space where all ...
user3433489's user avatar
0 votes
0 answers
16 views

How can I assign households to coordinates in a social space consistent with pairwise distance measures?

How can I assign households to coordinates in a social space consistent with pairwise distance measures? I have a question which is somewhat ill-defined, about creating an interesting and useful ...
andrewH's user avatar
  • 3,157
0 votes
0 answers
18 views

Curse of dimensionality in Time series with K-means

I have been looking at the following notebook: time series clustering where the writer says that the dataset is affected by the "Curse of Dimensionality", so applying TimeSeriesKMeans ...
Zackbord's user avatar
0 votes
0 answers
15 views

Meaning of within-class Covariance in Linear Discriminant Analysis Dimensionality Reduction

In section 4.3.3 of Elements of Statistical Learning by Hastie, Tibshirani, and Friedman the authors listed a procedure to reduce the dimensions of an input matrix $\mathbf{X}$, first using Linear ...
Jack Guan's user avatar
  • 103
0 votes
1 answer
62 views

Matrix decomposition with constraints and weighted least squares

We have a matrix, $\mathbf{X}$, of probability distributions between 6 different results, so each row $\mathbf{x}_i$ sums to 1. We want to perform dimension reduction so that each row is a linear ...
jgf1123's user avatar
0 votes
0 answers
21 views

Is the behavior of log-likelihood and number of parameters correct in probabilistic PCA?

I am studying the behavior of Probabilistic PCA as described by Tipping and Bishop (1999). I am using the R package "Rdimtools" to help. I am puzzled about the number of parameters in the ...
Daniel Caetano's user avatar
0 votes
1 answer
30 views

Conditional Independence: Equivalent Conditions

Let $X_1$ and $X_2$ be random variables, and $R(X_1)$ be a function of $X_1$. Here are two statements: (a) $X_1\perp\!\!\!\!\perp (X_2, Y) \mid R(X_1) $ (b) $X_1\perp\!\!\!\!\perp Y \mid \{R(X_1),X_2\}...
Hepdrey's user avatar
  • 79
0 votes
0 answers
31 views

Why apply PCA instead of just removing highly correlated variables? Specially in prediction tasks [duplicate]

First of all let's assume we have variables that are correlated or highly correlated. When we apply PCA we want to reduce dimensionality, PCA works better when we have a linear correlation between the ...
Gabriel_86400's user avatar
1 vote
0 answers
35 views

Dimensionality reduction and precomputed distance matrix

I have a question about dimensionality reduction. I want to understand how methods like MDS and t-SNE work. In particular, I'd like to understand the difference when I precompute the distance matrix ...
Clemente Gotelli's user avatar
0 votes
1 answer
45 views

Applying clustering algorithms after t-SNE in R

So I'm doing my bachelor`s work and I'm applying different clustering algorithms on certain data. Before all the clustering of course I'm using a dimensionality reduction algorithm such as t-SNE for ...
user avatar
0 votes
1 answer
39 views

post processing in PCA and making sense of an example

The example is as follows: A bunch of doctors were asked to score a list of desirable characteristics of sales representatives. The questions were like: "in-depth knowledge about his/her product&...
figs_and_nuts's user avatar

15 30 50 per page
1
2 3 4 5
74