0
$\begingroup$

From classical multidimensional scaling, a Cartesian coordinate matrix can be obtained as $\mathbf{X} = \mathbf{V} \mathbf{\Lambda}^{1/2}$, where $\mathbf{\Lambda}$ is a diagonal matrix of eigenvalues and $\mathbf{V}$ is an orthogonal matrix of eigenvectors, both related to a known real symmetric matrix $\mathbf{B}$. In this case, the $i$th column of the $\mathbf{X}$ matrix is $\mathbf{X}_i = \sqrt{\lambda_i}\mathbf{v}_i$, where $(\lambda_i,\mathbf{v}_i)$ is the $i$th eigenpair.

I would like to form the derivative of $\mathbf{X}_i$ with respect to variable $t$, denoted as $\dot{\mathbf{X}}_i$, in an analytical form, for which $\dot{\mathbf{B}}$ is analytically available. Is there a specific trick to obtain $\mathbf{X}_i$, especially in the presence of mulitple eigenvalues? I know that there are formulas for the eigenvalue/eigenvector derivatives for real symmetric matrices (https://onlinelibrary.wiley.com/doi/full/10.1002/nme.6442), but I am curious whether this special case can be simplfied further.

$\endgroup$

0

You must log in to answer this question.