Skip to main content

All Questions

5 votes
1 answer
155 views

If I can solve $Ax = b$ efficiently, is there a method to solve $(I+A)x=b$ efficiently?

Say if I already have the LU factorization of a square matrix $A$, is there an efficient way to get the LU factorization of $I+A$? (We may assume all the matrix I mentioned is invertible.) I know from ...
qdmj's user avatar
  • 555
0 votes
0 answers
59 views

QR algorithm fails to converge (bad shift?)

Problem: My code-based implementation of the implicit QR algorithm fails to converge for certain special cases, and it's because those cases have bad shift values. What are those special cases: While ...
Math Machine's user avatar
0 votes
0 answers
71 views

Eigenvalue decomposition for $A^TA$ for sparse A?

I have a sparse matrix $A \in \mathbb{R}^{n \times l^{2}}$, and I want to calculate the eigenvalue decomposition of $A^{\top}A$. Since $A^{\top}A$ is positive semidefinite, all the eigenvalues are non-...
wsz_fantasy's user avatar
  • 1,722
0 votes
0 answers
40 views

How to approximately diagonalize a special symmetric hermitian matrix?

Given a hermitian matrix $H$ as follows: \begin{equation} H = \begin{bmatrix} H^1 & V^{12} \\ V^{21} & H^2 \end{bmatrix}. \end{equation} Here, $H^1,H^2\in\mathbb{C}^{N\times N}$ ...
bb wang's user avatar
0 votes
0 answers
62 views

Numerical linear algebra problem (QR decomposition)

Problem: Given the QR-decomposition of a rectangular matrix $A \in \mathbb{R}^{m \times n}$, where $m > n > 1$, find the QR-decomposition of a matrix $A_k \in \mathbb{R}^{m \times (n - 1)}$, ...
the_dude's user avatar
  • 596
3 votes
1 answer
139 views

Form of Q in extended QR decomposition calculated with Householder reflections

Let $A = QR$ be the extended QR decomposition of matrix $A \in \mathbb{R}^{m \times n}$ which is calculated by using $n$ Householder reflections. Prove by construction that there exist an upper ...
Hinko Pih Pih's user avatar
1 vote
1 answer
95 views

How to efficiently compute an SVD decomposition with a generalized orthonormal condition?

A regular SVD decomposition of matrix $X\in\mathbb{R}^{n\times m}$ is $$ X = UDV^\top, \qquad U\in\mathbb{R}^{n\times r},\ D\in\mathbb{R}^{r\times r},V \in\mathbb{R}^{m\times r},$$ where $U$ and $V$ ...
Miles N.'s user avatar
  • 157
0 votes
1 answer
468 views

Efficient low-rank approximation of the covariance matrix

Suppose we have $n$ samples each containing $p$ features arranged into a matrix $X \in \mathbb{R}^{n \times p}$. We focus on the high-dimensional setting where $p >> n$. By definition, the ...
nalzok's user avatar
  • 836
0 votes
0 answers
53 views

Show that a matrix has a Cholesky factorization providing that it can be written as a product of a matrix and its transpose [duplicate]

$A$ is an invertible real square matrix ($A \in \mathbb{M_{n}(\mathbb{R})}$ and $det(A) \neq 0$). Let's consider another matrix $B \in \mathbb{M_{n}(\mathbb{R})}$ such that: $$B = {}^\intercal A \cdot ...
Ramzi Baaguigui's user avatar
2 votes
1 answer
531 views

How to understand QR decomposition? Compare the power method, QR decomposition for finding eigenvalues and Lyapunov exponents.

The numerical methods for finding (the largest) eigenvalues and (the largest) Lyapunov exponents (LEs) look similar. The power method is to use the matrix $B$ repetitively to grow a vector $z$, and ...
Charlie Chang's user avatar
0 votes
0 answers
103 views

On the numerical computation of eigenvalues and eigenvectors

Given a matrix $A$ of order $n$ with coefficients in $\mathbb{C}$, applying the shifted QR algorithm: $$ \begin{aligned} & (Q, R) = \text{qrfactor}(A - \omega\,I_n)\,; \\ & A = R\cdot Q + \...
Monster's user avatar
  • 249
0 votes
0 answers
285 views

Uniquness of a QR-Decomposition: Show that there exists an orthogonal diagonal matrix $ S \in \mathbb{R}^{n \times n} $

Let $ A=Q_{1} R_{1}=Q_{2} R_{2} $ be two $ Q R $-decompositions of a quadratic matrix $A \in \mathbb{R}^{n \times n} $ with full rank, i.e., $ \operatorname{rank}(A)=n $. This means $ Q_{1}, Q_{2} \in ...
clementine1001's user avatar
0 votes
0 answers
212 views

What's the general form of the rotation matrices should be used in QR decomposition

Apply two iteration of the QR method to the matrix that was given $$ A=\left[\begin{array}{lll} 3 & 1 & 0 \\ 1 & 3 & 1 \\ 0 & 1 & 3 \end{array}\right] $$ Solution: $$ P_1=\...
WhyMeasureTheory's user avatar
0 votes
0 answers
268 views

Implementing LU factorization with partial pivoting in C using only one matrix

I have designed the following C function in order to compute the PA = LU factorization, using only one matrix to store and compute the data: ...
Marc's user avatar
  • 195
0 votes
1 answer
48 views

Constructing a vector norm on $\mathbb{R}^n$ such that subordinate matrix norm equals the spectral radius

Statement of problem: "Let $A$ be square diagonalizable matrix. Constructing a vector norm on $\mathbb{R}^n$ such that subordinate matrix norm, $||A||=\max|\lambda_i|$" I know that $A$ being ...
user5896534's user avatar

15 30 50 per page
1
2 3 4 5 6