<?xml version="1.0" encoding="utf-8" standalone="yes" ?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom">
  <channel>
    <title>Eigenvalues and singular values | Nicholas Hu</title>
    <link>https://www.math.ucla.edu/~njhu/notes/nla/eig/</link>
      <atom:link href="https://www.math.ucla.edu/~njhu/notes/nla/eig/index.xml" rel="self" type="application/rss+xml" />
    <description>Eigenvalues and singular values</description>
    <generator>Hugo Blox Builder (https://hugoblox.com)</generator><language>en-ca</language><lastBuildDate>Mon, 16 Jun 2025 00:00:00 +0000</lastBuildDate>
    
    
    <item>
      <title>The Schur decomposition</title>
      <link>https://www.math.ucla.edu/~njhu/notes/nla/eig/schur/</link>
      <pubDate>Wed, 28 May 2025 00:00:00 +0000</pubDate>
      <guid>https://www.math.ucla.edu/~njhu/notes/nla/eig/schur/</guid>
      <description>&lt;div class=&#34;btn-links mb-3&#34;&gt;
&lt;a class=&#34;btn btn-outline-primary btn-page-header btn-sm&#34; href=&#34;../schur.pdf&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;
  PDF
&lt;/a&gt;
&lt;/div&gt;
&lt;!--
No newlines allowed between $$&#39;s below!
--&gt;
&lt;div style=&#34;display: none;&#34;&gt;
$$
%% Sets and functions %%
\newcommand{\set}[1]{\{ #1 \}}
\newcommand{\Set}[1]{\left \{ #1 \right\}}
\renewcommand{\emptyset}{\varnothing}
\newcommand{\N}{\mathbb{N}}
\newcommand{\Z}{\mathbb{Z}}
\newcommand{\R}{\mathbb{R}}
\newcommand{\Rn}{\mathbb{R}^n}
\newcommand{\Rm}{\mathbb{R}^m}
\newcommand{\C}{\mathbb{C}}
\newcommand{\F}{\mathbb{F}}
%% Linear algebra %%
\newcommand{\abs}[1]{\lvert #1 \rvert}
\newcommand{\Abs}[1]{\left\lvert #1 \right\rvert}
\newcommand{\inner}[2]{\langle #1, #2 \rangle}
\newcommand{\Inner}[2]{\left\langle #1, #2 \right\rangle}
\newcommand{\norm}[1]{\lVert #1 \rVert}
\newcommand{\Norm}[1]{\left\lVert #1 \right\rVert}
\newcommand{\tp}{{\top}}
\newcommand{\trans}{{\top}}
\newcommand{\span}{\operatorname{span}}
\newcommand{\im}{\operatorname{im}}
\newcommand{\ker}{\operatorname{ker}}
\newcommand{\rank}{\operatorname{rank}}
\newcommand{\proj}{\operatorname{proj}}
\newcommand{\proj}[1]{\mathop{\mathrm{proj}_{#1}}}
\newcommand{\refl}{\operatorname{refl}}
\newcommand{\refl}[1]{\mathop{\mathrm{refl}_{#1}}}
\newcommand{\K}{\mathcal{K}}
\newcommand{\L}{\mathcal{L}}
\renewcommand{\epsilon}{\varepsilon}
\newcommand{\conj}{\overline}
\newcommand{\sign}{\operatorname{sign}}
%% Colours %%
\definecolor{cblue}{RGB}{31, 119, 180}
\definecolor{corange}{RGB}{255, 127, 14}
\definecolor{cgreen}{RGB}{44, 160, 44}
\definecolor{cred}{RGB}{214, 39, 40}
\definecolor{cpurple}{RGB}{148, 103, 189}
\definecolor{cbrown}{RGB}{140, 86, 75}
\definecolor{cpink}{RGB}{227, 119, 194}
\definecolor{cgrey}{RGB}{127, 127, 127}
\definecolor{cyellow}{RGB}{188, 189, 34}
\definecolor{cteal}{RGB}{23, 190, 207}
$$
&lt;/div&gt;
&lt;!-- BODY --&gt;
&lt;h2 id=&#34;the-complex-schur-decomposition&#34;&gt;The complex Schur decomposition&lt;/h2&gt;
&lt;p&gt;Let 

$A \in \C^{n \times n}$. The &lt;strong&gt;(complex) Schur decomposition&lt;/strong&gt; is a factorization of 

$A$ as 

$UTU^{-1}$, where 

$U \in \C^{n \times n}$ is &lt;em&gt;unitary&lt;/em&gt; and 

$T \in \C^{n \times n}$ is &lt;em&gt;upper triangular&lt;/em&gt;.&lt;/p&gt;
&lt;p&gt;Such a factorization always exists and can be constructed recursively: let 

$\lambda \in \C$ be an eigenvalue of 

$A$ (which exists by the fundamental theorem of algebra) and 

$v_1 \in \C^n$ be a corresponding normalized eigenvector. Extending 

$\set{v_1}$ to an orthonormal basis 

$\set{v_j}_{j=1}^n$ and defining 

$V := \begin{bmatrix} v_1 &amp; \cdots &amp; v_n \end{bmatrix} \in \C^{n \times n}$, we obtain


$$
A = V \begin{bmatrix} \lambda &amp; b^* \\ &amp; \hat{A} \end{bmatrix} V^{-1}
$$
for some 

$\hat{A} \in \C^{(n-1) \times (n-1)}$ and 

$b \in \C^{n-1}$. Thus, if 

$\hat{A}$ has a Schur decomposition 

$\hat{U} \hat{T} \hat{U}^{-1}$, then


$$
A = \underbrace{V \begin{bmatrix} 1 &amp; \\ &amp; \hat{U} \end{bmatrix}}_{U} \underbrace{\begin{bmatrix} \lambda &amp; b^* \hat{U} \\ &amp; \hat{T} \end{bmatrix}}_T \underbrace{\begin{bmatrix} 1 &amp; \\ &amp; \hat{U}^{-1} \end{bmatrix} V^{-1}}_{U^{-1}}
$$
is a Schur decomposition of 

$A$, and its existence in the base case 

$n = 1$ is trivial.&lt;/p&gt;
&lt;h3 id=&#34;complex-normal-matrices&#34;&gt;Complex normal matrices&lt;/h3&gt;
&lt;p&gt;If a normal matrix 

$T \in \C^{n \times n}$ can be partitioned as 

$\begin{bmatrix} T_{11} &amp; T_{12} \\ &amp; T_{22} \end{bmatrix}$ with 

$T_{11} \in \C^{m \times m}$, then by definition


$$
\begin{bmatrix} T_{11} &amp; T_{12} \\ &amp; T_{22} \end{bmatrix} \begin{bmatrix} T_{11}^* &amp; \\ T_{12}^* &amp; T_{22}^* \end{bmatrix} = \begin{bmatrix} T_{11}^* &amp; \\ T_{12}^* &amp; T_{22}^* \end{bmatrix} \begin{bmatrix} T_{11} &amp; T_{12} \\ &amp; T_{22} \end{bmatrix},
$$
so 

$T_{11} T_{11}^* + T_{12} T_{12}^* = T_{11}^* T_{11}$. Taking traces, we obtain 

$\norm{T_{11}}_\mathrm{F}^2 + \norm{T_{12}}_\mathrm{F}^2 = \norm{T_{11}}_\mathrm{F}^2$, which implies that 

$T_{12} = 0$ and hence that 

$T_{11}$ and 

$T_{22}$ are normal.&lt;/p&gt;
&lt;p&gt;Now if 

$A$ is normal, then so is the upper triangular matrix 

$T = U^* A U$ in its complex Schur factorization. It follows that 

$T$ must in fact be &lt;em&gt;diagonal&lt;/em&gt;, which yields the &lt;strong&gt;complex spectral theorem&lt;/strong&gt;: a (square) complex matrix is unitarily diagonalizable if and only if it is normal.&lt;/p&gt;
&lt;h2 id=&#34;the-real-schur-decomposition&#34;&gt;The real Schur decomposition&lt;/h2&gt;
&lt;p&gt;Let 

$A \in \R^{n \times n}$. The &lt;strong&gt;real Schur decomposition&lt;/strong&gt; is a factorization of 

$A$ as 

$QTQ^{-1}$, where 

$Q \in \R^{n \times n}$ is &lt;em&gt;orthogonal&lt;/em&gt; and 

$T \in \R^{n \times n}$ is &lt;em&gt;upper quasi-triangular&lt;/em&gt; – that is, block upper triangular with diagonal blocks that are 

$1 \times 1$ or 

$2 \times 2$.&lt;/p&gt;
&lt;p&gt;As in the complex case, such a factorization always exists; arguing as above, it suffices to show that 

$A$ has a one- or two-dimensional invariant subspace. Indeed, when regarded as a complex matrix, 

$A$ has an eigenvalue 

$\lambda = \alpha + i \beta \in \C$ (where 

$\alpha, \beta \in \R$) and an eigenvector 

$v_1 = x_1 + iy_1 \in \C^n$ (where 

$x_1, y_1 \in \R^n$).&lt;/p&gt;
&lt;p&gt;If 

$\lambda \in \R$, then 

$Ax_1 + i Ay_1 = \lambda x_1 + i \lambda y_1$, so 

$x_1$ or 

$y_1$ is an eigenvector with eigenvalue 

$\lambda$ (at least one of these vectors must be nonzero) and therefore spans a one-dimensional invariant subspace of 

$A$. On the other hand, if 

$\lambda \in \C \setminus \R$, then 

$Ax_1 + i Ay_1 = (\alpha x_1 - \beta y_1) + i(\beta x_1 + \alpha y_1)$. Moreover, 

$\conj{v_1}$ is an eigenvector of 

$A$ with eigenvalue 

$\conj{\lambda}$ and 

$\lambda \neq \conj{\lambda}$, so 

$\set{v_1, \conj{v_1}}$ is linearly independent. Hence 

$x_1 = \frac{1}{2}(v_1 + \conj{v_1})$ and 

$y_1 = \frac{1}{2i}(v_1 - \conj{v_1})$ span a two-dimensional invariant subspace of 

$A$.&lt;/p&gt;
&lt;h3 id=&#34;real-normal-matrices&#34;&gt;Real normal matrices&lt;/h3&gt;
&lt;p&gt;If 

$A$ is normal, then 

$T = Q^* A Q$ will be “&lt;em&gt;quasi-diagonal&lt;/em&gt;” – block diagonal with diagonal blocks that are 

$1 \times 1$ or 

$2 \times 2$ – and each diagonal block will itself be normal. Thus, these blocks will be of the form


$$
\begin{bmatrix} \alpha \end{bmatrix} \quad (\alpha \in \R)
\qquad
\text{or}
\qquad
\begin{bmatrix} \alpha &amp; \beta \\ -\beta &amp; \alpha \end{bmatrix}
\quad
(\alpha \in \R,\, \beta \in \R \setminus \set{0}).
$$
In particular, if 

$A$ is symmetric, there will only be 

$1 \times 1$ blocks, which yields the &lt;strong&gt;real spectral theorem&lt;/strong&gt;: a (square) real matrix is orthogonally diagonalizable if and only if it is symmetric. If 

$A$ is orthogonal, we will have 

$\alpha = \pm 1$ in the 

$1 \times 1$ blocks and 

$\alpha = \cos(\theta)$ and 

$\beta = \sin(\theta)$ for some 

$\theta \in \R \setminus \pi \Z$ in the 

$2 \times 2$ blocks.&lt;/p&gt;
</description>
    </item>
    
    <item>
      <title>The singular value decomposition</title>
      <link>https://www.math.ucla.edu/~njhu/notes/nla/eig/svd/</link>
      <pubDate>Thu, 09 Jan 2025 00:00:00 +0000</pubDate>
      <guid>https://www.math.ucla.edu/~njhu/notes/nla/eig/svd/</guid>
      <description>&lt;div class=&#34;btn-links mb-3&#34;&gt;
&lt;a class=&#34;btn btn-outline-primary btn-page-header btn-sm&#34; href=&#34;../svd.pdf&#34; target=&#34;_blank&#34; rel=&#34;noopener&#34;&gt;
  PDF
&lt;/a&gt;
&lt;/div&gt;
&lt;!--
No newlines allowed between $$&#39;s below!
--&gt;
&lt;div style=&#34;display: none;&#34;&gt;
$$
%% Sets and functions %%
\newcommand{\set}[1]{\{ #1 \}}
\newcommand{\Set}[1]{\left \{ #1 \right\}}
\renewcommand{\emptyset}{\varnothing}
\newcommand{\N}{\mathbb{N}}
\newcommand{\Z}{\mathbb{Z}}
\newcommand{\R}{\mathbb{R}}
\newcommand{\Rn}{\mathbb{R}^n}
\newcommand{\Rm}{\mathbb{R}^m}
\newcommand{\C}{\mathbb{C}}
\newcommand{\F}{\mathbb{F}}
%% Linear algebra %%
\newcommand{\abs}[1]{\lvert #1 \rvert}
\newcommand{\Abs}[1]{\left\lvert #1 \right\rvert}
\newcommand{\inner}[2]{\langle #1, #2 \rangle}
\newcommand{\Inner}[2]{\left\langle #1, #2 \right\rangle}
\newcommand{\norm}[1]{\lVert #1 \rVert}
\newcommand{\Norm}[1]{\left\lVert #1 \right\rVert}
\newcommand{\trans}{{\top}}
\newcommand{\span}{\mathop{\mathrm{span}}}
\newcommand{\im}{\mathop{\mathrm{im}}}
\newcommand{\ker}{\mathop{\mathrm{ker}}}
\newcommand{\rank}{\mathop{\mathrm{rank}}}
%% Colours %%
\definecolor{cblue}{RGB}{31, 119, 180}
\definecolor{corange}{RGB}{255, 127, 14}
\definecolor{cgreen}{RGB}{44, 160, 44}
\definecolor{cred}{RGB}{214, 39, 40}
\definecolor{cpurple}{RGB}{148, 103, 189}
\definecolor{cbrown}{RGB}{140, 86, 75}
\definecolor{cpink}{RGB}{227, 119, 194}
\definecolor{cgrey}{RGB}{127, 127, 127}
\definecolor{cyellow}{RGB}{188, 189, 34}
\definecolor{cteal}{RGB}{23, 190, 207}
$$
&lt;/div&gt;
&lt;!-- BODY --&gt;
&lt;p&gt;Let 

$A \in \C^{m \times n}$. The &lt;strong&gt;singular value decomposition (SVD)&lt;/strong&gt; is a factorization of 

$A$ as 

$U \Sigma V^*$, where 

$U \in \C^{m \times m}$ and 

$V \in \C^{n \times n}$ are unitary and 

$\Sigma \in \R^{m \times n}$ is (rectangular) diagonal with &lt;em&gt;nonnegative&lt;/em&gt; entries. &lt;sup id=&#34;fnref:1&#34;&gt;&lt;a href=&#34;#fn:1&#34; class=&#34;footnote-ref&#34; role=&#34;doc-noteref&#34;&gt;1&lt;/a&gt;&lt;/sup&gt; In other words, 

$A = \sum_{i=1}^{\min \set{m, n}} \sigma_i u_i v_i^*$, where 

$u_i$ and 

$v_i$ are the 

$i$&lt;sup&gt;th&lt;/sup&gt; columns of 

$U$ and 

$V$ and 

$\sigma_i$ is the 

$i$&lt;sup&gt;th&lt;/sup&gt; diagonal entry of 

$\Sigma$. The vectors 

$u_i$ and 

$v_i$ are called &lt;strong&gt;left&lt;/strong&gt; and &lt;strong&gt;right singular vectors&lt;/strong&gt; of 

$A$ and the scalars 

$\sigma_i$ are called &lt;strong&gt;singular values&lt;/strong&gt; of 

$A$; by convention, we arrange the singular values in decreasing order.&lt;/p&gt;
&lt;p&gt;If an SVD of 

$A$ has 

$r$ nonzero singular values, then 

$\set{u_i}_{i=1}^r$ is an orthonormal basis of 

$\im(A)$ because 

$Av_i = \sigma_i u_i$ for all 

$i$. Hence 

$r$ must be the rank of 

$A$ and 

$\set{u_i}_{i=r+1}^m$ an orthonormal basis of 

$\ker(A^*)$; similarly, 

$\set{v_i}_{i=1}^r$ and 

$\set{v_i}_{i=r+1}^n$ are orthonormal bases of 

$\im(A^*)$ and 

$\ker(A)$.&lt;/p&gt;
&lt;h2 id=&#34;existence&#34;&gt;Existence&lt;/h2&gt;
&lt;p&gt;Assume without loss of generality that 

$m \geq n$. Clearly, the matrix 

$A^* A$ is (Hermitian) positive semidefinite, so by the spectral theorem, 

$A^* A = V \Lambda V^*$ for some unitary 

$V \in \C^{n \times n}$ and some diagonal 

$\Lambda \in \R^{n \times n}$ with diagonal entries 

$\lambda_1 \geq \cdots \geq \lambda_n \geq 0$. Set 

$\sigma_i = \sqrt{\lambda_i}$ for each 

$i$ and 

$Av_i = \sigma_i u_i$ for each nonzero 

$\sigma_i$. If 

$r$ is as above, 

$\hat{U} := \begin{bmatrix} u_1 &amp; \cdots &amp; u_r \end{bmatrix} \in \C^{m \times r}$, and 

$\hat{\Sigma} := \mathrm{diag}(\sigma_1, \dots, \sigma_r) \in \R^{r \times r}$, then by construction


$$
AV = \hat{U} \begin{bmatrix} \hat{\Sigma} &amp; 0_{r \times (n-r)} \end{bmatrix}.
$$
Moreover, 

$\inner{u_i}{u_j} = \inner{Av_i / \sigma_i}{Av_j / \sigma_j} = \inner{\lambda_i v_i}{v_j} / \sigma_i \sigma_j = \delta_{ij}$, so 

$\set{u_i}_{i=1}^r$ is orthonormal. Extending this set to an orthonormal basis 

$\set{u_i}_{i=1}^m$, and defining 

$U = \begin{bmatrix} u_1 &amp; \cdots &amp; u_m \end{bmatrix} \in \C^{m \times m}$ and


$$
\Sigma = \begin{bmatrix} \hat{\Sigma} &amp; 0_{r \times (n-r)} \\ 0_{(m-r) \times r} &amp; 0_{(m-r) \times (n-r)}\end{bmatrix} \in \R^{m \times n},
$$
we obtain 

$A = U \Sigma V^*$ as required. &lt;sup id=&#34;fnref:2&#34;&gt;&lt;a href=&#34;#fn:2&#34; class=&#34;footnote-ref&#34; role=&#34;doc-noteref&#34;&gt;2&lt;/a&gt;&lt;/sup&gt;&lt;/p&gt;
&lt;p&gt;Although an SVD is not unique, this argument shows that the singular values are unique and that the singular vectors are unique up to complex signs if 

$m = n$ and the singular values are distinct, since we must have 

$A^* A = V (\Sigma^* \Sigma) V^*$.&lt;/p&gt;
&lt;h2 id=&#34;low-rank-approximation&#34;&gt;Low-rank approximation&lt;/h2&gt;
&lt;blockquote&gt;
&lt;p&gt;&lt;strong&gt;Eckart–Young theorem&lt;/strong&gt;&lt;/p&gt;
&lt;p&gt;Suppose that 

$U \Sigma V^*$ is an SVD of a matrix 

$A \in \C^{m \times n}$ with rank 

$r$. If 

$k \leq r$ and 

$A_k := \sum_{i=1}^k \sigma_i u_i v_i^*$, then 

$\norm{A - B}_2 \geq \sigma_{k+1} = \norm{A - A_k}_2$ for all 

$B \in \C^{m \times n}$ such that 

$\rank(B) \leq k$ (where 

$\sigma_{r+1} := 0$). In particular, 

$\norm{A}_2 = \sigma_1$.&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;&lt;em&gt;Proof.&lt;/em&gt; Suppose that 

$B \in \C^{m \times n}$ is such that 

$\rank(B) \leq k$. Then 

$\dim(\ker(B)) \geq n-k$, so there exists a 

$v \in \ker(B) \cap \span \set{v_i}_{i=1}^{k+1}$ such that 

$\norm{v}_2 = 1$. Hence 

$\norm{A - B}_2^2 \geq \norm{(A-B)v}_2^2 = \norm{Av}_2^2 = \sum_{i=1}^{k+1} \sigma_i^2 \abs{\inner{v}{v_i}}^2 \geq \sigma_{k+1}^2$. Similarly, if 

$v \in \C^n$ with 

$\norm{v}_2 = 1$, then 

$\norm{(A-A_k) v}_2^2 = \sum_{i=k+1}^r \sigma_i^2 \abs{\inner{v}{v_i}}^2 \leq \sigma_{k+1}^2$, with equality if 

$v = v_{k+1}$. ∎&lt;/p&gt;
&lt;p&gt;An analogous theorem holds for the Frobenius norm (which can be proven similarly). In fact, we have the following generalization.&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;&lt;strong&gt;Eckart–Young–Mirsky theorem&lt;/strong&gt;&lt;/p&gt;
&lt;p&gt;Suppose that 

$U \Sigma V^*$ is an SVD of a matrix 

$A \in \C^{m \times n}$ with rank 

$r$ and let 

$\norm{{}\cdot{}}$ be a &lt;em&gt;unitarily invariant&lt;/em&gt; norm. If 

$k \leq r$ and 

$A_k := \sum_{i=1}^k \sigma_i u_i v_i^*$, then 

$\norm{A - B} \geq \norm{A - A_k}$ for all 

$B \in \C^{m \times n}$ such that 

$\rank(B) \leq k$.&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p&gt;&lt;em&gt;Proof.&lt;/em&gt; We begin by proving &lt;strong&gt;Weyl’s inequality&lt;/strong&gt; for singular values:


$$
\sigma_{i+j-1}(A+B) \leq \sigma_i(A) + \sigma_j(B),
$$
where 

$\sigma_i({}\cdot{})$ denotes the 

$i$&lt;sup&gt;th&lt;/sup&gt; singular value of a given matrix. Let 

$A_k := \sum_{i=1}^k \sigma_i(A) u_i v_i^*$. Then 

$\rank(A_{i-1} + B_{j-1}) \leq (i-1) + (j-1) = i+j-2$, so by the Eckart–Young theorem, 

$\sigma_{i+j-1}(A+B) \leq \norm{(A + B) - (A_{i-1} + B_{j-1})}_2 \leq \norm{A - A_{i-1}}_2 + \norm{B - B_{j-1}}_2 = \sigma_i(A) + \sigma_j(B)$.&lt;/p&gt;
&lt;p&gt;Now suppose that 

$B \in \C^{m \times n}$ is such that 

$\rank(B) \leq k$. By Weyl’s inequality, 

$\sigma_{k+i}(A) \leq \sigma_{k+1}(B) + \sigma_i(A-B) = \sigma_i(A-B)$ for all 

$i$ (where 

$\sigma_{k+i}({}\cdot{}) := 0$ if 

$k + i &gt; \min \set{m, n}$). Thus, by unitary invariance, it suffices to show that 

$\Phi(x) := \norm{\operatorname{diag}(x_1, x_2, \dots)} \leq \Phi(y)$ whenever 

$0 \leq x \leq y$ componentwise; by induction and permutation invariance, we may assume without loss of generality that 

$x_i = y_i$ for all 

$i &gt; 1$. &lt;sup id=&#34;fnref:3&#34;&gt;&lt;a href=&#34;#fn:3&#34; class=&#34;footnote-ref&#34; role=&#34;doc-noteref&#34;&gt;3&lt;/a&gt;&lt;/sup&gt; Accordingly, let 

$\theta \in [0, 1]$ be such that 

$x_1 = \theta y_1$. Then 

$\Phi(x) = \Phi(\frac{1+\theta}{2} y_1 + \frac{1-\theta}{2}(-y_1), y_2, \dots) \leq \frac{1+\theta}{2} \Phi(y_1, y_2, \dots) + \frac{1-\theta}{2} \Phi(-y_1, y_2, \dots) = \Phi(y)$, as was to be shown. ∎&lt;/p&gt;
&lt;div class=&#34;footnotes&#34; role=&#34;doc-endnotes&#34;&gt;
&lt;hr&gt;
&lt;ol&gt;
&lt;li id=&#34;fn:1&#34;&gt;
&lt;p&gt;If 

$A \in \R^{m \times n}$, an SVD is defined analogously; i.e., with 

$U$ and 

$V$ orthogonal.&amp;#160;&lt;a href=&#34;#fnref:1&#34; class=&#34;footnote-backref&#34; role=&#34;doc-backlink&#34;&gt;&amp;#x21a9;&amp;#xfe0e;&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li id=&#34;fn:2&#34;&gt;
&lt;p&gt;If we instead add 

$n-r$ rows of zeroes to 

$\begin{bmatrix} \hat{\Sigma} &amp; 0 \end{bmatrix}$, forming a square matrix, the resulting decomposition is sometimes called the &lt;strong&gt;thin SVD&lt;/strong&gt;; if we instead omit the last 

$n-r$ columns, what remains is sometimes called the &lt;strong&gt;compact SVD&lt;/strong&gt;.&amp;#160;&lt;a href=&#34;#fnref:2&#34; class=&#34;footnote-backref&#34; role=&#34;doc-backlink&#34;&gt;&amp;#x21a9;&amp;#xfe0e;&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li id=&#34;fn:3&#34;&gt;
&lt;p&gt;Clearly, if 

$A \mapsto \Phi(\sigma_1(A), \sigma_2(A), \dots)$ is a unitarily invariant norm, then 

$\Phi$ is a &lt;strong&gt;symmetric gauge function&lt;/strong&gt;: a norm on a real coordinate space that is &lt;em&gt;permutation invariant&lt;/em&gt; and &lt;em&gt;absolute&lt;/em&gt; (

$\Phi(\abs{x}) = \Phi(x)$ for all 

$x$). The converse is a theorem of von Neumann.&amp;#160;&lt;a href=&#34;#fnref:3&#34; class=&#34;footnote-backref&#34; role=&#34;doc-backlink&#34;&gt;&amp;#x21a9;&amp;#xfe0e;&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;/ol&gt;
&lt;/div&gt;
</description>
    </item>
    
  </channel>
</rss>
