Hermitian Operator
·
Mathematics/Linear Algebra
Hermitian Defintion 1. Let $T \in \mathcal{L}(V)$ where $V$ is an inner product space. We say that $T$ is hermitian (or self-adjoint) if $T = T^*$. 위와 같은 조건을 만족시켰을 때 선형 연산자가 hermitian이라고 부른다. 자명하게 선형 연산자 $T$가 hermitian일 조건은 $[T]_{\beta}$가 hermitian일 조건과 동치이다. ($\beta$는 orthonormal basis) 선형 연산자가 normal일 조건을 생각해본다면, hermitian이면 normal임을 쉽게 알 수 있다. Lemma Lemma. Let $T$ be a hermitian operator on a..
Normal Operator
·
Mathematics/Linear Algebra
Normal Operator Defintion 1. Let $T \in \mathcal{L}(V)$ where $V$ is an inner product space. We say that $T$ is normal if $TT^* = T^*T$. 위와 같은 조건을 만족시켰을 때 선형 연산자가 normal, 즉 정규하다고 부른다. 자명하게 선형 연산자 $T$가 normal일 조건은 $[T]_{\beta}$가 normal일 조건과 동치이다. ($\beta$는 orthonormal basis) Theorem 1 Theorem 1. Let $T$ be a normal operator on $V$ where $V$ is an inner product space. Then the following statements..
Schur's Theorem
·
Mathematics/Linear Algebra
Schur's TheoremTheorem 1. Let $T \in \mathcal{L}(V)$ where $V$ is a finite-dimensional inner product space. Then there exists an orthonormal basis $\beta$ for $V$ such that $[T]_{\beta}$ is upper triangular. Proof. Let $n = \dim(V)$. The proof is by the mathematical induction on $n$. If $n = 1$, the result is immediate. So suppose that the theorem is true for $n-1$ where $n-1 \geq 1$. Let $W$ be..
Adjoint of Linear Transformation
·
Mathematics/Linear Algebra
행렬의 adjoint는 원 행렬의 켤레 전치로 정의되었다. 유사하게 선형 변환의 adjoint를 정의하려고 한다. 어떤 선형 변환 $T$에 대해 $([T]_{\beta}^{\gamma})^* = [U]_{\gamma}^{\beta}$를 만족하는 선형 변환 $U$를 찾고, 그 $U$를 $T$의 adjoint라고 정의하는 것이 자연스러울 것이다. Adjoint of Linear Transformation Definition 1. Let $T \in \mathcal{L}(V, W)$ where $V$ and $W$ are finite-dimensional inner product space with inner products $\langle \cdot, \cdot \rangle _1$ and $\langle ..
Bessel's Inequality, and Parseval's Identity
·
Mathematics/Linear Algebra
Bessel's Inequality Theorem 1. Let ($V, \langle \cdot, \cdot \rangle$) be an inner product space, and let $S = \{v_1, ..., v_n\}$ be an orthonormal subset of $V$. Then $\forall x \in V$, $$||x||^2 \geq \sum_{i=1}^n |\langle x, v_i \rangle|^2.$$ Proof. Let $\langle S \rangle = W$. Then $! \exists y \in W, z \in W^{\perp}$ such that $x = y + z$ by Theorem 1. Thus we have $$||x||^2 = ||y||^2 + ||z|..
Direct Sum
·
Mathematics/Linear Algebra
Sum Definition 1. Let $W_1, ..., W_k \leq V$. We define the sum of these subspaces to be the set $\{v_1 + \cdots + v_k \,|\, v_i \in W_i \text{ for } 1 \leq i \leq k\}$, which we denote by $$\sum_{i=1}^k W_i.$$ Direct Sum Definition 2. Let $W_1, ..., W_k \leq V$. We call $V$ the direct sum of $W_1, ..., W_k$ and write $$V = \bigoplus_{i=1}^k W_i,$$ if $V = \sum_{i=1}^k W_i$ and $W_j \cap \sum_{i..
Orthogonal Complement
·
Mathematics/Linear Algebra
Orthogonal Complement Definition 1. Let $(V, \langle \cdot, \cdot \rangle)$ be an inner product space, and let $\emptyset \neq S \subseteq V$. We define $S^{\perp}$ to be $S^{\perp} = \{x \in V \,|\, \langle x, y\rangle = 0, \forall y \in S\}$. The set $S^{\perp}$ is called the orthogonal complement of $S$. $S$의 벡터들에 직교하는 벡터들을 모두 모아놓은 집합을 $S$의 orthogonal complement, 직교여공간이라고 부른다. 자명하게 $S^{\perp}..
Gram-Schmidt Process
·
Mathematics/Linear Algebra
Orthogonal Definition 1. Let $(V, \langle \cdot, \cdot \rangle)$ be an inner product space. Let $x, y \in V$, and let $S \subseteq V$. Then (a) $x$ and $y$ are orthogonal (or perpendicular) if $\langle x, y\rangle = 0$. (b) $S$ is orthogonal if any two distinct vector in $S$ are orthogonal. 고등학교 시절 내적을 배웠다면, 내적의 정의를 $x \cdot y = |x| |y| \cos \theta$로 기억하고 있을 것이다. 이 경우 $\theta = 90^{\circ}$일 때 두 ..
Norm
·
Mathematics/Linear Algebra
Norm Definition 1. Let $V$ be a vector space over $F = \mathbb{R}$ or $\mathbb{C}$. A norm is a function $|| \cdot ||: V \longrightarrow \mathbb{R}$ such that $\forall x, y \in V, \forall a \in F$, the following hold: (a) $||x|| \geq 0$, and $||x|| = 0 \iff x = \mathbf{0}$. (b) $||ax|| = |a|\,||x||$. (c) $||x+y|| \leq ||x|| + ||y||$. Then $(V, ||\cdot||)$ is called a normed space. 복소수의 크기를 절댓값을 ..
Adjoint of Matrix
·
Mathematics/Linear Algebra
Adjoint of Matrix Definition 1. Let $A \in M_{m \times n}(F)$. We define the adjoint or conjugate transpose of $A$ to be the $n \times m$ matrix $A^*$ such that $(A^*)_{ij} = \overline{A_{ji}}$ for all $i, j$. Theorem 1 Theorem 1. Let $A, B \in M_{m \times n}(F)$, and let $C \in M_{n \times p}$. Then (a) $(A+B)^* = A^* + B^*$ (b) $(cA)^* = \overline{c} A^*, \forall c \in F$. (c) $(AC)^* = C^*A^*..