전체 글

수학과, 물리학과 학부생들이 운영하는 팀블로그입니다.
수학/선형대수학

Schur's Theorem

Schur's Theorem Theorem 1. Let $T \in \mathcal{L}(V)$ where $V$ is a finite-dimensional inner product space. Then there exists an orthonormal basis $\beta$ for $V$ such that $[T]_{\beta}$ is upper triangular. Proof. Let $n = \dim(V)$. The proof is by the mathematical induction on $n$. If $n = 1$, the result is immediate. So suppose that the theorem is true for $n-1$ where $n-1 \geq 1$. Let $W$ b..

수학/선형대수학

Adjoint of Linear Transformation

행렬의 adjoint는 원 행렬의 켤레 전치로 정의되었다. 유사하게 선형 변환의 adjoint를 정의하려고 한다. 어떤 선형 변환 $T$에 대해 $([T]_{\beta}^{\gamma})^* = [U]_{\gamma}^{\beta}$를 만족하는 선형 변환 $U$를 찾고, 그 $U$를 $T$의 adjoint라고 정의하는 것이 자연스러울 것이다. Adjoint of Linear Transformation Definition 1. Let $T \in \mathcal{L}(V, W)$ where $V$ and $W$ are finite-dimensional inner product space with inner products $\langle \cdot, \cdot \rangle _1$ and $\langle ..

수학/선형대수학

Bessel's Inequality, and Parseval's Identity

Bessel's Inequality Theorem 1. Let ($V, \langle \cdot, \cdot \rangle$) be an inner product space, and let $S = \{v_1, ..., v_n\}$ be an orthonormal subset of $V$. Then $\forall x \in V$, $$||x||^2 \geq \sum_{i=1}^n |\langle x, v_i \rangle|^2.$$ Proof. Let $\langle S \rangle = W$. Then $! \exists y \in W, z \in W^{\perp}$ such that $x = y + z$ by Theorem 1. Thus we have $$||x||^2 = ||y||^2 + ||z|..

수학/선형대수학

Direct Sum

Sum Definition 1. Let $W_1, ..., W_k \leq V$. We define the sum of these subspaces to be the set $\{v_1 + \cdots + v_k \,|\, v_i \in W_i \text{ for } 1 \leq i \leq k\}$, which we denote by $$\sum_{i=1}^k W_i.$$ Direct Sum Definition 2. Let $W_1, ..., W_k \leq V$. We call $V$ the direct sum of $W_1, ..., W_k$ and write $$V = \bigoplus_{i=1}^k W_i,$$ if $V = \sum_{i=1}^k W_i$ and $W_j \cap \sum_{i..

수학/선형대수학

Orthogonal Complement

Orthogonal Complement Definition 1. Let $(V, \langle \cdot, \cdot \rangle)$ be an inner product space, and let $\emptyset \neq S \subseteq V$. We define $S^{\perp}$ to be $S^{\perp} = \{x \in V \,|\, \langle x, y\rangle = 0, \forall y \in S\}$. The set $S^{\perp}$ is called the orthogonal complement of $S$. $S$의 벡터들에 직교하는 벡터들을 모두 모아놓은 집합을 $S$의 orthogonal complement, 직교여공간이라고 부른다. 자명하게 $S^{\perp}..

수학/선형대수학

Gram-Schmidt Process

Orthogonal Definition 1. Let $(V, \langle \cdot, \cdot \rangle)$ be an inner product space. Let $x, y \in V$, and let $S \subseteq V$. Then (a) $x$ and $y$ are orthogonal (or perpendicular) if $\langle x, y\rangle = 0$. (b) $S$ is orthogonal if any two distinct vector in $S$ are orthogonal. 고등학교 시절 내적을 배웠다면, 내적의 정의를 $x \cdot y = |x| |y| \cos \theta$로 기억하고 있을 것이다. 이 경우 $\theta = 90^{\circ}$일 때 두 ..

수학/선형대수학

Norm

Norm Definition 1. Let $V$ be a vector space over $F = \mathbb{R}$ or $\mathbb{C}$. A norm is a function $|| \cdot ||: V \longrightarrow \mathbb{R}$ such that $\forall x, y \in V, \forall a \in F$, the following hold: (a) $||x|| \geq 0$, and $||x|| = 0 \iff x = \mathbf{0}$. (b) $||ax|| = |a|\,||x||$. (c) $||x+y|| \leq ||x|| + ||y||$. Then $(V, ||\cdot||)$ is called a normed space. 복소수의 크기를 절댓값을 ..

수학/선형대수학

Adjoint of Matrix

Adjoint of Matrix Definition 1. Let $A \in M_{m \times n}(F)$. We define the adjoint or conjugate transpose of $A$ to be the $n \times m$ matrix $A^*$ such that $(A^*)_{ij} = \overline{A_{ji}}$ for all $i, j$. Theorem 1 Theorem 1. Let $A, B \in M_{m \times n}(F)$, and let $C \in M_{n \times p}$. Then (a) $(A+B)^* = A^* + B^*$ (b) $(cA)^* = \overline{c} A^*, \forall c \in F$. (c) $(AC)^* = C^*A^*..

수학/선형대수학

Inner Product Space

이 포스트에서 $V$는 $F$-벡터공간으로 취급한다. Inner Product Definition 1. An inner product on $V$ is a function $\langle \cdot, \cdot \rangle: V \times V \longrightarrow F$, such that $\forall x, y, z \in V$ and $\forall c \in F$, the following hold: (a) $\langle x + z, y \rangle = \langle x, y \rangle + \langle z, y \rangle$. (b) $\langle cx, y \rangle = c \langle x, y \rangle$. (c) $\overline{ \langle x, y \r..

수학/선형대수학

The Cayley-Hamilton Theorem

The Cayley-Hamilton Theorem Theorem 1. (The Cayley-Hamilton Theorem) Let $T \in \mathcal{L}(V)$, and let $f(t)$ be the characteristic polynomial of $T$. (V is finite-dimensional) Then $f(T) = T_0$, the zero transformation. Proof. We need to show that $f(T)(v) = \mathbf{0}, \forall v \in V$. If $v = \mathbf{0}$, it is clear. Suppose that $v \neq \mathbf{0}$. Let $W$ be the $T$-cyclic subspace of ..

Erdos
SAMICO