Projection

2023. 11. 19. 20:06·Mathematics/Linear Algebra

Projection

Definition 1. Let $W_1, \cdots, W_k \leq V$ such that $V = \bigoplus_{i=1}^k W_i$, and let $T \in \mathcal{L}(V)$ where $V$ is a vector space. Then $T$ is the projection on $W_j$ if, whenever $x = x_1 + \cdots + x_k$ with $x_i \in W_i (i = 1, \cdots, k)$, we have $T(x) = x_j$.

Theorem 1

Theorem 1. Let $W_1, \cdots, W_k \leq V$ such that $V = \bigoplus_{i=1}^k W_i$ where $V$ is a vector space. If $T$ is the projection on $W_j$, then $R(T) = W_j$ and $N(T) = W'_j = \bigoplus_{i \neq j} W_i$, so $V = R(T) \bigoplus N(T)$.
Proof. Let $v \in W_j$. Then clearly $T(v) = v \in R(T)$. Let $y \in R(T)$. Then $T(x) = y$ for some $x \in V$. If we denote $x = x_1 + \cdots + x_k$ for $x_i \in W_i (i = 1, \cdots, k)$, then $T(x) = x_j = y \in W_j$. Thus $R(T) = W_j$.
Let $x \in N(T)$, and let $x$ denote $x = x_1 + \cdots + x_k$ for $x_i \in W_i (i = 1, \cdots, k)$. Then $T(x) = x_j = \mathbf{0}$. Thus $x = x_1 + \cdots + x_{j-1} + x_{j+1} + \cdots + x_k \in W'_j$. Let $v \in W'_j$. Then clearly $T(v) = \mathbf{0}$, so $v \in N(T)$. Thus $N(T) = W'_j$. $\blacksquare$

Theorem 2

Theorem 2. Let $T \in \mathcal{L}(V)$ where $V$ is a vector space. Then $T$ is a projection $\iff$ $T = T^2$.
Proof. Let $W_1, \cdots, W_k \leq V$ such that $V = \bigoplus_{i=1}^k W_i$. 
($\Longrightarrow$)
Suppose that $T$ is the projection on $W_j$. Let $x \in V$ denote $x = x_1 + \cdots + x_k$ for $x_i \in W_i (i = 1, ..., k)$. Then $T^2(x) = T(T(x)) = T(x_j) = x_j = T(x)$. Thus $T = T^2$.
($\Longleftarrow$)
We claim that $V = W \bigoplus N(T)$ where $W = \{y \in V \,|\, T(y) = y \}$. Note that $x = T(x) + (x - T(x)), \forall x \in V$ and note that $T(x) = T^2(x) = T(T(x)), \forall x \in V$. Then $T(x - T(x)) = T(x) - T^2(x) = \mathbf{0}$. Thus $x - T(x) \in N(T)$ and $T(x) \in W$.
For any $y \in W \cap N(T)$, we have $T(y) = y$ and $T(y) = \mathbf{0}$. Thus $y = \mathbf{0}$, so $W \cap N(T) = \{\mathbf{0} \}$. This means that $V = W \bigoplus N(T)$.
Let $x \in V$ denote $x = x_1 + x_2$ for some $x_1 \in W$ and $x_2 \in N(T)$. Then $T(x_1) = x_1$ and $T(x_2) = \mathbf{0}$. Thus $T(x) = T(x_1) + T(x_2) = x_1$, so $T$ is the projection on $W$. $\blacksquare$
저작자표시 (새창열림)
'Mathematics/Linear Algebra' 카테고리의 다른 글
  • Spectral Theorem
  • Orthogonal Projection
  • Hermitian Matrix
  • Normal Matrix
Erdos
Erdos
수학과, 물리학과 학부생들이 운영하는 팀블로그입니다.
  • Erdos
    SAMICO
    Erdos
  • 전체
    오늘
    어제
    • 분류 전체보기 (266) N
      • Mathematics (178) N
        • Real analysis (25) N
        • Linear Algebra (64)
        • Number Thoery (11)
        • Calculus (55)
        • Probability (6)
        • Set Theory (13)
        • Writing (2)
        • Problems (1)
        • Abstract Algebra (1)
      • Physics (69)
        • 일반물리 (2)
        • 상대성이론과 양자역학 입문 (35)
        • 열물리 (13)
        • 수리물리 (13)
        • 고전역학 (6)
      • Computer (7)
      • 독서 (12)
        • 과학 (5)
        • 문학 (2)
        • 자기계발서 (4)
  • 공지사항

    • 참고서적
  • 최근 댓글

  • 최근 글

  • hELLO· Designed By정상우.v4.10.2
Erdos
Projection
상단으로

티스토리툴바