행렬의 adjoint는 원 행렬의 켤레 전치로 정의되었다. 유사하게 선형 변환의 adjoint를 정의하려고 한다. 어떤 선형 변환 $T$에 대해 $([T]_{\beta}^{\gamma})^* = [U]_{\gamma}^{\beta}$를 만족하는 선형 변환 $U$를 찾고, 그 $U$를 $T$의 adjoint라고 정의하는 것이 자연스러울 것이다.
Adjoint of Linear Transformation
Definition 1. Let $T \in \mathcal{L}(V, W)$ where $V$ and $W$ are finite-dimensional inner product space with inner products $\langle \cdot, \cdot \rangle _1$ and $\langle \cdot, \cdot \rangle _2$, respectively. A function $T^*: W \longrightarrow V$ is called an adjoint of $T$ if $\langle T(x), y \rangle _2 = \langle x, T^*(y) \rangle _1, \forall x \in V, \forall y \in W$.
Theorem 1
Theorem 1. Let $\mathsf{g} \in V^*$ where $V$ is a finite-dimensional inner product space over $F$. Then $! \exists \, y \in V$ such that $\mathsf{g}(x) = \langle x, y\rangle, \forall x \in V$.
Proof. Let $\beta = \{v_1, ..., v_n\}$ be an orthonormal basis for $V$. Then define $$y = \sum_{i=1}^n \overline{g(v_i)} v_i.$$ Then we have $$x = \sum_{i=1}^n \langle x, v_i \rangle v_i$$ and $$\mathsf{g}(x) = \mathsf{g}(\sum_{i=1}^n \langle x, v_i \rangle v_i) = \sum_{i=1}^n \langle x, v_i \rangle \mathsf{g}(v_i) \\ = \langle x, \sum_{i=1}^n \overline{\mathsf{g}(v_i)} v_i \rangle = \langle x, y\rangle.$$ If $\mathsf{g}(x) = \langle x, z\rangle, \forall x \in V$, for some $z \in V$, then $\mathsf{g}(x) = \langle x, z \rangle = \langle x, y\rangle, \forall x \in V$. Thus $z = y$. $\blacksquare$
Theorem 2
Theorem 2. Let $T \in \mathcal{L}(V, W)$ where $V$ and $W$ are finite-dimensional inner product space with inner products $\langle \cdot, \cdot \rangle _1$ and $\langle \cdot, \cdot \rangle _2$, respectively. Then there exists a unique adjoint $T^*$ of $T$, and $T \in \mathcal{L}(W, V)$.
Proof. Note that $\langle y, T^*(cx + z) \rangle _1 = \langle T(y), cx+ z \rangle _2$ = $\overline{c} \langle T(y), x \rangle _2 + \langle T(y), z \rangle _2$ = $\overline{c} \langle y, T^*(x) \rangle _1 + \langle y, T^*(z) \rangle _1$ = $\langle y, cT^*(x) + T^*(z) \rangle$ for all $x, z \in W, y \in V$, and $c \in F$. Then $T^*(cx + z) = cT^*(x) + T^*(z).$
Fix $y \in W$. Define $\mathsf{g}(x) = \langle T(x), y \rangle _2, \forall x \in V$. Since $\mathsf{g} \in V^*$, by Theorem 1, $! \, \exists y' \in V$ such that $\mathsf{g}(x) = \langle x, y' \rangle _1, \forall x \in V$.
Then by defining $T^*(y) = y', \forall y \in W$, we have $\langle T(x), y \rangle _2 = \langle x, T^*(y) \rangle _1, \forall x \in V$. Then $T^*$ is an adjoint of $T$.
Suppose that $\exists U \in \mathcal{L}(W, V)$ such that $\langle T(x), y \rangle _2 = \langle x, U(y) \rangle _1, \forall x \in V, y \in W$. Then $\langle x, U(y) \rangle _1 = \langle x, T^*(y) \rangle _1, \forall x \in V$, so we have $U(y) = T^*(y), \forall y \in W$. $\blacksquare$
Note
Note. $\langle x, T(y) \rangle _2 = \overline{\langle T(y), x\rangle _2} = \overline{\langle y, T^*(x) \rangle _1} = \langle T^*(x), y \rangle _1, \forall x \in W, \forall y \in V$.
Theorem 3
Theorem 3. Let $T \in \mathcal{L}(V, W)$, and let $\beta$ and $\gamma$ be orthonormal bases for $V$ and $W$, where $V$ and $W$ are finite-dimensional inner product space with inner products $\langle \cdot, \cdot \rangle _1$ and $\langle \cdot, \cdot \rangle _2$, respectively. Then $[T^*]_{\gamma}^{\beta} = ([T]_{\beta}^{\gamma})^*$.
Proof. Let $[T^*]_{\gamma}^{\beta} = A$ and $([T]_{\beta}^{\gamma})^* = B$. Then $$A_{ij} = \langle T^*(w_j), v_i \rangle _1 = \langle w_j, T(v_i) \rangle _2 = \overline{\langle T(v_i), w_j \rangle} = \overline{B_{ij}},$$ where $\beta = \{v_1, ..., v_n\}$ and $\gamma = \{w_1, ..., w_m\}$. $\blacksquare$
Corollary 1
Corollary. Let $A \in M_{m \times n}(F)$. Then $L_{A^*} = (L_A)^*$.
Proof. Note that $[L_{A^*}]_{\gamma}^{\beta} = A^* = ([L_A]_{\beta}^{\gamma})^* = [(L_A)^*]_{\gamma}^{\beta},$ where $\beta$ and $\gamma$ are the standard ordered bases for $F^n$ and $F^m$, respectively. Thus we have $L_{A^*} = (L_A)^*$. $\blacksquare$
Theorem 4
Theorem 4. Let $T, U \in \mathcal{L}(V, W)$, and let $P \in \mathcal{L}(W, Z)$, where $V, W$ and $Z$ are finite-dimensional inner product space. Then
(a) $(T+U)^* = T^* + U^*$.
(b) $(cT)^* = \overline{c} T^*, \forall c \in F$.
(c) $(PT)^* = T^*P^*$.
(d) $T^{**} = T$.
(e) $I^* = I$.
Proof. Let $\langle \cdot, \cdot \rangle _1, \langle \cdot, \cdot \rangle _2$ and $\langle \cdot, \cdot \rangle _3$ be inner products of $V, W$ and $Z$, respectively. For all $x \in V, y \in W$ and $z \in Z$,
(a) Note that $$\langle (T^* + U^*)(y), x \rangle _1 = \langle T^*(y), x \rangle _1 + \langle U^*(y), x \rangle _1 = \langle y, T(x) \rangle _2 + \langle y, U(x) \rangle _2 \\ = \langle y, T(x) + U(x) \rangle _2 = \langle (T + U)^*(y), x \rangle _1.$$ Thus $(T+U)^* = T^* + U^*$.
(b) Note that $$\langle (\overline{c} T^*)(y), x \rangle _1 = \overline{c} \langle y, T(x) \rangle _2 = \langle y, cT(x) \rangle _2 = \langle (cT)^*(y), x \rangle _1.$$ Thus $(cT)^* = \overline{c} T^*$.
(c) Note that $$\langle (T^*P^*)(z), x \rangle _1 = \langle P^*(z), T(x) \rangle _2 = \langle z, PT(x) \rangle _3 = \langle (PT)^*(z), x \rangle _1.$$ Thus $(PT)^* = T^*P^*$.
(d) Note that $$\langle T(x), y \rangle _2 = \langle x, T^*(y) \rangle _1 = \langle T^{**}(x), y \rangle _2.$$ Thus $T^{**} = T$.
(e) Note that $$||x||^2 = \langle x, x \rangle _1 = \langle I(x), x \rangle _1 = \langle x, I^*(x) \rangle _1.$$ Then $\langle x, I(x) - I^*(x) \rangle _1 = 0$ for all $x \in V$. Thus $I^* = I$. $\blacksquare$
Theorem 5
Theorem 5. Let $T \in \mathcal{L}(V, W)$ where $V$ and $W$ are finite-dimensional inner product space. Then rank($T^*$) = rank($T^*T$) = rank($TT^*$) = rank($T$).
Proof. Let $A = [T]_{\beta}^{\gamma}$ where $\beta$ and $\gamma$ are orthonormal bases for $V$ and $W$. Then by Theorem 1, rank($T$) = rank($A$) and rank($T^*$) = rank($A^*$). Then by Theorem 2, the result is clear. $\blacksquare$
Theorem 6
Theorem 6. Let $T \in \mathcal{L}(V, W)$ where $V$ and $W$ are finite-dimensional inner product spaces with inner products $\langle \cdot, \cdot \rangle _1$ and $\langle \cdot, \cdot \rangle _2$, respectively. Then $R(T^*)^{\perp} = N(T)$.
Proof. Note that $\forall y \in R(T^*)$, $\exists s \in V$ such that $T^*(s) = y$.
Then $\forall x \in R(T^*)^{\perp}$, we have $\langle x, y \rangle _1 = \langle x, T^*(s) \rangle _1 = \langle T(x), s \rangle _2 = 0$. Since we choose $s$ arbitrarily, $T(x) = \mathbf{0}$. Thus $R(T^*)^{\perp} \subseteq N(T)$.
$\forall z \in N(T)$, we have $\langle z, y \rangle _1 = \langle z, T^*(s) \rangle _1 = \langle T(z), s \rangle _2 = 0$. Thus $x \in R(T^*)^{\perp}$. Hence $R(T^*)^{\perp} = N(T).$ $\blacksquare$
Theorem 7
Theorem 7. Let $T \in \mathcal{L}(V, W)$ where $V$ and $W$ are finite-dimensional inner product spaces. If $T$ is invertible, then $T^*$ is invertible and $(T^*)^{-1} = (T^{-1})^*$.
Proof. Let $A = [T]_{\beta}^{\gamma}$ where $\beta$ and $\gamma$ are ordered bases for $V$ and $W$, respectively. Then we have $AA^{-1} = A^{-1}A = I$. Note that $(AA^{-1})^* = (A^{-1})^*A^* = I$ and $(A^{-1}A)^* = A^*(A^{-1})^* = I$. Thus $(A^*)^{-1} = (A^{-1})^*$ and $T^*$ is invertible.
Since $A^* = [T^*]_{\gamma}^{\beta}$ and $A^{-1} = [T^{-1}]_{\gamma}^{\beta}$, we have $[(T^*)^{-1}]_{\beta}^{\gamma} = (A^*)^{-1} = (A^{-1})^* = [(T^{-1})^*]_{\beta}^{\gamma}$. Thus $(T^*)^{-1} = (T^{-1})^*$. $\blacksquare$