Continuous Functions on Compact Metric Spaces
·
Mathematics/Real analysis
Theorem 44.1Theorem 44.1. If \( f \) is a continuous function from a compact metric space \( M_1 \) into a metric space \( M_2 \), then \( f(M_1) \) is compact.Proof. Let $\{ y_n \}$ be a sequence in $f(M_1)$. Then $y_n \in f(M_1), \forall n \in \mathbb{N}$, which means that $\exists x_n \in M_1$ such that $f(x_n) = y_n, \forall n \in \mathbb{N}$. Note that $\{ x_n \}_{n=1}^{\infty}$ is a sequen..
The Bolzano-Weierstrass Characterization of a Compact Metric Space
·
Mathematics/Real analysis
Lemma 43.1Lemma 43.1. If $M$ is a compact metric space, then every sequence in $M$ has a convergent subsequence.Proof. Suppose that there is a sequence $\{ x_n \}$ in $M$ such that has no convergent subsequence. We claim that $\forall x \in M, \exists \varepsilon > 0$ such that the set $$\{ n \in \mathbb{N} \, | \, x_n \in B_{\varepsilon}(x) \}$$ is finite. $(\because)$ If not, then $\exists x \..
Wronskian
·
Mathematics/Linear Algebra
WronskianDefinition. Let $f_1, ..., f_n$ be $n$-th differentiable functions. The Wronskian of $\{ f_1, ..., f_n \}$ is the matrix $$W = \begin{bmatrix} f_1 & f'_1 & f''_1 & \cdots & f^{(n-1)}_1 \\ f_2 & f'_2 & f''_2 & \cdots & f^{(n-1)}_2 \\ \vdots & \vdots & \vdots & \ddots & \vdots \\ f_ n & f'_n & f''_n & \cdots & f^{(n-1)}_n \end{bmatrix}.$$TheoremTheorem. Let $W$ be the Wronskian of $\{ f_1..
Vandermonde Matrix
·
Mathematics/Linear Algebra
Vandermonde MatrixDefinition. The Vandermonde matrix of order $n$ is the matrix $$V = \begin{bmatrix} 1 & x_1 & x^2_1 & \cdots & x_1^{n-1} \\ 1 & x_2 & x^2_2 & \cdots & x_2^{n-1} \\ \vdots & \vdots & \vdots & \ddots & \vdots \\ 1 & x_n & x^2_n & \cdots & x_n^{n-1} \end{bmatrix}.$$ TheoremTheorem. The determinant of the Vandermonde matrix $V$ of order $n$ is $$\det(V) = \prod_{1 \leq i
Basis of the Sum and Intersection
·
Mathematics/Linear Algebra
TheoremTheorem. Let \( V \) and \( W \) be two subspaces of \( \mathbb{R}^n \), and let $\alpha = \{v_1, ..., v_p\}$ and $\beta = \{w_1, ..., w_q \}$ be bases for \( V \) and \( W \), respectively. Put $$Q = \begin{bmatrix} | & & | & | & & | \\v_1 & \cdots & v_p & w_1 & \cdots & w_q \\| & & | & | & & |\end{bmatrix} \in M_{n \times (p + q)}(\mathbb{R}). $$ Then (1) $\mathcal{C}(Q) = V + W $ (2) $..
Structure of The System of Linear Equations
·
Mathematics/Linear Algebra
TheoremTheorem. Let $K$ be the solution set of a system of linear equations $Ax = b$. Then $\forall s \in K$, $K = \{s\} + \mathcal{N}(A) = \{ s + k \, | \, k \in \mathcal{N}(A) \}$. Proof. Fix $s \in K$. $\forall l \in K, Al = b = As \Longrightarrow A(l-s) = \mathbf{0}$. Then $l-s \in \mathcal{N}(A) \Longrightarrow \exists p \in \mathcal{N}(A)$ such that $l-s = p \Longrightarrow l = s+p \in \{s..
Cramer's Rule
·
Mathematics/Linear Algebra
Cramer's RuleCramer's Rule. Let $Ax = b$ be a system of linear equations for $A \in M_{n \times n}(F)$. If $A$ is invertible, then $Ax = b$ has the unique solution given by $$x_j = \frac{\det C_j}{\det A}, \quad j = 1, ..., n,$$ where $C_j$ is the matrix obtained from $A$ by replacing the $j$-th column with the column vector $b$.Proof. We have $\textbf{x} = A^{-1} \textbf{b}$. Then $$\textbf{x} ..
Row and Column Spaces
·
Mathematics/Linear Algebra
벡터 $x_1, ..., x_n \in \mathbb{R}^m$이 주어졌을 때 이 벡터들이 linearly independent한지 그 여부를 알기 위해서 다음과 같은 방정식의 해를 구하면 되었었다. $$\begin{bmatrix} | & & | \\ x_1 & \cdots & x_n \\ | & & | \end{bmatrix} \begin{bmatrix} \alpha_1 \\ \vdots \\ \alpha_n \end{bmatrix} = \mathbf{0}$$ 이때 nonzero solution $(\alpha_1, ..., \alpha_n)$이 존재한다면 $x_1, ..., x_n$은 linearly dependent하고, 오직 trivial solution밖에 존재하지 않는다면 linearly in..
How To Compute The Inverse of a Matrix
·
Mathematics/Linear Algebra
Theorem 1Theorem 1. Let $A \in M_{n \times n}(F)$. Then(a) If $A$ is invertible, then $(A \vert I_n)$ can be transformed into $(I_n \vert A^{-1})$ by means of a finite number of elementary row operations,(b) If $A$ is invertible and $(A \vert I_n)$ is transformed into $(I_n \vert B)$ by means of a finite number of elementary row operations, then $B = A^{-1}$,(c) If $A$ is not invertible, then an..
Reduced Row Echolen Form
·
Mathematics/Linear Algebra
Forward Elimination and Backward SubstitutionDefinition 48. (1) Forward elimination is the procedure making below each first nonzero coefficients in the nonzero rows of the augmented matrix of the system into a column of zeros. (2) After the forward elimination, the first nonzero coefficients in the nonzero rows are the pivots. (3) The first nonzero number $1$'s at the pivotal positions are call..