orthogonality-least-squares.wiki (4245B)
1 == Orthogonality & least squares == 2 let $u,v \in \Re^n$. orthogonal iff: 3 * $u \cdot v = 0$ 4 * or $\|u\|^2 + \|v\|^2 = \|u+v\|^2$ 5 6 === Inner (dot) product & uses === 7 let $u,v \in \Re^n$. then, $u \cdot v = u^T v \in \Re$. 8 9 in English, to calculate you just multiply the vectors row-wise, and sum all the results. 10 11 Regular algebraic rules apply. 12 13 $u \cdot u \geq 0$, only 0 iff u = 0. 14 15 ==== Length of a vector ==== 16 Let $v \in \Re^n$, then the norm (length) of v is $\|v\| = \sqrt{v \cdot v}$. 17 18 Does the norm coincide with length of line segments? Yes: 19 20 $x = \begin{bmatrix}a\\b\end{bmatrix}, \quad \|v\| = \sqrt{v \cdot v} = \sqrt{a^2 + b^2} = \text{Pythagoras}$ 21 22 ==== Distance between vectors ==== 23 Let $u,v \in \Re^n$. then, $\text{dist}(u,v) = \|u-v\|$. 24 25 === Orthogonal complement === 26 Let $W \subset \Re^n$ a subspace, then orthogonal complement of W is $W^\perp = \{x \in \Re^n | x \cdot v = 0 \forall u \in W \}$ 27 28 properties: 29 * $(colA)^\perp = (NulA)^T$ 30 * $(NulA)^\perp = (colA)^T$ 31 32 === Orthogonal sets === 33 a set $\{v_1 \dots v_p\}$ is orthogonal if $v_i \cdot v_j = 0 \forall i,j$. then $\{v_1 \dots v_p\}$ is a basis for $\text{Span}\{v_1 \dots v_p\}$ 34 35 An orthogonal basis is a basis that is also an orthogonal set 36 37 Why orthogonal basis? Let $W \in \Re^n$ be subspace with orthogonal basis $\{u_1 \dots u_p\}$, then $W \ni y = c_1 u_1 + \ldots + c_p u_p$, with $c_i = \frac{y \cdot u_i}{u_i \cdot u_i}$ for i = 1...p. 38 39 An orthonormal set/basis is an orthogonal set/basis consisting of unit vectors (like $\{e_1, \ldots, e_n\}\text{ for }\Re^n$). 40 41 An m × matrix A has orthonormal columns iff $A^T A = I_n$ 42 * $(Ax) \cdot (Ay) = x \cdot y$ 43 * $\| Ax \| = \| x \|$ 44 45 === Orthogonal projections === 46 ==== Orthogonal decomposition ==== 47 Let W be a subspace of $\Re^n$. Each y in $R^n$ can be written uniquely in $y = \hat{y}+z$ ($\hat{y} \in W,\; z \in W^\perp$) 48 49 If $\{u_1, \ldots, u_p\}$ in orthogonal basis of W, then $\hat{y} = \frac{y \cdot u_1}{u_1 \cdot u_1} u_1 + \ldots + \frac{y \cdot u_p}{u_p \cdot u_p}u_p$ 50 51 ŷ is an orthogonal projection of y onto W ($proj_w y$) 52 53 ==== Best approximation ==== 54 Let W be subspace of $\Re^n$, y a vector in $\Re^n$, ŷ an orthogonal projection of y onto W. 55 56 Then $\|y-\hat{y}\| < \|y-v\|$ 57 58 ==== When basis for W is an orthonormal set ==== 59 If $\{u_1 \ldots u_p\}$ is orthonormal basis for subspace W of $\Re^n$, then $\text{proj}_w y = (y \cdot u_1)u_1 + \dots + (y \cdot u_p) u_p$ 60 61 If U = $\begin{bmatrix} u_1 & u_2 & \dots & u_p \end{bmatrix}$, then $\text{proj}_w y = UU^T y \quad \forall y \in \Re^n$ 62 63 === Gram-Schmidt process === 64 An algorithm for producing orthogonal or orthonormal basis for any nonzero subspace of $\Re^n$. 65 66 Given basis $\{ x_1 \dots x_p \}$ for nonzero subspace W of $\Re^n$, define: 67 68 {{$%align*$ 69 v_1 &= x_1\\ 70 v_2 &= x_2 - \frac{x_2 \cdot v_1}{v_1 \cdot v_1} v_1\\ 71 v_3 &= x_3 - \frac{x_3 \cdot v_1}{v_1 \cdot v_1} v_1 - \frac{x_3 \cdot v_2}{v_2 \cdot v_2} v_2\\ 72 \vdots \\ 73 v_p &= x_p - \frac{x_p \cdot v_1}{v_1 \cdot v_1} v_1 - \dots - \frac{x_p \cdot v_{p-1}}{v_{p-1} \cdot v_{p-1} v_{p-1}} 74 }}$ 75 76 Then $\{v_1 \dots v_p\}$ is an orthogonal basis for W. 77 78 $\text{Span}\{v_1 \dots v_k\} = \text{Span}\{x_1 \dots x+k\}$ for 1 ≤ k ≤ p. 79 80 ==== QR factorization ==== 81 If A is an m × n matrix, with linearly independent columns, then A can be factored as $A = QR$, where Q is he m×n matrix whose columns form an orthonormal basis for Col A, and R is n×n upper triangular invertible matrix with diagonal positive entries. 82 83 === Least-squares problems === 84 If a solution for $Ax = b$ does not exist and one is needed, try to find the best approximation x for $Ax = b$. 85 86 General least-squares problem is to find x that makes $\| b - Ax\|$ as small as possible. 87 88 If A is m×n and $b \in \Re^m$, a least-squares solution of $Ax = b$ is $\hat{x} \in \Re^n$ such that $\| b - A\hat{x} \| \leq \| b - Ax \|, \qquad \forall x \in \Re^n$. 89 90 Least-square solution set of $Ax = b$ is the same as the solution set for $A^T Ax = A^T b$. 91 92 Therefore, $\hat{x} = (A^T A)^{-1} A^T b$. 93 94 Given an m×n matrix A with linearly independent columns, let $A = QR$ be a QR factorization of A. Then, for each $b \in \Re^m$, $Ax = b$ has unique least-squares solution: 95 96 $\hat{x} = R^{-1} Q^T b$ 97