lectures.alex.balgavy.eu

Lecture notes from university.
git clone git://git.alex.balgavy.eu/lectures.alex.balgavy.eu.git
Log | Files | Refs | Submodules

introduction.wiki (5298B)


      1 == Introduction ==
      2 === Linear Equations ===
      3 "the study of linear equations"
      4 
      5 a linear equation in the variables $x_1, \dots, x_n$ has the form $a_1 x_1+\dots+a_n x_n = b$, with $a_1, \dots, a_n$ being the _coefficients_
      6 
      7 geometric interpretation:
      8 
      9 {{$
     10 \begin{alignat*}{3}
     11 &n=1\qquad &&a_1 x_1 = b \longrightarrow x_1 = \frac{b}{a_1}\qquad &&\text{(point on a line in $\Re$)}\\
     12 &n=2\qquad &&a_1 x_1 + a_2 x_2 = b \longrightarrow x_2 = \frac{b}{a_2} - \frac{a_1}{a_2}\qquad &&\text{(line in a plane in $\Re^2$)}\\
     13 &n=3\qquad &&a_1 x_1 + a_2 x_2 + a_3 x_3 = b\qquad &&\text{(planes in 3D space, in $\Re^3$)}
     14 \end{alignat*}
     15 }}$
     16 
     17 in general, $n-1$-dimensional planes in n-dimensional space
     18 
     19 system of linear equations $x_1, \dots, x_n$ is a collection of linear equations in these variables.
     20 
     21 $x_1 - 2x_2 = -1$
     22 
     23 $-x_1 + 3x_2 = 3$
     24 
     25 If you graph them, you get this:
     26 
     27 {{file:img/graph-example.png|System of equations graph}}
     28 
     29 the solution is the intersection.
     30 
     31 a system of linear equations has:
     32     1. no solutions (inconsistent) -- e.g. parallel lines
     33     2. exactly 1 solution (consistent)
     34     3. infinitely many solutions (consistent) - e.g. the same line twice
     35 
     36 two linear systems are "equivalent" if they have the same solutions.
     37 
     38 === Matrix notation ===
     39 | Equation                                                                                                                            | (Augmented) coefficient matrix notation                                           |
     40 |-------------------------------------------------------------------------------------------------------------------------------------|-----------------------------------------------------------------------------------|
     41 | $\begin{alignat*}{6} &x_1 &&-&&2x_2 &&+&&x_3 &&= 0\\ & && &&2x_2 &&-&&8x_3 &&= 8\\ &5x_1 && && &&-&&5x_3 &&= 10\end{alignat*}$ | $\begin{bmatrix} 1 & -2 & 1 & 0\\ 0 & 2 & -8 & 8\\ 5 & 0 & -5 & 10 \end{bmatrix}$ |
     42 
     43 the strategy to solve is to replace the system with an equivalent system that is easier to solve.
     44 
     45 elementary row operations:
     46     1. replacement: add rows
     47     2. scaling: multiply by constant (non-zero scalar)
     48     3. interchange: swap two rows
     49 
     50 all of these are reversible & don't change the solution set.
     51 
     52 Matrices A and B are equivalent ($A \sim B$) if there is a sequence of elementary operations to transform A to B.
     53 
     54 If augmented matrices of two systems are row-equivalent, then the systems are equivalent.
     55 
     56 Matrix A is in echelon form if:
     57     a) zero rows are below non-zero rows
     58     b) the leading entry of a row is contained in a column that is to the left of the leading entry of the row below it.
     59 
     60 A is in reduced echelon form if:
     61     a) A is in echelon form
     62     b) all leading entries are 1
     63     c) the leading entry is the only non-zero entry in that column
     64 
     65 === Reducing a matrix ===
     66 The reduced echelon form of a matrix is unique.
     67 
     68 every matrix is row-equivalent to a unique reduced echelon matrix.
     69 
     70 the positions of the leading entries in an echelon matrix are unique
     71 
     72 $\begin{bmatrix} \textbf{1} & * & * & *\\ 0 & 0 & \textbf{1} & *\\ 0 & 0 & 0 & 0\\ 0 & 0 & 0 & 0 \end{bmatrix}$
     73 
     74 the values in bold are pivot positions. the columns containing those values are pivot columns.
     75 
     76 Row reduction algorithm:
     77     1. Start with leftmost non-zero column (pivot column)
     78     2. Select a non-zero entry as pivot and move it to the pivot position.
     79     3. Create zeros below the pivot position.
     80     4. Ignore the row containing the pivot position & repeat steps 1-3 until solved. The matrix will be in echelon form.
     81     5. Make pivot positions equal to 1, create zeros in all pivot columns. Start with the rightmost column. The matrix will be in reduced echelon form.
     82 
     83 Side note: a computer chooses as pivot the entry that's smallest in absolute value to minimize the round-off error.
     84 
     85 Basic variables correspond to pivot columns. Free variables correspond to non-pivot columns. You solve the equation by expressing basic variables in terms of free variables.
     86 
     87 The matrix can be written in parametric form, example with $x_3$ being a free variable:
     88 
     89 $\binom{x_1}{x_2} = \big\{ \binom{1}{4} + \binom{5}{-1} x_3 \;\rvert\; x_3 \in \Re \big\}$
     90 
     91 if there are any free variables, there are infinite solutions.
     92 
     93 === Vectors ===
     94 A vector is a line. If you have a vector in the form $\begin{bmatrix} a\\b\end{bmatrix}$, you can draw it as an arrow from the origin ending at the point $(a,b)$.
     95 
     96 To add vectors, add the individual cells together.
     97 
     98 A vector equation $a_1 x_1 + a_2 x_2 + \dots + a_n x_n = b$ has the same solution set as $\begin{bmatrix} a_1 & a_2 & \dots & a_n & b \end{bmatrix}$.
     99 
    100 When asked whether $b$ is in $\text{Span} \{v_1, \dots, v_p\}$, you have to check whether the augmented matrix $\begin{bmatrix} v_1 & \dots & v_p & b \end{bmatrix}$ has a solution.
    101 
    102 $b$ is a linear combination of $A$ if $Ax = b$ has a solution.
    103 
    104 The span is the set of all linear combinations of the vectors.
    105 
    106 To calculate $Ax$, if the number of columns in A is the same as the number of rows in x, you can follow the definition:
    107 
    108 {{$
    109 Ax = \begin{bmatrix} a_1 & a_2 & \dots & a_n \end{bmatrix} \begin{bmatrix} x_1 \\ \dots \\ x_n \end{bmatrix} = x_1 a_1 + x_2 a_2 + \dots + x_n a_n
    110 }}$
    111 
    112 You also have the rules (matrix A,  vectors u and v, scalar c):
    113     * $A(u+v) = Au + Av$
    114     * $A(cu) = c(Au)$
    115 
    116