lectures.alex.balgavy.eu

Lecture notes from university.
git clone git://git.alex.balgavy.eu/lectures.alex.balgavy.eu.git
Log | Files | Refs | Submodules

index.md (1927B)


      1 +++
      2 title = 'Linear transformations'
      3 template = 'page-math.html'
      4 +++
      5 
      6 # Linear transformations
      7 definitions:
      8 * transformation, function, mapping: rule assigning to each vector in $\Re^n$ a vector $T(x)$ in $\Re^m$
      9 * domain: set $\Re^n$
     10 * codomain: set $\Re^m$
     11 * image: vector T(x)
     12 * range: set of all images T(x)
     13 
     14 a projection transformation happens if you go to a lower dimension (e.g. $x_3$ becomes 0). a shear transformation happens if a 2D square is tilted sideways into a parallelogram.
     15 
     16 a transformation T is linear if:
     17 i) $T(u + v) = T(u) + T(v)$ for all $u,v \in \text{Domain}(T)$
     18 ii) $T(cu) = cT(u)$ for all scalars c and all $u \in \text{Domain}(T)$
     19 
     20 linear transformations preserve operations of vector addition and scalar multiplication.
     21 
     22 if T is a linear transformation, then:
     23 * $T(0) = 0)$
     24 * $T(cu + dv) = cT(u) + dT(v)$
     25 * $T(c_1 v_2 + \dots + c_p v_p) = c_1 T(v_1) + \dots + c_p T(v_p)$ (superposition principle)
     26 
     27 given scalar r, and $T: \Re^2 \rightarrow \Re^2$ by $T(x) = rx$
     28 * contraction: when $0 \leq r \leq 1$
     29 * dilation: when $r > 1$
     30 
     31 every linear transformation $\Re^n \rightarrow \Re^m$ is a matrix transformation $x \mapsto Ax$.
     32 
     33 $A = [[T(e_1) \dots T(e_n)]$, where $e_j$ is the jth column of the identity matrix in $\Re^n$
     34 
     35 geometric linear transformations of $\Re^2$:
     36 
     37 ![Reflections](geo-reflections.png) ![Contractions/expansions and shears](geo-contract-shears.png) ![Projections](geo-projections.png)
     38 
     39 types of mappings:
     40 * $T: \Re^n \rightarrow \Re^m$ is 'onto' $\Re^m$ if _each_ b in $\Re^m$ is the image of _at least one_ x in $\Re^n$.
     41 * $T: \Re^n \rightarrow \Re^m$ is one-to-one if _each_ b in $\Re^m$ is the image of _max one_ x in $\Re^n$.
     42     * so if $T(x) = 0$ only has the trivial solution
     43 
     44 for mapping $T: \Re^n \rightarrow \Re^m$ and standard matrix $A$:
     45 * T maps $\Re^n$ onto $\Re^m$ iff columns of matrix span $\Re^m$
     46 * T is one-to-one iff columns of matrix are linearly independent.
     47