lectures.alex.balgavy.eu

Lecture notes from university.
git clone git://git.alex.balgavy.eu/lectures.alex.balgavy.eu.git
Log | Files | Refs | Submodules

index.md (5761B)


      1 +++
      2 title = "Lecture 9"
      3 template = "page-math.html"
      4 +++
      5 
      6 *[TM]: Turing Machine
      7 *[BFS]: breadth-first search
      8 
      9 # Variations of Turing machines
     10 ## With multiple tapes
     11 A TM with two tapes can be simulated by a TM with one tape: transition on two tapes translates to multiple transitions on one tape.
     12 The difference in time complexity between TM with one tape and multiple tapes is a polynomial factor.
     13 
     14 ## Nondeterministic Turing machines
     15 Nondeterministic TM has transition function $\delta : Q \times \Gamma \rightarrow 2^{Q \times \Gamma \times \lbrace L, R \rbrace}$
     16 
     17 A nondeterministic TM can be simulated by a deterministic TM:
     18 - the deterministic TM can use breadth-first search to simulate all executions of nondeterministic TM in parallel.
     19 - the branches of the BFS can be stored on tape in form of queue
     20 
     21 Difference in time complexity between deterministic and nondeterministic TM is exponential factor
     22 
     23 ## Universal Turing machines
     24 A TM is "universal" if it can simulate every TM.
     25 
     26 A universal TM gets as input: Turing machine M (described as word w), input word u.
     27 It simulates M on u.
     28 The inputs can be written on two different tapes, or behind each other on one tape.
     29 
     30 The theorem is that there is a Turing machine. You just gotta believe it.
     31 
     32 # Unrestricted grammars
     33 These correspond to Turing machines.
     34 
     35 Unrestricted grammar contains rules x → y, where x ≠ λ (and that's the only constraint).
     36 
     37 Language L is generated by unrestricted grammar ↔ L is accepted by a Turing machine.
     38 
     39 ## Unrestricted grammars → Turing machines
     40 For every unrestricted grammar there is Turing machine st the languages generated/accepted are identical.
     41 
     42 How do you do it?
     43 - input for machine M is word w, written on tape
     44 - M can do BFS for derivation of w from S.
     45 - if derivation found, then w accepted by M.
     46 - then the two languages accepted/generated are equal.
     47 
     48 ## Turing machines → unrestricted grammars
     49 For every Turing machine M there is an unrestricted grammar st the languages generated/accepted are identical.
     50 
     51 How do you do it?
     52 - Sorry, the stuff he put in the slides is too abstract for me here. When I actually have to learn it, I'll try to write it in a way that's easier to understand.
     53 
     54 # Context-sensitive grammars
     55 ## Context-sensitive grammars
     56 Grammar is context-sensitive if, for every rule x → y, |x| ≤ |y| and x ≠ λ.
     57 
     58 For every context-sensitive grammar, there is a grammar that generates the same language, with rules of the form xAy → xvy (with v ≠ λ).
     59 
     60 A language is context-sensitive if there is a context-sensitive grammar for it.
     61 
     62 ## Linear bounded automata
     63 A nondeterministic TM (Q, Σ, Γ, δ, q₀, F).
     64 
     65 There is no □; instead, symbols [ and ] are placed around the input word.
     66 
     67 For every state q:
     68 - δ(q,[) is of the form (q',[,R)
     69 - δ(q,]) is of the form (q',],L)
     70 
     71 The memory is restricted by the length of the input word.
     72 
     73 A language accepted by a linear bounded automaton is a word reached from the start state, of the form `[uqv]`, where q is a final state and u,v ∈ Γ\*.
     74 
     75 ## Context-sensitive grammars to LBAs
     76 For every context-sensitive grammar, there is an LBA that generates the same language.
     77 
     78 A derivation of a word in the language contains only words of same or smaller length.
     79 A nondeterministic Turing machine can simulate (guess) this derivation without leaving the bounds of w.
     80 
     81 ## LBAs to context-sensitive grammars
     82 For every LBA, its language is context-sensitive.
     83 
     84 Build unrestricted grammar of the same language; then, all production rules are context-sensitive except for □ → λ.
     85 However, LBA does not use □, since it never leaves borders of input word.
     86 So, drop rule □ → λ, and rules including □.
     87 
     88 ## Basic properties of context-sensitive languages
     89 If $L_1$ and $L_2$ are context-sensitive, then so are $L_1 \cup L_2$, $L_1 \cap L_2$, $L_{1}<sup>{R}$, $L_{1}L_{2}$, $L_{1}</sup>{*}$, $\overline{L_1}$, $L_{1} \setminus L_2$
     90 
     91 Proofs:
     92 - $L_1 \cup L_2$, $L_{1}<sup>{R}$, $L_1 L_2$, $L_{1}</sup>{*}$: using grammars/automata
     93 - $L_1 \cap L_2$: using LBAs
     94 - $L_1 \setminus L_2 = L_1 \cap \overline{L_2}$
     95 - $\overline{L_1}$: proven by people smarter than us (Immerman, Szelepcsenyi)
     96 
     97 We don't know if deterministic LBAs are as expressive as nondeterministic LBAs.
     98 
     99 # Recursively enumerable languages
    100 Language is recursively enumerable if it's accepted by a Turing machine.
    101 
    102 Turing machines are recursively enumerable:
    103 - a TM can be represented as a word
    104 - a parser can check whether a word represents a TM, and if so, accept
    105 - so, there is recursive enumeration of all TMs
    106 
    107 ## Properties of recursively enumerable languages
    108 The class of recursively enumerable Languages is closed under ∪ and ∩.
    109 There exist recursively enumerable languages for which their complement is not recursively enumerable.
    110 
    111 # Recursive languages
    112 A language is recursive if it, and its complement, is recursively enumerable.
    113 Not every recursively enumerable language is recursive.
    114 
    115 Language is recursive ↔ it is accepted by a deterministic TM that halts for every input.
    116 
    117 Context-sensitive languages are recursive, but not every recursive language is context-sensitive.
    118 
    119 # The Chomsky hierarchy
    120 
    121 ![164376dff69f7c23d2933a9d4042b8c9.png](b6bfee1559554792afe8148c52bd516f.png)
    122 
    123 # Countability
    124 Turing machines are countable, languages aren't.
    125 
    126 There are countably many Turing machines over an input alphabet Σ.
    127 There are uncountably many languages over Σ.
    128 
    129 Why:
    130 - let a be a letter in Σ
    131 - assume L₀, L₁,... is enumeration of all languages over {a}
    132 - define language L as follows: for every i ≥ 0, $a<sup>{i} \in L \Leftrightarrow a</sup>{i} \notin L_i$
    133 - then for every i ≥ 0, $L \neq L_i$
    134 - then L is not part of the above enumeration; contradiction.
    135 - so, not all languages are recursively enumerable