lectures.alex.balgavy.eu

Lecture notes from university.
git clone git://git.alex.balgavy.eu/lectures.alex.balgavy.eu.git
Log | Files | Refs | Submodules

lecture-1.md (1736B)


      1 +++
      2 title = "Lecture 1"
      3 template = "page-math.html"
      4 +++
      5 
      6 Some definitions:
      7 - digit: 0, 1
      8 - word: sequence of digits
      9 - length: digits in word (|word|)
     10 - binary code: set C of words
     11 
     12 assumptions about transmission channel:
     13 - length of sent == length of received
     14 - easy to find beginning of first word
     15 - noise scattered randomly (not in bursts)
     16 
     17 properties of binary channel:
     18 - symmetric if 0 and 1 transmitted with same accuracy
     19 - reliability: probability that digit sent == digit received
     20     - we assume ½ ≤ p < 1
     21 
     22 information rate of code (length n) = $\frac{1}{n} \log_{2} |C|$
     23 
     24 ## Most likely codeword
     25 Let $\phi_{p} (v,w)$ be probability that if word v sent over BSC with reliability p, word w is received.
     26 
     27 $\phi_{p} (v, w) = p^{n-d} (1-p)^d$ if v and w disagree in d positions.
     28 
     29 if v₁ and w disagree in d₁, and v₂ and w in d₂, then $\phi_{p} (v_{1}, w) \leq \phi_{p} (v_{2}, w)$ iff d₁ ≥ d₂.
     30 - English: the most likely word disagrees in least digits
     31 
     32 ## Weight and distance
     33 
     34 K = {0, 1}, $K^{n}$ = set of all binary words of length n
     35 
     36 (Hamming) weight: number of ones
     37 
     38 (Hamming) distance: number of differing digits between words
     39 
     40 ## Max likelihood decoding (MLD)
     41 Complete:
     42 - if one word min distance, decode to that
     43 - else, arbitrarily select one of closest
     44 
     45 Incomplete:
     46 - if one word min distance, decode to that
     47 - else, ask for retransmission
     48     - look for smallest weight in error patterns with C, e.g. 0+w and 1+w
     49     - retransmit if same weight
     50 
     51 Reliability: probability that if v sent over BSC of prob b, then IMLD concludes that v was sent
     52 
     53 $\theta_{p} (c, v) = \sum_{w \in L(v)} \phi_{p} (v, w)$ where $L(v) = \lbrace words \in K^{n} \enspace \| \enspace \text{IMLD correctly concludes v sent} \rbrace$
     54 
     55