lectures.alex.balgavy.eu

Lecture notes from university.
git clone git://git.alex.balgavy.eu/lectures.alex.balgavy.eu.git
Log | Files | Refs | Submodules

commit 6aa9c763cafb5ee9cbfb3d615bdf86ab3674ea72
parent eaf1c210d9d41b96b1ca6eda9bdfc1daa00cdf01
Author: Alex Balgavy <alex@balgavy.eu>
Date:   Sun, 13 Jun 2021 16:25:28 +0200

Physical computing notes

Diffstat:
Mcontent/_index.md | 2+-
Dcontent/physcomp-notes/Audio signals.html | 4----
Acontent/physcomp-notes/Audio signals.md | 61+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Dcontent/physcomp-notes/Classification 2.html | 4----
Rcontent/physcomp-notes/Classification 2.resources/screenshot_4.png -> content/physcomp-notes/Classification 2/41a367424533a6e08fb95638b9c2b11e.png | 0
Rcontent/physcomp-notes/Classification 2.resources/screenshot_3.png -> content/physcomp-notes/Classification 2/6690bab9dc8c17cf8396b94cd09f3d6f.png | 0
Rcontent/physcomp-notes/Classification 2.resources/screenshot_1.png -> content/physcomp-notes/Classification 2/9ae5328bcc3da5561497f7ac32aff1a6.png | 0
Rcontent/physcomp-notes/Classification 2.resources/screenshot_2.png -> content/physcomp-notes/Classification 2/e57408d5aa6439eabd8137bda295d117.png | 0
Rcontent/physcomp-notes/Classification 2.resources/screenshot.png -> content/physcomp-notes/Classification 2/f43f65c9be0fe3566b08e933c48e957a.png | 0
Acontent/physcomp-notes/Classification 2/index.md | 67+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Dcontent/physcomp-notes/Classification.html | 4----
Rcontent/physcomp-notes/Classification.resources/screenshot_1.png -> content/physcomp-notes/Classification/4f6a745e243088e8189c73f7eca571a0.png | 0
Rcontent/physcomp-notes/Classification.resources/screenshot.png -> content/physcomp-notes/Classification/b7159d61e48e0091c9bcc952dbdd9472.png | 0
Acontent/physcomp-notes/Classification/index.md | 106+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Dcontent/physcomp-notes/Control systems.html | 4----
Acontent/physcomp-notes/Control systems.md | 43+++++++++++++++++++++++++++++++++++++++++++
Dcontent/physcomp-notes/Image processing.html | 89-------------------------------------------------------------------------------
Rcontent/physcomp-notes/Image processing.resources/screenshot_1.png -> content/physcomp-notes/Image processing/0fd98b013f8e2be2b21982bb51ab196d.png | 0
Rcontent/physcomp-notes/Image processing.resources/screenshot_2.png -> content/physcomp-notes/Image processing/8f49a40caa697b2f19f15994ac58ee51.png | 0
Rcontent/physcomp-notes/Image processing.resources/screenshot.png -> content/physcomp-notes/Image processing/ae1cea520f55cfdc0379fdf98f0651cd.png | 0
Acontent/physcomp-notes/Image processing/index.md | 52++++++++++++++++++++++++++++++++++++++++++++++++++++
Dcontent/physcomp-notes/Images.html | 4----
Acontent/physcomp-notes/Images.md | 19+++++++++++++++++++
Dcontent/physcomp-notes/Morphological operations.html | 4----
Rcontent/physcomp-notes/Morphological operations.resources/screenshot.png -> content/physcomp-notes/Morphological operations/5823a4727aa3da50d8f101bd342ae265.png | 0
Rcontent/physcomp-notes/Morphological operations.resources/screenshot_1.png -> content/physcomp-notes/Morphological operations/f29ae7fc6a951cebc39d401c516693ca.png | 0
Acontent/physcomp-notes/Morphological operations/index.md | 30++++++++++++++++++++++++++++++
Dcontent/physcomp-notes/Neighborhood processing.html | 4----
Rcontent/physcomp-notes/Neighborhood processing.resources/screenshot_1.png -> content/physcomp-notes/Neighborhood processing/069d57e32b2bfb0f851662a3d4d6bab0.png | 0
Rcontent/physcomp-notes/Neighborhood processing.resources/screenshot.png -> content/physcomp-notes/Neighborhood processing/6c9075b1d48e76a1b390804c138dcfe1.png | 0
Acontent/physcomp-notes/Neighborhood processing/index.md | 27+++++++++++++++++++++++++++
Dcontent/physcomp-notes/Pervasive computing system.html | 4----
Acontent/physcomp-notes/Pervasive computing system.md | 11+++++++++++
Dcontent/physcomp-notes/Point processing.html | 4----
Rcontent/physcomp-notes/Point processing.resources/screenshot.png -> content/physcomp-notes/Point processing/14678296b2a9a5e2df1e024513278322.png | 0
Rcontent/physcomp-notes/Point processing.resources/screenshot_1.png -> content/physcomp-notes/Point processing/3324add035d108efc1dc36a5adc53f9f.png | 0
Rcontent/physcomp-notes/Point processing.resources/screenshot_2.png -> content/physcomp-notes/Point processing/4f05a88f44157b44d30209a3fa3a1941.png | 0
Acontent/physcomp-notes/Point processing/index.md | 38++++++++++++++++++++++++++++++++++++++
Dcontent/physcomp-notes/Sound processing.html | 4----
Dcontent/physcomp-notes/Sound processing.resources/screenshot_1.png | 0
Dcontent/physcomp-notes/Sound processing.resources/screenshot_10.png | 0
Dcontent/physcomp-notes/Sound processing.resources/screenshot_3.png | 0
Dcontent/physcomp-notes/Sound processing.resources/screenshot_6.png | 0
Dcontent/physcomp-notes/Sound processing.resources/screenshot_9.png | 0
Rcontent/physcomp-notes/Sound processing.resources/screenshot_4.png -> content/physcomp-notes/Sound processing/2f8ad86778ebb0b91e9ebc527decb0d4.png | 0
Rcontent/physcomp-notes/Sound processing.resources/screenshot_8.png -> content/physcomp-notes/Sound processing/5a9081f841b448d241811917f4eea3e3.png | 0
Rcontent/physcomp-notes/Sound processing.resources/screenshot_2.png -> content/physcomp-notes/Sound processing/8ecb6e39f786a6738ceaea52c1640948.png | 0
Rcontent/physcomp-notes/Sound processing.resources/screenshot_7.png -> content/physcomp-notes/Sound processing/e90248e66991c5183a713e851b9fbda8.png | 0
Rcontent/physcomp-notes/Sound processing.resources/screenshot.png -> content/physcomp-notes/Sound processing/fb0360fdcbdf2c0fa8c15ce7ddbe6670.png | 0
Rcontent/physcomp-notes/Sound processing.resources/screenshot_5.png -> content/physcomp-notes/Sound processing/fe629573739f7ff022dd7c5ae666c281.png | 0
Acontent/physcomp-notes/Sound processing/index.md | 89+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Dcontent/physcomp-notes/Systems engineering.html | 4----
Rcontent/physcomp-notes/Systems engineering.resources/Photo.png -> content/physcomp-notes/Systems engineering/76a8d624923f4b72c0bfb09fd42428a4.png | 0
Rcontent/physcomp-notes/Systems engineering.resources/Scannable Document on 28 Nov 2017 at 11_17_47.png -> content/physcomp-notes/Systems engineering/8ca002b539212b99d661276d2fd11d9f.png | 0
Acontent/physcomp-notes/Systems engineering/index.md | 20++++++++++++++++++++
Acontent/physcomp-notes/_index.md | 18++++++++++++++++++
Dcontent/physcomp-notes/index.html | 56--------------------------------------------------------
Dcontent/physcomp-notes/sitewide.css | 32--------------------------------
58 files changed, 582 insertions(+), 222 deletions(-)

diff --git a/content/_index.md b/content/_index.md @@ -42,7 +42,7 @@ title = "Alex's university course notes" * [Introduction to Programming in C++](cpp-notes) * [Computational thinking](compthink-notes) * [Systems architecture](https://thezeroalpha.github.io/sysarch-notes) -* [Physical Computing](https://thezeroalpha.github.io/physcomp-notes) +* [Physical Computing](physcomp-notes) * [Logic & sets](logicsets-notes/) * [Web tech](webtech-notes) * [Computer Networks](compnet-notes/) diff --git a/content/physcomp-notes/Audio signals.html b/content/physcomp-notes/Audio signals.html @@ -1,3 +0,0 @@ -<?xml version="1.0" encoding="UTF-8"?> -<!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Transitional//EN" "http://www.w3.org/TR/xhtml1/DTD/xhtml1-transitional.dtd"> -<html><head><link rel="stylesheet" href="sitewide.css" type="text/css"><meta http-equiv="Content-Type" content="text/html; charset=UTF-8"/><meta name="exporter-version" content="Evernote Mac 7.5.2 (457164)"/><meta name="altitude" content="-0.9381439089775085"/><meta name="author" content="Alex Balgavy"/><meta name="created" content="2017-10-31 10:14:23 +0000"/><meta name="latitude" content="52.33479884700187"/><meta name="longitude" content="4.866838046268644"/><meta name="source" content="desktop.mac"/><meta name="updated" content="2017-12-19 19:32:31 +0000"/><title>Audio signals</title></head><body><div><span style="font-weight: bold;">Representation</span></div><div>patterns of variations that represent/encode information</div><div><br/></div><div>expressed in terms of waves — sinusoidal, sawtooth, triangle, square</div><div>waves have period, amplitude, frequency, wavelength</div><div><br/></div><div>period: T (sec)</div><div>frequency: 1/T (per sec)</div><div>wavelength: velocity/frequency (m)</div><div><br/></div><div><span style="font-weight: bold;">As functions</span></div><div>a function of time and volume (amplitude) — time t =&gt; s(t)</div><div>continuous if there is a volume for each point in time</div><div><br/></div><div>speech is one-dimensional — only changes in time</div><div>an image is two-dimensional — has x and y</div><div><br/></div><div><span style="font-weight: bold;">Digitisation of signals</span></div><div>real signals are analog signals that are continuous in all dimensions</div><div>a computer has limited space and can’t process them</div><div>therefore, digitise — sampling + quantisation</div><div><br/></div><div>Sampling</div><div><ul><li>has period/frequency, result in samples at specific points in time</li><li>x axis is now discrete</li></ul><div><br/></div></div><div>Quantisation</div><div><ul><li>representation of real numbers with finite numbers of bits</li><li>the more bits, the more information you can store</li></ul><div><br/></div></div><div><span style="font-weight: bold;">Converting analog and digital</span></div><div>analog to digital converter (ADC) — converts from analog (continuous) to digital (discrete) signal</div><div>takes input analog and reference voltage, outputs the fraction of input voltage in reference voltage</div><div><br/></div><div>digital-to-analog converter — ‘reconstruction'</div><div><br/></div><div><span style="font-weight: bold;">Digital representation</span></div><div>Ts — sampling period</div><div>fs — sampling frequency</div><div>a discrete signal is represented by a sequence of samples s[n]</div><div><br/></div><div>s[n] = s(nTs)</div><div><br/></div><div><span style="font-weight: bold;">Shannon (Nyquist) theorem</span></div><div>the sampling rate must be at least twice the highest frequency</div><div>the highest useful frequency from an FFT is half of the sampling frequency</div><div>if it’s not obeyed and your sample rate is too low, you get aliasing (false data)</div><div><br/></div></body></html>- \ No newline at end of file diff --git a/content/physcomp-notes/Audio signals.md b/content/physcomp-notes/Audio signals.md @@ -0,0 +1,61 @@ ++++ +title = 'Audio signals' ++++ +# Audio signals +## Representation +patterns of variations that represent/encode information + +expressed in terms of waves — sinusoidal, sawtooth, triangle, square + +waves have period, amplitude, frequency, wavelength +- period: T (sec) +- frequency: 1/T (per sec) +- wavelength: velocity/frequency (m) + +## As functions +a function of time and volume (amplitude) — time t => s(t) + +continuous if there is a volume for each point in time + +speech is one-dimensional — only changes in time + +an image is two-dimensional — has x and y + +## Digitisation of signals +real signals are analog signals that are continuous in all dimensions + +a computer has limited space and can’t process them + +therefore, digitise — sampling + quantisation + +Sampling + +- has period/frequency, result in samples at specific points in time +- x axis is now discrete + +Quantisation + +- representation of real numbers with finite numbers of bits +- the more bits, the more information you can store + +## Converting analog and digital +analog to digital converter (ADC) — converts from analog (continuous) to digital (discrete) signal + +takes input analog and reference voltage, outputs the fraction of input voltage in reference voltage + +digital-to-analog converter — ‘reconstruction' + +## Digital representation +- Ts — sampling period +- fs — sampling frequency + +a discrete signal is represented by a sequence of samples s[n] + +s[n] = s(nTs) + +## Shannon (Nyquist) theorem +the sampling rate must be at least twice the highest frequency + +the highest useful frequency from an FFT is half of the sampling frequency + +if it’s not obeyed and your sample rate is too low, you get aliasing (false data) diff --git a/content/physcomp-notes/Classification 2.html b/content/physcomp-notes/Classification 2.html @@ -1,3 +0,0 @@ -<?xml version="1.0" encoding="UTF-8"?> -<!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Transitional//EN" "http://www.w3.org/TR/xhtml1/DTD/xhtml1-transitional.dtd"> -<html><head><link rel="stylesheet" href="sitewide.css" type="text/css"><meta http-equiv="Content-Type" content="text/html; charset=UTF-8"/><meta name="exporter-version" content="Evernote Mac 7.5.2 (457164)"/><meta name="altitude" content="1.285988807678223"/><meta name="author" content="Alex Balgavy"/><meta name="created" content="2017-11-27 13:56:22 +0000"/><meta name="latitude" content="52.33301811320457"/><meta name="longitude" content="4.865534911775513"/><meta name="source" content="desktop.mac"/><meta name="updated" content="2017-11-28 10:06:34 +0000"/><title>Classification 2</title></head><body><div>uncertainty is everywhere. probabilistic models look at learning as process of reducing uncertainty.</div><div>probability can be for single variables, but also conditional/posterior — how existing beliefs change in light of new evidence</div><div><br/></div><div><span style="font-weight: bold;">Naive Bayes classifier</span></div><div>given a set of classes, use Bayes rule to get posterior probabilities that object with features belongs to class.</div><div>the class with highest posterior probability is most likely class.</div><div>naive — assuming that elements in feature vector are conditionally independent</div><div><img src="Classification%202.resources/screenshot_1.png" height="75" width="303"/><br/></div><div><br/></div><div><span style="font-weight: bold;">Hidden Markov Model Classifier</span></div><div>works on a set of temporal data (when time is important)</div><div>each clock tick, the system moves to new state (can be the previous one)</div><div>we do not know these states (hidden), but we see observations</div><div>steps:</div><div><ul><li>Train by calculating:</li><ul><li>probability that person is in state x</li><li>transition probability P(xj | xi)</li><li>observation probability P(yi | xi)</li></ul><li>Use HMM as classifier</li><ul><li>given observation y, use Bayes to calculate P(xi | y)</li><li>class with highest P wins</li></ul></ul><div><br/></div></div><div><span style="font-weight: bold;">Unsupervised learning</span></div><div>do not have training sets, explore data and search for naturally occurring patterns and clusters</div><div>once clusters are found we make decisions</div><div>two inputs cluster if their vectors are similar (they are close to each other in feature space)</div><div><br/></div><div><img src="Classification%202.resources/screenshot_3.png" height="344" width="433"/><br/></div><div><br/></div><div><span style="font-weight: bold;">Evaluating classifiers</span></div><div>predictive accuracy — proportion of new, unseen instances that classifies correctly</div><div>classification error — correctly classified or not</div><div>error rate — # of classification errors / # of classifications attempted</div><div>true positives/negatives VS false positives/negatives — false negatives can be most dangerous!</div><div>true positive rate (hit rate) — proportion of positive instances that are correctly classified as positive (TP/(TP+FN))</div><div>false positive rate — negative instances that are erroneously classified as positive (FP/(FP+TN))</div><div>accuracy — percent of correct classifications</div><div><br/></div><div>confusion matrix gives info on how frequently instances were correctly/incorrectly classified. the diagonal is what’s important.</div><div>when writing a report, it’s best to explicitly give the confusion matrix</div><div><img src="Classification%202.resources/screenshot.png" height="244" width="475"/><br/></div><div><br/></div><div>receiver operating characteristics (ROC) graphs</div><div>useful for organising classifiers and visualising their performance</div><div>depict tradeoff between hit rates and false alarm rates over noisy channel</div><div><img src="Classification%202.resources/screenshot_4.png" height="326" width="472"/><img src="Classification%202.resources/screenshot_2.png" height="356" width="405"/><br/></div><div><br/></div><div><br/></div></body></html>- \ No newline at end of file diff --git a/content/physcomp-notes/Classification 2.resources/screenshot_4.png b/content/physcomp-notes/Classification 2/41a367424533a6e08fb95638b9c2b11e.png Binary files differ. diff --git a/content/physcomp-notes/Classification 2.resources/screenshot_3.png b/content/physcomp-notes/Classification 2/6690bab9dc8c17cf8396b94cd09f3d6f.png Binary files differ. diff --git a/content/physcomp-notes/Classification 2.resources/screenshot_1.png b/content/physcomp-notes/Classification 2/9ae5328bcc3da5561497f7ac32aff1a6.png Binary files differ. diff --git a/content/physcomp-notes/Classification 2.resources/screenshot_2.png b/content/physcomp-notes/Classification 2/e57408d5aa6439eabd8137bda295d117.png Binary files differ. diff --git a/content/physcomp-notes/Classification 2.resources/screenshot.png b/content/physcomp-notes/Classification 2/f43f65c9be0fe3566b08e933c48e957a.png Binary files differ. diff --git a/content/physcomp-notes/Classification 2/index.md b/content/physcomp-notes/Classification 2/index.md @@ -0,0 +1,67 @@ ++++ +title = 'Classification 2' +template = 'page-math.html' ++++ +# Classification 2 +uncertainty is everywhere. probabilistic models look at learning as process of reducing uncertainty. + +probability can be for single variables, but also conditional/posterior — how existing beliefs change in light of new evidence + +## Naive Bayes classifier + +given a set of classes, use Bayes rule to get posterior probabilities that object with features belongs to class. + +the class with highest posterior probability is most likely class. +naive — assuming that elements in feature vector are conditionally independent + +$P(C_{i} | X) = \frac{P(X | C_{i}) \times P(C_{i})}{P(X)}$ + +## Hidden Markov Model Classifier +works on a set of temporal data (when time is important) +each clock tick, the system moves to new state (can be the previous one) +we do not know these states (hidden), but we see observations +steps: + +- Train by calculating: + - probability that person is in state x + - transition probability P(xj | xi) + - observation probability P(yi | xi) +- Use HMM as classifier + - given observation y, use Bayes to calculate P(xi | y) + - class with highest P wins + +## Unsupervised learning + +do not have training sets, explore data and search for naturally occurring patterns and clusters + +once clusters are found we make decisions + +two inputs cluster if their vectors are similar (they are close to each other in feature space) + +![screenshot.png](6690bab9dc8c17cf8396b94cd09f3d6f.png) + +## Evaluating classifiers + +predictive accuracy — proportion of new, unseen instances that classifies correctly + +classification error — correctly classified or not +error rate — # of classification errors / # of classifications attempted + +true positives/negatives VS false positives/negatives — false negatives can be most dangerous! + +true positive rate (hit rate) — proportion of positive instances that are correctly classified as positive (TP/(TP+FN)) + +false positive rate — negative instances that are erroneously classified as positive (FP/(FP+TN)) + +accuracy — percent of correct classifications + +confusion matrix gives info on how frequently instances were correctly/incorrectly classified. the diagonal is what’s important. + +when writing a report, it’s best to explicitly give the confusion matrix +![screenshot.png](f43f65c9be0fe3566b08e933c48e957a.png) + +receiver operating characteristics (ROC) graphs +useful for organising classifiers and visualising their performance +depict tradeoff between hit rates and false alarm rates over noisy channel + +![screenshot.png](41a367424533a6e08fb95638b9c2b11e.png)![screenshot.png](e57408d5aa6439eabd8137bda295d117.png) diff --git a/content/physcomp-notes/Classification.html b/content/physcomp-notes/Classification.html @@ -1,3 +0,0 @@ -<?xml version="1.0" encoding="UTF-8"?> -<!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Transitional//EN" "http://www.w3.org/TR/xhtml1/DTD/xhtml1-transitional.dtd"> -<html><head><link rel="stylesheet" href="sitewide.css" type="text/css"><meta http-equiv="Content-Type" content="text/html; charset=UTF-8"/><meta name="exporter-version" content="Evernote Mac 7.5.2 (457164)"/><meta name="altitude" content="0"/><meta name="author" content="Alex Balgavy"/><meta name="created" content="2017-11-19 12:13:40 +0000"/><meta name="latitude" content="52.37362670126313"/><meta name="longitude" content="4.836090082173948"/><meta name="source" content="desktop.mac"/><meta name="updated" content="2017-11-27 13:55:46 +0000"/><title>Classification</title></head><body><div>a pattern is an entity vaguely defined, that could be given a name.</div><div>recognition is identification of pattern as member of a category</div><div><br/></div><div><span style="font-weight: bold;">Types of pattern recognition (classification) systems:</span></div><div><ul><li>speech recognition: </li></ul></div><div><ul><ol><li>PC card converts analog waves from mic into digital format</li><li>acoustical model breaks the word into phonemes</li><li>language model compares phonemes to words in built-in dictionary</li><li>software decides on what spoken word was and displays best match</li></ol><li>brain-computer interface that acquires signals directly from the brain</li><li>gesture recognition using acceleration magnitude from watch</li><li>image recognition</li></ul><div><br/></div></div><div><span style="font-weight: bold;">Classification (known categories)</span></div><div><ul><li>given a few classes, each item belongs to one class</li><li>objects are described by features</li><li>system needs a training set (both positive and negative examples)</li><li>if a new item comes, its features are measured and the system decides which class it belongs to</li></ul></div><div><img src="Classification.resources/screenshot.png" height="625" width="920"/><br/></div><div><br/></div><div>Components:</div><div><ul><li>Sensing module</li><li>Preprocessing mechanism</li><li>Feature extraction mechanism</li><li>Classifier</li><li>Training set of already classified examples</li></ul><div><br/></div></div><div><span style="font-weight: bold;">Building a pattern recognition system:</span></div><div><ol><li>Choose features, define classes (e.g. coins 10 cent, 20 cent, 50 cent, 1$, 2$)</li><ul><li>features need to have discriminative power</li><li>not too many, but enough to reliably separate classes based on them</li><li>e.g. coins colour and diameter</li><li>algorithms</li><ul><li>simple: rule-based activity recognition (If…And/Or…Then)</li><li>complicated: machine learning decision trees, HMM, neural networks</li></ul></ul><li>Extract features</li><ul><li>image recognition</li><ul><li>shape decriptors</li><ul><li>form factor (round object has 1, others smaller)</li><li>Euler number (number of objects minus number of holes in objects)</li><li>perimeter, area, roundness ratio…</li></ul><li>preprocessing</li><ul><li>binarisation, morphological operators, segmentation</li></ul><li>extract features (e.g. area, coordinates of centre of mass)</li><li>optical character recognition (OCR)</li><ul><li>converts image into machine readable text</li><li>uses statistical moments (total mass, centroid, elliptical parameters, etc.)</li><li>invariant moments of Hu</li></ul></ul><li>sound recognition</li><ul><li>features</li><ul><li>frequency spectrum</li><li>spectrograms</li><li>Mel cepstrum coefficients — FFT to Log(|x|) to IFFT results in cepstrum</li></ul><li>vowels recognition: second formant vs first formant frequency for vowels (significant freqs)</li></ul></ul><li>Train the classifier</li><li>Evaluate the performance of classification</li></ol></div><div><br/></div><div><b>Classifiers</b></div><div><u>Rule-based:</u> if-then-else</div><div><ul><li>exhaustive, mutually exclusive rules</li><li>works well if there aren’t too many features</li></ul><div><br/></div></div><div><u>Template-matching:</u> a set of reference patterns is available, match an unknown using nearest-neighbour</div><div>get a fingerprint for a specific signal, using FFT (freq. spectrum) or Mel cepstrum coefficients</div><div>train with various words, store fingerprints, and then apply</div><div>two approaches:</div><div><ul><li>maximum correlation</li><li>minimum error — calculate Euclidian distance between vectors</li></ul></div><div><br/></div><div><u>Neural networks:</u></div><div>synapses are weights</div><div>output is binary, depends on comparison between weighted sum of inputs and threshold θ</div><div>a neuron has:</div><div><ul><li>set of weighted inputs — dendrites+synapses</li><li>an adder — soma</li><li>an activation function to decide whether or not the neuron fires</li></ul><div><br/></div></div><div>a neuron cannot learn, but a perceptron can. by changing the weights which are adjustable.</div><div>neural networks are collections of artificial neurons, and have hidden layers.</div><div>they learn by testing output against desired output and adjusting weights accordingly.</div><div><img src="Classification.resources/screenshot_1.png" height="121" width="216"/></div><div><br/></div></body></html>- \ No newline at end of file diff --git a/content/physcomp-notes/Classification.resources/screenshot_1.png b/content/physcomp-notes/Classification/4f6a745e243088e8189c73f7eca571a0.png Binary files differ. diff --git a/content/physcomp-notes/Classification.resources/screenshot.png b/content/physcomp-notes/Classification/b7159d61e48e0091c9bcc952dbdd9472.png Binary files differ. diff --git a/content/physcomp-notes/Classification/index.md b/content/physcomp-notes/Classification/index.md @@ -0,0 +1,106 @@ ++++ +title = 'Classification' ++++ +# Classification +a pattern is an entity vaguely defined, that could be given a name. +recognition is identification of pattern as member of a category + +## Types of pattern recognition (classification) systems + +- speech recognition: + + 1. PC card converts analog waves from mic into digital format + 2. acoustical model breaks the word into phonemes + 3. language model compares phonemes to words in built-in dictionary + 4. software decides on what spoken word was and displays best match + +- brain-computer interface that acquires signals directly from the brain +- gesture recognition using acceleration magnitude from watch +- image recognition + +## Classification (known categories) + +- given a few classes, each item belongs to one class +- objects are described by features +- system needs a training set (both positive and negative examples) +- if a new item comes, its features are measured and the system decides which class it belongs to + +![screenshot.png](b7159d61e48e0091c9bcc952dbdd9472.png) + +Components: + +- Sensing module +- Preprocessing mechanism +- Feature extraction mechanism +- Classifier +- Training set of already classified examples + +### Building a pattern recognition system + +1. Choose features, define classes (e.g. coins 10 cent, 20 cent, 50 cent, 1$, 2$) + + - features need to have discriminative power + - not too many, but enough to reliably separate classes based on them + - e.g. coins colour and diameter + - algorithms + - simple: rule-based activity recognition (If…And/Or…Then) + - complicated: machine learning decision trees, HMM, neural networks + +2. Extract features + + - image recognition + - shape decriptors + - form factor (round object has 1, others smaller) + - Euler number (number of objects minus number of holes in objects) + - perimeter, area, roundness ratio… + - preprocessing + - binarisation, morphological operators, segmentation + - extract features (e.g. area, coordinates of centre of mass) + - optical character recognition (OCR) + - converts image into machine readable text + - uses statistical moments (total mass, centroid, elliptical parameters, etc.) + - invariant moments of Hu + - sound recognition + - features + - frequency spectrum + - spectrograms + - Mel cepstrum coefficients — FFT to Log(|x|) to IFFT results in cepstrum + - vowels recognition: second formant vs first formant frequency for vowels (significant freqs) + +3. Train the classifier +4. Evaluate the performance of classification + +### Classifiers +#### Rule-based: if-then-else + +- exhaustive, mutually exclusive rules +- works well if there aren’t too many features + +#### Template-matching: a set of reference patterns is available, match an unknown using nearest-neighbour + +get a fingerprint for a specific signal, using FFT (freq. spectrum) or Mel cepstrum coefficients + +train with various words, store fingerprints, and then apply +two approaches: + +- maximum correlation +- minimum error — calculate Euclidian distance between vectors + +#### Neural networks: +synapses are weights + +output is binary, depends on comparison between weighted sum of inputs and threshold θ + +a neuron has: + +- set of weighted inputs — dendrites+synapses +- an adder — soma +- an activation function to decide whether or not the neuron fires + +a neuron cannot learn, but a perceptron can. by changing the weights which are adjustable. + +neural networks are collections of artificial neurons, and have hidden layers. + +they learn by testing output against desired output and adjusting weights accordingly. + +![screenshot.png](4f6a745e243088e8189c73f7eca571a0.png) diff --git a/content/physcomp-notes/Control systems.html b/content/physcomp-notes/Control systems.html @@ -1,3 +0,0 @@ -<?xml version="1.0" encoding="UTF-8"?> -<!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Transitional//EN" "http://www.w3.org/TR/xhtml1/DTD/xhtml1-transitional.dtd"> -<html><head><link rel="stylesheet" href="sitewide.css" type="text/css"><meta http-equiv="Content-Type" content="text/html; charset=UTF-8"/><meta name="exporter-version" content="Evernote Mac 7.5.2 (457164)"/><meta name="altitude" content="2.2164626121521"/><meta name="author" content="Alex Balgavy"/><meta name="created" content="2017-11-06 12:38:55 +0000"/><meta name="latitude" content="52.33300980610338"/><meta name="longitude" content="4.865525386192134"/><meta name="source" content="desktop.mac"/><meta name="updated" content="2017-12-19 19:46:43 +0000"/><title>Control systems</title></head><body><div>Sensors: receive signals (microphones, cameras…)</div><div>Actuators: actually do stuff in the 'real world’ (LEDs, motors, speakers, displays, lamps…)</div><div>Controller: “the brain”, the intelligent unit. provide hardware &amp; software that makes system autonomous by using sensor input etc.</div><div><br/></div><div>any pervasive computing system executes sense-control-act sequence in a loop</div><div><br/></div><div>Control:</div><div><ul><li>deliberative: think hard, act later</li><ul><li>planning — look ahead at outcomes of possible actions</li><li>searching — looking for sequence of actions leading to desired goal</li><li>use internal representation of the environment — a map, for example</li><li>for decisions, use for example shortest path from one node to another in a map</li><ul><li>uses Dijkstra’s or A* algorithm</li><li>GPS nav systems use this</li></ul></ul><li>reactive: don’t think, react!</li><ul><li>e.g. smart curtains, thermostat, obstacle handling, landmark navigation</li><li>don’t use internal representation, just direct mapping between sensors and effectors</li><li>rules:</li><ul><li>if dark outside, then close the curtains</li></ul><li>control type</li><ul><li>open-loop control — input signal to controller, actuator, output controlled variable</li><ul><li>examples: microwave, automatic lights, automatic water faucets</li></ul><li>closed-loop control — get feedback, check if everything was executed right</li><ul><li>uses a comparator that gets feedback from output of actuator</li><li>comparator outputs error to controller, which then tries to minimise error</li><li>example: heater</li></ul></ul><li>obstacle handling</li><ul><li>simple — contact (touch sensor)</li><li>better — proximity, but don’t know distance (whiskers)</li><li>best — ranging</li><ul><li>sonar, with reflected sound waves. echolocation (distance = speed × time)</li><li>LIDAR, using a laser swept across FOV</li></ul></ul><li>landmark navigation</li><ul><li>follow a line, a wall</li><li>feedback control — turn always same angle, turn proportionally (P), proportional derivative (PD, rate of change), or proportional derivative and integral (PID, rate of change and time)</li></ul></ul><li>others: hybrid, behaviour-based</li></ul><div><br/></div></div><div><br/></div></body></html>- \ No newline at end of file diff --git a/content/physcomp-notes/Control systems.md b/content/physcomp-notes/Control systems.md @@ -0,0 +1,43 @@ ++++ +title = 'Control systems' ++++ +# Control systems +Sensors: receive signals (microphones, cameras…) + +Actuators: actually do stuff in the 'real world’ (LEDs, motors, speakers, displays, lamps…) + +Controller: “the brain”, the intelligent unit. provide hardware & software that makes system autonomous by using sensor input etc. + +any pervasive computing system executes sense-control-act sequence in a loop + +Control: + +- deliberative: think hard, act later + - planning — look ahead at outcomes of possible actions + - searching — looking for sequence of actions leading to desired goal + - use internal representation of the environment — a map, for example + - for decisions, use for example shortest path from one node to another in a map + - uses Dijkstra’s or A* algorithm + - GPS nav systems use this +- reactive: don’t think, react! + - e.g. smart curtains, thermostat, obstacle handling, landmark navigation + - don’t use internal representation, just direct mapping between sensors and effectors + - rules: + - if dark outside, then close the curtains + - control type + - open-loop control — input signal to controller, actuator, output controlled variable + - examples: microwave, automatic lights, automatic water faucets + - closed-loop control — get feedback, check if everything was executed right + - uses a comparator that gets feedback from output of actuator + - comparator outputs error to controller, which then tries to minimise error + - example: heater + - obstacle handling + - simple — contact (touch sensor) + - better — proximity, but don’t know distance (whiskers) + - best — ranging + - sonar, with reflected sound waves. echolocation (distance = speed × time) + - LIDAR, using a laser swept across FOV + - landmark navigation + - follow a line, a wall + - feedback control — turn always same angle, turn proportionally (P), proportional derivative (PD, rate of change), or proportional derivative and integral (PID, rate of change and time) +- others: hybrid, behaviour-based diff --git a/content/physcomp-notes/Image processing.html b/content/physcomp-notes/Image processing.html @@ -1,89 +0,0 @@ -<?xml version="1.0" encoding="UTF-8"?> -<!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Transitional//EN" "http://www.w3.org/TR/xhtml1/DTD/xhtml1-transitional.dtd"> -<html> - -<head><link rel="stylesheet" href="sitewide.css" type="text/css"> - <meta http-equiv="Content-Type" content="text/html; charset=UTF-8" /> - <meta name="exporter-version" content="Evernote Mac 7.5.2 (457164)" /> - <meta name="altitude" content="-0.9683055877685547" /> - <meta name="author" content="Alex Balgavy" /> - <meta name="created" content="2017-11-07 11:04:35 +0000" /> - <meta name="latitude" content="52.33476618838052" /> - <meta name="longitude" content="4.866821194517779" /> - <meta name="source" content="desktop.mac" /> - <meta name="updated" content="2017-12-19 22:01:20 +0000" /> - <title>Image processing</title> -</head> - -<body> - <div>Problems:</div> - <div> - <ol> - <li>Acquired signals are often noisy</li> - <li>Context information is often hidden</li> - </ol> - <div><br /></div> - </div> - <div>Below are options for image processing.</div> - <div><br /></div> - <div><a href="Point processing.html"">Point processing</a></div> - <div><a href="Neighborhood processing.html"">Neighborhood processing</a></div> - <div><a href="Morphological operations.html">Morphological operations</a></div> - <div><br /></div> - <div><span style="font-weight: bold;">Windowing</span></div> - <div>Select a region of interest — user defined region within the image.</div> - <div>Then crop (resize) the image.</div> - <div><br /></div> - <div><span style="font-family: &quot;Helvetica Neue&quot;; font-weight: bold;">Segmentation</span></div> - <div><span style="font-family: &quot;Helvetica Neue&quot;;">Partitioning an image into separate objects, main vs background.</span></div> - <div><span style="font-family: &quot;Helvetica Neue&quot;;">Can be done by:</span></div> - <div> - <ul> - <li><span style="font-family: &quot;Helvetica Neue&quot;;">Edge detection</span></li> - <ul> - <li><span style="font-family: &quot;Helvetica Neue&quot;;">edge — local discontinuity in pixel value that exceeds given threshold</span></li> - <li><span style="font-family: &quot;Helvetica Neue&quot;;">consists of creating binary image where non-background pixel values are object boundaries</span></li> - </ul> - </ul> - </div> - <blockquote style="margin: 0 0 0 40px; border: none; padding: 0px;"> - <blockquote style="margin: 0 0 0 40px; border: none; padding: 0px;"> - <div> - <font face="Helvetica Neue"><img src="Image%20processing.resources/screenshot.png" height="245" width="467" /><br /></font> - </div> - </blockquote> - </blockquote> - <div> - <ul> - <ul> - <li><span style="font-family: &quot;Helvetica Neue&quot;;">can be obtained by correlation with a kernel</span></li> - <ul> - <li><span style="font-family: &quot;Helvetica Neue&quot;;">Prewitt</span></li> - <li><span style="font-family: &quot;Helvetica Neue&quot;;">Canny</span></li> - <li><span style="font-family: &quot;Helvetica Neue&quot;;"> Sobel</span></li> - </ul> - </ul> - </ul> - </div> - <blockquote style="margin: 0 0 0 40px; border: none; padding: 0px;"> - <blockquote style="margin: 0 0 0 40px; border: none; padding: 0px;"> - <div><img src="Image%20processing.resources/screenshot_2.png" height="292" width="474" /><br /></div> - </blockquote> - </blockquote> - <div><br /></div> - <div><br /></div> - <div><br /></div> - <div><br /></div> - <div><span style="font-weight: bold;">BLOB analysis</span></div> - <div>BLOB — binary large object (group of connected pixels in binary image)</div> - <div>connectivity decides which pixels are neighbours (4-connectivity, 8 connectivity)</div> - <div>set of allpixels which are connected to a given pixel is a connected component or BLOB</div> - <div><br /></div> - <div><img src="Image%20processing.resources/screenshot_1.png" height="107" width="367" /><br /></div> - <div><br /></div> - <div>extracting connected components is called labelling, each connected component is given a label.</div> - <div>0 is background, then 1,2,3,…</div> - <div>in MATLAB, <span style="font-family: &quot;Courier New&quot;;">bwlabel </span><span style="font-family: &quot;Helvetica Neue&quot;;">returns matrix with labels and number of BLOBs</span></div> -</body> - -</html> diff --git a/content/physcomp-notes/Image processing.resources/screenshot_1.png b/content/physcomp-notes/Image processing/0fd98b013f8e2be2b21982bb51ab196d.png Binary files differ. diff --git a/content/physcomp-notes/Image processing.resources/screenshot_2.png b/content/physcomp-notes/Image processing/8f49a40caa697b2f19f15994ac58ee51.png Binary files differ. diff --git a/content/physcomp-notes/Image processing.resources/screenshot.png b/content/physcomp-notes/Image processing/ae1cea520f55cfdc0379fdf98f0651cd.png Binary files differ. diff --git a/content/physcomp-notes/Image processing/index.md b/content/physcomp-notes/Image processing/index.md @@ -0,0 +1,52 @@ ++++ +title = 'Image processing' ++++ +# Image processing +Problems: +1. Acquired signals are often noisy +2. Context information is often hidden + +Below are options for image processing. + +[Point processing](./point-processing) + +[Neighborhood processing](./neighborhood-processing) + +[Morphological operations](./morphological-operations) + + +## Windowing +Select a region of interest — user defined region within the image. +Then crop (resize) the image. + +## Segmentation +Partitioning an image into separate objects, main vs background. +Can be done by: + +- Edge detection + - edge — local discontinuity in pixel value that exceeds given threshold + - consists of creating binary image where non-background pixel values are object boundaries + + ![screenshot.png](ae1cea520f55cfdc0379fdf98f0651cd.png) + + - can be obtained by correlation with a kernel + - Prewitt + - Canny + - Sobel + + ![screenshot.png](8f49a40caa697b2f19f15994ac58ee51.png) + +## BLOB analysis +BLOB — binary large object (group of connected pixels in binary image) + +connectivity decides which pixels are neighbours (4-connectivity, 8 connectivity) + +set of allpixels which are connected to a given pixel is a connected component or BLOB + +![screenshot.png](0fd98b013f8e2be2b21982bb51ab196d.png) + +extracting connected components is called labelling, each connected component is given a label. + +0 is background, then 1,2,3,… + +in MATLAB, bwlabel returns matrix with labels and number of BLOBs diff --git a/content/physcomp-notes/Images.html b/content/physcomp-notes/Images.html @@ -1,3 +0,0 @@ -<?xml version="1.0" encoding="UTF-8"?> -<!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Transitional//EN" "http://www.w3.org/TR/xhtml1/DTD/xhtml1-transitional.dtd"> -<html><head><link rel="stylesheet" href="sitewide.css" type="text/css"><meta http-equiv="Content-Type" content="text/html; charset=UTF-8"/><meta name="exporter-version" content="Evernote Mac 7.5.2 (457164)"/><meta name="altitude" content="-0.0002259806205984205"/><meta name="author" content="Alex Balgavy"/><meta name="created" content="2017-12-18 14:52:17 +0000"/><meta name="latitude" content="52.37359849166707"/><meta name="longitude" content="4.836369663993605"/><meta name="source" content="desktop.mac"/><meta name="updated" content="2017-12-19 19:37:35 +0000"/><title>Images</title></head><body><div><span style="font-weight: bold;">Image conversion</span></div><div>Image (spatial) sampling</div><ul><li>sampling is done in both space and colour </li><li>brightness is sampled at points and stored in matrix of numbers</li><li>each element is a pixel</li><li>spacial resolution — size in pixels in each direction (e.g. 1920 x 1080, 1280 x 720)</li></ul><div><br/></div><div>Colour quantisation</div><ul><li>each pixel has a digital value represented in bits</li><li>that’s analog brightness</li><li>with 1 bit, not much is done — binary image</li><li>with 8 bits, greyscale (0-255 per pixel, with white at 255)</li><li>with 24 bits, colour (0-255, 0-255, 0-255 per each pixel in RGB)</li></ul><div><br/></div></body></html>- \ No newline at end of file diff --git a/content/physcomp-notes/Images.md b/content/physcomp-notes/Images.md @@ -0,0 +1,19 @@ ++++ +title = 'Images' ++++ +# Images +## Image conversion +Image (spatial) sampling + +- sampling is done in both space and colour +- brightness is sampled at points and stored in matrix of numbers +- each element is a pixel +- spacial resolution — size in pixels in each direction (e.g. 1920 x 1080, 1280 x 720) + +Colour quantisation + +- each pixel has a digital value represented in bits +- that’s analog brightness +- with 1 bit, not much is done — binary image +- with 8 bits, greyscale (0-255 per pixel, with white at 255) +- with 24 bits, colour (0-255, 0-255, 0-255 per each pixel in RGB) diff --git a/content/physcomp-notes/Morphological operations.html b/content/physcomp-notes/Morphological operations.html @@ -1,3 +0,0 @@ -<?xml version="1.0" encoding="UTF-8"?> -<!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Transitional//EN" "http://www.w3.org/TR/xhtml1/DTD/xhtml1-transitional.dtd"> -<html><head><link rel="stylesheet" href="sitewide.css" type="text/css"><meta http-equiv="Content-Type" content="text/html; charset=UTF-8"/><meta name="exporter-version" content="Evernote Mac 7.5.2 (457164)"/><meta name="altitude" content="0.3445475697517395"/><meta name="author" content="Alex Balgavy"/><meta name="created" content="2017-12-19 20:06:13 +0000"/><meta name="latitude" content="52.37343104980074"/><meta name="longitude" content="4.836051095350311"/><meta name="source" content="desktop.mac"/><meta name="updated" content="2017-12-19 20:12:30 +0000"/><title>Morphological operations</title></head><body><div><span style="font-weight: bold;">Morphologic operations</span></div><div>Useful for analysing shapes in images</div><div>Used on binary images</div><div>Operates by applying a kernel (structuring element of 0 and 1) to each pixel in input</div><div>there is a designated center pixel</div><div>instead of multiplication/addition, it is applied using a hit (dilation) or fit (erosion) operation</div><div><br/></div><div>procedure:</div><ol><li>Structuring element is placed on top of image</li><li>Center of structuring element is placed at position of pixel in focus</li><li>The value of that pixel is calculated by applying structuring element</li></ol><div><br/></div><div>operations</div><ul><li>dilation (increase in size) — center structuring element on each 0 pixel in image. if any in neighbourhood is 1, pixel is switched to 1.</li></ul><blockquote style="margin: 0 0 0 40px; border: none; padding: 0px;"><div><img src="Morphological%20operations.resources/screenshot_1.png" height="316" width="488"/><br/></div></blockquote><ul><li>erosion (decrease in size) — center structuring element on each 1 pixel in image. if any in neighbourhood is 0, pixel is switched to 0.</li></ul><blockquote style="margin: 0 0 0 40px; border: none; padding: 0px;"><div><img src="Morphological%20operations.resources/screenshot.png" height="328" width="500"/><br/></div></blockquote><ul><li>opening (erosion =&gt; dilation) — removes small, isolated noisy objects</li><li>closure (dilation =&gt; erosion) — removes small hole and join narrow isthmuses between objects</li></ul><div><br/></div></body></html>- \ No newline at end of file diff --git a/content/physcomp-notes/Morphological operations.resources/screenshot.png b/content/physcomp-notes/Morphological operations/5823a4727aa3da50d8f101bd342ae265.png Binary files differ. diff --git a/content/physcomp-notes/Morphological operations.resources/screenshot_1.png b/content/physcomp-notes/Morphological operations/f29ae7fc6a951cebc39d401c516693ca.png Binary files differ. diff --git a/content/physcomp-notes/Morphological operations/index.md b/content/physcomp-notes/Morphological operations/index.md @@ -0,0 +1,30 @@ ++++ +title = 'Morphological operations' ++++ +# Morphological operations +Useful for analysing shapes in images +Used on binary images + +Operates by applying a kernel (structuring element of 0 and 1) to each pixel in input + +there is a designated center pixel + +instead of multiplication/addition, it is applied using a hit (dilation) or fit (erosion) operation + +procedure: +1. Structuring element is placed on top of image +2. Center of structuring element is placed at position of pixel in focus +3. The value of that pixel is calculated by applying structuring element + +operations + +- dilation (increase in size) — center structuring element on each 0 pixel in image. if any in neighbourhood is 1, pixel is switched to 1. + +![screenshot.png](f29ae7fc6a951cebc39d401c516693ca.png) + +- erosion (decrease in size) — center structuring element on each 1 pixel in image. if any in neighbourhood is 0, pixel is switched to 0. + +![screenshot.png](5823a4727aa3da50d8f101bd342ae265.png) + +- opening (erosion => dilation) — removes small, isolated noisy objects +- closure (dilation => erosion) — removes small hole and join narrow isthmuses between objects diff --git a/content/physcomp-notes/Neighborhood processing.html b/content/physcomp-notes/Neighborhood processing.html @@ -1,3 +0,0 @@ -<?xml version="1.0" encoding="UTF-8"?> -<!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Transitional//EN" "http://www.w3.org/TR/xhtml1/DTD/xhtml1-transitional.dtd"> -<html><head><link rel="stylesheet" href="sitewide.css" type="text/css"><meta http-equiv="Content-Type" content="text/html; charset=UTF-8"/><meta name="exporter-version" content="Evernote Mac 7.5.2 (457164)"/><meta name="altitude" content="0.3140121698379517"/><meta name="author" content="Alex Balgavy"/><meta name="created" content="2017-12-19 19:57:01 +0000"/><meta name="latitude" content="52.37337526752695"/><meta name="longitude" content="4.836118090339202"/><meta name="source" content="desktop.mac"/><meta name="updated" content="2017-12-19 20:10:29 +0000"/><title>Neighborhood processing</title></head><body><div>Value of pixel in output is determined by value of same pixel in input and the neighbours.</div><div>Use small neighbourhood of pixel in input to get new brightness value in output.</div><div><br/></div><div><img src="Neighborhood%20processing.resources/screenshot.png" height="181" width="396"/><br/></div><div><br/></div><div><br/></div><div><span style="font-weight: bold;">Filtering (applying filter or mask to entire image):</span></div><ul><li>Mean filter — replace noise pixel by mean value of neighbours, including itself</li><li>Median filter — order values in increasing order including itself, find median, set that as new value</li></ul><div><br/></div><div><span style="font-weight: bold;">Correlation/convolution in an image:</span></div><ul><li>works by scanning through image and applying mask (“kernel”) to each pixel</li><li>kernel is filled with numbers, not always equal to one (“kernel coefficients”)</li><li>coefficients weigh pixel value they are covering</li><li>output of correlation is sum of weighted pixel values</li><li>kernel alters center pixel</li><li>decreases size of image!</li></ul><div><br/></div><div>example kernels:</div><div><img src="Neighborhood%20processing.resources/screenshot_1.png" height="210" width="578"/><br/></div><div><br/></div><div><br/></div></body></html>- \ No newline at end of file diff --git a/content/physcomp-notes/Neighborhood processing.resources/screenshot_1.png b/content/physcomp-notes/Neighborhood processing/069d57e32b2bfb0f851662a3d4d6bab0.png Binary files differ. diff --git a/content/physcomp-notes/Neighborhood processing.resources/screenshot.png b/content/physcomp-notes/Neighborhood processing/6c9075b1d48e76a1b390804c138dcfe1.png Binary files differ. diff --git a/content/physcomp-notes/Neighborhood processing/index.md b/content/physcomp-notes/Neighborhood processing/index.md @@ -0,0 +1,27 @@ ++++ +title = 'Neighborhood processing' ++++ +# Neighborhood processing +Value of pixel in output is determined by value of same pixel in input and the neighbours. + +Use small neighbourhood of pixel in input to get new brightness value in output. + +![screenshot.png](6c9075b1d48e76a1b390804c138dcfe1.png) + +## Filtering (applying filter or mask to entire image): + +- Mean filter — replace noise pixel by mean value of neighbours, including itself +- Median filter — order values in increasing order including itself, find median, set that as new value + +## Correlation/convolution in an image: + +- works by scanning through image and applying mask (“kernel”) to each pixel +- kernel is filled with numbers, not always equal to one (“kernel coefficients”) +- coefficients weigh pixel value they are covering +- output of correlation is sum of weighted pixel values +- kernel alters center pixel +- decreases size of image! + +example kernels: + +![screenshot.png](069d57e32b2bfb0f851662a3d4d6bab0.png) diff --git a/content/physcomp-notes/Pervasive computing system.html b/content/physcomp-notes/Pervasive computing system.html @@ -1,3 +0,0 @@ -<?xml version="1.0" encoding="UTF-8"?> -<!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Transitional//EN" "http://www.w3.org/TR/xhtml1/DTD/xhtml1-transitional.dtd"> -<html><head><link rel="stylesheet" href="sitewide.css" type="text/css"><meta http-equiv="Content-Type" content="text/html; charset=UTF-8"/><meta name="exporter-version" content="Evernote Mac 7.5.2 (457164)"/><meta name="altitude" content="1.660305500030518"/><meta name="author" content="Alex Balgavy"/><meta name="created" content="2017-11-06 12:44:10 +0000"/><meta name="latitude" content="52.33301065856576"/><meta name="longitude" content="4.865528096260713"/><meta name="source" content="desktop.mac"/><meta name="updated" content="2017-12-18 14:49:28 +0000"/><title>Pervasive computing system</title></head><body><div>Properties:</div><div><ul><li>Autonomous — the ability to make one’s decisions and act on them (total or partial)</li><li>Distributed — many computers are networked in a transparent way</li><li>Context-aware</li><li>i-HCI — implicit human-computer interaction, in a multimodal way</li><li>Intelligent — exhibits some form of artificial intelligence</li></ul></div><div><br/></div></body></html>- \ No newline at end of file diff --git a/content/physcomp-notes/Pervasive computing system.md b/content/physcomp-notes/Pervasive computing system.md @@ -0,0 +1,11 @@ ++++ +title = 'Pervasive computing system' ++++ +# Pervasive computing system +Properties: + +- Autonomous — the ability to make one’s decisions and act on them (total or partial) +- Distributed — many computers are networked in a transparent way +- Context-aware +- i-HCI — implicit human-computer interaction, in a multimodal way +- Intelligent — exhibits some form of artificial intelligence diff --git a/content/physcomp-notes/Point processing.html b/content/physcomp-notes/Point processing.html @@ -1,3 +0,0 @@ -<?xml version="1.0" encoding="UTF-8"?> -<!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Transitional//EN" "http://www.w3.org/TR/xhtml1/DTD/xhtml1-transitional.dtd"> -<html><head><link rel="stylesheet" href="sitewide.css" type="text/css"><meta http-equiv="Content-Type" content="text/html; charset=UTF-8"/><meta name="exporter-version" content="Evernote Mac 7.5.2 (457164)"/><meta name="altitude" content="-0.4378171265125275"/><meta name="author" content="Alex Balgavy"/><meta name="created" content="2017-12-19 19:53:28 +0000"/><meta name="latitude" content="52.37364976172622"/><meta name="longitude" content="4.836290468415996"/><meta name="source" content="desktop.mac"/><meta name="updated" content="2017-12-19 19:57:57 +0000"/><title>Point processing</title></head><body><div>A simple function applied to each value — e.g. to lighten, y = x+C<br/></div><div>Ex: change brightness of every pixel in the same way</div><div><br/></div><div><img src="Point%20processing.resources/screenshot_1.png" height="199" width="407"/>    </div><div><br/></div><div><span style="font-family: &quot;Courier New&quot;;">b = imread(‘filename.bmp’);</span></div><div><span style="font-family: &quot;Courier New&quot;;">b1 = imadd(b,128);</span></div><div><br/></div><div><b>Type conversion</b></div><div>From RGB to grayscale (24 bits -&gt; 8 bits)</div><div><br/></div><div><span style="font-family: &quot;Courier New&quot;;">I = imread(‘filename.bmp’);</span></div><div><span style="font-family: &quot;Courier New&quot;;">J = rgb2gray(I);</span></div><div><br/></div><div><span style="font-family: &quot;Helvetica Neue&quot;;"><b>Histogram stretching</b></span></div><div><span style="font-family: &quot;Helvetica Neue&quot;;">A histogram graphs the amount of times each value from 0 to 255 occurs in an image.</span></div><div><span style="font-family: &quot;Helvetica Neue&quot;;">You can stretch a histogram to equalise the image</span></div><div><br/></div><div><img src="Point%20processing.resources/screenshot_2.png" height="254" width="328"/><br/></div><div><br/></div><div><span style="font-family: &quot;Helvetica Neue&quot;;"><b>Thresholding</b></span></div><ul><li><span style="font-family: &quot;Helvetica Neue&quot;;">a gray scale image can be converted into binary (B&amp;W)</span></li><li><span style="font-family: &quot;Helvetica Neue&quot;;">choose a grey level T (threshold), change each pixel based on relation to T</span></li><li><span style="font-family: &quot;Helvetica Neue&quot;;">white if &gt;T, black if ≤T</span></li></ul><div><br/></div><div><img src="Point%20processing.resources/screenshot.png" height="275" width="664"/><br/></div><div><br/></div></body></html>- \ No newline at end of file diff --git a/content/physcomp-notes/Point processing.resources/screenshot.png b/content/physcomp-notes/Point processing/14678296b2a9a5e2df1e024513278322.png Binary files differ. diff --git a/content/physcomp-notes/Point processing.resources/screenshot_1.png b/content/physcomp-notes/Point processing/3324add035d108efc1dc36a5adc53f9f.png Binary files differ. diff --git a/content/physcomp-notes/Point processing.resources/screenshot_2.png b/content/physcomp-notes/Point processing/4f05a88f44157b44d30209a3fa3a1941.png Binary files differ. diff --git a/content/physcomp-notes/Point processing/index.md b/content/physcomp-notes/Point processing/index.md @@ -0,0 +1,38 @@ ++++ +title = 'Point processing' ++++ +# Point processing +A simple function applied to each value — e.g. to lighten, y = x+C + +Ex: change brightness of every pixel in the same way + +![screenshot.png](3324add035d108efc1dc36a5adc53f9f.png) + +``` +b = imread(‘filename.bmp’); +b1 = imadd(b,128); +``` + +## Type conversion +From RGB to grayscale (24 bits -> 8 bits) + +``` +I = imread(‘filename.bmp’); +J = rgb2gray(I); +``` + +## Histogram stretching + +A histogram graphs the amount of times each value from 0 to 255 occurs in an image. + +You can stretch a histogram to equalise the image + +![screenshot.png](4f05a88f44157b44d30209a3fa3a1941.png) + +## Thresholding + +- a gray scale image can be converted into binary (B&W) +- choose a grey level T (threshold), change each pixel based on relation to T +- white if >T, black if ≤T + +![screenshot.png](14678296b2a9a5e2df1e024513278322.png) diff --git a/content/physcomp-notes/Sound processing.html b/content/physcomp-notes/Sound processing.html @@ -1,3 +0,0 @@ -<?xml version="1.0" encoding="UTF-8"?> -<!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Transitional//EN" "http://www.w3.org/TR/xhtml1/DTD/xhtml1-transitional.dtd"> -<html><head><link rel="stylesheet" href="sitewide.css" type="text/css"><meta http-equiv="Content-Type" content="text/html; charset=UTF-8"/><meta name="exporter-version" content="Evernote Mac 7.5.2 (457164)"/><meta name="altitude" content="1.400665044784546"/><meta name="author" content="Alex Balgavy"/><meta name="created" content="2017-11-13 12:50:06 +0000"/><meta name="latitude" content="52.33301919689172"/><meta name="longitude" content="4.865518015192567"/><meta name="source" content="desktop.mac"/><meta name="updated" content="2017-11-13 19:31:10 +0000"/><title>Sound processing</title></head><body><div><span style="font-weight: bold;">Problems with sound capture:</span></div><div><ol><li>Acquired signals are very noisy</li><li>Context information is hidden</li></ol><div><br/></div></div><div>How do we process sound to classify and extract information?</div><div><br/></div><div><span style="font-weight: bold;">Basic features of sound</span></div><div><ul><li>Volume (air pressure) or loudness (dB) — amplitude of wave</li><li>Frequency (Hz) or pitch — frequency of wave</li></ul><div><br/></div></div><div><span style="font-weight: bold;">Periodic signal — Fourier</span></div><div>When the signal is sinusoidal, it’s simple to calculate the frequency with a physics formula.</div><div>But if it’s not sinusoidal, what do you do? Analyse frequency spectrum. Enter Fourier.</div><div><br/></div><div>Fourier: almost every signal can be broken down into multiple sinusoidal waves with different frequencies and amplitudes.</div><div><br/></div><div>Instead of having signal amplitude as function of time, represent it by function of frequencies.</div><div><br/></div><div><img src="Sound%20processing.resources/screenshot_4.png" height="320" width="429"/><br/></div><div><br/></div><div>Then you end up with a Fourier series — sum of simple sinusoidal waves with frequencies kf₀, amplitudes A<span style="vertical-align: sub;">k</span> and phase shifts  𝜑<span style="vertical-align: sub;">k:</span></div><div><img src="Sound%20processing.resources/screenshot_10.png" height="88" width="407"/><br/></div><div><br/></div><div>The periodic signal has a frequency spectrum of various harmonics:</div><div><br/></div><div><img src="Sound%20processing.resources/screenshot_2.png" height="242" width="514"/><br/></div><div><br/></div><div>Component frequencies are a multiple of the fundamental frequency, called harmonics.</div><div><br/></div><div>You can calculate amplitudes A<span style="vertical-align: sub;">k</span> with an algorithm called FFT (Fast Fourier Transform), in a vector.</div><div>You put in the vector of samples and the number of samples N, and you get out a vector of amplitudes of length N+1</div><div><ul><li>First element is DC component with frequency 0</li><li>You can really only use the first half of the vector</li></ul><div><br/></div></div><div>Formulas:</div><div><table style="border-collapse: collapse; min-width: 100%;"><colgroup><col style="width: 130px;"/><col style="width: 130px;"/><col style="width: 130px;"/><col style="width: 130px;"/></colgroup><tbody><tr><td style="border: 1px solid rgb(219, 219, 219); width: 130px; padding: 8px;"><div>Frequency step</div></td><td style="border: 1px solid rgb(219, 219, 219); width: 130px; padding: 8px;"><div>Frequency at amplitude</div></td><td style="border: 1px solid rgb(219, 219, 219); width: 130px; padding: 8px;"><div>Nyquist frequency</div></td><td style="border: 1px solid rgb(219, 219, 219); width: 130px; padding: 8px;"><div>Last useful amplitude</div></td></tr><tr><td style="border: 1px solid rgb(219, 219, 219); width: 130px; padding: 8px;"><div><img src="Sound%20processing.resources/screenshot_6.png" height="107" width="141"/></div></td><td style="border: 1px solid rgb(219, 219, 219); width: 130px; padding: 8px;"><div><img src="Sound%20processing.resources/screenshot_3.png" height="92" width="248"/></div></td><td style="border: 1px solid rgb(219, 219, 219); width: 130px; padding: 8px;"><div><img src="Sound%20processing.resources/screenshot_9.png" height="37" width="67"/></div></td><td style="border: 1px solid rgb(219, 219, 219); width: 130px; padding: 8px;"><div><img src="Sound%20processing.resources/screenshot_1.png" height="40" width="206"/></div></td></tr></tbody></table><div><br/></div></div><div><br/></div><div>Nyquist frequency (fc): maximum freq. detected using FFT; half sampling rate Fs. </div><div><br/></div><div><span style="font-weight: bold;">Not periodic — short time analysis</span></div><div>some sound signals are periodic for a very short time</div><div><img src="Sound%20processing.resources/screenshot_8.png" height="337" width="488"/><img src="Sound%20processing.resources/screenshot.png" height="318" width="534"/><br/></div><div><br/></div><div>Cut the speech in segments (frames). Then you can apply FFT on those pieces.</div><div>This is called segmentation or windowing.</div><div><br/></div><div><span style="font-weight: bold;">Spectrogram</span></div><div>Freq. spectrum varies in time</div><div>Graph with time on x-axis, frequency on y-axis and colour being amplitude of each frequency</div><div><br/></div><div><img src="Sound%20processing.resources/screenshot_5.png" height="151" width="509"/><br/></div><div><br/></div><div><img src="Sound%20processing.resources/screenshot_7.png" height="401" width="525"/><br/></div><div><br/></div><div><br/></div><div><span style="font-weight: bold;">Digital filtering</span></div><div>Time domain: moving average filter</div><div>Frequency domain:</div><div><ul><li>Low-pass</li><li>High-pass</li><li>Band-pass — allow only a certain frequency band</li><li>Band-reject (notch-filter) — allow everything but a certain frequency band</li><ul><li>sample signal, compute spectrum using FFT, set to zero portions of spectrum that are just noise, and inverse FFT to synthesise improved signal</li></ul></ul></div><div><br/></div><div><br/></div></body></html>- \ No newline at end of file diff --git a/content/physcomp-notes/Sound processing.resources/screenshot_1.png b/content/physcomp-notes/Sound processing.resources/screenshot_1.png Binary files differ. diff --git a/content/physcomp-notes/Sound processing.resources/screenshot_10.png b/content/physcomp-notes/Sound processing.resources/screenshot_10.png Binary files differ. diff --git a/content/physcomp-notes/Sound processing.resources/screenshot_3.png b/content/physcomp-notes/Sound processing.resources/screenshot_3.png Binary files differ. diff --git a/content/physcomp-notes/Sound processing.resources/screenshot_6.png b/content/physcomp-notes/Sound processing.resources/screenshot_6.png Binary files differ. diff --git a/content/physcomp-notes/Sound processing.resources/screenshot_9.png b/content/physcomp-notes/Sound processing.resources/screenshot_9.png Binary files differ. diff --git a/content/physcomp-notes/Sound processing.resources/screenshot_4.png b/content/physcomp-notes/Sound processing/2f8ad86778ebb0b91e9ebc527decb0d4.png Binary files differ. diff --git a/content/physcomp-notes/Sound processing.resources/screenshot_8.png b/content/physcomp-notes/Sound processing/5a9081f841b448d241811917f4eea3e3.png Binary files differ. diff --git a/content/physcomp-notes/Sound processing.resources/screenshot_2.png b/content/physcomp-notes/Sound processing/8ecb6e39f786a6738ceaea52c1640948.png Binary files differ. diff --git a/content/physcomp-notes/Sound processing.resources/screenshot_7.png b/content/physcomp-notes/Sound processing/e90248e66991c5183a713e851b9fbda8.png Binary files differ. diff --git a/content/physcomp-notes/Sound processing.resources/screenshot.png b/content/physcomp-notes/Sound processing/fb0360fdcbdf2c0fa8c15ce7ddbe6670.png Binary files differ. diff --git a/content/physcomp-notes/Sound processing.resources/screenshot_5.png b/content/physcomp-notes/Sound processing/fe629573739f7ff022dd7c5ae666c281.png Binary files differ. diff --git a/content/physcomp-notes/Sound processing/index.md b/content/physcomp-notes/Sound processing/index.md @@ -0,0 +1,89 @@ ++++ +title = 'Sound processing' +template = 'page-math.html' ++++ +# Sound processing +Problems with sound capture: +1. Acquired signals are very noisy +2. Context information is hidden + +How do we process sound to classify and extract information? + +Basic features of sound + +- Volume (air pressure) or loudness (dB) — amplitude of wave +- Frequency (Hz) or pitch — frequency of wave + +## Periodic signal — Fourier + +When the signal is sinusoidal, it’s simple to calculate the frequency with a physics formula. + +But if it’s not sinusoidal, what do you do? Analyse frequency spectrum. Enter Fourier. + +Fourier: almost every signal can be broken down into multiple sinusoidal waves with different frequencies and amplitudes. + +Instead of having signal amplitude as function of time, represent it by function of frequencies. + +![screenshot.png](2f8ad86778ebb0b91e9ebc527decb0d4.png) + +Then you end up with a Fourier series — sum of simple sinusoidal waves with frequencies kf₀, amplitudes Ak and phase shifts φk: + +$x(t) = A_{0} + \sum_{k=1}^N A_{k} \sin (2 \pi k f_{0} t + \phi_{k})$ + +The periodic signal has a frequency spectrum of various harmonics: + +![screenshot.png](8ecb6e39f786a6738ceaea52c1640948.png) + +Component frequencies are a multiple of the fundamental frequency, called harmonics. + +You can calculate amplitudes Ak with an algorithm called FFT (Fast Fourier Transform), in a vector. + +You put in the vector of samples and the number of samples N, and you get out a vector of amplitudes of length N+1 + +- First element is DC component with frequency 0 +- You can really only use the first half of the vector + +Formulas: + +<table> +<tr><td>Frequency step</td> +<td>Frequency at amplitude</td> +<td>Nyquist frequency</td> +<td>Last useful amplitude</td> +</tr> +<tr> +<td>$\Delta f = \frac{F_s}{N}$</td> +<td>$f_{k} = k \Delta f = \frac{kF_{s}}{N}$</td> +<td>$F_{s}/2$</td> +<td>$f_{N/2} = N/2 \Delta f$</td> +</tr> +</table> + +Nyquist frequency (fc): maximum freq. detected using FFT; half sampling rate Fs. + +## Not periodic — short time analysis +some sound signals are periodic for a very short time + +![screenshot.png](5a9081f841b448d241811917f4eea3e3.png)![screenshot.png](fb0360fdcbdf2c0fa8c15ce7ddbe6670.png) + +Cut the speech in segments (frames). Then you can apply FFT on those pieces. +This is called segmentation or windowing. + +### Spectrogram +Freq. spectrum varies in time + +Graph with time on x-axis, frequency on y-axis and colour being amplitude of each frequency + +![screenshot.png](fe629573739f7ff022dd7c5ae666c281.png) + +![screenshot.png](e90248e66991c5183a713e851b9fbda8.png) + +### Digital filtering +Time domain: moving average filter +Frequency domain: + +- Low-pass +- High-pass +- Band-pass — allow only a certain frequency band +- Band-reject (notch-filter) — allow everything but a certain frequency band + - sample signal, compute spectrum using FFT, set to zero portions of spectrum that are just noise, and inverse FFT to synthesise improved signal diff --git a/content/physcomp-notes/Systems engineering.html b/content/physcomp-notes/Systems engineering.html @@ -1,3 +0,0 @@ -<?xml version="1.0" encoding="UTF-8"?> -<!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Transitional//EN" "http://www.w3.org/TR/xhtml1/DTD/xhtml1-transitional.dtd"> -<html><head><link rel="stylesheet" href="sitewide.css" type="text/css"><meta http-equiv="Content-Type" content="text/html; charset=UTF-8"/><meta name="exporter-version" content="Evernote Mac 7.5.2 (457164)"/><meta name="altitude" content="-1.862802863121033"/><meta name="author" content="Alex Balgavy"/><meta name="created" content="2017-11-28 10:06:37 +0000"/><meta name="latitude" content="52.33477121549987"/><meta name="longitude" content="4.866823814718402"/><meta name="source" content="desktop.mac"/><meta name="updated" content="2017-11-29 08:35:41 +0000"/><title>Systems engineering</title></head><body><div><span style="font-weight: bold;">Use case specifications</span></div><div><br/></div><div><span style="font-weight: bold;">State transition diagrams</span></div><div><ul><li><br/></li><li>changes in state happen as result of events</li></ul><div><img src="Systems%20engineering.resources/Photo.png" height="927" width="1600"/><br/></div></div><div><br/></div><div><br/></div><div>e.g. robot that avoids obstacles:</div><div><img src="Systems%20engineering.resources/Scannable%20Document%20on%2028%20Nov%202017%20at%2011_17_47.png" height="970" width="1600"/><br/></div><div><br/></div><div><br/></div><div>Activity diagrams</div><div><ul><li>rounded rect for events</li><li>diamonds for decisions</li><li>black ovals for start/stop</li><li><br/></li></ul></div><div><br/></div></body></html>- \ No newline at end of file diff --git a/content/physcomp-notes/Systems engineering.resources/Photo.png b/content/physcomp-notes/Systems engineering/76a8d624923f4b72c0bfb09fd42428a4.png Binary files differ. diff --git a/content/physcomp-notes/Systems engineering.resources/Scannable Document on 28 Nov 2017 at 11_17_47.png b/content/physcomp-notes/Systems engineering/8ca002b539212b99d661276d2fd11d9f.png Binary files differ. diff --git a/content/physcomp-notes/Systems engineering/index.md b/content/physcomp-notes/Systems engineering/index.md @@ -0,0 +1,20 @@ ++++ +title = 'Systems engineering' ++++ +# Systems engineering + +State transition diagrams + +- changes in state happen as result of events + +![Photo.png](76a8d624923f4b72c0bfb09fd42428a4.png) + +e.g. robot that avoids obstacles: + +![Scannable Document on 28 Nov 2017 at 11_17_47.png](8ca002b539212b99d661276d2fd11d9f.png) + +Activity diagrams + +- rounded rect for events +- diamonds for decisions +- black ovals for start/stop diff --git a/content/physcomp-notes/_index.md b/content/physcomp-notes/_index.md @@ -0,0 +1,18 @@ ++++ +title = 'Physical Computing' ++++ +# Physical Computing +1. [Properties of a pervasive computing system](pervasive-computing-system) +2. [Control](control-systems) +3. Signals + 1. [Audio (1-dimensional)](audio-signals) + 2. [Images (2-dimensional)](images) +4. Processing + - [Audio](sound-processing) + - [Images](image-processing) + - [Point processing](point-processing) + - [Neighborhood processing](neighborhood-processing) + - [Morphological operations](morphological-operations) +5. Classification + - [Classification](classification) + - [Classification 2](classification-2) diff --git a/content/physcomp-notes/index.html b/content/physcomp-notes/index.html @@ -1,56 +0,0 @@ -<?xml version="1.0" encoding="UTF-8"?> -<!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Transitional//EN" "http://www.w3.org/TR/xhtml1/DTD/xhtml1-transitional.dtd"> -<html> - -<head><link rel="stylesheet" href="sitewide.css" type="text/css"> - <meta http-equiv="Content-Type" content="text/html; charset=UTF-8" /> - <meta name="exporter-version" content="Evernote Mac 7.5.2 (457164)" /> - <meta name="altitude" content="-0.1214089840650558" /> - <meta name="author" content="Alex Balgavy" /> - <meta name="created" content="2017-12-18 14:39:56 +0000" /> - <meta name="latitude" content="52.37353909340794" /> - <meta name="longitude" content="4.83621105353759" /> - <meta name="source" content="desktop.mac" /> - <meta name="updated" content="2017-12-19 22:03:14 +0000" /> - <title>TOC Physical Computing</title> -</head> - -<body> - <nav> -<a href="http://thezeroalpha.github.io">Homepage</a> -</nav> - - <h1>Physical Computing Notes</h1> - <h3>Alex Balgavy</h3> - <div> - <ol> - <li><a href="Pervasive computing system.html">Properties of a pervasive computing system</a></li> - <li><a href="Control systems.html">Control</a></li> - <li>Signals</li> - <ol> - <li><a href="Audio signals.html">Audio (1-dimensional)</a></li> - <li><a href="Images.html">Images (2-dimensional)</a></li> - </ol> - <li>Processing</li> - <ul> - <li><a href="Sound processing.html">Audio</a></li> - <li><a href="Image processing.html">Images</a></li> - </ul> - <ol> - <ul> - <li><a href="Point processing.html">Point processing</a></li> - <li><a href="Neighborhood processing.html">Neighborhood processing</a></li> - <li><a href="Morphological operations.html">Morphological operations</a></li> - </ul> - </ol> - <li>Classification</li> - <ul> - <li><a href="Classification.html">Classification</a></li> - <li><a href="Classification 2.html">Classification 2</a></li> - </ul> - </ol> - </div> - <div><br /></div> -</body> - -</html> diff --git a/content/physcomp-notes/sitewide.css b/content/physcomp-notes/sitewide.css @@ -1,32 +0,0 @@ -@charset 'UTF-8'; -@font-face{font-family:'FontAwesome';src:url('font/fontawesome-webfont.eot?v=4.0.1');src:url('font/fontawesome-webfont.eot?#iefix&v=4.0.1') format('embedded-opentype'),url('font/fontawesome-webfont.woff?v=4.0.1') format('woff'),url('font/fontawesome-webfont.ttf?v=4.0.1') format('truetype'),url('font/fontawesome-webfont.svg?v=4.0.1#fontawesomeregular') format('svg');font-weight:normal;font-style:normal} - -body { - margin: 0px; - padding: 1em; - background: #f3f2ed; - font-family: 'Lato', sans-serif; - font-size: 12pt; - font-weight: 300; - color: #8A8A8A; - padding-left: 50px; -} -h1, h2, h3 { - margin: 0px; - padding: 0px; - font-weight: 300; - text-align: center; -} -h3 { - font-style: italic; -} -a { - color: #D1551F; - } -a:hover { - color: #AF440F; -} - strong { - font-weight: 700; - color: #2A2A2A; - }