lectures.alex.balgavy.eu

Lecture notes from university.
git clone git://git.alex.balgavy.eu/lectures.alex.balgavy.eu.git
Log | Files | Refs | Submodules

ethics.md (1737B)


      1 +++
      2 title = "Ethics of AI"
      3 template = 'page-math.html'
      4 +++
      5 # Ethics of AI
      6 three main questions:
      7 * how do we encode ethical behavior?
      8 * how should we behave towards AI?
      9 * how does the existence of AI affects our daily lives?
     10 
     11 > "Ethics begins when elements of a moral system conflict."
     12 
     13 Fundamental ethics: moral absolutism, you are not allowed to do something due to e.g. religion
     14 
     15 Pragmatic ethics: humans always have a choice, you have the freedom of choice at any point in time
     16 
     17 ## Sci-fi ethics (problems down the road)
     18 Asimov's laws:
     19 1. A robot may not injure a human being or, through inaction, allow a human being to come to harm.
     20 2. A robot must obey the orders given to it by human beings except where such orders would conflict with the First Law.
     21 3. A robot must protect its own existence as long as such protection does not conflict with the First or Second Laws.
     22 
     23 The trolley problem is a good example of an ethical dilemma, and can be extended to self-driving cars (should it kill the driver or bystanders?).
     24 
     25 How do we treat AI? How should we?
     26 
     27 ## Today's problems
     28 * Autonomous weapons: weapons that decide what to do by themselves
     29   * what are we allowing these systems to do?
     30   * the Dutch government said it's fine "if there's a human in the wider loop"...but this is very vague, what is the wider loop?
     31 * Privacy
     32   * big companies have a bunch of data about people
     33   * often, people give this data for free.
     34 * Profiling (e.g. racial)
     35   * e.g. a black person was stopped while driving in an expensive car because the system thought he could only be driving the car if he stole it.
     36 
     37 Prosecutor's fallacy:
     38 * using probabilities incorrectly. $P(\text{black} | \text{uses drugs}) \neq P(\text{uses drugs} | \text{black})$
     39