lectures.alex.balgavy.eu

Lecture notes from university.
git clone git://git.alex.balgavy.eu/lectures.alex.balgavy.eu.git
Log | Files | Refs | Submodules

ethics.wiki (1703B)


      1 = Ethics of AI =
      2 three main questions:
      3   * how do we encode ethical behavior?
      4   * how should we behave towards AI?
      5   * how does the existence of AI affects our daily lives?
      6 
      7     "Ethics begins when elements of a moral system conflict."
      8 
      9 Fundamental ethics: moral absolutism, you are not allowed to do something due to e.g. religion
     10 
     11 Pragmatic ethics: humans always have a choice, you have the freedom of choice at any point in time
     12 
     13 == Sci-fi ethics (problems down the road) ==
     14 Asimov's laws:
     15   1. A robot may not injure a human being or, through inaction, allow a human being to come to harm.
     16   2. A robot must obey the orders given to it by human beings except where such orders would conflict with the First Law. 
     17   3. A robot must protect its own existence as long as such protection does not conflict with the First or Second Laws.
     18 
     19 The trolley problem is a good example of an ethical dilemma, and can be extended to self-driving cars (should it kill the driver or bystanders?).
     20 
     21 How do we treat AI? How should we?
     22 
     23 == Today's problems ==
     24 * Autonomous weapons: weapons that decide what to do by themselves
     25   * what are we allowing these systems to do?
     26   * the Dutch government said it's fine "if there's a human in the wider loop"...but this is very vague, what is the wider loop?
     27 * Privacy
     28   * big companies have a bunch of data about people
     29   * often, people give this data for free.
     30 * Profiling (e.g. racial)
     31   * e.g. a black person was stopped while driving in an expensive car because the system thought he could only be driving the car if he stole it.
     32 
     33 Prosecutor's fallacy:
     34   * using probabilities incorrectly. $P(\text{black} | \text{uses drugs}) \neq P(\text{uses drugs} | \text{black})$
     35