|Brockman I 243
Trolley Problem/Church, George M.: (…) we have to address the ethical rules that should be built in, learned, or probabilistically chosen for increasingly intelligent and diverse machines. We have a whole series of Trolley Problems. At what number of people in line for death should the computer decide to shift a moving trolley to one person? Ultimately this might be a deep-learning problem—one in which huge databases of facts and contingencies can be taken into account, some seemingly far from the ethics at hand.
If one of [the] problem descriptions seems paradoxical or
Brockman I 244
illogical, it may be that the authors of the Trolley Problem have adjusted the weights on each side of the balance such that hesitant indecision is inevitable. Alternatively, one can use misdirection to rig the system, such that the error modes are not at the level of attention. For example, in the Trolley Problem, the real ethical decision was made years earlier (…).
Questions that at first seem alien and troubling, like “Who owns the new minds, and who pays for their mistakes?” are similar to well-established laws about who owns and pays for the sins of a corporation. >Robot rights/Church, George M., >Laws of Robotcs/Church, George M.
Church, George M. „The Rights of Machines” in: Brockman, John (ed.) 2019. Twenty-Five Ways of Looking at AI. New York: Penguin Press._____________Explanation of symbols: Roman numerals indicate the source, arabic numerals indicate the page number. The corresponding books are indicated on the right hand side. ((s)…): Comment by the sender of the contribution. The note [Author1]Vs[Author2] or [Author]Vs[term] is an addition from the Dictionary of Arguments. If a German edition is specified, the page numbers refer to this edition.
The Calculi of Lambda Conversion. (Am-6)(Annals of Mathematics Studies) Princeton 1985
Possible Minds: Twenty-Five Ways of Looking at AI New York 2019