Dictionary of Arguments


Philosophical and Scientific Issues in Dispute
 
[german]

Screenshot Tabelle Begriffes

 

Find counter arguments by entering NameVs… or …VsName.

Enhanced Search:
Search term 1: Author or Term Search term 2: Author or Term


together with


The author or concept searched is found in the following 11 entries.
Disputed term/author/ism Author
Entry
Reference
Brain/Brain State Pinker I 40
Brain/Pinker: slower than silicon, but it can compare larger patterns.
I 43
It provides missing information - a universal problem solver: is only possible with total information - snow is sometimes darker than coal. >Thinking, >Mind, >Memory, >Comparison, >Comparability, >Symbol processing, >General Problem Solver, >Cognition, >Information processing.

Pi I
St. Pinker
How the Mind Works, New York 1997
German Edition:
Wie das Denken im Kopf entsteht München 1998

Computer Model Sellars Rorty VI 184
Machine/Sellars: (according to Rorty): there is no big difference between a machine and humans. - At most as far as the complexity of the action is concerned. (NagelVs, SearleVs).
>Artificial Intelligence, >Artificial General Intelligence, >Artificial Consciousness, >Human Level AI, >Robots, >Formalization, >Computation,
>Problem solving/Psychology, >General Problem Solver,
>Cognition, >Thinking.

Sellars I
Wilfrid Sellars
The Myth of the Given: Three Lectures on the Philosophy of Mind, University of London 1956 in: H. Feigl/M. Scriven (eds.) Minnesota Studies in the Philosophy of Science 1956
German Edition:
Der Empirismus und die Philosophie des Geistes Paderborn 1999

Sellars II
Wilfred Sellars
Science, Perception, and Reality, London 1963
In
Wahrheitstheorien, Gunnar Skirbekk Frankfurt/M. 1977


Rorty I
Richard Rorty
Philosophy and the Mirror of Nature, Princeton/NJ 1979
German Edition:
Der Spiegel der Natur Frankfurt 1997

Rorty II
Richard Rorty
Philosophie & die Zukunft Frankfurt 2000

Rorty II (b)
Richard Rorty
"Habermas, Derrida and the Functions of Philosophy", in: R. Rorty, Truth and Progress. Philosophical Papers III, Cambridge/MA 1998
In
Philosophie & die Zukunft, Frankfurt/M. 2000

Rorty II (c)
Richard Rorty
Analytic and Conversational Philosophy Conference fee "Philosophy and the other hgumanities", Stanford Humanities Center 1998
In
Philosophie & die Zukunft, Frankfurt/M. 2000

Rorty II (d)
Richard Rorty
Justice as a Larger Loyalty, in: Ronald Bontekoe/Marietta Stepanians (eds.) Justice and Democracy. Cross-cultural Perspectives, University of Hawaii 1997
In
Philosophie & die Zukunft, Frankfurt/M. 2000

Rorty II (e)
Richard Rorty
Spinoza, Pragmatismus und die Liebe zur Weisheit, Revised Spinoza Lecture April 1997, University of Amsterdam
In
Philosophie & die Zukunft, Frankfurt/M. 2000

Rorty II (f)
Richard Rorty
"Sein, das verstanden werden kann, ist Sprache", keynote lecture for Gadamer’ s 100th birthday, University of Heidelberg
In
Philosophie & die Zukunft, Frankfurt/M. 2000

Rorty II (g)
Richard Rorty
"Wild Orchids and Trotzky", in: Wild Orchids and Trotzky: Messages form American Universities ed. Mark Edmundson, New York 1993
In
Philosophie & die Zukunft, Frankfurt/M. 2000

Rorty III
Richard Rorty
Contingency, Irony, and solidarity, Chambridge/MA 1989
German Edition:
Kontingenz, Ironie und Solidarität Frankfurt 1992

Rorty IV (a)
Richard Rorty
"is Philosophy a Natural Kind?", in: R. Rorty, Objectivity, Relativism, and Truth. Philosophical Papers Vol. I, Cambridge/Ma 1991, pp. 46-62
In
Eine Kultur ohne Zentrum, Stuttgart 1993

Rorty IV (b)
Richard Rorty
"Non-Reductive Physicalism" in: R. Rorty, Objectivity, Relativism, and Truth. Philosophical Papers Vol. I, Cambridge/Ma 1991, pp. 113-125
In
Eine Kultur ohne Zentrum, Stuttgart 1993

Rorty IV (c)
Richard Rorty
"Heidegger, Kundera and Dickens" in: R. Rorty, Essays on Heidegger and Others. Philosophical Papers Vol. 2, Cambridge/MA 1991, pp. 66-82
In
Eine Kultur ohne Zentrum, Stuttgart 1993

Rorty IV (d)
Richard Rorty
"Deconstruction and Circumvention" in: R. Rorty, Essays on Heidegger and Others. Philosophical Papers Vol. 2, Cambridge/MA 1991, pp. 85-106
In
Eine Kultur ohne Zentrum, Stuttgart 1993

Rorty V (a)
R. Rorty
"Solidarity of Objectivity", Howison Lecture, University of California, Berkeley, January 1983
In
Solidarität oder Objektivität?, Stuttgart 1998

Rorty V (b)
Richard Rorty
"Freud and Moral Reflection", Edith Weigert Lecture, Forum on Psychiatry and the Humanities, Washington School of Psychiatry, Oct. 19th 1984
In
Solidarität oder Objektivität?, Stuttgart 1988

Rorty V (c)
Richard Rorty
The Priority of Democracy to Philosophy, in: John P. Reeder & Gene Outka (eds.), Prospects for a Common Morality. Princeton University Press. pp. 254-278 (1992)
In
Solidarität oder Objektivität?, Stuttgart 1988

Rorty VI
Richard Rorty
Truth and Progress, Cambridge/MA 1998
German Edition:
Wahrheit und Fortschritt Frankfurt 2000
Computer Model Weizenbaum I 234
Computer Model/General Problem Solver/GPS/Artificial Intelligence/Newell/Simon/Weizenbaum: (described in A. Newell und H. A. Simon 1972(1)). General Problem Solver/Weizenbaum: is basically nothing more than a programming language in which you can write programs for certain highly specialized tasks.
>Computer languages, >Problem solving.
I 236
The General Problem Solver (GPS) is a frame within which the logical theory program runs. To solve problems, you have to work with very general symbolic structures that represent objects, operators, properties of objects and differences between objects, and one also has to create a method catalog. But even then, GPS does not allow you to draw conclusions from such "principles". >Principles.
WeizenbaumVsSimon/WeizenbaumVsNewell: the statement that the General Problem Solver (GPS) is in every sense an embodiment of human problem solving is tantamount to the statement that the algebra of the secondary school is also such an embodiment.
I 237
Problem: that says nothing about the psychology of human problem solving. Outside world/Newell/Simon: Particular attention should be paid to the restrictions on GPS access to the outside world. The initial part of the explicit commands to GPS has been acquired by humans long before this when building up their vocabulary.
>Language, >Language use, >Language community, >Knowledge, >World/Thinking.
I 238
WeizenbaumVsSimon/WeizenbaumVsNewell: this is where the true facts are bypassed. In reality, the question is what happens to the whole person as he or she builds up his or her vocabulary. >Language acquisition.
How is his/her understanding of what a "problem" is shaped by the experiences that are an inseparable part of his/her vocabulary acquisition?
>Problems, >H.A. Simon, >A. Newell.

1. A. Newell und H. A. Simon, Human Problem Solving, Englewood Cliffs(N. J. 1972, Kap 9: Logic, GPS and Human Behavior, S. 455-554

Weizenbaum I
Joseph Weizenbaum
Computer Power and Human Reason. From Judgment to Calculation, W. H. Freeman & Comp. 1976
German Edition:
Die Macht der Computer und die Ohnmacht der Vernunft Frankfurt/M. 1978

Forms of Thinking Dennett I 275
Limits/Unger: There must always be a pair of x on both sides of the border; this is required by our conventions. InwagenVsUnger: so much the worse for the conventions!
Jackendoff: Candidates on the border are forced into one or the other category.
Dennett: a good trick, but not a forced move! Darwin shows us that nature does not need what we say we need to think; nature copes well with gradual variations.
I 277
((s) Limits are necessary for thinking, but not for nature.) General/Particular/Artificial Intelligence/Dennett: Donald Symons(1): there is no "general problem solver", because there are no general problems, only particular problems. >General Problem Solver.
I 691
DennettVsSymons: There is also no general wound, but only particular wounds. Nevertheless, there is a general wound healing process.

1. Symons, D. 1992. "On the Use and Misuse od Darwinism in the Study of Human Behavior." In: Barkow, Cosmides, and Tooby, 1992, pp. 137-62.

Dennett I
D. Dennett
Darwin’s Dangerous Idea, New York 1995
German Edition:
Darwins gefährliches Erbe Hamburg 1997

Dennett II
D. Dennett
Kinds of Minds, New York 1996
German Edition:
Spielarten des Geistes Gütersloh 1999

Dennett III
Daniel Dennett
"COG: Steps towards consciousness in robots"
In
Bewusstein, Thomas Metzinger Paderborn/München/Wien/Zürich 1996

Dennett IV
Daniel Dennett
"Animal Consciousness. What Matters and Why?", in: D. C. Dennett, Brainchildren. Essays on Designing Minds, Cambridge/MA 1998, pp. 337-350
In
Der Geist der Tiere, D Perler/M. Wild Frankfurt/M. 2005

Generality Dennett I 691
Generality/Particularity/Artificial Intelligence/Dennett: Donald Symons(1): There is no "general problem solver", because there is no general problem, only particular problems. DennettVsSymons: There is no general wound but only particular wounds but there is a general wound healing process. > General Problem Solver.

1. Symons, D. 1992. "On the Use and Misuse od Darwinism in the Study of Human Behavior." In: Barkow, Cosmides, and Tooby, 1992, pp. 137-62.

Dennett I
D. Dennett
Darwin’s Dangerous Idea, New York 1995
German Edition:
Darwins gefährliches Erbe Hamburg 1997

Dennett II
D. Dennett
Kinds of Minds, New York 1996
German Edition:
Spielarten des Geistes Gütersloh 1999

Dennett III
Daniel Dennett
"COG: Steps towards consciousness in robots"
In
Bewusstein, Thomas Metzinger Paderborn/München/Wien/Zürich 1996

Dennett IV
Daniel Dennett
"Animal Consciousness. What Matters and Why?", in: D. C. Dennett, Brainchildren. Essays on Designing Minds, Cambridge/MA 1998, pp. 337-350
In
Der Geist der Tiere, D Perler/M. Wild Frankfurt/M. 2005

Goals Gärdenfors I 63
Goals/Intention/Intent/Language acquisition/Semantics/Gärdenfors: to represent intentions, the goal must already be represented. >Representation.
I 64
Conceptual space/semantic domain: can be a product space of physical space with itself. The goal is then a vector with the endpoints agent and target object, or their localization. >Conceptual Space.
Vectors: target vectors can be more abstract than motion vectors. They can be defined in all semantic domains. The classic case is Newell and Simons (1972)(1) General Problem Solver. The target spaces can be viewed as metaphorical transmissions of physical space, with the key concept still being the distance.
Spatial metaphors: are omnipresent in our everyday language. See Lakoff & Johnson (1980)(2).
>Metaphors.

1. Newell, A., & Simon, H. (1972). Human problem solving. Englewood Cliffs, NJ: Prentice-Hall.
2. Lakoff, G., & Johnson, M. (1980). Metaphors we live by. Chicago: University of Chicago Press.

Gä I
P. Gärdenfors
The Geometry of Meaning Cambridge 2014

Intelligence Newell, A./Simon, H. Münch III 57ff
Intelligence/Newell/Simon: there is as little a "principle of intelligence" as there is a "principle of life", which explains the essence of life from its very nature. But that is not that there are no structural requirements for intelligence. Cf. >Principles.
Münch III 69
General Problem Solver/Newell/Simon: (GPS) general mechanisms, schemes, for performing different tasks. Distinction nets, pattern recognition mechanisms, syntax analysis. >General Problem Solver, >Distinctions, >Networks, >Artificial Neural Networks, >Syntax, >Analysis, >Pattern Recognition, >Machine Learning, >Artificial Intelligence.
Münch III 76
Definition Intelligence/Newell/Simon: a system with limited processing capacity is to make wise decisions in the face of what is next to be done. Prerequisite: the solution distribution must not be completely random! Pure insertion and testing is not intelligent.
>Inserting.
The origin of intelligence is nothing mystic: it comes from search trees.

Allen Newell/Herbert Simon, “Computer Science as Empirical Inquiry: Symbols and Search“ Communications of the Association for Computing Machinery 19 (1976), 113-126


Mü III
D. Münch (Hrsg.)
Kognitionswissenschaft Frankfurt 1992
Planning Norvig Norvig I 156
Planning/artificial intelligence/Norvig/Russell: The unpredictability and partial observability of real environments were recognized early on in robotics projects that used planning techniques, including Shakey (Fikes et al., 1972)(1) and (Michie, 1974)(2). The problems received more attention after the publication of McDermott’s (1978a) influential article, Planning and Acting(3). >Belief states/Norvig.
Norvig I 366
Problems: [a sinple] problem-solving agent (…) can find sequences of actions that result in a goal state. But it deals with atomic representations of states and thus needs good domain-specific heuristics to perform well. [A] hybrid propositional logical agent (…) can find plans without domain-specific heuristics because it uses domain-independent heuristics based on the logical structure of the problem. But it relies on ground (variable-free) propositional inference, which means that it may be swamped when there are many actions and states.
Norvig I 367
planning researchers have settled on a factored representation - one in which a state of the world is represented by a collection of variables. We use a language called PDDL, the Planning Domain Definition Language, that allows us to express all 4Tn2 actions with one action schema. Each state is represented as a conjunction of fluents that are ground, functionless atoms. Database semantics is used: the closed-world assumption means that any fluents that are not mentioned are false, and the unique names assumption means that [x] 1 and [x] 2 are distinct. Actions are described by a set of action schemas that implicitly define the ACTIONS(s) and RESULT(s, a) functions needed to do a problem-solving search. >Frame Problem. Classical planning concentrates on problems where most actions leave most things unchanged.
Actions: A set of ground (variable-free) actions can be represented by a single action schema.
The schema is a lifted representation—it lifts the level of reasoning from propositional logic to a restricted subset of first-order logic.
Action schema: The schema consists of the action name, a list of all the variables used in the schema, a precondition and an effect.
Norvig I 367
Forward/backward (progression/regression) state-space search: Cf. >Forward chaining, >backward chaining.
Norvig I 376
Heuristics for planning: a heuristic function h(s) estimates the distance from a state s to the goal and that if we can derive an admissible heuristic for this distance - one that does not overestimate - then we can use A∗ search to find optimal solutions. Representation: Planning uses a factored representation for states and action schemas. That makes it possible to define good domain-independent heuristics and for programs to automatically apply a good domain-independent heuristic for a given problem. Think of a search problem as a graph where the nodes are states and the edges are actions. The problem is to find a path connecting the initial state to a goal state. There are two ways we can relax this problem to make it easier: by adding more edges to the graph, making it strictly easier to find a path, or by grouping multiple nodes together, forming an abstraction of the state space that has fewer states, and thus is easier to search.
Norvig I 377
State abstraction: Many planning problems have 10100 states or more, and relaxing the actions does nothing to reduce the number of states. Therefore, we now look at relaxations that decrease the number of states by forming a state abstraction - a many-to-one mapping from states in the ground representation of the problem to the abstract representation. The easiest form of state abstraction is to ignore some fluents.
Norvig I 378
Heuristics: A key idea in defining heuristics is decomposition: dividing a problem into parts, solving each part independently, and then combining the parts. The subgoal independence assumption is that the cost of solving a conjunction of subgoals is approximated by the sum of the costs of solving each subgoal independently.
Norvig I 390
Planning as constraint satisfaction: >Constraint satisfaction problems.
Norvig I 393
History of AI planning: AI planning arose from investigations into state-space search, theorem proving, and control theory and from the practical needs of robotics, scheduling, and other domains. STRIPS (Fikes and Nilsson, 1971)(4), the first major planning system, illustrates the interaction of these influences.
General problem solver/GPS: the General Problem Solver (Newell and Simon, 1961)(5), [was] a state-space search system that used means–ends analysis. The control structure of STRIPS was modeled on that of GPS.
Norvig I 394
Language: The Problem Domain Description Language, or PDDL (Ghallab et al., 1998)(6), was introduced as a computer-parsable, standardized syntax for representing planning problems and has been used as the standard language for the International Planning Competition since 1998. There have been several extensions; the most recent version, PDDL 3.0, includes plan constraints and preferences (Gerevini and Long, 2005)(7). Subproblems: Problem decomposition was achieved by computing a subplan for each subgoal and then stringing the subplans together in some order. This approach, called linear planning by Sacerdoti (1975)(8), was soon discovered to be incomplete. It cannot solve some very simple problems (…).A complete planner must allow for interleaving of actions from different subplans within a single sequence. The notion of serializable subgoals (Korf, 1987)(9) corresponds exactly to the set of problems for which oninterleaved planners are complete. One solution to the interleaving problem was goal-regression planning, a technique in which steps in a totally ordered plan are reordered so as to avoid conflict between subgoals. This was introduced by Waldinger (1975)(10) and also used by Warren’s (1974)(11) WARPLAN.
Partial ordering: The ideas underlying partial-order planning include the detection of conflicts (Tate, 1975a)(12) and the protection of achieved conditions from interference (Sussman, 1975)(13). The construction of partially ordered plans (then called task networks) was pioneered by the NOAH planner (Sacerdoti, 1975(8), 1977(14)) and by Tate’s (1975b(15), 1977(16)) NONLIN system. Partial-order planning dominated the next 20 years of research (…).
State-space planning: The resurgence of interest in state-space planning was pioneered by Drew McDermott’s UNPOP program (1996)(17), which was the first to suggest the ignore-delete-list heuristic (…).Bonet and Geffner’s Heuristic Search Planner (HSP) and its later derivatives (Bonet and Geffner, 1999(18); Haslum et al., 2005(19); Haslum, 2006(20)) were the first to make
Norvig I 395
state-space search practical for large planning problems. The most successful state-space searcher to date is FF (Hoffmann, 2001(21); Hoffmann and Nebel, 2001(22); Hoffmann, 2005(23)), winner of the AIPS 2000 planning competition. (Richter and Westphal, 2008)(24), a planner based on FASTDOWNWARD with improved heuristics, won the 2008 competition. >Environment/world/planning/Norvig. See also McDermot (1885(25).

1. Fikes, R. E., Hart, P. E., and Nilsson, N. J. (1972). Learning and executing generalized robot plans. AIJ,3(4), 251-288
2. Michie, D. (1974). Machine intelligence at Edinburgh. In On Intelligence, pp. 143–155. Edinburgh
University Press.
3. McDermott, D. (1978a). Planning and acting. Cognitive Science, 2(2), 71-109.
4. Fikes, R. E. and Nilsson, N. J. (1993). STRIPS, a retrospective. AIJ, 59(1–2), 227-232.
5. Newell, A. and Simon, H. A. (1961). GPS, a program that simulates human thought. In Billing, H.
(Ed.), Lernende Automaten, pp. 109-124. R. Oldenbourg.
6. Ghallab, M., Howe, A., Knoblock, C. A., and Mc-Dermott, D. (1998). PDDL—The planning domain definition language. Tech. rep. DCS TR-1165, Yale Center for Computational Vision and Control
7. Gerevini, A. and Long, D. (2005). Plan constraints and preferences in PDDL3. Tech. rep., Dept. of Electronics for Automation, University of Brescia, Italy
8. Sacerdoti, E. D. (1975). The nonlinear nature of plans. In IJCAI-75, pp. 206-214.
9. Korf, R. E. (1987). Planning as search: A quantitative approach. AIJ, 33(1), 65-88
10. Waldinger, R. (1975). Achieving several goals simultaneously. In Elcock, E. W. and Michie, D.
(Eds.), Machine Intelligence 8, pp. 94-138. Ellis Horwood
11. Warren, D. H. D. (1974). WARPLAN: A System for Generating Plans. Department of Computational
Logic Memo 76, University of Edinburgh
12. Tate, A. (1975a). Interacting goals and their use. In IJCAI-75, pp. 215-218.
13. Sussman, G. J. (1975). A Computer Model of Skill Acquisition. Elsevier/North-Holland.
14. Sacerdoti, E. D. (1977). A Structure for Plans and Behavior. Elsevier/North-Holland.
15. Tate, A. (1975b). Using Goal Structure to Direct Search in a Problem Solver. Ph.D. thesis, University of Edinburgh.
16. Tate, A. (1977). Generating project networks. In IJCAI-77, pp. 888-893.
17. McDermott, D. (1996). A heuristic estimator for means-ends analysis in planning. In ICAPS-96, pp.
142-149.
18. Bonet, B. and Geffner, H. (1999). Planning as heuristic search: New results. In ECP-99, pp. 360-372. 19. Haslum, P., Bonet, B., and Geffner, H. (2005). New admissible heuristics for domain-independent planning. In AAAI-05.
20. Haslum, P. (2006). Improving heuristics through relaxed search – An analysis of TP4 and HSP*a in the
2004 planning competition. JAIR, 25, 233-267.
21. Hoffmann, J. (2001). FF: The fast-forward planning system. AIMag, 22(3), 57-62.
22. Hoffmann, J. and Nebel, B. (2001). The FF planning system: Fast plan generation through heuristic search. JAIR, 14, 253-302.
23. Hoffmann, J. (2005). Where “ignoring delete lists” works: Local search topology in planning benchmarks. JAIR, 24, 685-758
24. Richter, S. and Westphal, M. (2008). The LAMA planner. In Proc. International Planning Competition at ICAPS.
25. McDermott, D. (1985). Reasoning about plans. In Hobbs, J. and Moore, R. (Eds.), Formal theories of the commonsense world. Intellect Books.

Norvig I
Peter Norvig
Stuart J. Russell
Artificial Intelligence: A Modern Approach Upper Saddle River, NJ 2010

Planning Russell Norvig I 156
Planning/artificial intelligence/Norvig/Russell: The unpredictability and partial observability of real environments were recognized early on in robotics projects that used planning techniques, including Shakey (Fikes et al., 1972)(1) and (Michie, 1974)(2). The problems received more attention after the publication of McDermott’s (1978a) influential article, Planning and Acting(3). >Belief states/Norvig.
Norvig I 366
Problems: [a sinple] problem-solving agent (…) can find sequences of actions that result in a goal state. But it deals with atomic representations of states and thus needs good domain-specific heuristics to perform well. [A] hybrid propositional logical agent (…) can find plans without domain-specific heuristics because it uses domain-independent heuristics based on the logical structure of the problem. But it relies on ground (variable-free) propositional inference, which means that it may be swamped when there are many actions and states.
Norvig I 367
planning researchers have settled on a factored representation - one in which a state of the world is represented by a collection of variables. We use a language called PDDL, the Planning Domain Definition Language, that allows us to express all 4Tn2 actions with one action schema. Each state is represented as a conjunction of fluents that are ground, functionless atoms. Database semantics is used: the closed-world assumption means that any fluents that are not mentioned are false, and the unique names assumption means that [x] 1 and [x] 2 are distinct. Actions are described by a set of action schemas that implicitly define the ACTIONS(s) and RESULT(s, a) functions needed to do a problem-solving search. >Frame Problem.
Classical planning concentrates on problems where most actions leave most things unchanged.
Actions: A set of ground (variable-free) actions can be represented by a single action schema.
The schema is a lifted representation—it lifts the level of reasoning from propositional logic to a restricted subset of first-order logic.
Action schema: The schema consists of the action name, a list of all the variables used in the schema, a precondition and an effect.
Norvig I 367
Forward/backward (progression/regression) state-space search Cf. >Forward chaining, >backward chaining.
Norvig I 376
Heuristics for planning: a heuristic function h(s) estimates the distance from a state s to the goal and that if we can derive an admissible heuristic for this distance - one that does not overestimate - then we can use A∗ search to find optimal solutions. Representation: Planning uses a factored representation for states and action schemas. That makes it possible to define good domain-independent heuristics and for programs to automatically apply a good domain-independent heuristic for a given problem. Think of a search problem as a graph where the nodes are states and the edges are actions. The problem is to find a path connecting the initial state to a goal state. There are two ways we can relax this problem to make it easier: by adding more edges to the graph, making it strictly easier to find a path, or by grouping multiple nodes together, forming an abstraction of the state space that has fewer states, and thus is easier to search.
Norvig I 377
State abstraction: Many planning problems have 10100 states or more, and relaxing the actions does nothing to reduce the number of states. Therefore, we now look at relaxations that decrease the number of states by forming a state abstraction - a many-to-one mapping from states in the ground representation of the problem to the abstract representation. The easiest form of state abstraction is to ignore some fluents.
Norvig I 378
Heuristics: A key idea in defining heuristics is decomposition: dividing a problem into parts, solving each part independently, and then combining the parts. The subgoal independence assumption is that the cost of solving a conjunction of subgoals is approximated by the sum of the costs of solving each subgoal independently.
Norvig I 390
Planning as constraint satisfaction >Constraint satisfaction problems.
Norvig I 393
History of AI planning: AI planning arose from investigations into state-space search, theorem proving, and control theory and from the practical needs of robotics, scheduling, and other domains. STRIPS (Fikes and Nilsson, 1971)(4), the first major planning system, illustrates the interaction of these influences.
General problem solver/GPS: the General Problem Solver (Newell and Simon, 1961)(5), [was] a state-space search system that used means–ends analysis. The control structure of STRIPS was modeled on that of GPS.
Norvig I 394
Language: The Problem Domain Description Language, or PDDL (Ghallab et al., 1998)(6), was introduced as a computer-parsable, standardized syntax for representing planning problems and has been used as the standard language for the International Planning Competition since 1998. There have been several extensions; the most recent version, PDDL 3.0, includes plan constraints and preferences (Gerevini and Long, 2005)(7). Subproblems: Problem decomposition was achieved by computing a subplan for each subgoal and then stringing the subplans together in some order. This approach, called linear planning by Sacerdoti (1975)(8), was soon discovered to be incomplete. It cannot solve some very simple problems (…).A complete planner must allow for interleaving of actions from different subplans within a single sequence. The notion of serializable subgoals (Korf, 1987)(9) corresponds exactly to the set of problems for which oninterleaved planners are complete. One solution to the interleaving problem was goal-regression planning, a technique in which steps in a totally ordered plan are reordered so as to avoid conflict between subgoals. This was introduced by Waldinger (1975)(10) and also used by Warren’s (1974)(11) WARPLAN.
Partial ordering: The ideas underlying partial-order planning include the detection of conflicts (Tate, 1975a)(12) and the protection of achieved conditions from interference (Sussman, 1975)(13). The construction of partially ordered plans (then called task networks) was pioneered by the NOAH planner (Sacerdoti, 1975(8), 1977(14)) and by Tate’s (1975b(15), 1977(16)) NONLIN system. Partial-order planning dominated the next 20 years of research (…).
State-space planning: The resurgence of interest in state-space planning was pioneered by Drew McDermott’s UNPOP program (1996)(17), which was the first to suggest the ignore-delete-list heuristic (…).Bonet and Geffner’s Heuristic Search Planner (HSP) and its later derivatives (Bonet and Geffner, 1999(18); Haslum et al., 2005(19); Haslum, 2006(20)) were the first to make
Norvig I 395
state-space search practical for large planning problems. The most successful state-space searcher to date is FF (Hoffmann, 2001(21); Hoffmann and Nebel, 2001(22); Hoffmann, 2005(23)), winner of the AIPS 2000 planning competition. (Richter and Westphal, 2008)(24), a planner based on FASTDOWNWARD with improved heuristics, won the 2008 competition. >Environment/world/planning/Norvig. See also McDermot (1885(25).
1. Fikes, R. E., Hart, P. E., and Nilsson, N. J. (1972). Learning and executing generalized robot plans. AIJ,3(4), 251-288
2. Michie, D. (1974). Machine intelligence at Edinburgh. In On Intelligence, pp. 143–155. Edinburgh
University Press.
3. McDermott, D. (1978a). Planning and acting. Cognitive Science, 2(2), 71-109.
4. Fikes, R. E. and Nilsson, N. J. (1993). STRIPS, a retrospective. AIJ, 59(1–2), 227-232.
5. Newell, A. and Simon, H. A. (1961). GPS, a program that simulates human thought. In Billing, H.
(Ed.), Lernende Automaten, pp. 109-124. R. Oldenbourg.
6. Ghallab, M., Howe, A., Knoblock, C. A., and Mc-Dermott, D. (1998). PDDL—The planning domain definition language. Tech. rep. DCS TR-1165, Yale Center for Computational Vision and Control
7. Gerevini, A. and Long, D. (2005). Plan constraints and preferences in PDDL3. Tech. rep., Dept. of Electronics for Automation, University of Brescia, Italy
8. Sacerdoti, E. D. (1975). The nonlinear nature of plans. In IJCAI-75, pp. 206-214.
9. Korf, R. E. (1987). Planning as search: A quantitative approach. AIJ, 33(1), 65-88
10. Waldinger, R. (1975). Achieving several goals simultaneously. In Elcock, E. W. and Michie, D.
(Eds.), Machine Intelligence 8, pp. 94-138. Ellis Horwood
11. Warren, D. H. D. (1974). WARPLAN: A System for Generating Plans. Department of Computational
Logic Memo 76, University of Edinburgh
12. Tate, A. (1975a). Interacting goals and their use. In IJCAI-75, pp. 215-218.
13. Sussman, G. J. (1975). A Computer Model of Skill Acquisition. Elsevier/North-Holland.
14. Sacerdoti, E. D. (1977). A Structure for Plans and Behavior. Elsevier/North-Holland.
15. Tate, A. (1975b). Using Goal Structure to Direct Search in a Problem Solver. Ph.D. thesis, University of Edinburgh.
16. Tate, A. (1977). Generating project networks. In IJCAI-77, pp. 888-893.
17. McDermott, D. (1996). A heuristic estimator for means-ends analysis in planning. In ICAPS-96, pp.
142-149.
18. Bonet, B. and Geffner, H. (1999). Planning as heuristic search: New results. In ECP-99, pp. 360-372. 19. Haslum, P., Bonet, B., and Geffner, H. (2005). New admissible heuristics for domain-independent planning. In AAAI-05.
20. Haslum, P. (2006). Improving heuristics through relaxed search – An analysis of TP4 and HSP*a in the
2004 planning competition. JAIR, 25, 233-267.
21. Hoffmann, J. (2001). FF: The fast-forward planning system. AIMag, 22(3), 57-62.
22. Hoffmann, J. and Nebel, B. (2001). The FF planning system: Fast plan generation through heuristic search. JAIR, 14, 253-302.
23. Hoffmann, J. (2005). Where “ignoring delete lists” works: Local search topology in planning benchmarks. JAIR, 24, 685-758
24. Richter, S. and Westphal, M. (2008). The LAMA planner. In Proc. International Planning Competition at ICAPS.
25. McDermott, D. (1985). Reasoning about plans. In Hobbs, J. and Moore, R. (Eds.), Formal theories of the commonsense world. Intellect Books.

Russell I
B. Russell/A.N. Whitehead
Principia Mathematica Frankfurt 1986

Russell II
B. Russell
The ABC of Relativity, London 1958, 1969
German Edition:
Das ABC der Relativitätstheorie Frankfurt 1989

Russell IV
B. Russell
The Problems of Philosophy, Oxford 1912
German Edition:
Probleme der Philosophie Frankfurt 1967

Russell VI
B. Russell
"The Philosophy of Logical Atomism", in: B. Russell, Logic and KNowledge, ed. R. Ch. Marsh, London 1956, pp. 200-202
German Edition:
Die Philosophie des logischen Atomismus
In
Eigennamen, U. Wolf (Hg) Frankfurt 1993

Russell VII
B. Russell
On the Nature of Truth and Falsehood, in: B. Russell, The Problems of Philosophy, Oxford 1912 - Dt. "Wahrheit und Falschheit"
In
Wahrheitstheorien, G. Skirbekk (Hg) Frankfurt 1996


Norvig I
Peter Norvig
Stuart J. Russell
Artificial Intelligence: A Modern Approach Upper Saddle River, NJ 2010
Stimuli Pinker I 196
Stimulus/response/Pinker: reflexes in humans need much longer than, say, in insects. That is, that any thought processes take place. >Thinking, >Spirit, >Mind, >Memory, >Comparisons, >Comparability, >Problem solving, >General Problem Solver, >Cognition, >Information processing, >Symbol processing.

Pi I
St. Pinker
How the Mind Works, New York 1997
German Edition:
Wie das Denken im Kopf entsteht München 1998

System Theory Weizenbaum I 322
System theory/Forrester/Weizenbaum: J. W. Forrester from MIT, the spiritual father of "cybernetic system theory", noted in front of a US Congress Committee (J. W. Forrester(1)): Thesis: the human thinking is not suitable to explain the behavior of social systems. >Cybernetics, >Behavior, >Explanation.
WeizenbaumVsForrester: he claims that the way Plato, Spinoza, Hume, Mill, Gandi and many others have thought about social systems is inferior to the system analysis method. According to Forrester, the problem is that human thinking is based on thought models.
Forrester: a model of thought is unclear. It is incomplete. It is inaccurately worded. In addition, a thought model in an individual changes with time and even in the course of a talk ... The goals are different and remain unspoken.
I 324
Forrester/Weizenbaum: claims that computer systems, in contrast to social systems, eliminate insecurity completely. But there are some behaviors that are "more desirable" than others. How are they made possible? Forrester: they are probably only possible if we have a proper understanding of the theory of dynamic systems and are prepared to submit to self-discipline and endure the constraints that must accompany the desired behaviour.
WeizenbaumVsForrester/WeizenbaumVsSkinner/WeizenbaumVsSimon: in the context in which Forrester uses the expressions "system" and "dynamic", the only way to gain an understanding that alone leads to "desirable behaviours" is the method of a "scientific analysis" according to Forrester (or Skinner or the >General Problem Solver, >A. >Newell, >H. A. Simon, >Behaviorism.
I 325
WeizenbaumVsForrester: For Forrester, the world literally consists of back coupling loops.
I 327
Meaning/System Theory/WeizenbaumVsForrester: the systems we have investigated, have been clearly shown that meaning has been completely transformed into function. >Behavior, >Society, >Actions, cf. >Anomalous monism.

1. J. W. Forrester Testimony before the Subcommitee on Urban Growth oft he Committe on Banking and Currency oft he United states House of representatives, given in Washington, D.C., Oct. 7, 1971, 91. Congress, 2nd Session, Part III, p 205-265.

Weizenbaum I
Joseph Weizenbaum
Computer Power and Human Reason. From Judgment to Calculation, W. H. Freeman & Comp. 1976
German Edition:
Die Macht der Computer und die Ohnmacht der Vernunft Frankfurt/M. 1978


The author or concept searched is found in the following 2 controversies.
Disputed term/author/ism Author Vs Author
Entry
Reference
Artificial Intelligence Chomsky Vs Artificial Intelligence Dennett I 540
Language / ChomskyVsArtificial Intelligence: the child shall later only switch whether it is learning Chinese or English, but it is not a "general problem solver". Even "slow" children "learn" jspeak well! They do not "learn" it, just as birds do not learn their feathers.
I 541
Dennett per Chomsky. But if he s right, the phenomena of language are much more difficult to explore.

Chomsky I
Noam Chomsky
"Linguistics and Philosophy", in: Language and Philosophy, (Ed) Sidney Hook New York 1969 pp. 51-94
In
Linguistik und Philosophie, G. Grewendorf/G. Meggle Frankfurt/M. 1974/1995

Chomsky II
Noam Chomsky
"Some empirical assumptions in modern philosophy of language" in: Philosophy, Science, and Method, Essays in Honor of E. Nagel (Eds. S. Morgenbesser, P. Suppes and M- White) New York 1969, pp. 260-285
In
Linguistik und Philosophie, G. Grewendorf/G. Meggle Frankfurt/M. 1974/1995

Chomsky IV
N. Chomsky
Aspects of the Theory of Syntax, Cambridge/MA 1965
German Edition:
Aspekte der Syntaxtheorie Frankfurt 1978

Chomsky V
N. Chomsky
Language and Mind Cambridge 2006

Dennett I
D. Dennett
Darwin’s Dangerous Idea, New York 1995
German Edition:
Darwins gefährliches Erbe Hamburg 1997

Dennett II
D. Dennett
Kinds of Minds, New York 1996
German Edition:
Spielarten des Geistes Gütersloh 1999

Dennett III
Daniel Dennett
"COG: Steps towards consciousness in robots"
In
Bewusstein, Thomas Metzinger Paderborn/München/Wien/Zürich 1996

Dennett IV
Daniel Dennett
"Animal Consciousness. What Matters and Why?", in: D. C. Dennett, Brainchildren. Essays on Designing Minds, Cambridge/MA 1998, pp. 337-350
In
Der Geist der Tiere, D Perler/M. Wild Frankfurt/M. 2005
Various Authors Dennett Vs Various Authors I 87
DennettVsDavies, Paul: ("God’s plan"): the human mind cannot be an unimportant byproduct. Dennett: why should it be unimportant or trivial merely because it is a byproduct? Fallacy, error. Why can the most important thing of all not be something that has emerged from something unimportant?.
I 192
DennettVsSnow: was wrong when he compared scientific discoveries with Shakespeare: Shakespeare belongs only to himself, scientific achievement belongs to all. E.g. Why is there no copyright on the successful multiplication of two numbers?.
I 244
DennettVsSmolin/Parallel Universes: Problem: there are too few limitations on what should be described as obvious variations and why.
I 333
GhiselinVsPangloss Principle: is bad because it asks the wrong question: the question of what is good. Instead, we should ask "What happened?".
I 692
DennettVsGhiselin: he deceived himself: there is never a clear answer to this question that does not greatly depends on what we like!. General/Particular/AI/Dennett: Donald Symons: there is no "general problem solver", because there are no general problems, only particular problems. DennettVsSymons: What was that? Neither is there a general wound, but only particular wounds. Nevertheless, there is a general healing process.
II 23/24
Consciousness/Language/Dennett: There is a view that certain beings could possess a consciousness, but due to their lack of language they cannot inform us about it. DennettVs: why do I think that is a problem? E.g. The computer can also be function if no printer is connected. Our royal road to getting to know the minds of others is language. It does not reach all the way to them, but that’s just a limitation of our knowledge, but not a limitation of their minds.
Sai V 77
Identity/Sainbury: no vague relation. DennettVsSainsbury: identity is no relation!.

Dennett I
D. Dennett
Darwin’s Dangerous Idea, New York 1995
German Edition:
Darwins gefährliches Erbe Hamburg 1997