Disputed term/author/ism | Author |
Entry |
Reference |
---|---|---|---|
Explanation | Nozick | II 10 Explanation/Nozick: not based on arguments - and not on evidence - because an evidence provides no understanding. >Understanding, >Argumentation, >Proofs, >Provability, >Evidence. Hypotheses that are needed in an explanation must not be known to be true. >Hypotheses, >Knowledge. II 12 Explanation/Nozick: locates something in the topicality. >Actuality. Understanding: localizes something in the space of possibilities. >Possibility, >Truth conditions, cf. >Understanding/Dummett. II 115 Existence/explanation/Leibniz/Nozick: each factor that should explain why there is anything at all, will be part itself of what needs to be explained. cf. >Existence/Leibniz. Explanation: always happens in terms of something else - one cannot explain everything, but nothing is inexplicable in principle. >Concepts, >Description levels, >Levels/order. II 116 Explanation/Nozick: is irreflexive, asymmetric and transitive: - irreflexive: nothing explains itself. Asymmetrical: if X explains Y then Y does not explain X (not reversible). II 117 Transitive: if X explains Y and Y explains Z, then X explains Z. - With that a strict partial order is established. >Partial order. II 118f Explanation/existence/Nozick: another possibility: explanation from laws or theories. >Laws, >Theories. Question: why is there then such theories and laws. Ultimate justification/self-explanation: could one last law subsume itself? >Ultimate justification. Last law: must have any characteristic C - all other laws. Problem: truth is not proven from form. >Truth, >Proofs, >Provability. II 120 Explanation/level/stage/Nozick: some authors: the statement must be deeper than the explained. KripkeVs: new theory: statements themselves seek the appropriate level - the highest level/stage/Kripke: those to which the sentence to its reference is applied to. >Truth/Kripke, >S.A. Kripke, >Fixped points/Kripke. Nozick: then P has to be, when used in a deduction, one level lower than its instance - then a deduced statement is lower when it subsumes something than when it is subsumed. >Deduction. II 120 Self-explanation/Nozick: self subsumption explains itself in the quantifier logic - Otherwise:. explanation is irreflexive - that means, it cannot explain itself. Bare facts/Nozick: a) something that cannot be explained by something else b) weaker: something that cannot be explained by something else. Then the explanatory self subsumption is a bare fact that explains itself. >Bare facts. II 305 Explanation/Nozick: one says, an explanation should not have less (for example, semantic) depth than the explained. >Semantics, >Semantic facts. II 308 Causation/Descartes: cannot be less deep than the effect (principle). >Cause, >Effect, >Description levels, >Levels/order, >Principles. |
No I R. Nozick Philosophical Explanations Oxford 1981 No II R., Nozick The Nature of Rationality 1994 |
Order | Simons | I 25 Partial order: the partial order is reflexive and transitive. "Less than": "less then" ensures reflexivity, because nothing can be less than itself (this also applies to total order). The full classical mereology equals the full Boolean algebra without zero. >Mereology, >Parts, >Part-of-relation, >Partial order. |
Simons I P. Simons Parts. A Study in Ontology Oxford New York 1987 |
Planning | Norvig | Norvig I 156 Planning/artificial intelligence/Norvig/Russell: The unpredictability and partial observability of real environments were recognized early on in robotics projects that used planning techniques, including Shakey (Fikes et al., 1972)(1) and (Michie, 1974)(2). The problems received more attention after the publication of McDermott’s (1978a) influential article, Planning and Acting(3). >Belief states/Norvig. Norvig I 366 Problems: [a sinple] problem-solving agent (…) can find sequences of actions that result in a goal state. But it deals with atomic representations of states and thus needs good domain-specific heuristics to perform well. [A] hybrid propositional logical agent (…) can find plans without domain-specific heuristics because it uses domain-independent heuristics based on the logical structure of the problem. But it relies on ground (variable-free) propositional inference, which means that it may be swamped when there are many actions and states. Norvig I 367 planning researchers have settled on a factored representation - one in which a state of the world is represented by a collection of variables. We use a language called PDDL, the Planning Domain Definition Language, that allows us to express all 4Tn2 actions with one action schema. Each state is represented as a conjunction of fluents that are ground, functionless atoms. Database semantics is used: the closed-world assumption means that any fluents that are not mentioned are false, and the unique names assumption means that [x] 1 and [x] 2 are distinct. Actions are described by a set of action schemas that implicitly define the ACTIONS(s) and RESULT(s, a) functions needed to do a problem-solving search. >Frame Problem. Classical planning concentrates on problems where most actions leave most things unchanged. Actions: A set of ground (variable-free) actions can be represented by a single action schema. The schema is a lifted representation—it lifts the level of reasoning from propositional logic to a restricted subset of first-order logic. Action schema: The schema consists of the action name, a list of all the variables used in the schema, a precondition and an effect. Norvig I 367 Forward/backward (progression/regression) state-space search: Cf. >Forward chaining, >backward chaining. Norvig I 376 Heuristics for planning: a heuristic function h(s) estimates the distance from a state s to the goal and that if we can derive an admissible heuristic for this distance - one that does not overestimate - then we can use A∗ search to find optimal solutions. Representation: Planning uses a factored representation for states and action schemas. That makes it possible to define good domain-independent heuristics and for programs to automatically apply a good domain-independent heuristic for a given problem. Think of a search problem as a graph where the nodes are states and the edges are actions. The problem is to find a path connecting the initial state to a goal state. There are two ways we can relax this problem to make it easier: by adding more edges to the graph, making it strictly easier to find a path, or by grouping multiple nodes together, forming an abstraction of the state space that has fewer states, and thus is easier to search. Norvig I 377 State abstraction: Many planning problems have 10100 states or more, and relaxing the actions does nothing to reduce the number of states. Therefore, we now look at relaxations that decrease the number of states by forming a state abstraction - a many-to-one mapping from states in the ground representation of the problem to the abstract representation. The easiest form of state abstraction is to ignore some fluents. Norvig I 378 Heuristics: A key idea in defining heuristics is decomposition: dividing a problem into parts, solving each part independently, and then combining the parts. The subgoal independence assumption is that the cost of solving a conjunction of subgoals is approximated by the sum of the costs of solving each subgoal independently. Norvig I 390 Planning as constraint satisfaction: >Constraint satisfaction problems. Norvig I 393 History of AI planning: AI planning arose from investigations into state-space search, theorem proving, and control theory and from the practical needs of robotics, scheduling, and other domains. STRIPS (Fikes and Nilsson, 1971)(4), the first major planning system, illustrates the interaction of these influences. General problem solver/GPS: the General Problem Solver (Newell and Simon, 1961)(5), [was] a state-space search system that used means–ends analysis. The control structure of STRIPS was modeled on that of GPS. Norvig I 394 Language: The Problem Domain Description Language, or PDDL (Ghallab et al., 1998)(6), was introduced as a computer-parsable, standardized syntax for representing planning problems and has been used as the standard language for the International Planning Competition since 1998. There have been several extensions; the most recent version, PDDL 3.0, includes plan constraints and preferences (Gerevini and Long, 2005)(7). Subproblems: Problem decomposition was achieved by computing a subplan for each subgoal and then stringing the subplans together in some order. This approach, called linear planning by Sacerdoti (1975)(8), was soon discovered to be incomplete. It cannot solve some very simple problems (…).A complete planner must allow for interleaving of actions from different subplans within a single sequence. The notion of serializable subgoals (Korf, 1987)(9) corresponds exactly to the set of problems for which oninterleaved planners are complete. One solution to the interleaving problem was goal-regression planning, a technique in which steps in a totally ordered plan are reordered so as to avoid conflict between subgoals. This was introduced by Waldinger (1975)(10) and also used by Warren’s (1974)(11) WARPLAN. Partial ordering: The ideas underlying partial-order planning include the detection of conflicts (Tate, 1975a)(12) and the protection of achieved conditions from interference (Sussman, 1975)(13). The construction of partially ordered plans (then called task networks) was pioneered by the NOAH planner (Sacerdoti, 1975(8), 1977(14)) and by Tate’s (1975b(15), 1977(16)) NONLIN system. Partial-order planning dominated the next 20 years of research (…). State-space planning: The resurgence of interest in state-space planning was pioneered by Drew McDermott’s UNPOP program (1996)(17), which was the first to suggest the ignore-delete-list heuristic (…).Bonet and Geffner’s Heuristic Search Planner (HSP) and its later derivatives (Bonet and Geffner, 1999(18); Haslum et al., 2005(19); Haslum, 2006(20)) were the first to make Norvig I 395 state-space search practical for large planning problems. The most successful state-space searcher to date is FF (Hoffmann, 2001(21); Hoffmann and Nebel, 2001(22); Hoffmann, 2005(23)), winner of the AIPS 2000 planning competition. (Richter and Westphal, 2008)(24), a planner based on FASTDOWNWARD with improved heuristics, won the 2008 competition. >Environment/world/planning/Norvig. See also McDermot (1885(25). 1. Fikes, R. E., Hart, P. E., and Nilsson, N. J. (1972). Learning and executing generalized robot plans. AIJ,3(4), 251-288 2. Michie, D. (1974). Machine intelligence at Edinburgh. In On Intelligence, pp. 143–155. Edinburgh University Press. 3. McDermott, D. (1978a). Planning and acting. Cognitive Science, 2(2), 71-109. 4. Fikes, R. E. and Nilsson, N. J. (1993). STRIPS, a retrospective. AIJ, 59(1–2), 227-232. 5. Newell, A. and Simon, H. A. (1961). GPS, a program that simulates human thought. In Billing, H. (Ed.), Lernende Automaten, pp. 109-124. R. Oldenbourg. 6. Ghallab, M., Howe, A., Knoblock, C. A., and Mc-Dermott, D. (1998). PDDL—The planning domain definition language. Tech. rep. DCS TR-1165, Yale Center for Computational Vision and Control 7. Gerevini, A. and Long, D. (2005). Plan constraints and preferences in PDDL3. Tech. rep., Dept. of Electronics for Automation, University of Brescia, Italy 8. Sacerdoti, E. D. (1975). The nonlinear nature of plans. In IJCAI-75, pp. 206-214. 9. Korf, R. E. (1987). Planning as search: A quantitative approach. AIJ, 33(1), 65-88 10. Waldinger, R. (1975). Achieving several goals simultaneously. In Elcock, E. W. and Michie, D. (Eds.), Machine Intelligence 8, pp. 94-138. Ellis Horwood 11. Warren, D. H. D. (1974). WARPLAN: A System for Generating Plans. Department of Computational Logic Memo 76, University of Edinburgh 12. Tate, A. (1975a). Interacting goals and their use. In IJCAI-75, pp. 215-218. 13. Sussman, G. J. (1975). A Computer Model of Skill Acquisition. Elsevier/North-Holland. 14. Sacerdoti, E. D. (1977). A Structure for Plans and Behavior. Elsevier/North-Holland. 15. Tate, A. (1975b). Using Goal Structure to Direct Search in a Problem Solver. Ph.D. thesis, University of Edinburgh. 16. Tate, A. (1977). Generating project networks. In IJCAI-77, pp. 888-893. 17. McDermott, D. (1996). A heuristic estimator for means-ends analysis in planning. In ICAPS-96, pp. 142-149. 18. Bonet, B. and Geffner, H. (1999). Planning as heuristic search: New results. In ECP-99, pp. 360-372. 19. Haslum, P., Bonet, B., and Geffner, H. (2005). New admissible heuristics for domain-independent planning. In AAAI-05. 20. Haslum, P. (2006). Improving heuristics through relaxed search – An analysis of TP4 and HSP*a in the 2004 planning competition. JAIR, 25, 233-267. 21. Hoffmann, J. (2001). FF: The fast-forward planning system. AIMag, 22(3), 57-62. 22. Hoffmann, J. and Nebel, B. (2001). The FF planning system: Fast plan generation through heuristic search. JAIR, 14, 253-302. 23. Hoffmann, J. (2005). Where “ignoring delete lists” works: Local search topology in planning benchmarks. JAIR, 24, 685-758 24. Richter, S. and Westphal, M. (2008). The LAMA planner. In Proc. International Planning Competition at ICAPS. 25. McDermott, D. (1985). Reasoning about plans. In Hobbs, J. and Moore, R. (Eds.), Formal theories of the commonsense world. Intellect Books. |
Norvig I Peter Norvig Stuart J. Russell Artificial Intelligence: A Modern Approach Upper Saddle River, NJ 2010 |
Planning | Russell | Norvig I 156 Planning/artificial intelligence/Norvig/Russell: The unpredictability and partial observability of real environments were recognized early on in robotics projects that used planning techniques, including Shakey (Fikes et al., 1972)(1) and (Michie, 1974)(2). The problems received more attention after the publication of McDermott’s (1978a) influential article, Planning and Acting(3). >Belief states/Norvig. Norvig I 366 Problems: [a sinple] problem-solving agent (…) can find sequences of actions that result in a goal state. But it deals with atomic representations of states and thus needs good domain-specific heuristics to perform well. [A] hybrid propositional logical agent (…) can find plans without domain-specific heuristics because it uses domain-independent heuristics based on the logical structure of the problem. But it relies on ground (variable-free) propositional inference, which means that it may be swamped when there are many actions and states. Norvig I 367 planning researchers have settled on a factored representation - one in which a state of the world is represented by a collection of variables. We use a language called PDDL, the Planning Domain Definition Language, that allows us to express all 4Tn2 actions with one action schema. Each state is represented as a conjunction of fluents that are ground, functionless atoms. Database semantics is used: the closed-world assumption means that any fluents that are not mentioned are false, and the unique names assumption means that [x] 1 and [x] 2 are distinct. Actions are described by a set of action schemas that implicitly define the ACTIONS(s) and RESULT(s, a) functions needed to do a problem-solving search. >Frame Problem. Classical planning concentrates on problems where most actions leave most things unchanged. Actions: A set of ground (variable-free) actions can be represented by a single action schema. The schema is a lifted representation—it lifts the level of reasoning from propositional logic to a restricted subset of first-order logic. Action schema: The schema consists of the action name, a list of all the variables used in the schema, a precondition and an effect. Norvig I 367 Forward/backward (progression/regression) state-space search Cf. >Forward chaining, >backward chaining. Norvig I 376 Heuristics for planning: a heuristic function h(s) estimates the distance from a state s to the goal and that if we can derive an admissible heuristic for this distance - one that does not overestimate - then we can use A∗ search to find optimal solutions. Representation: Planning uses a factored representation for states and action schemas. That makes it possible to define good domain-independent heuristics and for programs to automatically apply a good domain-independent heuristic for a given problem. Think of a search problem as a graph where the nodes are states and the edges are actions. The problem is to find a path connecting the initial state to a goal state. There are two ways we can relax this problem to make it easier: by adding more edges to the graph, making it strictly easier to find a path, or by grouping multiple nodes together, forming an abstraction of the state space that has fewer states, and thus is easier to search. Norvig I 377 State abstraction: Many planning problems have 10100 states or more, and relaxing the actions does nothing to reduce the number of states. Therefore, we now look at relaxations that decrease the number of states by forming a state abstraction - a many-to-one mapping from states in the ground representation of the problem to the abstract representation. The easiest form of state abstraction is to ignore some fluents. Norvig I 378 Heuristics: A key idea in defining heuristics is decomposition: dividing a problem into parts, solving each part independently, and then combining the parts. The subgoal independence assumption is that the cost of solving a conjunction of subgoals is approximated by the sum of the costs of solving each subgoal independently. Norvig I 390 Planning as constraint satisfaction >Constraint satisfaction problems. Norvig I 393 History of AI planning: AI planning arose from investigations into state-space search, theorem proving, and control theory and from the practical needs of robotics, scheduling, and other domains. STRIPS (Fikes and Nilsson, 1971)(4), the first major planning system, illustrates the interaction of these influences. General problem solver/GPS: the General Problem Solver (Newell and Simon, 1961)(5), [was] a state-space search system that used means–ends analysis. The control structure of STRIPS was modeled on that of GPS. Norvig I 394 Language: The Problem Domain Description Language, or PDDL (Ghallab et al., 1998)(6), was introduced as a computer-parsable, standardized syntax for representing planning problems and has been used as the standard language for the International Planning Competition since 1998. There have been several extensions; the most recent version, PDDL 3.0, includes plan constraints and preferences (Gerevini and Long, 2005)(7). Subproblems: Problem decomposition was achieved by computing a subplan for each subgoal and then stringing the subplans together in some order. This approach, called linear planning by Sacerdoti (1975)(8), was soon discovered to be incomplete. It cannot solve some very simple problems (…).A complete planner must allow for interleaving of actions from different subplans within a single sequence. The notion of serializable subgoals (Korf, 1987)(9) corresponds exactly to the set of problems for which oninterleaved planners are complete. One solution to the interleaving problem was goal-regression planning, a technique in which steps in a totally ordered plan are reordered so as to avoid conflict between subgoals. This was introduced by Waldinger (1975)(10) and also used by Warren’s (1974)(11) WARPLAN. Partial ordering: The ideas underlying partial-order planning include the detection of conflicts (Tate, 1975a)(12) and the protection of achieved conditions from interference (Sussman, 1975)(13). The construction of partially ordered plans (then called task networks) was pioneered by the NOAH planner (Sacerdoti, 1975(8), 1977(14)) and by Tate’s (1975b(15), 1977(16)) NONLIN system. Partial-order planning dominated the next 20 years of research (…). State-space planning: The resurgence of interest in state-space planning was pioneered by Drew McDermott’s UNPOP program (1996)(17), which was the first to suggest the ignore-delete-list heuristic (…).Bonet and Geffner’s Heuristic Search Planner (HSP) and its later derivatives (Bonet and Geffner, 1999(18); Haslum et al., 2005(19); Haslum, 2006(20)) were the first to make Norvig I 395 state-space search practical for large planning problems. The most successful state-space searcher to date is FF (Hoffmann, 2001(21); Hoffmann and Nebel, 2001(22); Hoffmann, 2005(23)), winner of the AIPS 2000 planning competition. (Richter and Westphal, 2008)(24), a planner based on FASTDOWNWARD with improved heuristics, won the 2008 competition. >Environment/world/planning/Norvig. See also McDermot (1885(25). 1. Fikes, R. E., Hart, P. E., and Nilsson, N. J. (1972). Learning and executing generalized robot plans. AIJ,3(4), 251-288 2. Michie, D. (1974). Machine intelligence at Edinburgh. In On Intelligence, pp. 143–155. Edinburgh University Press. 3. McDermott, D. (1978a). Planning and acting. Cognitive Science, 2(2), 71-109. 4. Fikes, R. E. and Nilsson, N. J. (1993). STRIPS, a retrospective. AIJ, 59(1–2), 227-232. 5. Newell, A. and Simon, H. A. (1961). GPS, a program that simulates human thought. In Billing, H. (Ed.), Lernende Automaten, pp. 109-124. R. Oldenbourg. 6. Ghallab, M., Howe, A., Knoblock, C. A., and Mc-Dermott, D. (1998). PDDL—The planning domain definition language. Tech. rep. DCS TR-1165, Yale Center for Computational Vision and Control 7. Gerevini, A. and Long, D. (2005). Plan constraints and preferences in PDDL3. Tech. rep., Dept. of Electronics for Automation, University of Brescia, Italy 8. Sacerdoti, E. D. (1975). The nonlinear nature of plans. In IJCAI-75, pp. 206-214. 9. Korf, R. E. (1987). Planning as search: A quantitative approach. AIJ, 33(1), 65-88 10. Waldinger, R. (1975). Achieving several goals simultaneously. In Elcock, E. W. and Michie, D. (Eds.), Machine Intelligence 8, pp. 94-138. Ellis Horwood 11. Warren, D. H. D. (1974). WARPLAN: A System for Generating Plans. Department of Computational Logic Memo 76, University of Edinburgh 12. Tate, A. (1975a). Interacting goals and their use. In IJCAI-75, pp. 215-218. 13. Sussman, G. J. (1975). A Computer Model of Skill Acquisition. Elsevier/North-Holland. 14. Sacerdoti, E. D. (1977). A Structure for Plans and Behavior. Elsevier/North-Holland. 15. Tate, A. (1975b). Using Goal Structure to Direct Search in a Problem Solver. Ph.D. thesis, University of Edinburgh. 16. Tate, A. (1977). Generating project networks. In IJCAI-77, pp. 888-893. 17. McDermott, D. (1996). A heuristic estimator for means-ends analysis in planning. In ICAPS-96, pp. 142-149. 18. Bonet, B. and Geffner, H. (1999). Planning as heuristic search: New results. In ECP-99, pp. 360-372. 19. Haslum, P., Bonet, B., and Geffner, H. (2005). New admissible heuristics for domain-independent planning. In AAAI-05. 20. Haslum, P. (2006). Improving heuristics through relaxed search – An analysis of TP4 and HSP*a in the 2004 planning competition. JAIR, 25, 233-267. 21. Hoffmann, J. (2001). FF: The fast-forward planning system. AIMag, 22(3), 57-62. 22. Hoffmann, J. and Nebel, B. (2001). The FF planning system: Fast plan generation through heuristic search. JAIR, 14, 253-302. 23. Hoffmann, J. (2005). Where “ignoring delete lists” works: Local search topology in planning benchmarks. JAIR, 24, 685-758 24. Richter, S. and Westphal, M. (2008). The LAMA planner. In Proc. International Planning Competition at ICAPS. 25. McDermott, D. (1985). Reasoning about plans. In Hobbs, J. and Moore, R. (Eds.), Formal theories of the commonsense world. Intellect Books. |
Russell I B. Russell/A.N. Whitehead Principia Mathematica Frankfurt 1986 Russell II B. Russell The ABC of Relativity, London 1958, 1969 German Edition: Das ABC der Relativitätstheorie Frankfurt 1989 Russell IV B. Russell The Problems of Philosophy, Oxford 1912 German Edition: Probleme der Philosophie Frankfurt 1967 Russell VI B. Russell "The Philosophy of Logical Atomism", in: B. Russell, Logic and KNowledge, ed. R. Ch. Marsh, London 1956, pp. 200-202 German Edition: Die Philosophie des logischen Atomismus In Eigennamen, U. Wolf (Hg) Frankfurt 1993 Russell VII B. Russell On the Nature of Truth and Falsehood, in: B. Russell, The Problems of Philosophy, Oxford 1912 - Dt. "Wahrheit und Falschheit" In Wahrheitstheorien, G. Skirbekk (Hg) Frankfurt 1996 Norvig I Peter Norvig Stuart J. Russell Artificial Intelligence: A Modern Approach Upper Saddle River, NJ 2010 |
Set Theory | Simons | I 12 Set Theory: seperate quantities have one element in common: the empty set. Mereology: here, mereology does not exist. >Mereology. Partial order: here, partial order is the common part of the lower bound. >Partial order. Product: a product is the greatest lower barrier: i.e. the individual, that x and y have in common (= average of the set theory, lens in Venndiagram). Stronger: binary sum: the binary sum is the individual who overlaps iff. it at least overlaps one of x or y (Venndiagram: both circles with lens), e.g. a broom is the sum of handle and head. Any two individuals always have a sum. >Mereological sum. I 279 Set Theory/modality/necessity/Simons: there is rigidity of the element relationship: a class can have in no possible world other elements, as it has in the actual world. This is analog to the mereological essentialism for subsets. >Element relation. I 332 Set Theory/mereology/elements/(s): elements are not interchangeable but parts are. >Parts. |
Simons I P. Simons Parts. A Study in Ontology Oxford New York 1987 |
Similarity Metrics | Lewis | V 10 Similarity metric/Possible worlds/Po.wo./Similarity/Lewis: order assumption: weak order: whenever two worlds can be accessed from the the world i in question, either one or the other is more similar to world i. - Decreasing or increasing similarity is transitive. - In contrast, partial order: not all couples are distinguishable. >Possible world/Lewis. V 11 Compatibility/Possible world/Lewis: B is compatible with A in world i if an A world is closer to i than any non-B-world. - (Reversal of rather true) - then A were>>would C is true if C follows from A together with auxiliary hypotheses B1...Bn. - E.g. natural laws are compatible or completely incompatible with every assumption - thesis: then laws of nature are generalizations of what we consider to be particularly important. - Then conformity with Laws of Nature should be important for the similarity relation between possible worlds - (> Similarity metrics). V 12f Similarity metric/Possible worlds/Lewis: sphere/Similarity sphere: E.g. S sphere around the world i: exists, if any S world is accessible from i and closer than any ~ S world): admitting A: a sphere contains an A world. - Degree: spheres represent degrees (comparative, unlike neighborhood in topology). Compatibility/Compatible/(s): B is compatible with A if there is an A world in the B sphere. - Definition A were>>would C is true if A>C applies in an A permitting sphere around i, if such a sphere exists. >Implication. V 13 Definition Then were A>>would C would be true if AC applied in every A permitting sphere around i ((s) conjunction) - Definition A impossible worlds: >Impossible World. V 42 Similarity metric/Similarity/Possible world/Lewis: It is not about any particular similarity relation that you happen to have in mind. - Problem: if some aspects do not even count, the centering assumption would be violated. - I.e. worlds that differ in an unnoticed aspect, would be identical with the actual world. - Lewis: but such worlds do not exist.- Similarity relations: must be distinguished: a) for explicit judgments - b) for counterfactual judgments. V 150 Revision/Possible world/Similarity metrics/Stalnaker/Lewis: every revision will select the most similar antecedent world. --- Schwarz I 160 Lewis: E.g. a single particle changes its charge: then it behaves differently. - Because a possible world in which not only the charge but also the role were exchanged would be much less similar (> next world). |
Lewis I David K. Lewis Die Identität von Körper und Geist Frankfurt 1989 Lewis I (a) David K. Lewis An Argument for the Identity Theory, in: Journal of Philosophy 63 (1966) In Die Identität von Körper und Geist, Frankfurt/M. 1989 Lewis I (b) David K. Lewis Psychophysical and Theoretical Identifications, in: Australasian Journal of Philosophy 50 (1972) In Die Identität von Körper und Geist, Frankfurt/M. 1989 Lewis I (c) David K. Lewis Mad Pain and Martian Pain, Readings in Philosophy of Psychology, Vol. 1, Ned Block (ed.) Harvard University Press, 1980 In Die Identität von Körper und Geist, Frankfurt/M. 1989 Lewis II David K. Lewis "Languages and Language", in: K. Gunderson (Ed.), Minnesota Studies in the Philosophy of Science, Vol. VII, Language, Mind, and Knowledge, Minneapolis 1975, pp. 3-35 In Handlung, Kommunikation, Bedeutung, Georg Meggle Frankfurt/M. 1979 Lewis IV David K. Lewis Philosophical Papers Bd I New York Oxford 1983 Lewis V David K. Lewis Philosophical Papers Bd II New York Oxford 1986 Lewis VI David K. Lewis Convention. A Philosophical Study, Cambridge/MA 1969 German Edition: Konventionen Berlin 1975 LewisCl Clarence Irving Lewis Collected Papers of Clarence Irving Lewis Stanford 1970 LewisCl I Clarence Irving Lewis Mind and the World Order: Outline of a Theory of Knowledge (Dover Books on Western Philosophy) 1991 Schw I W. Schwarz David Lewis Bielefeld 2005 |
Disputed term/author/ism | Author Vs Author |
Entry |
Reference |
---|---|---|---|
Extensionality | Simons Vs Extensionality | I 116 Extensionality/Simons: we leave extensionality with the rejection of ≤. I 117 ≤: the relation ≤ is not antisymmetric, it is a partial order, that means it is reflexive and transitive. In terms of it one can define a symmetric predicate: Def coincidence of parts/mereology/spelling/Simons: SD16 x ≤≥ y ≡ x ≤ y u y ≤ x. Coinciding individuals are perceptually indistinguishable for their period of coincidence. They are in superposition. Def superposition/mereology/Simons: they occupy the same place at the same time. Question: (see below): do all superposed objects coincide mereologically? By rejecting the proper parts principle we receive an abundance of descriptions and explanatory power. SimonsVsExtensionality: extensionality is too ascetic for mereology. I 251 Part/SimonsVsExtensionality/VsCEM/VsExtensional Mereology/Simons: we see which abundance we have to give up if we want to remain extensional, because now we have three concepts of part instead of one, which throws together the SSP and there may be even more. CEM/Extensional Mereology/Simons: extensional mereology is actually a substantive thesis: individuals who are of the same material are identified. Coincidence-Principle/Simons: 1. For the two more powerful coincidence concepts of identity and the strong coincidence we refuse it. 2. For weak coincidence we allow it, provided we consider only superimposed material individuals. Strictly weak inclusion: e.g. there is no reason to deny that Caesar's heart is weakly included in the matter of Caesar. |
Simons I P. Simons Parts. A Study in Ontology Oxford New York 1987 |
Various Authors | Simons Vs Various Authors | I 89 Sharvy/quasi-mereology/Simons: Sharvy is actually about a general approach of reference and specific descriptions for all types of nouns: singular term, general term and mass terms ("count singular, plural count and mass"). Sharvy: considering quasi-mereology only for areas which are the extensions of predicates corresponding to such nouns. I 90 Part-Relation/Sharvy: N.B.: accumulation, group. Simons: with that we have a non-random similarity to the ontological functor <, which can also be read as "are some of". Part-Relation: a part-relation may be the identity in the limit, e.g. if the only part of an object is the object itself (e.g. English "boot"?). Mass term/part-relation: water parts are themselves water but that is not trivial because they can also be hydrogen. Quasi-mereology/solution/Sharvy: the part-relation or "some-of"-relation has to be put into perspective. Thus, it is then more fundamental than the identity! Existence/SharvyVsQuine: instead of no "entity without identity", we assume "quasi-mereology". Identity/Sharvy: identity thus becomes the special case of the part-of-relation. SimonsVsSharvy: this fails because if the partial order which should correspond to a predicate is the identity then there is the least upper bound (l.u.b.). It is a subset of the extension of the predicate only if this subset contains a single (single, "singular") element and then the order is not a quasi-mereology, for which namely the least upper bound for all subsets of the extension of the predicate has to exist. I 330 Unity/integrity/whole/complete/Simons: it can occur in variations because of the systematic ambiguity of the predicate "part". I 331 This can lead to individuals, collections or masses. whole: it is here not clear whether the whole formed of individuals is an individual itself. It could also be a collection because the elements form a division because the element-relation is a special case of the part-relation for collections. SimonsVsSociology: undifferentiated concept of a "whole" which is composed of individuals: is again incorrectly assumed as an individual (supra-personal). |
Peter Simons I P. Simons Parts Oxford New York 1987 |