## Psychology Dictionary of ArgumentsHome | |||

| |||

Dempster-Shafer Theory: The Dempster-Shafer Theory (DST) is a mathematical framework for reasoning with uncertainty. It is a generalization of probability theory that allows for the representation of partial or incomplete knowledge. Belief functions can be combined using Dempster's rule to produce a new belief function that reflects the combined evidence from multiple sources._____________ Annotation: The above characterizations of concepts are neither definitions nor exhausting presentations of problems related to them. Instead, they are intended to give a short introduction to the contributions below. – Lexicon of Arguments. | |||

Author | Concept | Summary/Quotes | Sources |
---|---|---|---|

Peter Norvig on Dempster-Shafer Theory - Dictionary of Arguments Norvig I 547 Dempster-Shafer Theory/AI Research/Norvig/Russell: uses interval-valued degrees of belief to represent an agent’s knowledge of the probability of a proposition. Norvig I 549 The Dempster–Shafer theory is designed to deal with the distinction between uncertainty and ignorance. Rather than computing the probability of a proposition, it computes the probability that the evidence supports the proposition. This measure of belief is called a belief function, written Bel(X). The mathematical underpinnings of Dempster–Shafer theory have a similar flavor to those of probability theory; the main difference is that, instead of assigning probabilities to possible worlds, the theory assigns masses to sets of possible world, that is, to events. The masses still must add to 1 over all possible events. Bel(A) is defined to be the sum of masses for all events that are subsets of (i.e., that entail) A, including A itself. With this definition, Bel(A) and Bel(¬A) sum to at most 1, and the gap—the interval between Bel(A) and 1 − Bel(¬A)—is often interpreted as bounding the probability of A. VsDempster-Shafer theory: Problems: As with default reasoning, there is a problem in connecting beliefs to actions. Whenever there is a gap in the beliefs, then a decision problem can be defined such that a Dempster–Shafer system is unable to make a decision. In fact, the notion of utility in the Dempster–Shafer model is not yet well understood because the meanings of masses and beliefs themselves have yet to be understood. Pearl (1988) ^{(1)} has argued that Bel(A) should be interpreted not as a degree of belief in A but as the probability assigned to all the possible worlds (now interpreted as logical theories) in which A is provable. While there are cases in which this quantity might be of interest, it is not the same as the probability that A is true. A Bayesian analysis of the coin-flipping example would suggest that no new formalism is necessary to handle such cases. The model would have two variables: the Bias of the coin (a number between 0 and 1, where 0 is a coin that always shows tails and 1 a coin that always shows heads) and the outcome of the next Flip. Cf. >Fuzzy Logic, >Vagueness/Philosophical theories, >Sorites/Philosophical theories.1. Pearl, J. (1988). Probabilistic Reasoning in Intelligent Systems: Networks of Plausible Inference. Morgan Kaufmann. _____________ Explanation of symbols: Roman numerals indicate the source, arabic numerals indicate the page number. The corresponding books are indicated on the right hand side. ((s)…): Comment by the sender of the contribution. Translations: Dictionary of Arguments The note [Concept/Author], [Author1]Vs[Author2] or [Author]Vs[term] resp. "problem:"/"solution:", "old:"/"new:" and "thesis:" is an addition from the Dictionary of Arguments. If a German edition is specified, the page numbers refer to this edition. |
Norvig I Peter Norvig Stuart J. Russell Artificial Intelligence: A Modern Approach Upper Saddle River, NJ 2010 |

> Counter arguments against **Norvig**

> Counter arguments in relation to **Dempster-Shafer Theory**