Philosophy Dictionary of Arguments

Home Screenshot Tabelle Begriffe

 
Probability theory: Probability theory is the branch of mathematics that deals with the analysis of random phenomena. It is used to model the uncertainty of random events. The probability of any event is between 0 and 1, the sum of the probabilities of all events in a sample space is 1. Probability theory is used in mathematics, statistics, physics, and engineering. See also Probability, Probability distribution, Probability functions, Predictions, Method, Knowledge.
_____________
Annotation: The above characterizations of concepts are neither definitions nor exhausting presentations of problems related to them. Instead, they are intended to give a short introduction to the contributions below. – Lexicon of Arguments.

 
Author Concept Summary/Quotes Sources

Gerhard Schurz on Probability Theory - Dictionary of Arguments

I 110
Probability theory/theorems/Schurz:
a) unconditioned probability: (objective und subjective)
(T1) p(~A) = 1 – p(A) (complementary probability)
(T2) p(A) ≤ 1 (upper bound)
(T3) p(A u ~A) = 0 (contradiction)
(T4) p(A1 v A2) = p(A1) + p(A2) – p(A1 u A2) (general law of addition).

b) conditioned probability (for formulas X in antecedens position)

(PT1) If B > A is exhaustive, gilt p(A I B) = 1. The converse is not valid.
(PT2) p(A u B) = p(A I B) mal p(B)
(PT3) Für jede Partition B1,...Bn: p(A) = ∑ 1≤i≤n p(A I Bi) times p(Bi) (general law of multiplication)
(PT4): Def Bayes-Theorem, 1st version:
p(A I B) = p(B I A) times p(A)/p(B)

(PT5) Def Bayes-Theorem, 2nd version: for each partition A1,...An:
p(Ai I B) = p(B I Ai) times p (Ai) /∑ 1≤i≤n p(B I Ai) times p(Ai).

(PT6) Symmetry of probabilistic dependence:
p(A I B) > p(A) iff p(B I A) > p(B) iff p(B I A) > p(B I ~A) (analog for ≥).
Def Partition/Schurz: exhaustive disjunction.
I 110
Consequence relation/probability/consequence/probability theory/Schurz: the probability-theoretic inference relation can be characterized as follows: a probability statement A follows probabilistically from a set D of probability statements iff. A follows logically from D and the Kolmogorov axioms (plus mathematical definitions).
>Probability
.

I 112
Probability theory/Schurz: still unsolved problems:
(a) objective probability: definitional problems.
Definition of statistical probability: problem: with one random experiment one can potentially produce infinitely many infinitely increasing sequences of results, Why should they all have the same frequency limit? Why should they have one at all?
Problem: even worse: from a given sequence of results, one can always construct a sequence with an arbitrarily deviating frequency limit value by arbitrary rearrangement or place selection.
I 113
Law of large numbers/Schurz: ("naive statistical theory"): is supposed to be a solution for this problem: the assertion "p(Fx) = r" does not say then that in all random sequences the frequency limit is r, but only that it is r with probability 1.
StegmüllerVs/KutscheraVs: This is circular! In the definiens of the expression "the probability of Fx is r" the expression "with probability 1" occurs again. Thus the probability is not reduced to frequency limits, but again to probability.
>Circularity.
Rearrangement/(s): only a problem with infinite sets, not with finite ones.
Mises/solution: "statistical collective".
1. every possible outcome E has a frequency threshold in g, identified with probability p(E), and
2. this is insensitive to job selection.
From this follows the general
product rule/statistic: the probability of a sum is equal to the product of the individual probabilities: p(Fx1 u Gx2) = p(Fx1) times p(Gx2).
Probability /propensity//Mises: this result of Mises is empirical, not a priori! It is a substantive dispositional statement about the real nature of the random experiment (>Ontology/Statistics). The Mises probability is also called propensity.
>Propensity.
Singular Propensity/Single Case Probability/Single Probability/Popper: many Vs.
Probability theory/Schurz: problem: what is the empirical content of a statistical hypothesis and how is it tested? There is no observational statement that logically follows from this hypothesis.
>Verification.
That a random sequence has a certain frequency limit r is compatible for any n, no matter how large, with any frequency value hn unequal to r reached up to that point.
Bayes/Schurz: this is often raised as an objection by Bayesians, but it merely expresses the fact that no observational theorems follow from statistical hypotheses.
I 115
Verification/Statistics/Schurz: Statistical hypotheses are not deductively testable, but they are probabilistically testable, by sampling.

I 115
Principal Principle/PP/Statistics/Schurz: subjective probabilities, if objective probabilities are known, must be consistent with them.
Lewis (1980): singular PP: subjectivist. Here "objective" singular propensities are simply postulated.
>Propensities.
SchurzVsPropensity/SchurzVsPopper: it remains unclear what property a singular propensity should correspond to in the first place.
Solution/de Finetti: one can also accept the objective notion of probability at the same time.
Conditionalization/Statistics/Schurz: on an arbitrary experience datum E(b1...bn) over other individuals b1,..bn is important to derive two further versions of PP:
1. PP for random samples, which is needed for the subjective justification of the statistical likelihood intuition.
2. the conditional PP, for the principle of the closest reference class and subject to the inductive statistical specialization inference.
PP: w(Fa I p(Fx) = r u E(b1,...bn)) = r
PP for random samples: w(hn(Fx) = k/n I p(Fx) = r) = (nk) rk times (1 r)n k.
Conditional PP: w(Fa I Ga u p(Fx I Gx) = r u E(b1,...bn)) = r.
Principal principle: is only meaningful for subjective a priori probability. I.e. degrees of belief of a subject who has not yet had any experience.
Actual degree of belief: for him the principle does not apply in general: e.g. if the coin already shows heads, (=Fa) so the act. dgr. of belief of it is of course = 1, while one knows that p(Fx) = ½.
a priori probability function: here all background knowledge W must be explicitly written into the antecedent of a conditional probability statement w( I W).
Actual: = personalistic.
Apriori probability: connection with actual probability:
Strict conditionalization/Schurz: let w0 be the a priori probability or probability at t0 and let w1 be the actual probability
I 116
Wt the knowledge acquired between t0 and t1. Then for any A holds:
Wt(A) = w0(A I Wt).

Closest reference class/principle/Schurz: can be justified in this way: For a given event Fa, individual a can belong to very many reference classes assigning very different probabilities to Fx. Then we would get contradictory predictions.
Question: But why should the appropriate reference class be the closest one? Because we can prove that it maximizes the frequency threshold of accurate predictions.

_____________
Explanation of symbols: Roman numerals indicate the source, arabic numerals indicate the page number. The corresponding books are indicated on the right hand side. ((s)…): Comment by the sender of the contribution. Translations: Dictionary of Arguments
The note [Concept/Author], [Author1]Vs[Author2] or [Author]Vs[term] resp. "problem:"/"solution:", "old:"/"new:" and "thesis:" is an addition from the Dictionary of Arguments. If a German edition is specified, the page numbers refer to this edition.

Schu I
G. Schurz
Einführung in die Wissenschaftstheorie Darmstadt 2006


Send Link
> Counter arguments against Schurz
> Counter arguments in relation to Probability Theory

Authors A   B   C   D   E   F   G   H   I   J   K   L   M   N   O   P   Q   R   S   T   U   V   W   Y   Z  


Concepts A   B   C   D   E   F   G   H   I   J   K   L   M   N   O   P   Q   R   S   T   U   V   W   Z  



Ed. Martin Schulz, access date 2024-04-28
Legal Notice   Contact   Data protection declaration