The symposium discusses philosophically relevant features of conditionalization and analyzes specific issues in which conditional probability plays a crucial rule, where conditionalization is not defined by Bayes' rule but by conditional expectations determined by Boolean algebras ("Kolmogorov conditionalization"). The symposium talks will (i) argue that philosophically significant uses of conditional probability (in causal modeling, confirmation theory, decision theory) in fact come with a designated conditioning algebra and not just a conditioning event and that the defining features of Kolmogorov conditionalization are crucial in such applications; (ii) describe general features of Bayesian learning that are based on Kolmogorov conditionalization; in particular results are presented on the size of the set of probability measures that cannot be obtained as conditioned probabilities from a given prior; (iii) formulate and analyze, in terms of Kolmogorov conditionalization, the Reflection Principle, which requires that future estimates of a quantity cohere with its current estimate; (iii) consider a generalization of Kolmogorov conditionalization in order to accommodate the phenomenon in a Bayesian framework called "misplaced certainty": cases where one assigns credence 1 in a falsehood.
The symposium discusses philosophically relevant features of conditionalization and analyzes specific issues in which conditional probability plays a crucial rule, where conditionalization is not defined by Bayes' rule but by conditional expectations determined by Boolean algebras ("Kolmogorov conditionalization"). The symposium talks will (i) argue that philosophically significant uses of conditional probability (in causal modeling, confirmation theory, decision theory) in fact come with a designated conditioning algebra and not just a conditioning event and that the defining features of Kolmogorov conditionalization are crucial in such applications; (ii) describe general features of Bayesian learning that are based on Kolmogorov conditionalization; in particular results are presented on the size of the set of probability measures that cannot be obtained as conditioned probabilities from a given prior; (iii) formulate and analyze, in terms of Kolmogorov conditionalization, the Reflection Principle, which requires that future estimates of a quantity cohere with its current estimate; (iii) consider a generalization of Kolmogorov conditionalization in order to accommodate the phenomenon in a Bayesian framework called "misplaced certainty": cases where one assigns credence 1 in a falsehood.
Capitol Hill (Third Floor) PSA2018: The 26th Biennial Meeting of the Philosophy of Science Association office@philsci.orgTechnical Issues?
If you're experiencing playback problems, try adjusting the quality or refreshing the page.
Questions for Speakers?
Use the Q&A tab to submit questions that may be addressed in follow-up sessions.