Loading Session...

Conditionalization via Conditional Expectations

Session Information

The symposium discusses philosophically relevant features of conditionalization and analyzes specific issues in which conditional probability plays a crucial rule, where conditionalization is not defined by Bayes' rule but by conditional expectations determined by Boolean algebras ("Kolmogorov conditionalization"). The symposium talks will (i) argue that philosophically significant uses of conditional probability (in causal modeling, confirmation theory, decision theory) in fact come with a designated conditioning algebra and not just a conditioning event and that the defining features of Kolmogorov conditionalization are crucial in such applications; (ii) describe general features of Bayesian learning that are based on Kolmogorov conditionalization; in particular results are presented on the size of the set of probability measures that cannot be obtained as conditioned probabilities from a given prior; (iii) formulate and analyze, in terms of Kolmogorov conditionalization, the Reflection Principle, which requires that future estimates of a quantity cohere with its current estimate; (iii) consider a generalization of Kolmogorov conditionalization in order to accommodate the phenomenon in a Bayesian framework called "misplaced certainty": cases where one assigns credence 1 in a falsehood.

02 Nov 2018 03:45 PM - 05:45 PM(America/Los_Angeles)
Venue : Capitol Hill (Third Floor)
20181102T1545 20181102T1745 America/Los_Angeles Conditionalization via Conditional Expectations

The symposium discusses philosophically relevant features of conditionalization and analyzes specific issues in which conditional probability plays a crucial rule, where conditionalization is not defined by Bayes' rule but by conditional expectations determined by Boolean algebras ("Kolmogorov conditionalization"). The symposium talks will (i) argue that philosophically significant uses of conditional probability (in causal modeling, confirmation theory, decision theory) in fact come with a designated conditioning algebra and not just a conditioning event and that the defining features of Kolmogorov conditionalization are crucial in such applications; (ii) describe general features of Bayesian learning that are based on Kolmogorov conditionalization; in particular results are presented on the size of the set of probability measures that cannot be obtained as conditioned probabilities from a given prior; (iii) formulate and analyze, in terms of Kolmogorov conditionalization, the Reflection Principle, which requires that future estimates of a quantity cohere with its current estimate; (iii) consider a generalization of Kolmogorov conditionalization in order to accommodate the phenomenon in a Bayesian framework called "misplaced certainty": cases where one assigns credence 1 in a falsehood.

Capitol Hill (Third Floor) PSA2018: The 26th Biennial Meeting of the Philosophy of Science Association office@philsci.org

Presentations

Partition-Sensitivity for Conditional Probabilities

Philosophy of Science 03:45 PM - 04:15 PM (America/Los_Angeles) 2018/11/02 22:45:00 UTC - 2018/11/02 23:15:00 UTC
Kenny Easwaran (Texas A&M University)
There are two major families of proposals for probabilities conditional on events of measure zero. One family (associated with Popper, De Finetti, Dubins, and their followers) fixes a conditional probability for every pair of events. The other (associated with Kolmogorov, and standard in mathematical probability theory) requires a third argument of a partition from which the conditioning event is drawn. I argue that partition sensitivity is not a problem, because every context in which conditional probability is relevant has a natural partition, whether in learning, confirmation theory, decision theory, causal modeling, or otherwise.
Presenters Kenny Easwaran
Texas A&M University

Conditional Expectation and the Reflection Principle

Philosophy of Science 04:15 PM - 04:45 PM (America/Los_Angeles) 2018/11/02 23:15:00 UTC - 2018/11/02 23:45:00 UTC
Simon Huttegger (University of California, Irvine)
Kolmogorov's concept of conditional expectation can be thought of as a very general way of updating on new information, including Bayesian conditioning as a special case. This role of conditional expectation can be explored in terms of the reflection principle and martingale conditions, which in turn can be justified in a variety of ways, such as dynamic coherence arguments, accuracy arguments, and arguments based on the value of evidence. This paper studies these approaches in a fully general measure theoretic framework.
Presenters
SH
Simon Huttegger
UC Irvine

Features of Bayesian Learning Based on Conditioning Using Conditional Expectations

Philosophy of Science 04:45 PM - 05:15 PM (America/Los_Angeles) 2018/11/02 23:45:00 UTC - 2018/11/03 00:15:00 UTC
Miklos Redei (London School of Economics), Zalan Gyenis (Jagiellonian University and Eotvos University)
Conditional expectations define a "Bayes accessibility" relation among probability measures on a Boolean algebra. If a probability measure is Bayes accessible from another, then the Bayesian Agent can learn this probability from the evidence represented by the other probability. The Bayes Blind Spot is the set of probability measures on a Boolean algebra that cannot be learned by a single conditionalization from any evidence. It is shown that the Bayes Blind Spot is uncountably infinite in standard probability spaces and that it is a large set (in cardinality, in measure and topologically) if the Boolean algebra is finite.
Presenters
ZG
Zalan Gyenis
Jagiellonian University And Eotvos University
MR
Miklos Redei
London School Of Economics

Conditioning on a Probability Zero Event

Philosophy of Science 05:15 PM - 05:45 PM (America/Los_Angeles) 2018/11/03 00:15:00 UTC - 2018/11/03 00:45:00 UTC
Michael Rescorla (University of California, Los Angeles)
Kolmogorov's theory of conditional probability generalizes the ratio formula so as to define conditional probabilities when the conditioning proposition has probability 0. Recently, several authors have suggested that Kolmogorov's theory can illuminate conditionalization on propositions with initial probability 0. I pursue a version of this strategy, articulating a diachronic norm that I call Non-factive Kolmogorov Conditionalization (NKC). NKC encompasses numerous scientifically and philosophically important scenarios, including numerous scenarios where the agent becomes certain of a false proposition that has initial probability zero. I highlight key advantages that NKC offers over all rival credal update rules.
Presenters
MR
Michael Rescorla
UCLA
864 visits

Session Participants

User Online
Session speakers, moderators & attendees
Texas A&M University
London School of Economics
Jagiellonian University and Eotvos University
Rhodes College
No attendee has checked-in to this session!
34 attendees saved this session

Session Chat

Live Chat
Chat with participants attending this session

Questions & Answers

Answered
Submit questions for the presenters

Session Polls

Active
Participate in live polls

Need Help?

Technical Issues?

If you're experiencing playback problems, try adjusting the quality or refreshing the page.

Questions for Speakers?

Use the Q&A tab to submit questions that may be addressed in follow-up sessions.