Apr
7
Fri
Koellner: Gödel’s Disjunction @ 716 Philosophy Hall
Apr 7 @ 5:00 pm

Gödel’s disjunction asserts that either “the mind cannot be mechanized” or “there are absolutely undecidable statements.” Arguments are examined for and against each disjunct in the context of precise frameworks governing the notions of absolute provability and truth. The focus is on Penrose’s new argument, which interestingly involves type-free truth. In order to reconstruct Penrose’s argument, a system, DKT, is devised for absolute provability and type-free truth. It turns out that in this setting there are actually two versions of the disjunction and its disjuncts. The first, fully general versions end up being (provably) indeterminate. The second, restricted versions end up being (provably) determinate, and so, in this case there is at least an initial prospect of success. However, in this case it will be seen that although the disjunction itself is provable, neither disjunct is provable nor refutable in the framework.

 

UNIVERSITY SEMINAR ON LOGIC, PROBABILITY, AND GAMES
Gödel’s Disjunction
Peter Koellner (Harvard University)
5:00 pm, Friday, April 7th, 2017
716 Philosophy Hall, Columbia University

Apr
8
Sat
Columbia Workshop on Probability and Learning @ 716 Philosophy Hall
Apr 8 all-day

Gordon Belot (Michigan) – Typical!, 10am
Abstract. This talk falls into three short stories. The over-arching themes are: (i) that the notion of typicality is protean; (ii) that Bayesian technology is both more and less rigid than is sometimes thought.

Simon Huttegger (Irvine LPS) – Schnorr Randomness and Lévi’s Martingale Convergence Theorem, 11:45am
Abstract. Much recent work in algorithmic randomness concerns characterizations of randomness in terms of the almost everywhere
behavior of suitably effectivized versions of functions from analysis or probability. In this talk, we take a look at Lévi’s Martingale Convergence Theorem from this perspective. Levi’s theorem is of fundamental importance to Bayesian epistemology. We note that much of Pathak, Rojas, and Simpson’s work on Schnorr randomness and the Lebesgue Differentiation Theorem in the Euclidean context carries over to Lévi’s Martingale Convergence Theorem in the Cantor space context. We discuss the methodological choices one faces in choosing the appropriate mode of effectivization and the potential bearing of these results on Schnorr’s critique of Martin-Löf. We also discuss the consequences of our result for the Bayesian model of learning.

Deborah Mayo (VA Tech) – Probing With Severity: Beyond Bayesian Probabilism and Frequentist Performance, 2:45pm
Abstract. Getting beyond today’s most pressing controversies revolving around statistical methods and irreproducible findings requires scrutinizing underlying statistical philosophies. Two main philosophies about the roles of probability in statistical inference are probabilism and performance (in the long-run). The first assumes that we need a method of assigning probabilities to hypotheses; the second assumes that the main function of statistical method is to control long-run performance. I offer a third goal: controlling and evaluating the probativeness of methods. A statistical inference, in this conception, takes the form of inferring hypotheses to the extent that they have been well or severely tested. A report of poorly tested claims must also be part of an adequate inference. I show how the “severe testing” philosophy clarifies and avoids familiar criticisms and abuses of significance tests and cognate methods (e.g., confidence intervals). Severity may be threatened in three main ways: fallacies of rejection and non-rejection, unwarranted links between statistical and substantive claims, and violations of model assumptions. I illustrate with some controversies surrounding the use of significance tests in the discovery of the Higgs particle in high energy physics.

Teddy Seidenfeld (CMU) – Radically Elementary Imprecise Probability Based on Extensive Measurement, 4:30pm
Abstract. This presentation begins with motivation for “precise” non-standard probability. Using two old challenges — involving (i) symmetry of probabilistic relevance and (ii) respect for weak dominance — I contrast the following three approaches to conditional probability given a (non-empty) “null” event and their three associated decision theories.
Approach #1 – Full Conditional Probability Distributions (Dubins, 1975) conjoined with Expected Utility.
Approach #2 – Lexicographic Probability conjoined with Lexicographic Expected Value (e.g., Blume et al., 1991)
Approach #3 – Non-standard Probability and Expected Utility based on Non-Archimedean Extensive Measurement (Narens, 1974).
The second part of the presentation discusses progress we’ve made using Approach #3 within a context of Imprecise Probability.

Dec
8
Fri
The Price of Broadminded Probabilities and the Limitation of Science – Haim Gaifman (Columbia) @ Faculty House, Columbia U
Dec 8 @ 4:10 pm

A subjective probability function is broadminded to the extent that it assigns positive probabilities to conjectures that can be possibly true. Assigning to such a conjecture the value 0 amounts to a priori ruling out the possibility of confirming the conjecture to any extent by the growing evidence. A positive value leaves, in principle, the possibility of learning from the evidence. In general, broadmindedness is not an absolute notion, but a graded one, and there is a price for it: the more broadminded the probability, the more complicated it is, because it has to assign non-zero values to more complicated conjectures. The framework which is suggested in the old Gaifman-Snir paper is suitable for phrasing this claim in a precise way and proving it. The technique by which this claim is established is to assume a definable probability function, and to state within the same language a conjecture that can be possibly true, whose probability is 0.

The complexity of the conjecture depends on the complexity of the probability, i.e., the complexity of the formulas that are used in defining it. In the Gaifman-Snir paper we used the arithmetical hierarchy as a measure of complexity. It is possible however to establish similar results with respect to a more “down to earth” measures, defined in terms of the time that it takes to calculate the probabilities, with given precisions.

A claim of this form, for a rather simple setup, was first proven by Hilary Putnam in his paper ““Degree of Confirmation” and inductive logic”, which was published in the 1963 Schilpp volume dedicated to Carnap. The proof uses in a probabilistic context, a diagonalization technique, of the kind used in set theory and in computer science. In the talk I shall present Putnam’s argument and show how diagonalization can be applied in considerably richer setups.

The second part of the talk is rather speculative. I shall point out the possibility that there might be epistemic limitations to what human science can achieve, which are imposed by certain pragmatic factors ‒ such as the criterion of repeatable experiments. All of which would recommend a skeptic attitude.

Feb
16
Fri
Schervish: Finitely-Additive Decision Theory @ Faculty House, Columbia U
Feb 16 @ 4:10 pm

We examine general decision problems with loss functions that are bounded below. We allow the loss function to assume the value ∞. No other assumptions are made about the action space, the types of data available, the types of non-randomized decision rules allowed, or the parameter space. By allowing prior distributions and the randomizations in randomized rules to be finitely-additive, we find very general complete class and minimax theorems. Specifically, under the sole assumption that the loss function is bounded below, every decision problem has a minimal complete class and all admissible rules are Bayes rules. Also, every decision problem has a minimax rule and a least-favorable distribution and every minimax rule is Bayes with respect to the least-favorable distribution. Some special care is required to deal properly with infinite-valued risk functions and integrals taking infinite values. This talk will focus on some examples and the major differences between finitely-additive and countably-additive decision theory. This is joint work with Teddy Seidenfeld, Jay Kadane, and Rafael Stern.

 

UNIVERSITY SEMINAR ON LOGIC, PROBABILITY, AND GAMES
Finitely-Additive Decision Theory
Mark Schervish (Carnegie Mellon)
4:10 pm, Friday, February 16th, 2018
Faculty House, Columbia University

Mar
3
Sat
Recent Work in Decision Theory and Epistemology Workshop @ Philosophy Hall rm 716
Mar 3 all-day

Speakers:

Jennifer Carr (University of California, San Diego)
Ryan Doody (Hebrew University of Jerusalem)
Harvey Lederman (Princeton University)
Chris Meacham (University of Massachusetts, Amherst)

Organizer:

Melissa Fusco (Columbia University)

9:30 – 10:00 Breakfast (716 Philosophy Hall)
SESSION I Chair: Melissa Fusco
10:00 – 11:30 Jennifer Carr: “Can Accuracy Motivate Modesty?”
11:30 – 11:45 Coffee Break I
SESSION II Chair: Jessica John Collins
11:45 – 1:15 Ryan Doody: “Hard Choices Made Harder”
1:15 – 2:30 Lunch
SESSION III Chair: Jennifer Carr
2:30 – 4:00 Harvey Lederman: “Verbalism”
4:00 – 4:30 Coffee Break II
SESSION IV Chair: Ryan Doody
4:30 – 6:00 Chris Meacham: “Decision in Cases of Infinitely Many Utility Contributions”
6:00 Drinks
Apr
13
Fri
Icard: On the Rational Role of Randomization @ Faculty House, Columbia U
Apr 13 @ 4:10 pm

Randomized acts play a marginal role in traditional Bayesian decision theory, essentially only one of tie-breaking. Meanwhile, rationales for randomized decisions have been offered in a number of areas, including game theory, experimental design, and machine learning. A common and plausible way of accommodating some (but not all) of these ideas from a Bayesian perspective is by appeal to a decision maker’s bounded computational resources. Making this suggestion both precise and compelling is surprisingly difficult. We propose a distinction between interesting and uninteresting cases where randomization can help a decision maker, with the eventual aim of achieving a unified story about the rational role of randomization. The interesting cases, we claim, all arise from constraints on memory.

UNIVERSITY SEMINAR ON LOGIC, PROBABILITY, AND GAMES
On the Rational Role of Randomization
Thomas Icard (Stanford)

Nov
16
Fri
Nielsen: Speed-optimal Induction and Dynamic Coherence @ Faculty House, Columbia U
Nov 16 @ 4:10 pm – 6:10 pm

A standard way to challenge convergence-based accounts of inductive success is to claim that they are too weak to constrain inductive inferences in the short run. We respond to such a challenge by answering some questions raised by Juhl (1994). When it comes to predicting limiting relative frequencies in the framework of Reichenbach, we show that speed-optimal convergence—a long-run success condition—induces dynamic coherence in the short run. This is joint work with Eric Wofsey.

Michael Nielsen (Columbia University).
4:10 pm, Friday, November 16th, 2018
Faculty House, Columbia University

Dec
7
Fri
Actual Causality: A Survey, Joseph Halpern (Cornell) @ Faculty House, Columbia U
Dec 7 @ 4:10 pm

What does it mean that an event C “actually caused” event E? The problem of defining actual causation goes beyond mere philosophical speculation.  For example, in many legal arguments, it is precisely what needs to be established in order to determine responsibility.   (What exactly was the actual cause of the car accident or the medical problem?) The philosophy literature has been struggling with the problem of defining causality since the days of Hume, in the 1700s. Many of the definitions have been couched in terms of counterfactuals. (C is a cause of E if, had C not happened, then E would not have happened.) In 2001, Judea Pearl and I introduced a new definition of actual cause, using Pearl’s notion of structural equations to model counterfactuals.  The definition has been revised twice since then, extended to deal with notions like “responsibility” and “blame”, and applied in databases and program verification.  I survey the last 15 years of work here, including joint work with Judea Pearl, Hana Chockler, and Chris Hitchcock. The talk will be completely self-contained.

Feb
8
Fri
Logic, Probability, and Games Seminar @ Faculty House, Columbia U
Feb 8 @ 4:00 pm

The seminar is concerned with applying formal methods to fundamental issues, with an emphasis on probabilistic reasoning, decision theory and games. In this context “logic” is broadly interpreted as covering applications that involve formal representations. The topics of interest have been researched within a very broad spectrum of different disciplines, including philosophy (logic and epistemology), statistics, economics, and computer science. The seminar is intended to bring together scholars from different fields of research so as to illuminate problems of common interest from different perspectives. Throughout each academic year, meetings are regularly presented by the members of the seminar and distinguished guest speakers.

details tba

02/08/2019 Faculty House, Columbia University
4:00 PM

03/22/2019 Faculty House, Columbia University
4:00 PM

04/19/2018 Faculty House, Columbia University
4:00 PM

Feb
22
Fri
Buddha versus Popper: Do we live in the present or do we plan for the future? Rohit Parikh (CUNY) @ Faculty House, Columbia U
Feb 22 @ 4:10 pm

There are two approaches to life. The first one, which we are identifying with Sir Karl Popper, is to think before we act and to let our hypotheses die in our stead when the overall outcome is likely to be negative. We act now for a better future, and we think now which action will bring the best future. Both decision theory and backward induction are technical versions of this train of thought.  The second approach, which we will identify with the Buddha, is to live in the present and not allow the future to pull us away from living in the ever present  Now. The Buddha’s approach is echoed in many others who came after him, Jelaluddin Rumi, Kahlil Gibran, and even perhaps Jesus.  It occurs in many contemporary teachers like Eckhart Tolle and Thich Nhat Hanh.  We may call Popper’s approach “futurism” and the Buddha’s approach “presentism.”

In this talk, we will discuss various aspects of the discourse on presentism and futurism. The purpose is to contrast one with the other. We will not attempt to side with one against the other, and instead leave it as a future project to find a prescriptive action-guiding choice between the two. We merely conjecture that a better optimal choice between these two positions may be somewhere in between. (This is joint work with Jongjin Kim.)