Apr
28
Thu
Latinx Philosophers Conference @ 716 Philosophy Hall, Columbia U.
Apr 28 – Apr 29 all-day

The 1st Latinx Philosophers Conference is an initiative of some Latin American PhD candidates in the Columbia Philosophy Department. We hope to initiate a tradition of annual conferences to serve the following ends. First, to foster the creation and development of a Latinx Philosophers Network in the United States. This network, in turn, will help us provide a space for camaraderie and collaborative work, as well as identify and pursue the common interests of Latinx Philosophers in the U.S. Second, to provide a space for discussing issues of particular relevance to Latinx from a philosophical perspective.

The conference will take place on April 29-30 and will be organized around two clusters of topics. The first day will be devoted to issues in Epistemology, Logic, Metaphysics and Philosophy of Science. The second day will focus on Ethics, Social and Political Philosophy, Feminist Philosophy, Philosophy of Race, and Latin American Philosophy. We are happy to announce that Otavio Bueno and Jorge Gracia will be our keynote speakers for each day, respectively.

We invite graduate students who identify as Latinx or who are interested in forming part of the Latino/a Philosophers Network to submit papers on any of the topics mentioned above. We encourage submissions by women. We also encourage submissions that discuss issues relevant to the Latinx experience.

Papers should not exceed 4000 words (or the equivalent of a 30-minute presentation). They should be prepared for blind review and sent as a PDF file to latinophilosophersnetwork@gmail.com. In a separate PDF attachment, please include your name, academic affiliation, email address, telephone number, paper title, and an abstract of no more than 250 words. Any questions can be directed to César Cabezas (cgc2125@columbia.edu), or Ignacio Ojea (ignacio.ojea@columbia.edu). Acceptances will be announced by March 15.

This event is supported by:

MAP (Minorities and Philosophy), and

The Center for Race, Philosophy and Social Justice at Columbia University

Feb
17
Fri
Rethinking Philosophy’s Past 1300 – 1800 @ Heyman Center for the Humanities, Second Floor Common Room
Feb 17 – Feb 18 all-day

Rethinking Philosophy’s Past, 1300-1800: The Philosophy Department and Center for Science and Society at Columbia University invite you to “Rethinking Philosophy’s Past, 1300-1800” (February 17-18). Distinguished historians will share recent scholarship on women and other understudied figures in the history of philosophy to encourage more accurate accounts of philosophy’s past and more inclusive teaching. Sessions rethink standard stories and offer practical ideas about to incorporate understudied figures in our philosophy courses, both historical and non-historical.

http://philosophy.columbia.edu/events/events/events/conferences

 

CHEDULE
Workshop day 1: Friday, February 17, 2017
Time
Topic
Speakers
2:00
2:20
Welcome and Introduction
Christia Mercer (Philosophy, Columbia)
Session 1: Creating the Modern Self: Medieval Women on Authority, Will and Self-knowledge
2:20
3:00
Part I. Medieval Women and the Struggle for Authority
Chair: Christia Mercer
(Philosophy, Columbia)
Elizabeth Castelli (Religion, Barnard)
3:00
3:45
Holly Flora (Art History, Tulane)
3:45
4:00
Coffee Break
4:00
5:30
Part II. Medieval Women on Authority, Will, and Self-knowledge
Chair: Achille Varzi
(Philosophy, Columbia)
Peter King (Philosophy, University of Toronto)
and Christina Van Dyke (Philosophy and Gender
Studies, Calvin College)
Session 2
5:30
6:30
Panel Discussion: Making
Our
Philosophy Courses
More Inclusive
Chair: Alison Simmons
(Philosophy, Harvard)
Andy Arlig (Philosophy, Brooklyn College), Colin
Chamberlain (Philosophy, Temple University),
Don Garrett (Philosophy, NYU), Justin Steinberg
(Philosophy, Brooklyn College) and
Julie Walsh
(Philosophy, Wellesley)
Participants and Invited Guests are invited to Dinner at Symposium, 544 W. 113th St (between
Broadway and Amsterdam)
Workshop day 2: Saturday, February 18, 2017
Time
Topic
Speakers
8:30
9:00
Light Breakfast
Session 3
9:00
10:00
Education and Women’s
Epistemic Authority, 1500
1800
Chair: John Collins
(Philosophy, Columbia)
Marguerite Deslauriers (Philosophy, McGill)
and Lisa Shapiro (Philosophy, Simon
Fraser)
10:05
11:10
Karen Detlefsen (Philosophy and Education,
Penn) and Sandrine Berges (Philosophy,
Bilkent)
11:15
11:30
Coffee Break
Session 4
11:30
1:15
Reconsidering the
Standard Narrative about
Early
Modern Philosophy:
Cavendish, du Châtelet,
and Shepherd on the
Metaphysics of Nature
Chair: Andrew Janiak
(Philosophy, Duke)
David Cunning (Philosophy, Iowa), Marcy
Lascano (Philosophy, CSULB) and Antonia
Lolordo (Philosophy, UVA)
1:15
2:15
Lunch and Break
out Sessions
2:15
3:40
Results of Break
out Discussions and Further Discussions of Papers
Session 5
3:45
6:15
Reflective Methodology:
From Medieval
Meditations to Early
Modern Science
Chair: Daniel
Garber
(Philosophy, Princeton)
Bachir Diagne (French, Columbia), Christia
Mercer (Philosophy, Columbia), Alan Stewart
(English and Comparative Literature,
Columbia), and Matthew Jones (History,
Columbia)
Dinner for participants hosted by Christia Mercer
Apr
8
Sat
Columbia Workshop on Probability and Learning @ 716 Philosophy Hall
Apr 8 all-day

Gordon Belot (Michigan) – Typical!, 10am
Abstract. This talk falls into three short stories. The over-arching themes are: (i) that the notion of typicality is protean; (ii) that Bayesian technology is both more and less rigid than is sometimes thought.

Simon Huttegger (Irvine LPS) – Schnorr Randomness and Lévi’s Martingale Convergence Theorem, 11:45am
Abstract. Much recent work in algorithmic randomness concerns characterizations of randomness in terms of the almost everywhere
behavior of suitably effectivized versions of functions from analysis or probability. In this talk, we take a look at Lévi’s Martingale Convergence Theorem from this perspective. Levi’s theorem is of fundamental importance to Bayesian epistemology. We note that much of Pathak, Rojas, and Simpson’s work on Schnorr randomness and the Lebesgue Differentiation Theorem in the Euclidean context carries over to Lévi’s Martingale Convergence Theorem in the Cantor space context. We discuss the methodological choices one faces in choosing the appropriate mode of effectivization and the potential bearing of these results on Schnorr’s critique of Martin-Löf. We also discuss the consequences of our result for the Bayesian model of learning.

Deborah Mayo (VA Tech) – Probing With Severity: Beyond Bayesian Probabilism and Frequentist Performance, 2:45pm
Abstract. Getting beyond today’s most pressing controversies revolving around statistical methods and irreproducible findings requires scrutinizing underlying statistical philosophies. Two main philosophies about the roles of probability in statistical inference are probabilism and performance (in the long-run). The first assumes that we need a method of assigning probabilities to hypotheses; the second assumes that the main function of statistical method is to control long-run performance. I offer a third goal: controlling and evaluating the probativeness of methods. A statistical inference, in this conception, takes the form of inferring hypotheses to the extent that they have been well or severely tested. A report of poorly tested claims must also be part of an adequate inference. I show how the “severe testing” philosophy clarifies and avoids familiar criticisms and abuses of significance tests and cognate methods (e.g., confidence intervals). Severity may be threatened in three main ways: fallacies of rejection and non-rejection, unwarranted links between statistical and substantive claims, and violations of model assumptions. I illustrate with some controversies surrounding the use of significance tests in the discovery of the Higgs particle in high energy physics.

Teddy Seidenfeld (CMU) – Radically Elementary Imprecise Probability Based on Extensive Measurement, 4:30pm
Abstract. This presentation begins with motivation for “precise” non-standard probability. Using two old challenges — involving (i) symmetry of probabilistic relevance and (ii) respect for weak dominance — I contrast the following three approaches to conditional probability given a (non-empty) “null” event and their three associated decision theories.
Approach #1 – Full Conditional Probability Distributions (Dubins, 1975) conjoined with Expected Utility.
Approach #2 – Lexicographic Probability conjoined with Lexicographic Expected Value (e.g., Blume et al., 1991)
Approach #3 – Non-standard Probability and Expected Utility based on Non-Archimedean Extensive Measurement (Narens, 1974).
The second part of the presentation discusses progress we’ve made using Approach #3 within a context of Imprecise Probability.

Nov
10
Fri
Entropy and Insufficient Reason – Anubav Vasudevan (University of Chicago) @ Faculty House, Columbia U
Nov 10 @ 4:00 pm – 6:00 pm

One well-known objection to the principle of maximum entropy is the so-called Judy Benjamin problem, first introduced by van Fraassen (1981). The problem turns on the apparently puzzling fact that, on the basis of information relating an event’s conditional probability, the maximum entropy distribution will almost always assign to the event conditionalized on a probability strictly less than that assigned to it by the uniform distribution. In this paper, I present an analysis of the Judy Benjamin problem that can help to make sense of this seemingly odd feature of maximum entropy inference. My analysis is based on the claim that, in applying the principle of maximum entropy, Judy Benjamin is not acting out of a concern to maximize uncertainty in the face of new evidence, but is rather exercising a certain brand of epistemic charity towards her informant. This charity takes the form of an assumption on the part of Judy Benjamin that her informant’s evidential report leaves out no relevant information. I will explain how this single assumption suffices to rationalize Judy Benjamin’s behavior. I will then explain how such a re-conceptualization of the motives underlying Judy Benjamin’s appeal to the principle of maximum entropy can further our understanding of the relationship between this principle and the principle of insufficient reason. I will conclude with a discussion of the foundational significance for probability theory of ergodic theorems (e.g., de Finetti’s theorem) describing the asymptotic behavior of measure preserving transformation groups. In particular, I will explain how these results, which serve as the basis of maximum entropy inference, can provide a unified conceptual framework in which to justify both a priori and a posteriori probabilistic reasoning.

We will be having dinner right after the meeting at the faculty house. Please let Robby (jrf2162@columbia.edu) know if you will be joining us so that he can make an appropriate reservation (please be advised that at this point the university only agrees to cover the expenses of the speaker and the rapporteur and that the cost for all others is $30, payable by cash or check).

https://philevents.org/event/show/37746

Dec
8
Fri
The Price of Broadminded Probabilities and the Limitation of Science – Haim Gaifman (Columbia) @ Faculty House, Columbia U
Dec 8 @ 4:10 pm

A subjective probability function is broadminded to the extent that it assigns positive probabilities to conjectures that can be possibly true. Assigning to such a conjecture the value 0 amounts to a priori ruling out the possibility of confirming the conjecture to any extent by the growing evidence. A positive value leaves, in principle, the possibility of learning from the evidence. In general, broadmindedness is not an absolute notion, but a graded one, and there is a price for it: the more broadminded the probability, the more complicated it is, because it has to assign non-zero values to more complicated conjectures. The framework which is suggested in the old Gaifman-Snir paper is suitable for phrasing this claim in a precise way and proving it. The technique by which this claim is established is to assume a definable probability function, and to state within the same language a conjecture that can be possibly true, whose probability is 0.

The complexity of the conjecture depends on the complexity of the probability, i.e., the complexity of the formulas that are used in defining it. In the Gaifman-Snir paper we used the arithmetical hierarchy as a measure of complexity. It is possible however to establish similar results with respect to a more “down to earth” measures, defined in terms of the time that it takes to calculate the probabilities, with given precisions.

A claim of this form, for a rather simple setup, was first proven by Hilary Putnam in his paper ““Degree of Confirmation” and inductive logic”, which was published in the 1963 Schilpp volume dedicated to Carnap. The proof uses in a probabilistic context, a diagonalization technique, of the kind used in set theory and in computer science. In the talk I shall present Putnam’s argument and show how diagonalization can be applied in considerably richer setups.

The second part of the talk is rather speculative. I shall point out the possibility that there might be epistemic limitations to what human science can achieve, which are imposed by certain pragmatic factors ‒ such as the criterion of repeatable experiments. All of which would recommend a skeptic attitude.