Nov
10
Tue
Appetite for Distraction: Social Media and Today’s Attention-Economy @ Wolff Conference Room, Albert and Vera List Academic Center, 1103
Nov 10 @ 5:00 pm – 8:00 pm

The Liberal Studies department at the New School for Social Research and the Culture & Media Department at Eugene Lang College are pleased to jointly present “Appetite for Distraction: Social Media and Today’s Attention-Economy,” an evening lecture by Chair and faculty memeber Dominic Pettman, which also marks the publication his forthcoming book Infinite Distraction (Polity Press, 2016).

It is often argued that contemporary media homogenize our thoughts and actions, without us being fully aware of the restrictions they impose. But what if the problem is not that we are all synchronized to the same motions or moments, but rather dispersed into countless different emotional micro-experiences? What if the effect of so-called social media is to calibrate the interactive spectacle so that we never fully feel the same way as other potential allies at the same time? While one person is fuming about economic injustice or climate change denial, another is giggling at a cute cat video. And, two hours late, vice versa. The nebulous indignation which constitutes the very fuel of true social change can be redirected safely around the network, avoiding any dangerous surges of radical activity.

Infinite Distraction examines the deliberate deployment of what Pettman calls hypermodulation, as a key strategy encoded into the contemporary media environment. His account challenges the various narratives that portray social media as a sinister space of synchronized attention, in which we are busily clicking ourselves to death. This critical reflection on the unprecedented power of the Internet requires us to rethink the potential for infinite distraction that our latest technologies now allow.

A Q&A will follow the lecture and refreshments will be on hand.

Dec
9
Fri
Elizabeth Miller (Yale), Jonathan Bain (NYU): What Explains the Spin-Statistics Connection? @ NYU Philosophy Dept. rm 101
Dec 9 @ 2:30 pm – 4:30 pm

Metro Area Philosophy of Science Presents:

Elizabeth Miller (Yale),

Title: TBA.

Jonathan Bain (NYU)

What Explains the Spin-Statistics Connection?

The spin-statistics connection plays an essential role in explanations of non-relativistic phenomena associated with both field-theoretic and non-field-theoretic systems (for instance, it explains the electronic structure of solids and the behavior of Einstein-Bose condensates and superconductors). However, it is only derivable within the context of relativistic quantum field theory (RQFT) in the form of the Spin-Statistics Theorem; and moreover, there are multiple, mutually incompatible ways of deriving it. This essay attempts to determine the sense in which the spin-statistics connection can be said to be an essential property in RQFT, and how it is that an essential property of one type of theory can figure into fundamental explanations offered by other, inherently distinct theories.

Mar
15
Wed
Agenda Setting and the Media @ Setauket Neighborhood House
Mar 15 @ 7:30 pm

This week we focus our lens on the Media by exploring Agenda Setting. Agenda setting is when a certain media outlet frequently presents an issue that may not be as relevant as the frequent appearance of the piece suggests. However, as a result of the frequency the audience begins to think the issue is important. The concern is compounded when other media outlets jump on the band wagon. We’ll explore this little touched topic and more! Can’t wait to hear your thoughts!

Please read Agenda Setting pages 147 – 161

Please remember to bring $3 for the Setauket Neighborhood house.

Apr
8
Sat
Columbia Workshop on Probability and Learning @ 716 Philosophy Hall
Apr 8 all-day

Gordon Belot (Michigan) – Typical!, 10am
Abstract. This talk falls into three short stories. The over-arching themes are: (i) that the notion of typicality is protean; (ii) that Bayesian technology is both more and less rigid than is sometimes thought.

Simon Huttegger (Irvine LPS) – Schnorr Randomness and Lévi’s Martingale Convergence Theorem, 11:45am
Abstract. Much recent work in algorithmic randomness concerns characterizations of randomness in terms of the almost everywhere
behavior of suitably effectivized versions of functions from analysis or probability. In this talk, we take a look at Lévi’s Martingale Convergence Theorem from this perspective. Levi’s theorem is of fundamental importance to Bayesian epistemology. We note that much of Pathak, Rojas, and Simpson’s work on Schnorr randomness and the Lebesgue Differentiation Theorem in the Euclidean context carries over to Lévi’s Martingale Convergence Theorem in the Cantor space context. We discuss the methodological choices one faces in choosing the appropriate mode of effectivization and the potential bearing of these results on Schnorr’s critique of Martin-Löf. We also discuss the consequences of our result for the Bayesian model of learning.

Deborah Mayo (VA Tech) – Probing With Severity: Beyond Bayesian Probabilism and Frequentist Performance, 2:45pm
Abstract. Getting beyond today’s most pressing controversies revolving around statistical methods and irreproducible findings requires scrutinizing underlying statistical philosophies. Two main philosophies about the roles of probability in statistical inference are probabilism and performance (in the long-run). The first assumes that we need a method of assigning probabilities to hypotheses; the second assumes that the main function of statistical method is to control long-run performance. I offer a third goal: controlling and evaluating the probativeness of methods. A statistical inference, in this conception, takes the form of inferring hypotheses to the extent that they have been well or severely tested. A report of poorly tested claims must also be part of an adequate inference. I show how the “severe testing” philosophy clarifies and avoids familiar criticisms and abuses of significance tests and cognate methods (e.g., confidence intervals). Severity may be threatened in three main ways: fallacies of rejection and non-rejection, unwarranted links between statistical and substantive claims, and violations of model assumptions. I illustrate with some controversies surrounding the use of significance tests in the discovery of the Higgs particle in high energy physics.

Teddy Seidenfeld (CMU) – Radically Elementary Imprecise Probability Based on Extensive Measurement, 4:30pm
Abstract. This presentation begins with motivation for “precise” non-standard probability. Using two old challenges — involving (i) symmetry of probabilistic relevance and (ii) respect for weak dominance — I contrast the following three approaches to conditional probability given a (non-empty) “null” event and their three associated decision theories.
Approach #1 – Full Conditional Probability Distributions (Dubins, 1975) conjoined with Expected Utility.
Approach #2 – Lexicographic Probability conjoined with Lexicographic Expected Value (e.g., Blume et al., 1991)
Approach #3 – Non-standard Probability and Expected Utility based on Non-Archimedean Extensive Measurement (Narens, 1974).
The second part of the presentation discusses progress we’ve made using Approach #3 within a context of Imprecise Probability.

May
13
Sat
Between Philosophy and Rhetoric: NYU Spring Workshop in Ancient Philosophy @ Depts. of Philosophy & Classics
May 13 – May 14 all-day

Even though ancient philosophy and rhetoric have many overlapping interests (education, persuasion, politics, etc.), their relationship has long been a contentious subject, especially among ancient philosophers. Contemporary scholarship on the topic is equally divided: philosophers tend to approach the topic primarily through the works of Plato and Aristotle and regard rhetoric (and rhetorical compositions) as a second-rate notion/discipline which has little interest in shedding light on philosophically relevant questions about human nature and society, whereas classicists research oratorical compositions to get a better understanding of Greek prose style, historical details and context, but often shy away from philosophical questions that the texts might hint at. This workshop aims to bring together scholars working on ancient rhetoric and argumentative techniques on the one hand, and scholars working on ancient philosophy, on the other in order to open up a space for a constructive engagement with philosophy/rhetoric, one which might enrich our understanding of ancient texts as well as the context in which they were produced.

Confirmed speakers: Jamie Dow (Leeds), Richard Hunter (Cambridge), Joel Mann (St Norbert), Jessica Moss (NYU), Usha Nathan (Columbia), James Porter (Berkeley), Edward Schiappa (MIT), Nancy Worman (Barnard). All papers will be followed by a response and general discussion.

Attending the workshop is free, but in order to have an idea of numbers it would be greatly appreciated if those interested in participating in the event would email the organizers, Laura Viidebaum and Toomas Lott.

This Workshop is generously sponsored by the Department of Philosophy (NYU), Department of Classics (NYU) and NYU Center for Ancient Studies.

Oct
2
Mon
Phenomenology of Probability, Noah Greenstein (ME!) @ CUNY Grad Center, rm 3209
Oct 2 @ 4:15 pm – 6:15 pm

An account of fairness and probability is given using Game Theoretical Semantics to schematize fairness as a “draw” result of a logical game. The two concepts of probability — objective frequency vs. subjective belief — are then described as differences in game strategy. Lastly the logical machinery is used to potentially bridge the gap between the two, giving perspective on the problem of induction.

Logic and Metaphysics Workshop Fall 2017:

September 11 Lovett, NYU

September 18 Skiles, NYU

September 25 Jago, Nottingham

October 2 Greenstein, Private Scholar

October 9 GC Closed. No meeting

October 16 Ripley UConn

October 23 Mares, Wellington

October 30 Woods, Bristol

November 6 Hamkins, GC

November 13 Silva, Alagoas

November 20 Yi, Toronto

November 27 Malink, NYU

December 4 Kivatinos, GC

Oct
20
Fri
Literature as an Ark: on the Stylistic and Ethical Aspects of Zoopoetics @ Maison Française East Gallery, Buell Hall
Oct 20 @ 12:30 pm – 1:30 pm

A talk by Anne Simon, moderated by Eliza Zingesser

Zoopoetics aims to highlight the plurality of stylistic, linguistic and narrative tools used by writers to express the plurality of animal activities, affects and worlds, as well as the intricacies of the interactions between humans and animals. Such an approach helps to understand that all life forms are in a relationship of dependence with an archè (Husserl)—an origin, a reason, a refuge, a dwelling, the Earth— and that animals are more stylistic or rhetorical beings than we usually think of them as being. Evolution and biomorphic logics allow us to intuitively understand other species related to us, to share many of their emotions and expressions, and to be able to account for them through specific human means, such as evocative and figurative language. The lecture will show that perspectivism, metamorphosis and hybridity are universal patterns and experiences that literature embodies in different ways.

Anne Simon is a Research Director at the Centre National de la Recherche Française and a Member of the École des Hautes Études en Sciences Sociales (Paris), where she leads the Project « Animots » ; an author of Trafics de Proust, 2016 and La rumeur des distances traversées, to be published in 2018. Her research focuses on disturbing relationships between philosophy and literature, and on zoopoetics.

Oct
27
Fri
“Probabilistic Knowledge and Legal Proof” Sarah Moss (Univ. of Michigan) @ NYU Philosophy Dept. rm 202
Oct 27 @ 3:30 pm – 5:30 pm

Abstract: Traditional theories of knowledge often focus on the epistemic status of full beliefs. In Probabilistic Knowledge (forthcoming), I argue that like full beliefs, credences and other probabilistic beliefs can constitute knowledge. This talk applies probabilistic knowledge to problems in legal and moral philosophy. I begin by arguing that legal standards of proof require knowledge of probabilistic contents. For instance, proof by a preponderance of the evidence requires the factfinder to have greater than .5 credence that a defendant is liable, and also requires this probabilistic belief to be knowledge. Proof of guilt beyond a reasonable doubt requires knowledge of a significantly stronger content. The fact that legal proof requires knowledge explains why merely statistical evidence is insufficient to license a legal verdict of liability or guilt. In addition to explaining the limited value of statistical evidence, probabilistic knowledge enables us to articulate epistemic norms that are violated by acts of racial and other profiling. According to these norms, it can be epistemically wrong to infer from statistics that a person of Mexican ancestry is likely undocumented, for instance, even when inferring parallel facts about ordinary objects is perfectly okay.

Reception to follow in 6th floor lounge.

Nov
10
Fri
Entropy and Insufficient Reason – Anubav Vasudevan (University of Chicago) @ Faculty House, Columbia U
Nov 10 @ 4:00 pm – 6:00 pm

One well-known objection to the principle of maximum entropy is the so-called Judy Benjamin problem, first introduced by van Fraassen (1981). The problem turns on the apparently puzzling fact that, on the basis of information relating an event’s conditional probability, the maximum entropy distribution will almost always assign to the event conditionalized on a probability strictly less than that assigned to it by the uniform distribution. In this paper, I present an analysis of the Judy Benjamin problem that can help to make sense of this seemingly odd feature of maximum entropy inference. My analysis is based on the claim that, in applying the principle of maximum entropy, Judy Benjamin is not acting out of a concern to maximize uncertainty in the face of new evidence, but is rather exercising a certain brand of epistemic charity towards her informant. This charity takes the form of an assumption on the part of Judy Benjamin that her informant’s evidential report leaves out no relevant information. I will explain how this single assumption suffices to rationalize Judy Benjamin’s behavior. I will then explain how such a re-conceptualization of the motives underlying Judy Benjamin’s appeal to the principle of maximum entropy can further our understanding of the relationship between this principle and the principle of insufficient reason. I will conclude with a discussion of the foundational significance for probability theory of ergodic theorems (e.g., de Finetti’s theorem) describing the asymptotic behavior of measure preserving transformation groups. In particular, I will explain how these results, which serve as the basis of maximum entropy inference, can provide a unified conceptual framework in which to justify both a priori and a posteriori probabilistic reasoning.

We will be having dinner right after the meeting at the faculty house. Please let Robby (jrf2162@columbia.edu) know if you will be joining us so that he can make an appropriate reservation (please be advised that at this point the university only agrees to cover the expenses of the speaker and the rapporteur and that the cost for all others is $30, payable by cash or check).

https://philevents.org/event/show/37746

Dec
8
Fri
The Price of Broadminded Probabilities and the Limitation of Science – Haim Gaifman (Columbia) @ Faculty House, Columbia U
Dec 8 @ 4:10 pm

A subjective probability function is broadminded to the extent that it assigns positive probabilities to conjectures that can be possibly true. Assigning to such a conjecture the value 0 amounts to a priori ruling out the possibility of confirming the conjecture to any extent by the growing evidence. A positive value leaves, in principle, the possibility of learning from the evidence. In general, broadmindedness is not an absolute notion, but a graded one, and there is a price for it: the more broadminded the probability, the more complicated it is, because it has to assign non-zero values to more complicated conjectures. The framework which is suggested in the old Gaifman-Snir paper is suitable for phrasing this claim in a precise way and proving it. The technique by which this claim is established is to assume a definable probability function, and to state within the same language a conjecture that can be possibly true, whose probability is 0.

The complexity of the conjecture depends on the complexity of the probability, i.e., the complexity of the formulas that are used in defining it. In the Gaifman-Snir paper we used the arithmetical hierarchy as a measure of complexity. It is possible however to establish similar results with respect to a more “down to earth” measures, defined in terms of the time that it takes to calculate the probabilities, with given precisions.

A claim of this form, for a rather simple setup, was first proven by Hilary Putnam in his paper ““Degree of Confirmation” and inductive logic”, which was published in the 1963 Schilpp volume dedicated to Carnap. The proof uses in a probabilistic context, a diagonalization technique, of the kind used in set theory and in computer science. In the talk I shall present Putnam’s argument and show how diagonalization can be applied in considerably richer setups.

The second part of the talk is rather speculative. I shall point out the possibility that there might be epistemic limitations to what human science can achieve, which are imposed by certain pragmatic factors ‒ such as the criterion of repeatable experiments. All of which would recommend a skeptic attitude.