J. Brian Pitts (Cambridge) gives a talk at the MCMP Colloquium (6 February, 2013) titled "How Almost Everything in Space-time Theory Is Illuminated by Simple Particle Physics: The Neglected Case of Massive Scalar Gravity". Abstract: Both particle physics from the 1920s-30s and the 1890s Seeliger-Neumann modification of Newtonian gravity suggest considering a “mass term,” an additional algebraic term in the gravitational potential. The “graviton mass” gives gravity a finite range. The smooth massless limit implies underdetermination. In 1914 Nordström generalized Newtonian gravity to fit Special Relativity. Why not do to Nordström what Seeliger and Neumann did to Newton? Einstein started in setting up a (faulty!) analogy for his cosmological constant Λ. Scalar gravities, though not empirically viable since the 1919 bending of light observations, provide a useful test bed for tensor theories like General Relativity. Massive scalar gravity, though not completed in a timely way, sheds philosophical light on most issues in contemporary and 20th century space-time theory. A mass term shrinks the symmetry group to that of Special Relativity and violates Einstein's principles (general covariance, general relativity, equivalence and Mach) in empirically small but conceptually large ways. Geometry is a poor guide to massive scalar gravities in comparison to detailed study of the field equation or Lagrangian. Matter sees a conformally flat metric because gravity distorts volumes while leaving the speed of light alone, but gravity sees the whole flat metric due to the mass term. Largely with Poincaré (pace Eddington), one can contemplate a “true” flat geometry differing from what material rods and clocks disclose. But questions about “true” geometry need no answer and tend to block inquiry. Presumptively one should expect analogous results for the tensor (massive spin 2) case modifying Einstein’s equations. A case to the contrary was made only in 1970-72: an apparently fatal dilemma involving either instability or empirical falsification appeared. But dark energy measurements since 1999 cast some doubt on General Relativity (massless spin 2) at long distances. Recent calculations (2000s, some from 2010) show that instability can be avoided and that empirical falsification likely can be as well, making massive spin 2 gravity a serious rival for GR. Particle physics can let philosophers proportion belief to evidence over time, rather than suffering from unconceived alternatives.
Luc Bovens (LSE) gives a talk at the 6th Munich-Sydney-Tilburg Conference on "Models and Decisions" (10-12 April, 2013) titled "Evaluating Risky Prospects: The Distribution View". Abstract: Policy Analysts need to rank policies with risky outcomes. Such policies can be thought off as prospects. A prospect is a matrix of utilities. On the rows we list the people who are affected by the policy. In the columns we list alternative states of the world and specify a probability distribution over the states. I provide a taxonomy of various ex ante and ex post distributional concerns that enter into such policy evaluations and construct a general method that reflects these concerns, integrates the ex ante and ex post calculus, and generates orderings over policies. I show that Parfit’s Priority View is a special case of the Distribution View.
Daniel Wohlfarth (Bonn) gives a talk at the MCMP Colloquium (30 January, 2013) titled "On the Conception of Fundamentality of Time-Asymmetries in Physics". Abstract: The goal of my talk is to argue for two connected proposals: Firstly: I shall show that a new conceptual understanding of the term ‘fundamentality’ - in the context of time-asymmetries - is applicable to cosmology and in fact shows that classical and semi-classical cosmology should be understood as time-asymmetric theories. Secondly: I will show that the proposed conceptual understanding of ‘fundamentality’, applied to cosmological models with a hyperbolical curved spacetime structure, provides a new understanding of the origin of the (quantum) thermodynamic time-asymmetry. In the proposed understanding a ‘quantum version’ of the second law can be formulated. This version is explicitly time-asymmetric (decreasing entropy with decreasing time coordinates and visa versa). Moreover, the physical effectiveness of the time-asymmetry will be based on the crucial Einstein equations and additional calculations in QFT. Therefore, the physical effectiveness of the time-asymmetry will be independent of an ontic interpretation of ‘entropy’ itself. The whole account is located in the set of semi classical quantum cosmology (without an attempt to quantize gravity) and depends on the definability of any cosmic time coordinates.
Luigi Scorzato (Roskilde) gives a talk at the MCMP Colloquium (16 January, 2013) titled "Simplicity and Measurability in Science". Abstract: Simple assumptions represent a decisive reason to prefer one theory to another in everyday scientific praxis. But this praxis has little philosophical justification, since there exist many notions of simplicity, and those that can be defined precisely strongly depend on the language in which the theory is formulated. Moreover, according to a common general argument, the simplicity of a theory is always trivial in a suitably chosen language. However, this "trivialization argument" is always either applied to toy-models of scientific theories or applied with little regard for the empirical content of the theory. In this paper I show that the trivialization argument fails, when one considers realistic theories and requires their empirical content to be preserved. In fact, the concepts that enable a very simple formulation, are not necessarily measurable, in general. Moreover, the inspection of a theory describing a chaotic billiard shows that precisely those concepts that naturally make the theory extremely simple are provably not measurable. This suggests that, whenever a theory possesses sufficiently complex consequences, the constraint of measurability prevents too simple formulations in any language. In this paper I propose a way to introduce the constraint of measurability in the formulation of a scientific theory in such a way that the notion of simplicity acquires a general and sufficiently precise meaning. I argue that this explains why the scientists often regard their assessments of simplicity as largely unambiguous.
Holger Andreas (MCMP/LMU) gives a talk at the conference on "The Analysis of Theoretical Terms" (3-5 April, 2013) titled "Descriptivism about Theoretical Concepts Implies Ramsification or (Poincarean) Conventionalism".
Hannes Leitgeb (LMU/MCMP) gives a talk at the conference on "The Analysis of Theoretical Terms" (3-5 April, 2013) titled "Theoretical Terms and Induction".
C. Ulises Moulines (LMU) gives a talk at the conference on "The Analysis of Theoretical Terms" (3-5 April, 2013) titled "Causality and Theoretical Terms in Physics".
Xavier de Donato (USC) gives a talk at the conference on "The Analysis of Theoretical Terms" (3-5 April, 2013) titled "Theoretical Terms, Ideal Objects and Zalta's Abstract Objects Theory".
Stathis Psillos (Athens) gives a talk at the conference on "The Analysis of Theoretical Terms" (3-5 April, 2013) titled "Causal-descriptivism Revisited".
Michele Ginammi (Pisa) gives a talk at the conference on "The Analysis of Theoretical Terms" (3-5 April, 2013) titled "Avoiding Reification".
Jeffrey Ketland (Oxford) gives a talk at the conference on "The Analysis of Theoretical Terms" (3-5 April, 2013) titled "Leibniz Equivalence".
Demetra Christopoulou (Patras) gives a talk at the conference on "The Analysis of Theoretical Terms" (3-5 April, 2013) titled "Implicitly defining mathematical terms".
Gauvain Leconte (Paris) gives a talk at the conference on "The Analysis of Theoretical Terms" (3-5 April, 2013) titled "Definition, elimination and introduction of theoretical terms".
John Worrall (LSE) gives a talk at the conference on "The Analysis of Theoretical Terms" (3-5 April, 2013) titled "Theoretical Terms, Ramsey Sentences and Structural Realism".
Charlotte Werndl (LSE) gives a talk at the conference on "The Analysis of Theoretical Terms" (3-5 April, 2013) titled "Typicality in Statistical Physics and Dynamical Systems Theory".
Georg Schiemer (LMU/MCMP) gives a talk at the conference on "The Analysis of Theoretical Terms" (3-5 April, 2013) titled "The epsilon-reconstruction of theories and scientific structuralism".
Sebastian Lutz (LMU/MCMP) gives a talk at the conference on "The Analysis of Theoretical Terms" (3-5 April, 2013) titled "The Criteria for the Empirical Significance of Terms".
Julian Nida-Rümelin (LMU) gives a talk at the 6th Munich-Sydney-Tilburg Conference on "Models and Decisions" (10-12 April, 2013) titled "Cooperation and (structural) Rationality". Abstract:Cooperation remains a challenge for the theory of rationality, rational agents should not cooperate in one shot prisoner's dilemmas. But they do, it seems. There is a reason why mainstream rational choice theory is at odds with cooperative agency: rational action is thought to be consequentialist, but this is wrong. If we give up consequentialism and adopt a structural account of rationality, the problem resolves, as will be shown. In the second part of my lecture I shall show that structural rationality can be combined with bayesianism, contrary to what one may expect. And finally I shall discuss some philosophical implications of structural rationality.
Michael Strevens (NYU) gives a talk at the 6th Munich-Sydney-Tilburg Conference on "Models and Decisions" (10-12 April, 2013) titled "Idealization, Prediction, Difference-Making". Abstract: Every model leaves out or distorts some factors that are causally connected to its target phenomena – the phenomena that it seeks to predict or explain. If we want to make predictions, and we want to base decisions on those predictions, what is it safe to omit or to simplify, and what ought a causal model to capture fully and correctly? A schematic answer: the factors that matter are those that make a difference to the target phenomena. There are several ways to understand the notion of difference-making. Which are the most useful to the forecaster, to the decision-maker? This paper advances a view.
Itzhak Gilboa (HEC) gives a talk at the 6th Munich-Sydney-Tilburg Conference on "Models and Decisions" (10-12 April, 2013) titled "Rationality and the Bayesian Paradigm". Abstract: It is claimed that rationality does not imply Bayesianism. We first define what is meant by the two terms, so that the statement is not tautologically false. Two notions of rationality are discussed, and related to two main approaches to statistical inference. It is followed by a brief survey of the arguments against the definition of rationality by Savage's axioms, as well as some alternative approaches to decision making.