MCMP – Philosophy of Mathematics

Mathematical Philosophy - the application of logical and mathematical methods in philosophy - is about to experience a tremendous boom in various areas of philosophy. At the new Munich Center for Mathematical Philosophy, which is funded mostly by the German Alexander von Humboldt Foundation, philosophical research will be carried out mathematically, that is, by means of methods that are very close to those used by the scientists. The purpose of doing philosophy in this way is not to reduce philosophy to mathematics or to natural science in any sense; rather mathematics is applied in order to derive philosophical conclusions from philosophical assumptions, just as in physics mathematical methods are used to derive physical predictions from physical laws. Nor is the idea of mathematical philosophy to dismiss any of the ancient questions of philosophy as irrelevant or senseless: although modern mathematical philosophy owes a lot to the heritage of the Vienna and Berlin Circles of Logical Empiricism, unlike the Logical Empiricists most mathematical philosophers today are driven by the same traditional questions about truth, knowledge, rationality, the nature of objects, morality, and the like, which were driving the classical philosophers, and no area of traditional philosophy is taken to be intrinsically misguided or confused anymore. It is just that some of the traditional questions of philosophy can be made much clearer and much more precise in logical-mathematical terms, for some of these questions answers can be given by means of mathematical proofs or models, and on this basis new and more concrete philosophical questions emerge. This may then lead to philosophical progress, and ultimately that is the goal of the Center.

Recent metamathematical wonders and the question of arithmetical realism

Andrey Bovykin (Bristol) gives a talk at the MCMP Colloquium (16 January, 2013) titled "Recent metamathematical wonders and the question of arithmetical realism". Abstract: Metamathematics is the study of what is possible or impossible in mathematics, the study of unprovability, limitations of methods, algorithmic undecidability and "truth". I would like to make my talk very historical and educational and will start with pre-Godelean metamathematics and the first few metamathematical scenarios that emerged in the 19th century ("Parallel Worlds", "Insufficient Instruments", "Absence of a Uniform Solution", and "Needed Objects Are Yet Absent"). I will also mention some metamathematical premonitions from the era before Gauss that are the early historical hints that a mathematical question may lead to a metamathematical answer, rather than a "yes" or "no" answer. (These historical quotations have recently been sent to me by a historian Davide Crippa.) Then I am planning to sketch the history and evolution of post-Godelean metamathematical ideas, or, rather, generations of ideas. I split post-Godelean history of ideas into five generations and explain what people of each generation believed in (what are the right objects of study? what are the right questions? what are the right methods?). The most spectacular, and so far, the most important completed period in the history of metamathematics started in 1976 with the introduction of the Indicator Method by Jeff Paris and the working formulation of the Reverse Mathematics Programme by Harvey Friedman. I will describe this period (what people believed in that era) in some detail. We now live in a new period in metamathematics, quite distinct from the ideas of 1980s. I will speak about the question of universality of Friedman's machinery, the theory of templates, meaninglessnessisation (with an entertaining example of eaninglessnessisation), Weiermann's threshold theory and some new thoughts about the space of all possible strong theories. I will concentrate in most detail on the question of the Search for Arithmetical Splitting (the search for "equally good" arithmetical theories that contradict each other). and sketch some possible ways that may lead to the discovery of Arithmetical Splitting. There are people, who strongly deny the possibility of Arithmetical Splitting, most often under the auspices of arithmetical realism: "there exists the true standard model of arithmetic". They usually cite Godel's theorem as existence of ***true*** but unprovable assertions and concede only to some epistemological difficulties in reaching the "arithmetical truth". I will mention seven common arguments usually quoted by anti-Arithmetical-Splitting people and explain why it seems that all of these arguments don't stand. Historically and very tentatively, I would like to suggest that anti-Arithmetical-Splitting views should perhaps be considered not a viable anti-Pluralist philosophy of mathematics, but rather a conservative resistance to the inevitable new generation of metamathematical wonders. So anti-Arithmetical-Splitting views are more in line with such historical movements as "rejection of complex numbers", "resistance to non-Euclidean geometries in 1830s-1850s", and "refusal to accept concepts of abstract mathematics in the second half of the 19th century" and other conservative movements of the past. Arithmetical Splitting has not been reached yet, but we are well equipped to discuss it before its arrival.

04-18
01:02:16

The Univalence Axiom

Steve Awodey (CMU) gives a talk at the MCMP Colloquium (16 July, 2014) titled "The Univalence Axiom". Abstract: In homotopy type theory, the Univalence Axiom is a new principle of reasoning which implies that isomorphic structures can be identified. I will explain this axiom and consider its background and consequences, both mathematical and philosophical.

04-18
56:22

In Good Company? On Hume's Principle and the assignment of numbers to infinite concepts.

Paolo Mancosu (UC Berkeley) gives a talk at the MCMP Colloquium (8 May, 2014) titled "In Good Company? On Hume's Principle and the assignment of numbers to infinite concepts.". Abstract: In a recent article (Review of Symbolic Logic 2009), I have explored the historical, mathematical, and philosophical issues related to the new theory of numerosities. The theory of numerosities provides a context in which to assign numerosities to infinite sets of natural numbers in such a way as to preserve the part-whole principle, namely if a set A is properly included in B then the numerosity of A is strictly less than the numerosity of B. Numerosities assignments differ from the standard assignment of size provided by Cantor’s cardinality assignments. In this talk, I generalize some specific worries emerging from the theory of numerosities to a line of thought resulting in what I call a ‘good company’ objection to Hume’s principle. The talk has four main parts. The first takes a historical look at nineteenth-century attributions of equality of numbers in terms of one-one correlations and argues that there was no agreement as to how to extend such determinations to infinite sets of objects. This leads to the second part where I show that there are countably infinite many abstraction principles that are ‘good’, in the sense that they share the same virtues of HP and from which we can derive the axioms of second order arithmetic. The third part connects this material to a debate on Finite Hume Principle between Heck and MacBride and states the ‘good company’ objection. Finally, the last part gives a tentative taxonomy of possible neo-logicist responses to the ‘good company’ objection and makes a foray into the relevance of this material for the issue of cross-sortal identifications for abstractions.

04-18
01:07:17

Learning Experiences, Expected Inaccuracy, and the Value of Knowledge

Simon Huttegger (UC Irvine) gives a talk at the MCMP Colloquium (8 May, 2014) titled "Learning Experiences, Expected Inaccuracy, and the Value of Knowledge". Abstract: I argue that van Fraassen's reflection principle is a principle of rational learning. First, I show that it follows if one wants to minimize expected inaccuracy. Second, the reflection principle is a consequence of a postulate describing genuine learning situations, which is related to the value of knowledge theorem in decision theory. Roughly speaking, this postulate says that a genuine learning experience cannot lead one to foreseeably make worse decisions after the learning experience than one could already have made before learning.

04-18
56:33

Anti-Mathematicism and Formal Philosophy

Eric Schliesser (Ghent) gives a talk at the MCMP Colloquium (25 June, 2014) titled "Anti-Mathematicism and Formal Philosophy". Abstract: Hannes Leitgeb rightly claims that "contemporary critics of mathematization of (parts of) philosophy do not so much put forward arguments as really express a feeling of uneasiness or insecurity vis-à-vis mathematical philosophy." (Leitgeb 2013: 271) This paper is designed to articulate arguments in the place of that feeling of uneasiness. The hope is that this will facilitate more informed discussion between partisans and critics of formal philosophy. In his (2013) paper Leitgeb articulates and refutes one argument from Kant against formal philosophy. This paper will show, first, that Kant's argument as part of a much wider 18th century debates of formal methods in philosophy (prompted by success of Newton and anxiety over Spinoza). Now, obviously 'philosophy' has a broader scope in the period, so I will confine my discussion of arguments about formal methods in aeas we still consider 'philosophical' (metaphysics, epistemology, moral philosophy, etc.) In order to facilitate discussion I offer the fllowing taxonomy of arguments: (i) the global strategy, by which we mean that that the epistemic authority and security of mathematical applications as such are challenged and de-privileged; (ii) the containment strategy, by which the successful application of mathematical technique is restricted only to some limited domain; (iii) non-epistemic theories, by which the apparent popularity of mathematics within some domain of application is explained away in virtue of some non-truth-tracking features. The non-epistemic theory is generally part of a debunking strategy. I will offer examples from Hume and Mandeville of all three strategies. The second main aim is to articulate arguments that call attention to potential limitations of formal philosophy. By 'limitation' I do not have in mind formal, intrinsic limits (e.g., Godel). I explore two: (A) firs, formal philosophers often assume a kind of topic neutrality or generality when justifying their methods (even if this idea has been challenged from within Dutilh Novaes 2011). But this means that formal methods are not self-justifying (outside logic and mathematics, perhaps) and unable to ground their own worth; it follows straightforwardly that for such grounding formal approaches require substantive (often normative) non-formal premises. (B) Second, Leitgeb (2013: 274-5) insightfully discusses the ways in which formal approaches like any other method, may be abused. But because abuses with esoteric techniques may be hard to detect by bystanders (philosophical and otherwise) absent other means of control and containment, there is a heavy responsibility on practitioners of formal philosophy to develop institutional and moral safeguards that are common in, say, engineering and medical sciences against such abuses. Absent safeguards, formal philosophers require a strong collective self-policing ethos; it is unlikely that current incentives promote such safeguards.

04-18
49:32

Geometrical Roots of Model Theory: Duality and Relative Consistency

Georg Schiemer (Vienna/MCMP) gives a talk at the MCMP Colloquium (9 July, 2015) titled "Geometrical Roots of Model Theory: Duality and Relative Consistency". Abstract: Axiomatic geometry in Hilbert's Grundlagen der Geometrie (1899) is usually described as model-theoretic in character: theories are understood as theory schemata that implicitly define a number of primitive terms and that can be interpreted in different models. Moreover, starting with Hilbert's work, metatheoretic results concerning the relative consistency of axiom systems and the independence of particular axioms have come into the focus of geometric research. These results are also established in a model-theoretic way, i.e. by the construction of structures with the relevant geometrical properties. The present talk wants to investigate the conceptual roots of this metatheoretic approach in modern axiomatics by looking at an important methodological development in projective geometry between 1810 and 1900. This is the systematic use of the "principle of duality", i.e. the fact that all theorems of projective geometry can be dualized.The aim here will be twofold: First, to assess whether the early contributions to duality (by Gergonne, Poncelet, Chasles, and Pasch among others) can already be described as model-theoretic in character. The discussion of this will be based on a closer examination of two existing justifications of the general principle, namely a transformation-based account and a (proto-)proof-theoretic account based on the axiomatic presentation of projective space. The second aim will be to see in what ways Hilbert's metatheoretic results in Grundlagen, in particular his relative consistency proofs, were influenced by the previous uses of duality in projective geometry.

07-14
01:09:25

A Hypothetical Conception of Mathematics in Practice

José Ferreirós (Sevilla) gives a talk at the MCMP Colloquium (11 June, 2015) titled "A Hypothetical Conception of Mathematics in Practice". Abstract: The aim of the talk will be to present some of the basic aspects of my approach to mathematical epistemology, developed in the forthcoming book Mathematical Knowledge and the Interplay of Practices (Princeton UP). The approach is agent-based, considering mathematical systems as frameworks that emerge in connection with practices of different kinds, giving rise to new practices. In particular, we shall consider the effects of placing the rather traditional thesis that advanced mathematics is hypothetical – based on 'constitutive,' not representational, hypotheses – in the setting of a web of interrelated practices. Insistence on the coexistence of a plurality of practices, I claim, modifies substantially that thesis and allows for the development of a novel epistemology.

06-30
57:01

On the Contingency of Predicativism

Sam Sanders (MCMP) gives a talk at the MCMP Colloquium (16 April, 2015) titled "On the Contingency of Predicativism". Abstract: Following his discovery of the paradoxes present in naive set theory, Russell proposed to ban the vicious circle principle, nowadays called impredicative definition, by which a set may be defined by referring to the totality of sets it belongs to. Russell's proposal was taken up by Weyl and Feferman in their development of the foundational program predicativist mathematics. The fifth `Big Five' system from Reverse Mathematics (resp. arithmetical comprehension, the third Big Five systen) is a textbook example of impredicative (resp. predicative) mathematics. In this talk, we show that the fifth Big Five system can be viewed as an instance of nonstandard arithmetical comprehension. We similarly prove that the impredicative notion of bar recursion can be viewed as the predicative notion primitive recursion with nonstandard numbers. In other words, predicativism seems to be contingent on whether the framework at hand accommodates Nonstandard Analysis, arguably an undesirable feature for a foundational philosophy.

05-11
49:02

A Computational Perspective on Metamathematics

Vasco Brattka (UniBwM Munich) gives a talk at the MCMP Colloquium (29 January, 2015) titled "A Computational Perspective on Metamathematics". Abstract: By metamathematics we understand the study of mathematics itself using methods of mathematics in a broad sense (not necessarily based on any formal system of logic). In the evolution of mathematics certain steps of abstraction have led from numbers to sets of numbers, from sets to functions and eventually to function spaces. Another meaningful step in this line is the step to spaces of theorems. We present one such approach to a space of theorems that is based on a computational perspective. Theorems as individual points in this space are related to each other in an order theoretic sense that reflects the computational content of the related theorems. The entire space is called the Weihrauch lattice and carries the order theoretic structure of a lattice enriched by further algebraic operations. This space yields a mathematical framework that allows one to classify theorems according to their complexity and the results can be essentially seen as a uniform and somewhat more resource sensitive refinement of what is known as reverse mathematics. In addition to what reverse mathematics delivers, a Weihrauch degree of a theorem yields something like a full "spectrum" of a theorem that allows one to determine basically all types of computational properties of that theorem that one would typically be interested in. Moreover, the Weihrauch lattice is formally a refinement of the Borel hierarchy, which provides a well-known topological complexity measure (and the relation of the Weihrauch lattice to the Borel hierarchy is very much like the relation between the many-one or Turing semi-lattice and the arithmetical hierarchy). Well known classes of functions that have been studied in algorithmic learning theory or theoretical computer science have meaningful and very succinct characterizations in the Weihrauch lattice, which underlines that this lattice yields a very natural model. Since the Weihrauch lattice is defined using a concrete model, the lattice itself and theorems as points in it can also be studied directly using methods of topology, descriptive set theory, computability theory and lattice theory. Hence, in a very true and direct sense the Weihrauch lattice provides a way to study metamathematics without any detour over formal systems and models of logic.

02-10
01:02:58

Quantified Probability Logics: How Boolean Algebras Met Real-Closed Fields

Stanislav O. Speranski (Sobolev Institute of Mathematics) gives a talk at the MCMP Colloquium (4 December, 2014) titled "Quantified probability logics: how Boolean algebras met real-closed fields". Abstract: This talk is devoted to one interesting probability logic with quantifiers over events — henceforth denoted by QPL. That is to say, the quantifiers in QPL are intended to range over all events of the probability space at hand. Here I will be concerned with fundamental questions about the expressive power and algorithmic content of QPL. Note that these two aspects are closely connected, and in fact there exists a trade-off between them: whenever a logic has a high computational complexity, we expect some natural higher-order properties to be definable in the corresponding language. For each class of spaces, we introduce its theory, viz. the set of probabilistic sentences true in every space from the class. Then the theory of the class of all spaces turns out to be of a huge complexity, being computably equivalent to the true second-order arithmetic of natural numbers. On the other hand, many important properties of probability spaces — like those of being finite, discrete and atomless — are definable by suitable formulas of our language. Further, for any class of atomless spaces, there exists an algorithm for deciding whether a given probabilistic formula belongs to its theory or not; moreover, this continues to hold even if we extend our formalism by adding quantifiers over reals. To sum up, the proposed logic and its variations, being attractive from the viewpoint of probability theory and having nice algebraic features, prove to be very useful for investigating connections between probability and logic. In the final part of the talk I will compare QPL with other probabilistic languages (including Halpern's first-order logics of probability and Keisler's model-theoretic logics with probability quantifiers), discuss analogies with research in Boolean algebras and show, in particular, how QPL can serve for classification of probability spaces.

02-10
51:28

Symmetry and Mathematicians' Aesthetic Preferences: a Case Study

Irina Starikova (Sao Paulo) gives a talk at the MCMP Colloquium (8 January, 2015) titled "Symmetry and Mathematicians' Aesthetic Preferences: a Case Study". Abstract: Symmetry plays an important role in some areas of mathematics and has traditionally been regarded as a factor of visual beauty. In this talk I explore the ways that symmetry contribute to mathematicians’ aesthetics judgments about mathematical entities and representations. I discuss an example from algebraic graph theory. Comparing two isomorphic drawings of the Petersen graph, I argue that we need to refine the question by distinguishing between perceptual and intellectual beauty and by noting that some mathematical symmetries are revealed to us in diagrams while others are hidden.

01-16
44:10

An Aristotelian continuum

Stewart Shapiro (Ohio) gives a talk at the MCMP Colloquium (18 December, 2014) titled "An Aristotelian continuum". Abstract: Geoffrey Hellman and I are working on a point-free account of the continuum. The current version is “gunky” in that it does not recognize points, as part of regions, but it does make essential use of actual infinity. The purpose of this paper is to produce a more Aristotelian theory, eschewing both the actual existence of points and infinite sets, pluralities, or properties. There are three parts to the talk. The first is to show how to modify the original gunky theory to avoid the use of (actual) infinity. It is interesting that there are a number of theorems in the original theory (such as the existence of bisections and differences, and the Archimedean property) that have to be added, as axioms. The second part of the talk is to take the “potential” nature of the usual operations seriously, by using a modal language. The idea is that each “world” is finite; the usual operations are understood as possibilities. This part builds on some recent work on set theory by Øystein Linnebo. The third part is an attempt to recapture points, but taking the notion of a potentially infinite sequence seriously.

12-31
46:56

Neuropsychology of numbers

Hourya Benis-Sinaceur (Paris I) gives a talk at the Workshop on Mathematics: Objectivity by Representation (11 November, 2014) titled "Neuropsychology of numbers". Abstract: How do we extract numbers from our perceiving the surrounding world? Neurosciences and cognitive sciences provide us with a myriad of empirical findings that shed light on hypothesized primitive numerical processes in the brain and in the mind. Yet, the hypotheses based on which the experiments are conducted, hence the results, depend strongly on sophisticated arithmetical models. These sophisticated arithmetical models are used to describe and explain neural data or cognitive representations that supposedly are the roots of primary arithmetical activity. I will give some examples of this petitio principii, which is involved in neuropsychologist arguments, most time without any justification.

12-20
29:41

IF epistemic logic and mathematical knowledge

Manuel Rebuschi (Poincaré Archives, University of Lorraine, Nancy) gives a talk at the Workshop on Mathematics: Objectivity by Representation (11 November, 2014) titled "IF epistemic logic and mathematical knowledge". Abstract: Can epistemic logicstate anything interesting about the epistemology of mathematics? That's one of Jaakko Hintikka’s claims. Hintikka was not only the founder of modal epistemic logic (1962), since he also worked on the foundations of mathematics (1996). Using what he calls "second generation" epistemic logic (2003), i.e. independence-friendly (IF) epistemic logic, Hintikka revisits the epistemology of mathematics, and in particular the debate between classical and intuitionistic mathematics (2001). The aim of the talk is to show that Hintikka is right regarding IF epistemic logic, for such a logic enables us to account for interesting features of mathematical knowledge. However, the path is not as easy as that Hintikka suggests. I will show that the well-known issue of logical omniscience directly threatens the understanding of intuitionism offered by IF epistemic logic.

12-18
01:07:16

Natural numbers in philosophy of mathematics and in cognitive science

Paula Quinon (Lund) gives a talk at the MCMP Colloquium (27 November, 2014) titled "Natural numbers in philosophy of mathematics and in cognitive science". Abstract: Natural numbers are the object of studies in various disciplines. Two such disciplines are the philosophy of mathematics and research in developmental cognitive sciences. My current endeavor consists in studying the borders and possible mutual influences between these two. In my talk I compare the conceptual frameworks of the two disciplines to highlight similarities and differences. I provide examples of how the application the results from one discipline to the other may fail. I use these to consider how interdisiplinariy on the subject of natural numbers can be made fruitful. I will suggest that methodological contraints may be prudent to this.

12-18
57:13

On Mathematical Structuralism. A Theory of Unlabeled Graphs as Ante Rem Structures

Hannes Leitgeb (MCMP/LMU) gives a talk at the Workshop on Mathematics: Objectivity by Representation (11 November, 2014) titled "On Mathematical Structuralism. A Theory of Unlabeled Graphs as Ante Rem Structures". Abstract: There are different versions of structuralism in present-day philosophy of mathematics which all take as their starting point the structural turn that mathematics took in the last two centuries. In this talk, I will make one variant of structuralism—ante rem structuralism—precise in terms of an axiomatic theory of unlabeled graphs as ante rem structures. I will then use that axiomatic theory in order to address some of the standard objections to ante rem structuralism that one can find in the literature. Along the way, I will discuss also other versions of mathematical structuralism, and I will say something on how the emerging theory of ante rem structures relates to modern set theory.

12-18
01:12:59

What are the challenges of Benacerrafs Dilemma? A Reinterpretation

Marco Panza (Paris I) gives a talk at the Workshop on Mathematics: Objectivity by Representation (11 November, 2014) titled "What are the challenges of Benacerrafs Dilemma? A Reinterpretation". Abstract: Despite its enormous influence, Benacerraf's dilemma admits no standard, unanimously accepted, version. This mainly depends on Benacerraf's having originally presented it in a quite colloquial way, by avoiding any compact, somehow codified, but purportedly comprehensive formulation. But it also depends on Benacerraf's appealing, while expounding the dilemma, to so many conceptual ingredients so as to spontaneously generate the feeling that most of them are in fact inessential for stating it. It is almost unanimously admitted that the dilemma is, as such, independent of the adoption of a causal conception of knowledge, though Benacerraf appealed to it. This apart, there have not been, however, and still there is no agreement about which of these ingredients have to be conserved so as to get a sort of minimal version of the dilemma, and which others can, rather, be left aside (or should be so, in agreement with an Okkamist policy). My purpose is to come back to the discussion on this matter, with a particular attention to Field's reformulation of the problem, so as to identify two converging and quite basic challenges, addressed by Benacerraf's dilemma to a platonist and to a combinatorialist (in Benacerraf's own sense) philosophy of mathematics, respectively. What I mean by dubbing these challenges 'converging' is both that they share a common kernel, which encompasses a challenge for any plausible philosophy of mathematics, and that they suggest (at least to me) a way-out along similar lines. Roughing these lines out is the purpose of the two last part of the talk.

12-18
56:23

Discernibility from a countable perspective

Kate Hodesdon (Nancy) gives a talk at the Workshop on Mathematics: Objectivity by Representation (11 November, 2014) titled "Discernibility from a countable perspective". Abstract: In this talk I discuss formal methods for discerning between uncountably many objects with a countable language, building on recent work of James Ladyman, Øystein Linnebo and Richard Pettigrew. In particular, I show how stability theory provides the resources to characterize theories in which this is possible, and discuss the limitations of the stability theoretic approach.

12-18
32:32

Three ways in which logic might be normative

Florian Steinberger (MCMP/LMU) gives a talk at the Workshop on Mathematics: Objectivity by Representation (11 November, 2014) titled "Three ways in which logic might be normative". Abstract: Logic, the tradition has it, is, in some sense, normative for reasoning. Famously, the tradition was challenged by Gilbert Harman who argued that there is no interesting general normative link between facts about logical consequence and norms of belief formation and revision. A number of authors (e.g. John MacFarlane and Hartry Field) have sought to rehabilitate the traditional view of the normative status of logic against Harman. In this paper, I argue that the debate as a whole is marred by a failure of the disputing parties to distinguish three different types of normative assessment, and hence three distinct ways in which logic might be said to be normative. I show that as a result of their failure to appreciate this three-fold distinction, authors have been talking past one another. Finally, time-permitting, I show how each of the three types of normative assessments relate to broader epistemological commitments, specifically commitments within the internalism/externalism debate.

12-18
01:04:17

A useful method for obtaining alternative formulations of the analytical hierarchy

Stanislav O. Speranski (Sobolev Institute of Mathematics) gives a talk at the MCMP Colloquium (6 November, 2014) titled "A useful method for obtaining alternative formulations of the analytical hierarchy". Abstract: In mathematical philosophy one often employs various formal systems and structures for solving philosophical tasks. In particular, many important results in Kripke's theory of truth and the like rest on definability techniques from second-order arithmetic. With this in mind, I will present one useful method for obtaining alternative formulations of the analytical hierarchy. The latter plays a key role in foundations of mathematics and theory of computation, being the generally accepted classification of undecidable problems which capture the truth predicate for first-order arithmetic of natural numbers, and whose computational complexities are less than that of second-order true arithmetic. In the course of the presentation I will mention some relevant contributions of J. Robinson, H. Putnam, J.Y. Halpern, I. Korec and others. Further applications, including those dealing with probabilistic logics, will be discussed in the final part of the talk.

12-12
01:14:05

Recommend Channels