# Courses

**Titles and abstracts of the courses**

These **50 course sessions**, oriented towards an **audience of non-specialist junior and senior researchers**, constitute the core of the thematic trimester. Thanks to them, we aim to foster scientific interaction between mathematicians, computer scientists, and philosophers on various interface topics such as the **discrete** and the **continuous** (more precisely **combinatorics** and **topology**, **geometry** and **arithmetic**), the notion of **modality**, and this with an openness for **philosophical implications on the practice** of these fields.

The **program **(**dates**, **hours**, **locations**) is available in the section **SCHEDULE**.

You will find below the **abstracts** and the **available slides** of the courses: these data will be updated throughout 2016.

**BOOK OF ABSTRACTS**** / ****SLIDES**

Jeremy AVIGAD (Carnegie Mellon Univ.), Formal methods and mathematical understanding - 30/05; 01/06; 06/06; 10/06; 13/06.

*Mathematical Understanding*

I will present views of mathematical language, mathematical method, and mathematical understanding that are informed by formal methods and careful attention to the actual practice of mathematics. I will also identify important roles that philosophy can play in contributing to our understanding of these notions.

*Case study: the History of Dirichlet's Theorem*

In 1837, Dirichlet proved that there are infinitely many prime numbers in any arithmetic progression in which the first two terms have no common factor. Modern presentations of the proof are explicitly higher-order, allowing quantification and summation over certain types of functions known as "Dirichlet characters". I will discuss the history of the theorem, and explain how it illustrates profound ontological and methodological shifts in nineteenth century language and method.

*Formalization and Interactive Theorem Proving*

I will survey various formal languages and foundations for modern mathematics, and the modern practice of interactive theorem proving, which involves the use of computational proof assistants to verify the correctness of mathematical proofs. I will focus on pragmatic issues regarding the formal languages that are designed to express mathematical statements and proofs, with an eye to what that tells us about informal mathematical language.

*Automated Theorem Proving and Heuristics*

In this lecture, I will discuss formal and informal models of mathematical reasoning, as well as the practice of automated theorem proving, again with an eye as to what these fields have to tell us about informal mathematical language and reasoning.

*Modularity and Structure in Mathematics*

In this last lecture, I will return to the themes of the opening lecture, and use the contents of the intervening lectures to develop better philosophical models of mathematical language and practice. In particular, I will discuss ways in which mathematical language and methods support, and are supported by, a modular structuring of mathematical knowledge and understanding.

Christoph BENZMÜLLER(Freie Universität Berlin), On a (quite) universal theorem proving approach and its application in metaphysics - 16/06.

Classical higher-order logic is suited as a meta-logic in which a range of other logics can be elegantly embedded. Interactive and automated theorem provers for higher-order logic are therefore readily applicable. By employing the approach the automation of a variety of ambitious logics has recently been pioneered, including variants of first-order and higher-order quantified multimodal logics and conditional logics. Moreover, the approach supports the automation of meta-level reasoning, and it sheds some new light on meta-theoretical results such as cut-elimination. Most importantly, however, the approach is relevant for practice: it has recently been successfully applied in a series of experiments in metaphysics in which higher-order theorem provers have actually contributed some new knowledge.

Jean-Benoît BOST (Université Paris Sud), Analogies and dictionaries - 11/04; 12/04.

Analogies between mathematical theories have played an important role in the development of many areas of pure mathematics since the beginning of the twentieth century. Some of them have evolved to become precise dictionaries, with remarkable predictive power, but with elusive rigorous justifications.

The lectures will present several of these analogies in a historical perspective and discuss their role in the current practice of mathematical research. The lectures will notably focus on the classical analogies between function fields and number fields.

Mic DETLEFSEN (Notre Dame University),Gödel's theorems and their applications- 04/04; 06/04; 08/04; 11/04

This course is intended to provide an introduction to Gödel’s incompleteness theorems and their principal applications to the philosophy of mathematics. We will survey the proofs of the theorems and their generalizations. We will focus on the major historical application of these theorems, namely, that (those) concerning Hilbert’s Program. As time permits, we will also consider the possible bearing of these theorems on such traditional questions as that concerning the created vs. discovered character of mathematics.

*Technical basics concerning Gödel’s theorems.**Arithmetization as representation.**Application of Gödel’s theorems to Hilbert’s Program.**Application of Gödel’s theorems to the question of the mechanization of mind.*

David FERNANDEZ-DUQUE (Mexico, ITAM & CIMI),Modal logic in the foundations of mathematics- 12/04 ; 14/04

Kurt Gödel suggested treating provability as a modal operator, where the formula []A is interpreted as ‘A is derivable’ (within a formal system such as Peano arithmetic). Gödel’s incompleteness theorems together with later results by Löb suggested an axiomatization for the resulting logic of provability, which is now known as GL and was proven complete by Solovay in the 70’s. Later, Japaridze suggested an extension of GL with countably many modalities [0],[1],[2],[3],... representing provability operators in theories of increasing strength. Japaridze's logic is known as GLP, for ‘Gödel-Löb polymodal’.

Aside from being a fascinating field where modal logic, formal arithmetic and topology interact, the study of Japaridze's logic has important applications in the foundations of mathematics. In his doctoral thesis, Turing proposed a method for iterating the consistency of a theory U over an ordinal notation system. With this, one may measure the consistency strength of a second theory T using the least ordinal |T| such that T does not prove the |T|-iterated consistency of U. In the last decades, Beklemishev has shown how this ordinal may be computed using modal reasoning in GLP.

The goal of this course is to present a general overview of the polymodal logic of provability and its application to the analysis of formal theories. In our first session (12/04), we will discuss the logics GL and GLP as abstract modal logics and compare their arithmetic, Kripke and topological semantics. In our second session (14/04), we will outline Beklemishev's analysis of Peano arithmetic using the tools developed in the first session.

Brice HALIMI (Université Paris Ouest, IREPH), Application of fibered structures in logic- 31/05; 02/06

Since Aristotle, logic constitutes a discipline anchored in philosophy. But its task, also since Aristotle, is to analyze mathematics and to bring out the unity shared by ordinary reasoning (as expressed in natural language) and mathematical reasoning. The privileged relationship that logic enjoys with mathematics explains why logic, by dint of formalizing mathematical theories, has partly become a mathematical discipline in its own right. Yet logic continues to act as the interface which makes it possible for philosophy and mathematics to meet, and this mediating role of logic is not altered by its division into "philosophical logic" and "mathematical logic."

The general working hypothesis of my talks is that the mathematical framework used by philosophical logic has been artificially sealed and that this is related to the fact that mathematical logic does not interact with other mathematical branches in a uniform way. More specifically, I would like to show that logic would benefit from importing fibered structures coming both from differential geometry and category theory. I will first set out a general picture of the possible applications of fibered structures in logic. I will then focus on two particular cases: modal logic (possible world semantics) and set theory. In the first case, I will resort to tangent bundles in order to formalize the concept of higher-order possible world; In the second case, I will elaborate on the connection made by algebraic set theory between set theory and descent theory.

André JOYAL(UQAM),Categorical aspects of homotopy type theory - 30/05; 02/06; 07/06; 08/06.

The rules of homotopy type theory can be described and explained by using category theory. We introduce the notion of tribe for that purpose: it is a category equipped with a class of maps called fibrations satisfying a few axioms. The category of groupoids and the category of Kan complexes are examples. A tribe may, or may not, have internal products. Every tribe is a fibration category in the sense of Brown. The univalence axiom of Veovodsky can be formulated in a tribe. I will develop the algebraic theory and the homotopy theory of tribes.

*Homotopy type theory and the notion of tribe (part 1)**Homotopy type theory and the notion of tribe (part 2)**Homotopy theory in a tribe**The homotopy theory of tribes.*

Dmitry KOZLOV (Universität Bremen), Combinatorics and topology - 23/05; 25/05; 27/05; 14/06; 16/06.

*Formal Introduction to Distributed Computing*

We give a brief from scratch introduction to theoretical distributed computing focussing on those parts where topological and combinatorial methods have been especially successful. We discuss various communication models such as immediate snapshot read/write shared memory model. Furthermore, we look at various standard tasks which one studies in these models.

*Combinatorics and Topology in Computation*

We define various simplicial models to be able to cast questions in distributed computing in mathematical model. First, we formulate in general what we mean by distributed task and distributed protocols. Second, we specify these concepts in the case of Weak Symmetry Breaking. We survey known results and outline a number of open questions.

Giuseppe LONGO(ENS Ulm), Counting vs. measuring: the foundational turn and some practical consequences - 27/05

The focus on Arithmetic in the foundations of mathematics proposed new remarkable frames for knowledge and set the basis for the invention of the Logical Computing Machine, as Turing called his machine. The turn in the analysis of “access to data” will be discussed as well as its relation to physical measurement, at the core of the new physics of the XX century (Geometry of Dynamical Systems, Relativity, Quantum Mechanics). Some consequences will be hinted as for the current perspective in knowledge construction, from the myths of Big Data to those of Molecular Biology.

Paolo MANCOSU(Berkeley University), Abstraction and infinity - 06/06; 09/06; 13/06; 14/06.

Neo-logicism is an attempt to revive Frege’s logicist program by claiming that important parts of mathematics, such as second-order arithmetic, can be shown to be analytic. The claim rests on a logico-mathematical theorem and a cluster of philosophical argumentations. The theorem is called *Frege’s theorem*, namely that second order logic with a single additional axiom, known as N= or Hume’s principle, deductively implies (modulo some appropriate definitions) the ordinary axioms for second order arithmetic. The cluster of philosophical claims is related to the status (logical and epistemic) of N= or Hume’s principle. In the Fregean context the second order systems have variables for concepts and objects (individuals). In addition, there is a functional symbol # that when applied to concepts yield objects as values. The intuitive meaning of # is as an operator that when applied to concepts yields an object corresponding to the cardinal number of the objects falling under the concept (numbers are thus construed as objects). Hume’s principle has the following form:

HP ("B)("C) [#x:(Bx)=#x:(Cx) iff B ≈ C]

where B ≈ C is short-hand for one of the many equivalent formulas of pure second order logic expressing that “there is a one-one correlation between the objects falling under B and those falling under C”. As remarked, the right hand side of the equivalence can be stated in the pure terminology of second-order logic. The left hand-side gives a condition of numerical (cardinal) identity for concepts. The concepts B and C have the same (cardinal) number just in case there is a one-one correlation between the objects falling under them.

In my lectures, I will introduce the neologicist program and then present two new lines of investigation which are my original contributions to this foundational position. The first consists in rooting the use of abstraction principles in the mathematical practice of the nineteenth century. The second, concerns alternative definitions for assigning numbers to infinite concepts (or sets). I will show that there infinitely many abstraction principles that are alternatives to Hume's principle in the sense that they share with Hume's principle the same epistemic virtues (from each of them one can derive the axioms for second-order arithmetic) but differ from it in the assignments of numbers to infinite concepts. In the lectures, I will investigate the epistemological, ontological and foundational consequences of these new investigations.

Jaroslav NESETRIL (Charles University, Prague) & Patrice OSSONA DE MENDEZ (EHESS, CAMS), Structural sparsity - 04/04; 05/04; 07/04.

The notion of sparsity is certainly not appearing sparsely across the diverse fields of mathematics. However, it appears that this notion might be more elusive than expected.

A structure is sparse if it may be reconstructed from a small amount of information, in particular if it can be obtained as an interpretation of a sparse structure (in a more usual sense of the word sparsity). In this sense, a sparse structure has to be structured. To the opposite, structures that are inherently dense are close to be random.

This notion of sparsity relates to the general model theoretic notions of independence property introduced by Shelah, to the notion of VC dimension, which arose in probability theory in the work of Vapnik and Chervonenkis, to tractability thresholds in theoretical computer science (first-order model checking, dual constraint satisfaction problems), as well as to the problem of structure approximation and of structural limits representation.

With this general picture in mind we shall discuss the dichotomy we introduced between nowhere dense and somewhere dense classes of graphs.

Frédéric PATRAS(CNRS, Nice), Philosophie de la combinatoire, 24/05; 25/05; 26/05.

*Le problème de la combinatoire*

La première partie du cours visera à dégager, à travers l'analyse de différents exemples, les problèmes conceptuels et méthodologiques liés au mode de pensée combinatoire. Elle essaiera aussi d'indiquer quelques inflexions importantes dans l'utilisation des outils combinatoires au cours de la période récente.

*L'exemple de G. C. Rota*

La seconde se penchera plus particulièrement sur l'oeuvre mathématique (à la fois protéiforme et fortement axée autour de quelques principes directeurs combinatoires), et sur la pensée philosophique de G. C. Rota.

*Une philosophie du discret ?*

La dernière partie, plus spéculative, essaiera d'élaborer une philosophie de la combinatoire sur des bases systématiques.

Alex SIMPSON (University of Ljubljana), Sheaf models of probabilistic systems - 01/06

I shall describe how the mathematical technology of sheaves can be used to model probabilistic systems. The approach will be illustrated using examples involving the passing and hiding of probabilistic information. No previous knowledge of sheaf theory will be assumed.

Hourya SINACEUR (CNRS, IHPST),Philosophie de la neuropsychologie du nombre - 07/06; 10/06.

Comment le cerveau et l’esprit appréhendent-ils la dimension quantitative et numérique des événements et des phénomènes du monde?

Les neurosciences cognitives produisent une myriade de résultats expérimentaux qui éclairent d’un nouveau jour les processus numériques, supposés primitifs, du cerveau et de l’esprit.

Cependant, les hypothèses et les modalités de réalisation des expériences, donc leurs résultats, sont conditionnées par les modèles mathématiques et informatiques servant à rassembler et expliquer les données et ne sont pas immunisées contre les parti-pris philosophiques culturellement constitués et spontanément ou délibérément assumés.

Le but de ce cours est de mettre en évidence les infrastructures techniques et idéologiques de la recherche de fondements effectifs, neuraux ou cognitifs, matériels ou symboliques, pour nos capacités arithmétiques élémentaires.

Ivahn SMADJA (Université Paris Diderot, SPHERE), Making Sense of Brahmagupta : Which Quadrilaterals ? Which Proofs ? - 13/04; 14/04.

As a result of Colebrooke’s 1817 English translations of Brahmagupta’s and Bhaskara’s mathematical works spreading across Europe among scholarly circles, the French geometer Michel Chasles pointed out the importance of a group of so-called *sutras* within the mathematical chapters of Brahmagupta’s *Brahma-sphuta-siddhanta*, *viz*. XII 21-38. He argued that in spite of the *sutras* being mere statements unsupported by individual proofs, the whole system of them made sense as providing a unified geometrical theory which, in his view, would solve completely, with precision and in all generality, one single question, *viz*. how to construct a cyclic quadrilateral whose sides, diagonals, perpendiculars, segments and area, as well as the diameter of the circumscribed circle, may be expressed in rational numbers.

A few years later, this interpretive puzzle and Chasles' purported solution to it aroused renewed interest in Germany. At about the same time he developed his theory of ideal numbers, E. E. Kummer fell under the spell of Brahmagupta's quadrilaterals. In connecting the problem of rational quadrilaterals – suitably transposed into an algebraic framework – with Eulerian techniques in Diophantine analysis, Kummer decisively established – through reconstructing Brahmagupta's methods – that Chasles had wrongly imputed full generality to Brahmagupta's theory. Kummer also tackled the problem anew with new tools so as to achieve the very generality presumably inaccessible to ancient methods. In so doing, he launched a new topic – « rational quadrilaterals » – leading up to results by L. J. Mordell in 1960. Kummer’s main argument against Chasles’ claim interestingly pertained to the compared strength of ancient and modern methods.

Eventually, more than two decades later, by virtue of his conjoining philological precision and mathematical expertise, Hermann Hankel outlined an alternative interpretation of Brahmagupta’s quadrilateral theory, XII 21-38 – an interpretation which happened to do justice to Chasles' main insight, though in a minor key, while taking into account Kummer's welcome rectification. One will unfold this last reading against the backdrop of Chasles' and Kummer's previous attempts, thereby showing how it also qualifies as the cornerstone of Hankel's characterization of so-called Indian intuitive ‘proofs’, by contrast to Greek deductive ones.

Christophe SOULE (CNRS & IHES) & Jean-Jacques SZCZECINIARZ (Univ. Paris Diderot, SPHERE), Le corps à un élément - 08/04.

Jan VAN EIJCK(Univ. Amsterdam),Epistemic Logic meets probability - 05/04; 06/04; 07/04.

This mini-course considers various connections between epistemic/doxastic logic and probability, and connects the world of neighbourhood semantics for modal logic with that of Bayesian updating. The resulting rich model of multi-agent probabilistic epistemic logic allows the development of a toolkit for dynamic epistemic model checking that combines dynamic epistemic logic with Bayesian reasoning. The course will start with a quick overview of dynamic epistemic logic.

Glynn WINSKEL(Cambridge University),Probabilistic Distributed Strategies - 08/06.

In a distributed game we imagine a team Player engaging a team Opponent in a distributed fashion. No longer can we assume that moves of Player and Opponent alternate. Rather the history of a play more naturally takes the form of a partial order of dependency between occurrences of moves. How are we to define strategies within such a game, and how are we to adjoin probability to such a broad class of strategies? The answer yields a surprisingly rich language of probabilistic distributed strategies and the possibility of programming optimal strategies. Along the way we shall encounter solutions to: the need to mix probability and nondeterminism; the problem of parallel causes in which members of the same team can race to make the same move, and why this leads us to invent a new model for the semantics of distributed systems.

Frank WOLTER(University of Liverpool), Ontologies in Computer Science - 23/05; 24/05; 26/05.

*Applications of Ontologies in Computer Science**Ontology Languages**Mathematical Logic and Ontologies*

We give an introduction to ontologies and ontology language that are developed and applied in Computer Science. In the final session we discuss how notions and techniques developed in mathematical logic have become important tools for the development and maintenance of real-world ontologies.