Partenaires

logo Sphere
CNRS
Logo Université Paris-Diderot Logo Université Paris1-Panthéon-Sorbonne


Rechercher

Sur ce site

Sur le Web du CNRS


Accueil > Archives > Séminaires des années précédentes > Séminaires 2020-2021 : archives > Seminar PhilMath Intersem 11.2021

Axis History and Philosophy of Mathematics

Seminar PhilMath Intersem 11.2021



PhilMath Intersem is jointly sponsored by the University of Notre Dame and the research unit SPHere (UMR7219) of the University of Paris.


To current year and archives 2010-




PROGRAM : JUNE 2021

The PhilMath Intersem will have a special format this year.
Professor Michael Detlefsen, one of its founders, has passed away,
so we have decided to make it a tribute on behalf of those
who participated in the exchanges he initiated in recent years.


The sessions will take place on Zoom and at the University
For the link Zoom, thanks to contact by email Emmylou Haffner with subject "Zoom-Intersem2021"
University : Room Malevitch (483A), Building Condorcet (10 rue Alice Domon & Léonie Duquet, 75013 Paris), EXCEPT on June 8 & 15 : Room Mondrian (646A), on the campus of the University of Paris (ex-Diderot)
Schedule, access Abstracts W 2 Th 3 M 7 Tu 8 M 14 Tu 15

W 2, 16:00 – 18:30, Room Malevitch (483A) & Zoom

  • 4pm Opening
  • 4:15pm – 5:15pm
    Marco Panza (CNRS, IHPST, University Paris 1 Panthéon Sorbonne)
    Michael Detlefsen and Mark Luker on the Proof of the Four-Colors Theorem : Empiricism, Platonism, or both ?
  • 5:30pm – 6:30pm
    Andrew Arana (CNRS, Archives Henri Poincaré - PReST, University of Lorraine)
    A vectorial conception of problem-solving
    In our paper “Purity of Methods” (Philosophers’Imprint, 2011), Mic & I characterized one epistemic value of purity as better ensuring the stability of problem-solutions through changes in the epistemic attitudes of the problem-solver than impurity. In our account we took what we called to each other informally at least, a “vectorial conception” of problem-solving. On this view, a mathematical problem represents a particular ignorance, and a solution to that problem can be more or less directed at relieving that ignorance, rather than some other ignorance. In this talk, I would like to say some more about this vectorial conception, and about its relation to what we might roughly call the “constructivist” epistemologies of Poincaré and Brouwer in particular.



Th 3, 3pm – 6:30pm, Room Malevitch (483A) & Zoom

  • 3pm – 4pm
    Andrei Rodin (Saint Petersburg State University) (Saint Petersburg State University)
    Mic Detlefsen on Frege-Hilbert Controversy
    The exchange of letters between Frege and Hilbert took place during an early stage of Hilbert’s work in foundations of mathematics and stopped before Hilbert shaped what is commonly known today as Hilbert‘s Program. As Mic Detlefsen convincingly shows in his 1986 monograph ‘Hilbert’s Program’, Frege’s challenges remain fully relevant to Hilbert’s foundational project at its later mature stage. In this new context Detlefsen reconstructs Hilbert’s argument in support of his notion of formal aka “ideal” proof as follows. While on Frege’s view a formal derivation of formulas from other formulas does not constitute a proof unless all involved formulas are semantically interpreted in an appropriate way, on Hilbert’s view at least a part of these formulas may remain uninterpreted being instead “metamathematically evaluated”. This opens a possibility to build metamathematical proofs of mathematical theorems ; the needed join between mathematics and metamathematics is provided by what Detlefsen calls the Equivalency Principle. This principle, which Detlefsen attributes to Hilbert on the basis of available textual evidence, postulates that metamathematical reasoning about symbolic formulas has the same contentual character as the elementary contentual arithmetical reasoning and thus equally belongs to the intuitively trusted core of mathematics.
    In my paper I describe how the method of formal ‘ideal’ proof as conceptualised by Detlefsen after Hilbert, i.e., the method of proof by a ‘metamathematical replacement’, performs in today’s mathematical practice, and demonstrate that it still remains highly problematic and, generally, doesn’t lead to a stable consensus in the mathematical community. For this end I briefly review the cases of Continuum Hypothesis, Consistency of Arithmetic, Four Colour Theorem, and Kepler Conjecture. On the basis of these examples I identify some general problems of formal methods and, finally, I show how these problems can be partly resolved with the homotopy-theoretic rendering of formal proofs used in the Univalent foundations of mathematics.
  • 4:15pm – 5:15pm
    Paola Cantù (CNRS, Centre Gilles Gaston Granger, Aix-Marseille Université)
    Peano’s philosophical views between structuralism and logicism
    This paper is part of a larger project that aims at a comprehensive historical and philosophical evaluation of the contributions of the Peano School to mathematics, logic, and the foundation of mathematics based on an investigation of the school’s axiomatic practices. A detailed analysis of several mathematical practices that can be considered as markers of logicism is a fruitful way to reconstruct Peano’s philosophical views : the link between functions and relations, the role of metatheoretical investigations, the kind of semantics, the use of definitions by abstraction, and the foundational or non-foundational value of axiomatics. This paper will focus on Peano’s semantics, coping with an apparent contradiction in the Peano School approach to concept script. On the one hand, symbols stand for ideas. On the other hand, the linguistic expressions associated to the symbols are sometimes considered as meanings, sometimes as a way to read the symbols and sometimes as a way to indicate the adopted interpretation. Besides, they are used in the analysis of the independence of the axioms of arithmetic, where they sometimes occur together with the notion of interpretation. Focusing on this aspect and by means of a comparative investigation of Peano’s logico-mathematical, linguistic and notational practices we would like to assess 1) whether Peano’s symbolic language is a concept script and 2) whether Peano’s symbolization is a formalization in modern sense. Peano’s view is characterized as a form of structural algebraism, which differs from both the algebra of logic tradition using mathematical symbols to express logical calculi, and from Frege’s logical investigation centred on the effort to understand the functional nature of predication.



M 7, 3pm – 6:30pm, Room Malevitch (483A) & Zoom

  • 3pm – 4pm
    Walt Dean (University of Warwick)
    A royal road to incompleteness ?
    One of the goals of Detlefsen 1990 ("On an alleged refutation of Hilbert’s program using Gödel’s first incompleteness theorem") was to refute an argument popularized by Kreisel, Smorynski, and Simpson that Gödel’s first theorem "effectively kills Hilbert’s programme". The first goal of this talk will be to offer a reading of the second volume of Grundlagen der Mathematik (1939) which supports a key premise in Detlefsen’s argument — i.e. that Hilbert should not be understood as committed to the finitary decidability of real propositions or that his commitment to "real soundness" (i.e. that no sentence provable by an ideal theory can be refutable by real means) should be understood as engendering a commitment to "real conservativity". The second goal of the talk will be to trace the reception of the incompleteness theorems in Grundlagen der Mathematik —e.g. in regard to Hilbert & Bernays’s use of the Liar paradox as a framework for presenting Gödel’s results, the details of their formalization of the second incompleteness theorem, and its interaction with their formal truth definition for first-order arithmetic. Building on this and subsequent work on arithmetical incompleteness, I will finally illustrate a means by which these considerations connect with the second goal of Detlefsen 1990 —i.e. highlighting the potential aptness of "consistency-minded" definitions of provability (e.g. in the manner of Feferman 1960) in the formulation of ideal theories.
  • 4:15pm – 5:15pm
    Emmylou Haffner (Institut de Mathématique d’Orsay, Université Paris-Saclay)
    Reassessing Dedekind’s ideal of rigor ?
    Ideals of rigor have been an important interest of Mic Detlefsen’s. In this talk, I will start from Mic’s analysis of Dedekind’s ideal of rigor and question to which extent this ideal is relevant to rigor in the making, as observed in Dedekind’s mathematical drafts. Indeed, considerations about rigor in mathematics often rely on questions of justification and/or verification of results, rather than how they were found. In this talk, I will propose to shift the focus and look behind the scene to consider the shaping of rigorous mathematics. This will raise an additional consideration : To what extent does rigor support or guide mathematical research in its various phases ? What are some consequences of such an ideal of rigor, if any, on mathematical research ? To do so, I will use two examples. Firstly, I will consider the genesis of his late Dualgruppe theory (equivalent to our modern lattice). Focusing on a specific law of Dualgruppe theory, I will show that the elaboration of a rigorous work can be the outcome of a process that is not necessarily so. I will put forward the trial-and-error and inductive aspects of Dedekind’s research practices. Secondly, I will consider the genesis of Was sind und was sollen die Zahlen ?, Dedekind’s famous essay on the natural numbers. Dedekind wrote several versions of this text, from the 1870s to 1888. I will particularly be interested in what seems to be an important step of mathematical writing, in Dedekind’s drafts, namely arranging the order of propositions in a deductive hierarchy.
  • 5:30pm – 6:30pm
    Graham Leach-Krouse (Kansas State University)
    Coabstraction and the Continuum


Tu 8, 3pm – 6:30pm, !! Room Mondrian (646A) !! & Zoom

  • 4:15pm – 5:15pm
    Sébastien Maronne (Institut de Mathématiques de Toulouse, Université de Toulouse III Paul Sabatier)
    The unreasonable effectiveness of infinite quantities in early modern geometry
  • 5:30pm – 6:30pm
    Sean Walsh (UCLA, Department of Philosophy)
    Infinitesimals, valued fields, and the orders of infinite smallness
    In the 1960s, Abraham Robinson famously used model theory to defend the coherence of the calculus as based on infinitesimals. In Appendix 2 to his 1974 paper "Differentials, Higher-Order Differentials and the Derivative in the Leibnizian Calculus," Bos argued that Robinson’s non-standard analysis did not take into account the distinct orders of infinite smallness present in the infinitesimals in the historical calculus. In this talk, we describe how incorporating a valuation —in the sense of valued fields— can help non-standard analysis to overcome this deficit. After describing the proposal, we test it out on the historical cases from Euler and Bernoulli to which Bos drew attention. This is based on joint work with Tim Button, and in particular Sections 4.5-4.6 of the book Philosophy and Model Theory (Oxford University Press, 2018).


M 14, 3pm – 6:30pm, Room Malevitch (483A) & Zoom

  • 3pm – 4pm
    Iulian Toader (University of Vienna)
    Revisiting Weyl on Dedekind on Proof and Intuition
    I reconsider an intriguing case of normative disagreement in the history of philosophy of mathematics : Weyl’s criticism of Dedekind’s principle that "In science, what is provable ought not to be believed without proof." The criticism is reconstructed as a series of three objections : (1) Dedekind’s rigorous proofs are epistemologically incorrect and, thus, incapable of justifying belief, (2) withholding belief until rigorous proofs are given is epistemologically perverse, and (3) the proof revisions demanded by Dedekind are driven by epistemologically unreasonable norms of reasoning. I discuss several questions that Weyl’s criticism, when properly understood, raises about normativity in mathematics, such as : What are the standards by which we assess higher-order norms of belief like Weyl’s correctness and Dedekind’s rigor ? Are these norms categorical, i.e., binding on everyone, or hypothetical, i.e., depending on one’s goals ?
  • 4:15pm – 5:15pm
    Gerhard Heinzman (Archives Henri Poincaré - PReST, Univ. de Lorraine)
    Poincaré against the logicians
    Poincaré did not believe that logical reasoning could express the essential
    structure of an extensive mathematical proof. Instead of accepting the non-invariance of mathematical reasoning with respect to its content one has to grasp the architecture of the subject in question (Detlefsen 1992). Subsequently the architecture involved was interpreted differently : as theorematic reasoning in Peirce’s sense (Heinzmann 1995), as an aesthetic
    structure (Heinzmann 1997), as the right insight in what was later expressed by Hintikka’s IF-logic (2012) and as the insight in what can be expressed by a dialogical type theoretical reconstruction of the Erlangen notion of a Constructive Language (Orthosprache ; Rahman 2012).
    We will give an overview of these proposals and proceed to critical analysis of the arguments regarding Poincaré’s general philosophical position.
  • 5:30pm – 6:30pm
    John Mumma (California State University of San Bernardino)
    Seeing an equation in a field of dots
    A well known picture proof —discussed extensively in the work of Marcus Giaquinto— relies on conceiving the arithmetical equation 1+2+...+n=n(n+1)/2 in terms of a rectangular array of dots. The topic of my talk is the principle of arithmetical generality at play in the proof. The picture of the array does not, in fact, determine the path of reasoning to the general equation. There are (at least) two. In one, generality is secured via mathematical induction. Successive instantiations of the equation are identified in the picture. In the other, a symmetry displayed by a sub-configuration of the rectangle is recognized as independent of the rectangle’s particular length. After describing both procedures for seeing the general equation, I reflect on the concept of number presupposed in the idea that a proof concerning it can be found in a picture of dots at all.



Tu 15, 3pm – 6:30pm, !! Room Mondrian (646A) !! & Zoom

  • 3pm
 – 4pm
    Chris Porter (Drake University) (Drake University)
    Chaitin’s Omega and Information-Theoretic Incompleteness
    The aim of this talk is to examine the significance of an incompleteness result due to Gregory Chaitin, namely that no computably axiomatizable theory that extends Robinson’s arithmetic can provably determine more than finitely many bits of Omega, a specific formally random sequence. The received interpretation of Chaitin’s result has been subject to quite severe criticism. I will add further criticism by identifying several issues with this received interpretation that have not been previously discussed in the literature on Chaitin’s theorem. In addition, no positive account of the role that Chaitin’s Omega plays in the broader theory of algorithmic randomness has been offered ; here I present such an account.
  • 4:15pm – 5:15pm
    Karine Chemla (CNRS, SPHERE, University of Paris, & Radcliffe Institute, Harvard University)
    On Numbers as Formulas — A Second Attempt
    In March 2019, at the Conference “Mathematics in Philosophy : Purity and Idealization”, in honor of Mic Detlefsen, I gave a talk titled “On Numbers as Formulas.” The talk gave rise to a written exchange with Mic, which leads me to return to the same issue from a different viewpoint. The key point will be to contrast contentual and non-contentual ways of computing, and from this perspective highlight in which respect some inscriptions of numbers have been used as a basis for a formal work in mathematics.
  • 5:30pm – 6:30pm
    Roundtable ronde, with Matteo Bianchetti, Ellen Lehet (Notre Dame University), Mattia Petrolo (Universidade Federal do ABC), Paul Tran-Hoang (Lone Star College – University Park)