logo Sphere
Logo Université Paris-Diderot Logo Université Paris1-Panthéon-Sorbonne


Sur ce site

Sur le Web du CNRS

Accueil > Séminaires en cours > Histoire et philosophie de la physique

Axe Histoire et philosophie des sciences de la nature

Histoire et philosophie de la physique

Ce séminaire est conçu comme un lieu d’échange entre historiens de la physique, philosophes de la physique, physiciens et étudiants dans les disciplines concernées. Bien que le programme de cette année n’ait pas de thème précis, il reflète l’intérêt des organisateurs pour les questions qui nous poussent à traverser les frontières interdisciplinaires : entre physique et philosophie, entre histoire et philosophie, entre construction théorique et expérience, entre physique et autres sciences.

Responsables : Nadine de Courtenay, Olivier Darrigol, Sara Franceschelli, Jan Lacki

antérieures à 2008-2009, 2008-2009,
2009-2010, 2010-2011, 2011-2012,
2012-2013, 2013-2014, 2014-2015,
2015-2016, 2016-2017, 2017-2018,
2018-2019, 2019-2020, 2020-2021

PROGRAMME 2021-2022

Le programme est en cours d’élaboration et sera en ligne sur cette page. Il remplacera le programme 2020-21.

PROGRAMME 2020-2021


Dans le cadre du Séminaire d’histoire et philosophie de la physique, nous proposons une série d’ateliers de réflexion sur des thèmes précis. Les séances (4, 11, 18 mai) auront lieu le mardi après-midi à partir de 15h30, en video-conférence.
Les exposés dureront environ une demi-heure et seront suivis de discussions.

Vous êtes tous bienvenus à participer à ces ateliers.

Les organisateurs : Nadine de Courtenay, Sara Franceschelli, Fabien Grégis, Olivier Darrigol


  • Johannes-Geert Hagmann (Deutsches Museum, Munich)
    Electronics preceding optics : The Shawanga Lodge conference in 1959
    Designated by some as “the conference that changed the world,” the Shawanga Lodge Meeting on “Quantum Electronics : Resonance Phenomena” in 1959 was the first workshop that brought together a sizeable group of international scientists working in the emerging fields of maser and laser physics. As the term quantum electronics was minted through the gathering, we take a closer look at what the steering committee, led by Charles H. Townes (1915-2015), thought about the composition of this particular field of research for an on-invitation-only event. Analyzing the preparations for the meeting, the present talk will embed the conference in two dimensions : first, as a meaningful snapshot of physical research dealing with quantum phenomena that later extended to optics, and second, as a notable expression of US Cold War foreign policy on scientific exchange.
  • Gautier Depambour (Université de Paris–ED623, SPHere)
    L’introduction des états cohérents dans le domaine de l’optique quantique
    Les « états cohérents », ainsi nommés par Roy Glauber en 1963, jouent un rôle fondamental en optique quantique. On les trouve également auparavant sous d’autres formes, aussi bien chez Schrödinger (en 1926) que chez les théoriciens de l’électrodynamique quantique, tels Feynman ou Schwinger. Au cours de cette présentation, je tâcherai d’abord d’expliquer comment et pourquoi de tels états ont ressurgi dans le cadre de l’optique. Dans un second temps, je montrerai que les états cohérents ont permis de clarifier un certain nombre de problèmes soulevés dans les années 1960, comme l’absence d’effet Hanbury Brown et Twiss dans les corrélations du laser, le lien entre les notions de phase et de photon, ou encore la comparaison des théories classique et quantique de la cohérence. Je présenterai enfin la querelle de priorité qui opposa Roy Glauber à George Sudarshan à propos de la découverte de la représentation diagonale, qui permet d’exprimer les états du champ comme une somme de projecteurs sur les états cohérents et de mieux comprendre la relation entre optiques classique et quantique.
  • Olival Freire Jr (Universidade Federal da Bahia, Brésil)
    The slow acceptance of quantum optics
    Courses at the Collège de France are a singular kind of teaching. Neither graduate nor undergraduate courses, they are a kind of working thought, both presentation of the state of the art and of novel proposals in a given field. I will approach Claude Cohen-Tannoudji’s 1979–1980 course from this perspective. Its goal was “to attempt to answer, analyzing recent experimental tests, the following question : Could we dispense with the concept of photon, at least in the optical domain ?” The course critically reviewed the semiclassical theories of matter-radiation interactions and presented some successes of these approaches in the explanation of phenomena such as the photoelectrical effect, the Hanbury-Brown and Twiss effect, in the semiclassical theory of laser, and in nonlinear optics. However, the course was silent on the early experiments on Bell’s theorem in the optical domain, and about what information these experiments could bring for the concept of photon. Thus, twenty years after Roy Glauber’s foundational work, there still was no consensus on the contents of quantum optics. I will propose a few conjectures about this slow acceptance of aspects of quantum optics.


  • Justin Gabriel (Université de Paris–ED623, SPHere)
    Unwanted particles : Early reception of the first experimental indications of "heavy mesons."
    A typical account of the history of strange particles begins as follows : George Rochester and Clifford Butler published two V-shaped tracks in 1947 ; then for 3 years nobody reported such tracks ; finally, Carl David Anderson and his team found 30 such events in 1950, perhaps following an informal discussion with Rochester. This account is highly inaccurate, and it exists mainly because Rochester gave it himself many times during conferences on the history of particle physics. One of its main defects is that it confuses the history of strange particles with the one V-particle that Rochester and Butler were first to observe. A second default is that it leads us to believe that the observation of Rochester and Butler was not received and had no consequences until 1950.
    On the contrary, the observation of Rochester and Butler was regularly cited before 1950 together with a number of other observations of so-called "heavy mesons," namely, particles of mass intermediate between pi-meson and proton mass. The history of strange particles truly started with or proceeded from the history of heavy mesons. Experimentalists talked about and sometimes looked for such particles in the years 1947–1950, and theoreticians worked on them as early as 1948.
  • Arianna Borrelli (MECS Leuphana University Lüneburg and Technical University Berlin)
    The "strange" behaviour of new particles in the 1950s : Surprising fact or theoretical construction ?
    The story of strange particles has often been presented as a paradigmatic example of how a simple, but surprising observation can lead to the formulation of a theoretical model. According to these accounts, physicists in the early 1950s noted that new particles discovered in cosmic-ray experiments were produced fast and in great quantity, but decayed slowly. This "puzzling","strange" behaviour was eventually explained by postulating the existence of a new particle property, which was given the name "strangeness".
    Was this really a straightforward story of bottom-up model-building ? I will argue that the strange behaviour of the new particles was not a phenomenon immediately observed in cosmic ray experiments, but rather a hypothesis which emerged from attempts at modeling experimental results. The surprising behaviour was a construction resulting from sketchy hypotheses by Enrico Fermi and Richard Feynman and complex models by a number of Japanese theorists, among them Yoichiro Nambu, Kazuhiko Nishijima and Toichiro Kinoshita. Of particular importance for the emergence of the allegedly observed strange behaviour was a workshop held in Tokyo in July 1951, in which Japanese scientists from Tokyo, Kanazawa and Osaka met to discuss the new discoveries made by U. S. Americans and European experimentalists on cosmic rays.
  • Kent Staley (Philosophy, Saint Louis University, Missouri) et Hugo Beauchemin (Physics, Tufts University, Massachusetts)
    When no particle is unwanted : Exploring beyond the standard model at the LHC.
    One way to search for new physics at the Large Hadron Collider is to look, not for signs of specific particles predicted by particular models, but for signatures that have features distinctive of a wide variety of Beyond Standard Model physics models, and that may even reflect new physics not included in any model thus far proposed. These Signature Based Model Independent (SBMI) searches are the subject of this talk. Given the long drought in the discovery of new physics pointing beyond the Standard Model, any particle that SBMI searches reveal would not be unwanted, but might be unpredicted, or predicted by a large number of incompatible theories. Although in many ways SBMI searches have characteristics associated with exploratory experiments (as described by Steinle and extended by Karaca), we will describe how they involve theoretical assumptions and theory guidance at every level, all the way down to the identification within the data of signatures associated with Standard Model ontology, such as electrons.
    Crucial to the empirical soundness of SBMI searches is the use of theoretical assumptions apart from commitment to the correctness of those assumptions. What enables this non-committal use of theory is the evaluation of uncertainty in the context of definite epistemic objectives. Our discussion highlights how uncertainty evaluation itself can be understood as having an exploratory aspect. This line of analysis invites a restructuring of the epistemology of experimental physics along pragmatist lines. Instead of distinguishing exploratory from other types of experimentation, we distinguish experimental tasks (e.g., testing a hypothesis or exploring through parameter variation) that can be combined with other tasks (calibration, correction, etc.) in the pursuit of epistemic objectives. An open historical question remains : to what extent can possibly implicit practices of non-committal theory use supported by uncertainty evaluation shed light on earlier episodes in the history of physics ?


  • Sarah Hijmans (SPHere, UMR 7219)
    The tantalum metals : Inorganic analysis and elementary nature in nineteenth-century chemistry.
    Historical studies of the concept of chemical element have mainly focused on (changes in) the definitions of this concept. Thus, because of the widespread acceptance of Lavoisier’s definition of the element as “the last point which analysis is capable of reaching” between 1789 and 1869, it is generally accepted that the element was seen as an indecomposable substance throughout this period. Yet, when claiming the discovery of a new element, chemists regularly disregarded this definition.
    Based on an analysis of the debates surrounding the elementary nature of tantalum, niobium and three other ‘tantalum metals’ which remained controversial for almost seventy years, I will argue that Lavoisier’s definition had little effect on the identification of elements using mineral analysis. This practice did not correspond to Lavoisier’s vision of analysis : it was concerned with identifying substances rather than decomposing them, and therefore the idea of a ‘last point’ of analysis was simply irrelevant. Using examples from this and other case studies, I will try to illustrate the need to study the ways in which substances were judged to be elementary in practice in order to increase our understanding of the history of the concept of element.
  • Marina Banchetti-Robino (Florida Atlantic University)
    John Dalton : Reconciling atomicity with elementarity
    The concepts of ‘atom’ and of ‘element’ have a long and venerable history both in the history of philosophy and that of chemistry. Although both ‘atomicity’ and ‘elementarity’ served to elucidate the fundamental nature of material substances both for philosophers and for chemists, these notions were never interchangeable since they were grounded in different conceptions of fundamentality.
    While, in its early history, the concept of element referred to empirical substances that were believed to constitute every other material substance, the notion of atom referred to absolutely fundamental entities that marked the limits of any possible reduction and that, as such, could not ever be experience or measured.
    In spite of the early modern rehabilitation of atomicity as a chymical concept, this notion falls by the wayside once more during the Chemical Revolution, as Lavoisier’s desire for the development of a chemical science leads him to reject all metaphysical speculation about the nature of atoms and to focus chemical theory and experiment on the empirical isolation and quantitative description of ‘elementary’ substances, defined as the final products of chemical analysis.
    It is not until the 19th century and the work of Dalton that the concept of atom is, once again, reintroduced into chemistry and that it becomes inextricably linked to the notion of elementarity. Rather than defining elements in terms of the limits of analysis, as Lavoisier had done, Dalton defines an element as a substance that is composed entirely of atoms with identical properties. Referring to such atoms as ‘chemical atoms’ and considering these as empirical and measurable entities, Dalton considers the primary determinable feature of chemical atoms to be their relative weight. He stipulates that, since elements are composed of atoms and since there are differences between elements, there must also be differences between the relative weights of the atoms that compose those elements.
    By thus reconciling the concept of ‘atomicity’ and ‘elementarity’ and rendering both as empirical notions, amenable to measurement and quantitative description, one of the central goals of Dalton’s chemical atomic theory becomes the understanding of how the relative weight of chemical atoms determines the properties of elements and of how the chemical atoms of different elements combine to form compound substances.