Taming the Uncertainty Monster: Lessons from Astrochemistry
6-7 juil. 2022
Salle des Thèses, Observatoire des Sciences de l'Univers (OSUR), Université de Rennes 1 - Rennes (France)
https://tumla.sciencesconf.org
The aim of TUMLA is to focus on recent developments in the field of astrochemistry and analyse how scientists have evaluated their models in this area. Astrochemistry is a young discipline, which started with the detection of molecules in the interstellar medium --the medium where stars are born-- in the 1940's, even though such an environment, with low temperatures, low densities, constant exposition to radiations and to cosmic rays had been considered too hostile to host molecules, even more so the complex organic molecules that have since been discovered there. In this field, models based on extrapolated data containing a lot of sources of uncertainties, fail to reproduce observations. Observations, on the other hand, are known to be both incomplete and uncertain. Yet, models are indispensable to take the most of observations, by allowing to better target future campaign observations; to determine where better experimental data are needed; and also to motivate further theoretical developments, but all this scientific work can only be done on the basis of a reliable evaluation of the accuracy of these models. Thus, it is of the utmost importance both to improve the adequacy of the model and to facilitate the interdisciplinary work required by this field to identify, quantify and manage the different types of uncertainties entering the composition of a chemical model.
Both the interpretation of observations and the building of chemical models requires interdisciplinary work at its finest, as progresses in the area rely on a combination of observational, experimental and theoretical skills. Consider the steps required before being in a position to compare observations of molecules in space to chemical models. The process starts with radioastronomers, who use telescopes to search for molecules in astrophysical environments through their spectrum –hence the name of molecular spectroscopy—, i.e., the characteristic set of frequencies at which a given molecule emit light. The spectrum thus collected needs to be analysed, for astronomers need to identify the lines of the spectrum. This is done thanks to the data collected in laboratory spectroscopy, that permits to pair the intensity of the lines to chemical species. Then, chemists provide coefficient rates that allow to determine the abundance of a given chemical species, and thus the composition of the astrophysical object observed by astronomers. Last but not least, these observations need to be interpreted by comparing them to chemical models of (gas-phase) evolution of chemical processes.
Chemical models solve a system of differential equations of the type:
with the density of species.
That is, chemical numerical models compute the abundance of chemical species in a given environment as a function of time, given an initial composition provided by observations, for hundreds of species and thousands of reactions. But if one wants to build a chemical model of the evolution of chemical species in environment with temperatures as low as the ones met in the interstellar medium, one has to determine rate constants on the basis of reaction laws that have often never been tested in that range of temperatures. In order to build, say, a photochemical model of the atmosphere of the biggest moon of Saturn, Titan, astrochemists must track the chemical evolution of nitrogen and methane at temperatures as low as 50 to 200K: the relevant rate reactions necessary to build the models must be extrapolated on the basis of laws whose validity is not known for the low temperature experienced in this environment. In other words, the rate reactions are estimated, extrapolated, with an associated uncertainty--that is, they are not measured. Remember that the conditions in the interstellar medium do not resemble those on Earth and are difficult to reproduce, with makes it hard to calculate the reactivity of chemical species on the basis of terrestrial experiments. Modellers could work hand by hand with theoreticians to calculate these reactions or develop new tools to improve the accuracy of their extrapolations. Theoreticians, however, usually proceed to calculate reaction rates with a very high degree of precision on systems of very small size --i.e., often up to three atoms; when modelers work with extremely large systems and do not need such a high degree of precision--photochemical models of Titan's atmosphere must compute for instance the abundances of about 127 chemical species, involved in 676 reactions and 69 dissociation processes .
Thus, experimenters, modellers and theoreticians have different backgrounds, different goals, use different measurement standards and study different systems, a reality that does not make the exchange of information either easy or effective. It is not rare, for instance, for chemists working in other areas, to calculate rate coefficients at temperature relevant for astrochemistry but to fail to share their work due to a lack of awareness of the astrophysical interest of their work. Likewise, astronomers plan observational campaigns to observe the abundance of a given species without taking into account how rate coefficients affect the precision of their models and fail to ask for these data, as they do not know the impact they may have on the precision of their models. The hypothesis underlying this conference is that putting in a philosophical context the obstacles that may hinder this interdisciplinary joint work could facilitate their identification and resolution.
This workshop will address these questions through the following perspective: can an interdisciplinary take on uncertainty management help both to reduce these uncertainties and to facilitate interdisciplinary collaborations? It will consists of four different sessions, centred around the themes of : 1) how sensitivity analysis can help identify the main sources of uncertainties in astrochemical models and its limits, 2) a philosophical perspective on errors, uncertainties and the validation of computation models, 3) how theoreticians, experimentalists, and modellers can work together to improve the adequacy of astro-physical/chemical models, and finally 4) the interdisciplinary nature of astrochemistry and how philosophy can help to facilitate this work.
As defined by Curry and Webster, a monster is a being that occupies simultaneously two exclusive categories. The concept of uncertainty constitutes a monster in this sense, in that it is both a condition for our knowledge and the acknowledging of our ignorance, an incentive to push back the limits of what we can know and an obligation to leave room for doubt. Thus, a good account of model evaluation must start by identifying which strategies to address the uncertainty monster are the most appropriate in different situations. This workshop will tackle this problem through the following questions:
- Should (or can) uncertainties be systematically reduced or eliminated?
- In which cases can we simplify the problem of model evaluation by quantifying uncertainties or studying how uncertainties in input variables propagates through the models, with the goal of determining when the level of uncertainties on the model and on the observations’ side allows for a meaningful comparison?
- In which cases is the empirical agreement between a model and observations not sufficient to assess their adequacy?
- Finally, in which cases should the idea of comparing the model outputs to observations be fully abandoned to leave room to innovative methods for model evaluation completely independent from observations matching?
Discipline scientifique :
Chimie théorique et/ou physique - Astrophysique - Philosophie
Lieu de la conférence