Vital Fernández

Career Stage
Postdoctoral Researcher
Poster Abstract

This work consists in a novel technique to fit the chemical parameter space of ionised gas. This methodology is based in a Bayesian which can fit both the recombination and collisionally excited lines. The current chemical model consists in fourteen dimensions: two electron temperatures, one electron density, the extinction coefficient, the optical depth on the HeI recombination lines and nine ionic species. To the best authors’ knowledge, this is the largest chemical model available in the optical range. The results are in good agreement with those previously published using the traditional methodology. The probabilistic programming library PyMC3 was chosen to explore the parameter space via a NUTs sampler. These machine learning tools are based in neural networks, which provided excellent convergence quality and speed. The primordial helium abundance measured from a multivariable regression using oxygen, nitrogen and sulfur was Y_{P, O-N-S} = 0.243 士 0.005. This result supports a standard Big Bang scenario.

Plain text summary
The rise of astronomical spectroscopy enabled scientists to discover the universe composition. It was soon learned that hydrogen represented the largest fraction with X ≈ 0.74, followed by helium with Y ≈ 0.24. Finally, the remaining elements are commonly grouped by as astronomers as “metals”. Their combined fraction stands below Z ≈ 0.02 (where X + Y + Z = 1).
A common pattern emerged while comparing the composition between very old objects, such as globular clusters, and very young ones, such as local star-forming regions. The metals mass fraction was considerably lower in the former group. This was the expected behaviour, as the origin of metals and their evolution can be explained by the stellar feedback. In contrast, the helium mass fraction hardly increased between young and old astronomical bodies. These facts led astronomers to support a Big Bang scenario in which the universe was born with a high Y fraction.
In 1974, Peimbert and collaborators proposed an empirical technique to measure this primordial helium mass fraction (YP) via a linear regression in a Y vs Z diagram. The point at which Z = 0 represents the Big Bang, where Y = YP . Moreover, these authors also suggested to use the oxygen mass fraction as a tracer for the metallicity (O/H = Z). By measuring YP with a high accuracy and precision, researchers seek to constrain the Big Bang Cosmology. In this project, we propose a new methodology based in neural networks, which can fit larger and more complex chemical models.
The first step in this project involved the acquisition of high-quality spectra from star-forming galaxies. These objects are known for their primitive behaviour: Low metallicity, a high gas fraction and a stellar luminosity dominated by very bright, young, and massive stars. A spectroscopic observation of these galaxies results in an emission line spectrum. Each of these intensity peaks represents the photons from an ionic transition. A simple physical model can be used on these photon fluxes to establish these ions abundance, as well as their physical conditions. However, the model complexity increases surprisingly fast, as more transitions and ions are considered.
In this project, we present a new methodology which can easily fit 21 emission fluxes for a 14 dimensions chemical models: Two electron temperatures, one electron density, the logarithmic extinction coefficient, the optical depth on the helium lines and nine ionic species. To the authors best knowledge, this is the largest chemical model available. The successful fitting of this parameter space was possible thanks to neural networks.
At the heart of the Data Science, Big Data and Artificial learning fields lies the need to explore multi-dimensional arrays, very fast. An example of a well established sampler is the adaptive metropolis algorithm. This follows a Markov Chain Monte Carlo where each sampling step is chosen stochastically. In contrast, a NUTs (Non-U-Turns) sampler follows a Hamiltonian Monte Carlo logic, where each sampling step is guided by the theoretical model derivatives. This is accomplished in a neural networks graph where the derivatives are transmitted via tensors. This provides a faster fitting for higher dimensional models. In our case, this approach decreased the fitting time from several hours to a couple of minutes.
This technique was presented in Fernández et al. (2019) and the results were compare with those from Fernández et al (2018) provided a very good agreement with the traditional technique. Finally, since all the parameters are fitted simultaneously, we proposed a multivariable linear regression using oxygen, nitrogen and sulfur YP, O-N-S = 0.243 士 0.005. This result supports a standard Big Bang Cosmology
Poster Title
Primordial helium abundance measurement using neural networks
Tags
Astrophysics
Cosmochemistry
Data Science