Helen and Francisco (Paco) tell us about their recent work using neural networks to predict the masses of subhalos within simulations. They find that the neural network trained on a subset of the subhalos is very good at predicting subhalo masses for the rest of the data. Continue reading
Author: Talks_Curator
A subtle cosmological symmetry & mirror dark sector might fix H0 (Cyr-Racine & Knox)
Francis-Yan Cyr-Racine and Lloyd Knox talk about their work with Fei Ge pointing out a symmetry present in most cosmological observables.
The symmetry involves rescaling (almost) *all* the densities and temperatures in the universe thus leaving any dimensionless observables unchanged. When exploited it might pave a way to solving the Hubble tension as it allows one to change H0 without changing predictions for other crucial cosmological measurements (most of which are e.g. temperature, density, etc *contrasts* not absolute measurements). Continue reading
Unitarity constraints on cosmological correlators (valid in *any* flat FLRW metric) | Goodhew & Lee
Harry Goodhew and Gordon Lee talk about their recent work on “cosmological correlators”.
Observationally these would be power spectra, bispectra, etc; however on the theory side they find it easier to work with pieces of the “wavefunction of the universe”, which are closely related to observational correlation functions.
They show constraints on the form these correlators can take that arise from imposing unitarity during inflation. Contrary to prior expectations these constraints apply not just in space-times that are exactly de Sitter, but in fact in any flat FLRW space-time. Continue reading
What is the best way to analyse galaxy clustering data? Panel event – Gil MarÃn, Simonovic, and Tröster
This is a recording of a panel event run by the organisers of the Cosmology from Home conference series: https://cosmologyfromhome.com/
The topic was a comparison of the relative merits of “full shape” and “template” methods to analyse galaxy clustering data. Essentially the difference comes down to whether you consider the entire power spectrum as a whole and fit to it in all its glory, or break into separate pieces that encapsulate specific physics effects. Continue reading
How much are local anisotropies biasing our measurements (e.g. H0)? (Heinesen and Macpherson)
Asta Heinesen and Hayley Macpherson tell us about their recent papers developing a formalism for measuring local parameters without assuming local isotropy (and homogeneity) and predicting what we should expect for the parameters in this formalism when we go beyond the isotropic approximation of FRW.
Asta talks about her paper from last year which developed the formalism, and how a finite number of terms can capture all the expected behaviour in the anisotropic luminosity distance, at each order of redshift.
Hayley then talks about how, together, they applied Asta’s formalism to Hayley’s fully relativistic simulations of cosmology. Continue reading
Eloisa Bentivegna – Evolution of a periodic eight-black-hole lattice in numerical relativity
Eloisa tells us about her work from 2012 (and following years) constructing a model universe space-time out of lattices of blackholes.
The motivation for this is to take a very bottom up approach to cosmology. We know that around isolated objects the correct metric is close to the Schwarzschild metric, so in principle the full metric of the universe should be able to be written as a patching together of such metrics. On the other hand, the universe on large scales is statistically homogeneous and isotropic and the Friedmann-Robertson-Walker metric appears to fit the data well.
What Eloisa and colleagues wanted to know is how these two paradigms come together, and they more or less found the answer.
Eloisa is also employed not at a university, or any other institute we might normally expect to find a cosmologist. She is employed at IBM. However, she hasn’t stopped doing cosmology research, IBM pay her to do numerical relativity and cosmology. In the video she talks a lot about how this is possible and what IBM want from her as an employee and why this isn’t so unique. In fact, she’s not even IBM’s first numerical relativist!
Eloisa: https://researcher.watson.ibm.com/researcher/view.php?person=ibm-Eloisa.Bentivegna
1st paper: https://arxiv.org/abs/1204.3568
2018 review article on the topic: https://arxiv.org/abs/1801.01083
Steffen Hagstotz – The Hubble parameter measured with Fast Radio Bursts
Steffen tells us about how the dispersion measure of fast radio bursts (FRBs) can be used to measure the distance to FRBs. Therefore, if we can find the host galaxies of FRBs and measure their redshifts we can measure the expansion rate (Hubble parameter) with FRBs.
And, he and his collaborators have done just that. At the moment the uncertainty is relatively large, but they still get a result within 10% of the more precise measurements (and consistent with both CMB and supernovae), indicating that they’re doing the right thing.
In the near future (less than five years) we’ll have (hopefully) more than 500 FRBs and a ~% level accuracy measurement of H0. These are exciting times for FRBs!
Paper: https://arxiv.org/abs/2104.04538
Steffen: https://www.su.se/english/profiles/stha5722-1.400226
Shaun’s tweet thread about the talk: https://twitter.com/just_shaun/status/1385147665244037123
Dan Thomas – The first model independent cosmological simulations of modified gravity
Dan Thomas tells us about recent work first creating a framework for describing modified gravity in a model independent way on non-linear scales and then running N-body simulations in that framework.
The framework involves finding a correspondence between large scale linear theory where everything is under control and small scale non-linear post-Newtonian dynamics. After a lot of care and rigour it boils down to a modified Poisson equation – on both large and small scales (in a particular gauge).
The full generality of the modification to the Poisson equation allows, essentially, for a time and space dependent value for Newton’s constant. For most modified gravity models, the first level of deviation from general relativity can be parameterised in this way (and we know that the deviations from general relativity are small because so far we haven’t found any!!)
The cosmological simulations are then done by having Newton’s constant just vary over time (i.e. it is constant in space). This allows them to actually do some simulations, but in future work they will go beyond this particular simplification.
They then compare the simulation results to semi-analytic models like Halofit and ReACT. Halofit is explicitly just applicable to ΛCDM model but does surprisingly well. ReACT however still does much better at fitting e.g. the matter power spectrum and model Euclid lensing observables.
Future work will examine more closely why ReACT fits so well and aim to improve the fit even better so that e.g. Euclid and/or the Vera C. Rubin Observatory (LSST) will be able to use this method to constrain modified gravity without needing to run a new simulation for every step of a Monte Carlo parameter fit.
Theory framework paper: https://arxiv.org/abs/2004.13051
Simulation paper: https://arxiv.org/abs/2103.05051
Unitarity, causality & locality: impacts on dark energy and gravity’s speed (Melville & Noller)
Johannes Noller and Scott Melville talk about their recent paper exploring the impacts of certain positivity bounds on cosmological parameters.
Positivity bounds are restrictions on low energy effective parameters that arise from requiring the full high energy fundamental theory to satisfy certain criteria. It is possible to show that if, e.g. all the interactions of a full theory satisfies unitarity (conservation of information/probability), causality and locality, then a specific class of low energy theories must have the speed of light less than the speed of gravity.
The specific interactions Johannes, Scott (and collaborator Claudia) used to show this are interactions between dark energy and standard model matter.
This condition actually ends up lying right in the region that observations prefer for this model, effectively cutting the allowed parameter space in half.
Johannes: http://www.icg.port.ac.uk/author/nollerj/
Scott: http://www.scottamelville.com/
Paper: https://arxiv.org/abs/2103.06855
Supplementary video: https://www.youtube.com/watch?v=Z3Lx7VXB78E
Azadeh Malek-Nejad | A common origin for inflation, neutrino mass, baryogenesis *and* dark matter! 😲
– provide a particle physics backbone to inflation
– give neutrinos mass
– generate a dark matter candidate
– and solve baryogenesis
thus linking all of these cosmological problems. The framework can also deal with various particle physics problems, such as the origin of the accidental B-L symmetry in the standard model, the strong CP problem, and the vacuum stability problem.
So, it’s safe to say that if the framework survives scrutiny it is a massive achievement.
The framework takes ideas from the “neutrino minimal standard model”, specifically a new SU(2) gauge field that couples only to right handed particles and can generate neutrino masses, as well as provide a dark matter candidate via the lightest right handed neutrino.
It then combines those ideas with some from Azadeh’s earlier work creating the model of gauge-flation. Specifically, it allows for the fields that interact under this new SU(2) gauge symmetry to be actively generated during inflation. This allows the vacuum fluctuations present during inflation, and which generate the curvature perturbation, to also generate the particles that will decay to dark matter and to generate the asymmetry in baryon number that eventually becomes the matter asymmetry.
What’s more, the model makes a number of specific predictions. The primordial gravitational waves from inflation would have some non-Gaussianity and will be chiral. And, the dark matter mass will be ~GeV – and would thus generate gamma rays in regions of very large dark matter density.
It will be fascinating to see how this framework develops, and whether numerical reheating studies can shed light on the various particle production processes that generate the matter and dark matter during and after inflation.
Azadeh: https://theory.cern/roster/maleknejad-azadeh
The paper: https://arxiv.org/abs/2012.11516