The Status of the Chicago-Carnegie Hubble Program (with JWST data) – Wendy Freedman, Barry Madore

Wendy Freedman and Barry Madore give a status update on the Chicago-Carnegie Hubble Program (CCHP). There has been much progress!

In this status update there are three independent distance ladder measurements of the expansion rate (i.e. the Hubble parameter) all using JWST data. There is the very well-known Cepheid method, the now also well known Tip of the Red Giant (TRGB) method, and making its debut, the J-Region Asymptotic Giant Branch (JAGB) method.

The TRGB and JAGB methods agree strikingly well and appear to be consistent with the CMB + ΛCDM value for the expansion rate. The Cepheid method also agrees to within less than 5%, which is still striking, but there is some deviation.

The deviation pushes the Cepheid value for H0 up, making it consistent with the SH0ES value, although the error bars on this CCHP result are large enough that their Cepheid value is also consistent with the lower TRGB and JAGB methods. Most curiously (to this relative outsider) the JAGB value does not appear to be consistent with SH0ES, even when considering statistical and systematic errors at once. The striking consistency between TRGB and JAGB, means that one’s naive guess might be that there is some unknown systematic in the Cepheid method (in both the CCHP and SH0ES measurements). However, naive guesses are naive so, time will tell. Continue reading

DES Supernovae – Precisely Measured Time Dilation from Universe’s Expansion (Ryan White, Tamara Davis)

Ryan White and Tamara Davis from the Dark Energy Survey tell us about how they have measured time dilation in distant supernovae. This time dilation is precisely what one would expect in an expanding universe. “Precisely” is the right word too, as they have measured this effect to 0.5% precision and they get exactly the number predicted by an expanding universe. Continue reading

DES Supernovae – Weak Lensing Magnification Detected at 6sigma! (Paul Shah, Tamara Davis)

Paul Shah and Tamara Davis tell us about how they have used this wonderful supernova catalogue from the Dark Energy Survey to detect the weak lensing magnification signal for the first time. There has been evidence of this signal in earlier catalogues, but at no bigger than 1.4σ. They’ve got it at 6σ!

They do this by correlating the scatter in the magnitude of the supernovae with the over-under density in galaxy catalogues along the same lines of sight of the supernovae. Where there is more matter, the light from the supernova should be magnified, and where there is less matter it should be de-magnified. And they do indeed see that along overdense lines of sight the supernovae are, on average, ever so slightly brighter, and on underdense lines of sight the are ever so slightly dimmer.

I can’t wait to see how this observable is used in the future to constrain all sorts of bits of cosmology. Nice work everyone!

Paper: arXiv: 2406.05047

Paul: paulshah.github.io

Tamara: smp.uq.edu.au/profile/186/tamara-davis

DES Supernovae – Beyond LCDM (Ryan Camilleri, Tamara Davis)

Ryan Camilleri and Tamara Davis tell us about how they have examined models beyond ΛCDM using the Dark Energy Survey’s wonderful supernova catalogue. Tantalisingly, they find that a number of models are “moderately preferred” over ΛCDM (in model comparison speak).

They also, very admirably, check whether crucial aspects of the DES pipeline are model dependent or not. They find that, so long as the reference model is close-ish to the true model then the pipeline is accurate. “Close-ish” is very generous here as well, as they even found in simulations that when one processed the data with models 10σ from the truth, the subsequent parameter constraints were still within 1σ of the truth. The moral of this is that, even though the supernovae were processed assuming ΛCDM, so long as the true cosmology isn’t too far from ΛCDM then this doesn’t matter.

This means, if you have your own model that they haven’t tested, you don’t need to simulate the entire DES analysis pipeline to analysis your model, you can do your model comparison at the level of the Hubble diagram. Nice! Continue reading

DES Supernovae – H0 From the Inverse Distance Ladder Without LCDM (Ryan Camilleri, Tamara Davis)

Ryan Camilleri and Tamara Davis tell us about how they have used the Dark Energy Survey’s Year 5 supernovae catalogue, anchored to the Dark Energy Spectroscopic Instrument’s Baryon Acoustic Oscillations, to create an “inverse distance ladder”.

With this they are able to determine the Hubble Parameter, at redshift zero with a high accuracy, without needing to assume ΛCDM. The results still match Planck, meaning that the high redshift to low redshift matching appears to still not work out, even outside of ΛCDM.

The implications are large for any attempts to go beyond ΛCDM to solve the Hubble tension as it appears the z=2 to z=0.05 window is not the right window for finding the solution.

Paper: arXiv: 2406.05049

Ryan: smp.uq.edu.au/profile/13102/ryan-camilleri

Tamara: smp.uq.edu.au/profile/186/tamara-davis

The Dark Energy Survey Supernova Program – Data and Cosmology (Davis, Vincenzi and Brout)

Tamara Davis, Maria Vincenzi and Dillon Brout tell us about the Dark Energy Survey’s (DES) new supernova catalogue. The catalogue has more than 1500 new supernovae, and will allow a vast range of new cosmology constraints. It is a factor of around five larger than the next largest high redshift supernovae catalogue.

Very curiously, DES’ supernovae see hints of evolving dark energy. This is especially curious given that a few months after DES released this data, the Dark Energy Spectroscopic Instrument (DESI) also released data with similar hints. Continue reading

The Parameter Masked Mock Data Challenge for Beyond 2-Pt Statistics – Results, Lessons & Reflections

1-1/2 hour discussion between Shaun and 9 members of The Beyond-2pt Collaboration: Elisabeth Krause, Marcos Pellejero-Ibanez, Andres Salcedo, Minh Nguyen, Mikhail Ivanov, Enrique Paillas, Carolina Cuesta-Lazaro, Chirag Modi, Giovanni Verza

One-sentence summary of the work by Minh Nguyen: “how cosmology and galaxy survey analyses can move beyond the canonical 2-point correlation function”

Paper: https://arxiv.org/abs/2405.02252

Fundamental Cosmology from the Lab (Fromhold and Hackermuller)

Mark Fromhold and Lucia Hackermuller tell us about how they are 3D printing atom traps that allow them to cool atoms to a few micro Kelvin. This is super interesting for cosmology because it would allow them, among many other things, to potentially trap dark domain walls. We learned in another recent cosmology talk about the physics behind these dark domain walls, now here is the physics behind the cold atom trap.

In principle these traps may one day measure the gravitational effects of quantum objects, ultimately testing whether space-time curvature can be in a quantum superposition or not. Continue reading

Field Level Inference – Up to 5 sigma Better than Power and Bispectrum! (Minh Nguyen and Beatriz Tucci)

Nhat-Minh Nguyen and Beatrice Tucci tell us about their recent work comparing the performance of field inference (FLI) and simulation based inference (SBI). In an apples to apples comparison, they find that FLI comfortably outperforms SBI, even in what is essentially the “best case scenario” for SBI.

Field level inference gives up on using “summary statistics” to construct a cosmological likelihood (e.g. the power spectrum, the bispectrum, the location of the BAO peak, voids, etc) and instead constructs the cosmological likelihood at the level of the field itself. In other words the likelihood step of a statistical analysis is done comparing the measured density field at each point in Fourier space to a model’s actual density field. This means the set of model “parameters” necessarily also includes the entire set of Fourier modes of the initial conditions. Then, for example, when one would then talk about the “maximum likelihood” parameters in a FLI inference, one is talking also about the maximum likelihood set of initial conditions.

One then does the rest of the statistical analysis more or less the same as if one is analysing a measured power spectrum, e.g. one has priors on the inferred parameters, one has the likelihood function, and one produces posterior probability distributions for all of the model parameters.

In this analysis they fixed all cosmological parameters except the overall amplitude of the initial density fluctuations, via σ8. This means they also restricted the set of initial density fluctuations to those with a certain spectral index, but varied over all sets of initial density fluctuations that do produce this spectral index. They then evolve the initial conditions forward in time using the LEFTfield framework and do the FLI analysis on the evolved field. Continue reading

Black Holes in Ultralight Dark Matter – Slowed Down and Sped Up? (Russell Boey – ft Easther & Wang)

Russell Boey, along with his coauthors Richard Easther and Yourong Wang, tells us about his simulations of a supermassive black hole traveling through an ultralight dark matter soliton. In particular, he has studied the dynamical friction effect on the black hole within the soliton.

This is especially interesting in the context of the “final parsec” problem, where the orbits of supermassive black hole binary systems stall in their decay as they reach one parsec separation. Maybe a different background, in the form of ULDM instead of WIMP DM, could help?

An ultralight dark matter soliton is much more dense than expectations from “ordinary” WIMP-like dark matter, so it is also expected that the dynamical friction in such a soliton should be large. This is indeed what Russell, Richard and Yourong found (and other coauthor Emily Kendall who isn’t present in the video). However, curiously, they also found a secondary effect where the black hole perturbs the soliton, which in turn causes the soliton to backreact on the black hole and sometimes speed it back up. Continue reading