Statistical challenges in modern cosmology
Markus Rau, Carnegie Mellon University
12-1pm 27th Sep 2018
Modern cosmology is entering an era of high statistical precision, where it becomes increasingly important to control sources of systematic error. I will give a broad overview of these challenges and focus on a concrete example of sources of error in cosmological distance measurements. These errors are a dominant systematic for a variety of cosmological probes and can ultimately hinder our ability to accurately study dark energy and the cosmic expansion. Machine Learning and Bayesian modeling are a primary tool to improve the accuracy of these measurements while enabling the consistent parametrization of residual biases. As concrete examples, I will discuss how Machine Learning can be used to obtain accurate measurements of distance using Long Period variable stars to ultimately calibrate local supernovae samples. Connecting with complementary probes based on Weak Gravitational Lensing and Large-Scale-Structure, I will discuss how inaccurate distance, or redshift, measurements for samples of galaxies can bias ongoing and future large area photometric surveys like DES, KiDS, LSST and Euclid. I will present a Bayesian Hierarchical model that self-consistently parametrizes these errors and incorporates them into measurements of cosmological parameters.