I am a postdoc at the group of Antony Lewis at the University of Sussex.
I am interested in novel ways of exploiting Cosmological data to learn about Early and Late Universe Physics.
My weapons of choice are an Occam's razor approach to Physics, a broad set of programing and computational skills, and a personal appeal for Statistics. Although I am currenty mainly a data-driven researcher, my background is on Theoretical Physics.
You can check out below a more technical description of my current scientific interests, as well as my list of publications.
On the personal side, among many other things I enjoy reading and discussing about psychology and politics, coding, cooking and swing dancing.
I am working on constraints for simple extensions to the canonical slow-roll scenario, based on combinations of CMB power spectrum and bispectrum and LSS data.
Although Planck data so far favour a canonical single-field scenario, hints of features of different kinds are present both in the power spectrum and bispectrum (arXiv:1502.02114, arXiv:1502.01592). From a model-independent approach, those hints are not statistically significant. However, if a physical model can produce those features simultaneously in both observables, and if that model is simple enough, those hints can turn into significant bayesian evidence for , provided that the model is simple enough.
In particular, I look for features produced by transient reductions in the speed of sound of the inflaton in an effectively-single-field slow-roll scenario (e.g. produced by soft, sudden turns along the inflationary trayectory). Those reductions in \(c_\mathrm{s}\) produce correlated features in both the primordial power spectrum \(\mathcal{P}(k)\) and bispectrum \(\mathcal{B}(k_1,k_2,k_3)\). Those features are linear-in-\(k\) oscillations with a localised envelope (arXiv:1211.5619). The results of our search so far can be seen in arXiv:1311.2552, arXiv:1404.7522 and arXiv:1410.4804.
I am also trying to produce a cosmological sampler suitable for very slow likelihoods or estimators, in order to more efficiently extract constraints from current and future surveys with increasingly larger data output.
Most sampling algorithms used in Cosmology today, are based on Metropolis-Hastings MCMC and Nested Sampling. Both contain some sort of indirect rejection sampling, where samples outside the desired likelihood threshold are discarded, the ratio of these over total likelihood computations being called efficiency. Since computing the likelihood is normally the most time-consuming operation during sampling, a low efficiency can make slow likelihoods intractable.
An alternative approach would be to model the likelihood as a gaussian random field, and use that model to compute the location of the next optimal sampling point such that the maximum amount of information can be learned about the underlying likelihood (NIPS:4657, NIPS:5483). This way, part of the computational cost gets shifted from the likelihood evaluation to the computation of the next optimal sampling point. This may allow to tackle problems with slow likelihoods in parameter spaces of moderate dimensionality, though this approach is not free of trouble: the sampler is sensitive to the correlation length of the likelihood model, which must also be learnt. This is still a work in progress.