As a particle phenomenologist with a focus on computation, my research it's half way between the experimental and theoretical worlds, without fully falling in any of them. Putting it in layman terms, I use theoretical model to compute simulations that can then be tested by my experimentalist colleagues in the various experiments across the globe (most of them, actually, in Geneva (Switzerland)).

Of the wide world of particle physics, my work is mainly within the realm of Quantum Chromodynamics. My aim is to obtain more precise predictions with more faithful errors which can then be compared to the experiments.

In the computational side, my research is on Machine Learning and Monte Carlo simulations. They are all pieces of the same big puzzle: artificial intelligence is used to model the internal structure of the colliding protons, the theory of Quantum Chromodynamics allows us to model what happens after the collision. The resulting complicated integrals are then computed using Monte Carlo methods.

In this page you can read a bit more about what I do and also find an up to date list of all my papers, talks and open-sourced published software.

A branch of Artificial Intelligence, this scientific discipline uses automated learning to perform tasks without knowing the necessary explicit instructions. We use ML algorithms in order construct mathematical models which, based on sample data (commonly known as training data), are able to make prediction on data that the model has never seen. Nowadays machine learning techniques are widespread in the world of physics (jet tagging, model building, pdf determination, ...) and beyond (email filtering, computer vision, ...). Currently my main topic of research as part of N3PDF consist on the application of ML techniques to the determination of the internal structure of the proton.

One very important ingredient for the high energy physics program of the Large Hadron Collider (LHC) at CERN is the determination of the parton content of the proton. The internal structure of the proton in terms of quarks and gluons is given by the Parton Distribution Functions.

In the NNPDF collaboration we use Machine Learning techniques in order to determinate the aforementioned PDFs.

Some of our work on PDF determination has been featured in the cover of the European Physical Journal C.

The Higgs boson is at the center of the physics program of the LHC at CERN. After its discovery in 2012, the effort of the phenomenological community (hep-ph) has been focused on the more precise determination of its properties.

I have worked on some of the most precise calculations for the determination of the corrections due to Quantum Chromodynamics (QCD) effects to Higgs boson production on the gluon fusion and Vector Boson Fusion (VBF) production channels.

State-of-the-art computations in High Energy Physics (HEP) require computing very complex multi-dimensional integrals numerically, as the analytical result is often not known. Monte Carlo algorithms are generally the option of choice be it in HEP applications or elsewhere, as the error of such algorithms does not grow with the number of dimensions.

Our aim is to improve the efficiency of widely used methods in order to enable more physics in less time. Some examples are our implementation of the Vegas algorithm in tensorflow (Vegasflow), PDFFlow or the distributed computing tool pyHepGrid.

title | conference | location | date | slides |
---|---|---|---|---|

VegasFlow: accelerating Monte Carlo simulation across devices | ICHEP 2020 | Virtually Prague | July 2020 | slides |

Optimizating the hyperoptimization | NNPDF Collaboration meeting | Amsterdam (The Netherlands) | February 2020 | |

Studying the parton content of the proton with deep learning models | Artificial Intelligence for Science, Industry and Society Symposium (AISIS 2019) | Ciudad de Mexico (Mexico) | October 2019 | slides |

Methodological improvements in PDF determination | James Stirling Memorial Conference \& PDF4LHC | Durham (UK) | September 2019 | slides |

n3fit and hyperoptimization in the context of NNPDF 4.0 | NNPDF Collaboration meeting | Varenna (Italy) | August 2019 | |

Towards a new generation of PDFs with deep learning models | QCD@LHC 2019 | Buffalo, New York (USA) | July 2019 | slides |

Numerical Integration with Neural Networks | NNLOJET Collaboration meeting | Zurich (Switzerland) | May 2019 | |

N3PDF studies of new methodologies | NNPDF Collaboration meeting | Amsterdam (The Netherlands) | February 2019 | |

Recent developments within NNLOJET | NNPDF Collaboration \& N3PDF Kickoff Meeting | Gargnano, Lake Garda (Italy) | September 2018 | slides |

NNLO corrections to VBF Higgs boson production | Loops and Legs in Quantum Field Theory 2018 | St. Goar (Germany) | May 2018 | |

NNLO phenomenology with Antenna Subtraction | HiggsTools Final Meeting | Durham (UK) | September 2017 | slides |

\phi^*_\eta observable for Higgs production | Internal Seminar | Durham (UK) | May 2017 | |

Higgs phenomenology with antenna subtraction | Student Seminar | Durham (UK) | February 2017 | |

Higgs phenomenology with antenna subtraction | Invited Seminar | Valencia (Spain) | Jaunary 2017 | slides |

NNLO calculations for Higgs processes | HiggsTools Second Annual Meeting | Granada (Spain) | April 2016 | slides |

Renormalisation Scale Dependence as a Testing Ground for NNLO calculations | Internal Seminar | Durham (UK) | February 2016 | |

Building and Playing with NNLO Monte Carlos | Student Seminar | Durham (UK) | February 2016 | |

NNLO predictions for Higgs production at LHC | HiggsTools First Annual Meeting | Freiburg (Germany) | April 2015 | slides |

title | description | source |
---|---|---|

pyHepGrid | Distributed computing made easy | Zenodo (2019) |

evolutionary keras | An evolutionary algorithm implementation for Keras | Zenodo (2020) |

vegasflow | Accelerating Monte Carlo simulation across multiple hardware platforms | Zenodo (2020) |

pdfflow | fast device agnostic PDF interpolation | Zenodo (2020) |

eko | eko | Zenodo (2020) |