# Stefano CarrazzaUniversity of Milan | UNIMI · Department of Physics

Stefano Carrazza

Doctor of Philosophy

## About

110

Publications

6,722

Reads

**How we measure 'reads'**

A 'read' is counted each time someone views a publication summary (such as the title, abstract, and list of authors), clicks on a figure, or views or downloads the full-text. Learn more

8,937

Citations

Introduction

Additional affiliations

October 2018 - present

October 2015 - September 2018

January 2012 - September 2015

## Publications

Publications (110)

Quantum technologies are moving towards the development of novel hardware devices based on quantum bits (qubits). In parallel to the development of quantum devices, efficient simulation tools are needed in order to design and benchmark quantum algorithms and applications before deployment on quantum hardware. In this context, we present a first att...

We propose and assess an alternative quantum generator architecture in the context of generative adversarial learning for Monte Carlo event generation, used to simulate particle physics processes at the Large Hadron Collider (LHC). We validate this methodology by implementing the quantum network on artificial data generated from known underlying di...

We present a new set of parton distribution functions (PDFs) based on a fully global dataset and machine learning techniques: NNPDF4.0. We expand the NNPDF3.1 determination with 44 new datasets, mostly from the LHC. We derive a novel methodology through hyperparameter optimization, leading to an efficient fitting algorithm built upon stochastic gra...

We present a first attempt to perform circuit-based quantum simulation using the just-in-time (JIT) compilation technique on multiple hardware architectures and configurations based on single-node central processing units (CPUs) and graphics processing units (GPUs). One of the major challenges in scientific code development is to balance the level...

In this proceedings we describe the current development status and recent technical achievements of Qibo, an open-source framework for quantum simulation. After a concise overview of the project goal, we introduce the modular layout for backend abstraction released in version 0.1.7. We discuss the advantages of each backend choice with particular e...

Since the first determination of a structure function many decades ago, all methodologies used to determine structure functions or parton distribution functions (PDFs) have employed a common prefactor as part of the parametrization. The NNPDF collaboration pioneered the use of neural networks to overcome the inherent bias of constraining the space...

We present Qibo, a new open-source software for fast evaluation of quantum circuits and adiabatic evolution which takes full advantage of hardware accelerators. The growing interest in quantum computing and the recent developments of quantum hardware devices motivates the development of new advanced computational tools focused on performance and us...

Since the first determination of a structure function many decades ago, all methodologies used to determine structure functions or parton distribution functions (PDFs) have employed a common prefactor as part of the parametrization. The NNPDF collaboration pioneered the use of neural networks to overcome the inherent bias of constraining the space...

Simulating quantum imaginary-time evolution (QITE) with high precision is a major promise of quantum computation. However, the known algorithms are either probabilistic (repeat until success) with unpractically small success probabilities or coherent (quantum amplitude amplification) but with circuit depths and ancillary-qubit numbers unrealistical...

We propose and assess an alternative quantum generator architecture in the context of generative adversarial learning for Monte Carlo event generation, used to simulate particle physics processes at the Large Hadron Collider (LHC). We validate this methodology by implementing the quantum network on artificial data generated from known underlying di...

We present the software framework underlying the NNPDF4.0 global determination of parton distribution functions (PDFs). The code is released under an open source licence and is accompanied by extensive documentation and examples. The code base is composed by a PDF fitting package, tools to handle experimental data and to efficiently compare it to t...

We present a new set of parton distribution functions (PDFs) based on a fully global dataset and machine learning techniques: NNPDF4.0. We expand the NNPDF3.1 determination with 44 new datasets, mostly from the LHC. We derive a novel methodology through hyperparameter optimisation, leading to an efficient fitting algorithm built upon stochastic gra...

We present the software framework underlying the NNPDF4.0 global determination of parton distribution functions (PDFs). The code is released under an open source licence and is accompanied by extensive documentation and examples. The code base is composed by a PDF fitting package, tools to handle experimental data and to efficiently compare it to t...

We present , a first general multi-purpose framework for Monte Carlo (MC) event simulation of particle physics processes designed to take full advantage of hardware accelerators, in particular, graphics processing units (GPUs). The automation process of generating all the required components for MC simulation of a generic physics process and its de...

We present MadFlow, a first general multi-purpose framework for Monte Carlo (MC) event simulation of particle physics processes designed to take full advantage of hardware accelerators, in particular, graphics processing units (GPUs). The automation process of generating all the required components for MC simulation of a generic physics process and...

We present a compression algorithm for parton densities using synthetic replicas generated from the training of a generative adversarial network (GAN). The generated replicas are used to further enhance the statistics of a given Monte Carlo PDF set prior to compression. This results in a compression methodology that is able to provide a compressed...

In this proceedings we present MadFlow, a new framework for the automation of Monte Carlo (MC) simulation on graphics processing units (GPU) for particle physics processes. In order to automate MC simulation for a generic number of processes, we design a program which provides to the user the possibility to simulate custom processes through the Mad...

We present a compression algorithm for parton densities using synthetic replicas generated from the training of a Generative Adversarial Network (GAN). The generated replicas are used to further enhance the statistics of a given Monte Carlo PDF set prior to compression. This results in a compression methodology that is able to provide a compressed...

We present PDFFlow, a new software for fast evaluation of parton distribution functions (PDFs) designed for platforms with hardware accelerators. PDFs are essential for the calculation of particle physics observables through Monte Carlo simulation techniques. The evaluation of a generic set of PDFs for quarks and gluon at a given momentum fraction...

Purpose: to propose a patient-independent model for the well-aerated volume estimation (WAVE) using CT images of the lung
Methods or Background: The model was applied to the histograms of lungs segmented in CT images of the chest. Two cohorts of 20 patients each with healthy lungs (cohort1, from emergency department of our hospital) and with 4DCT...

We present a first attempt to design a quantum circuit for the determination of the parton content of the proton through the estimation of parton distribution functions (PDFs), in the context of high energy physics (HEP). The growing interest in quantum computing and the recent developments of new algorithms and quantum hardware devices motivates t...

In this proceedings we present MadFlow, a new framework for the automation of Monte Carlo (MC) simulation on graphics processing units (GPU) for particle physics processes. In order to automate MC simulation for a generic number of processes, we design a program which provides to the user the possibility to simulate custom processes through the Mad...

We present PDFFlow, a new software for fast evaluation of parton distribution functions (PDFs) designed for platforms with hardware accelerators. PDFs are essential for the calculation of particle physics observables through Monte Carlo simulation techniques. The evaluation of a generic set of PDFs for quarks and gluons at a given momentum fraction...

The Prime state of n qubits, | P n ⟩ , is defined as the uniform superposition of all the computational-basis states corresponding to prime numbers smaller than 2 n . This state encodes, quantum mechanically, arithmetic properties of the primes. We first show that the Quantum Fourier Transform of the Prime state provides a direct access to Chebyshe...

We present a first attempt to design a quantum circuit for the determination of the parton content of the proton through the estimation of parton distribution functions (PDFs), in the context of high energy physics (HEP). The growing interest in quantum computing and the recent developments of new algorithms and quantum hardware devices motivates t...

In this work we demonstrate the usage of the VegasFlow library on multidevice situations: multi-GPU in one single node and multi-node in a cluster. VegasFlow is a new software for fast evaluation of highly parallelizable integrals based on Monte Carlo integration. It is inspired on the Vegas algorithm, very often used as the driver of cross section...

In this proceedings we demonstrate how to implement and construct the PineAPPL grids, designed for fast-interpolation of Monte Carlo simulation with electroweak and QCD corrections, using the VegasFlow framework for Monte Carlo simulation on hardware accelerators. We provide an example of synchronous and asynchronous filling operations of PineAPPL...

We present PDFFlow, a new software for fast evaluation of parton distribution functions (PDFs) designed for platforms with hardware accelerators. PDFs are essential for the calculation of particle physics observables through Monte Carlo simulation techniques. The evaluation of a generic set of PDFs for quarks and gluon at a given momentum fraction...

We discuss the determination of the parton substructure of hadrons by casting it as a peculiar form of pattern recognition problem in which the pattern is a probability distribution, and we present the way this problem has been tackled and solved. Specifically, we review the NNPDF approach to PDF determination, which is based on the combination of...

The Prime state of $n$ qubits, $|\mathbb{P}_n\rangle$, is defined as the uniform superposition of all the computational basis states corresponding to prime numbers smaller than $2^n$. This state encodes, quantum mechanically, arithmetic properties of the primes. We first show that the Quantum Fourier Transform of the Prime state provides a direct a...

We present VegasFlow, a new software for fast evaluation of high dimensional integrals based on Monte Carlo integration techniques designed for platforms with hardware accelerators. The growing complexity of calculations and simulations in many areas of science have been accompanied by advances in the computational tools which have helped their dev...

We introduce a novel implementation of a reinforcement learning (RL) algorithm which is designed to find an optimal jet grooming strategy, a critical tool for collider experiments. The RL agent is trained with a reward function constructed to optimize the resulting jet properties, using both signal and background samples in a simultaneous multi-lev...

The probability density function for the visible sector of a Riemann-Theta Boltzmann machine can be taken conditional on a subset of the visible units. We derive that the corresponding conditional density function is given by a reparameterization of the Riemann-Theta Boltzmann machine modelling the original probability density function. Therefore t...

Track finding and fitting are among the most complex parts of event reconstruction in high-energy physics, and usually dominate the computing time in a high luminosity environment. A central part of track reconstruction is the transport of a given track parametrisation (i.e. the parameter estimation and associated covariance matrices) through the d...

We present VegasFlow, a new software for fast evaluation of high dimensional integrals based on Monte Carlo integration techniques designed for platforms with hardware accelerators. The growing complexity of calculations and simulations in many areas of science have been accompanied by advances in the computational tools which have helped their dev...

Parton Distribution Functions (PDFs) model the parton content of the proton. Among the many collaborations which focus on PDF determination, NNPDF pioneered the use of Neural Networks to model the probability of finding partons (quarks and gluons) inside the proton with a given energy and momentum. In this proceedings we make use of state of the ar...

We formulate a general approach to the inclusion of theoretical uncertainties, specifically those related to the missing higher order uncertainty (MHOU), in the determination of parton distribution functions (PDFs). We demonstrate how, under quite generic assumptions, theory uncertainties can be included as an extra contribution to the covariance m...

We introduce a generative model to simulate radiation patterns within a jet using the Lund jet plane. We show that using an appropriate neural network architecture with a stochastic generation of images, it is possible to construct a generative model which retrieves the underlying two-dimensional distribution to within a few percent. We compare our...

The parton distribution functions (PDFs) which characterize the structure of the proton are currently one of the dominant sources of uncertainty in the predictions for most processes measured at the Large Hadron Collider (LHC). Here we present the first extraction of the proton PDFs that accounts for the missing higher order uncertainty (MHOU) in t...

Modern global analyses of the structure of the proton include collider measurements which probe energies well above the electroweak scale. While these provide powerful constraints on the parton distribution functions (PDFs), they are also sensitive to beyond the standard model (BSM) dynamics if these affect the fitted distributions. Here we present...

In this proceedings we describe the computational challenges associated to the determination of parton distribution functions (PDFs). We compare the performance of the convolution of the parton distributions with matrix elements using different hardware instructions. We quantify and identify the most promising data-model configurations to increase...

In these proceedings, we present a library allowing for straightforward calls in C++ to jet grooming algorithms trained with deep reinforcement learning. The RL agent is trained with a reward function constructed to optimize the groomed jet properties, using both signal and background samples in a simultaneous multi-level training. We show that the...

We introduce a generative model to simulate radiation patterns within a jet using the Lund jet plane. We show that using an appropriate neural network architecture with a stochastic generation of images, it is possible to construct a generative model which retrieves the underlying two-dimensional distribution to within a few percent. We compare our...

We present a new regression model for the determination of parton distribution functions (PDF) using techniques inspired from deep learning projects. In the context of the NNPDF methodology, we implement a new efficient computing framework based on graph generated models for PDF parametrization and gradient descent optimization. The best model conf...

We introduce a novel implementation of a reinforcement learning (RL) algorithm which is designed to find an optimal jet grooming strategy, a critical tool for collider experiments. The RL agent is trained with a reward function constructed to optimize the resulting jet properties, using both signal and background samples in a simultaneous multileve...

We present a new regression model for the determination of parton distribution functions (PDF) using techniques inspired from deep learning projects. In the context of the NNPDF methodology, we implement a new efficient computing framework based on graph generated models for PDF parametrization and gradient descent optimization. The best model conf...

We formulate a general approach to the inclusion of theoretical uncertainties, specifically those related to the missing higher order uncertainty (MHOU), in the determination of parton distribution functions (PDFs). We demonstrate how, under quite generic assumptions, theory uncertainties can be included as an extra contribution to the covariance m...

The probability density function for the visible sector of a Riemann-Theta Boltzmann machine can be taken conditional on a subset of the visible units. We derive that the corresponding conditional density function is given by a reparameterization of the Riemann-Theta Boltzmann machine modelling the original probability density function. Therefore t...

Modern global analyses of the structure of the proton include collider measurements which probe energies well above the electroweak scale. While these provide powerful constraints on the parton distribution functions (PDFs), they are also sensitive to beyond the Standard Model (BSM) dynamics if these affect the fitted distributions. Here we present...

The parton distribution functions (PDFs) which characterize the structure of the proton are currently one of the dominant sources of uncertainty in the predictions for most processes measured at the Large Hadron Collider (LHC). Here we present the first extraction of the proton PDFs that accounts for the missing higher order uncertainty (MHOU) in t...

We introduce a novel implementation of a reinforcement learning (RL) algorithm which is designed to find an optimal jet grooming strategy, a critical tool for collider experiments. The RL agent is trained with a reward function constructed to optimize the resulting jet properties, using both signal and background samples in a simultaneous multi-lev...

A bstract
We present a next-to-leading order accurate simulation of t -channel single-top plus jet production matched to parton showers via the Powheg method. The calculation underlying the simulation is enhanced with a process-specific implementation of the multi-scale improved NLO (Minlo) method, such that it gives physical predictions all throug...

Machine learning is an important applied research area in particle physics, beginning with applications to high-level physics analysis in the 1990s and 2000s, followed by an explosion of applications in particle and event identification and reconstruction in the 2010s. In this document we discuss promising future research and development areas in m...

Machine learning is an important applied research area in particle physics, beginning with applications to high-level physics analysis in the 1990s and 2000s, followed by an explosion of applications in particle and event identification and reconstruction in the 2010s. In this document we discuss promising future research and development areas in m...

Machine learning is an important research area in particle physics, beginning with applications to high-level physics analysis in the 1990s and 2000s, followed by an explosion of applications in particle and event identification and reconstruction in the 2010s. In this document we discuss promising future research and development areas in machine l...

We present a next-to-leading order accurate simulation of t-channel single-top plus jet production matched to parton showers via the POWHEG method. The calculation underlying the simulation is enhanced with a process-specific implementation of the multi-scale improved NLO (MINLO) method, such that it gives physical predictions all through phase spa...

We present a determination of the strong coupling constant $\alpha_s(m_Z)$ based on the NNPDF3.1 determination of parton distributions, which for the first time includes constraints from jet production, top-quark pair differential distributions, and the $Z$ $p_T$ distributions using exact NNLO theory. Our result is based on a novel extension of the...

We show that the visible sector probability density function of the Riemann-Theta Boltzmann machine corresponds to a gaussian mixture model consisting of an infinite number of component multi-variate gaussians. The weights of the mixture are given by a discrete multi-variate gaussian over the hidden state space. This allows us to sample the visible...

We present a determination of the strong coupling constant $\alpha_s(m_Z)$ based on the NNPDF3.1 determination of parton distributions, which for the first time includes constraints from jet production, top-quark pair differential distributions, and the $Z$ $p_T$ distributions using exact NNLO theory. Our result is based on a novel extension of the...

A general Boltzmann machine with continuous visible and discrete integer valued hidden states is introduced. Under mild assumptions about the connection matrices, the probability density function of the visible units can be solved for analytically, yielding a novel parametric density function involving a ratio of Riemann-Theta functions. The condit...

Precision phenomenology at the LHC requires accounting for both higher-order
QCD and electroweak corrections as well as for photon-initiated subprocesses.
Building upon the recent NNPDF3.1 fit, in this work the photon content of the
proton is determined within a global analysis supplemented by the LUXqed
constraint relating the photon PDF to lepton...

In this proceedings we perform a brief review of machine learning (ML) applications in theoretical High Energy Physics (HEP-TH). We start the discussion by defining and then classifying machine learning tasks in theoretical HEP. We then discuss some of the most popular and recent published approaches with focus on a relevant case study topic: the d...

We discuss the current minimisation strategies adopted by research projects involving the determination of parton distribution functions (PDFs) and fragmentation functions (FFs) through the training of neural networks. We present a short overview of a proton PDF determination obtained using the covariance matrix adaptation evolution strategy (CMA-E...

We present NNFF1.0, a new determination of the fragmentation functions (FFs) of charged pions, charged kaons, and protons/antiprotons from an analysis of single-inclusive hadron production data in electron-positron annihilation. This determination, performed at leading, next-to-leading, and next-to-next-to-leading order in perturbative QCD, is base...

We present a new set of parton distributions, NNPDF3.1, which updates NNPDF3.0, the first global set of PDFs determined using a methodology validated by a closure test. The update is motivated by recent progress in methodology and available data, and involves both. On the methodological side, we now parametrize and determine the charm PDF alongside...

We present a preliminary strategy for modeling multidimensional distributions through neural networks. We study the efficiency of the proposed strategy by considering as input data the two-dimensional next-to-next leading order (NNLO) jet k-factors distribution for the ATLAS 7 TeV 2011 data. We then validate the neural network model in terms of int...

We present an unbiased determination of the charm content of the proton, in which the charm parton distribution function (PDF) is parametrized on the same footing as the light quarks and the gluon in a global PDF analysis. This determination relies on the NLO calculation of deep-inelastic structure functions in the FONLL scheme, generalized to acco...

This Report summarizes the results of the activities of the LHC Higgs Cross Section Working Group in the period 2014-2016. The main goal of the working group was to present the state-of-the-art of Higgs physics at the LHC, integrating all new results that have appeared in the last few years. The first part compiles the most up-to-date predictions o...