Xiangyang Ju’s research while affiliated with Lawrence Berkeley National Laboratory and other places

What is this page?


This page lists works of an author who doesn't have a ResearchGate profile or hasn't added the works to their profile yet. It is automatically generated from public (personal) data to further our legitimate goal of comprehensive and accurate scientific recordkeeping. If you are this author and want this page removed, please let us know.

Publications (6)


Hyperparameters we optimize in our study. The lower and upper bound for each hyperparameter as well as the scaling factors are shown for both the broad search (with identical search ranges for both case studies) and the focus search (with the values depending on the results from the broad search).
Hyperparameter Optimization of Generative Adversarial Network Models for High-Energy Physics Simulations
  • Preprint
  • File available

October 2022

·

48 Reads

·

3 Citations

Vincent Dumont

·

Xiangyang Ju

·

Juliane Mueller

The Generative Adversarial Network (GAN) is a powerful and flexible tool that can generate high-fidelity synthesized data by learning. It has seen many applications in simulating events in High Energy Physics (HEP), including simulating detector responses and physics events. However, training GANs is notoriously hard and optimizing their hyperparameters even more so. It normally requires many trial-and-error training attempts to force a stable training and reach a reasonable fidelity. Significant tuning work has to be done to achieve the accuracy required by physics analyses. This work uses the physics-agnostic and high-performance-computer-friendly hyperparameter optimization tool HYPPO to optimize and examine the sensitivities of the hyperparameters of a GAN for two independent HEP datasets. This work provides the first insights into efficiently tuning GANs for Large Hadron Collider data. We show that given proper hyperparameter tuning, we can find GANs that provide high-quality approximations of the desired quantities. We also provide guidelines for how to go about GAN architecture tuning using the analysis tools in HYPPO.

Download

Hyperparameter Optimization of Generative Adversarial Network Models for High-Energy Physics Simulations

August 2022

·

60 Reads

The Generative Adversarial Network (GAN) is a powerful and flexible tool that can generate high-fidelity synthesized data by learning. It has seen many applications in simulating events in High Energy Physics (HEP), including simulating detector responses and physics events. However, training GANs is notoriously hard and optimizing their hyperparameters even more so. It normally requires many trial-and-error training attempts to force a stable training and reach a reasonable fidelity. Significant tuning work has to be done to achieve the accuracy required by physics analyses. This work uses the physics-agnostic and high-performance-computer-friendly hyperparameter optimization tool HYPPO to optimize and examine the sensitivities of the hyperparameters of a GAN for two independent HEP datasets. This work provides the first insights into efficiently tuning GANs for Large Hadron Collider data. We show that given proper hyperparameter tuning, we can find GANs that provide high-quality approximations of the desired quantities. We also provide guidelines for how to go about GAN architecture tuning using the analysis tools in HYPPO.


BROOD: Bilevel and Robust Optimization and Outlier Detection for Efficient Tuning of High-Energy Physics Event Generators

January 2022

·

60 Reads

·

6 Citations

SciPost Physics Core

Wenjing Wang

·

Mohan Krishnamoorthy

·

Juliane Muller

·

[...]

·

Zachary Marshall

The parameters in Monte Carlo (MC) event generators are tuned on experimental measurements by evaluating the goodness of fit between the data and the MC predictions. The relative importance of each measurement is adjusted manually in an often time-consuming, iterative process to meet different experimental needs. In this work, we introduce several optimization formulations and algorithms with new decision criteria for streamlining and automating this process. These algorithms are designed for two formulations: bilevel optimization and robust optimization. Both formulations are applied to the datasets used in the ATLAS A14 tune and to the dedicated hadronization datasets generated by the SHERPA generator, respectively. The corresponding tuned generator parameters are compared using three metrics. We compare the quality of our automatic tunes to the published ATLAS A14 tune. Moreover, we analyze the impact of a pre-processing step that excludes data that cannot be described by the physics models used in the MC event generators.


Apprentice for Event Generator Tuning

March 2021

·

21 Reads

Apprentice is a tool developed for event generator tuning. It contains a range of conceptual improvements and extensions over the tuning tool Professor. Its core functionality remains the construction of a multivariate analytic surrogate model to computationally expensive Monte-Carlo event generator predictions. The surrogate model is used for numerical optimization in chi-square minimization and likelihood evaluation. Apprentice also introduces algorithms to automate the selection of observable weights to minimize the effect of mis-modeling in the event generators. We illustrate our improvements for the task of MC-generator tuning and limit setting.


BROOD: Bilevel and Robust Optimization and Outlier Detection for Efficient Tuning of High-Energy Physics Event Generators

March 2021

·

16 Reads

The parameters in Monte Carlo (MC) event generators are tuned on experimental measurements by evaluating the goodness of fit between the data and the MC predictions. The relative importance of each measurement is adjusted manually in an often time-consuming, iterative process to meet different experimental needs. In this work, we introduce several optimization formulations and algorithms with new decision criteria for streamlining and automating this process. These algorithms are designed for two formulations: bilevel optimization and robust optimization. Both formulations are applied to the datasets used in the ATLAS A14 tune and to the dedicated hadronization datasets generated by the sherpa generator, respectively. The corresponding tuned generator parameters are compared using three metrics. We compare the quality of our automatic tunes to the published ATLAS A14 tune. Moreover, we analyze the impact of a pre-processing step that excludes data that cannot be described by the physics models used in the MC event generators.


Figure 2: Cumulative distribution of bins (y axis) in each category of the A14 dataset at different bands of variance levels (x axis) given by r b (p) = ( f b (p)−R b ) 2 ∆ f b (p) 2 +∆R 2 b .
Apprentice for Event Generator Tuning

January 2021

·

42 Reads

·

26 Citations

The European Physical Journal Conferences

APPRENTICE is a tool developed for event generator tuning. It contains a range of conceptual improvements and extensions over the tuning tool Professor. Its core functionality remains the construction of a multivariate analytic surrogate model to computationally expensive Monte-Carlo event generator predictions. The surrogate model is used for numerical optimization in chi-square minimization and likelihood evaluation. Apprentice also introduces algorithms to automate the selection of observable weights to minimize the effect of mis-modeling in the event generators. We illustrate our improvements for the task of MC-generator tuning and limit setting.

Citations (3)


... Spectral normalization is better in both cases outperforming the baseline. Hyperparameter optimisation for deep models is an important part for good training results as shown in further research studies [19], [20], [21], [22]. ...

Reference:

Pix2Pix Hyperparameter Optimisation Towards Ideal Universal Image Quality Index Score
Hyperparameter Optimization of Generative Adversarial Network Models for High-Energy Physics Simulations

... These tools make it possible to validate new and alternative MC models in a homogeneous and standardised way. They also play an important role in the context of the growing field of MC tuning; see, e.g., [7,12,[15][16][17][18][19][20][21][22][23][24][25][26][27][28]. ...

BROOD: Bilevel and Robust Optimization and Outlier Detection for Efficient Tuning of High-Energy Physics Event Generators

SciPost Physics Core

... A practical use-case for our studies is to consider the fitting or tuning of parameters inside of Monte Carlo event generators [22][23][24]. Tuning consists of simulating a large number of events with a 'base' parameterization θ, comparing the output distributions to experimental data, and updating the base parameterization to a new parameterization θ ′ following a prescribed procedure [25][26][27][28][29]. The first and third steps present as the primary bottlenecks encountered when performing full event-generator tunes. ...

Apprentice for Event Generator Tuning

The European Physical Journal Conferences