November 2020
·
38 Reads
Journal of Computational Physics
This paper presents mathematical formulations and methods to predict the effect of forced-flight variance reduction on Monte Carlo tally variance and calculation time. This includes deducing biasing operators that are then used to construct a history-score probability density function (HSPDF), which represents all possible Monte Carlo random walks and gives the probability of a Monte Carlo history scoring in a tally from a particular phase-space position. The history-score moment equations (HSMEs), the statistical moments of the HSPDF, are then derived to calculate the statistical behavior of the Monte Carlo tally when forced-flight variance reduction is applied. In addition, the future-time equation (FTE) is derived to predict the Monte Carlo computational time as a result of applying forced-flight variance reduction. The solutions of the HSMEs and FTE can be used to predict Monte Carlo computational cost. This work also describes a discrete ordinates method to solve the forced-flight HSMEs and FTE. Several 1-D and 2-D test problems verify that the derivations are performed and implemented correctly.