Content uploaded by April Novak
Author content
All content in this area was uploaded by April Novak on Feb 16, 2023
Content may be subject to copyright.
Full Terms & Conditions of access and use can be found at
https://www.tandfonline.com/action/journalInformation?journalCode=unse20
Nuclear Science and Engineering
ISSN: (Print) (Online) Journal homepage: https://www.tandfonline.com/loi/unse20
Coupled Monte Carlo Transport and Conjugate
Heat Transfer for Wire-Wrapped Bundles Within
the MOOSE Framework
A. J. Novak, P. Shriwise, P. K. Romano, R. Rahaman, E. Merzari & D. Gaston
To cite this article: A. J. Novak, P. Shriwise, P. K. Romano, R. Rahaman, E. Merzari & D.
Gaston (2023): Coupled Monte Carlo Transport and Conjugate Heat Transfer for Wire-
Wrapped Bundles Within the MOOSE Framework, Nuclear Science and Engineering, DOI:
10.1080/00295639.2022.2158715
To link to this article: https://doi.org/10.1080/00295639.2022.2158715
Published online: 15 Feb 2023.
Submit your article to this journal
View related articles
View Crossmark data
Coupled Monte Carlo Transport and Conjugate Heat Transfer for
Wire-Wrapped Bundles Within the MOOSE Framework
A. J. Novak,
a
* P. Shriwise, P. K. Romano,
a
R. Rahaman,
b
E. Merzari,
c
and D. Gaston
d
a
Argonne National Laboratory, Lemont, Illinois
b
Georgia Institute of Technology, Atlanta, Georgia
c
Pennsylvania State University, State College, Pennsylvania
d
Idaho National Laboratory, Idaho Falls, Idaho
Received August 19, 2022
Accepted for Publication December 12, 2022
Abstract — Cardinal is an open-source application that couples OpenMC Monte Carlo transport and
NekRS computational fluid dynamics (CFD) to the Multiphysics Object-Oriented Simulation Environment
(MOOSE), closing neutronics and thermal-fluid gaps in conducting high-resolution multiscale and multi-
physics analyses of nuclear systems. We first provide a brief introduction to Cardinal’s software design, data
mapping, and coupling strategy to highlight our approach to overcoming common challenges in high-
fidelity multiphysics simulations. We then present two Cardinal simulations for hexagonal pin bundles. The
first is a validation of Cardinal’s conjugate heat transfer coupling of NekRS’s Reynolds-Averaged Navier
Stokes model with MOOSE’s heat conduction physics for a bare seven-pin Freon-12 bundle flow experi-
ment. Predictions for pin surface temperatures under three different heating modes agree reasonably well
with experimental data and similar CFD modeling from the literature. The second simulation is
a multiphysics coupling of OpenMC, NekRS, and BISON for a reduced-scale, seven-pin wire-wrapped
version of an Advanced Burner Reactor bundle. Wire wraps are approximated using a momentum source
model, and coupled predictions are provided for velocity, temperature, and power distribution.
Keywords — Cardinal, MOOSE, NekRS, OpenMC, computational fluid dynamics, Monte Carlo.
Note — Some figures may be in color only in the electronic version.
I. INTRODUCTION
Many reactor phenomena are inherently multiscale
and multiphysics. The time and length scales characteriz-
ing reactor systems typically span many orders of mag-
nitude. At the lower time and length scales, fine-scale
effects include turbulent energy dissipation, turbulent
boundary layers, and microstructure behavior in materi-
als. At the higher time and length scales, large-scale
effects are typically described in terms of core pressure
drop and bulk energy balances with characteristic lengths
on the order of meters and with timescales that reflect the
interaction between the reactor core and balance of plant
systems. Physics phenomena over this broad range of
scales have significant implications for reactor design
and licensing, and multiscale techniques are often neces-
sary to achieve full-core simulations. In addition, the
multiphysics interactions between neutron transport, ther-
mal fluids, materials, chemistry, and structural mechanics
should be accounted for to ensure safe operation while
maximizing performance.
Historically, a common challenge to multiscale and
multiphysics modeling of nuclear systems has been the
development of tools for neutron transport, thermal
hydraulics (T/H), and solid mechanics using a wide vari-
ety of spatial discretization schemes, software architec-
tures, and solution data structures. Depending on the code
design, it may not be a simple feat to establish just the
“mechanics” of scale and physics coupling—the data
*E-mail: anovak@anl.gov
NUCLEAR SCIENCE AND ENGINEERING
© 2023 American Nuclear Society
DOI: https://doi.org/10.1080/00295639.2022.2158715
1
transfers, parallel communication, and iterative solution
—among diverse physics codes.
The Multiphysics Object-Oriented Simulation
Environment (MOOSE) is a finite element and finite
volume framework that allows applied math practitioners
to translate physics models into high-quality, state-of-the-
art engineering software while also providing solutions to
many of the software mechanics challenges encountered
in multiphysics/multiscale research.
1
Because all
MOOSE applications are based on the same framework,
a shared data transfer and field interpolation system can
be used to couple MOOSE applications to one another
through source terms, boundary conditions (BCs), and
virtually any other mechanism by which physics and
scales might be coupled. This common execution and
data transfer system provides an opportunity for platform
coupling, where the MOOSE framework is used as an
application programming interface (API) for coupling
codes.
Cardinal is an open-source application that wraps the
NekRS computational fluid dynamics (CFD) code
2
and the
OpenMC Monte Carlo particle transport code
3
within the
MOOSE framework. Cardinal leverages MOOSE’s physics
and geometry-agnostic multi-application and data transfer
systems to enable high-resolution multiphysics feedback to
the MOOSE ecosystem. By adopting MOOSE’s plug-and-
play philosophy, Cardinal has been applied to very diverse
systems, including multiphysics couplings of NekRS,
OpenMC, and BISON for pebble bed reactors (PBRs) ran-
ging from 1568 to 127 000 pebbles,
4,5
multiphysics couplings
of NekRS, OpenMC, BISON, and THM for high-
temperature gas reactors (HTGRs) ranging from the unit
cell to microreactor scale,
6
separate and overlapping domain
coupling of NekRS and SAM for systems-level analysis,
7
and coupling of NekRS and the MOOSE tensor mechanics
module for pressurized thermal shock in light water reactors
8
(LWRs).
Building off the success of this work, Cardinal has
recently expanded into fast reactor applications as part of
a multiyear project that aims to improve the understand-
ing of a phenomenon known as core radial expansion.
Core radial expansion is an important reactivity feedback
effect in fast-spectrum systems where interactions
between solid mechanics, neutronics, and T/H induce
differential thermal expansion, irradiation swelling, and
irradiation creep. Changes to duct geometries have sig-
nificant implications for reactor control
9,10
and refueling
operations,
11
but this physics has yet to be modeled at
high fidelity.
12
This project aims to apply Cardinal to core
radial expansion modeling of fast reactors using tightly
coupled neutronics, T/H, and solid mechanics physics.
This paper is one step in the direction of a full-core radial
expansion simulator, with the following objectives: (1)
introduce the coupling methodologies used in Cardinal,
(2) validate Cardinal’s conjugate heat transfer (CHT)
coupling of NekRS to BISON using a bare, seven-pin
Freon-12 experiment,
13
and (3) demonstrate a tight cou-
pling of NekRS, OpenMC, and BISON for a small-scale
version of a driver fuel assembly in the Advanced Burner
Reactor
14
(ABR).
The remainder of this paper is as follows. Section II
introduces the single-physics computational tools used in
the present analysis, NekRS, OpenMC, and BISON, as
well as how they are coupled together via Cardinal. Next,
Sec. III presents validation of Cardinal’s NekRS-MOOSE
coupling with temperature data obtained from a seven-pin
bare bundle. Section IV then presents a fully coupled
multiphysics simulation of a seven-pin wire-wrapped
bundle by incorporating OpenMC Monte Carlo transport.
Pin-resolved predictions are provided for the velocity,
fission distribution, fluid temperature, and solid tempera-
ture. Finally, Sec. V revisits the limitations of the model-
ing and simulation and outlines future work.
II. COMPUTATIONAL TOOLS
This section introduces Cardinal and describes how
OpenMC and NekRS are coupled to MOOSE. Then,
Secs. II.B, II.C, and II.D provide additional model infor-
mation for the single-physics OpenMC, NekRS, and
BISON models in the context of the present applications.
Section II.E concludes with a description of Picard
iteration.
II.A. Cardinal
MOOSE was initially developed for solving coupled
systems of nonlinear partial differential equations
1
(PDEs). To utilize MOOSE in this manner, applied
math practitioners can create C++ objects in an object-
oriented framework to represent the physics kernels, BCs,
material properties, executioners, and any other aspects of
the governing equations and solution strategy. MOOSE
then coordinates libMesh and PETSc to discretize space
using the finite element or finite volume method and
solve the nonlinear system. Many such “native”
MOOSE applications have been developed, spanning
domains including nuclear reactor physics,
15
nuclear
fuel performance,
16
systems-level T/H (Ref. 17), porous
media T/H (Ref. 18), and heat pipes.
19
2NOVAK et al. · COUPLED MONTE CARLO TRANSPORT AND CONJUGATE HEAT TRANSFER
NUCLEAR SCIENCE AND ENGINEERING · VOLUME 00 · XXXX 2023
In recent years, MOOSE has added the capability for
coupled solves with external applications that are based on
entirely different solution methodologies and software stacks.
An external code can be wrapped into a MOOSE application
by overriding a few key interface functions in MOOSE’s code
base to initialize, run, and postprocess results for the external
code, while exposing the time-stepping, synchronization, and
data transfer systems in MOOSE. These MOOSE-wrapped
codes are themselves MOOSE applications with physics
engines substituted with API calls to the external code base.
Such applications are also referred to as “non-native”
MOOSE applications because all aspects of the simulation
are pulled from external libraries.
Cardinal is a “non-native” MOOSE application that
wraps OpenMC and NekRS within MOOSE, allowing the
radiation transport and CFD physics engines in OpenMC and
NekRS to interact with the MOOSE framework. Cardinal
essentially translates the NekRS and OpenMC solutions into
a MOOSE-compatible format that can then be applied as
feedback to any other MOOSE application using MOOSE’s
mesh-to-mesh mapping features. Cardinal is designed with
a general plug-and-play structure that allows OpenMC and
NekRS to be mixed and matched with other MOOSE appli-
cations or even one another. For instance, the same OpenMC
model can provide neutronics feedback to NekRS turbulence-
resolved CFD, Pronghorn subchannel/porous media models,
and SAM one-dimensional flow loop models.
2,17,20
In
a similar fashion, the same NekRS model can provide CFD
feedback to a MOOSE tensor mechanics model, BISON fuel
performance, Griffin deterministic neutronics, and OpenMC
radiation transport.
3,16
This flexibility is important to support
the diverse advanced reactor landscape, where there is not
a “one size fits all” multiphysics toolset.
A detailed introduction to the software design, data
transfers, and coupling strategy used in Cardinal can be
found in our previous work.
6
Extensive documentation on
Cardinal’s build system, input file syntax, and model
setup can also be found on the Cardinal website.
21
Here, we briefly summarize the high-level design and
advantages in contrast with other high-fidelity multiphy-
sics works.
The Cardinal software consists of three main steps to
wrap NekRS and OpenMC within MOOSE. Representing
NekRS and OpenMC symbolically as χ, these three
steps are
1. Copy the mesh/geometry of χ into the
MooseMesh format. This “mesh mirror” is the receiving
point for all field data sent in/out of χ.
2. Establish a spatial mapping from χ’s geometry to
the mesh mirror.
3. Solve χ:
a. Read data from MooseVariable(s) defined on
the mesh mirror and send to χ.
b. Run χ for one time step.
c. Read data from χ and write into
MooseVariable(s) defined on the mesh mirror.
Figure 1 depicts the overall relationship of Cardinal,
NekRS, and OpenMC to the MOOSE framework. In
gray circles are shown a number of native MOOSE
applications that perform physics solves using the
MOOSE framework. Conversely, the models used by
NekRS (a high-order spectral element solution) and
OpenMC [a cell-uniform Constructive Solid Geometry
(CSG) solution] are shown in the upper and lower right,
respectively.
Cardinal interfaces NekRS and OpenMC with
MOOSE’s mesh-to-mesh field transfer system by copying
the external solutions to and from a mesh mirror, an
intermediate layer between NekRS/OpenMC and
MOOSE. The solid red arrows in Fig. 1 represent data
transfers facilitated by Cardinal, while all dashed black
arrows represent data transfers performed by MOOSE.
The particular fields transferred between applications
(heat flux, power, temperature, etc.) are customizable
and are dependent on the particular system being
modeled.
Cardinal’s design eliminates many limitations com-
mon to earlier T/H and Monte Carlo couplings. All data
are communicated in memory, obviating the need for
code-specific input/output (I/O) programs
22–24
and redu-
cing file-based communication bottlenecks.
25
In addition,
spatial mappings to MOOSE are constructed automati-
cally with no requirements on node/element/cell align-
ment. This eliminates the need for rigid one-to-one
mappings
26
or careful attention to cell volumes,
27
but
more importantly, it allows for geometry-agnostic data
transfers. Cardinal’s diverse applications to PBRs,
sodium fast reactors (SFRs), LWRs, and HTGRs are all
conducted without the need to develop any custom source
code or file I/O manipulation scripts.
Cardinal supports distributed mesh data transfers
between NekRS, OpenMC, and MOOSE. Both MOOSE
and NekRS may distribute the mesh and solutions among
MPI ranks, reducing the memory associated with trans-
fers. OpenMC replicates the geometry across all ranks,
but may still communicate with a domain-decomposed
MOOSE solve. In addition, NekRS supports both CPU
and graphics processing unit (GPU) backends. When
GPUs are available, Cardinal facilitates data transfers
COUPLED MONTE CARLO TRANSPORT AND CONJUGATE HEAT TRANSFER · NOVAK et al. 3
NUCLEAR SCIENCE AND ENGINEERING · VOLUME 00 · XXXX 2023
between the host (where MOOSE and OpenMC run) and
device (where NekRS runs). A GPU port is currently
under development in OpenMC, with plans to support
via Cardinal.
28
Cardinal also contains a rich postproces-
sing system to generate coarse-mesh T/H closures
directly from NekRS simulations.
Sections II.B and II.C describe Cardinal’s OpenMC and
NekRS wrappings in greater detail, with an emphasis on the
particular models used in the present work. Section II.D also
briefly covers BISON, which provides the solid heat conduc-
tion solver. An in-depth description of the Picard coupling
strategy is then presented in Sec. II.E.
II.B. OpenMC
OpenMC is an open-source continuous-energy neutron-
photon Monte Carlo code with capabilities for cell and
libMesh unstructured mesh tallies, k-eigenvalue and fixed-
source calculations, event- and history-based parallelism,
depletion, and windowed multipole on-the-fly Doppler
broadening, and many other features.
3
In the present work,
the OpenMC models are built using a CSG cell-based
geometry. Woodcock delta tracking and mesh-based geome-
tries are both currently under development, so temperatures
and densities are uniform over an individual cell.
The fission distribution is measured with a kappa-
fission tally, the recoverable energy release from fission,
in units of electron-volt/source. Cross-section data are
provided with the ENDF/B-VII.1 library, which has data
sets between 250 and 2500 K. For the Sðα;βÞ, thermal
scattering data, and unresolved resonance region prob-
ability tables, an in-memory stochastic linear-linear inter-
polation between the nearest two temperature data sets is
performed, whereas the windowed multipole method is
used for the resolved resonance range.
Cardinal couples OpenMC to MOOSE through the
fission distribution and cross-section feedback via cell tem-
peratures and densities. During initialization, Cardinal loops
over the elements in the mesh mirror and maps each by
centroid to an OpenMC cell to obtain a mapping between
cells c and elements e. For a given cell c, the fission tally is
written as a constant monomial field to e, while tempera-
tures and densities are updated from volume averages of
these fields over e. An example CSG geometry and mesh
Fig. 1. Overall relationship of Cardinal, NekRS, and OpenMC to the MOOSE framework. All gray circles are native MOOSE
applications. Solid red arrows indicate data transfers performed by Cardinal, while dashed black arrows indicate data transfers
performed using MOOSE.
4NOVAK et al. · COUPLED MONTE CARLO TRANSPORT AND CONJUGATE HEAT TRANSFER
NUCLEAR SCIENCE AND ENGINEERING · VOLUME 00 · XXXX 2023
mirror is shown in the lower right of Fig. 1. The CSG
geometry is colored by the cell ID, and the inset shows the
actual boundary of an OpenMC cell as a white dashed line.
The element centroids, shown as white dots, determine the
cell-to-element mapping. There are no requirements on the
alignment of elements/cells or on preserving volumes. The
OpenMC cells and mesh mirror elements do not need to be
conformal, and any distortion is considered a discretization
error that is addressed via formal mesh refinement studies.
II.C. NekRS
NekRS is an open-source spectral element CFD code
with capabilities for Reynolds-Averaged Stokes (RANS),
large eddy simulation, and direct numerical simulation.
2
By using the open concurrent compute abstraction
(OCCA) interface, NekRS supports both CPU and GPU
backends. In the present work, NekRS solves for con-
stant-property mass, momentum, and energy conservation
with an incompressible RANS model:
Ñ~
u¼0;ð1Þ
ρf
q~
u
qtþ~
uÑ~
u
¼ ÑPþÑμfþμT
Ñ~
uþρf~
f;ð2Þ
and
ρfCp;f
qTf
qtþ~
uÑTf
¼ÑkfþkT
ÑTf;ð3Þ
where
~
u = velocity
ρf = density
P = pressure
μf = laminar dynamic viscosity
μT = turbulent dynamic viscosity
Cp;f = isobaric specific heat capacity
Tf = temperature
kf = laminar thermal conductivity
kT = turbulent thermal conductivity
~
f = a generic momentum source.
The turbulent Prandtl number PrT relates kT and μT,
PrT;μTCp;f
kT
:ð4Þ
Variable-property RANS modeling in NekRS is
still under development, so we expect to incur some
error from the use of a constant-property RANS
model for CHT. For the bare bundle cases in
Sec. III, bulk temperature rises are on the order of
10°C such that errors from constant properties (taken
at the average fluid conditions) are expected to be
minimal. For the coupled physics cases in Sec. IV,
the bulk temperature rise of 155° C is expected to
cause changes in ρf, Cp;f, kf, and μf on the order of
4%, 2%, 10%, and 20%, respectively, between inlet
and outlet. Because these simulations are meant to be
a proof-of-concept multiphysics demonstration, small
differences due to thermophysical property variations
will be accounted for in future work.
In Sec. III, PrT¼0:9 is used for the Freon-12
simulations, while the Aoki correlation
29
is used for
the sodium simulations in Sec. IV. The k-τ RANS
model is then used to evaluate μT based on two addi-
tional PDEs for the turbulent kinetic energy k and the
inverse specific dissipation rate τ (Refs. 30 and 31).
Wall functions in NekRS are currently under develop-
ment, so all NekRS models are wall-resolved such
that yþ<1.
Both bare and wire-wrapped bundles are modeled in
this work. The long-term goals of this research project are
to conduct large-scale CFD simulations of SFRs. When
compared to bare pin bundles, explicitly resolving the
wire wraps results in roughly an order of magnitude
higher number of mesh elements. The associated increase
in computational cost would restrict most of our simula-
tions to a few fuel bundles and exclude the possibility of
the large-scale CFD modeling needed to account for core
radial expansion. Therefore, the effect of the wire wraps
on cross flow is approximated with the Hu and Fanning
momentum source model
32
(MSM). This model attempts
to capture the important wire physics with a bare-bundle
mesh by correcting for the absence of meshed wires by
adding a momentum body force at each quadrature point
inside the wire region. In other words, an additional
forcing term ~
f that spirals around the pin is added to
the momentum equation. The momentum source has
components tangent to the wire ft, tangential to the pin
but perpendicular to the wire fn, and perpendicular to both
the wire and the pin fpn . The corresponding unit vectors
b
nt, b
nn, and b
npn are shown in Fig. 2 along with two angles,
ϕ and θ.
COUPLED MONTE CARLO TRANSPORT AND CONJUGATE HEAT TRANSFER · NOVAK et al. 5
NUCLEAR SCIENCE AND ENGINEERING · VOLUME 00 · XXXX 2023
The unit vectors are as follows:
b
nt¼sin ϕ cos θ þπ
2
biþsin ϕ sin θ þπ
2
bjþcos ϕ b
k;ð5Þ
b
nn¼cos ϕ cos θ π
2
biþcos ϕ sin θ π
2
bjþsin ϕ b
k;ð6Þ
and
b
npn ¼ cos θ bisin θ bj;ð7Þ
where we note that in the original publication
32
there is
a misprint in the b
k component of b
nt that has been cor-
rected here. These definitions can be verified by noting
that b
nnb
nt¼0 and b
nnb
nt¼b
npn. The momentum source
is then expressed as
~
f¼fB
u2
t
2Dw
|fflfflffl{zfflfflffl}
ftb
ntþun
qun
qnnþut
qun
qntþupn
qun
qnpn
|fflfflfflfflfflfflfflfflfflfflfflfflfflfflfflfflfflfflfflfflfflfflfflfflffl{zfflfflfflfflfflfflfflfflfflfflfflfflfflfflfflfflfflfflfflfflfflfflfflfflffl}
fnb
nn
þun
qupn
qnnþut
qupn
qntþupn
qupn
qnpn
|fflfflfflfflfflfflfflfflfflfflfflfflfflfflfflfflfflfflfflfflfflfflfflfflfflffl{zfflfflfflfflfflfflfflfflfflfflfflfflfflfflfflfflfflfflfflfflfflfflfflfflfflffl}
fpn b
npn ;ð8Þ
where ut, un, and upn are the velocity components along
the corresponding unit vectors and Dw is the wire dia-
meter. In the two wire-normal directions, the primary
effect of the wire is to block the flow, such that the
body force components are the dominant contributions
to the change in momentum. In the wire-tangent direc-
tion, the primary effect of the wire is a frictional loss,
represented with a friction factor fB. By assuming that
velocity gradients are proportional to the normal velocity
and with other details from Hu and Fanning,
32
Eq. (8) is
implemented as
~
f¼fB
u2
t
2Dwb
ntþCununþutcos ϕ þupn
Dwb
nn
þCupn unþutcos ϕ þupn
Dwb
npn ;ð9Þ
where C¼3:0 is a constant.
Several insights from previous sensitivity studies
32
are
used to inform the present implementation in NekRS. First,
Hu and Fanning compared the effect of explicit capturing of
the wire surface in their bare bundle meshes (i.e., forcing
mesh elements to conform to the wire boundary) versus
meshing without regard to the wire-fluid interface. When
the wire surface is not explicitly captured, the cross-
sectional area of the momentum source is distorted, causing
errors in the pressure drop on the order of a few percent.
Normalized cross-flow distributions were still well predicted
for interior gaps, though were slightly overpredicted for gaps
in the peripheral region. When considering that the typical
predictive accuracy of CFD models is in the range of 5% to
20% for pressure drops,
33–36
these small errors are acceptable
trade-offs to further simplify the mesh. In the present work,
quadrature points are individually assigned to be inside/out-
side of the wire region, as shown in Fig. 3.
Next, Hu and Fanning explored the sensitivity of the fB
model. When the tangential friction was removed ( fB¼0),
the pressure drop was underpredicted by only a few percent,
while the normalized cross-flow velocities were only minorly
affected since the main driver of cross flows is the blocking
effect. Therefore, we also adopt the Blasius correlation for
tube flow in this work, despite the obvious differences
between tubes and helical wire wraps. Additional sensitivity
studies on other model coefficients, such as C in Eq. (9), will
be conducted as future work specifically aiming to validate
the wire-wrap MSM.
Finally, it should be noted that when using a MSM in
place of a wire-resolved mesh, local flow velocities, pres-
sures, and temperatures very close to the wire will not be
perfectly represented. Local temperature peaking on the
backside of wires
34,37
may be underpredicted because the
MSM does not strictly force the velocities to zero within
the wire region. However, past comparisons between the
MSM and wire-resolved CFD and heated experiments
32,35
show acceptable accuracy in capturing cross-bundle tempera-
ture gradients and local pin surface temperatures, although
limited data are available for points very close to wires.
Further comparisons with wire-resolved CFD can help to
identify such limitations or bound analyses using the MSM.
Fig. 2. Unit vector and angle definitions for the wire-
wrap MSM.
6NOVAK et al. · COUPLED MONTE CARLO TRANSPORT AND CONJUGATE HEAT TRANSFER
NUCLEAR SCIENCE AND ENGINEERING · VOLUME 00 · XXXX 2023
In Sec. IV, cross-flow velocity predictions with a wire-wrap
model are presented in a manner that facilitates comparison
to seven-pin wire-resolved CFD simulations from the
literature.
This concludes the discussion of the MSM. Before clos-
ing this section, a brief discussion on the coupling of NekRS
to MOOSE is given. Cardinal contains two modes for cou-
pling NekRS to MOOSE: (1) boundary CHT coupling via
temperature and heat flux wall BCs, and (2) volume coupling
via heat sources, temperatures, and densities. In this work, we
combine both modes together, such that NekRS communi-
cates via CHT with BISON, but via densities and tempera-
tures with OpenMC.
During initialization, Cardinal automatically loops over
the elements in the high-order CFD mesh and reconstructs
the mesh as a lower-order mesh mirror. For CHT coupling,
only the boundaries through which NekRS is coupled to
MOOSE are rebuilt. Conversely, the entire NekRS mesh is
reconstructed for volumetric coupling. An example of
a spectral element CFD mesh and a volumetric mesh mirror
are shown in the upper right of Fig. 1. The spectral element
solution is then interpolated to/from a lower-order Lagrange
basis on the mesh mirror using Vandermonde matrices.
6
II.D. BISON
BISON is a MOOSE-based fuel performance code
applicable to a wide variety of nuclear fuels.
16
This work
solves the steady-state heat conduction equation for solid
temperature Ts,
ÑksÑTs
ð Þ ¼ ·
qs;ð10Þ
where ·
qs is the heat source and ks is the solid thermal con-
ductivity. Thermal conductivities for the various solid materi-
als used in this work were obtained from the literature.
16,38–42
In order to have a fully open-source model that can
serve as a user tutorial, the actual heat conduction simu-
lations are conducted with MOOSE’s heat conduction
module (which BISON uses internally for the heat con-
duction physics kernels). For brevity, BISON will be
used in order to convey MOOSE’s heat conduction
solver. In other words, the BISON executable can be
used to run the models developed for this paper, but in
order to release inputs as open-source tutorials, the
actual solves are conducted with the MOOSE heat con-
duction module (which provides the same physics ker-
nels as BISON).
II.E. Picard Coupling
This section provides additional discussion on the Picard
coupling strategy and data transfers used to couple OpenMC,
NekRS, and BISON. Figure 4 summarizes the data transfers
for each Picard iteration using the present SFR simulations as
an example. Note that these data transfers can be system and
resolution dependent; many other examples are available in
the literature.
6–8
For the CHT simulations conducted in
Sec. III that omit neutron transport, it is understood that the
execution of and data transfers to/from OpenMC are simply
omitted from Fig. 4.
Fig. 3. Assignment of individual quadrature points to inside/outside the wire region. Yellow regions will have a nonzero
momentum source, while teal regions have zero additional forcing term.
COUPLED MONTE CARLO TRANSPORT AND CONJUGATE HEAT TRANSFER · NOVAK et al. 7
NUCLEAR SCIENCE AND ENGINEERING · VOLUME 00 · XXXX 2023
OpenMC solves for the fission distribution and sends
the pin power to BISON. For simplicity, this work neglects
power generation in the coolant, although Cardinal does
support nonlocal Monte Carlo tallies and volumetric heat
sources in Eq. (3) that can be used to capture nonlocal
power generation. Next, BISON solves for the solid tem-
perature, and sends the solid temperature to OpenMC and
the fluid-solid boundary heat flux to NekRS. Finally,
NekRS solves for the fluid flow and heat transfer and
sends the fluid temperature and density to OpenMC and
the fluid-solid wall temperature to BISON. In other words,
OpenMC is coupled to BISON and NekRS via volumetric
terms, while BISON and NekRS are coupled to one
another through BCs on the fluid-solid interfaces.
Picard iterations are achieved in time. In other
words, the simulation has a notion of time and a time-
step index, but only NekRS is actually solved with
nonzero time derivatives. The concept of time stepping
is then used to customize how frequently (i.e., in units of
time steps) data are exchanged among applications.
Each application uses a unique time step size, which
plays a significant role in reducing the number of high-
cost physics solves.
To help explain the strategy, represent the time step sizes
in NekRS, BISON, and OpenMC as Δtnek, Δtbison ¼MΔtnek,
and Δtopenmc ¼NMΔtnek, respectively. Selecting N�1 and/or
M�1 is referred to as “subcycling.” In other words, NekRS
runs M times for each BISON solve, while BISON runs N
times for each OpenMC solve, effectively reducing the total
number of BISON solves by a factor of M and the total
number of OpenMC solves by a factor of NM compared to
the naive approach to exchange data based on the smallest
time step across the coupled codes. Therefore, each Picard
iteration consists of
1. Run an OpenMC k-eigenvalue calculation.
Transfer ·
qs to BISON.
2. Repeat N times:
a. (a) Run a steady-state BISON calculation.
Transfer q'' to NekRS.
b. (b) Run a transient NekRS calculation for M
time steps. Transfer Twall to BISON.
3. Transfer Ts, Tf, and ρf to OpenMC.
Figure 5 shows the procedure for an example selec-
tion of N¼3, meaning that the BISON-NekRS subsolve
occurs three times for every OpenMC solve. For the CHT
simulations in Sec. III, it is again understood that the Run
OpenMC step is omitted.
Due to Courant-Friedrichs-Lewy considerations, the
time steps used in NekRS are typically on the order of
milliseconds (though each individual NekRS CFD solve
is extremely fast). In the present work, M¼2 was
required to obtain a stable solution for the CHT modeling
in Sec. III, although we note that the vast majority of our
past CHT simulations could use data transfer intervals
one to three orders of magnitude less frequent. The multi-
physics modeling in Sec. IV allowed more forgiving data
exchanges, and M¼1 and N¼5000 were selected
based on preliminary scoping studies. Selecting N¼
5000 for this latter case transfers data approximately
Fig. 4. Summary of data transfers that occur within each Picard iteration for a SFR multiphysics application.
8NOVAK et al. · COUPLED MONTE CARLO TRANSPORT AND CONJUGATE HEAT TRANSFER
NUCLEAR SCIENCE AND ENGINEERING · VOLUME 00 · XXXX 2023
every 0.5 flow-through-times in the NekRS domain.
A rigorous search for the optimal choices is deferred to
future work.
III. BARE BUNDLE VALIDATION
Validating Cardinal for multiphysics neutronics-
T/H-fuel modeling of SFRs is a considerable task requir-
ing high-quality data for temperatures, velocities, pres-
sures, and fission distributions under a wide variety of
operating conditions spanning different flow regimes,
burnups, fuel loadings, and more. There are several T/H
experimental facilities that provide data for temperatures,
pressures, and velocities that can be used to validate
Cardinal’s NekRS-MOOSE CHT coupling.
33,43–53
Operating reactor data from facilities such as the Fast
Flux Test Facility can be used to validate Cardinal’s
OpenMC-NekRS-MOOSE coupling.
54
The validation of Cardinal is currently underway for
several of these facilities. Our strategy is to progressively
increase model complexity in tandem with experimental
validation in order to isolate the effects of different model
parameters. In the present study, we aim to isolate the
effect of the wire-wrap MSM by comparing Cardinal
CHT predictions against bare Freon-12 pin bundle experi-
ments conducted at the Research Center Karlsruhe in the
1990s (Refs. 13 and 55). This small, seven-pin experi-
ment provides an ideal starting point for a comprehensive
validation test matrix.
Bare Freon-12 bundle data are relevant to the over-
arching objective of validating Cardinal for multiphysics
modeling of wire-wrapped, sodium-cooled bundles.
Because NekRS solves the incompressible Navier
Stokes equations in nondimensional form, the data
collected for a Freon-12 coolant simply represent flow
data at particular Reynolds and Prandtl numbers. As will
be discussed later with Table II, the Reynolds numbers
for these experiments are representative of typical SFR
conditions, although the Prandtl number is several orders
of magnitude higher. Nevertheless, validation against
Freon-12 data will provide additional support for the
data transfers and coupling methodology in Cardinal
that is independent of the particular flow conditions.
This section describes the results of this CHT vali-
dation exercise. After describing the experimental facil-
ity, Sec. III.A details the mesh refinement study and
Sec. III.B then compares Cardinal temperature predic-
tions with experimental data.
The bare bundle experiments from Research Center
Karlsruhe were originally conducted to obtain critical
heat flux data for tight-lattice pressurized water reactors
and employed a Freon-12 working fluid due to its well-
known properties and low latent heat.
56
Two seven-pin
bundles were tested using different spacer designs, either
a grid spacer or a wire wrap. In this section, only the
spacer grid bundles are considered; their geometric spe-
cifications are summarized in Table I.
The pins were heated electrically with a uniform axial
distribution. The composition of each pin is shown in Fig. 6.
In each pin, eight thermocouples were embedded within the
cladding, 15 mm from the exit of the heated section, in order
to measure pin surface temperatures. The thermocouple posi-
tions are shown in Fig. 7. Later plots of the experimental data
on the pin surfaces are depicted in terms of angles θ defined
relative to the axes and directional arrows shown in white in
Fig. 7.
Three different single-phase experiments were con-
ducted, each with different heating configurations. These
three experiments are designated as cases A, B, and C,
Fig. 5. Coupling procedure with subcycling for a calculation with OpenMC, BISON, and NekRS for N¼3.
COUPLED MONTE CARLO TRANSPORT AND CONJUGATE HEAT TRANSFER · NOVAK et al. 9
NUCLEAR SCIENCE AND ENGINEERING · VOLUME 00 · XXXX 2023
and are shown in Fig. 8. Red pins are heated, with uni-
form power distribution among the heated pins.
Experimental pin surface temperatures were only
reported for the unheated rods. The flow conditions for
each case are summarized in Table II. Nondimensional
numbers were evaluated based on average μ, Cp, and k. It
should be noted that the precise inlet temperatures and
dimensionless numbers in Table II were either extracted
from plots or back-calculated from other information in
Ref. 13, and therefore may not be in perfect agreement
with the experimental facility, although errors are
expected to be small.
The NekRS model was constructed using a spectral ele-
ment discretization of the k-τ RANS equations. Because the
NekRS flow solution was decoupled from the energy equa-
tion by the incompressible, constant-property RANS model,
isothermal NekRS simulations were first performed with
periodic inlet/outlet BCs to converge the flow. No-slip BCs
were applied on all walls. Next, the coupled CHT simulations
were performed by transporting a temperature passive scalar
on this “frozen” flow field with boundary coupling to BISON
on the fluid-solid interfaces, again with all walls set to no-slip
BCs. This two-stage approach was a convenient technique to
eliminate the complexity of setting turbulent flow inlet con-
ditions by instead approximating the inlet flow as fully
developed.
III.A. Mesh Refinement Study
For each case, a refinement study was performed to
ensure that the coupled NekRS-BISON simulation was
mesh independent. Two parameters were varied: (1) the
NekRS polynomial order N (p-refinement) and (2) the
BISON uniform mesh refinement level r (h-refinement).
For brevity, the refinement study is presented only for case
A, but identical rigor was pursued for cases B and C. Only
a subset of the refinement study results is shown, as many
different quantities were monitored for convergence.
Because the CHT simulation was conducted in two
stages (isothermal periodic flow, followed by temperature
transport), the mesh refinement study was for convenience
also conducted in two stages that mimicked the overall simu-
lation strategy. In stage I, isothermal flow was converged to
a steady state, and N was selected to sufficiently resolve
velocity, k, and τ. In stage II, the coupled CHT was converged
to a steady state using the converged velocity and μT from
TABLE I
Geometric Parameters for the Seven-Pin Bundle Experiments*
Parameter Value
Test section length 1.24 m
Heated length 0.60 m
Hydraulic diameter, Dh4.36 mm
Pin diameter, Dp9.50 mm
Pin pitch, P10.90 mm
Pin-to-wall clearance 1.40 mm
*Reference 13.
Fig. 6. Material structure in each pin.
Fig. 7. Thermocouple locations and angle definitions for the
pins. An example of θ = 60 deg is shown on the center pin.
10 NOVAK et al. · COUPLED MONTE CARLO TRANSPORT AND CONJUGATE HEAT TRANSFER
NUCLEAR SCIENCE AND ENGINEERING · VOLUME 00 · XXXX 2023
stage I. Because Pr > 1, thermal simulations were conducted
for the N selected in stage I as well as Nþ2 in order to
provide additional refinement in the temperature transport.
After selecting N, the BISON mesh was then uniformly
coarsened and temperatures compared between r values to
confirm that the solid mesh was also sufficiently refined. Brief
details on these refinement studies are now presented.
III.A.1. Stage I: Flow Refinement
First, several metrics were monitored in time to ensure
steady state. Temporal convergence was assessed based on
the relative difference in ΔP=ΔL and the maximum Vz, k, and
τ between two successive checkpoint steps. As an example,
Fig. 9 shows the relative difference in these four quantities as
a function of time for polynomial order N¼7. Temporal
convergence was defined to occur once the relative change
from the previous checkpoint was smaller than 5103.
Steady state was obtained after approximately 2.5 flow-
through times, or the time for the flow to completely traverse
the axial extent of the bundle 2.5 times.
After ensuring steady state, spatial convergence was then
assessed based on the relative L
2
norm in Vz, k, and τ along
the line y¼0 (shown in red in Fig. 7) between two succes-
sive polynomial orders. For each choice of N, spatial
convergence was defined to occur once the relative L
2
norms were less than 5103. As an example, Fig. 10
shows these fields for each choice of N. Spatial convergence
was obtained for both N¼5 and N¼7.
III.A.2. Stage II: Thermal Refinement
The CHT simulations were then conducted using the
converged flow simulations from stage I for N¼5, 7, and
9 (an additional higher polynomial order to ensure that the
Pr > 1 flow had sufficiently resolved thermal boundary
layers). Temporal convergence was first assessed using
MOOSE’s automatic steady-state detection features.
Temporal convergence was defined as the point at which
the relative L
2
norm of the entire temperature solution T
(solid temperature from BISON, fluid temperature from
NekRS) was smaller than a user-specified tolerance.
Conceptually, this can be represented as
Fig. 8. Pin heating configurations for the three single-phase experiments. Red pins are heated, with a uniform power distribution
among the heated pins.
TABLE II
Summary of Flow Conditions for the Single-Phase
Experiments*
Parameter Case ACase BCase C
Inlet temperature (° C) −7.5 7.5 0.0
Heater power (kW) 18.3 8.15 2.85
Reynolds number 58 928 62 937 44 760
Peclet number 214 873 217 829 171 800
*Reference 13.
Fig. 9. Time evolution of the flow for N¼7. The rela-
tive difference at time tn is computed relative to the
solution at time tn1. The y-axis is cut off at 105.
COUPLED MONTE CARLO TRANSPORT AND CONJUGATE HEAT TRANSFER · NOVAK et al. 11
NUCLEAR SCIENCE AND ENGINEERING · VOLUME 00 · XXXX 2023
T;Ts
Tf
;ð11Þ
where Ts and Tf are themselves vectors with lengths depend-
ing on the number of degrees of freedom (DOFs) in each
application. Temporal convergence therefore occurs when
the relative norm in T is less than 5103, and the simula-
tion automatically terminates at this point.
After ensuring steady state, spatial refinement was
then assessed to determine an appropriate NekRS poly-
nomial order N and BISON mesh refinement level r.
Spatial convergence in N was assessed based on the
relative L
2
norm in Tf along the line y¼0 at the axial
elevation of the thermocouples between two successive
polynomial orders and when using a very fine BISON
mesh. This criteria is analogous to those shown pre-
viously in Fig. 10. The outcome of this investigation
showed that the flow was thermally refined for N¼7.
Next, the BISON mesh was progressively coarsened by
halving the number of radial, azimuthal, and axial divi-
sions in a cylindrical structured mesh to show that the
finest mesh was thermally converged.
Figure 11 shows the converged solid mesh (multiple
colors) and the converged NekRS mesh (gray region).
Note that each NekRS element has 83¼512 DOFs, and
therefore actually has a much finer solution representa-
tion than the element boundaries shown in Fig. 11. The
fluid and solid meshes were then extruded in the axial
direction into 120 layers.
III.B. Results
This section compares the Cardinal CFD simulations
with the experimental temperature data. To provide addi-
tional comparisons, Cardinal is also compared against
ANSYS CFD simulations conducted by Cheng and
Yu.
13
Figure 12 shows the (nondimensional) NekRS
axial velocity Vz, k, and τ for case B, the highest Re
case. These fields are qualitatively very similar for
cases A and C, and for brevity are not shown. These
fields all display the expected behavior for pin bundles.
For a pitch/diameter (P/D) of 1.14, the hydraulic dia-
meters of the channels increase as
Dh;corner < Dh;interior < Dh;edge, which is reflected in the
axial velocity distribution and qualitatively matches
other CFD simulations for hexagonal pin bundles.
43
Figure 13 shows the BISON solid temperature for each
case on a shared color scale. Recall that the inlet temperature
was different for each case, which influences the shared color-
scale representation shown in Fig. 13. Contours are shown in
the unheated pins to illustrate the effect of CHT. Sharp jumps
Fig. 10. (a) Vz, (b) k, and (c) τ along the line y¼0 as a function of NekRS polynomial order N. All axes are shown in
nondimensional units.
Fig. 11. Converged solid and fluid meshes for the seven-
pin bare bundle experiment simulations.
12 NOVAK et al. · COUPLED MONTE CARLO TRANSPORT AND CONJUGATE HEAT TRANSFER
NUCLEAR SCIENCE AND ENGINEERING · VOLUME 00 · XXXX 2023
occur in the contours at the boundaries between the different
pin regions where thermal conductivities change.
Figure 14 shows the NekRS fluid temperature at the
thermocouple measurement plane. For each case, the tem-
perature is normalized to the range ½0;1with the
transformation
T¼TTmin
Tmax Tmin ;ð12Þ
where Tmin and Tmax are the minimum and maximum tempera-
tures on the plane, respectively. Fluid temperatures are highest
at the narrow pin-pin gaps due to the lower velocity in these
Fig. 13. BISON solid temperature for cases A, B, and C.
Fig. 12. NekRS predictions for (a) Vz, (b) k, and (c) τ for case B, all in nondimensional units.
Fig. 14. Normalized NekRS fluid temperature Tdefined in Eq. (12) for cases A, B, and C.
COUPLED MONTE CARLO TRANSPORT AND CONJUGATE HEAT TRANSFER · NOVAK et al. 13
NUCLEAR SCIENCE AND ENGINEERING · VOLUME 00 · XXXX 2023
regions. The Pr > 1 results in thin thermal boundary layers,
which are not visible without significant zoom in Fig. 14.
Finally, Figs. 15, 16, and 17 compare the surface
temperature of the unheated pins with the experimental
data, as well as the CFD simulations conducted by
Cheng and Yu using ANSYS and the Speziale
Reynolds Stress Model.
13
The x-axes are shown in
terms of the pin angles in Fig. 7. Error bars are shown
on the experimental data to denote the as-stated thermo-
couple accuracy of <1.2° C.
Cardinal predicts the experimental data fairly well.
Across all thermocouple measurements, Cardinal predicts
73.4% of points within the experimental accuracy, and
90.6% of points within 2 the experimental accuracy.
For data collected on pins 4 and 7 (only applicable to
Case B in Fig. 16), Cardinal gives an excellent prediction
of the experimental data, with all predictions lying within
experimental accuracy. For data collected on pins 2 and 3,
the agreement between Cardinal and the experimental
data is less favorable. Both Cardinal and Cheng and Yu
tend to underpredict temperatures at the narrow pin-pin
gap at θ ¼0, but show reasonable accuracy for thermo-
couples facing the duct. Cardinal tends to predict lower
minimum/maximum temperatures as compared to Cheng
and Yu, giving better predictions for thermocouples
facing the duct but underpredicting peak clad
temperatures.
The predictions of Cheng and Yu could loosely be
considered a better predictor of metrics related to peak
clad temperatures, while Cardinal could loosely be
considered a better predictor of duct temperatures that
drive the core bowing phenomenon. Rather than
attempting to rank these two simulations based on
these different effects, we simply report the normalized
root-mean-square (RMS) error (err) in Table III,
defined as
err ¼
·
mCp
qffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffi
Pnt
i¼1TcfdðxiÞ Texp ðiÞ2
nt
s;ð13Þ
where nt is the number of thermocouples per pin, TcfdðxiÞ
is the CFD solution at the i’th thermocouple, and Texp ðiÞ
is the experimental measurement at the i’th thermocou-
ple. For normalization, q=·
mCp is the nominal bulk tem-
perature rise. In Table III, a “—” indicates that
experimental data were not available or the ANSYS
data were not reported.
Fig. 16. Pin surface temperature predicted by Cardinal and Cheng and Yu
13
for (a) pins 2 and 3 and (b) pins 4 and 7 in case B.
Fig. 15. Pin surface temperature predicted by Cardinal
and Cheng and Yu
13
for pins 2 and 3 in case A.
14 NOVAK et al. · COUPLED MONTE CARLO TRANSPORT AND CONJUGATE HEAT TRANSFER
NUCLEAR SCIENCE AND ENGINEERING · VOLUME 00 · XXXX 2023
Both Cardinal and Cheng and Yu’s ANSYS simu-
lations show very similar RMS errors for pin surface
temperatures. The normalized RMS errors for case C
are higher than for the other cases only because the
nominal temperature rise is only 3.5°C, such that
errors on the order of a few degrees are high once
normalized.
A more thorough verification and validation exer-
cise should incorporate uncertainty quantification to
provide confidence intervals for the Cardinal simula-
tions and assess the combined impact of multiple
input uncertainties on solid thermophysical properties,
geometric dimensions, and other parameters. In future
work, we plan to integrate Cardinal with MOOSE’s
stochastic tools module to streamline such
calculations.
57
In addition to uncertainties in various model para-
meters, there were a number of additional sources of
error. While pins 2 and 3 should have symmetric
temperature distributions, at several angles (most
notably at θ = 160° for case A) thermocouple mea-
surements differed significantly. These apparent asym-
metries could suggest perturbations in the
thermocouple positions relative to the as-designed
experiment or inaccurate readings. In addition, the
NekRS model used a constant-property RANS
model, which neglected any thermophysical property
variations with temperature and pressure. Nominal
temperature rises for the three experiments ranged
from about 3.5° C to 14.5° C, and such property-
related errors are expected to be small. In addition,
the insulated thermal BCs assumed on the outer
boundary do not account for any heat gain from the
surroundings.
Given that uncertainty quantification is deferred to
future work, these results suggest that Cardinal is able to
reasonably well predict pin surface temperatures in bare
hexagonal bundles. This validation exercise provides rea-
sonable confidence in the implementation of data trans-
fers and other software details needed to conduct
multiphysics modeling of a more complex wire-wrapped
SFR bundle in Sec. IV.
IV. MULTIPHYSICS MODELING OF A SEVEN-PIN ABR
BUNDLE
Cardinal is now applied to multiphysics modeling of
a reduced-scale seven-pin ABR bundle. The primary goal
is to demonstrate Cardinal’s readiness for steady-state
fast reactor applications with proof-of-concept simula-
tions of SFR geometries. The purpose at this stage is to
show that the coupling data transfers are sufficiently
sophisticated to capture all multiphysics interactions
between neutron transport, T/H, and solid heat
Fig. 17. Pin surface temperature predicted by Cardinal
and Cheng and Yu
13
for pins 2 and 3 in case C.
TABLE III
RMS Error Norms for the Cardinal and Cheng and Yu ANSYS Simulations*
Case ACase BCase C
Pin Cardinal ANSYS Cardinal ANSYS Cardinal ANSYS
2 0.121 0.125 0.136 0.125 0.284 0.296
3 0.147 0.150 0.162 0.162 0.450 0.464
4 — — 0.095 — — —
7 — — 0.061 — — —
*Reference 13; “—” indicates that experimental data were not available or the ANSYS data were not reported.
COUPLED MONTE CARLO TRANSPORT AND CONJUGATE HEAT TRANSFER · NOVAK et al. 15
NUCLEAR SCIENCE AND ENGINEERING · VOLUME 00 · XXXX 2023
conduction. The results shown in this section should
therefore only be taken as indicative of Cardinal’s cap-
abilities for SFR analysis, with all validation left to future
work. The codes coupled in this section include
OpenMC, NekRS, and BISON. After describing the com-
putational model, Sec. IV.A discusses the mesh refine-
ment study and Sec. IV.B presents Cardinal’s
multiphysics model predictions.
The ABR is a large SFR concept with a power of
1000 MW(thermal).
14
The core design consists of 180
driver fuel assemblies, 114 reflector assemblies, 66
shield assemblies, and 19 control assemblies. The
driver fuel assemblies contain 271 wire-wrapped pins
within a hexagonal duct, but the present work models
a seven-pin version of the same geometry. The geo-
metric parameters are summarized in Table IV. The
pin diameter, pin pitch, clad thickness, wire diameter,
and wire axial pitch all exactly match the nominal
ABR design, while simplifications were made for the
other specifications. The active height of the ABR
core is 4.22Lw, which was shortened to 4:0Lw in
order to facilitate periodic flow BCs in NekRS’s
RANS model (to be discussed in detail shortly).
Other adjustments to the axial regions in the assembly
included neglecting the gas/sodium plenum and using
equal-height axial reflectors on the bottom and top of
the assembly.
Material compositions were obtained from
Ref. 58, which provides metallic fuel compositions
with five axial enrichment zones. Because the present
simulations were purely of a demonstration nature, the
fuel composition at all axial positions in the OpenMC
model was simply set to the composition of
the second layer. Due to the small bundle size, volume
fractions of fuel, coolant, and structural materials
were distorted from typical SFR values. To compen-
sate, the duct thickness was halved to better reflect
material volume fractions.
Several simplifications were also required to
obtain representative power and flow rate levels. To
a first-order approximation, it was assumed that the
entire core power was produced by the 180 driver fuel
assemblies, neglecting the contributions by other bun-
dle types. The bundle power level q was then approxi-
mated as
q¼1000 MWðthermalÞ
180 bundles 7
271 :ð14Þ
The flow rate ·
m was then selected to match the nominal
core temperature rise of ΔT¼155° C with
q¼·
mCpΔT:ð15Þ
Table V summarizes the operating conditions selected for
the reduced seven-pin ABR bundle. The Reynolds and
Peclet numbers are both based on Dh.
The NekRS computational model was constructed
using a spectral element discretization of the k-τ RANS
equations. Because the NekRS flow solution was
decoupled from the energy equation by the incompressi-
ble, constant-property k-τ RANS model, isothermal
NekRS simulations were first performed with periodic
inlet/outlet BCs to converge the flow. No-slip BCs are
applied on all walls. Next, the multiphysics simulations
were performed by transporting a temperature passive
scalar on this frozen flow field with boundary coupling
to BISON on the fluid-solid interfaces and volumetric
coupling between OpenMC and NekRS-BISON. This
two-stage approach was a convenient technique to
TABLE IV
Geometric Parameters for the Reduced Seven-Pin ABR Bundle
Modeled in the Present Work*
Parameter Value
Active height 0.8128 m
Rod plenum height 0.0 m
Top reflector height 0.50 m
Bottom reflector height 0.50 m
Pin diameter, Dp7.646 mm
Pin pitch, P8.966 mm
Clad thickness 0.587 mm
Wire diameter, Dw1.03 mm
Wire axial pitch, Lw0.2032 m
Duct inner flat to flat 0.02584 m
Duct thickness 1.983 mm
*Reference 14.
TABLE V
Operating Conditions Assumed for the Reduced Seven-Pin
ABR Bundle*
Parameter Value
Power 143.5 kW(thermal)
Inlet temperature 355° C
Core temperature rise 155° C
Reynolds number 37 202
Peclet number 192
*Reference 14.
16 NOVAK et al. · COUPLED MONTE CARLO TRANSPORT AND CONJUGATE HEAT TRANSFER
NUCLEAR SCIENCE AND ENGINEERING · VOLUME 00 · XXXX 2023
eliminate the complexity of setting turbulent flow inlet
conditions by instead approximating the inlet flow as
fully developed.
In the OpenMC and BISON models, the wire wrap
was homogenized into the cladding in order to obtain
a consistent geometric representation with NekRS’s
wire-wrap MSM (which does not explicitly mesh
wires). For the OpenMC model, the density of the clad-
ding was increased to preserve the overall cladding
mass.
IV.A. Mesh Refinement Study
A mesh refinement study was performed to ensure
that the coupled OpenMC-NekRS-BISON simulation
was mesh independent. Three parameters were varied:
(1) the NekRS polynomial order N, (2) the BISON
uniform mesh refinement level r, and (3) the number
of axial layers in the OpenMC cell model. Our pre-
vious work
6
details our extensive methodology for
assessing convergence for multiphysics simulations,
and for brevity we refer the reader to this work for
more information. Here, we simply note a few impor-
tant points.
Temporal convergence was assessed using MOOSE’s
automatic steady-state detection features. Temporal con-
vergence was defined as the point at which the relative L
2
norm of the entire coupled physics solution x (solid
temperature from BISON, fluid temperature from
NekRS, fluid density from NekRS, and OpenMC
power) was smaller than a user-specified tolerance.
Conceptually, this can be represented similar to
Eq. (11) as
x;
Ts
Tf
ρf
·
qs
2
6
6
43
7
7
5;ð16Þ
where each component in the total solution x is itself
a vector with the length depending on the number of
DOFs in each application. Temporal convergence there-
fore occurred when the relative norm in x was less than
5103 and the simulation automatically terminated at
this point.
Single-physics convergence of the OpenMC
Monte Carlo transport was assessed using a number
of additional criteria. The number of particles per
batch was fixed at 10 000 for this small model, and
the Shannon entropy was then used to select the
number of inactive batches.
59
OpenMC’s tally trigger
system was then used to terminate each Picard itera-
tion (i.e., stop running active batches) once the max-
imum relative uncertainty (based on 1σ standard
deviations) in the fission distribution was less than
5103. No instabilities were observed during early
scoping studies, but Robbins-Monro relaxation
60
was
employed based on best practices in multiphysics
algorithms.
61
Figure 18 shows the converged solid mesh (orange
regions) and the converged NekRS mesh (blue regions).
Note that each NekRS element has 83 DOFs, and there-
fore actually has a much finer solution representation
than the element boundaries shown in Fig. 18. The fluid
and solid meshes were then extruded in the axial direc-
tion into 80 layers. The OpenMC mesh mirror was for
simplicity identical to the combined mesh shown in
Fig. 18.
IV.B. Results
This section presents proof-of-concept simulations
of tightly coupled multiphysics for a seven-pin ver-
sion of an ABR bundle using Cardinal. First, we
present the flow solution obtained with the wire-
wrap momentum source model. While no experimen-
tal data or equivalent wire-resolved simulations were
conducted as part of this work, we compare qualita-
tively with other simulations in the literature (albeit
with different P=Dp, Lw=Dp, and Dw=Dp). Figure 19a
shows the axial and Fig. 19b the transverse velocity at
Fig. 18. Converged solid and fluid meshes for the seven-
pin ABR bundle simulations.
COUPLED MONTE CARLO TRANSPORT AND CONJUGATE HEAT TRANSFER · NOVAK et al. 17
NUCLEAR SCIENCE AND ENGINEERING · VOLUME 00 · XXXX 2023
Lw=2. In these images, the wire on each pin is at
θ = 180 deg. These predictions agree qualitatively
with other predictions from the literature.
32
To further explore the transverse velocity, define the
transverse gap velocity vgap as
vgap ¼1
U0
ðΓs
~
Vb
ndΓ
ðΓs
dΓ
;ð17Þ
where U0 is the uniform inlet velocity, Γs is the gap
plane, and b
n is the gap unit normal. Figure 21 shows
the transverse gap velocity on the planes defined in
Fig. 20 over a single wire pitch; various 60
degree phase shifts were applied to match the data
presentation in Ref. 32. These transverse velocities
exhibit excellent agreement with the distributions pre-
dicted by Hu and Fanning
32
for a wire-resolved
model.
Next we discuss the coupled physics results.
Convergence was obtained in five Picard iterations.
The advantages of using the rigorous convergence
criteria in Eq. (16) are automatic consideration of
the solution over the entire domain, as opposed to
cherry-picking a subset of the coupled solution. For
example, Fig. 22 shows the k-eigenvalue as
a function of Picard iteration. As can be seen, k
during iteration 2 is within the uncertainty bounds
of the previous iteration, which would suggest con-
vergence earlier than that imposed by the coupled
temperature, density, and heat source solution that is
captured in Eq. (16).
Figure 23 shows the OpenMC power in the fuel
pins, and Fig. 24 shows the power along the centerline
Fig. 19. (a) Axial and (b) transverse velocity on the plane z¼Lw=2. The wire is at θ = 180 deg on each pin. All quantities are
shown in nondimensional units.
Fig. 20. Four gap planes defined in the bundle, used later
in Fig. 21.
18 NOVAK et al. · COUPLED MONTE CARLO TRANSPORT AND CONJUGATE HEAT TRANSFER
NUCLEAR SCIENCE AND ENGINEERING · VOLUME 00 · XXXX 2023
of the center pin. Error bars on the power distribution
are too small to be visible in Fig. 24. The axial
reflectors above and below the bundle reflect neutrons
back to the fuel, inducing large power peaks in these
regions. The power is nearly symmetric about the
midplane due to the small variation of sodium density
with temperature
62
and the long mean free path, which
causes thermal feedback to act in a more global sense
than in thermal spectrum systems.
Figure 25 shows the BISON duct temperature and the
NekRS fluid temperature on several z-slices. Two fuel
pins are also shown in gray. The CHT between the fluid
and duct is evident in the continuous temperature field at
the walls.
Figure 26 shows the BISON temperature in the fuel and
cladding. The cross flow induced by the wire mixes the fluid,
lowering peak temperatures relative to bare pin bundles.
Figure 27 shows the fluid and solid temperatures on the
plane z¼0:805 m to more clearly illustrate the effects
of CHT.
This concludes the application of Cardinal to
a seven-pin version of the ABR driver assembly. The
seven-pin ABR simulations were conducted on Summit
and required approximately 60 node hours for the iso-
thermal flow solve and an additional 100 node hours for
the coupled physics energy solve (on a frozen velocity
field). Cardinal predicts realistic temperature and power
distributions for SFR geometries, but additional valida-
tion is required for the wire-wrap MSM. Comparisons of
Cardinal CHT simulations using the wire-wrap MSM
using a 61-pin partially heated wire-wrap experiment
33
are underway. Pending acceptable accuracy, the MSM
will provide a pathway toward full-core RANS modeling
of SFRs.
V. CONCLUSIONS
Cardinal is an open-source MOOSE application
a
that wraps NekRS spectral element CFD and OpenMC
Monte Carlo radiation transport within the MOOSE
framework, delivering high-resolution multiphysics
feedback to diverse applications in nuclear engineer-
ing. As part of an Argonne National Laboratory
Fig. 22. OpenMC k-eigenvalue as a function of
iteration.
Fig. 21. Transverse gap velocity on the planes defined in Fig. 20. Different y-axes scales were used to best depict the velocity and
various multiples of 60-deg shifts were applied in order to match the data presentation in Ref. 32.
COUPLED MONTE CARLO TRANSPORT AND CONJUGATE HEAT TRANSFER · NOVAK et al. 19
NUCLEAR SCIENCE AND ENGINEERING · VOLUME 00 · XXXX 2023
(ANL) laboratory-directed research and development
(LDRD) project developing high-fidelity multiphysics
tools for predicting core radial expansion, this paper
described two applications of Cardinal to hexagonal
pin bundles.
a
Cardinal’s CHT coupling of NekRS and BISON was
compared against temperature data collected in a bare
seven-pin experiment from the Research Center
Karlsruhe.
13
Three different pin heating modes were
modeled, and Cardinal predicted the data with acceptable
accuracy. Cardinal predicted 73.4% of the thermocouple
readings within experimental accuracy and 90.6% of the
thermocouple readings within 2 the experimental
accuracy. Nearly identical RMS error norms between
comparable ANSYS simulations from the literature
13
suggest similar accuracy as other CHT simulation soft-
ware. A more thorough verification and validation exer-
cise should incorporate uncertainty quantification to
assess the combined impact of uncertainties of thermo-
physical properties, geometric dimensions, and other
parameters. In future work, we plan to integrate
Cardinal with MOOSE’s stochastic tools module to
streamline such calculations.
57
Next, a demonstration application of Cardinal to
a reduced seven-pin version of an ABR driver fuel assem-
bly was performed using a tight coupling of OpenMC,
NekRS, and BISON. A wire-wrap MSM was used to
approximate the effect of wires on the flow. Predictions
were made for velocity, fluid temperature, solid tempera-
ture, and fission distribution. Additional validation using
wire-wrap data will be used to assess the relevance of the
MSM for full-core SFR RANS modeling.
Fig. 23. OpenMC pin power.
Fig. 24. OpenMC pin power along the centerline.
a
See https://github.com/neams-th-coe/cardinal.
20 NOVAK et al. · COUPLED MONTE CARLO TRANSPORT AND CONJUGATE HEAT TRANSFER
NUCLEAR SCIENCE AND ENGINEERING · VOLUME 00 · XXXX 2023
Fig. 26. BISON pin temperature.
Fig. 25. BISON duct temperature and NekRS fluid temperature on several z-slices.
COUPLED MONTE CARLO TRANSPORT AND CONJUGATE HEAT TRANSFER · NOVAK et al. 21
NUCLEAR SCIENCE AND ENGINEERING · VOLUME 00 · XXXX 2023
Acknowledgments
This material is based on work supported by LDRD
funding from ANL provided by the director, Office of
Science, of the U.S. Department of Energy (DOE) under
contract no. DE-AC02-06CH11357.
We gratefully acknowledge the computing resources
provided on Bebop, a high-performance computing cluster
operated by the Laboratory Computing Resource Center at
ANL.
An award of computer time was provided by the
INCITE program. This research also used resources of the
Oak Ridge Leadership Computing Facility, which is a DOE
Office of Science User Facility supported under contract
no. DE-AC05-00OR22725.
Disclosure Statement
No potential conflict of interest was reported by the authors.
ORCID
A. J. Novak http://orcid.org/0000-0002-0048-1452
References
1. C. J. PERMANN et al., “MOOSE: Enabling Massively
Parallel Multiphysics Simulation,” SoftwareX, 11, 100430
(2020); http://doi.org/10.1016/j.softx.2020.100430.
2. P. FISCHER et al., “NekRS, a GPU-Accelerated Spectral
Element Navier-Stokes Solver,” arXiv:2104.05829 (Apr.
2021); https://doi.org/10.48550/arXiv.2104.05829
3. P. K. ROMANO et al., “OpenMC: A State-of-the-Art
Monte Carlo Code for Research and Development,” Ann.
Nucl. Energy, 82, 90 (2015); https://doi.org/10.1016/j.anu
cene.2014.07.048
4. E. MERZARI et al., “Cardinal: A Lower Length-Scale
Multiphysics Simulator for Pebble-Bed Reactors,” Nucl.
Technol., 207, 7, 1118 (2021); https://doi.org/10.1080/
00295450.2020.1824471
5. P. FISCHER et al., “Highly Optimized Full-Core Reactor
Simulations on Summit,” arXiv:2110.01716 (Oct. 2021).
6. A. J. NOVAK et al., “Coupled Monte Carlo and
Thermal-Fluid Modeling of High Temperature Gas
Reactors Using Cardinal,” Ann. Nucl. Energy, 177,
109310 (2022); https://doi.org/10.1016/j.anucene.2022.
109310.
7. A. HUXFORD et al., “Development of Innovative
Overlapping-Domain Coupling Between SAM and
NekRS,” Proc. (2022).
8. Y. YU et al., “Coupled Simulation of Reactor Pressure
Vessel (RPV) Subjected to Pressurized Thermal Shock
(PTS) Using Cardinal,” Proc. ATH (2022).
9. R. HAROLDSEN, The Story of the BORAX Nuclear
Reactor and the EBR-I Meltdown (2008).
10. B. FONTAINE et al., “Description and Preliminary Results of
PHENIX Core Flowering Test,” Nucl. Eng. Des., 241, 10, 4143
(2011); https://doi.org/10.1016/j.nucengdes.2011.08.041.
11. J. A. SHIELDS JR., “Bowing in Experimental Breeder
Reactor II Reflector Subassemblies,” Nucl. Technol., 52,
2, 214 (1981); https://doi.org/10.13182/NT81-A32666.
12. N. WOZNIAK, E. R. SHEMON, and J. J. GRUDZINSKI,
“Review of Tools for Modeling Core Radial Expansion in
Liquid Metal-Cooled Fast Reactors,” Technical Report
ANL/NSE-20/41, Nuclear Science and Engineering
Division, Argonne National Laboratory (2020).
13. X. CHENG and Y. Q. YU, “Local Thermal-Hydraulic
Behaviour in Tight 7-Rod Bundles,” Nucl. Eng. Des.,
239, 10, 1944 (2009); https://doi.org/10.1016/j.
nucengdes.2009.04.010.
14. J. CAHALAN et al., “Advance Burner Reactor 1000MWth
Reference Concept,” Technical Report ANL-AFCI-202,
Argonne National Laboratory (2007).
15. Y. WANG, S. SCHUNERT, and V. LABOURÉ,
“RATTLESNAKE Theory Manual,” Technical Report
INL/EXT-17-42103, Idaho National Laboratory (2019).
16. J. D. HALES et al., “BISON Theory Manual,” Technical
Report INL/EXT-13-29930 Rev. 3, Idaho National
Laboratory (2016).
Fig. 27. NekRS fluid temperature and BISON solid tem-
perature on the plane z¼0:805. Light blue lines indicate
the peripheries of the duct, cladding, and fuel pellets.
22 NOVAK et al. · COUPLED MONTE CARLO TRANSPORT AND CONJUGATE HEAT TRANSFER
NUCLEAR SCIENCE AND ENGINEERING · VOLUME 00 · XXXX 2023
17. R. HU, “SAM Theory Manual,” Technical Report ANL/
NE–17/4, Argonne National Laboratory (2017).
18. A. J. NOVAK et al., “Pronghorn: A Multidimensional
Coarse-Mesh Application for Advanced Reactor Thermal
Hydraulics,” Nucl. Technol., 207, 7, 1015 (2021); https://
doi.org/10.1080/00295450.2020.1825307.</bib>
19. J. E. HANSEL et al., “Sockeye Theory Manual,” Technical
Report INL/EXT-19-54395, Idaho National Laboratory
(2020).
20. A. J. NOVAK et al., “Pronghorn Theory Manual,”
Technical Report INL/EXT-18-44453-Rev001, Idaho
National Laboratory (2020).
21. “Cardinal: An Open-Source Coupling of NekRS and
OpenMC to MOOSE,” Argonne National Laboratory
(2022); https://cardinal.cels.anl.gov.
22. A. IVANOV et al., “High Fidelity Simulation of
Conventional and Innovative LWR with the Coupled
Monte-Carlo Thermal-Hydraulic System
MCNP5-SUBCHANFLOW,” Nucl. Eng. Des., 262, 264
(2013); https://doi.org/10.1016/j.nucengdes.2013.05.008.
23. Q. ZHANG et al., “An Efficient Scheme for Coupling
OpenMC and FLUENT with Adaptive Load Balancing,”
Sci. Technol. Nucl. Ins., 2021, 5549602 (2021); https://doi.
org/10.1155/2021/5549602.
24. A. G. MYLONAKIS, M. VARVAYANNI, and
N. CATSAROS, “A Newton-Based Jacobian-Free
Approach for Neutronic-Monte Carlo/Thermal-Hydraulic
Static Coupled Analysis,” Ann. Nucl. Energy, 110, 709
(2017); https://doi.org/10.1016/j.anucene.2017.07.014.
25. J. GUO et al., “A Versatile Method of Coupled Neutronics/
Thermal-Hydraulics Based on HDF5,” Proc. M&C (2017).
26. W. GURECKY and E. SCHNEIDER, “Development of an
MCNP6-ANSYS FLUENT Multiphysics Coupling
Capability,” Proc. ICONE (2016).
27. P. ROMANO et al., “Design of a Code-Agnostic Driver
Application for High-Fidelity Neutronic and
Thermal-Hydraulic Simulations,” Proc. PHYSOR (2020).
28. J. R. TRAMM et al., “Toward Portable GPU Acceleration
of the OpenMC Monte Carlo Particle Transport Code,”
Proc. PHYSOR (2022).
29. D. TALER, “Heat Transfer in Turbulent Tube Flow of
Liquid Metals,” Procedia Eng., 157, 148 (2016); https://
doi.org/10.1016/j.proeng.2016.08.350.
30. J. C. KOK and S. P. SPEKREIJSE, “Efficient and Accurate
Implementation of the k-ω Turbulence Model in the NLR
Multi-Block Navier-Stokes System,” European Congress
on Computational Methods in Applied Sciences and
Engineering (2000).
31. S. THANGAM, R. ABID, and C. G. SPEZIALE,
“Application of a New k-τ Model to Near Wall Turbulent
Flows,” Technical Report AD-A232 844 No. 91-16,
National Aeronautics and Space Administration Langley
Research Center (1991).
32. R. HU and T. H. FANNING, “A Momentum Source Model
for Wire-Wrapped Rod Bundles: Concept, Validation, and
Application,” Nucl. Eng. Des., 262, 371 (2013); https://doi.
org/10.1016/j.nucengdes.2013.04.026.
33. “Thermal Hydraulic Computational Fluid Dynamics
Simulations and Experimental Investigation of Deformed Fuel
Assemblies,” Technical Report DOE-AFS-9998321-1, Areva
(2017).
34. K. D. HAMMAN and R. A. BERRY, “A CFD Simulation
Process for Fast Reactor Fuel Assemblies,” Nucl. Eng.
Des., 240, 9, 2304 (2010); https://doi.org/10.1016/j.
nucengdes.2009.11.007.
35. M. MARTIN et al., “CFD Verification and Validation of
Wire-Wrapped Pin Assemblies,” Nucl. Technol., 206, 9,
1325 (2020); https://doi.org/10.1080/00295450.2020.
1727263.
36. L. M. BROCKMEYER et al., “CFD Investigation of Wire-
Wrapped Fuel Rod Bundles and Flow Sensitivity to Bundle
Size,” Proc. NURETH-16 (2015).
37. I. AHMAD and K. KIM, “Three-Dimensional Analysis of
Flow and Heat Transfer in a Wire-Wrapped Fuel
Assembly,” Proc. ICAPP (2005).
38. H. W. GODBEE and W. T. ZIEGLER, “Thermal
Conductivities of MgO, Al
2
O
3
, and ZrO
2
Powders to 850° -
C. I. Experimental,” J. Appl. Phys., 37, 1, 40 (1966);
https://doi.org/10.1063/1.1707849.
39. R. W. POWELL, R. P. TYE, and M. J. HICKMAN, “The
Thermal Conductivity of Nickel,” Int. J. Heat Mass
Transfer, 8, 5, 679 (1965); https://doi.org/10.1016/0017-
9310(65)90017-7.
40. Boron Nitride Powder (2022).
41. H. GERWIN et al., “TINTE—Nuclear Calculation Theory
Description Report,” Technical Report JUL-4317, Institute
for Energy Research (2010).
42. L. LEIBOWITZ and R. A. BLOMQUIST, “Thermal
Conductivity and Thermal Expansion of Stainless Steels
D
9
and HT
9
,” Int. J. Thermophys., 9, 5, 873 (1988);
https://doi.org/10.1007/BF00503252.
43. S. CHANG et al., “Experimental Study of the Flow
Characteristics in an SFR Type 61-Pin Rod Bundle
Using Iso-Kinetic Sampling Method,” Ann. Nucl.
Energy, 106, 160 (2017); https://doi.org/10.1016/j.anu
cene.2017.03.024.
44. S. K. CHOI et al., “Measurement of Pressure Drop in a
Full-Scale Fuel Assembly of a Liquid Metal Reactor,”
J. Pressure Vessel Technol., 125, 2, 233 (2003); https://
doi.org/10.1115/1.1565076.
COUPLED MONTE CARLO TRANSPORT AND CONJUGATE HEAT TRANSFER · NOVAK et al. 23
NUCLEAR SCIENCE AND ENGINEERING · VOLUME 00 · XXXX 2023
45. N. GOTH et al., “PTV/PIV Measurements of Turbulent
Flows in Interior Subchannels of a 61-Pin Wire-Wrapped
Hexagonal Fuel Bundle,” Int. J. Heat Fluid Flow, 71, 295
(2018); https://doi.org/10.1016/j.ijheatfluidflow.2018.03.
021.
46. M. S. SONG, J. JEONG, and E. S. KIM, “Flow
Visualization on SFR Wire-Wrapped 19-Pin Bundle
Geometry Using MIR-PIV-PLIF and Comparisons with
RANS-Based CFD Analysis,” Ann. Nucl. Energy, 147,
107653 (2020); https://doi.org/10.1016/j.anucene.2020.
107653.
47. M. CHUN and K. SEO, “An Experimental Study and
Assessment of Existing Friction Factor Correlations for
Wire-Wrapped Fuel Assemblies,” Ann. Nucl. Energy, 28, 17,
1683 (2001); https://doi.org/10.1016/S0306-4549(01)00023-8.
48. K. REHME, “Pressure Drop Correlations for Fuel Element
Spacers,” Nucl. Technol., 17, 1, 15 (1973); https://doi.org/
10.13182/NT73-A31250.
49. R. M. ROIDT, M. D. CARELLI, and R. A. MARKLEY,
“Experimental Investigations of the Hydraulic Field in
Wire-Wrapped LMFBR Core Assemblies,” Nucl. Eng.
Des., 62, 1–3, 295 (1980); https://doi.org/10.1016/0029-
5493(80)90035-7.
50. F. C. ENGEL, R. A. MARKLEY, and A. A. BISHOP,
“Laminar, Transition, and Turbulent Parallel Flow
Pressure Drop Across Wire-Wrap-Spaced Rod Bundles,”
Nucl. Sci. Eng., 69, 2, 290 (1979); https://doi.org/10.
13182/NSE79-A20618.
51. F. C. ENGEL et al., “Characterization of Heat Transfer and
Temperature Distributions in an Electrically Heated Model
of an LMFBR Blanket Assembly,” Nucl. Eng. Des., 62, 1–
3, 335 (1980); https://doi.org/10.1016/0029-5493(80)
90037-0.
52. M. H. FONTANA et al., “Temperature Distribution in the
Duct Wall and at the Exit of a 19-Rod Simulated LMFBR
Fuel Assembly (FFM Bundle 2A),” Nucl. Technol., 24, 2,
176 (1974); https://doi.org/10.13182/NT74-A31474.
53. D. J. KRISHNA et al., “Natural Convection in a Partially
Heat Generating Rod Bundle Inside an Enclosure,” J. Heat
Transfer, 132, 10, 102510 (2010); https://doi.org/10.1115/1.
4001610.
54. J. D. BESS et al., “Evaluation of the Initial Isothermal Physics
Measurements at the Fast Flux Test Facility, a Prototypic Liquid
Metal Fast Breeder Reactor,” Technical Report INL/EXT-09-
16524, Idaho National Laboratory (2010).
55. X. CHENG and U. MULLER, “Critical Heat Flux and
Turbulent Mixing in Hexagonal Tight Rod Bundles,” Int.
J. Multiphase Flow, 24, 8, 1245 (1998); https://doi.org/10.
1016/S0301-9322(98)00027-5.
56. “Thermodynamic Properties of DuPont Freon 12 (R-12)
Refrigerant,” Technical Report T-12 SI, DuPont.
57. A. E. SLAUGHTER et al., “MOOSE Stochastic Tools:
A Module for Performing Parallel, Memory-Efficient
in situ Stochastic Simulations,” SoftwareX preprint
(2022); https://papers.ssrn.com/sol3/papers.cfm?
abstract_id=4049487.
58. D. BLANCHET, L. BUIRON, and N. STAUFF, “Sodium
Fast Reactor Core Definitions,” Technical Report Version
1.2, Working Party of Scientific Issues.
59. F. B. BROWN, “On the Use of Shannon Entropy of the
Fission Distribution for Assessing Convergence of Monte
Carlo Criticality Calculations,” Proc. PHYSOR (2006).
60. J. DUFEK and W. GUDOWSKI, “Stochastic Approximation
for Monte Carlo Calculation of Steady-State Conditions in
Thermal Reactors,” Nucl. Sci. Eng., 152, 3, 274 (2006);
https://doi.org/10.13182/NSE06-2.
61. K. E. REMLEY and D. P. GRIESHEIMER, “A Fully
Analytic Coupled Thermal-Neutronics Benchmark and Its
Application to Monte Carlo Simulation,” Proc. M&C
(2019).
62. Y. MA et al., “Neutronic and Thermal-Mechanical
Coupling Analyses in a Solid-State Reactor Using Monte
Carlo and Finite Element Methods,” Ann. Nucl. Energy,
151, 107923 (2021); https://doi.org/10.1016/j.anucene.
2020.107923.
24 NOVAK et al. · COUPLED MONTE CARLO TRANSPORT AND CONJUGATE HEAT TRANSFER
NUCLEAR SCIENCE AND ENGINEERING · VOLUME 00 · XXXX 2023