Content uploaded by Marta Camps Santasmasas

Author content

All content in this area was uploaded by Marta Camps Santasmasas on Jan 07, 2019

Content may be subject to copyright.

DUAL NAVIER-STOKES /LATTICE-BOLTZMANN METHOD

FOR URBAN WIND FLOW

Marta Camps Santasmasas1, Alistair J. Revell1, Ben Parslew1,

Adrian R. G. Harwood1and William Crowther1

1School of Mechanical, Aerospace and Civil Engineering,

The University of Manchester, Manchester M13 9PL, United Kingdom

marta.campssantasmasas@postgrad.manchester.ac.uk

Abstract

Scale-resolving turbulent ﬂow simulation for ur-

ban environments is computationally expensive and

not widely used in industrial practice, despite inher-

ent complexities from geometry and high Reynolds

number effects. One way to reduce the computational

demand of the simulation is to focus a turbulence-

resolving solver on the region of interest, and employ

a cheaper less accurate solver elsewhere. We present

developments towards a dual Navier-Stokes / lattice-

Boltzmann (NS/LB) solver for three-dimensional un-

steady urban wind ﬂow. The simulation domain is di-

vided into an NS sub-domain and an LB sub-domain

with calculations performed on CPU and GPU respec-

tively. Turbulence is modelled via a large eddy sim-

ulation (LES) sub-grid model in the LB sub-domain

and an unsteady Reynolds averaged Navier-Stokes

model (URANS) in the NS sub-domain. Both sub-

domains are bi-directionally coupled on their overlap-

ping boundaries; a synthetic eddy method (SEM) is

used to generate instantaneous ﬂow data for the LB

sub-domain from the URANS mean ﬂow. Preliminary

results of the dual NS/LB model on laminar ﬂow show

that the two solvers are able to communicate and sim-

ulate continuous ﬂow across the interface. The imple-

mentation of the SEM inlet boundary condition suc-

cessfully demonstrates the convection of turbulence

throughout the LB sub-domain. Further work to fully

embed the LES region within the URANS domain is

ongoing. While motivated by a focus on urban wind

ﬂow simulations, this work is expected to have broader

relevance where computational domains include ﬂow

around a range of geometric scales.

1 Introduction

Wind modelling in urban environments has many

applications, from pedestrian wind comfort studies to

prediction of contaminant dispersion in toxic gas re-

lease accidents. The most widely-used computational

models for urban wind ﬂow numerically solve the

Reynolds averaged Navier-Stokes (RANS) equations.

However, scale-resolving methods such as Large Eddy

Simulation (LES), that can accurately simulate the tur-

bulent ﬂow structures require large, ﬁne grids, and re-

sult in high computational costs which limit the appli-

cation of more accurate methods in this sector.

So-called ‘hybrid’ turbulence models combine a

turbulence-resolving approach such as LES on the re-

gion of interest with the less computationally expen-

sive RANS methods employed elsewhere (see Frohlic

and von Terzi (2008)). The majority of such meth-

ods solve the Navier-Stokes equations in both regions.

On the other hand, the lattice-Boltzmann (LB) method

is also able to model turbulent ﬂow and may present

advantages over Navier-Stokes (NS) solvers in some

cases. For example, its local and simple algorithm

is well-suited to resolving ﬂow around complex ge-

ometries using graphic processing units (GPUs). On

the other hand, LB’s conditional stability and uni-

form domain discretisation reduce the efﬁciency of

the LB method as the domain size increases. Hybrid

RANS-LES methods have gained popularity in recent

years following sustained efforts to develop and vali-

date their use for industrial CFD applications (e.g. see

Haase et al. 2009) and in particular, the use of embed-

ded LES is emerging as a reliable and efﬁcient way to

combine cost saving RANS with high accuracy LES

(see e.g. Holgate et al. 2018).

A pragmatic approach to reducing the computa-

tional cost while still resolving turbulence in the re-

gions of interest is to combine two solvers, each ad-

dressing part of the domain with a different numerical

method and a different turbulence strategy. Tong and

He (2015) and Atanasov et al (2016) propose different

‘dual’ NS/LB solvers for both two-dimensional and

three-dimensional, steady and unsteady ﬂow. These

dual solvers demonstrate the advantages of dividing

the problem into sub-domains to be solved with NS

and LB, but thus far both solvers have been imple-

mented on the same computational framework; i.e.

neither method makes use of heterogeneous architec-

ture. It is well established that the computational cost

of the LB method can be reduced up to two orders of

magnitude when executed using GPUs (see Mawson,

2014).

In the present work we investigate the coupling of

an LB ﬂow solver, implemented on GPU, with a NS

solver implemented on CPU. The solvers are com-

bined in the framework of an embedded large eddy

simulation approach wherein the LB solver models

only part of the domain, overcoming mesh-related

drawbacks for standard LB methods. The NS solver

employs a RANS turbulence model within a transient

solver, and is coupled to the embedded LB solver via

ﬂuctuations generated from a synthetic eddy genera-

tor at the interface. In this way, the NS sub-domain

models the mean ﬂow in the outermost region and

the LB sub-domain resolves small-scale ﬂow features

around the urban geometry. The proposed dual solver

approach combines the mesh ﬂexibility, stability and

low memory requirements of the NS solver with the

speed of the GPU implementation of the LB solver.

Preliminary results presented herein illustrate two ele-

ments of the aforementioned dual solver; 1) coupling

between LB and NS sub-domains for laminar ﬂow and

2) the implementation of an SEM inlet boundary for

an LB-LES simulation.

2 Methodology

The dual NS/LB model presented in this paper di-

vides the simulated domain into two sub-domains as

shown in Figure 1. The LB-LES sub-domain is run

by an LB solver using a LES Smagorinsky turbulence

model and is executed on GPU. The objective of this

domain is to simulate turbulence in the region of inter-

est. The rest of the domain is covered by the URANS

sub-domain, which is modelled with an unsteady NS

solver with a RANS turbulence model executed on

CPU. The aim of the URANS sub-domain is to pro-

vide mean ﬂow statistics that take into account possi-

ble obstacles outside of the interest region. The exter-

nal boundaries (boundaries 1, 3 and 5 in Fig. 1) incor-

porate a synthetic turbulence model (SEM) to generate

the instantaneous velocity and pressure, needed by the

LB-LES sub-domain, from the mean ﬂow data at the

corresponding position in the URANS sub-domain.

Given the previously established dependence of syn-

thetic turbulence generation on quality of Reynolds

stress anisotropy, it is envisaged that the elliptic blend-

ing Reynolds stress model of Manceau and Hanjalic

(2002) will be employed.

The current section describes the implementation

of the dual NS/LB model. Subsequent sections present

and discuss the preliminary results of two test cases,

each of which addresses one of the required interface

conditions:

1. Laminar ﬂow around a wall-mounted cube using

a laminar ﬂow version of the dual NS/LB model

that includes boundaries 5 and 6 in Fig. 1.

2. URANS to LB-LES upstream boundary, number

1 in Fig. 1.

Navier-Stokes

The ﬂow in the URANS sub-domain is mod-

elled using the pisoFoam solver from the open source

Figure 1: Dual solver concept applied to model wind around

a group of buildings. View from the top (top ﬁg-

ure), view from the side (bottom ﬁgure). The

user is particularly interested on the wind around

the central building. The numbers mark the

boundaries where information is transferred from

URANS to LB-LES (odd numbers) and from LB-

LES to URANS (even numbers).

CFD package OpenFOAM (Weller and Tabor (1998)).

pisoFoam solves the three-dimensional incompress-

ible Navier-Stokes equations,

∂ui

∂t +uj

∂ui

∂xj

=−1

ρ

∂p

∂xi

+ν∂2ui

∂x2

j

(1)

∂ui

∂xi

= 0 i, j = 1,2,3(2)

where uiis the ﬂow velocity, pthe pressure and ν

the kinematic viscosity. The solver discretises the NS

equations employing the ﬁnite volume method. The

conﬁguration used in this paper includes linear inter-

polation schemes and an implicit Euler scheme for

time integration. The solved ﬂow data is stored at the

centre of each cell, while the boundary conditions are

stored at the faces of the boundary cells.

Lattice Boltzmann

The LB-LES sub-domain solver is LUMIS, a

GPU-accelerated version of the LB code developed

at The University of Manchester by Harwood et al.

(2018). The LB-LES sub-domain numerically solves

the incompressible lattice-Boltzmann equations for-

mulated by Guo et al. (2000) using a stream-collide

algorithm.

fα(x+eαdt, t +dt) = fα(x, t)(3)

−1

τ(fα(x, t)−feq

α(x, t))

where fα(x, t)is the particle distribution at the cell

xat time t,eαis the discretised particle velocity in the

αdirection, τis the relaxation time, which is related to

the kinematic viscosity of the ﬂuid via ν=c2

s(τ−0.5),

cs= 1/√3is the lattice sound speed and w0= 1/3,

wα= 1/18 for α= 1 to 6 and wα= 1/36 for α= 7

to 18. feq

α(x, t)are the equilibrium functions for the

particle distribution functions:

feq

α=(ρ0−(1 −w0)p

c2

s

+sα(u), α = 0,

wαp

c2

s

+sα(u), α = 1 to α = 18.

(4)

sα(u) = wαu·eα

c2

s

+(u·eα)2

2c4

s−

u·u

2c2

s(5)

where ρ0=P18

α=0 fαis the density of the ﬂuid in

LB units and uand pare the macroscopic velocity and

pressure of the ﬂuid in LB units. They can be obtained

from fαas

u=

18

X

α=0

eαfα(6)

p=c2

s

1−w0 18

X

α=1

fα+s0(u)!(7)

The discretised particle velocity eαfollows a

D3Q19 model. The spatial domain is discretised with

a uniform 3D grid where cubes have unit length, and

the ﬂuid data is stored at the centre of each ‘cell’. Sim-

ilarly, time is discretised in time steps each of unit du-

ration. The incompressible Navier-Stokes equations

can be obtained via a Chapmann-Enzkog expansion of

equation 3 with ρ0set to a constant value (Guo et al.,

200). LUMIS also incorporates the LES Smagorin-

sky turbulence model as described by Koda and Lien

(2015).

Synthetic eddy method (SEM)

The instantaneous macroscopic velocity and pres-

sure data needed at the LB-LES sub-domain boundary

are generated using the synthetic eddy method devel-

oped by Skillen et al. (2016) from the mean veloc-

ity, turbulence and pressure received from the URANS

sub-domain.

The ﬁrst step to incorporate SEM in the dual

NS/LB model is to test the suitability of SEM as a

boundary condition for the LB-LES sub-domain. In

order to test it, we use an OpenFOAM implementa-

tion of the Skillen et al. (2016) SEM to generate the

inlet boundary conditions for each time step of the LB-

LES simulation. Then we perform a time step of the

LB-LES simulation, which reads its inlet BC from the

OpenFOAM data. See section 3, Test case 2 for more

information and results of the test.

Coupling methodology

Spatial coupling.

The simulation is divided into an URANS sub-

domain and an LB-LES sub-domain as shown in Fig.

3, where the overlap region is the volume solved by

both the NS (URANS) and the LB (LB-LES) solvers.

The overlapping region acts as a buffer, so that the val-

ues transferred by one solver are not overwritten by the

values transferred from the other solver.

Figure 2: Data interpolated from the Navier-Stokes sub-

domain to the lattice-Boltzmann sub-domain (left)

and vice versa (right) in a line of cells.

The two sub-domains only exchange information

at the boundaries of the overlapping region (see Fig.

2). The URANS boundary receives data from the

equivalent cells on the LB-LES sub-domain, while the

LB-LES boundary receives data from the equivalent

cells in the URANS sub-domain. The LB-LES sub-

domain ﬁrst linearly interpolates the instantaneous ve-

locity data from the LB cell centres to the position of

the URANS boundary. This data is then transformed

to the data needed by the URANS sub-domain by the

process shown in Fig. 3. The LB-LES sub-domain

stores its boundary data at the cell centres and both

sub-domains have the same cell size at the overlapping

region, so no interpolation is needed when transferring

the data from URANS to LB-LES; the URANS mean

ﬂow data is transformed to the instantaneous values

needed by LB-LES following the procedure in Fig. 3.

Figure 3: Exchange of information between the RANS and

the LB-LES sub-domains.

The LB and NS methods model different quantities

and utilise different units, and so the NS velocity and

pressure need to be converted before being transferred

to the LB solver and vice-versa.

The URANS boundary in the overlapping region

implements a Dirichlet boundary condition for veloc-

ity, which yields a uniform Neumann boundary condi-

tion for pressure. Thus the NS solver needs to obtain

the dimensionless macroscopic uns velocity from the

LB macroscopic velocity ulb via

ulb =uns

dtlb

dxlb

;plb =pns

dt2

lb

dx2

lb

(8)

where dtlb is the time step and dxlb is the cell size

of the LB solver

The LB-LES boundary in the overlapping region

implements a Dirichlet boundary condition for the par-

ticle distribution functions fα. However, the URANS

sub-domain provides dimensionless pressure and ve-

locity. Once the NS pressure and velocity are con-

verted to LB units using eq. 8, the dual NS/LB solver

uses the equilibrium distribution function (eq. 4) to

obtain fα. However, this method assumes that the LB-

LES ﬂow is in equilibrium, so the non-equilibrium

information carried by the macroscopic velocity and

pressure is lost.

Temporal coupling.

The URANS and LB-LES sub-domains are solved

simultaneously and coupled following the parallel ex-

plicit approach described in algorithm 1, where both

solvers share data at the end of each time step. The

coupling algorithm is implemented using preCICE li-

braries (Bungartz et al., 2016) and the OpenFOAM

preCICE adapter (Choudrakis, 2017), which we have

modiﬁed to transfer and receive pressure and velocity

data from both the domain boundary and internal cells.

Algorithm 1 Parallel explicit coupling scheme for the

LB solver. The NS solver is executed simultaneously

following the same algorithm

1: while (time <end time)

2: Receive data from the NS solver;

3: Set received data as LB boundary condition;

4: Perform LB time step;

5: Send data to NS sub-domain;

6: Update time;

3 Results

Test case 1: dual N/S model

The coherence of the LB-NS interface is ﬁrst

assessed by simulating a three-dimensional wall

mounted cube with a uniform inlet velocity proﬁle and

a Reynolds number of 100 based on the inlet velocity

and the height of the cube. The aim of this simulation

is to test the exchange of information between the sub-

domains of the dual NS/LB, so the conﬁguration of the

case is not aimed at reproducing previous experimen-

tal or numerical studies. Fig. 4 shows a schematic

of the domain. The boundary conditions are as fol-

lows: the inlet and top boundary condition for the LB

sub-domain are Dirichlet boundary conditions for the

particle distribution functions with u= (1,0,0) and

p= 0, while the top boundary of the NS sub-domain

is a Dirichlet boundary for velocity and a zero gradi-

ent boundary for pressure; the bottom boundary con-

dition is a symmetry plane for the 5 ﬁrst units so that

the constant velocity inlet velocity proﬁle can adapt to

the no-slip boundary condition on the remainder of the

bottom boundary; the surfaces of the cube are also set

to no-slip; the side boundaries of the domain are peri-

odic; the outlet of the domain is set to a zero-gradient

boundary for velocity and a ﬁxed pressure value esti-

mated from an a priori calculation of the domain with-

out the cube (in the present case this is set to −0.15 di-

mensionless units). At present this is necessary since

the LB solver sets gauge pressure to zero at the inlet of

the domain, so that pressure is ﬁxed on both the inlet

and outlet of the complete domain. The need and con-

ﬁguration of this is expected to be further addressed in

subsequent work.

Figure 4: Sketch of Test case 1, not to scale. Distances are

in dimensionless units. The spanwise extent of the

domain is 6 units.

The NS and the LB sub-domains are discretised us-

ing a uniform Cartesian grid with cubic cells of dimen-

sionless length 0.1 and an uniform time step of 0.006.

Fig. 5 shows velocity streamlines across the NS

and LB sub-domains and the values of the pressure

for both solvers at the overlapping region. The LB

streamlines match the NS streamlines at the inlet of

the NS sub-domain. Inside the overlapping region, the

two sets of streamlines deviate as the distance from

the inlet of NS sub-domain increases. This effect is

strongest in the recirculation region. The NS pressure

at the overlap region (Fig. 5, right) follows the same

pattern as the LB pressure, however their values differ

except at the LB sub-domain outlet, where the bound-

ary condition sets the LB pressure to the NS pressure.

Test case 2: lattice-Boltzmann SEM

Prior to implementing SEM in the dual NS/LB

model, we tested the implementation of an SEM in-

let boundary within a traditional, weakly compress-

ible LB-LES solver. The test case is a channel ﬂow

with Reτ= 180 (see Fig. 6). The inlet velocity

is generated for each time step using the Skillen et

Figure 5: Preliminary results for ﬂow aroundwall-mounted

cube across LB-NS interface. Top XZ plane at

Y=0.5H; Btm XY plane at Z=0. Left: Stream-

lines on LB (grey) and NS sub-domains (black).

Right: LB pressure (ﬁlled contours) and NS pres-

sure (lines) on the overlapping region.

al. (2016) SEM model with the mean ﬂow statistics

from direct numerical simulation (DNS) by Moser and

Mansour (1999). The SEM limiter on the turbulent

scales uiuj3/2/ε is set to σmax = 0.125. The pressure

is set to a constant value for all of the inlet boundary.

The LB particle distribution functions are assumed to

be equal to the equilibrium distribution functions (see

eq. 4) obtained from the velocity and pressure data.

The top and bottom walls are no-slip boundaries and

the side boundaries are periodic. The outlet also im-

poses a Dirichlet boundary condition for the particle

distribution functions, which are set to yield constant

pressure and macroscopic velocity across the bound-

ary.

Figure 6: Sketch of Test case 2, not to scale: lattice-

Boltzmann SEM. The spanwise extent is π.

The domain is discretised using a uniform Carte-

sian grid with 283 ×92 ×142 cells in the streamwise,

vertical and spanwise directions respectively, which

yields a y+= 2 for the cells next to the walls and

y+= 4 for the remaining cells. The time step is uni-

form and equal to 3.6885 ×10−5.

Figure 7 displays the streamwise and spanwise

components of the instantaneous dimensionless veloc-

ity ux/uτand uy/uτ. The turbulent velocity patterns

introduced at the inlet are transmitted along the do-

main. However, small turbulent structures dissipate as

the distance from the inlet increases. There is also a

decrease in the velocity magnitude at the YZ plane of

cells neighbouring the inlet.

Figure 7: Spanwise and streamwise instantaneous velocity

on vertical section of a channel ﬂow where the in-

let boundary condition has been generated using

SEM. Simulation run using the LB solver LUMIS.

4 Discussion

The results of the dual NS/LB solver indicate that

the the NS and LB sub-domains exchange information

at their boundaries and are able to produce continu-

ous velocity patterns across both domains. It is noted

that the choice to locate the dual solver interface in the

middle of the recirculation region is intended to be a

particularly challenging test of the methodology; the

two-way nature of the interface is necessary. Compar-

ison to NS results for the full domain demonstrate an

over-prediction of the recirculation region by approxi-

mately 30%, and thus the solver requires further inves-

tigation to identify possible cause for this imbalance.

At present there are a number of imposed simpliﬁca-

tions which will limit the generality of the solver and

its accuracy, as discussed in the following:

Compressible-Incompressible boundary: the chal-

lenge of coupling a compressible method with an in-

compressible method is highlighted, since at present

the pressure coupling is limited to one-way only; the

LB sub-domain does not transmit pressure information

to the NS sub-domain. Furthermore, the pressure is set

at both the inlet of the LB sub-domain and the outlet

of the NS sub-domain. The value of the pressure at the

NS outlet is likely to affect the results.

Resolution: at present we have chose to limit the

resolution used in the LB solver - which is currently

set to 10 points per cube height - on account of compu-

tational limitations of the development platform. Tests

at higher, more appropriate resolutions are currently

underway and will be reported in session.

Reconstruction of particle distribution func-

tion:another important limitation is identiﬁed in the

algorithm used to obtain the LB particle distributions

from the NS velocity and pressure; at present we have

employed a simpliﬁed reconstruction which disregards

the non-equilibrium part of the particle distribution

functions, but a more complex approach will be im-

plemented shortly.

Size and position of the overlap region: the bound-

aries of the overlapping region have to be separated by

at least one cell to allow a buffer where both solvers

can assimilate the information received from the other.

Further investigation is required to better understand

the limitations in size and position of the overlapping

region.

In addition to the above identiﬁed ongoing areas of

development, the coupling model currently employs a

parallel, explicit time coupling scheme, but other time

coupling schemes will also be tested.

Regarding Test case 2, the weakly compressible

LB-LES model convects the SEM generated turbu-

lence across the domain. Downstream of the inlet the

intensity of turbulent ﬂuctuations is reduced, beyond

expected levels and further investigation is ongoing.

The artiﬁcial dissipation of turbulence observed along

the channel could be due to a number of causes, in-

cluding resolution and excessive subgrid scale damp-

ing. The decrease in the velocity at the cells adjacent

to the inlet might be inﬂuenced by the missing pres-

sure information from the SEM model.

5 Conclusions and outlook

We have presented a fully three-dimensional hy-

brid method for transient turbulent ﬂow simulations.

The method couples a sub-domain solved by an in-

compressible lattice-Boltzmann solver with another

sub-domain solved by an incompressible Navier-

Stokes solver. The NS/LB model is implemented on

a heterogeneous architecture with the NS sub-domain

executed on CPU and the LB sub-domain executed on

GPU.

We have assessed the coupling of the two models

for laminar ﬂow using a 3D wall mounted cube with

Re number of 100. The preliminary results show that

the two models are able to communicate and create a

continuous ﬂow across them showing small discrep-

ancies in the results on the overlapping region; fur-

ther study is required on the effect of resolution and

boundary conditions, coupling methodology, and size

and position of the overlapping region. We have also

implemented a synthetic eddy method inlet boundary

condition in an LB-LES solver as a ﬁrst step to intro-

ducing turbulence to the dual NS/LB model. The inlet

turbulence is convected through the domain but it is

affected by artiﬁcial dissipation. This may be due to

the inlet and outlet boundary conditions and also the

Smagorinsky constant of the LES model; both poten-

tial causes need further study.

We are currently investigating the effects of in-

creasing the resolution and improving the pressure

coupling on the results of the dual NS/LB solver.

Next steps will include ﬁnal validation of the NS/LB

solver for laminar ﬂows followed by the incorporation

of the SEM boundaries and the modelling of higher

Reynolds number ﬂows. Finally we will look to op-

timise the coupling algorithm by testing the perfor-

mance increase compared to using a single Navier-

Stokes or lattice-Boltzmann model for the complete

domain.

Acknowledgment:ﬁnancial support awarded by Sam-

sung’s Global Outreach Programme is gratefully ac-

knowledged here.

References

Atanasov, A., Uekermann, B., and Neumann, P. (2016),

Anderson Accelerated Coupling of Lattice Boltzmann and

NavierStokes Solvers for Parallel Applications, Computa-

tion. 4(4), 3857.

Bungartz, H. J., Lindner, F., Gatzhammer, B., Mehl, M.,

Scheufele, K., Shukaev, A. and Uekermann, B. (2016). pre-

CICE A fully parallel library for multi-physics surface cou-

pling. Computers and Fluids. 141, 250258.

Chourdakis, G. (2017), A general OpenFOAM adapter for

the coupling library preCICE. (Master’s dissertation).

Frohlich, J., Von Terzi, D. a. (2008), Hybrid LES/RANS

methods for the simulation of turbulent ﬂows, Progress in

Aerospace Sciences. 44(5), 349377.

Guo, Z., Shi, B. and Wang, N. (2000). Lattice BGK

Model for Incompressible Navier-Stokes Equation. Journal

of Computational Physics. 165(1), 288306.

Haase, W., Braza, M., Revell, A. (Eds.). (2009). DESiderA

European Effort on Hybrid RANS-LES Modelling: Results

of the European-Union Funded Project, 2004-2007 (Vol.

103). Springer Science & Business Media.

Harwood, A. R. G., O’Connor, J., Sanchez Mu˜

noz, J.,

Camps Santasmasas, M. and Revell, A. J. (2018), LUMA:

A many-core, FluidStructure Interaction solver based on the

Lattice-Boltzmann Method, SoftwareX. 7, 8894.

Holgate, J., Skillen, A., Craft, T., Revell, A. (2018). A

Review of Embedded Large Eddy Simulation for Internal

Flows. Archives of Computational Methods in Engineering,

1-18.

Koda, Y., and Lien, F. S. (2015). The lattice-Boltzmann

method implemented on the GPU to simulate the turbulent

ﬂow over a square cylinder conﬁned in a channel. Flow,

Turbulence and Combustion. 94(3), 495512.

Manceau, R., Hanjali´

c, K. (2002). Elliptic blending

model: A new near-wall Reynolds-stress turbulence closure.

Physics of Fluids, 14(2), 744-754. Chicago

Mawson, M. J. (2014), Interactive Fluid-Structure Interac-

tion With Many-Core Accelerators. (Doctoral dissertation).

Moser, R., Kim, J., Mansour, N. (1999), Direct numerical

simulation of turbulent channel ﬂow up to Re= 590, Physics

of Fluids.,11(4), 1113.

Skillen, A., Revell, A. and Craft, T. (2016). Accuracy and

efﬁciency improvements in synthetic eddy methods, Inter-

national Journal of Heat and Fluid Flow. 62, 386394.

Tong, Z. X., and He, Y. L. (2015), A uniﬁed coupling

scheme between lattice-Boltzmann method and ﬁnite vol-

ume method for unsteady ﬂuid ﬂow and heat transfer, In-

ternational Journal of Heat and Mass Transfer. 80, 812824.

Weller, H. G., and Tabor, G. (1998), A tensorial approach to

computational continuum mechanics using object-oriented

techniques, Computers in Physics. 12(6), 620631.