PreprintPDF Available

Mixing time for a noisy SIS model on graphs

Authors:
Preprints and early-stage research may not have been peer reviewed yet.

Abstract

We study the mixing time of the noisy SIS (Susceptible-Infected-Susceptible) model on graphs. The noisy SIS model is a variant of the standard SIS model, which allows individuals to become infected not just due to contacts with infected individuals but also due to external noise. We show that, under strong external noise, the mixing time is of order O(nlogn)O(n \log n). Additionally, we demonstrate that the mixing time on random graphs, namely Erd\"os--R\'enyi graphs, regular multigraphs, and Galton--Watson trees, is also of order O(nlogn)O(n \log n) with high probability.
arXiv:2501.07738v1 [math.PR] 13 Jan 2025
Mixing time for a noisy SIS model on graphs
Wasiur R. KhudaBukhsh and Yangrui Xiang
School of Mathematical Sciences, University of Nottingham, University
Park, Nottingham, NG7 2RD, State, United Kingdom.
*Corresponding author(s). E-mail(s):
Wasiur.KhudaBukhsh@nottingham.ac.uk;
Yangrui.Xiang@nottingham.ac.uk;
Abstract
We study the mixing time of the noisy SIS (Susceptible-Infected-Susceptible)
model on graphs. The noisy SIS model is a variant of the standard SIS model,
which allows individuals to become infected not just due to contacts with infected
individuals but also due to external noise. We show that, under strong external
noise, the mixing time is of order O(nlog n). Additionally, we demonstrate
that the mixing time on random graphs, namely Erd¨os–R´enyi graphs, regular
multigraphs, and Galton–Watson trees, is also of order O(nlog n)with high
probability.
Keywords: Mixing times, Markov chains, Epidemics, Random graphs
MSC Classification: 60J10 , 60J20 , 92D25
1
1 Introduction
The noisy SIS (Susceptible-Infected-Susceptible) model is an irreducible discrete-time
Markov chain Norris (1997). It is a variant of the standard stochastic SIS model Ander-
sson and Britton (2000). In the standard SIS model, individuals are segregated into
two compartments, namely, S(the susceptible individuals) and I(the infected indi-
viduals). Susceptible individuals get infected when contacted by infected individuals.
Once infected, they are able to infect others before recovering and becoming suscep-
tible again (i.e., moving back to the Scompartment). While this model is used to
model several infectious diseases by mathematical epidemiologists, the model is rather
simplistic and ignores external sources of infection, which plays a crucial role in the
spread of a disease Sharker et al. (2024). The noisy SIS model allows for external
sources of infection so that susceptible individuals can get infected not only by an
infectious contact with an infected individual within the closed population but also
an external source of infection. The other crucial drawback of the standard SIS model
is that it ignores individual heterogeneity and the underlying contact network struc-
ture. We, therefore, consider epidemic dynamics on a variety of graphs (see Durrett
(2007); KhudaBukhsh et al. (2019); Kiss et al. (2017); KhudaBukhsh et al. (2022);
Decreusefond et al. (2012); van der Hofstad (2017); Cui et al. (2022); Bandyopadhyay
and Sajadi (2015); Ball (2021) and references therein to get a glimpse of recent liter-
ature on this topic), and study its mixing properties using n, the number of vertices
(population size) as a scaling parameter.
1.1 Mixing properties
It is well known that when the size of the state-space of an irreducible Markov chain is
a finite and fixed integer, the chain converges to its equilibrium measure exponentially
fast (Levin and Peres,2017, Chapter 4). We, however, adopt the approach introduced
by Aldous and Diaconis Aldous and Diaconis (1986). This approach revolves around
a parameterized family of Markov chains, typically parameterized by the size of the
state-space as a scaling factor. The key question is to determine how long it takes
for the chain to reach a close approximation of its stationary state. The mixing time,
roughly speaking, is the amount of time required for the chain to approach stationarity
in total variation distance, such that this distance is less than a given value (typically
chosen to be 1/4). We will employ the path coupling method, introduced by Wilson in
Wilson (2004), to derive the upper and the lower bounds on the mixing time.
Related to the concept of mixing times is the cutoff phenomenon, an abrupt change
in the total variation distance as the Markov chain converges to its equilibrium. For
random walks on dynamical Erd¨os–R´enyi random graphs, the cutoff phenomenon was
proven in Sousi and Thomas (2020). The universality of the cutoff for quasi-random
graphs, particularly those with an added random matching, is characterized in Hermon
et al. (2022).
For mean-field models on complete graphs, the noisy voter model shares similar-
ities with the noisy SIS model. In Cox et al. (2016), it is shown that the noisy voter
model, a continuous-time Markov chain, has a mixing time of 1
2log nand exhibits a
cutoff phenomenon. Furthermore, in Aljovin et al. (2024), the convergence profile of
2
the noisy voter model in Kantorovich distance is characterized. This model does not
exhibit a cutoff under natural noise intensity conditions; however, the time required
for the model to forget the initial state of the particles—a process known as thermal-
ization—does exhibit a cutoff profile. The most recent work on the noisy SIS model
on complete graphs in He et al. (2024) proves that, in the continuous-time setting, the
noisy SIS model has a sharp mixing time of order O(log n) and also exhibits a cutoff
phenomenon.
1.2 Mathematical contributions
We show that the upper and the lower bounds on the mixing times with respect to
the scaling parameter nare both of order O(nlog n). That is, even if the Markov
chain starts from the worst possible” initial condition, the total variation distance
between the distribution of the chain at time t=O(nlog n) and its stationary state
is close. This argument stays valid for all general graphs under strong noise intensity.
Furthermore, we establish the mixing properties for random graphs, namely Erd¨os–
enyi graphs, regular multigraphs, and Galton–Watson trees. More specifically, given
an instantiation of a random graph, we run the Markov chain with the noisy SIS
dynamics (*) described in Section 2.1, and show that the mixing time is of order
O(nlog n) with high probability.
The rest of the paper is structured as follows: We describe the noisy SIS model
on graphs in Section 2. The upper and the lower bounds on the mixing time are
established in Sections 3and 4respectively. Finally, we study the mixing properties
of the noisy SIS model on random graphs in Section 5.
2 The noisy SIS model on graphs
2.1 Definition and dynamics of the model
Let nNbe the scaling parameter. Consider the sequence of graphs Gn:= (Vn, En)
with Vndenoting the set of vertices and EnVn×Vn, the set of edges. We fix
|Vn|=n. We define the maximal degree (n)
max of graph Gnto be
(n)
max := max
xVn
deg(x),
where deg(x) denotes the degree of the vertex x. We define Q:={0,1}and the state
space n:=QVn. Each vertex xVnrepresents an individual at site xwithin a
population of size n. From the perspective of an interacting particle system (IPS) (see
Kipnis and Landim (1999); Liggett (1985)), we say that a configuration σnhas a
particle at site xVnif σx= 1. If σx= 0, we say that the site xis empty.
In our context, we interpret a configuration σnas having an infected individual
at site xVnif σx= 1. If σx= 0, the individual at site xis considered susceptible. We
study a discrete-time Markov chain (σn(t); t= 0,1,2,...) on nNorris (1997); Levin
and Peres (2017). We define nI,σ (x) as the number of infected neighbours of x,i.e.
nI,σ (x):=|yVn:xyand σy= 1|,
3
where |A|denotes the cardinality of the set A.
The dynamics of the Markov chain are described as follows: Suppose an initial
configuration is given. With probability 1
n, we choose a vertex xto update, uniformly
at random. We consider three parameters a, λ, κ > 0. Here, λrepresents the infection
probability for each infected individual, meaning each infected individual can infect its
susceptible neighbours with probability λ. The parameter κis the recovery probabil-
ity,i.e., each infected individual recovers with this uniform probability. Finally, the
parameter adenotes the external infection probability. Each susceptible individual can
become infected independently by an external source of infection with probability a.
We define the rates
p(σ, x):=a+λ·nI,σ (x),
and
p(n)
(a, λ):=a+λ·(n)
max.
We choose the parameter λsuch that p(n)
(a, λ)<1. Then, the transition probabilities
of the Markov chain (σn(t); t= 0,1,2,...) are described as follows:
p(σx= 0, σx= 1) = 1
n·p(σ, x) and p(σx= 0, σx= 0) = 1
n·1p(σ, x)
p(σx= 1, σx= 0) = 1
n·κand p(σx= 1, σx= 1) = 1
n·(1 κ).
(*)
Note that only one vertex can change its local state in each timestep. The transi-
tion probabilities in Equation (*) describe the transition probabilities associated with
vertex xVn.
We call the discrete-time Markov chain (σn(t); t0) under this dynamics on
the graph Gnthe noisy susceptible-infected-susceptible (SIS) model on the graph. The
superscript non the Markov chain σn(t) emphasizes that the law of the Markov
chain σn(t) depends on the parameter n. When there is no confusion, we omit the
superscript. We denote the law of the Markov chain at time tas (µn
t;t0). If we
specify the initial condition, we denote the law at time tof the Markov chain starting at
state η0as (µn
η0,t;t0). Due to the external infection probability a > 0, this Markov
chain is irreducible, as the absorbing state—the disease-free state—is eliminated. For
each n, since it is Markov chain on a finite state space, there exists a unique invariant
measure µn
a,λ,κ with total support in n.
This model is useful for infections that do not generate permanent immunity such
as the flu. Our model does not capture the nuances of the modern theory of infectious
disease epidemiology. However, the spread of an infection in a middle-sized city near a
metropolis could be realistically modelled by the Markov chain σn(t). The metropolis
serves as a constant source of infection, facilitated by professional and commercial
activities, with an impact comparable to local contacts with infected individuals.
4
2.2 Mixing time
Given two probability measures µand νon n, the the total variation distance between
µand νis defined as
dTV(µ, ν ):=1
2X
σn
|µ(σ)ν(σ)|.
Also, for t0 and given η0n, and we define the distance to the stationary state
at time tstarting from η0as
d(n)(t;η0):= dTV(µn
η0,t, µn
a,λ,κ).
For η0n, we define
d(n)(t):= sup
η0n
dTV(µn
η0,t, µn
a,λ,κ),
and finally,
t(n)
mix(ǫ):= inf{t0; d(n)(t)ǫ}.
The time tn
mix(ǫ) is called the ǫ-mixing time of the n-th Markov chain (σn(t); t0),
which can be interpreted as follows: even if it starts from the “worst” initial condition,
after tn
mix(ǫ) units of time the distance to stationarity will nevertheless be smaller than
ǫ. For σ0, η0n, we furthermore define
¯
d(n)(t):= sup
σ00n
dTV(µn
σ0,t, µn
η0,t).
This represents the total variation distance between two laws at time t, starting from
the two “farthest” initial states σ0and η0. It follows directly from the definitions that
d(n)(t)¯
d(n)(t)2d(n)(t).
The inequality is standard. See Levin and Peres (2017) for a proof. To simplify nota-
tions, the superscript will be omitted whenever the dependence on the parameter nis
clear from the context.
3 Upper bound on the mixing time
In this section, we prove an upper bound on the mixing time of the noisy SIS model
on graphs. We consider the strong external infection regime:
0< κ < 1
4(n1) and a > 1κ > 1
2.(**)
5
In order to prove an upper bound, we need to introduce some notations and
concepts. First, we define a natural partial order on the state space n. For two
configurations σ, η n, we say σηif and only if
σxηx,xVn.
We consider the Hamming distance between two configurations σ, η ndefined as
follows
ρ(σ, η):=X
xVn
1{σx6=ηx}.
The nonnegative integer ρ(σ, η) is the number of vertices where σand ηdisagree.
We use coupling techniques to find upper bounds on the mixing times. In a pairwise
coupling, two copies of the Markov chain, denoted (σn(t); t0) and (ηn(t); t0),
are run simultaneously. The (σn(t); t0) sequence independently adheres to the
Markov chain’s transition rules specified in (*), as does the (ηn(t); t0) sequence.
However, the joint distribution of σn(t+1), ηn(t+ 1)conditioned on σn(t), ηn(t)is
designed in such a way that the two copies of the Markov chains coalesce in finite time
after which the two chains move together. We consider (σn(t); t0) and (ηn(t); t
0) to be coupled Markov chains with initial conditions σ0and η0. We denote by P
the probability measure on the coupled state space where both (σn(t); t0) and
(ηn(t); t0) are defined, and by Pσ00, the probability measure under the initial
conditions σ0and η0. The corresponding expectations are denoted by Eand Eσ00,
respectively, under the given initial conditions. We define the coalescence time of the
coupling τ(n)
couple to be
τ(n)
couple := min
t0{t:σs=ηsfor all st}.
By (Levin and Peres,2017, Theorem 5.4 and Corollary 5.5), the coalescence time
provides an upper bound on mixing time by the following inequality:
d(n)(t)sup
σ00n
Pσ00τ(n)
couple > t.(3.1)
To begin with, fix a vertex x. Then, we consider two initial configurations σ(0) = σ0
and η(0) = η0with Hamming distance ρσ(0), η(0)= 1, and they disagree at vertex
x. Without loss of generality, we assume that σ0η0, meaning σx(0), ηx(0)= (0,1).
We denote the Markov chain starting from σ0and η0by (σσ0(t); t0) and (ηη0(t); t
0). We first prove a contraction property.
Lemma 3.1 (Contraction property for strong external infection regime).For the
coupled Markov chains (σσ0(t); t0) and (ηη0(t); t0), after one step of update, we
have that
Eσ00ρσ(1), η(1)1γ
n<1,(3.2)
6
with γ:= (1 κ)a+ (1 a)κ2(n1)κ(1 κ)and γ > 0.
Proof of Lemma 3.1.Note that
Eσ00ρσ(1), η(1)
= 1 ·Pσ00ρσ(1), η(1)= 1+ 2 ·Pσ00ρσ(1), η(1)= 2
= 1 ·1Pσ00ρσ(1), η(1)= 0Pσ00ρσ(1), η(1)= 2
+ 2 ·Pσ00ρσ(1), η(1)= 2
= 1 Pσ00ρσ(1), η(1)= 0+Pσ00ρσ(1), η(1)= 2.
The only way to reduce the Hamming distance after one update is to update at
the vertex where σ0and η0disagree, setting them to the same state. There are two
possibilities:
Pσ00σx(1) = 1, ηx(1) = 1=1
n·(1 κ)·p(σ0, x),
and
Pσ00σx(1) = 0, ηx(1) = 0=1
n·1p(σ0, x)·κ.
It follows that
Pσ00ρσ(1), η(1)= 0=1
n·(1 κ)·p(σ0, x) + 1
n·1p(σ0, x)·κ.
In order to increase the Hamming distance, an update should occur at vertices
other than x, and the update at such a vertex wshould result in disagreement. There
are two cases: when σw(0) = ηw(0) = 1 or when σw(0) = ηw(0) = 0.
For the case when σw(0) = ηw(0) = 1, we have the estimate
Pσ00ρσ(1), η(1)= 2n1
n·2κ(1 κ).
For the case when σw(0) = ηw(0) = 0, we have the estimate
Pσ00ρσ(1), η(1)= 2n1
n·p(σ, w)1p(η, w)+p(η, w)1p(σ, w).
Therefore, we are able to get the following estimate:
Eσ00ρσ(1), η(1)11
n(1 κ)p(σ0, x) + 1p(σ0, x)κ
+n1
nmax 2κ(1 κ), p(σ, w)1p(η, w)
+p(η, w)1p(σ, w).
(3.3)
7
Under the strong external infection regime (**), we have
max n2κ(1 κ), p(σ, w)1p(η, w)+p(η, w)1p(σ, w)o= 2κ(1 κ).
On the other hand, by virtue of the two initial configurations, σ0and η0, with a
Hamming distance ρσ(0), η(0)= 1, where the configurations differ at vertex x.
Consequently, we have
(1 κ)p(σ0, x) + 1p(σ0, x)κ=κ+ (1 2κ)p(σ0, x)
min
σ0n
xVn
κ+ (1 2κ)p(σ0, x)
κ+ (1 2κ)a
= (1 κ)a+ (1 a)κ. (3.4)
This immediately implies
Eσ00ρσ(1), η(1)1γ
n<1,
with γgiven in the statement of the Lemma.
We now consider two initial configurations with Hamming distance greater than
1. Suppose that σ(0) = σ0and η(0) = η0with Hamming distance strictly greater
than one, i.e.,ρσ(0), η(0)=with 2. Without loss of generality, we assume
σ0η0. Additionally, there exist configurations ω0=σ0,ω1,ω2,...,ω=η0such
that ρ(ωk1, ωk) = 1 for all k. Since ρis a distance, we get the following estimate
Eσ00ρσ(1), η(1)=Eρσσ0(1), ηη0(1)E
X
k=1
ρσωk1(1), ηωk(1)
=
X
k=1
Eωk1kρσ(1), η(1)
·1γ
n=ρσ(0), η(0)·1γ
n.
The last inequality follows from Lemma 3.1. Now, after tsteps, we can obtain
Ehρσσ0(t), ηη0(t)σσ0(t1), ηη0(t1)= (σt1, ηt1)i=Eρσσt1(1), ηηt1(1)
ρσt1, ηt1·1γ
n.
By taking an expectation over the event σσ0(t1), ηη0(t1), we get that
Ehρσσ0(t), ηη0(t)iEhρσσ0(t1), ηη0(t1)i·1γ
n.(3.5)
8
Repeating the inequality t1 times, and we obtain
Eσ00hρσ(t), η(t)i=Ehρσσ0(t), ηη0(t)iρ(σ0, η0)·(1 γ
nt.(3.6)
Now we are ready to state and prove Theorem 3.1, which is the main result in this
section. The idea is to use inequality in Equation (3.1).
Theorem 3.1. Under the strong external infection regime (**), we have the following
upper estimate for the mixing time
t(n)
mix(ǫ)n
γlog n+ log( 1
ǫ),(3.7)
where γ= (1 κ)a+ (1 a)κ2(n1)κ(1 κ).
Proof of Theorem 3.1.For the coupled Markov chains (σσ0(t); t0) and (ηη0(t); t
0), we can write that
sup
σ00n
Pσ00τ(n)
couple > t= sup
σ00n
Pσ00σ(t)6=η(t)
= sup
σ00n
Pσ00ρσ(t), η(t)1.
By Markov’s inequality and the inequality in Equation (3.6), we obtain that
sup
σ00n
Pσ00ρσ(t), η(t)1sup
σ00n
Eσ00hρσ(t), η(t)i
sup
σ00n
ρ(σ0, η0)·(1 γ
nt
n·(1 γ
nt.
By inequality (3.1), we get the following upper bound for total variation distance
d(n)(t)n·(1 γ
ntn·etγ
n.
It follows that if tn
γlog n+ log( 1
ǫ), the total variation distance d(n)(t)ǫ, as
desired. This completes the proof.
Remark 3.1. For 0< κ < κand a > 1κ > 1
2, we have γ=a1
2+o(1). In this
case, Theorem 3.1 yields the following upper bound
t(n)
mix(ǫ)n
a1/2·log n+ log( 1
ǫ).
9
4 Lower bound on the mixing time.
In this section, we prove a lower bound on the mixing time of the noisy SIS model on
graphs. We consider the following strong external infection regime:
0< κ < 1
4(n1)2, a > 11
nα>1κ > 1
2, α > 1.(***)
As before, let us consider two starting configurations σ(0) = σ0and η(0) = η0
such that the Hamming distance satisfies ρσ(0), η(0)= 1, with the configurations
differing at the vertex x. Without loss of generality, we assume σ0η0, implying
σx(0), ηx(0)= (0,1). We also denote the respective Markov chains initiated at σ0
and η0by (σσ0(t); t0) and (ηη0(t); t0). We first bound expectation of the one-step
update Eσ00ρσ(1), η(1) from below:
Eσ00ρσ(1), η(1)1Pσ00ρσ(1), η(1)= 0
= 1 1
n(1 κ)p(σ0, x) + 1p(σ0, x)κ
11
n·max
xVnn(1 κ)p(σ0, x) + 1p(σ0, x)κo.
Since the Hamming distance between σ0and η0is ρσ(1), η(1)= 1, with the
configurations differing at the vertex x, we compute
max
xVnn(1 κ)p(σ0, x) + 1p(σ0, x)κo= max
xVnnk+p(σ0, x)(1 2k)o
k+p(n)
(a, λ)(1 2k),
with p(n)
(a, λ):=a+λ·(n)
max <1. Next, we define β:=k+p(n)
(a, λ)(1 2k). It
follows that
Eσ00ρσ(1), η(1)1β
n.
Therefore, we conclude that
Eσ00ρσ(t), η(t)ρσ(0), η(0)·1β
nt.(4.8)
Having established the inequality in Equation (4.8), we make the following claim.
Claim 4.1. The expected squared Hamming distance at time thas a lower bound of
the form
Eσ00ρ2σ(t), η(t)ρ2(σ0, η0)·12γ
nt+n
2γ.(4.9)
10
Proof of Claim 4.1.To simplify notations, we denote ρt:=ρσσ0(t), ηη0(t). Then, it
follows in a straightforward manner that
Eρ2
t+1ρt
=ρ2
t+ 2ρtEρt+1ρt2ρ2
t+E(ρt+1 ρt)2ρt
ρ2
t+ 2ρt1γ
nρt2ρ2
t+ 1
The inequality follows from Equation (3.5) and the fact that (ρt+1 ρt)21. By
induction and taking expectation over ρt, the claim follows.
With the estimate of the second moment for ρt, we can find an upper bound for
Var(ρt). Indeed, it follows that
Var(ρt) = Eρ2
tEρt2ρ2
012γ
nt+n
2γρ01β
nt2.
=ρ2
012γ
nt1β
n2t+n
2γ.
The first inequality is due to Equation (4.8) and Claim 4.1. Under the strong
external infection regime Equation (***), we have that
12γ
n<1β
n2.
Consequently, it follows that
Var(ρt)n
2γ.
Now we are ready to state and prove Theorem 4.1, which states that the mixing time
of a noisy SIS model on graphs has a lower bound of the order O(nlog n).
Theorem 4.1. For the strong external infection regime (***), we have that
t(n)
mix(ǫ)n
2βlog n+ log γǫ
4,(4.10)
where β=k+p(n)
(a, λ)(1 2k)and γ= (1 κ)a+ (1 a)κ2(n1)κ(1 κ).
Proof of Theorem 4.1.Recall that
d(n)(t):= sup
η0n
dTV(µn
η0,t, µn
a,λ,κ).
11
We consider the event ρtqn
γǫ . Then, note that
d(n)(t)dTV(µn
η0,t, µn
a,λ,κ)
µn
a,λ,κρtrn
γǫ µn
η0,tρtrn
γǫ
= 1 µn
a,λ,κρtrn
γǫ µn
η0,tρtrn
γǫ .
To obtain the lower bound of the mixing time, we need to bound the two terms
µn
a,λ,κρtqn
γǫ and µn
η0,tρtqn
γǫ . We do that separately.
For the term µn
a,λ,κρtqn
γǫ , given that µn
a,λ,κ is the invariant measure, we
have that Eµn
a,λ,κ [ρt] = 0. By Chebychev’s inequality, we obtain that
µn
a,λ,κρtrn
γǫ =µn
a,λ,κρtEµn
a,λ,κ [ρt]rn
γǫ ǫ.
For the term µn
η0,tρtqn
γǫ , let us assume for now that
Eµn
η0,t [ρt]r4n
γǫ .(4.11)
Under this condition, we have that
µn
η0,tρtrn
γǫ µn
η0,tρtr4n
γǫ rn
2γǫ
µn
η0,tρtEµn
η0,t [ρt]rn
2γǫ
µn
η0,tρtEµn
η0,t [ρt]rn
2γǫ
ǫ.
The last inequality is again by Chebychev’s inequality. Consequently, we conclude that
d(n)(t)12ǫ. (4.12)
Now, let us estimate the timescale at which the condition in Equation (4.11) holds.
To that end, we maximize the Hamming distance of initial condition. Then, it follows
that the inequalities
n1β
ntEµn
η0,t [ρt]r4n
γǫ ,
12
hold whenever
t
log n·pγǫ
4n
log 1β
n.
Now, when nis large, we obtain that
log n·pγǫ
4n
log 1β
nn
βlog rnγǫ
4=n
2βlog nγǫ
4=n
2βlog n+ log γǫ
4.
This implies that starting from the worst possible initial condition, after up to
n
2βlog n+ log γ ǫ
4steps, the total variation distance between µn
η0,t, the law of the
Markov chain at time t, and µn
a,λ,κ, the invariant measure, is still at least 1 2ǫ. This
proves Theorem 4.1.
Remark 4.1. Under the strong external infection regime (***), we have γ= 1 κ+
o(1) and β= 1 κ+o(1). Then, Theorem 4.1 yields the following lower estimate
t(n)
mix(ǫ)n
2(1 κ)·log n+ log( (1 κ)ǫ
4)n
21o(1)log n.
Theorems 3.1 and 4.1 tell us that
n
2log n1o(1)t(n)
mix(ǫ)2nlog n1 + o(1).(4.13)
This implies that the ǫ-mixing time of the noisy SIS model on graphs, under the strong
external infection regime, is of order O(nlog n). In the next section, we study the noisy
SIS model on a number of random graph models.
5 Mixing properties on random graphs.
As before, we focus on the strong external infection regime (***) with parameters:
0< κ < 1
4(n1)2, a > 11
nα>1κ > 1
2, α > 1.
For the choice of the strong external infection probability a > 11
nαwith α > 1, and
p(n)
(a, λ):=a+λ·(n)
max <1, we require that
λ·(n)
max <1
nα.(5.14)
We consider three different models of random graphs, namely, Erd¨os–R´enyi graphs,
Galton–Watson trees, and regular multigraphs.
13
5.1 Mixing property on Erd¨os–R´enyi graphs.
Let G(n, p) denote an Eros–R´enyi graph on nvertices, where each unordered pair
of vertices has an edge independently with probability p. We consider the case when
p1
n. Given a realisation of the Erd¨os–R´enyi graph G(n, p) (with p1
n), which we
continue to denote by G(n, p), we run the Markov chain with the noisy SIS dynamics
described in Section 2.1 on the random graph G(n, p). The following theorem follows.
Theorem 5.1. Let G(n, p)be an Erd¨os–R´enyi graph, with p1
n. Under the strong
external infection regime (***), if we furthermore assume λ1
n1+αp, then for any
ǫ>0, for nlarge enough, we have that
P n
2log n1o(1)t(n)
mix(ǫ)2nlog n1 + o(1)!1ǫ.
Proof of Theorem 5.1.By an application of the Chernoff bound on deg(v) for vVn,
we have that
Pdeg(v)npδnp=Pdeg(v)Edeg(v)δnp2 exp δ2np
3,
for any 0 < δ < 1. It follows that
P (1 δ)np < deg(v)<(1 + δ)np!12 exp δ2np
3.(5.15)
When p1
n, we get
lim
n→∞ 12 exp δ2np
3!n
= 1.
This means that for any ǫ>0, for nlarge enough, we have
P vVn: (1 δ)np < deg(v)<(1 + δ)np!1ǫ.
Therefore, it follows that for any ǫ>0, for nlarge enough,
P (1 δ)np < (n)
max <(1 + δ)np!1ǫ.
If we furthermore take λ1
n1+αp, with high probability (5.14) is satisfied. That is,
P λ·(n)
max <1
nα!1ǫ,
14
for any ǫ>0 and nlarge enough. Therefore, with high probability, the upper bound
and lower bound of the ǫ-mixing time are given by
P n
2log n1o(1)t(n)
mix(ǫ)2nlog n1 + o(1)!1ǫ.
This completes the proof.
5.2 Mixing property on regular multigraphs.
A multigraph is a graph in which more than one edges may be present between a pair
of vertices. A multigraph Gn= (Vn, En) is called regular if there exists a constant
dZ+such that for every vertex xVn, the degree of xis equal to d. That is,
deg(x) = dfor all xVn.
We will follow the configuration model (CM) to build a regular multigraph. Here we
provide a brief description of its construction, but we refer the readers to van der
Hofstad (2017) for a more detailed account.
Let nbe the number of vertices and d, the degree of each vertex in a d-regular
multigraph. To each vertex assign dhalf-edges/stubs. A half-edge, as the name sug-
gests, is half of an edge that has not yet been paired with another half-edge. The total
number of half-edges is dn. Denote the set of half-edges by H. At each step, we pick
two unpaired half-edges uniformly at random from the collection of all available half-
edges and pair them to form an edge. We repeat this process until no free half-edges
are left unpaired. Denote by the state space of all possible pairings of half-edges.
Each pairing corresponds to an edge in the multigraph. The total number of ways to
pair dn half-edges is given by:
||=(dn)!
(dn/2)!2dn/2.
We assign uniform probability to each possible pairings of half-edges. That is, the
probability assigned to each pairing is given by
(dn/2)!
dn
2·dn2
2·dn4
2· · · ·2
2.
Since in the case of a regular multigraph, we allow self-loops as e= (x, x) with
xVn, as well as multiple edges e= (x, y), e= (x, y) with e6=eand x, y Vn,
a modification of the definition of nI,σ (x), the number of infected neighbours of x, is
required. Consider the following revised definition:
nI,σ (x):=|eEn:e= (x, y), x 6=y, and σy= 1|.
15
We are now ready to state and prove the bounds on the mixing time of the noisy SIS
model on the random d-regular multigraph.
Theorem 5.2. Let Gnbe a random d-regular multigraph, with dbeing fixed and
independent of n. Under the strong external infection regime (***), if we furthermore
assume λ1
dnα, then for any ǫ>0, for nlarge enough, we have that
P n
2log n1o(1)t(n)
mix(ǫ)2nlog n1 + o(1)!1ǫ.
Before proving Theorem 5.2, we first analyze the concentration of the number of
self-loops in a d-regular multigraph. A self-loop is a pairing (edge) where two half-
edges at the same vertex are chosen during the random pairing process. We aim to
show that the number of self-loops concentrates around its expected value Durrett
(2007); van der Hofstad (2017) using McDiarmid’s inequality. To this end, let Sbe
the total number of self-loops in the d-regular multigraph. This can be written as the
sum of indicator random variables Ist,j with 1 s < t dand 1 jn:
S=
n
X
j=1 X
1s<td
Ist,j ,
where ndenotes the total number of vertices, and Ist,j is an indicator random variable
representing the pairing of half-edge swith half-edge t, where sand tare half-edges
associated with vertex i. That is,
Ist,j =(1, s and tassociated with vertex iform a self-loop,
0,otherwise.
The following lemma is then immediate.
Lemma 5.1. For a random d-regular multigraph Gn, and any δ > 0, we have
P|Sd1
2| δ2 exp 2δ2
n.
Proof of Lemma 5.1.At each pairing, the probability of forming a self-loop with half-
edges sand tassociated with vertex iin a single pairing is proportional to the number
of ways to pair two half-edges:
P(sand tform a self-loop) = 1
nd 1.
This probability does not depend on sand t. The expected number of self-loops E[S]
is:
E[S] = n·d
2·P(1 and 2 form a self-loop) = n·d(d1)
2·1
nd 1d1
2.
16
See (Durrett,2007, Theorem 3.1.2). Then, by McDiarmid’s inequality, we have:
P|Sd1
2| δexp 2δ2
n.
for any δ > 0.
Now, we are prepared to prove Theorem 5.2.
Proof of Theorem 5.2.By Lemma 5.1, with high probability, the total number of self-
loops is d1
2, and self-loops are uniformly distributed. This implies that, with high
probability, (n)
max is of order d. This means that for any ǫ>0, for nlarge enough,
P (1 δ)d < (n)
max <(1 + δ)d!1ǫ.
In addition, if we take λ1
dnα,with high probability (5.14) is satisfied. That is,
P λ·(n)
max <1
nα!1ǫ,
for any ǫ>0 and nlarge enough. Therefore, with high probability, we find the
following upper and lower bounds of the ǫ-mixing time
P n
2log n1o(1)t(n)
mix(ǫ)2nlog n1 + o(1)!1ǫ.
5.3 Mixing property on Galton–Watson trees
Let{Zn}n0be a Galton–Watson branching process, where Zndenotes the number
of individuals in the n-th generation. The process evolves according to the following
rules: We start with a single individual at generation 0, i.e.,Z0= 1. Each individual
in generation nindependently produces offspring according to a random variable X,
where Xtakes values in N0(the non-negative integers) with probability mass function
{pk}k0,i.e.,
pk=P(X=k), k = 0,1,2,...
Given that generation nconsists of Znindividuals, the number of individuals in
generation n+ 1, denoted Zn+1, is the sum of Znindependent random variables
X1, X2,...,XZn, where each Xihas the same distribution as X. Hence, for n0,
Zn+1 =
Zn
X
i=1
Xi,
17
where X1, X2,... are i.i.d. random variables with the same distribution as X.
We denote the Galton–Watson tree as Tn:= (Vn, En). Thus, the Galton–Watson
tree is a random rooted tree where the offspring of each individual is determined by the
random variable X, and the total number of individuals in each generation depends
on the offspring distribution of all individuals in the previous generation. We consider
two cases: a) The random variable Xhas a bonimial distribution; b) The random
variable Xhas a Poisson distribution.
5.3.1 Binomial offspring
Suppose XBin(n, p) and p1
n. We run the Markov chain with the noisy SIS
dynamics described in Section 2.1 on the random tree Tn. Consequently, the mixing
time, with high probability, is also of order O(nlog n) as well.
Theorem 5.3. Let Tnbe a Galton–Watson tree with offspring having a binomial
law Bin(n, p), with p1
n. Under the strong external infection regime (***), if we
furthermore assume λ1
n1+αp, then for any ǫ>0, for nlarge enough, we have that
P n
2log n1o(1)t(n)
mix(ǫ)2nlog n1 + o(1)!1ǫ.
Proof of Theorem 5.3.We begin by providing a Chernoff bound for the concentration
of deg(x) for xVn. Notice that E[X] = np, thus E[deg(x)] = E[X+ 1] = np + 1
O(np).In the same way as in Section 5.1, by Chernoff’s inequality, we have
P|deg(x)np| δnp=P|deg(x)E[deg(x)]| δnp2 exp δ2np
3,
for any 0 < δ < 1. This implies that
P(1 δ)np < deg(x)<(1 + δ)np12 exp δ2np
3.
In the case when p1
n, we obtain
lim
n→∞ 12 exp δ2np
3n
= 1.
Thus, for any ǫ>0, there exists a sufficiently large nsuch that
PxVn: (1 δ)np < deg(x)<(1 + δ)np1ǫ.
Therefore, it follows that for any ǫ>0, for large enough n,
P(1 δ)np < (n)
max <(1 + δ)np1ǫ.
18
Furthermore, if we take λ1
n1+αp,with high probability (5.14) is satisfied. This
implies that
P λ·(n)
max <1
nα!1ǫ,
for any ǫ>0 and nlarge enough. Consequently, we have
P n
2log n1o(1)t(n)
mix(ǫ)2nlog n1 + o(1)!1ǫ.
5.3.2 Poisson offspring
Let us now assume XPoi(θ) for any θ > 0. We run the Markov chain with the noisy
SIS dynamics described in Section 2.1 on the random tree Tn. The following theorem
describes the behaviour of the mixing time.
Theorem 5.4. Let Tnbe a Galton-Watson tree with offspring having a Poisson law
Poi(θ), with θ > 0. Under the strong external infection regime (***), if we furthermore
assume λlog log n
nαlog n, then for any ǫ>0,nlarge enough, we have that
P n
2log n1o(1)t(n)
mix(ǫ)2nlog n1 + o(1)!1ǫ.
Proof of Theorem 5.4.Let X1, X2,...,Xnbe independent random variables, where
XiPoisson(θ). Instead of using the Chernoff bound as in the previous sections, the
asymptotic behaviour of the maximum of a set of independent, identically distributed
Poisson random variables is given in Kimber (1983). As shown in Kimber (1983), for
any ǫ>0, there exists an nlarge enough, such that
P max
1inXilog n
log log n!1ǫ.
It follows that for any ǫ>0, and nlarge enough, we have
P (n)
max log n
log log n!1ǫ.
Now if we take λlog log n
nαlog n,with high probability (5.14) is satisfied. This can be
restated as
P λ·(n)
max <1
nα!1ǫ,
19
for any ǫ>0 and nlarge enough.
As a result, with high probability, the upper bound and lower bound of the ǫ-mixing
time are given by
P n
2log n1o(1)t(n)
mix(ǫ)2nlog n1 + o(1)!1ǫ.
20
Statements and Declarations
Competing Interests
The authors declare that they have no competing interests.
Ethics approval and consent to participate
Not applicable.
Consent for publication
Not applicable.
Data availability
Not applicable.
Materials availability
Not applicable.
Code availability
Not applicable.
Author contribution
Both authors contributed equally to the work.
Funding
The work was supported by the Engineering and Physical Sciences Research Council
(EPSRC) under grant number EP/Y027795/1.
References
Andersson, H., Britton, T.: Stochastic Epidemic Models and Their Statistical Analysis
vol. 151. Springer, ??? (2000)
Aldous, D., Diaconis, P.: Shuffling cards and stopping times. Am. Math. Mon. 93,
333–348 (1986) https://doi.org/10.2307/2323590
Aljovin, E., Jara, M., Xiang, Y.: Thermalization and convergence to equilibrium of
the noisy voter model. ArXiv preprint: 2409.05722 (2024)
Ball, F.: Central limit theorems for SIR epidemics and percolation on configuration
model random graphs. Ann. Appl. Probab. 31(5), 2091–2142 (2021) https://doi.
org/10.1214/20-aap1642
21
Bandyopadhyay, A., Sajadi, F.: On the expected total number of infections for virus
spread on a finite network. Ann. Appl. Probab. 25(2), 663–674 (2015) https://doi.
org/10.1214/14-AAP1007
Cui, K., KhudaBukhsh, W.R., Koeppl, H.: Motif-based mean-field approximation of
interacting particles on clustered networks. Phys. Rev. E 105(4), 042301–7 (2022)
https://doi.org/10.1103/physreve.105.l042301
Cox, J.T., Peres, Y., Steif, J.E.: Cutoff for the noisy voter model. Ann. Appl. Probab.
26(2), 917–932 (2016) https://doi.org/10.1214/15-AAP1108
Decreusefond, L., Dhersin, J.-S., Moyal, P., Tran, V.C.: Large graph limit for an SIR
process in random network with heterogeneous connectivity. Ann. Appl. Probab.
22(2), 541–575 (2012) https://doi.org/10.1214/11-AAP773
Durrett, R.: Random Graph Dynamics. Cambridge Series in Statistical and Proba-
bilistic Mathematics, vol. 20, p. 212. Cambridge University Press, Cambridge, ???
(2007)
He, R., Luczak, M., Ross, N.: Cutoff for the logistic SIS epidemic model with self-
infection. ArXiv preprint: 2407.18446 (2024)
Hermon, J., Sly, A., Sousi, P.: Universality of cutoff for graphs with an added random
matching. Ann. Probab. 50(1), 203–240 (2022) https://doi.org/10.1214/21-aop1532
KhudaBukhsh, W.R., Auddy, A., Disser, Y., Koeppl, H.: Approximate lumpabil-
ity for Markovian agent-based models using local symmetries. Journal of Applied
Probability 56(3), 647–671 (2019) https://doi.org/10.1017/jpr.2019.44
Kimber, A.C.: A note on Poisson maxima. Z. Wahrsch. Verw. Gebiete 63(4), 551–552
(1983) https://doi.org/10.1007/BF00533727
Kipnis, C., Landim, C.: Scaling Limits of Interacting Particle Systems. Springer, ???
(1999). https://doi.org/10.1007/978-3-662-03752-2
Kiss, I.Z., Miller, J.C., Simon, P.L.: Mathematics of Epidemics on Networks. Interdis-
ciplinary Applied Mathematics, vol. 46, p. 413. Springer, ??? (2017). https://doi.
org/10.1007/978-3-319-50806-1 . From exact to approximate models. https://doi.
org/10.1007/978-3-319-50806-1
KhudaBukhsh, W.R., Woroszylo, C., Rempa l a, G.A., Koeppl, H.: A functional central
limit theorem for SI processes on configuration model graphs. Advances in Applied
Probability 54(3), 880–912 (2022) https://doi.org/10.1017/apr.2022.52
Liggett, T.M.: Interacting Particle Systems. Grundlehren der mathematischen Wis-
senschaften [Fundamental Principles of Mathematical Sciences], vol. 276, p. 488.
Springer, ??? (1985). https://doi.org/10.1007/978-1-4613-8542-4 .https://doi.org/
22
10.1007/978-1-4613-8542-4
Levin, D.A., Peres, Y.: Markov Chains and Mixing Times, 2nd edn., p. 447. American
Mathematical Society, Providence, RI, ??? (2017). https://doi.org/10.1090/mbk/
107 . With contributions by Elizabeth L. Wilmer, With a chapter on “Coupling
from the past” by James G. Propp and David B. Wilson. https://doi.org/10.1090/
mbk/107
Norris, J.R.: Markov Chains. Cambridge Series in Statistical and Probabilistic
Mathematics. Cambridge University Press, ??? (1997). https://doi.org/10.1017/
cbo9780511810633
Sharker, Y., Diallo, Z., KhudaBukhsh, W.R., Kenah, E.: Pairwise accelerated failure
time regression models for infectious disease transmission in close-contact groups
with external sources of infection. Stat. Med. 43(27), 5138–5154 (2024)
Sousi, P., Thomas, S.: Cutoff for random walk on dynamical Erd˝os–R´enyi graph.
Ann. Inst. Henri Poincar´e Probab. Stat. 56(4), 2745–2773 (2020) https://doi.org/
10.1214/20-AIHP1057
Hofstad, R.: Random Graphs and Complex Networks. Vol. 1. Cambridge Series in
Statistical and Probabilistic Mathematics, vol. [43], p. 321. Cambridge University
Press, Cambridge, ??? (2017). https://doi.org/10.1017/9781316779422 .https://doi.
org/10.1017/9781316779422
Wilson, D.B.: Mixing times of Lozenge tiling and card shuffling Markov chains. Ann.
Appl. Probab. 14(1), 274–325 (2004) https://doi.org/10.1214/aoap/1075828054
23
ResearchGate has not been able to resolve any citations for this publication.
Article
Full-text available
We study a Markovian agent-based model (MABM) in this paper. Each agent is endowed with a local state that changes over time as the agent interacts with its neighbours. The neighbourhood structure is given by a graph. In a recent paper [Simon et al. 2011], the authors used the automorphisms of the underlying graph to generate a lumpable partition of the joint state space ensuring Markovianness of the lumped process for binary dynamics. However, many large random graphs tend to become asymmetric rendering the automorphism-based lumping approach ineffective as a tool of model reduction. In order to mitigate this problem, we propose a lumping method based on a notion of local symmetry, which compares only local neighbourhoods of vertices. Since local symmetry only ensures approximate lumpability, we quantify the approximation error by means of Kullback-Leibler divergence rate between the original Markov chain and a lifted Markov chain. We prove the approximation error decreases monotonically. The connections to fibrations of graphs are also discussed.
Article
Many important questions in infectious disease epidemiology involve associations between covariates (e.g., age or vaccination status) and infectiousness or susceptibility. Because disease transmission produces dependent outcomes, these questions are difficult or impossible to address using standard regression models from biostatistics. Pairwise survival analysis handles dependent outcomes by calculating likelihoods in terms of contact interval distributions in ordered pairs of individuals. The contact interval in the ordered pair is the time from the onset of infectiousness in to infectious contact from to , where an infectious contact is sufficient to infect if they are susceptible. Here, we introduce a pairwise accelerated failure time regression model for infectious disease transmission that allows the rate parameter of the contact interval distribution to depend on individual‐level infectiousness covariates for , individual‐level susceptibility covariates for , and pair‐level covariates (e.g., type of relationship). This model can simultaneously handle internal infections (caused by transmission between individuals under observation) and external infections (caused by environmental or community sources of infection). We show that this model produces consistent and asymptotically normal parameter estimates. In a simulation study, we evaluate bias and confidence interval coverage probabilities, explore the role of epidemiologic study design, and investigate the effects of model misspecification. We use this regression model to analyze household data from Los Angeles County during the 2009 influenza A (H1N1) pandemic, where we find that the ability to account for external sources of infection increases the statistical power to estimate the effect of antiviral prophylaxis.
Article
We study a stochastic compartmental susceptible–infected (SI) epidemic process on a configuration model random graph with a given degree distribution over a finite time interval. We split the population of graph vertices into two compartments, namely, S and I, denoting susceptible and infected vertices, respectively. In addition to the sizes of these two compartments, we keep track of the counts of SI-edges (those connecting a susceptible and an infected vertex) and SS-edges (those connecting two susceptible vertices). We describe the dynamical process in terms of these counts and present a functional central limit theorem (FCLT) for them as the number of vertices in the random graph grows to infinity. The FCLT asserts that the counts, when appropriately scaled, converge weakly to a continuous Gaussian vector semimartingale process in the space of vector-valued càdlàg functions endowed with the Skorokhod topology. We discuss applications of the FCLT in percolation theory and in modelling the spread of computer viruses. We also provide simulation results illustrating the FCLT for some common degree distributions.
Article
Interacting particles on graphs are routinely used to study magnetic behavior in physics, disease spread in epidemiology, and opinion dynamics in social sciences. The literature on mean-field approximations of such systems for large graphs typically remains limited to specific dynamics, or assumes cluster-free graphs for which standard approximations based on degrees and pairs are often reasonably accurate. Here, we propose a motif-based mean-field approximation that considers higher-order subgraph structures in large clustered graphs. Numerically, our equations agree with stochastic simulations where existing methods fail.
Book
This rigorous introduction to network science presents random graphs as models for real-world networks. Such networks have distinctive empirical properties and a wealth of new models have emerged to capture them. Classroom tested for over ten years, this text places recent advances in a unified framework to enable systematic study. Designed for a master's-level course, where students may only have a basic background in probability, the text covers such important preliminaries as convergence of random variables, probabilistic bounds, coupling, martingales, and branching processes. Building on this base - and motivated by many examples of real-world networks, including the Internet, collaboration networks, and the World Wide Web - it focuses on several important models for complex networks and investigates key properties, such as the connectivity of nodes. Numerous exercises allow students to develop intuition and experience in working with the models.
Book
This textbook provides an exciting new addition to the area of network science featuring a stronger and more methodical link of models to their mathematical origin and explains how these relate to each other with special focus on epidemic spread on networks. The content of the book is at the interface of graph theory, stochastic processes and dynamical systems. The authors set out to make a significant contribution to closing the gap between model development and the supporting mathematics. This is done by: • Summarising and presenting the state-of-the-art in modeling epidemics on networks with results and readily usable models signposted throughout the book; • Presenting different mathematical approaches to formulate exact and solvable models; • Identifying the concrete links between approximate models and their rigorous mathematical representation; • Presenting a model hierarchy and clearly highlighting the links between model assumptions and model complexity; • Providing a reference source for advanced undergraduate students, as well as doctoral students, postdoctoral researchers and academic experts who are engaged in modeling stochastic processes on networks; • Providing software that can solve the differential equation models or directly simulate epidemics in networks. Replete with numerous diagrams, examples, instructive exercises, and online access to simulation algorithms and readily usable code, this book will appeal to a wide spectrum of readers from different backgrounds and academic levels. Appropriate for students with or without a strong background in mathematics, this textbook can form the basis of an advanced undergraduate or graduate course in both mathematics and biology departments alike.