PreprintPDF Available

# A brief overview of pseudo-random number generators and testing of our own simple generator

Authors:
Preprints and early-stage research may not have been peer reviewed yet.

## Abstract

Most random numbers used in computer programs are pseudorandom, which means they are generated in a predictable fashion using a mathematical formula. This is acceptable for many purposes, sometimes even desirable. In this paper we will take a look at few popular generators producing pseudorandom integers from continuous uniform distribution. Then we will use such generator to try to implement a generator producing numbers from interval ]0, 1[. And then, on its basis, generators of numbers from Bernoulli, binomial, Poisson, exponential and normal distributions.
A brief overview of pseudo-random
number generators and testing of our
own simple generator
Jakub Łukasiewicz
https://orcid.org/0000-0002-4938-504X
May 3, 2022
Abstract
Most random numbers used in computer programs are pseudorandom, which means they
are generated in a predictable fashion using a mathematical formula. This is acceptable for
many purposes, sometimes even desirable.
In this paper we will take a look at few popular generators producing pseudorandom
integers from continuous uniform distribution. Then we will use such generator to try to
implement a generator producing numbers from interval ]0,1[ . And then, on its basis, gen-
erators of numbers from Bernoulli, binomial, Poisson, exponential and normal distributions.
Keywords: overview, pseudo, random, number, generator, testing
CONTENTS Contents
Contents
1 Introduction 2
2 Denition 2
2.1 Mathematical denition (L’Ecuyer) . . . . . . . . . . . . . . . . . . . . . . . . . . 2
3 Examples of existing PRNG 2
3.1 Middle-square method ................................. 2
3.2 Linear congruential generator (LCG)......................... 3
3.2.1 Lehmer random number generator ...................... 3
3.3 Lagged Fibonacci generator (LFG) .......................... 3
3.4 Linear-feedback shift register (LFSR) ......................... 3
3.4.1 Generalized feedback shift register (GFSR) .................. 4
3.5 ACORN......................................... 4
3.6 Mersenne Twister (MT) ................................ 4
3.7 Xorshift ......................................... 4
4 Own generator 5
5 Distributions 5
5.1 Uniform ......................................... 5
5.2 Bernoulli......................................... 6
5.3 Binomial......................................... 6
5.4 Poisson ......................................... 6
5.5 Exponential....................................... 7
5.6 Normal ......................................... 7
6 Implementation in C++11 8
7 Tests 9
7.1 Diehard ......................................... 9
7.2 ENT........................................... 11
7.3 Visualtest........................................ 11
8 Conclusions 11
References 12
1
3 Examples of existing PRNG
1 Introduction
Random numbers generators (RNGs) are needed for practically all kinds of computer applications,
such as simulation of stochastic systems, numerical analysis, probabilistic algorithms, secure com-
munications, computer games, and gambling machines, to name a few. [12]
One way of achieving randomness is by using entropy of the “outside” world. For example, in
the 1940’s one could get a large deck of punched cards lled with random sampling digits. Those
cards could be placed in the data section of a program. [10]
However for many use cases reading random numbers from other external storage devices was too
slow and the size of main memory was much too limited to store large tables of random digits.
Thus two types of solutions emerged to produce random numbers on the y, in real time:
using a fast physical device that produces/collects random noise
a purely deterministic algorithm producing a sequence imitating randomness.
2 Denition
Pseudorandom number generator 1(PRNG) – a deterministic algorithm that has one or
more inputs called ”seeds”, and it outputs a sequence of values that appears to be random. [8]
2.1 Mathematical denition (L’Ecuyer)
Agenerator is a structure G= (S, s0, T, U, G), where Sis a nite set of states,s0Sis the
initial state (seed), T:SSis the transition function,Uis a nite set of output symbols
and G:SUis the output function. A generator operates as follows:
Start from the seed s0and let u0:= G(s0). Then, for i:= 1,2, . . . let si=T(si1)
and ui=G(si)U. We assume that ecient procedures are available to compute Tand
G. The sequence {ui}is the output of the generator and its elements are called the obser-
vations. For pseudorandom number generators, one would expect the observations to behave
from the outside as if they were the values of independent and identically random variables,
uniformly distributed over U. The set Uis often a set of integers of the form {0, . . . , m 1}
or a nite set of values between 0and 1to approximate the U(0,1) distribution. [11]
Period and transient
Since Sis nite, the sequence of states is ultimately periodic. The period is the smallest positive
integer ρsuch that sρ+n=snfor some integer τ0and for all nτ. The smallest τwith this
property is called transient. When τ= 0, the sequence is said to be purely periodic. [11]
3 Examples of existing PRNG
3.1 Middle-square method
The method was invented by John von Neumann, and was described at a conference in 1949. [21]
To generate a sequence of n-digit pseudorandom numbers, an n-digit seed is created and squared,
producing a 2n-digit number. If the result has fewer than 2ndigits, leading zeroes are added to
compensate. The middle ndigits of the result would be the next number in the sequence, and
returned as the result. This process is then repeated to generate more numbers. [21, 27]
1also referred to as: deterministic random bit generator (DRBG) [8]
2
3.2 Linear congruential generator (LCG) 3 Examples of existing PRNG
3.2 Linear congruential generator (LCG)
By far one of the most popular random number generators in use today are special cases of the
following scheme, introduced by D. H. Lehmer in 1949. [14]
As we read in [9], to create LCG we need four integers:
the modulus m(0< m)
the multiplier a(0a < m)
the increment c(0c < m)
the seed X0(0X0< m)
The desired sequence of random numbers is then obtained by setting:
Xn+1 = (a·Xn+c) mod m(1)
3.2.1 Lehmer random number generator
The special case of (1) with c= 0 deserves explicit mention, since it’s Lehmer’s original method
(and the number generation process is a little faster [9]).
Xk+1 =a·Xkmod m(2)
The terms multiplicative congruential method and mixed congruential method are used by many
authors to denote linear congruential sequences respectively with c= 0 and c̸= 0.
3.3 Lagged Fibonacci generator (LFG)
Fibonacci Generators is a class of random number generator aimed at being an improvement on
the “standard” linear congruential generator. These are based on a generalisation of the Fibonacci
sequence, hence the formula:
Xn= (Xn1+Xn2) mod mn2
Fibonacci Generator has good quality compared to other linear generators, but requires much more
computations. The disadvantage of this generator are high correlations between the elements of the
sequence. The sequences satisfy the decomposition condition but do not satisfy the independence
condition. This disadvantage can be eliminated by generalizing the formula to a form called lagged
Fibonacci generator (LFG):
Xn= (Xnp3Xnq) mod mnp > q 1(3)
where 3is some mathematical operator (e.g. addition, subtraction, XOR). [6]
3.4 Linear-feedback shift register (LFSR)
LFSR is a shift register whose input bit is a linear function of its previous state.
Robert C. Tausworthe in 1965 dened [10, 31] LFSR generator via
Xn= (a1Xn1+· ·· +akXnk) mod 2
ui=
w
l=1
Xis+l1
2l
(4)
where a1, . . . , akF2, ak= 1 (F2is Galois eld) and w,sare positive integers. It takes a block
of wsuccessive bits every ssteps of the linear recurrence and constructs the output uifrom that.
3
3.5 ACORN 3 Examples of existing PRNG
3.4.1 Generalized feedback shift register (GFSR)
GFSR generator [15] is a widely used pseudorandom number generator based on the linear recur-
ring equation:
Xl+n=Xl+mXll0(5)
where each Xlis a word with components 0 or 1 of size w, and denotes bitwise exclusive-or
operation. [18]
3.5 ACORN
Additive Congruential Random Number (ACORN) generator, introduced by R.S. Wikrama-
ratna [35], was originally designed for use in geostatistical and geophysical Monte Carlo simula-
tions, and later extended for use on parallel computers. [36]
We dene [36] the kth order ACORN generator Xk
jrecursively from a seed X0
0(where 0< X0
0< M
and M= 1,2, . . . ) and a set of kinitial values Xm
0(where m= 1, . . . , k and 0Xm
0< M) by:
X0
n=X0
n1n1
Xm
n= (Xm1
n+Xm
n1) mod Mn1, m = 1, . . . , k (6)
The interested are encouraged to visit the ocial website [36].
3.6 Mersenne Twister (MT)
It is by far the most widely used general-purpose PRNG. Its name derives from the fact that its
period length is chosen to be a Mersenne prime.
MT was developed in 1997 by Makoto Matsumoto and Takuji Nishimura as a new variant of the
twisted GFSR (TGFSR) [19].
The most commonly used version of the Mersenne Twister algorithm is based on the Mersenne
prime 219937 1. The standard implementation of that, MT19937, uses a 32-bit word length.
Due to the size, the mathematical denition will be omitted. The interested are encouraged
to read the original paper [20] and to visit the ocial website [17].
3.7 Xorshift
Xorshift RNGs are a class of PRNGs discovered by George Marsaglia. [16]. They are a subset of
LFSRs which allow a particularly ecient implementation in software without using excessively
sparse polynomials. They generate the next number in their sequence by repeatedly taking the
exclusive or of a number with a bit-shifted version of itself.
The original paper [16] does not contain a straightforward mathematical denition. The
interested are encouraged to read also [22] and [2]. In place of mathematical denition an
example based on the implementation provided by the original paper [16] will be presented.
ui n t3 2_ t x or sh if t 32 ( ) { | ui n t3 2_ t xo r sh if t3 2 () {
static ui n t 3 2 _t x = 2 4 6 3 534242; | static ui n t 3 2 _t x = 2 4 6 3 534242;
x ^= (x << a); x = (x >> b); | x ^= (x << 13); x = (x >> 17);
return (x ^= ( x < < c) ) ; | return (x ^= ( x < < 5) ) ;
} | } // a , b , c = 13 ,17 ,5
4
5 Distributions
4 Own generator
Without putting much thought into it let us make our generator to be a combination of LCG,
LFG and Xorshift.
First let us combine (1) and (3) into:
Xn=(a(Xnp3Xnq) + c)mod mnp > q 1(7)
Now we need to handle the 3. Let it be +for even Xnqand for Xnqodd. Thus (7) transforms
into:
Xn={(a(Xnp+Xnq) + c)mod m2|Xnq
(a(XnpXnq) + c)mod m2Xnqnp > q 1(8)
Let us use the Marsaglia’s favourite values of Xorshift in (8) too (with m= 2b), thus:
Xn={(13(Xnp+Xnq)+5)mod 217 2|Xnq
(13(XnpXnq)+5)mod 217 2Xnqnp > q 1(9)
The remaining variables are pand q. Let p= 7 and q= 3, thus pre-nal formula is:
Xn={(13(Xn7+Xn3)+5)mod 217 2|Xnq
(13(Xn7Xn3)+5)mod 217 2Xnqn7(10)
Initial values will generate Xorshift with a “twist” that seed ss+ (smod 1000) ·b
In the end the generator looks like:
Xn=
Xorshift(s+ (smod 1000) ·b)n < 7
(13(Xn7+Xn3)+5)mod 217 2|Xnq
(13(Xn7Xn3)+5)mod 217 2Xnq}n7(11)
5 Distributions
5.1 Uniform
As operation mod mis being used in our generator, thus none of random numbers Xwill be
greater than m. Thus to obtain an uniform distribution on interval ]0,1[ we need to just divide
the result of generator (11) by value m. [9]
Un=Xn
m(12)
N= 100 N= 1000 N= 100000
5
5.2 Bernoulli 5 Distributions
5.2 Bernoulli
Bernoulli distribution is a discrete probability distribution of a random variable which takes the
value 1 with probability pand the value 0 with probability 1p. We will use for this the previously
dened uniform distribution (12).
Bn(p) = {0Un> p
1Unp(13)
N= 100 N= 1000 N= 100000
p= 0.6
5.3 Binomial
Binomial distribution with parameters p, K is the discrete probability distribution of the number
of successes in a sequence of KBernoulli trials (13). [26]
B
n(p, k) =
K
i=1
Bi(p)(14)
N= 100 N= 1000 N= 100000
p= 0.5, K = 10
5.4 Poisson
Knuth’s algorithm [5, 9]
Leλ,k0,p1
repeat
kk+ 1
pp·u() { where u() returns uniform random number in ]0,1[ }
until pL
return k1
6
5.5 Exponential 5 Distributions
N= 100 N= 1000 N= 100000
5.5 Exponential
To generate exponentially distributed number we will use yet again uniform one [28, 4]
En(λ) = ln(1 Un)
λ(15)
N= 100 N= 1000 N= 100000
5.6 Normal
One of the commonly used methods to generate Gaussian-distributed numbers from a regular
random generator is Box-Muller transform. The method uses two independent random numbers
Uand Vdistributed uniformly on ]0,1[. Then the two random variables Y1and Y2
Y1=2 ln Ucos 2πV Y2=2 ln Usin 2πV (16)
One is returned and the other saved for the next request for a random number. [30]
N= 100 N= 1000 N= 100000
7
6 Implementation in C++11
6 Implementation in C++11
Combining informations from sections 4 and 5 to write an implementation of our PRNG in C++11:
cl a ss PRNG {
us i ng base_type = uint32 _ t ;
ba s e _ t y p e X [7 ]; / / fo r conv enience , 'p ' i s define d h ere a s ar ray siz e!
std :: size_t N;
ba s e _ t y p e a, b , c , m , p, q ;
double nextNormal;
public:
PR NG ( base_type seed = 2 4 6 3 5 3 4 242) :
N(sizeof(X)/sizeof(X[0])),
a(13) , b ( 20) , c(5) , m (1 << b), p(N ) , q (3) ,
nextNormal(0)
{
se ed += (seed % 100 0 ) * b; // " twist "
for (auto& x: X ) { // Xorshift
se ed ^= ( seed << a);
se ed = ( s eed >> b);
x = (seed ^= ( s e ed << c));
}
}
ba s e _ t y p e o perator ()()
{
au to P = (N - p ) % p ;
au to Q = (N - q ) % p ;
if ( X [ Q ] % 2 == 0) {
X[P] = (a*(X[P ] + X [ Q ]) + c ) % m ;
}
el se {
X[P] = (a*(X[P ] & X [ Q ]) + c ) % m ;
}
return X[P];
}
double un i fo rm ( ) { re turn stat ic_cas t <double>( (* th is ) () ) / m; }
bo ol bernoulli ( double P) { return uniform () <= P ? 1 : 0; }
double exponential(double l) { return std:: log (1 - uniform ()) / - l ; }
std :: size_t bi n o m i al (double P , std :: size _ t n )
{
au to val = 0;
wh i le (n - -) {
val += bernoulli( P ) ;
}
return val ;
}
8
7 Tests
ba s e _ t y p e poisson ( double l)
{
double L = std ::exp(- l ) ;
ba s e _ t y p e k = 0;
double p = 1;
do {
++ k ;
p * = u ni f or m () ;
}wh i le (p > L );
return k-1;
}
double no r m al ()
{
if ( nextNorm a l != 0) {
au to temp = nextNormal ;
nextNormal = 0;
return temp;
}
au to r = std :: sqrt ( -2 * std:: log ( unifo r m ()));
au to s = 2 * M _PI * uniform ();
ne x t N o r m a l = r * std :: sin ( s) + DBL_MIN;
return r * std ::cos(s ) ;
}
};
7 Tests
7.1 Diehard
108samples (6.6 GiB) in formatted [32] data le.
Command: dieharder -a -g 202 -f out/data/testing.dat
Unfortunately, the generator seems to fail Dieharder tests.
Process has been manually terminated after 58th FAIL.
Generated output:
#=============================================================================#
# dieharder version 3.31.1 Copyright 2003 Robert G. Brown #
#=============================================================================#
rng_name | filename |rands/second|
file_input| out/data/testing.dat| 1.76e+06 |
#=============================================================================#
test_name |ntup| tsamples |psamples| p-value |Assessment
#=============================================================================#
diehard_birthdays| 0| 100| 100|0.00000000| FAILED
diehard_operm5| 0| 1000000| 100|0.00000000| FAILED
diehard_rank_32x32| 0| 40000| 100|0.00000000| FAILED
diehard_rank_6x8| 0| 100000| 100|0.00000000| FAILED
diehard_bitstream| 0| 2097152| 100|0.00000000| FAILED
diehard_opso| 0| 2097152| 100|0.00000000| FAILED
diehard_oqso| 0| 2097152| 100|0.00000000| FAILED
9
7.1 Diehard 7 Tests
diehard_dna| 0| 2097152| 100|0.00000000| FAILED
diehard_count_1s_str| 0| 256000| 100|0.00000000| FAILED
diehard_count_1s_byt| 0| 256000| 100|0.00000000| FAILED
diehard_parking_lot| 0| 12000| 100|0.00000000| FAILED
diehard_2dsphere| 2| 8000| 100|0.00000000| FAILED
diehard_3dsphere| 3| 4000| 100|0.00000000| FAILED
diehard_squeeze| 0| 100000| 100|0.00000000| FAILED
diehard_sums| 0| 100| 100|0.00000000| FAILED
diehard_runs| 0| 100000| 100|0.00000000| FAILED
diehard_runs| 0| 100000| 100|0.00000000| FAILED
diehard_craps| 0| 200000| 100|0.00000000| FAILED
diehard_craps| 0| 200000| 100|0.00000000| FAILED
marsaglia_tsang_gcd| 0| 10000000| 100|0.00000000| FAILED
marsaglia_tsang_gcd| 0| 10000000| 100|0.00000000| FAILED
sts_monobit| 1| 100000| 100|0.00000000| FAILED
sts_runs| 2| 100000| 100|0.00000000| FAILED
sts_serial| 1| 100000| 100|0.00000000| FAILED
sts_serial| 2| 100000| 100|0.00000000| FAILED
sts_serial| 3| 100000| 100|0.00000000| FAILED
sts_serial| 3| 100000| 100|0.00000000| FAILED
sts_serial| 4| 100000| 100|0.00000000| FAILED
sts_serial| 4| 100000| 100|0.00000000| FAILED
sts_serial| 5| 100000| 100|0.00000000| FAILED
sts_serial| 5| 100000| 100|0.00000000| FAILED
sts_serial| 6| 100000| 100|0.00000000| FAILED
sts_serial| 6| 100000| 100|0.00000000| FAILED
sts_serial| 7| 100000| 100|0.00000000| FAILED
sts_serial| 7| 100000| 100|0.00000000| FAILED
sts_serial| 8| 100000| 100|0.00000000| FAILED
sts_serial| 8| 100000| 100|0.00000000| FAILED
sts_serial| 9| 100000| 100|0.00000000| FAILED
sts_serial| 9| 100000| 100|0.00000000| FAILED
sts_serial| 10| 100000| 100|0.00000000| FAILED
sts_serial| 10| 100000| 100|0.00000000| FAILED
sts_serial| 11| 100000| 100|0.00000000| FAILED
sts_serial| 11| 100000| 100|0.00000000| FAILED
sts_serial| 12| 100000| 100|0.00000000| FAILED
sts_serial| 12| 100000| 100|0.00000000| FAILED
sts_serial| 13| 100000| 100|0.00000000| FAILED
sts_serial| 13| 100000| 100|0.00000000| FAILED
sts_serial| 14| 100000| 100|0.00000000| FAILED
sts_serial| 14| 100000| 100|0.00000000| FAILED
sts_serial| 15| 100000| 100|0.00000000| FAILED
sts_serial| 15| 100000| 100|0.00000000| FAILED
sts_serial| 16| 100000| 100|0.00000000| FAILED
sts_serial| 16| 100000| 100|0.00000000| FAILED
rgb_bitdist| 1| 100000| 100|0.00000000| FAILED
rgb_bitdist| 2| 100000| 100|0.00000000| FAILED
rgb_bitdist| 3| 100000| 100|0.00000000| FAILED
rgb_bitdist| 4| 100000| 100|0.00000000| FAILED
rgb_bitdist| 5| 100000| 100|0.00000000| FAILED
10
7.2 ENT 8 Conclusions
7.2 ENT
ENT [34] program, although not ideal for the given input, yelds much more interesting results:
Entropy = 3.524750 bits per byte.
Optimum compression would reduce the size
of this 7000000207 byte file by 55 percent.
Chi square distribution for 7000000207 samples is 156174869624.00, and randomly
would exceed this value less than 0.01 percent of the times.
Arithmetic mean value of data bytes is 43.7601 (127.5 = random).
Monte Carlo value for Pi is 4.000000000 (error 27.32 percent).
Serial correlation coefficient is 0.117267 (totally uncorrelated = 0.0).
7.3 Visual test
This test brutally shows whats wrong with our
generator. It has pattern: numbers alternate
between even and odd. The rst 10 numbers:
18431
82898
3465
19412
95651
38182
77517
64584
27527
7007
8 Conclusions
Despite succeeding at creating all desired distributions, we ultimately failed at creating proper
pseudorandom number generator. Although on the rst glance numbers could appear plausible,
it took to our third test – ironically, visual one – to discover its fatal aw. It just shows that
pseudorandomness is no trivial matter and even the simplest algorithms actually have a lot of
thought put behind them.
11
REFERENCES References
References
[1] adsk. How to test a random generator. StackOverow. url:https://stackoverflow.com/q/2130621.
[2] Richard P. Brent. “Note on Marsaglia’s Xorshift Random Number Generators”. In: Journal of Statistical
Software, Articles 11.5 (2004), pp. 1–5. issn: 1548-7660. doi:10.18637/jss.v011.i05.
[3] Robert G. Brown. Dieharder: A Random Number Test Suite. Duke University Physics Department. url:
https://webhome.phy.duke.edu/~rgb/General/dieharder.php.
[4] Tomasz Chwiej. Generatory liczb pseudolosowych. AGH. url:http : / / home.agh.edu.pl / ~chwiej / mn /
generatory_16.pdf (visited on 2021-05-16).
[5] Wikipedia contributors. Generating Poisson-distributed random variables. 2020-06-01. url:https://en.
wikipedia. org /wiki /Poisson _ distribution# Generating_ Poisson - distributed_ random_ variables
(visited on 2021-06-03).
[6] Wikipedia contributors. Generator Fibonacciego. 2020-03-28. url:https : / / pl . wikipedia . org / wiki /
Generator_Fibonacciego (visited on 2021-05-30).
[7] Wikipedia contributors. Pseudorandom number generator. 2021. url:https: / / en . wikipedia . org / w /
index.php?title=Pseudorandom_number_generator (visited on 2021-05-15).
[8] CSRC. Pseudorandom Number (or Bit) Generator.url:https : / / csrc . nist . gov / glossary / term /
Pseudorandom_Number_Generator.
[9] Donald E. Knuth. The art of computer programming. Seminumerical Algorithms. 3rd ed. Vol. 2. Addison-
Wesley Longman Publishing Co., Inc., 1997. Chap. 3. isbn: 0201896842. doi:10.5555/270146.
[10] Pierre L’Ecuyer. “History of Uniform Random Number Generation”. In: Proceedings of the 2017 Winter Sim-
ulation Conference. IEEE Press, 2017, pp. 202–230. url:https://www.informs- sim.org/wsc17papers/
includes/files/016.pdf.
[11] Pierre L’Ecuyer. “Uniform random number generation”. In: Annals of Operations Research 53.1 (1994),
pp. 77–120. issn: 1572-9338. doi:10.1007/BF02136827.
[12] Pierre L’Ecuyer and Richard Simard. “TestU01: A C Library for Empirical Testing of Random Number
Generators”. In: ACM Trans. Math. Softw. 33.4 (2007-08). issn: 0098-3500. doi:10.1145/1268776.1268777.
[13] Lawrence L. Leemis and Stephen K. Park. Discrete-Event Simulation: A First Course. 2007, pp. 37–99.
[14] D.H. Lehmer. “Mathematical Methods in Large-scale Computing Units”. In: Proceedings of the Second Sym-
posium on Large Scale Digital Computing Machinery. 1949, pp. 141–146. url:https: / / archive . org /
details/proceedings_of_a_second_symposium_on_large-scale_/page/n178.
[15] T. G. Lewis and W. H. Payne. “Generalized Feedback Shift Register Pseudorandom Number Algorithm”. In:
J. ACM 20.3 (1973-06), pp. 456–468. issn: 0004-5411. doi:10.1145/321765.321777.
[16] George Marsaglia. “Xorshift RNGs”. In: Journal of Statistical Software, Articles 8.14 (2003), pp. 1–6. issn:
1548-7660. doi:10.18637/jss.v008.i14.
mat/MT/emt.html (visited on 2021-05-31).
[18] Makoto Matsumoto and Yoshiharu Kurita. “Twisted GFSR Generators”. In: ACM Trans. Model. Comput.
Simul. 2.3 (1992-07), pp. 179–194. issn: 1049-3301. doi:10.1145/146382.146383.url:http://www.math.
sci.hiroshima-u.ac.jp/m-mat/MT/ARTICLES/tgfsr3.pdf.
[19] Makoto Matsumoto and Yoshiharu Kurita. “Twisted GFSR Generators II”. In: ACM Trans. Model. Comput.
Simul. 4.3 (1994-07), pp. 254–266. issn: 1049-3301. doi:10.1145/189443.189445.url:http://www.math.
sci.hiroshima-u.ac.jp/m-mat/MT/ARTICLES/ttgfsr7.pdf.
[20] Makoto Matsumoto and Takuji Nishimura. “Mersenne twister: a 623-dimensionally equidistributed uniform
pseudo-random number generator”. In: ACM Transactions on Modeling and Computer Simulation 8 (1
1998), pp. 3–30. doi:10. 1145/272991.272995.url:http ://www . math.sci. hiroshima- u .ac.jp/ ~m-
mat/MT/ARTICLES/mt.pdf.
[21] John von Neumann. “Various Techniques Used in Connection with Random Digits”. In: Monte Carlo Method.
Ed. by A. S. Householder, G. E. Forsythe, and H. H. Germond. Vol. 12. National Bureau of Standards
Applied Mathematics Series. Washington, DC: US Government Printing Oce, 1951. Chap. 13, pp. 36–38.
url:https://mcnp.lanl.gov/pdf_files/nbs_vonneumann.pdf.
[22] François Panneton and Pierre L’Ecuyer. “On the Xorshift Random Number Generators”. In: ACM Trans.
Model. Comput. Simul. 15.4 (2005-10), pp. 346–361. issn: 1049-3301. doi:10.1145/1113316.1113319.url:
https://www.iro.umontreal.ca/~lecuyer/myftp/papers/xorshift.pdf.
12
REFERENCES References
[23] W. H. Payne, J. R. Rabung, and T. P. Bogyo. “Coding the Lehmer Pseudo-Random Number Generator”.
In: Commun. ACM 12.2 (1969-02), pp. 85–86. issn: 0001-0782. doi:10.1145/362848.362860.
[24] Pseudo-random number generation algorithms. MathOverow. 2010-06-26. url:https:// mathoverflow .
net/q/29494.
[25] RANDOM.org – Simple Visual Analysis.url:https://www.random.org/analysis/#visual.
[26] Edward Ross. “From Bernoulli to Binomial Distributions”. In: (). url:https://skeptric.com/bernoulli-
binomial/ (visited on 2021-06-09).
[27] PBS Innite Series. How to Generate Pseudorandom Numbers. 2017. url:https://youtu.be/C82JyCmtKWg.
[28] Alok Singhal. Pseudorandom Number Generator - Exponential Distribution. StackOverow. url:https:
//stackoverflow.com/a/2106564.
[29] Shobhit Sinha, SK Hazul Islam, and Mohammad S. Obaidat. “A comparative study and analysis of some
pseudorandom number generator algorithms”. In: Security and Privacy 1.6 (2018), 1:e46. doi:10. 1002 /
spy2.46.
[30] S.Lott. Generate random numbers following a normal distribution in C/C++. StackOverow. url:https:
//stackoverflow.com/a/2325531.
[31] Robert Tausworthe. “Random Numbers Generated by Linear Recurrence Modulo Two”. In: Mathematics of
Computation - Math. Comput. 19 (1965-05), pp. 201–201. doi:10.2307/2003345.
[32] Paul Uszak. How can I make my input le suitable for Dieharder? Cryptography Stack Exchange. url:
https://crypto.stackexchange.com/a/87122.
[33] John Viega. “Practical Random Number Generation in Software”. In: Proc. 19th Annual Computer Security
Applications Conference. 2003. url:https://www.acsac.org/2003/papers/79.pdf.
[34] John Walker. ENT – A Pseudorandom Number Sequence Test Program. Fourmilab, 2008-01-28. url:https:
//www.fourmilab.ch/random.
[35] Roy S. Wikramaratna. “ACORN – A new method for generating sequences of uniformly distributed Pseudo-
random Numbers”. In: Journal of Computational Physics 83.1 (1989), pp. 16–31. issn: 0021-9991. doi:
10.1016/0021-9991(89)90221-0.
[36] Roy S. Wikramaratna. ACORN random numbers. 2019-03-31. url:http : / / acorn . wikramaratna . org
(visited on 2021-05-30).
Source code
All source les (LaTeX, BibTeX, C++, Makele) are available on GitHub under the URL:
https://github.com/Jorengarenar/PRNG-paper
13
ResearchGate has not been able to resolve any citations for this publication.
Article
Full-text available
In typical stochastic simulations, randomness is produced by generating a sequence of independent uniform variates (usually real-valued between 0 and 1, or integer-valued in some interval) and transforming them in an appropriate way. In this paper, we examine practical ways of generating (deterministic approximations to) such uniform variates on a computer. We compare them in terms of ease of implementation, efficiency, theoretical support, and statistical robustness. We look in particular at several classes of generators, such as linear congruential, multiple recursive, digital multistep, Tausworthe, lagged-Fibonacci, generalized feedback shift register, matrix, linear congruential over fields of formal series, and combined generators, and show how all of them can be analyzed in terms of their lattice structure. We also mention other classes of generators, like non-linear generators, discuss other kinds of theoretical and empirical statistical tests, and give a bibliographic survey of recent papers on the subject.
Article
Full-text available
a collection of utilities for the empirical statistical testing of uniform random number generators (RNGs). It provides general implementations of the classical statistical tests for RNGs, as well as several others tests proposed in the literature, and some original ones. Predefined tests suites for sequences of uniform random numbers over the interval (0,1) and for bit sequences are available. Tools are also oered to perform systematic studies of the interaction between a specific test and the structure of the point sets produced by a given family of RNGs. That is, for a given kind of test and a given class of RNGs, to determine how large should be the sample size of the test, as a function of the generator's period length, before the generator starts to fail the test systematically. Finally, the library provides various types of generators implemented in generic form, as well as many specific generators proposed in the literature or found in widely-used software. The tests can be applied to instances of the generators predefined in the library, or to user-defined generators, or to streams of random numbers produced by any kind of device or stored in files. Besides introducing TestU01, the paper provides a survey and a classification of statistical tests for RNGs. It also applies batteries of tests to a long list of widely used RNGs.
Article
Full-text available
G. Marsaglia introduced recently a class of very fast xorshift random number generators, whose implementation uses three "xorshift" operations. They belong to a large family of generators based on linear recurrences modulo 2, which also includes shift-register generators, the Mersenne twister, and several others. In this paper, we analyze the theoretical properties of xorshift generators, search for the best ones with respect to the equidistribution criterion, and test them empirically. We find that the vast majority of xorshift generators with only three xorshift operations, including those having good equidistribution, fail several simple statistical tests. We also discuss generators with more than three xorshifts.
Article
In this article, we have investigated the statistical nature of popular pseudorandom number generators (PRNGs) present in the literature and have analyzed their performance against the battery of tests prescribed in NIST SP800‐22rev1a. Different tests performed in this article provide an insight into the PRNGs and have revealed if they are statistically random or not, which is the first criteria in being cryptographically secure. In our study, we have considered the following PRNGs: (a) Linear Congruential Generator, (b) WichMann‐Hill Algorithm, (c) Well Equidistributed Long Period Generator and (d) MIXMAX.
Article
A new family of pseudo-random number generators, the ACORN (additive congruential random number) generators, is proposed. The resulting numbers are distributed uniformly in the interval [0, 1). The ACORN generators are defined recursively, and the (k + 1)th order generator is easily derived from the kth order generator. Some theorems concerning the period length are presented and compared with existing results for linear congruential generators. A range of statistical tests are applied to the ACORN generators, and their performance is compared with that of the linear congruential generators and the Chebyshev generators. The tests show the ACORN generators to be statistically superior to the Chebyshev generators, while being statistically similar to the linear congruential generators. However, the ACORN generators execute faster than linear congruential generators for the same statistical faithfulness. The main advantages of the ACORN generator are speed of execution, long period length, and simplicity of coding.
Article
An algorithm and coding technique is presented for quick evaluation of the Lehmer pseudo-random number generator modulo 2 ** 31 - 1, a prime Mersenne number which produces 2 ** 31 - 2 numbers, on a p-bit (greater than 31) computer. The computation method is extendible to limited problems in modular arithmetic. Prime factorization for 2 ** 61 - 2 and a primitive root for 2 ** 61 - 1, the next largest prime Mersenne number, are given for possible construction of a pseudo-random number generator of increased cycle length.