ArticlePDF Available




A randomized algorithm is an algorithm that employees a degree of randomness as a part of its logic. An efficient randomized algorithm plays a significant role in various fields and helps us to solve complex problems. In this paper, we attempt to discuss the various concepts and related works. Also, we discuss the various research applications of these randomized algorithms.
Mohammed Saleh1 Dr. Muhammad Shettima Daluma2
1Department of Accounting, College of Business and Management Studies, Konduga, Borno State,
2Department of Economic, University of Maiduguri, Nigeria.
Received: 07 January 2021 / Revised: 08 January 2021 / Accepted: 09 September 2020 © Sahara
Journal Science+ Business Media, LLC, part of Sahara Nature 2021
ISSN 222-195 (Paper) ISSN 222-239 (Online) Vol.4, No.11, 2021
A randomized algorithm is an algorithm that employees a degree of randomness as a part of
its logic. An efficient randomized algorithms plays a significant role in various fields and help
us to solve a complex problems. In this paper we attempt to discuss the various concepts and
related works. Also we discuss the various research applications of these randomized
As we know that regular algorithm follow
simple steps or we have a predefined set of
inputs and a required output. So any such
algorithm which has a predefined inputs,
required output and follow a suggested steps
i:e those are the prescribed steps which are
followed then those algorithms are called as
deterministic algorithms. So what are
deterministic algorithms, in this we define it
graphically. If we have an input and that
input is applied to any definite set of rules
called as algorithm then we get a definite or
required output .Below is the graphical
representation of deterministic algorithm.
I/p o/p
This is a regular graph of deterministic
algorithm. In this the output or the running
time are the functions of the input. When
these deterministic algorithms are changed
then what happens ,this brings the concept of
randomized algorithms .so randomized
algorithms uses a randomness as a
computing tool for the design of algorithm
i:e the program logic is also going consider
randomness as one of the component and we
have to use it for finding the solution to the
given problems. Randomized algorithms are
also called as probabilistic algorithms. We
define it graphically we have the input but in
the algorithm part we introduce something
called as random bits. Random bit is a bit
which is generated by itself or by an
algorithm which is designed to produce
random numbers or random inputs. So these
random bits are added to your input and
changes the nature or required output.
It may also be defined as, “the output are the
running time are functions of the input and
random bits chosen”.
These algorithms make use of randomizer like
random number generator, and the output of
this randomizer decisions are made by the
algorithm. The output differs either in a
unpredictable way or it could differ from run
to run from the same input. Also it is noted
that its execution time may also differs from
run to run from the same inputs.
These algorithms are broadly divided into two
Randomized Las Vegas
Randomized Monte Carlo
Algorithm. Randomized Las Vegas
Randomized Las Vegas Algorithm are the
randomized algorithms which always
produces the correct outputs for a given
inputs. The execution of this algorithm
depends upon the set of outputs of the
randomizer. The running time in this
algorithm is a random variable. The time and
space complexities of this algorithm is
denoted with the help of notations such as Ŏ
(), Ɵ~ (), Ω~ (). Example of this algorithm is
randomized quick sort.
Randomized Quick Sort:
In the randomized version of quick sort we
impose a distribution on the inputs. This does
not improve the worst case running time but
improves the probability of average case
running time.
In this version we choose a random key for the
pivot. Assume that the procedure Random (a,
b) returns a random integer in the range [a, b);
there are b-a+1 integers in the range, and the
procedure is equally likely to return one of
them. The new partition procedure simply
implements the swap before actually
Pseudocode for randomized QuickSort:
Randomized _partition (A, P, r)
1. I random (P, r)
2. Exchange A[P] A[i]
3. Return partition (A, P, r)
The pseudocode for randomized quicksort
algorithm has the same structure as quick sort,
except that it calls the randomized version of
the partition procedure.
Randomized _ quicksort (A, P, r)
1. If P<r then
2. Q Randomized _ Partition (A, P, r)
3. Randomized _ quicksort (A, P, q)
4. Randomized _ quicksort (A, q+1, r)
Time Complexity:
The best and average case running time of
randomized quicksort is O(n log n).
The worst case running time of randomized
quicksort is O(n^2).
Note: the worst case of randomized quick sort
occurs when all the given elements are equal.
Probability [running time of quick sort is
exceeds twice its expected time] <
Randomized Monte Carlo Algorithm:
Randomized Monte Carlo algorithm is less
efficient than the randomized Las Vegas
Algorithm becoz the output may be incorrect
with some probability and the running time is
deterministic. Example is the randomized
algorithm for the appropriate median.
Let us take the example for sorting. The
deterministic algorithms for the sorting are;
Heap sort
Merge sort
What happens in the heap sort ,I have a list of
numbers and I just create a normal heap of
sorting i:e a cluster of sorting and in merge
sort I use divide and conquer .now when I
applied a randomized algorithm which is a
randomized Las Vegas algorithm, the sort
which is implemented used it is a quick sort.
What happens in the randomized quick sort
almost always outperforms heap sort and
merge sort.
What is minimum cut which is used in the
implementing randomized algorithms?
Problem Definition: Given a connected
graph G(V, E) on n vertices and m edges
.compute the smallest set of edges that will
make G disconnected. The best deterministic
algorithm is Stoer and Wagner in 1977 and the
time complexity is O(mn) and the randomized
Monte Carlo Algorithm by Karger in 1993 has
a time complexity of O(m log n). It shows that
the Monte Carlo Algorithm performs better
than the deterministic algorithm but the error
probability of Monte Carlo Algorithm is n^-c
for any c that we deserve. Where c is a cut
that we require in a minimum cut algorithm.
What is primality testing which is used in
the implementing the randomized
Problem Definition: Given an n bit integer
determine whether it is prime or not. There are
various application by using this testing and
these are as follows;
1. RSA cryptosystem 2.
Algebraic algorithms.
The best algorithms in this is Agarwal, Kayal,
and saxena in 2002 gives us the time
complexity of O(n^6). And the randomized
Monto Carlo algorithm by Rabin in 1980gives
us a time complexity of O(kn^2).and the error
probability is 2^-k for any k that we deserve.
We see that all the deterministic algorithms
have the higher time complexity than the
randomized algorithm. So it proves that
randomized bit when added to any input
sequence gives or outperforms in the output
and the time complexity for determining the
best result.
Characteristics of randomized algorithms:
Output is based on the random
Output is based on probability
Negligible error in long run
Generation of random numbers in
randomized algorithms:
A random number generator is used to
generate a random number. Two types of
random numbers;
True random numbers Pseudo
random numbers.
True random numbers: A true random
numbers is generated based on radio active, flip
of coins, radiations etc. Once the number is
generated then this generated numbers should
not appear again i:e recreated is not allowed.
Pseudo _ random numbers: A pseudo random is
generated using software application. It can be
recreated if the formula is is sufficient
for most of the purpose. The main properties of
pseudo random numbers are as follows;
Deterministic in all the cases
Design principles of randomized algorithm:
Some of the design principles of randomized
algorithm are as under;
Random Trials
Primality Testing to check whether the
no is prime or not.
Checking whether a given input X has the
property Y.
Advantages of Randomized algorithm:
Simplicity: the randomized algorithm
are simple in nature .it make use of the
pseudo random numbers.
Very efficient: because the output are
the function of the input and the random
bit chosen.
Computational complexity:
computational complexity is better than
the deterministic algorithm becoz in
deterministic algorithm there is a fixed
path from input to output and can take
long time to produce an output while as
in randomized algorithm takes less time
to produce an output and there is no
fixed path from input to output.
Disadvantages of randomized algorithm:
One of the main disadvantage which lies in the
randomized algorithm is its reliability. The
algorithm may lack certain guaranteed solutions.
The second issue is its quality. It depends on
quality of random number generator used as part
of the algorithm.
The third disadvantage of randomized algorithm
is hardware failure.
Proposed work and conclusion:
An efficient randomized algorithm is used for
detecting circles in a digital image processing.
With the help of this we can easily find the circles
in an image and then apply an evidence process
to further determine whether the possible circle
is true or false.
It can also be helpful in finding the closest point
queries quickly and besides this it can also be
used for the matrix approximation.
One of the important use of this algorithm is that
it is used in PCA .
Zhdanov, Aleksandr Ivanovich, and Yury
Vyacheslavovich Sidorov. "Parallel
implementation of a randomized regularized
Kaczmarz's algorithm." Computer
Optics 39.4 (2015): 536-541.
Katoch, Swati, and Jawahar Thakur. "Load
Balancing Algoritms in Cloud Computing
Environment: A Review." International
Journal on Recent and Innovation Trends in
Computing and Communication 2.8 (2014):
Talbot, David, and Thorsten Brants.
"Randomized language models via perfect
hash functions." Proceedings of ACL-08:
HLT. 2008.
Cecchi, Laura. "A randomized algorithm for
solving the satisfiability problem." III
Congreso Argentino de Ciencias de la
Computación. 1997.
Pacino, Dario, and Pascal Van Hentenryck.
"Large neighborhood search and adaptive
randomized decompositions for flexible
jobshop scheduling." Twenty-second
International Joint Conference on Artificial
Intelligence. 2011.
Garrido, Oscar, et al. "A simple randomized
parallel algorithm for maximal fmatchings."
Latin American Symposium on Theoretical
Informatics. Springer, Berlin, Heidelberg,
ResearchGate has not been able to resolve any citations for this publication.
Conference Paper
Full-text available
This paper considers a constraint-based scheduling approach to the flexible jobshop, a generalization of the traditional jobshop scheduling where activities have a choice of machines. It studies both large neighborhood (LNS) and adaptive randomized decomposition (ARD) schemes, using random, temporal, and machine decompositions. Empirical results on standard benchmarks show that, within 5 minutes, both LNS and ARD produce many new best solutions and are about 0.5% in average from the best-known solutions. Moreover, over longer runtimes, they improve 60% of the best-known solutions and match the remaining ones. The empirical results also show the importance of hybrid decompositions in LNS and ARD.
In this paper we propose a randomized algorithm which can solve the satisfiability problem with the probability of failure not exceeding ∈ in polynomial average time.