Question
Asked 6th Feb, 2014

What is the time complexity of Multilayer Perceptron (MLP) and other neural networks?

Complexity of an algorithm is mostly represented in Big O notations that plays an important role in finding efficient algorithm. What is the time complexity of MLP and other ANN?

Most recent answer

2nd Dec, 2019
Uzair Iqbal
National University of Computer and Emerging Sciences
Dear Ahmad, please guide me to use this , Please need some clarity
1 Recommendation

Popular Answers (1)

6th Feb, 2014
Björn-Elmar Macek
Universität Kassel
This depends on the architecture of the network. For a trained MLP the complexity of classification (the forward propagation) is roughly:
1. the number of multiplications needed to compute the activation of all neurons (vector product) in the i-th layer of the net equals: NumberOfNodesInLayer(i)*NumberOfNodesInLayer(i-1)
Since the separation of concave regions is possible with 3 layers, lets assume we got 3 at most. Given that the last layer only contains 1 neuron which is used to output the classification of the input, we got 2 layers to distribute our remaining n-1 neurons to. The worst case (regarding complexity) would be that the nodes are equally distributed, which results in ((n-1)/2)^2 \in O(n^2).
2. since output functions of neurons are in general very simple to compute i would assume those costs to be constant per neuron. Given a network with n neurons, this step would be in O(n).
Given the fact, that the number of neurons n for a given problem can be regarded as a constant, the overall complexity of O(n^2) equals O(1).
9 Recommendations

All Answers (10)

6th Feb, 2014
Björn-Elmar Macek
Universität Kassel
This depends on the architecture of the network. For a trained MLP the complexity of classification (the forward propagation) is roughly:
1. the number of multiplications needed to compute the activation of all neurons (vector product) in the i-th layer of the net equals: NumberOfNodesInLayer(i)*NumberOfNodesInLayer(i-1)
Since the separation of concave regions is possible with 3 layers, lets assume we got 3 at most. Given that the last layer only contains 1 neuron which is used to output the classification of the input, we got 2 layers to distribute our remaining n-1 neurons to. The worst case (regarding complexity) would be that the nodes are equally distributed, which results in ((n-1)/2)^2 \in O(n^2).
2. since output functions of neurons are in general very simple to compute i would assume those costs to be constant per neuron. Given a network with n neurons, this step would be in O(n).
Given the fact, that the number of neurons n for a given problem can be regarded as a constant, the overall complexity of O(n^2) equals O(1).
9 Recommendations
6th Feb, 2014
Niusha Shafiabady
University of Nottingham, Malaysia Campus
This depends in the structure and number of layers in a neural network.
3 Recommendations
11th Feb, 2014
Andrea Peterkova Nemethova
Slovak University of Technology in Bratislava
^ Ferreira, C. (2006). "Designing Neural Networks Using Gene Expression Programming". In A. Abraham, B. de Baets, M. Köppen, and B. Nickolay, eds., Applied Soft Computing Technologies: The Challenge of Complexity, pages 517–536, Springer-Verlag.
1 Recommendation
24th Apr, 2014
Ahmad Hassanat
Mutah University
O(2^n) to converge to optimal solution.
O(#epochs * # examples * #features * #neurons) => approximate solution
2 Recommendations
11th Feb, 2019
Hamit Can
Technical University of Sofia
26th Apr, 2019
Erik Cuevas
University of Guadalajara
I agree with answer of Ahmad
1 Recommendation
4th Jun, 2019
R.A. Antonio Carrasco
Newcastle University ,UK
depend of number of layers and the number of data input ,
5th Jun, 2019
Erik Cuevas
University of Guadalajara
In Big O, each operation requiers an operation. Sorting are performed in a log time
1 Recommendation
5th Jun, 2019
R.A. Antonio Carrasco
Newcastle University ,UK
There is an algorithm to determine the complexity of neural networks

Similar questions and discussions

Related Publications

Chapter
Being able to predict the daily activity of firefighters is of great interest to optimize human and material resources. It will allow to enable a quicker response by achieving a better geographical deployment of these resources according to the expected number of interventions. Having obtained the list of interventions for the period 2012–2017 in t...
Conference Paper
This paper offers an experimental study of the influence of learning on evolution in populations of neural networks in which evolutionary and learning fitness surfaces are set and known in advance. Although not biologically plausible, this allows us to investigate various hypotheses regarding the interaction between evolution and learning in neural...
Got a technical question?
Get high-quality answers from experts.