Conference PaperPDF Available

Machine Learning in Antenna Design: An Overview on Machine Learning Concept and Algorithms


Abstract and Figures

With the growth and wide variety of available data, advanced processing, and affordable data storage, machine learning is witnessing great attention in finding optimized solutions in various fields. Machine learning techniques are currently taking a major part of the ongoing research, and expected to be the key player in today's technologies. This paper introduces and investigates the applications of machine learning in antenna design. It covers the major aspects of machine learning, including its basic concept, differentiation with artificial intelligence and deep learning, learning algorithms, its wide applications in various technologies, with a main focus on its usage in antenna design. The review also includes a comparison of the results using machine learning in antenna design, compared to the conventional design methods.
Content may be subject to copyright.
Machine Learning in Antenna Design: An Overview
on Machine Learning Concept and Algorithms
Hilal M. El Misilmani and Tarek Naous
Electrical & Computer Eng. Department
Beirut Arab University
Beirut, Lebanon,
Abstract—With the growth and wide variety of available data,
advanced processing, and affordable data storage, machine learn-
ing is witnessing great attention in finding optimized solutions in
various fields. Machine learning techniques are currently taking
a major part of the ongoing research, and expected to be the
key player in today’s technologies. This paper introduces and
investigates the applications of machine learning in antenna
design. It covers the major aspects of machine learning, including
its basic concept, differentiation with artificial intelligence and
deep learning, learning algorithms, its wide applications in
various technologies, with a main focus on its usage in antenna
design. The review also includes a comparison of the results
using machine learning in antenna design, compared to the
conventional design methods.
Index Terms—Machine learning, artificial intelligence, antenna
design, neural networks, electromagnetic simulations
Artificial Intelligence (AI) is the art of enabling machines
to perform tasks that require human thinking abilities, such
as learning, decision making, and problem solving. In sim-
ple words, AI is implementing human thinking ability in
machines. With the latest advances in big data availability,
software engineering capability and affordable high computing
power, AI is becoming an essential part of today’s research.
It is expected to affect most of our daily activities, with fun-
damental changes in science and engineering, and enormous
impacts on society, with the ability to create, transform, or
optimize different aspects and applications of our daily lives.
To be able to implement and achieve artificial intelligence,
several capabilities need to be possessed by the computer or
machine. For instance, natural language processing is required
for successful communication in English and knowledge rep-
resentation is needed for storing information. To answer ques-
tions and drawing conclusions using the information stored,
automated reasoning is required, whereas, to derive patterns
in data and make predictions and adapt to new conditions,
machine learning is necessary. For object perception and
manipulation computer vision and robotics are needed. By
using intelligent algorithms and large amount of data sets,
iterative processing allows AI software to learn automatically
from patterns or features [1].
Artificial Intelligence and Machine Learning (ML) are often
used interchangeably, however, as will be discussed hereafter
Fig. 1: Relationship between Artificial Intelligence, Machine
Learning, and Deep Learning
and investigated in this paper, machine learning is a large
subset of artificial intelligence, as shown in Fig. 1. In fact,
machine learning can be considered as an approach to achieve
AI applications.
Machine Learning is briefly described as getting useful
information out of data, achieved by developing reliable pre-
diction algorithms. These algorithms can be very powerful in
optimization, but their success rely on the condition and size
of the data collected. Therefore, machine learning is frequently
associated with statistics and data analysis [2], [3].
As for Neural Networks, they are defined as a type of
machine learning algorithms that tries to imitate how the
human brain works. They consist of layers of interconnected
nodes. Each node produces a nonlinear function of its input.
Deep Neural Networks (DNNs) are neural networks that have
more than one hidden layer [4], usually refereed to as Deep
Learning. Both of these algorithms are considered as types of
machine learning algorithms.
This paper presents an overview on machine learning, with a
major focus on investigating its usage in antenna design appli-
cations. The concept of machine learning is studied along with
its different learning algorithms. Next, an extensive review
of several antenna designs and electromagnetic computational
methods is investigated. The methods used to design the
antennas using machine learning in each is presented, along
with the outcome of each algorithm used.
600978-1-7281-4484-9/19/$31.00 ©2019 IEEE
TABLE I: The Three Main Categories Of Learning
Learning Category Description
Supervised Learning A model is trained on a data set
Predictions are made on new inputs
Unsupervised Learning A pattern is derived from the data
after exploring it
Reinforcement Learning Model takes decisions and learns
from its actions
Fig. 2: Schematic of an artificial neural network
Machine learning is based on algorithms that can learn from
data without relying on rules-based programming [5]. It can be
generally divided to three key categories: supervised learning,
unsupervised learning, and reinforcement learning [2], [6]–[8],
listed in Table I. Additional learning scenarios that are less
widely used include: Semi-supervised learning, Transductive
Inference, On-line learning and Active Learning [2]. Each
category has different learning algorithms that belong to it.
In supervised learning, what a correct outputs looks like is
already known. After training the learning algorithm on a given
data set, the algorithm generalizes to give accurate predictions
to all possible inputs [7]. Supervised learning algorithms
include but not limited to: linear regression, logistic regression,
artificial neural networks, and support vector machines.
Linear Regression [9]: consists of fitting a continuous
linear function through the data from which the algorithm
can make predictions on new inputs.
Logistic Regression [9]: used in classification tasks where
it predicts the probability that a certain input corresponds
to one of the already known classes.
Artificial Neural Networks [9], [10]: in neural networks,
large interconnections of ”neurons”, which are simple
computing cells, are employed to achieve good perfor-
mance. When complex functions with many features are
found, neural networks offer an alternate way to perform
ML. Neural networks are made of multiple layers, the
input layer, the output layer, and hidden layers between
the input and output layers, as shown in Figure 2. One of
the many types of neural networks is the feed-forward
neural network where a weighted sum of connected-
neurons output are received by each neuron as an input.
Support Vector Machines (SVMs) [10], [11]: another type
of supervised learning algorithm. In particular, it is used
in classification and deals with the more difficult case of
non-linearly separable patterns by using kernel methods.
k Nearest Neighbors [9]: considered among the simplest
of all machine learning algorithms. After memorizing the
training set, the output of any new input is predicted by
the algorithm based on the outputs of its closest neighbors
in the training set.
In unsupervised learning, what the results should look like is
not known, the algorithm derives a structure from the data after
identifying similarities in the inputs [7]. Unsupervised learning
algorithms include but not limited to: K-Means Clustering and
Dimentionality Reduction Algorithms.
K-Means Clustering [3]: k-unique clusters are automati-
cally formed by this algorithm. It is a type of unsuper-
vised learning where variables in the data are grouped
together based on relationships among them.
Dimentionality Reduction Algorithms [9]: such as the
Principle Component Analysis Algorithm (PCA) where
the goal of this algorithm is minimizing the projection
error by reducing the reducing every feature’s distance to
a certain projection line.
In reinforcement learning, no labeled data set is received by
the machine. Instead, information is collected after interacting
with the environment through different actions. The machine is
rewarded after each action, hence its objective is maximizing
this expected average reward where the action would become
optimal. An example of a reinforcement learning model is the
Markov Decision Process (MDP). [2], [8]
Other machine learning algorithms exist with less widely
usage, such as: Decision Trees [12], Boosting [13], Naive
Bayes [9], Bayesian Regularization [14], Kriging [15], and
more [9].
When solving real world machine learning problems, the
data set is split into three parts [9]: a training set used to train
the algorithm, a cross-validation set used for model selection,
and a test set used for testing the performance of the prediction
algorithm, that is checking if it succeeds to generalize on
inputs not previously seen.
Machine learning techniques have been widely employed in
communication technology, such as antenna selection in wire-
less communications [16], [17], smart grid networks where
machine learning can detect malicious events before occuring
[18], wireless networks where ANNs can be used for predict-
ing the mobility patterns and content requests of wireless users
[6], speech recognition where SVMs with kernel functions
can be useful to improve its generalization performance [19],
context aware computing in IoT [20], and much more [21].
Several papers have investigated the applications of ma-
chine learning in antenna design. It is expected that machine
learning can provide accelerated antenna design process while
maintaining high accuracy levels, with a minimization of
error and time saving, along with a possible prediction of
the antenna behavior, a better computational efficiency and
reduced number of necessary simulations.
In general, in order to apply machine learning in antenna
design, the following steps can be done:
Fig. 3: Reflectarray unit cell [26]
1) By multiple simulations, the electromagnetic character-
istics of an antenna are found out.
2) These characteristics are stored in a database and used
as a data set for training a certain machine learning
3) The antenna that gives the closest results is designed by
the algorithm after making predictions, depending on the
needs of the designer.
In this section, a detailed review of several antenna de-
sign papers using machine learning is provided. For a clear
investigation, the papers discussed here are divided to two
main categories: machine learning for parameter optimization,
and machine learning for enhancing evolutionary computation
A. Machine Learning For Parameter Optimization
One approach of using machine learning in antenna design
is training a learning algorithm on data collected from previous
simulations to optimize the antenna parameters.
Based on Local-Periodicity (FW-LP), successful analysis
and optimization has already been achieved using a Full-Wave
analysis [22], [23], however, for space applications, faster
computations are required. Although this has been addressed
previously using Artificial Neural Networks (ANNs) [24],
[25], the results were limited. In [26], the design of shaped-
beam reflectarray unit cell using Support Vector Machine
(SVM) has been presented. Two sets of four parallel dipoles
are used in the design, shown in Fig. 3. To accelerate the reflec-
tarray antennas design machine learning algorithm is used. The
reflection coefficient matrix is first characterized using SVM.
The scattering parameters of the unit cell dimensions are then
derived. The influence of the discretization of the angle of
incidence is also taken into account in the simulations. The
results in paper using SVM are then compared to those in
[27] based on the Method of Moments (MoM-LP) in terms
of speed and accuracy. The results showed that the sequential
design using SVM can be accelerated by a factor of 880, and
the parallel design can also by accelerated by a factor of 566.
Fig. 4 showing the gain pattern comparison of the SVM and
the MoM-LP using the real angles of incidence.
Fig. 4: Gain pattern comparison of SVM and MoM (a)
elevation and (b) azimuth [26]
The design of a rectangular microstrip antenna using Sup-
port Vector Machines (SVMs) has been also presented in [28].
SVMs are used here for regression, and hence it may be called
Support Vector Regressors (SVR). The machine learning al-
gorithm is trained on a dataset that includes measured values
of the input impedance, operation bandwidth and resonant
frequency of the antenna. Artificial Neural Networks are also
employed at the same time to compare the results with those
obtained using SVR. It was shown that employing machine
learning algorithms in the design of microstrip antennas gives
better and more accurate characterization than the theoretical
results. Comparing SVR and ANN, it was shown that SVR
provides better computation efficiency since it has a faster
convergence rate, with less training time and test time needed
in comparison to ANN.
The design of a planar inverted-F antenna (PIFA), shown
in Fig. 5, with magneto dielectric Nano-composite (MDNC),
using the Bayesian Regularization algorithm as the neural
network learning process, has been presented in [14]. Starting
with the nano-magnetic material’s volume fraction and particle
radius, the different antenna parameters, such as radiation
efficiency, gain, resonant frequency, bandwidth and others, can
Fig. 5: PIFA antenna structure: (a) top view, (b) side view [29]
Fig. 6: Machine learning output vs target [29]
be calculated with high accuracy using machine learning. For
this design two databases are created. The first one contains
material properties (volume fraction and particle radius). The
second one contains electrical properties of the material,
such as electric and magnetic loss tangent, permittivity and
permeability. To create a relation between the performance
of the antenna and its properties, machine learning algorithm
is trained using the two databases as input and output. New
models for permeability and permittivity of the composite
material are also proposed in [14]. Using these equations with
only 42 samples created, a total error of only 7% is obtained
between machine learning data and the second database. Fig.
6 shows a comparison of the Machine Learning Output and
the target (all of the data stored in second database) regression
curve. It was concluded that machine learning techniques can
be very useful in minimization of error and acceleration of the
cycle time for new materials synthetization, and in expecting
the behavior of the antenna without performing extensive time
consuming simulations. Optimized results can be obtained by
increasing the number of data samples used with the algorithm.
Fig. 7: A comparison of the accuracy and time saving using
Neural Networks [15]
The design of reflectarrays using the Kriging algorithm
instead of the classical full-wave solvers has been presented in
[15]. Machine learning is used here to predict the response of
complex unit cells to design high-performance reflectarrays.
The scattering matrix of complex reflectarray elements is
predicted while reducing computational time. The model is
trained on a training set of N samples, where N [500,20000],
to learn the input output relationship. A comparison of the
accuracy and time saving using the proposed method is shown
in Fig. 7. It was concluded that using the Kriging model, a 99%
time saving can be reached, while maintaining a prediction
error below 5%.
The design of a W-Band slotted waveguide antenna array,
shown in Fig. 8, by employing artificial neural networks
has been also presented in [30]. The input layer of the
neural network consists of seven design parameters which
are the lengths and orientation angles of the coupling slots
(lc, θ1, θ2, θ3, θ4, θ5), in addition to the length of the radiating
slots (lR). A data set of 189 examples are obtained by simu-
lations in HFSS and stored in a database which will be used
for training, cross-validation and testing of the model. After
obtaining the optimized values of the design parameters, the
antenna is fabricated using the SLA 3D printing techniques.
The simulated S11 characteristic as well as the gain of the
antenna are compared to those measured and a good agreement
with some slight errors has been shown.
B. Machine Learning For Enhancing Evolutionary Computa-
tion Algorithms
Another approach of using machine learning in antenna de-
sign is embedding it in Evolutionary Algorithms. Evolutionary
Algorithms (EAs) are taking important interest in computing
sciences. Using evolutionary mechanisms involved in natural
selection, evolutionary algorithms allow to find approximate
solutions to optimization problems. EAs have been used in
previous works [31]–[33] for the purpose of antenna design
and optimization. To improve their performance, machine
learning techniques have been used successfully in Evolution-
ary algorithms. When using EAs, databases of adequate size
Fig. 8: Slotted Waveguide Antenna (a) top view, (b) bottom
view. [30]
are produced which allows to use machine learning techniques
Particle Swarm Optimization Algorithm (PSOA), a type of
evolutionary algorithm, is used in [35] to design a multi-
band patch antenna using artificial neural networks. After the
geometrical parameters of the antenna are decided by the
PSOA, a mapping function is built by the ANN so that the
frequencies and associated bandwidths can be related to the
dimensional parameters of the antenna. The flow chart of the
optimizer used with four input parameters, is shown in Fig.
10. The results showed that the design process can be speed
up by eliminating the need for time-consuming simulations,
therefore reducing the computational burden significantly. The
design antenna has been also fabricated and tested, with good
analogy between the measured and simulated results in [35].
In [36], the design of an E-shaped antenna by combining
differential evolution (DE) with the Kriging algorithm, has
been presented. Six antenna parameters are optimized, the feed
position Px, the slot position Ps, the patch width W, the slot
width Ws, the patch length L and the slot length Ls, shown in
Fig. 11. It is suggested that similar results of other optimization
techniques can be obtained while reducing the number of
necessary simulations by 80%. The magnitude of the S11
parameter is desired to be minimized at frequencies 5.0GHz
and 5.5GHz. After the proposed algorithm was run 5 times,
optimum solutions were found, and good prediction accuracy
was exhibited by the model. The predicted and simulated S11
Fig. 9: Regression plot [35]
Fig. 10: Flow chart of proposed modified optimizer [35]
curves are also shown in Fig. 12. Combining machine learning
with evolutionary algorithms has been proven to provide a
faster convergence rate with a similar solution quality in
comparison to optimization methods, self-adaptive differential
evolution [37] and wind driven optimization [38]. It was shown
that the same optimization goals can be reached while reducing
the number of simulations needed by self-adaptive differential
evolution [37] and wind driven optimization [38] by 82.3%
and 77.9% respectively.
The design of an ultrawide band microstrip antenna by
TABLE II: Comparison of the different machine learning techniques used in the investigated papers
Paper Antenna Type Learning
Algorithm Used Compared To Results
[26] Reflectarrays SVM MoM & ANN Accelerated design process while
maintaining high accuracy levels
[14] Planar Inverted F-antenna (PIFA) Bayesian Regularization Minimization of error and acceleration of
cycle time for new materials synthetization
[15] Reflectarrays Kriging Time saving can reach 99.9% while
maintaining a prediction error below 5%
[29] Planar Inverted F-Antenna (PIFA) ANN Conventional Simulations Possible prediction of antenna behavior
without extensive electromagnetic simulations
[28] Rectangular Microstrip Antenna SVM ANN Better computation efficiency with
a faster convergence rate
[30] Slotted Waveguide Antenna ANN Conventional Simulations
Computation of several antenna parameters
with good agreement with the simulated and
fabricated results
[35] Antenna (SWA) ANN Conventional Simulations Design process can be sped up by eliminating
the need for time-consuming simulations
[36] Stacked Patch Antenna Kriging Conventional Simulations
Similar results of other optimization techniques
can be obtained while reducing the number of
necessary simulations by 80%
[39] E-Shaped Antenna Linear Regression Conventional Simulations The optimum results were found without any
necessary simulations
[40] Microstrip Antenna Gaussian Process ML Differential Evolution
The speed of the design and optimization
procedure by more than four times
compared with differential evolution
Fig. 11: E-shaped patch antenna [36]
combining regression with an evolutionary algorithm has been
presented in [39]. Machine learning technique is employed
here along with evolutionary algorithm in the estimation of
fitness function behaviors: the bandwidth (BW), the return loss
(RL) and the central frequency division (CFD). The regressive
machine learning algorithm used allows to fit a curve through
a discrete set of known data points which are the antenna
parameters obtained previously by simulations. A prototype
algorithm has been used to find the optimized values for
Wsand Ls, shown in Fig. 13, while keeping other antenna
parameters constant and considering design restrictions (BW
>9.0GHz, RL <-20dB and CFD <0.37Hz). It was shown
that, using 170 datasets, the optimum results were found
while meeting the required restrictions. The perception on the
behavior of the objectives (BW, RL and CFD) increases as a
Fig. 12: The simulated and predicted S11 curves of the optimal
design [36]
Fig. 13: Microstrip antenna with investigated parameters [39]
larger number of datasets is used.
The design of inter-chip antenna, a four-element linear array
antenna and a two-dimensional array have been presented in
[40]. A new method for designing antennas called surrogate
model assisted differential evolution for antenna synthesis
(SADEA) has been proposed. This method combines Gaussian
Process Machine Learning and a Differential Evolution (DE)
Algorithm. Using this method in the design of the three differ-
ent antennas, the results showed that, using machine learning,
SADEA enhanced the speed of the design and optimization
procedure by more than four times compared with DE. In
[41] Less time consumption has been also achieved in the
design of an isotropic wire antenna by combining a machine
learning technique with an evolutionary algorithm, the fitness
proportionate selection.
By employing Support Vector Machines (SVMs), the design
of a regular microstrip antenna has been presented in [42].
Better characterization with higher accuracy is achieved by
using SVMs. A comparison of the obtained results with the
results of a different approach that uses Artificial Neural
Networks (ANNs) is also presented.
A summary of the different papers using machine learning
in antenna design are summarized in Table II.
Although Machine Learning is very useful, it comes with
a lot of challenges. The most common challenges include but
are not limited to:
1) The choice of learning algorithm: It is not easy to
decide what algorithm to choose as there are a great
number of them. This depends directly on what is being
predicted and also on the type of data acquired. A good
practice is to always visualize the data before choosing
the algorithm.
2) Problem Formulation: Beginning with wrong assump-
tions usually lead to worthless results that can cost a
lot of time. It is necessary to know what area of the
problem is best to spend time working on.
3) Getting enough data: Some data can be hard to find
or obtain. In antenna design, multiple simulations are
needed in order to obtain a training set.
4) Pre-processing of data: To ensure that the learning
algorithm performs adequately, multiple steps need to
be performed on the data, such as data cleaning, nor-
malization and feature selection, which would cost time
in case of very large datasets.
5) Debugging the algorithm: Knowing what to do next can
also be challenging. When problems, such as high bias
and high variance occur, it is crucial to know what
steps to take. This requires following some diagnosis
techniques such as plotting the learning curves.
This paper presented an overview on machine learning with
an investigation of its concepts, differentiation with artificial
intelligence and deep learning, its different learning algo-
rithms and techniques. An extensive investigation is presented
on the usage of machine learning in antenna design, for
which its advantages with respect to the traditional design
and computational techniques have been also studied. It was
seen that machine learning can provide accelerated antenna
design process while maintaining high accuracy levels, with a
minimization of error and time saving, along with a possible
prediction of the antenna behavior, a better computational
efficiency and reduced number of necessary simulations.
[1] S. J. Russell and P. Norvig, Artificial Intelligence: A Modern Approach,
3rd Edition. Pearson Education, 2016.
[2] M. Mohri, A. Rostamizadeh and A. Talwalkar, Foundations of Machine
Learning. The MIT Press, 2012.
[3] P, Harrington, Machine Learning in Action. Manning Publications, 2012.
[4] R. Jiang, X. Wang, S. Cao, J. Zhao and X. Li, “Deep Neural Networks
for Channel Estimation in Underwater Acoustic OFDM Systems,” IEEE
Access, vol. 7, pp. 23579–23594, 2019.
[5] D. Pyle and C. S. Jose, “An executive’s guide to machine learning,”
tech. rep., McKinsey & Company, June 2015.
[6] M. Chen, U. Challita, W. Saad , C. Yin and M. Debbah, “Machine
learning for wireless networks with artificial intelligence: A tutorial on
neural networks,” CoRR, vol. abs/1710.02913, 2017.
[7] S. Marsland, Machine Learning: An Algorithmic Perspective, Second
Edition. Chapman and Hall/CRC, 2014.
[8] A. Burkov, The Hundred-Page Machine Learning Book. Andriy Burkov,
[9] S. Shalev-Shwartz and S. Ben-David, Understanding Machine Learning:
From Theory to Algorithms. Cambridge University Press, 2014.
[10] S. O. Haykin, Neural Networks and Learning Machines. Prentice Hall,
[11] K. Crammer and Y. Singer, “On the algorithmic implementation of
multiclass kernel-based vector machines,Journal of Machine Learning
Research, vol. 2, p. 265–292, 2001.
[12] L. Hyafil and R. L. Rivest, “Constructing optimal binary decision trees
is NP-complete,” Information Processing Letters, vol. 5, no. 1, p. 15–17,
[13] Y. Freund and R. Schapire, “A decision-theoretic generalization of on-
line learning and an application to boosting,” European Conference on
Computational Learning Theory (EuroCOLT), p. 23–37, 1995.
[14] C. Gianfagna, M. Swaminathan, P. M. Raj, R. Tummala and G. An-
tonini, “Enabling antenna design with nano-magnetic materials using
machine learning,” 2015 IEEE Nanotechnology Materials and Devices
Conference (NMDC), pp. 1–5, Anchorage, AK, 2015.
[15] L. Tenuti, G. Oliveri, D. Bresciani and A. Massa, “Advanced learning-
based approaches for reflectarrays design,” 2017 11th European Confer-
ence on Antennas and Propagation (EUCAP), pp. 84–87, Paris, 2017.
[16] X. Wang, P. Wang and X. Wang, “Adaptive Sparse Array Reconfigura-
tion based on Machine Learning Algorithms,” 2018 IEEE International
Conference on Acoustics, Speech and Signal Processing (ICASSP),
pp. 1159–1163, Calgary, AB, 2018.
[17] J. Joung, “Machine Learning-Based Antenna Selection in Wireless Com-
munications,” IEEE Communications Letters, vol. 20, no. 11, pp. 2241–
2244, Nov. 2016.
[18] Z. M. Fadlullah and M. M. Fouda and N. Kato and X. Shen and Y.
Nozaki, “An early warning system against malicious activities for smart
grid communications,” IEEE Network, vol. 25, no. 5, pp. 50–55, 2011.
[19] B. A. Sonkamble and D. D. Doye, “Use of Support Vector Machines
through Linear-Polynomial (LP) Kernel for Speech Recognition,Inter-
national Conference on Advances in Mobile Network, Communication
and Its Applications, pp. 46–49, 2012.
[20] C. Perera and A. Zaslavsky and P. Christen and D. Georgakopoulos,
“Context Aware Computing for The Internet of Things: A Survey,IEEE
Communications Surveys Tutorials, vol. 16, no. 1, pp. 414–454, 2014.
[21] S. S. W. Samek and T. Wiegand, “The convergence of machine learning
and communications,” CoRR, vol. abs/1708.08299, 2017.
[22] J. Huang and J. Antonio Encinar, Reflectarray Antennas. Wiley-IEEE
Press, 2007.
[23] D. R. Prado, M. Arrebola, M. R. Pino, R. Florencio, R. R. Boix, J. A.
Encinar and F. Las-Heras, “Efficient Crosspolar Optimization of Shaped-
Beam Dual-Polarized Reflectarrays Using Full- Wave Analysis for the
Antenna Element Characterization,” IEEE Transactions on Antennas and
Propagation, vol. 65, no. 2, pp. 623–635, 2017.
[24] P. Robustillo, J. Zapata, J. A. Encinar and J. Rubio, “ANN Charac-
terization of Multi-Layer Reflectarray Elements for Contoured-Beam
Space Antennas in the Ku-Band,” IEEE Transactions on Antennas and
Propagation, vol. 60, no. 7, pp. 3205–3214, 2012.
[25] V. Richard R., Loison, R. Gillard, H. Legay and M. Romier, “Loss
analysis of a reflectarray cell using ANNs with accurate magnitude pre-
diction,” 2017 11th European Conference on Antennas and Propagation
(EUCAP), pp. 2396–2399, 2017.
[26] D. R. Prado, J. A. L´
andez, M. Arrebola and G. Goussetis,
“Efficient Shaped-Beam Reflectarray Design Using Machine Learning
Techniques,2018 15th European Radar Conference (EuRAD), pp. 525–
528, Madrid, 2018.
[27] R. Florencio, R. R. Boix and J. A. Encinar, “Enhanced MoM Analysis
of the Scattering by Periodic Strip Gratings in Multilayered Substrates,”
IEEE Transactions on Antennas and Propagation, vol. 61, no. 10,
pp. 5088–5099, 2013.
[28] N. T. Tokan and F. Gune, “Support Vector Characterization of the
Microstrip Antennas Based on Measurements,” Progress In Electromag-
netics Research B, vol. 5, pp. 49–61, 2008.
[29] C. Gianfagna, H. Yu, M. Swaminathan, R. Pulugurtha, R. Tum-
mala, and G. Antonini, “Machine-Learning Approach for Design
of Nanomagnetic-Based Antennas,” Journal of Electronic Materials,
vol. 46, no. 8, pp. 4963–4975, 2017.
[30] J. Tak, A. Kantemur, Y. Sharma and H. Xin, “A 3-D-PrintedW-Band
Slotted Waveguide Array Antenna Optimized Using Machine Learning,”
IEEE Antennas and Wireless Propagation Letters, vol. 17, no. 11,
pp. 2008–2012, Nov. 2018.
[31] A. Hoorfar, “Evolutionary Programming in Electromagnetic Optimiza-
tion: A Review,” IEEE Transactions on Antennas and Propagation,
vol. 55, no. 3, pp. 523–537, March 2007.
[32] J. M. Johnson and Y. Rahmat-Samii, “Genetic algorithms and method
of moments (GA/MOM) for the design of integrated antennas,” IEEE
Transactions on Antennas and Propagation, vol. 47, no. 10, pp. 1606–
1614, Oct. 1999.
[33] M. D. Gregory, Z. Bayraktar and D. H. Werner, “Fast Optimization
of Electromagnetic Design Problems Using the Covariance Matrix
Adaptation Evolutionary Strategy,” IEEE Transactions on Antennas and
Propagation, vol. 59, no. 4, pp. 1275–1285, April 2011.
[34] J. Zhang, Z. Zhan, Y. Lin, N. Chen, Y. Gong, J. Zhong, H. S. H. Chung,
Y. Li and Y. Shi, “Evolutionary Computation Meets Machine Learning:
A Survey,IEEE Computational Intelligence Magazine, vol. 6, no. 4,
pp. 68–75, Nov. 2011.
[35] S. K. Jain, “Bandwidth Enhancement Of Patch Antennas Using Neu-
ral Network Dependent Modified Optimizer,International Journal of
Microwave and Wireless Technologies, vol. 8, no. 7, pp. 1111–1119,
[36] X. H. Chen, X. X. Guo, J. M. Pei and W. Y. Man, “A hybrid algorithm of
differential evolution and machine learning for electromagnetic structure
optimization,” 2017 32nd Youth Academic Annual Conference of Chinese
Association of Automation (YAC), pp. 755–759, Hefei, 2017.
[37] M. D. Gregory and Z. Bayraktar and D. H. Werner, “Fast Optimization
of Electromagnetic Design Problems Using the Covariance Matrix
Adaptation Evolutionary Strategy,” IEEE Transactions on Antennas and
Propagation, vol. 59, no. 4, pp. 1275–1285, 2011.
[38] Z. Bayraktar and M. Komurcu and J. A. Bossard and D. H. Werner,
“The Wind Driven Optimization Technique and its Application in
Electromagnetics,” IEEE Transactions on Antennas and Propagation,
vol. 61, no. 5, pp. 2745–2757, 2013.
[39] C. R. Silva and S. R. Martins, “An Adaptive Evolutionary Algorithm
for UWB Microstrip Antennas Optimization Using a Machine Learning
Technique.,Microw. Opt. Technol. Lett., vol. 55, no. 8, pp. 1864–1868,
[40] B. Liu, H. Aliakbarian, Z. Ma, G. A. E. Vandenbosch, G. Gielen and P.
Excell, “An Efficient Method for Antenna Design Optimization Based
on Evolutionary Computation and Machine Learning Techniques,IEEE
Transactions on Antennas and Propagation, vol. 62, no. 1, pp. 7–18, Jan.
[41] Saifullah and B. Ahmed, “Machine learning for isotropic antenna
design,” 2018 22nd International Microwave and Radar Conference
(MIKON), pp. 683–687, Poznan, 2018.
[42] N. T. Tokan and F. Gunes, “Support vector design of the microstrip
antenna,” 2008 IEEE 16th Signal Processing, Communication and
Applications Conference, pp. 1–4, Aydin, 2008.
... Usually, satellite operations post-launch are tightly constrained by an FIGURE 2. A Russian Soyuz Rocket Failure [19] Conversely, in unsupervised learning, the algorithm derives a structure from the data after identifying similarities in the inputs. What the results should look like is not known [29]. The principal goal of unsupervised learning is to discover hidden and interesting patterns in unlabeled data. ...
... Instead, information is collected after interacting with the environment through different actions. The agent is rewarded after each action; hence its objective is maximizing this expected average reward where the action would become optimal [29]. The Markov Decision Process (MDP) represents a notable example of an RL model [45]. ...
... The Markov Decision Process (MDP) represents a notable example of an RL model [45]. Other ML algorithms, though with less widely usage, exist [29]. They include: Naïve Bayes [39], Decision Trees [46], Bayesian Regularization [47], Kriging [48], Boosting [49] and more [39]. ...
Full-text available
Artificial Intelligence (AI) is increasingly finding acceptance in the space community, especially Machine Learning (ML), a subdomain of AI. ML algorithms now find numerous applications in autonomous navigation, spacecraft health monitoring and operational management of satellite constellations. However, a large number of surveys on the applications of AI in space missions can be classified into two categories. The first category suffers from the limitation of being old and not covering some crucial and recent developments in the field; such as the contributions of Deep Learning (DL) and bioinspired AI algorithms. The challenge with the second category lies in its being too detailed with respect to the development and application of specific AI techniques or algorithms. These limitations have necessitated the need to have a concise survey with a wider scope for those interested in the applications and challenges of AI in the space industry, especially those with technical backgrounds in other fields. In this paper, we surveyed the use of traditional AI techniques in various domains of space missions without delving into formal methods. Some bioinspired AI algorithms were also surveyed and their potential application areas highlighted. Unlike similar surveys that focus only on technological challenges, we also addressed some crucial legal drawbacks that emanate from the reliance and use of AI in space. Summarily, while discussing future directions we reviewed some advancements in Internet of Things (IoT) and Blockchain technologies. Our review prioritized three application domains positioned to benefit immensely from the inevitable AI-Blockchain convergence in the space community. These include the Internet of Space Things (IoST), Satellite Communication and Spacecraft Data Security.
... A smart antenna, on the other hand, isn't perfect because some signals sent to users can heavily interfere with other signals bouncing off nearby physical structures. Machine learning has gotten a lot of attention as a way to identify better answers in several fields, and it's expected to play a big role in our future technology and with its omnidirectional uses in automating complex and long processes with specialized detailing, has become a go-to tool for science and engineering [6]. It is still in its infancy, but it has become a phenomenon in the technology business, with machine-learning users altering the foundations of hundreds of industries and research projects, particularly in the field of antennas. ...
... Machine learning is an application of artificial intelligence (AI) that enables users to gather and evaluate data from their surroundings. Machine learning enables software to progressively learn and adapt to a given job without human intervention, allowing it to make data-driven judgments [6]. Machine learning works through statistical techniques, among other methods. ...
According to several research circles, it is predicted that future wireless systems that employ smart antenna techniques would be more effective at using available spectrum and at building new networks at lower cost, while also improving service quality and allowing for cross-technology operation. These systems require constant monitoring in order to function properly, allowing users to apply machine learning algorithms to analyse large amounts of data from various antenna settings. Machine learning is a technique in which a machine learns and improves on its own, based on past data. These techniques enable the smart antenna target to be learned in an efficient, reliable, and adaptive manner. In this paper, the antenna array and antenna developed for the Internet of Things applications were highlighted. In this paper, we review how machine learning techniques can handle these applications effectively and what the concept of adaptive antenna is and when it can be used in this day and age characterized by the rapid development of information technology.
... The potential of exploiting machine learning (ML) is imminent in 5G and beyondenabled Industry 4.0 and beyond. At each abstraction layer, ML is anticipated to be used to predict service demands of Industry 4.0 and beyond as well as the evolution of the wireless channel in order to design self-optimizing and self-updating networks [197][198][199][200]. It is unveiled in [191] that the traditional multi-antenna signal processing approaches are no longer sufficient for broadband mmWave massive MIMO systems due to a large number of radiating elements involved, high data rates, and a high number of mobile users. ...
Full-text available
Industry 4.0 is a new paradigm of digitalization and automation that demands high data rates and real-time ultra-reliable agile communication. Industrial communication at sub-6 GHz industrial, scientific, and medical (ISM) bands has some serious impediments, such as interference, spectral congestion, and limited bandwidth. These limitations hinder the high throughput and reliability requirements of modern industrial applications and mission-critical scenarios. In this paper, we critically assess the potential of the 60 GHz millimeter-wave (mmWave) ISM band as an enabler for ultra-reliable low-latency communication (URLLC) in smart manufacturing, smart factories, and mission-critical operations in Industry 4.0 and beyond. A holistic overview of 60 GHz wireless standards and key performance indicators are discussed. Then the review of 60 GHz smart antenna systems facilitating agile communication for Industry 4.0 and beyond is presented. We envisage that the use of 60 GHz communication and smart antenna systems are crucial for modern industrial communication so that URLLC in Industry 4.0 and beyond could soar to its full potential.
... Considering the world's current big data age, ML has gotten significant attention in this sector. ML offers considerable potential in the design and prediction of antenna behavior that allows for substantial acceleration while retaining high accuracy [1]. ...
Full-text available
This paper presents a deep investigation and analysis of the recent advances in optimizing the parameters of microstrip antennas based on machine learning techniques. This investigation explains the numerical and traditional methods necessary for understanding the insights in designing microstrip antennas. Contemporary machine learning techniques employed in parameters optimization are then discussed for emphasizing the various approaches used in antenna synthesis. In addition, the regression methods in machine learning are highlighted in terms of the mathematical description and implementation of parameters optimization. Various methodologies and algorithms used to produce the design parameters of microstrip antennas based on antenna specifications and desired radiation are also described in this paper. Moreover, the recent research publications that target the design and optimization of microstrip antennas using machine learning are discussed in this paper to supply readers with the essential understanding of the recent methods required for applying the presented approaches in related tasks and projects.
... In recent years, the field of antenna design and optimization has been appealing for the use of Machine Learning (ML) algorithms [19], [20], since most of the antenna structures that are of interest for real-world applications do not have closed-form solutions. In addition, the design of antennas and microwave devices often requires a laborious and time-consuming geometric parameter optimization process to achieve the desired target performance. ...
Full-text available
This paper presents the use of machine learning (ML) to facilitate the design of dielectric-filled Slotted Waveguide Antennas (SWAs) with specified sidelobe levels. Conventional design methods for air-filled SWAs require the simultaneous solving of complex equations to deduce the antenna’s design parameters, which typically requires further manual simulation-based optimization to reach the desired resonance frequency and sidelobe level ratio (SLR). The few works that investigated the design of filled SWAs, did not optimize the design for a specified SLR. For an accelerated design process in the case of specified SLRs, we formulate the design of dielectric-filled SWAs as a regression problem where based on input specifications of the antenna’s SLR, reflection coefficient, frequency of operation, and relative permittivity of the dielectric material, the developed ML model predicts the filled SWA’s design parameters fast and with very low error. These parameters include the unified slots length and the non-uniform slots displacements required to achieve the desired performance. We experiment with several regressive ML algorithms and provide a comparative study of their results. Our numerical evaluations and validation experiments with the best performing ML models demonstrate the high efficiency of the proposed ML approach in estimating the dielectric-filled SWA’s design parameters in only a few milliseconds. A comparison to the design obtained through conventional optimization using the Genetic Algorithm also indicate superiority in computation time and resulting antenna performance.
... Metamaterials are materials with special physical properties that cannot be reproduced using natural materials, and so metamaterials are popular materials in today's world and are frequently used in many fields, such as microwave invisibility cloaks, invisible submarines, revolutionary electronics, microwave components, as filters, and antennas that are compact, efficient, and have a negative refractive index. One of its most important uses is the design of antennas made possible by metamaterials [1][2][3]. This is due to the fact that metamaterials have unique properties, and as a result, we may construct antennas with innovative features that standard materials cannot provide. ...
Full-text available
The design of microstrip antennas is a complex and time-consuming process, especially the step of searching for the best design parameters. Meanwhile , the performance of microstrip antennas can be improved using metamaterial, which results in a new class of antennas called metamaterial antenna. Several parameters affect the radiation loss and quality factor of this class of antennas, such as the antenna size. Recently, the optimal values of the design parameters of metamaterial antennas can be predicted using machine learning, which presents a better alternative to simulation tools and trial-and-error processes. However, the prediction accuracy depends heavily on the quality of the machine learning model. In this paper, and benefiting from the current advances in deep learning, we propose a deep network architecture to predict the bandwidth of metamaterial antenna. Experimental results show that the proposed deep network could accurately predict the optimal values of the antenna bandwidth with a tiny value of mean-square error (MSE). In addition, the proposed model is compared with current competing approaches that are based on support vector machines, multi-layer perceptron, K-nearest neighbors, and ensemble models. The results show that the proposed model is better than the other approaches and can predict antenna bandwidth more accurately.
Full-text available
Microstrip patch antennas are predominantly in use in mobile communication and healthcare. Their performances are even improved, using Split-Ring Resonator cells. But finding the ideal dimensions of the microstrip patch antenna and calculating the correct number and size of the split ring resonator cells consume a lot of time when we use Electromagnetic Simulation software to design first and then simulate. Using the pre-calculated results of certain sets of microstrip patch antennas with split ring resonators, a machine learning model can be trained and hence be used to predict the antenna metrics when the dimensions are specified. When the machine learning algorithms are combined with feature-optimization algorithms such as the Genetic Algorithm, the efficiency and performance can be improved further.
Full-text available
Orthogonal frequency division multiplexing (OFDM) provides a promising modulation technique for underwater acoustic (UWA) communication systems. It is indispensable to obtain channel state information for channel estimation to handle the various channel distortions and interferences. However, the conventional channel estimation methods such as least square (LS), minimum mean square error (MMSE) and back propagation neural network (BPNN) can not be directly applied to UWA-OFDM systems, since complicated multipath channels may cause a serious decline in performance estimation. To address the issue, two types of channel estimators based on deep neural networks (DNNs) are proposed with a novel training strategy in this paper. The proposed DNN models are trained with the received pilot symbols and the correct channel impulse responses in the training process, and then the estimated channel impulse responses are offered by the proposed DNN models in the working process. Experimental results demonstrate that the proposed methods outperform LS, BPNN algorithms and are comparable to MMSE algorithm in respect to bit error rate (BER) and normalized mean square error (NMSE). Meanwhile, there is no requirement of prior statistics information about channel autocorrelation matrix and noise variance for our proposals to estimate channels in UWA-OFDM systems, which is superior to MMSE algorithm. Our proposed DNN models achieve better performance using 16QAM than 32QAM, 64QAM, furthermore, the specified DNN architectures help improve real-time performance by saving runtime and storage resources for online UWA communications.
Full-text available
Next-generation wireless networks must support ultra-reliable, low-latency communication and intelligently manage a massive number of Internet of Things (IoT) devices in real-time, within a highly dynamic environment. This need for stringent communication quality-of-service (QoS) requirements as well as mobile edge and core intelligence can only be realized by integrating fundamental notions of artificial intelligence (AI) and machine learning across the wireless infrastructure and end-user devices. In this context, this paper provides a comprehensive tutorial that introduces the main concepts of machine learning, in general, and artificial neural networks (ANNs), in particular, and their potential applications in wireless communications. For this purpose, we present a comprehensive overview on a number of key types of neural networks that include feed-forward, recurrent, spiking, and deep neural networks. For each type of neural network, we present the basic architecture and training procedure, as well as the associated challenges and opportunities. Then, we provide an in-depth overview on the variety of wireless communication problems that can be addressed using ANNs, ranging from communication using unmanned aerial vehicles to virtual reality and edge caching.For each individual application, we present the main motivation for using ANNs along with the associated challenges while also providing a detailed example for a use case scenario and outlining future works that can be addressed using ANNs. In a nutshell, this article constitutes one of the first holistic tutorials on the development of machine learning techniques tailored to the needs of future wireless networks.
A Proven, Hands-On Approach for Students without a Strong Statistical Foundation Since the best-selling first edition was published, there have been several prominent developments in the field of machine learning, including the increasing work on the statistical interpretations of machine learning algorithms. Unfortunately, computer science students without a strong statistical background often find it hard to get started in this area. Remedying this deficiency, Machine Learning: An Algorithmic Perspective, Second Edition helps students understand the algorithms of machine learning. It puts them on a path toward mastering the relevant mathematics and statistics as well as the necessary programming and experimentation. New to the Second Edition Two new chapters on deep belief networks and Gaussian processes Reorganization of the chapters to make a more natural flow of content Revision of the support vector machine material, including a simple implementation for experiments New material on random forests, the perceptron convergence theorem, accuracy methods, and conjugate gradient optimization for the multi-layer perceptron Additional discussions of the Kalman and particle filters Improved code, including better use of naming conventions in Python Suitable for both an introductory one-semester course and more advanced courses, the text strongly encourages students to practice with the code. Each chapter includes detailed examples along with further reading and problems. All of the code used to create the examples is available on the author’s website.
A 3D printed W-band slotted waveguide array antenna (SWAA) is proposed. The proposed SWAA consists of three different sections (two horizontal ones and a vertical one) such as a radiating waveguide array with 10 x 10 slots array with an aperture size of 31 mm x 31.4 mm, a coupling waveguide to feed the radiating waveguide array, and a vertical waveguide to feed the coupling waveguide. Machine learning technique based on artificial neural network algorithm is used to optimize the design. The optimized SWAA is fabricated using SLA 3D printing and then is metallized with silver on the inner and outer surfaces by Jet metal spraying method. To metallize the inner and outer surfaces of the monolithic structure, non-radiating slots are added on the surface of the designed SWAA. The surface roughness is taken into account by employing the Huray-model methodology in simulation. The SWAA has a 22.5 dBi far-field gain, a -13.5 dB side lobe level and 10 degrees HPBW at 78.7 GHz in measurement.
How to write your own machine learning algorithms in Python.
The areas of machine learning and communication technology are converging. Today's communications systems generate a huge amount of traffic data, which can help to significantly enhance the design and management of networks and communication components when combined with advanced machine learning methods. Furthermore, recently developed end-to-end training procedures offer new ways to jointly optimize the components of a communication system. Also in many emerging application fields of communication technology, e.g., smart cities or internet of things, machine learning methods are of central importance. This paper gives an overview over the use of machine learning in different areas of communications and discusses two exemplar applications in wireless networking. Furthermore, it identifies promising future research topics and discusses their potential impact.