Habib Izadkhah

Habib Izadkhah
University of Tabriz · Department of Computer Science

Ph.D. in Computer Science

About

100
Publications
8,945
Reads
How we measure 'reads'
A 'read' is counted each time someone views a publication summary (such as the title, abstract, and list of authors), clicks on a figure, or views or downloads the full-text. Learn more
345
Citations
Citations since 2017
88 Research Items
316 Citations
2017201820192020202120222023020406080100
2017201820192020202120222023020406080100
2017201820192020202120222023020406080100
2017201820192020202120222023020406080100
Introduction
Dr. Izadkhah has been announced as the top 5 most influential authors in Software Modularization research by a recent study. Ref: Software Module Clustering: An In-Depth Literature Analysis, IEEE Transactions on Software Engineering, 2020.
Additional affiliations
December 2013 - present
University of Tabriz
Position
  • Professor (Assistant)
Education
September 2009 - September 2013
University of Tabriz
Field of study

Publications

Publications (100)
Chapter
A graph is a nonlinear data structure that has a wide range of applications. Most of the concepts in computer science and the real world can be visualized and represented in terms of graph data structure. This chapter provides 70 exercises for addressing different applications of graph algorithms. To this end, this chapter provides exercises on gra...
Chapter
In this chapter, we examine the four basic data structures including arrays, stacks, queues, and linked lists. To this end, this chapter provides 191 exercises on the basic data structures in terms of theoretical and application.
Chapter
The purpose of sorting is to arrange the elements of a given list in a specific order (ascending or descending). Sorting algorithms are categorized according to the following– By number of comparisons By number of swaps By memory (space) usage By recursion By stability By adaptability This chapter provides 431 exercises on the 35 sorting algorithms...
Chapter
The theory of computational complexity examines the difficulty of solving problems by computer (more precisely, algorithmically). In this theory, the complexity of problem definitions is classified into two sets; P which denotes “Polynomial” time and NP which indicates “Non-deterministic Polynomial” time. There are also NP-Hard and NP-Complete sets...
Chapter
Recursive relations are useful methods for analyzing recursive algorithms. This chapter provides exercises for developing skills in solving recurrence relations. You may be familiar with how to analyze the time complexity of algorithms. However, to analyze recursive algorithms, we need more sophisticated techniques such as solving recursive relatio...
Chapter
There is usually more than one algorithm for solving problems. The question that arises in such cases is which of these algorithms works best. The algorithms are compared based on runtime and memory usage. Therefore, it is an efficient algorithm that wastes running time and less memory. Because there is enough memory today, it is very important to...
Chapter
Mathematical induction is a technique for proving results or establishing statements for natural numbers. This chapter illustrates the method through a variety of examples and provides 50 exercises on mathematical induction. To this end, this chapter provides exercises on summations, inequalities, floors and ceilings, divisibility, postage stamps,...
Chapter
Searching algorithms are used to retrieve a given element from data structures. According to the search strategy, these algorithms are generally classified into two categories: sequential search and interval search. This chapter provides 76 exercises on the search methods including linear search, binary search, Ternary search, binary search tree, F...
Chapter
Greedy algorithms are simple algorithms used in optimization problems. This algorithm makes the optimal choice in each step so that it can find the optimal way to solve the whole problem. This chapter provides 169 exercises for addressing different aspects of greedy algorithms. To this end, this chapter provides exercises on activity selection prob...
Chapter
Dynamic programming is another way to design algorithms. Dynamic programming is similar to the divide and conquer techniques in that the problem is divided into smaller sub-problems. But in this method, we first solve the smaller sub-problems, save the results, and then retrieve it whenever one of them is needed, instead of recalculating. Dynamic p...
Chapter
The divide and conquer method is used for solving problems. In this type of method, the main problem is divided into sub-problems that are exactly similar to the main problem but smaller in size. The first phase of this method, as its name suggests, is to break or divide the problem into sub-problems, and the second phase is to solve smaller proble...
Chapter
Having come up with your new and innovative algorithm, how do you measure its efficiency? Certainly, we would prefer to design an algorithm which to be as efficient as possible, therefore we will require some method to prove that it will really operate as well as we had expected. Also, some algorithms are more efficient than others. But what do we...
Chapter
The backtracking algorithm is a problem-solving algorithm that tests all possible solutions and goes back wherever the solution was not appropriate (did not match the constraints of the problem) and corrects itself and finds a new way. Unlike brute force which tests all the ways and then we find the answer between them. This algorithm is usually us...
Conference Paper
Full-text available
Diagnosis of covid-19 using deep learning on CT scan images can play an important role in helping doctors. In this research, deep transfer learning is used, using the architecture of EfficientNet-B2 and ViT_l_32 neural networks as a base, and by combining these models, the proposed model is built. For evaluation , confusion matrix, precision, accur...
Article
Full-text available
Multiprocessor systems with parallel computing play an important role in data processing. Considering the optimal use of existing computing systems, scheduling on parallel systems has gained great significance. Usually, a sequential program to run on parallel systems is modeled by a task graph. Because scheduling of task graphs onto processors is c...
Article
In recent years, convolutional neural networks (CNNs) have outperformed conventional methods in end-to-end speaker identification (SI) systems. The CNN training time is considerably long due to the need for large amounts of training data and high costs of computation and memory consumption. This paper proposes a new CNN for text-independent SI insp...
Chapter
Breast cancer is one of the diseases that is gradually becoming more prevalent in today’s society. Machine learning is helping in early detection of mammographic lesions. In this work, the fully connected Neural Network (FCNN) deep learning architecture used to diagnose the breast cancer. In addition, we studied the effect of different techniques t...
Chapter
We would like to take a step further and discuss building a system that analyzes patterns in data and can then learn to generate novel data based on learned information. This is a newly emerging field in deep learning. It has brought a great deal of success and attention in deep learning in recent years. This chapter introduces generative networks,...
Chapter
According to the World Health Organization (WHO), cardiovascular disease (CVD) is the leading cause of death and disability worldwide, killing more than 17 million people every year. This toll accounts for a third of all deaths worldwide and half of all noncommunicable disease-related deaths. More than 80% of these deaths occur in low- and middle-i...
Chapter
A large amount of the data such as speech, protein sequence, bases in a DNA molecule, time series (e.g., weather forecast data and financial data), videos, and texts are inherently serial (sequential). The current value of sequential data depends on previous values. Recurrent neural networks (RNNs, for short) are a powerful type of neural networks...
Chapter
This chapter discusses deep learning applications, which show that deep learning is embedded in all aspects of our lives. In fact, people deal with these applications on a daily basis. The chapter will then point out why deep learning is important in bioinformatics. Finally, the concepts presented in all chapters of the book will be reviewed.
Chapter
This chapter discusses the legendary deep learning architectures first. It then presents several deep learning applications in bioinformatics and discusses several challenges in deep learning as well as the methods of overcoming those challenges.
Chapter
Neural networks are a subset of machine learning, and they are at the heart of deep learning algorithms. This chapter introduces neural networks, which have been among the most widely used and successful models for machine learning. Most of the modern deep learning models are also based on neural networks. These networks have a long history of deve...
Chapter
In bioinformatics, classification of the input data is a very important task that refers to a predictive modeling problem where a class label is predicted for any given input data. An instance is to classify the breast cancer into benign and malignant categories. The Pima Indians Diabetes Dataset is employed to predict the onset of diabetes based o...
Chapter
Known as a high-level programming language, Python is easy to use and learn. It also works quickly; furthermore, it is very versatile and may become the dominant platform for machine learning. If you would like to set up a career in deep learning, you need to know Python programming along with the Python ecosystem. Creating effective deep learning...
Chapter
The state-of-the-art image classification techniques have emerged from submissions by academics and industry leaders to the ImageNet large-scale visual recognition challenge (ILSVRC). ILSVRC is an annual contest aimed at developing and improving computer vision techniques on a subset of the ImageNet dataset. This chapter addresses eight well-known...
Chapter
This chapter introduces the convolutional neural networks (CNNs, or ConvNets for short), deep neural networks with special image processing applications. These networks significantly improve the processing of information by providing several new concepts, and are still a great option in many problems involving continuously sampled data in a rectang...
Chapter
A single-layer neural network is the simplest type of neural networks laying the foundation for understanding other kinds of neural networks. There are no hidden layers in this network that only contains input and output layers. After the advent of the single-layer neural network, it took nearly 30 years for the multilayer neural perceptron (MLP) n...
Chapter
Ability to use deep learning is among the most highly sought skills in technology. In recent years, it has drawn a great deal of research attention and led to many practical applications. Deep learning is a specialized subset of machine learning that has improved the ability to classify, recognize, detect, and generate—in one word, understand. This...
Chapter
Deep learning and relevant techniques are classified as a subset of machine learning, whereas machine learning is considered a subset of artificial intelligence. Therefore, this chapter reviews the definitions of these concepts and their differences. It also outlines some major challenges in deep learning. Moreover, some of the basic mathematical c...
Conference Paper
Topological data analysis (TDA) is a novel and rapidly growing area of modern data science that uses topological, geometric, and algebraic tools to extract structural features from very complex and large-scale data that are usually incomplete and noisy. The primary motivation for studying this method was to study the shape of data, which has been c...
Article
Multiplex link prediction is the problem of finding missing links between nodes based on information from other layers. Although the link prediction problem in the online social networks is studied comprehensively, most approaches only employ internal features of the under prediction layer and do not consider additional link information from other...
Article
Nowadays, in scientific and computational programs, increasing the execution speed of programs is very important. In implementing scientific programs, loops are dedicating a large part of time to themselves. Therefore, the parallel execution of the iteration loop reduces the runtime of all programs. This paper has been working on the parallelism of...
Article
Full-text available
Comprehensive analysis of proteins to evaluate their genetic diversity, study their differences, and respond to the tensions is the main subject of an interdisciplinary field of study called proteomics. The main objective of the proteomics is to detect and quantify proteins and study their post-translational modifications and interactions using pro...
Article
Evaluating the reliability of component-based software systems from their architecture is of great importance. This paper proposes metrics to assess the reliability of software systems considering the self-healing effect of components on software reliability. A self-healing component when being broken, heals itself with a probability and returns to...
Article
Users of cloud computing technology can lease resources instead of spending an excessive charge for their ownership. For service delivery in the infrastructure-as-a-service model of the cloud computing paradigm, virtual machines (VMs) are created by the hypervisor. This software is installed on a bare-metal server, called the host, and acted as a b...
Article
Context Clustering algorithms, as a modularization technique, are used to modularize a program aiming to understand large software systems as well as software refactoring. These algorithms partition the source code of the software system into smaller and easy-to-manage modules (clusters). The resulting decomposition is called the software system st...
Article
Full-text available
Software refactoring is a software maintenance action to improve the software internal quality without changing its external behavior. During the maintenance process, structural refactoring is performed by remodularizing the source code. Software clustering is a modularization technique to remodularize artifacts of source code aiming to improve rea...
Article
Full-text available
A software system evolves overtime to meet the needs of users. Understanding a program is the most important step to apply new requirements. Clustering techniques by dividing a program into small and meaningful parts make it possible to understand the program. In general, clustering algorithms are classified into two categories: hierarchical and no...
Article
Full-text available
Clustering (modularisation) techniques are often employed for the meaningful decomposition of a program aiming to understand it. In the software clustering context, several external metrics are presented to evaluate and validate the resultant clustering obtained by an algorithm. These metrics use a ground‐truth decomposition to evaluate a resultant...
Article
Full-text available
In the literature, loop fusion is an effective optimization technique which tries to enhance parallelizing compilers’ performance via memory hierarchy management, and all its competing criteria create an NP-hard problem. This paper proposes an evolutionary algorithm that aims to achieve a profitable loop order which maximizes fusion taking into acc...
Article
Multiplex networks are the general representative of complex systems composed of distinct interactions between the same entities on multiple layers. Community detection in the multiplex networks is the problem of finding a shared structure under all layers, which combines the information of the entire network. Most of the existing methods for commu...
Article
Full-text available
Program comprehension plays a significant role in the maintenance of software systems. There has recently been a significant increase in written large-scale applications with a collaboration of several programming languages. Due to the indirect collaboration between different components of a multilingual program, it is difficult to understand such...
Article
Security is a growing concern in developing software systems. It is important to face unknown threats in order to make the system continue operating properly. Threats are vague and attack methods change frequently. Coping with such changes is a major feature of an adaptive software. Therefore, designing an adaptive secure software is an appropriate...
Article
Software clustering is usually used for program comprehension. Since it is considered to be the most crucial NP-complete problem, therefore, several genetic algorithms have been proposed to solve this problem. In the literature, there exist some objective functions (i.e., fitness function) which are used by genetic algorithms for clustering. These...
Conference Paper
Full-text available
Software clustering is usually used for program comprehension. Since it is considered to be the most crucial NP-complete problem, several genetic algorithms have been proposed to solve this problem. In the literature, there exist some objective functions (i.e., fitness functions) which are used by genetic algorithms for clustering. These objective...
Article
A software system evolves over time to meet the user’s new requirements as well as to adapt to the environment. This causes it deviates from its original and documented structure. Hence, after a while, due to its low understandability, making new changes will not be easy. Modularization is utilized to extract the software system structure from the...
Article
Full-text available
Software modularization techniques are employed to understand a software system. The purpose of modularization is to decompose a software system from a source code into meaningful and understandable subsystems (modules). Since modularization of a software system is an NP-hard problem, the modularization quality obtained using evolutionary algorithm...
Chapter
Clustering is used as an important technique to extract patterns from big data in various fields. Graph clustering as a subset of clustering has a lot of practical applications. Due to the NP-hardness of the graph clustering problem, many evolutionary algorithms, particularly the genetic algorithm have been presented. One of the most effective oper...
Article
Full-text available
Programs are under continuous attack for disclosing secret information, and defending against these attacks is becoming increasingly vital. An attractive approach for protection is to measure the amount of secret information that might leak to attackers. A fundamental issue in computing information leakage is that given a program and attackers with...
Conference Paper
Clustering is used as an important technique to extract patterns from big data in various fields. Graph clustering as a subset of clustering has a lot of practical applications. Due to the NP-hardness of the graph clustering problem, many evolutionary algorithms, particularly the genetic algorithm have been presented. One of the most effective oper...
Conference Paper
Graph clustering has wide applications in different areas such as machine learning, bioinformatics, data mining, social networks and understanding a software. Since it is an NP-hard problem, most approaches use the meta-heuristic and search-based evolutionary methods for solving it. Inspired by optimization algorithms like krill herd (KH) and genet...
Conference Paper
Abstract: Clustering techniques are usually utilized to partition a software system, aiming to understand it. Understanding a program helps to maintain the legacy source code. Since the partitioning of a software system is an NP-hard problem, using the evolutionary approaches seems reasonable. Krill herd (KH) evolutionary algorithm is an effective...
Article
Full-text available
Nowadays, parallel and distributed based environments are used extensively; hence, for using these environments effectively, scheduling techniques are employed. The scheduling algorithm aims to minimize the makespan (i.e., completion time) of a parallel program. Due to the NP-hardness of the scheduling problem, in the literature, several genetic al...
Article
Context: Software systems evolve over time to meet the new requirements of users. These new requirements, usually, are not reflected in the original documents of these software systems. Therefore, the new version of a software system deviates from the original and documented architecture. This way, it will be more difficult to understand it after a...
Article
Full-text available
Planning a powerful server imposes an enormous cost for providing ideal performance. Given that a server responding for web requests is more likely to consume RAM memory than other resources, it is desirable to provide an appropriate RAM capacity for optimal performance of server in congested situations. This can be done through RAM usage modeling...
Chapter
A lack of up-to-date software documentation hinders the processes of software evolution and maintenance, as the structure and code of the software can be misunderstood. One approach to overcoming such problems is to extract and reconstruct the software architecture from the available source code so that it can be assessed against the required chang...
Chapter
In view of the very large search space in software system modularization, the use of evolutionary approaches and, especially, genetic algorithms seems reasonable. Research on these approaches has shown that they can achieve better results than hierarchical techniques, when used to modularize software systems to extract the architecture. Therefore,...
Chapter
This chapter discusses some advanced software modularization approaches that are based on the algebra of concepts. The following topics will be discussed in this chapter: concept analysis algorithm;spectral modularization algorithms;latent semantic indexing;modularization based on unifying syntactic and semantic features.
Chapter
In this chapter, we address the effect of architecture design on several different software quality attributes. The following quality attributes will be discussed: security;reliability;performance.
Chapter
This chapter is focused on introducing the terms “software artefact” to describe the entities and elements modularized together and “features” to denote the attributes of these artefacts. Proximity of software artefacts is essential to the modularization of source code. In this chapter, various proximity measures (such as the Minkowski family, L1 f...
Chapter
In the literature, a large number of algorithms have been proposed for software modularization. As the software modularization process is unsupervised, there are neither predefined modules nor examples that can be used to validate the modules found by modularization algorithms. To compare the results of different modularization algorithms, it is ne...
Article
Full-text available
Contemporary applications are much more complex than before and they need more time and resources while being executed. Investigators provide various strategies to improve the process of compiling, trying to improve execution speed. Hence, distinct transformations are proposed that are frequently used in modern compilers. One of them is loop fusion...
Conference Paper
Full-text available
در سیستمهاي نرمافزاري بزرگ بیش از 60 درصد هزینه نرمافزار صرف نگهداري آن میشود. فهم برنامه یک عامل مهم براي توسعه و نگهداري آن است. پیمانهبندي به عنوان یک فعالیت کلیدي در مهندسی معکوس براي استخراج معماري نرمافـزار مطـرح اسـت. مهندسی معکوس فهم برنامه را با ایجاد مدلهاي ذهنی سطح بالا فراهم میکند. هدف فرآیند پیمانهبنـدي نرمافـزار تجزیـه یـک سیستم نرما...
Article
Full-text available
Class cohesion or degree of the relations of class members is considered as one of the crucial quality criteria. A class with a high cohesion improves understandability, maintainability and reusability. The class cohesion metrics can be measured quantitatively and therefore can be used as a base for assessing the quality of design. The main objecti...
Article
This book presents source code modularization as a key activity in reverse engineering to extract the software architecture from the existing source code. To this end, it provides detailed techniques for source code modularization and discusses their effects on different software quality attributes. Nonetheless, it is not a mere survey of source co...
Article
Full-text available
Software clustering is usually used for program understanding. Since the software clustering is a NP-complete problem, a number of Genetic Algorithms (GAs) are proposed for solving this problem. In literature, there are two wellknown GAs for software clustering, namely, Bunch and DAGC, that use the genetic operators such as crossover and mutation t...
Article
Full-text available
Lack of up-to-date software documentation hinders the software evolution and maintenance processes, as simply the outdated software structure and code could be easily misunderstood. One approach to overcoming such problems is using software modularization, in which the software architecture is extracted from the available source code; such that dev...
Conference Paper
Full-text available
در سیستمهای نرمافزاری بزرگ فهم برنامه عامل مهمی برای نگهداری و توسعه آن محسوب میشود و بیش از 06 درصد هزینه نرمافزار صرف نگهداری آن میگردد. برای فهم نرمافزار از مهندسی معکوس استفاده میشود. مهندسی معکوس، شناخت برنامه را با ایجاد مدلهای ذهنی و سطح بالا فراهم مینماید. یکی از مراحل مهندسی معکوس استفاده از خوشهبندی میباشد. خوشهبندی به عنوان یک راهکار...
Article
Full-text available
Assessing software quality attributes (such as performance, reliability, and security) from source code is of the utmost importance. The performance of a software system can be improved by its parallel and distributed execution. The aim of the parallel and distributed execution is to speed up by providing the maximum possible concurrency in executi...
Article
Full-text available
Assessing software quality attributes (such as performance, reliability, and security) from source code is of the utmost importance. The performance of a software system can be improved by its parallel and distributed execution. The aim of the parallel and distributed execution is to speed up by providing the maximum possible concurrency in executi...
Article
Full-text available
Nowadays, evaluation of software security, as one of the important quality attributes, is of paramount importance. There are many software systems have not considered security in their design; this makes them vulnerable to security risks. Architecture is the most important consideration in software design that affects final quality of software. Qua...
Article
One way to speed up the execution of sequential programs is to divide them into concurrent segments and execute such segments in a parallel manner over a distributed computing environment. We argue that the execution speedup primarily depends on the concurrency degree between the identified segments as well as communication overhead between the seg...
Article
Most techniques used to assess the qualitative characteristics of software are done in testing phase of software development. Assessment of performance in the early software development process is particularly important to risk management. Software architecture, as the first product, plays an important role in the development of the complex softwar...
Article
Most techniques used to assess the qualitative characteristics of software are done in testing phase of software development. Assessment of performance in the early software development process is particularly important to risk management. Software architecture, as the first product, plays an important role in the development of the complex softwar...
Article
Full-text available
Nowadays, graphs and matrix have been used extensively in computing. In this paper an evolutionary approach to solve a problem related with the matrix called minimization of bandwidth problem is proposed. Due to difficulties to solve of this problem, using of evolutionary processing and especially genetic algorithm is efficient. In this paper by ad...
Article
Most techniques used to assess the qualitative characteristics of software are done in testing phase of software development. Assessment of performance in the early software development process is particularly important for risk management. In this paper, we present a method based on interface automata and use queuing theory to predict software com...

Network

Cited By

Projects

Projects (6)
Project
Implicit Relation Discovery in Multiplex Networks for User Personality Type Prediction
Project
Clustering with optimization algorithms