ThesisPDF Available

Evaluation and Analysis of Bio-Inspired Techniques for Resource Management and Load Balancing of Fog Computing

Authors:

Abstract

With the evolution of fog computing, processing takes place locally in a virtual platform rather than in a centralized cloud server. Fog computing combined with cloud computing is more efficient as fog computing alone does not serve the purpose. Inefficient resource management and load balancing leads to degradation in quality of service as well as energy losses. Traffic overhead is increased because all the requests are sent to main server causing delays which cannot be tolerated in health-care scenarios. To overcome this problem, the authors are consolidating fog computing resources so that requests are handled by cloudlets and only critical requests are sent to cloud for processing. Servers are placed locally in each city to handle the near-by requests in order to utilize the resources efficiently along with load balancing among all the servers, which leads to reduced latency and traffic overhead with the improved quality of service. Due to the limited data storage capacity available to Internet service providers and large-scale enterprises, the concept of resource sharing arises. The services can be given on lease to enterprises through Service Level Agreements (SLAs). Being the extension of the cloud computing, fog computing architecture brings the resources near end users. In order to get the services on lease, the enterprises are supposed to pay for the resources or services which are being used by them. For this, four nature inspired algorithms are analyzed in order to determine the efficient management of services or resources so that the cost of resources can be reduced and the billing can be attained through calculation of the utilized resources. Pigeon Inspired Optimization (PIO), Enhanced Differential Evolution (EDE), Binary Bat Algorithm (BBA) and Simple Human Learning Optimization (SHLO) are used to evaluate the energy consumed by the edge nodes or cloudlets that in turn can be used for estimating the bill through the Time of Use pricing variable. We evaluate the aforementioned techniques to analyze their performance regarding the bill calculation on the basis of fog servers usage. Simulation results demonstrate that BAT algorithm gives significantly better results than other three algorithms in terms of resource utilization and bill reduction.
A preview of the PDF is not available
Article
Full-text available
Internet of Things has been growing, due to which the number of user requests on fog computing layer has also increased. Fog works in a real-time environment and offers from connected devices need to be processed immediately. With the increase in users requests on fog layer, virtual machines (VMs) at fog layer become overloaded. Load balancing mechanism can distribute load among all the VMs in equal proportion. It has become a necessity in the fog layer to equally, and equitably distribute all the workload among the existing VMs in the segment. Till now, many load balancing techniques have been proposed for fog computing. An empirical study of existing methods in load balancing have been conducted, and taxonomy has been presented in a hierarchical form. Besides, the article contains the year-wise comprehensive review and summary of research articles published in the area of load balancing from 2013 to 2020. Furthermore, article also contains our proposed fog computing architecture to resolve load balancing problem. It also covers current issues and challenges that can be resolved in future research works. The paper concludes by providing future directions.
Article
Full-text available
The growth of the networks has difficult network management. Recently, a concept called software‐defined network (SDN) has been proposed to address this issue, which makes network management more adaptable. Control and forwarding planes are separated in SDN. The control plane is a centralized logical controller that controls the network. The forwarding plane that consists of transfer devices is responsible for transmitting packets. Because the network resources are limited, optimizing the use of resources in the networks is an important issue. Load balancing improves the balanced distribution of loads across multiple resources in order to maximize the reliability and network resources efficiency. SDN controllers can create an optimal load balancing compared to traditional networks because they have a network global view. The load‐balancing problem can be solved using many different nature‐inspired meta‐heuristic techniques because it has the NP‐complete nature. Hence, for solving load balancing problem in SDN, nature‐inspired meta‐heuristic techniques are important methods. However, to the best of our knowledge, there is not a survey or systematic review on studying these matters. Accordingly, in the area of the load balancing in the SDN, this paper reviews systematically the nature‐inspired meta‐heuristic techniques. Also, this study demonstrates advantages and disadvantages regarded of the chosen nature‐inspired meta‐heuristic techniques and considers their algorithms metrics. Moreover, to apply better load balancing techniques in the future, the important challenges of these techniques have been investigated.
Article
Full-text available
Today, enterprise applications impose more and more resource requirements to support an ascending number of clients and to deliver them an acceptable Quality of Service (QoS). To ensure such requirements are met, it is essential to apply appropriate resource and application monitoring techniques. Such techniques collect data to enable predictions and actions which can offer better system performance. Typically, system administrators need to consider different data sources, so making the relationship among them by themselves. To address these gaps and considering the context of general networked-based systems, we propose a survey that combines a discussion about system monitoring, data prediction, and resource management procedures in a unified view. The article discusses resource and application monitoring, resource management, and data forecast at both performance and architectural perspectives of enterprise systems. Our idea is to describe consolidated subjects such as monitoring metrics and resource scheduling, together with novel trends, including cloud elasticity and artificial intelligence-based load prediction algorithms. This survey links the aforesaid three pillars, emphasizing relationships among them and also pointing out opportunities and research challenges in the area.
Article
Full-text available
The most widely used technique for solving and optimizing a real-life problem is linear programming (LP), due to its simplicity and efficiency. However, in order to handle the impreciseness in the data, the neutrosophic set theory plays a vital role which makes a simulation of the decision-making process of humans by considering all aspects of decision (i.e., agree, not sure and disagree). By keeping the advantages of it, in the present work, we have introduced the neutrosophic LP models where their parameters are represented with a trapezoidal neutrosophic numbers and presented a technique for solving them. The presented approach has been illustrated with some numerical examples and shows their superiority with the state of the art by comparison. Finally, we conclude that proposed approach is simpler, efficient and capable of solving the LP models as compared to other methods.
Article
Full-text available
The neutrosophic set is an excellent tool for dealing with vague and inconsistent information effectively. Consequently, by studying the concept of three-way decisions based on neutrosophic set, we can find a suitable manner to take a reasonable decision. In this article, we suggest two rules of three-way decisions based on three membership degrees of neutrosophic set. A new evaluation function is presented to calculate weights of alternatives, for choosing the best one. We also study a supplier selection problem (selecting suppliers to obtain the indispensable materials for assisting the outputs of companies). The best suppliers need to be selected to enhance quality, service, to reduce cost, and to control time. The most widely used technique for determining the requirements of a company is the Quality Function Deployment (QFD). Since traditional QFD technique does not prioritize stakeholders’ requirements and fails to deal with vague and inconsistent information, this research also integrates it with Analytic Hierarchy Process (AHP) depending on neutrosophic environment. A case study is presented to illustrate the effectiveness of the proposed model.
Article
Full-text available
Understanding is considered a key purpose of image forensic science in order to find out if a digital image is authenticated or not. It can be a sensitive task in case images are used as necessary proof as an impact judgment. it’s known that There are several different manipulating attacks but, this copy move is considered as one of the most common and immediate one, in which a region is copied twice in order to give different information about the same scene, which can be considered as an issue of information integrity. The detection of this kind of manipulating has been recently handled using methods based on SIFT. SIFT characteristics are represented in the detection of image features and determining matched points. A clustering is a key step which always following SIFT matching in-order to classify similar matched points to clusters. The ability of the image forensic tool is represented in the assessment of the conversion that is applied between the two duplicated images of one region and located them correctly. Detecting copy-move forgery is not a new approach but using a new clustering approach which has been purposed by using the 2-level clustering strategy based on spatial and transformation domains and any previous information about the investigated image or the number of clusters need to be created is not necessary. Results from different data have been set, proving that the proposed method is able to individuate the altered areas, with high reliability and dealing with multiple cloning.
Article
Full-text available
The quadratic assignment problem (QAP) has considered one of the most significant combinatorial optimization problems due to its variant and significant applications in real life such as scheduling, production, computer manufacture, chemistry, facility location, communication, and other fields. QAP is NP-hard problem that is impossible to be solved in polynomial time when the problem size increases, hence heuristic and metaheuristic approaches are utilized for solving the problem instead of exact approaches, because these approaches achieve quality in the solution in short computation time. The objectives of this paper are to describe QAP in details showing its types, nature of the problem, complexity of the problem, importance, and simple example. QAP formulations, problems related with QAP, solution techniques, QAP benchmark instances, applications of QAP, survey of QAP researches are also illustrated.
Article
Full-text available
The role of education, which propagates knowledge, becomes increasingly significant in the past little years due to the fulminatory expansion in knowledge. Meantime, the model of education process is going via a conversion in which the learning of different students need to be completed in various ways. Therefore, the smart education environment is encouraged. It incorporates different information and communication technologies to activate learning process and adjust to the requirements of different students. The quality of learning process for students can be enhanced through continually monitoring and analyzing the state and activities of different students via information sensing devices and information processing platforms for offering feedback about learning process of different students. The Internet of Things pledges to achieve a great variation in life, goodness of individual's life, and organizations' productivity. Via a vastly dispensed locally smart network of intelligent objects, the IoT has the chance to allow expansions and improvements to essential utilities in various fields, while introducing a novel ecosystem for developing application. Applying the concept of Internet of Things in any education environment will increase the quality of education process because students will learn rapidly, and teachers will fulfill their job efficiently. This paper is designed to illustrate the basic concepts, definitions, characteristics, technology, and challenges of Internet of Things. We also illustrated the role of Internet of Things in building a smart educational process, and also, in making efficient and effective decisions, which is vital in our daily life.
Article
Full-text available
Bin packing problem (BPP) is a classical combinatorial optimization problem widely used in a wide range of fields. The main aim of this paper is to propose a new variant of whale optimization algorithm named improved Lévy-based whale optimization algorithm (ILWOA). The proposed ILWOA adapts it to search the combinatorial search space of BPP problems. The performance of ILWOA is evaluated through two experiments on benchmarks with varying difficulty and BPP case studies. The experimental results confirm the prosperity of the proposed algorithm in proficiency to find the optimal solution and convergence speed. Further, the obtained results are discussed and analyzed according to the problem size.
Article
Full-text available
The flow shop scheduling problem is one of the most important types of scheduling with a large number of real-world applications. In this paper, we propose a new algorithm that integrates the Whale Optimization Algorithm (WOA) with a local search strategy for tackling the permutation flow shop scheduling problem. The Largest Rank Value (LRV) requires the algorithm to deal with the discrete search space of the problem. The diversity of candidate schedules is improved using a swap mutation operation as well. In addition to the insert-reversed block operation is adopted to escape from the local optima. The proposed hybrid whale algorithm (HWA) is incorporated with Nawaz–Enscore–Ham (NEH) to improve the performance of the algorithm. It is observed that HWA gives competitive results compared to the existing algorithms.
Article
Mobile devices supporting the “Internet of Things” often have limited capabilities in computation, battery energy, and storage space, especially to support resource-intensive applications involving virtual reality, augmented reality, multimedia delivery, and artificial intelligence, which could require broad bandwidth, low response latency, and large computational power. Edge cloud or edge computing is an emerging topic and a technology that can tackle the deficiencies of the currently centralized-only cloud computing model and move the computation and storage resources closer to the devices in support of the above-mentioned applications. To make this happen, efficient coordination mechanisms and “offloading” algorithms are needed to allow mobile devices and the edge cloud to work together smoothly. In this survey article, we investigate the key issues, methods, and various state-of-the-art efforts related to the offloading problem. We adopt a new characterizing model to study the whole process of offloading from mobile devices to the edge cloud. Through comprehensive discussions, we aim to draw an overall “big picture” on the existing efforts and research directions. Our study also indicates that the offloading algorithms in the edge cloud have demonstrated profound potentials for future technology and application development.