About
118
Publications
85,963
Reads
How we measure 'reads'
A 'read' is counted each time someone views a publication summary (such as the title, abstract, and list of authors), clicks on a figure, or views or downloads the full-text. Learn more
4,754
Citations
Introduction
Additional affiliations
September 2014 - November 2016
September 2006 - July 2010
July 2010 - January 2015
Publications
Publications (118)
Pay-per-use service by Cloud service providers has attracted customers in the recent past and is still evolving. Since the resources being dealt within Clouds are non-storable and the physical resources need to be replaced very often, pricing the service in a way that would return profit on the initial capital investments to the service providers h...
A brief review of the Internet history reveals the fact that the Internet evolved after the formation of primarily independent networks. Similarly, interconnected Clouds, also called Inter-Cloud, can be viewed as a natural evolution of Cloud computing. Recent studies show the benefits in utilizing multiple Clouds and present attempts for the realiz...
Infrastructure-as-a-Service cloud providers offer diverse purchasing options and pricing plans, namely on-demand, reservation, and spot market plans. This allows them to efficiently target a variety of customer groups with distinct preferences and to generate more revenue accordingly. An important consequence of this diversification however, is tha...
Born from a need for a pure "pay-per-use" model and highly scalable platform, the "Serverless" paradigm emerged and has the potential to become a dominant way of building cloud applications. Although it was originally designed for cloud environments, Serverless is finding its position in the Edge Computing landscape, aiming to bring computational r...
In the emerging era of Internet of Things (IoT), fog computing plays a critical role in serving delay-sensitive and location-aware applications. As a result, fog nodes are envisioned to be heavily deployed and form future distributed data centers. Powering fog nodes with green energy sources (such as solar and wind), not only helps in environmental...
Alternative route planning requires finding
$k$
alternative paths (including the shortest path) between a given source and target. These paths should be significantly different from each other and meaningful/natural (e.g., must not contain loops or unnecessary detours). While there exists many work on finding high-quality alternative paths, the e...
Modern applications, such as autonomous vehicles, require deploying deep learning algorithms on resource-constrained edge devices for real-time image and video processing. However, there is limited understanding of the efficiency and performance of various object detection models on these devices. In this paper, we evaluate state-of-the-art object...
Eco-friendly navigation (aka eco-routing) finds a route from A to B in a road network that minimizes the greenhouse gas (GHG) emission or fuel/energy consumption of the traveling vehicle. As road transport is a major contributor to GHG emissions, eco-routing has received considerable research attention in the past decade, mainly on two research the...
Serverless edge computing is a specialized system design tailored for Internet of Things (IoT) applications. It leverages serverless computing to minimize operational management and enhance resource efficiency, and utilizes the concept of edge computing to allow code execution near the data sources. However, edge devices powered by renewable energy...
The emerging latency-sensitive applications and Internet of Things technology have resulted in the development of Edge computing. Therefore, improving Quality-of-Service (QoS) requirements such as response time is a fundamental goal in Edge environments. However, as edge devices are heterogeneous and resource-constrained, placing replicas of softwa...
Field-captured video allows for detailed studies of spatiotemporal aspects of animal locomotion, decision-making, and environmental interactions. However, despite the affordability of data capture with mass-produced hardware, storage, processing, and transmission overheads pose a significant hurdle to acquiring high-resolution video from field-depl...
Cloud native solutions are widely applied in various fields, placing higher demands on the efficient management and utilization of resource platforms. To achieve the efficiency, load forecasting and elastic scaling have become crucial technologies for dynamically adjusting cloud resources to meet user demands and minimizing resource waste. However,...
Networking technology is slowly but steadily adopting two key principles from cloud computing, namely, softwarization and modularization. This has resulted in the success of technologies such as Software Defined Networking (SDN) and Network Function Virtualization (NFV), which are being adopted by mobile networking as well. In the meantime, cloud c...
Serverless edge systems simplify the deployment of real-time AI-based Internet of Things (IoT) applications at the edge. However, the heterogeneity of edge computing nodes-in terms of both hardware and software-makes load balancing challenging in these systems. In this paper, we propose a performance-driven, empirical weight-tuning approach to achi...
The shift towards renewable energy sources for powering data centers is increasingly important in the era of cloud computing. However, integrating renewable energy sources into cloud data centers presents a challenge due to their variable and intermittent nature. The unpredictable workload demands in cloud data centers further complicate this probl...
Video games feature a dynamic environment where locations of objects (e.g., characters, equipment, weapons, vehicles etc.) frequently change within the game world. Although searching for relevant nearby objects in such a dynamic setting is a fundamental operation, this problem has received little research attention. In this paper, we propose a simp...
Efficient utilization of renewable energy when powering Cloud Data Centers is a challenging problem due to the variable and intermittent nature of both workload demand and renewable energy supply. This work aims to develop an innovative dynamic resource management algorithm to provide energy flexibility to data center operators for shaping their en...
Rapid growth in the popularity of smart vehicles and increasing demand for vehicle autonomy brings new opportunities for vehicular edge computing (VEC). VEC aims at offloading the time-sensitive computational load of connected vehicles to edge devices, e.g., roadside units. However, VEC offloading raises complex resource management challenges and t...
The rapid development of emerging vehicular edge computing (VEC) brings new opportunities and challenges for dynamic resource management. The increasing number of edge data centers, roadside units (RSUs), and network devices, however, makes resource management a complex task in VEC. On the other hand, the exponential growth of service applications...
Resource management in computing is a very challenging problem that involves making sequential decisions. Resource limitations, resource heterogeneity, dynamic and diverse nature of workload, and the unpredictability of fog/edge computing environments have made resource management even more challenging to be considered in the fog landscape. Recentl...
The rapid development of emerging vehicular edge computing~(VEC) brings new opportunities and challenges for dynamic resource management. The increasing number of edge data centers, roadside units (RSUs), and network devices, however, makes resource management a complex task in VEC. On the other hand, the exponential growth of service applications...
Road crashes cost over a million lives each year. Consequently, researchers and transport engineers continue their efforts to improve road safety and minimize road crashes. With the increasing availability of various sensor technologies to capture road safety-related data and the recent breakthrough in modern data-driven techniques, in particular M...
Services provided by mobile edge clouds offer low-latency responses for large-scale and real-time applications. Dynamic service management algorithms generate live service migration requests to support user mobility and ensure service latency in mobile edge clouds. To handle these migration requests, multiple migration planning and scheduling algor...
The alarming rate of increase in energy demand and carbon footprint of Fog environments has become a critical issue. It is, therefore, necessary to reduce the percentage of brown energy consumption in these systems and integrate renewable energy use into Fog. Renewables, however, are prone to availability fluctuations due to their variable and inte...
Next generation technologies such as smart healthcare, self-driving cars, and smart cities require new approaches to deal with the network traffic generated by the Internet of Things (IoT) devices, as well as efficient programming models to deploy machine learning techniques. Serverless edge computing is an emerging computing paradigm from the inte...
Data centres in contemporary times are essential as the supply of data increases. Data centres are areas where computing systems are concentrated for facilitating data processing, transfer and storage. At present, traditional data centres have moved more towards the cloud model—thereby making the processing, storage and harnessing of data more mana...
Traditional vehicle routing algorithms aim to find the fastest or shortest route, whereas eco-friendly routing algorithms aim to find the route that minimizes vehicle fuel consumption or greenhouse gas (GHG) emissions. To accurately estimate fuel consumption and emissions along a route, a detailed mobility profile of the vehicle traveling on the ro...
In this paper, we present energy-aware scheduling for Serverless edge computing. Energy awareness is critical since edge nodes, in many Internet of Things (IoT) domains, are meant to be powered by renewable energy sources that are variable, making low-powered and/or overloaded (bottleneck) nodes unavailable and not operating their services. This aw...
The exponential growth of Internet of Things (IoT) has given rise to a new wave of edge computing due to the need to process data on the edge, closer to where it is being produced and attempting to move away from a cloud-centric architecture. This provides its own opportunity to decrease latency and address data privacy concerns along with the abil...
Edge and Fog computing paradigms overcome the limitations of cloud-centric execution for different latency-sensitive Internet of Things (IoT) applications by offering computing resources closer to the data sources. Small single-board computers (SBCs) like Raspberry Pis (RPis) are widely used as computing nodes in both paradigms. These devices are u...
Distributed computing paradigms such as cloud, mobile, Internet of Things, and Fog have enabled new modalities for building enterprise architectures through service composition. The fundamental premise is that the application can benefit from functionally equivalent services that can be traded in the cloud or services repositories. These services c...
The enforcement of the Movement Control Order to curtail the spread of COVID-19 has affected home energy consumption, especially HVAC systems. Occupancy detection and estimation have been recognized as key contributors to improving building energy efficiency. Several solutions have been proposed for the past decade to improve the precision performa...
By integrating Software-Defined Networking and cloud computing, virtualized networking and computing resources can be dynamically reallocated through live migration of Virtual Machines (VMs). Dynamic resource management such as load balancing and energy-saving policies can request multiple migrations when the algorithms are triggered periodically....
Of the main challenges to keep the edge computing dream alive is to efficiently manage the energy consumption of highly resource-limited nodes. Past studies have limited or often simplistic focus on energy consumption factors considering computation or communication-only solutions, questioned by either costly hardware instrumentation or inaccurate...
Scheduling when, where, and under what conditions to re-charge an electric vehicle poses unique challenges absent in internal combustion vehicles. Charging scheduling of an electric vehicle for time- and cost-efficiency depends on many variables in a dynamic environment, such as the time-of-use price and the availability of charging piles at a char...
Of the main challenges to keep the edge computing dream alive is to efficiently manage the energy consumption of highly resource-limited nodes. Past studies have limited or often simplistic focus on energy consumption factors considering computation or communication-only solutions, questioned by either costly hardware instrumentation or inaccurate...
By integrating Software-Defined Networking and cloud computing, virtualized networking and computing resources can be dynamically reallocated through live migration of Virtual Machines (VMs). Dynamic resource management such as load balancing and energy-saving policies can request multiple migrations when the algorithms are triggered periodically....
The containerized services allocated in the mobile edge clouds bring up the opportunity for large-scale and real-time applications to have low latency responses. Meanwhile, live container migration is introduced to support dynamic resource management and users' mobility. However, with the expansion of network topology scale and increasing migration...
The exponential growth of Internet of Things (IoT) has given rise to a new wave of edge computing due to the need to process data on the edge, closer to where it is being produced and attempting to move away from a cloud-centric architecture. This provides its own opportunity to decrease latency and address data privacy concerns along with the abil...
The deployment of fog computing resources in industrial internet of things (IIoT) is essential to support time-sensitive applications. To utilize resources efficiently, a brand-new request dispatcher is required to sit between the IIoT devices and the pool of fog resources. The need for such a dispatcher stems from the challenges specific to these...
Data centres in contemporary times are essential as the supply of data increases. Data centres are areas where computing systems are concentrated for facilitating data processing, transfer and storage. At present traditional data centres have moved more towards the cloud model thereby making the processing, storage and harnessing of data more manag...
Occupancy-driven application research has been active research for a decade that focuses on improving or replacing new building infrastructure to improve building energy efficiency. Existing approaches for HVAC energy saving are putting more emphasis on occupancy detection, estimation, and localization to trade-off between energy consumption and th...
Auto-scaling of Web applications is an extensively investigated issue in cloud computing. To evaluate auto-scaling mechanisms, the cloud community is facing considerable challenges on either real cloud platforms or custom test-beds. Challenges include – but not limited to – deployment impediments, the complexity of setting parameters, and most impo...
Many modern navigation systems and map-based services do not only provide the fastest route from a source location s to a target location
$t$
but also provide a few alternative routes to the users as more options to choose from. Consequently, computing alternative paths has received significant research attention. However, it is unclear which of...
In Software-Defined Networking (SDN)-enabled cloud data centers, live migration is a key approach used for the reallocation of Virtual Machines (VMs) in cloud services and Virtual Network Functions (VNFs) in Service Function Chaining. Using live migration, cloud providers can address their dynamic resource management and fault tolerance objectives...
Software Defined Networking (SDN) has emerged as a programmable approach for provisioning and managing network resources by defining a clear separation between the control and data forwarding planes. Nowadays SDN has gained significant attention in the military domain. Its use in the battlefield communication facilitates the end-to-end interactions...
The MapReduce model is widely used to store and process big data in a distributed manner. MapReduce was originally developed for a single tightly coupled cluster of computers. Approaches such as Hierarchical and Geo-Hadoop are designed to address geo-distributed MapReduce processing. However, these methods still suffer from high inter-cluster data...
In Software-Defined Networking (SDN)-enabled cloud data centers, live migration is a key approach used for the reallocation of Virtual Machines (VMs) in cloud services and Virtual Network Functions (VNFs) in Service Function Chaining (SFC). Using live migration methods, cloud providers can address their dynamic resource management and fault toleran...
Edge and Fog computing paradigms overcome the limitations of Cloud-centric execution for different latency-sensitive Internet of Things (IoT) applications by offering computing resources closer to the data sources. In both paradigms, single-board small computers like Raspberry Pis (RPis) are widely used as the computing nodes. RPis are usually equi...
Software Defined Networking (SDN) has emerged as a programmable approach for provisioning and managing network resources by defining a clear separation between the control and data forwarding planes. Nowadays SDN has gained significant attention in the military domain. Its use in the battlefield communication facilitates the end-to-end interactions...
Rapid adoption of Cloud computing for hosting services and its success is primarily attributed to its attractive features such as elasticity, availability and pay-as-you-go pricing model. However, the huge amount of energy consumed by cloud data centers makes it to be one of the fastest growing sources of carbon emissions. Approaches for improving...
Rapid adoption of Cloud computing for hosting services and its success is primarily attributed to its attractive features such as elasticity, availability and pay-as-you-go pricing model. However, the huge amount of energy consumed by cloud data centers makes it to be one of the fastest growing sources of carbon emissions. Approaches for improving...
Optimization is an inseparable part of Cloud computing, particularly with the emergence of Fog and Edge paradigms. Not only these emerging paradigms demand reevaluating cloud-native optimizations and exploring Fog and Edge-based solutions, but also the objectives require significant shift from considering only latency to energy, security, reliabili...
Due to the popularity of smartphones, cheap wireless networks and availability of road network data, navigation applications have become a part of our everyday life. Many modern navigation systems and map-based services do not only provide the fastest route from a source location s to a target location t but also provide a few alternative routes to...
Fog computing introduces a distributed processing capability close to end-users. The proximity of computing to end-users leads to lower service time and bandwidth requirements. Energy consumption is a matter of concern in such a system with a large number of computing nodes. Renewable energy sources can be utilized to lessen the burden on the main...
Current cloud computing frameworks host millions of physical servers that utilize cloud computing resources in the form of different virtual machines (VM). Cloud Data Center (CDC) infrastructures require significant amounts of energy to deliver large scale computational services. Computing nodes generate large volumes of heat, requiring cooling uni...
Current cloud computing frameworks host millions of physical servers that utilize cloud computing resources in the form of different virtual machines. Cloud Data Center (CDC) infrastructures require significant amounts of energy to deliver large scale computational services. Moreover, computing nodes generate large volumes of heat, requiring coolin...
The fourth industrial revolution, widely known as Industry 4.0, is realizable through widespread deployment of Internet of Things (IoT) devices across the industrial ambiance. Due to communication latency and geographical distribution, Cloud-centric IoT models often fail to satisfy the Quality of Service (QoS) requirements of different IoT applicat...
The huge energy consumption of cloud data centers not only increases costs but also carbon emissions associated with such data centers. Powering data centers with renewable or green sources of energy can reduce brown energy use and consequently carbon emissions. However, powering data centers with these energy sources is challenging, as they are va...
Purpose- Human or machine, which one is more intelligent and powerful for performing computing and processing tasks? Over the years, researchers and scientists have spent significant amounts of money and effort to answer this question. Nonetheless, despite some outstanding achievements, replacing humans in the intellectual tasks is not yet a realit...
In Software-Defined Networking (SDN) enabled cloud data centers, live VM migration is a key technology to facilitate the resource management and fault tolerance. Despite many research focus on the network-aware live migration of VMs in cloud computing, some parameters that affect live migration performance are neglected to a large extent. Furthermo...
It is anticipated that future networks support network functions, such as firewalls, load balancers and intrusion prevention systems in a fully automated, flexible, and efficient manner. In cloud computing environments, network functions virtualization (NFV) aims to reduce cost and simplify operations of such network services through the virtualiza...
This chapter reviews the state‐of‐the‐art literature on network slicing in 5G, edge/fog, and cloud computing. It identifies the spectrum challenges and obstacles that must be addressed to achieve the ultimate realization of the concept. The chapter provides a brief introduction of 5G, edge/fog, and clouds and their interplay. It outlines the 5G vis...
Network slicing allows network operators to build multiple isolated virtual networks on a shared physical network to accommodate a wide variety of services and applications. With network slicing, service providers can provide a cost-efficient solution towards meeting diverse performance requirements of deployed applications and services. Despite sl...