added 3 research items
In this demo, we introduce the RAINBOW Analytics Stack, which offers a holistic approach for real-time data management and processing for Fog realms. Moreover, it provides a distributed solution encapsulating pluggable task schedulers that optimize user-defined trade-offs among performance indicators, like energy consumption, processing latency, data quality, etc. Finally, the RAINBOW declarative query model enables users to express streaming analytic insights, leaving the RAINBOW stack to compile, optimize, and execute them.
Micro-Datacenters (DCs) are emerging as key enablers for Edge computing and 5G mobile networks by providing processing power closer to IoT devices to extract timely analytic insights. However, the performance evaluation of data stream processing on micro-DCs is a daunting task due to difficulties raised by the time-consuming setup, configuration and heterogeneity of the underlying environment. To address these challenges, we introduce BenchPilot, a modular and highly customizable benchmarking framework for edge micro-DCs. BenchPilot provides a high-level declarative model for describing experiment testbeds and scenarios that automates the benchmarking process on Streaming Distributed Processing Engines (SDPEs). The latter enables users to focus on performance analysis instead of dealing with the complex and time-consuming setup. BenchPilot instantiates the underlying cluster, performs repeatable experimentation, and provides a unified monitoring stack in heterogeneous Micro-DCs. To highlight the usability of BenchPilot, we conduct experiments on two popular streaming engines, namely Apache Storm and Flink. Our experiments compare the engines based on performance, CPU utilization, energy consumption, temperature, and network I/O.
Micro-datacenters emerged as key enablers for Edge computing by bringing the processing power closer to IoT devices whilst yielding less network pressure to extract useful analytic insights. Nonetheless, due to the burdens caused by time-consuming setups, configurations and infrastructure heterogeneity, performance evaluation on micro-DCs can be an intimidating job. To address these challenges, we present BenchPilot, a customizable benchmarking framework for edge micro-DCs. BenchPilot performs repeatable and reproducible benchmarking, while offering a unified monitoring stack. It allows users to focus on performance analysis instead of dealing with the complex, time-consuming setup.
The majority of IoT devices disseminate harvested data through the internet for analysis by cloud services. However, emerging applications, such as autonomous vehicle navigation, are impacted by the Round-Trip-Time between the IoTs and cloud. 5G networks and edge computing promise shorter RTTs by bringing compute and network resources closer to IoT. Network slicing is a key enabler for 5G networks, dividing a physical network among a variety of services under their individual needs. However, the design of a network slice impacts the performance of the mobile IoT applications with owners puzzled among the numerous slice configurations and options. For instance, the placement of network access points and available compute nodes, the wireless protocols, and the midhaul and backhaul QoS are crucial factors impacting service performance with the mobility of entities constituting this issue even more daunting. Operators can address this challenge by purchasing a high-performance 5G network slice that provides radio units with numerous antenna elements and powerful compute nodes fully covering operational areas, however the latter is unrealistic due to the increased operational costs. Consequently, users have to select the minimum number of computing and network components that are capable of handling the volatile mobile workload and place them to maximize the network coverage. Thus, users need to perform multiple trials on diverse slices which is a rather time-consuming and costly procedure. In each trial, users have to design and lease the respective network slice, configure their physical IoT devices, deploy the IoT services, and monitor various KPIs. To alleviate the difficulties in setting up real-world 5G testbeds, we will demonstrate 5G-Slicer, an emulation framework that facilitates the definition of mobile network slices through modeling abstractions for radio units, mobile nodes, trajectories, etc., while also offering realistic network QoS by dynamically altering -at runtime- signal strength. Moreover, 5G-Slicer provides an already realized scenario for a city-scale deployment that smart-city researchers can simply configure through a ``ready-to-use" template, leaving 5G-Slicer responsible for translating it into an emulated environment.
5G is emerging as a key mobile network technology offering Gbps transmission rates, lower communication latency, and support for 10-100x more connected devices. The full exploitation of 5G relies on network slicing, a network virtualization technique where operators split a physical network among a wide number and variety of services, in accordance to their individual needs. However, experimentation with 5G-enabled services and measurement of key performance indicators (KPIs) over network slices is extremely challenging as it requires the deployment and coordination of numerous physical devices, including edge and cloud resources. In this paper, we introduce 5G-Slicer; an open and extensible framework for modeling and rapid experimentation of 5G-enabled services via a scalable network slicing emulator. Through modeling abstractions, our solution eases the definition of 5G network slices, virtual and physical fog resources, and the mobility of involved entities. With the blueprint of an emulated testbed at hand, users can create reproducible experiments to evaluate application functionality and KPIs by injecting load, faults and even changing runtime configurations. To show the wide applicability of 5G-Slicer, we introduce a proof-of-concept use-case that encompasses different scenarios for capacity management in a city-scale intelligent transportation service. Evaluation results exploiting real 5G data show that 5G-slicer presents, at most, an 11.7% deviation when comparing actual and emulated network Quality of Service (QoS)