Conference PaperPDF Available

Towards Enabling Scientific Workflows for the Future Internet of Things

Authors:

Abstract and Figures

Cloud computing offers on-demand access to computational, infrastructure and data resources operated from a remote source. This novel technology has opened new ways of flexible resource provisions for businesses to manage applications and data responding to new demands from customers. In the current web application scenario a rapidly growing number of powerful devices join the Internet, significantly impacting on the global traffic volume and foreshadowing a world of smart devices, or things in the Internet of Things (IoT) perspective. This trend calls for an ecosystem that provides means to interconnect and control these devices. In this position paper we envision the integration of IoT into Cloud-enabled scientific workflows to support the proliferation of IoT with the help of cloud technologies. These enhanced workflows will enable the creation and management of user applications that bring clouds and IoT closer to users by hiding the complexity and cumbersome utilization of virtualized resources, data sources and things. The goal of this approach is to ease the lives of users and foster scientific work by engaging the Internet of Things.
Content may be subject to copyright.
A preview of the PDF is not available
... The application of BPM technologies in the context of CPS and IoT poses a new research field that raises various novel research challenges. Several works discuss this topic and with that the new challenges in detail [LMM15, CSB15, CSB16, MBBF17, GGAAPE + 11, MRH15,KP15]. The work by Janiesch et al. presents a good high-level overview of 16 challenges related to the interaction between IoT and BPM that are partially relevant for our work (cf. Figure 2.16) [JKM + 17]. ...
Thesis
Workflows are a well-established concept for describing business logics and processes in web-based applications and enterprise application integration scenarios on an abstract implementation-agnostic level. Applying Business Process Management (BPM) technologies to increase autonomy and automate sequences of activities in Cyber-physical Systems (CPS) promises various advantages including a higher flexibility and simplified programming, a more efficient resource usage, and an easier integration and orchestration of CPS devices. However, traditional BPM notations and engines have not been designed to be used in the context of CPS, which raises new research questions occurring with the close coupling of the virtual and physical worlds. Among these challenges are the interaction with complex compounds of heterogeneous sensors, actuators, things and humans; the detection and handling of errors in the physical world; and the synchronization of the cyber-physical process execution models. Novel factors related to the interaction with the physical world including real world obstacles, inconsistencies and inaccuracies may jeopardize the successful execution of workflows in CPS and may lead to unanticipated situations. This thesis investigates properties and requirements of CPS relevant for the introduction of BPM technologies into cyber-physical domains. We discuss existing BPM systems and related work regarding the integration of sensors and actuators into workflows, the development of a Workflow Management System (WfMS) for CPS, and the synchronization of the virtual and physical process execution as part of self-* capabilities for WfMSes. Based on the identified research gap, we present concepts and prototypes regarding the development of a CPS WFMS w.r.t. all phases of the BPM lifecycle. First, we introduce a CPS workflow notation that supports the modelling of the interaction of complex sensors, actuators, humans, dynamic services and WfMSes on the business process level. In addition, the effects of the workflow execution can be specified in the form of goals defining success and error criteria for the execution of individual process steps. Along with that, we introduce the notion of Cyber-physical Consistency. Following, we present a system architecture for a corresponding WfMS (PROtEUS) to execute the modelled processes-also in distributed execution settings and with a focus on interactive process management. Subsequently, the integration of a cyber-physical feedback loop to increase resilience of the process execution at runtime is discussed. Within this MAPE-K loop, sensor and context data are related to the effects of the process execution, deviations from expected behaviour are detected, and compensations are planned and executed. The execution of this feedback loop can be scaled depending on the required level of precision and consistency. Our implementation of the MAPE-K loop proves to be a general framework for adding self-* capabilities to WfMSes. The evaluation of our concepts within a smart home case study shows expected behaviour, reasonable execution times, reduced error rates and high coverage of the identified requirements, which makes our CPS~WfMS a suitable system for introducing workflows on top of systems, devices, things and applications of CPS.
Article
Full-text available
Cloud computing and Internet of Things (IoT), two very different technologies, are both already part of our life. Their massive adoption and use is expected to increase further, making them important components of the Future Internet. A novel paradigm where Cloud and IoT are merged together is foreseen as disruptive and an enabler of a large number of application scenarios. In this paper we focus our attention on the integration of Cloud and IoT, which we call the CloudIoT paradigm. Many works in literature have surveyed Cloud and IoT separately: their main properties, features, underlying technologies, and open issues. However, to the best of our knowledge, these works lack a detailed analysis of the CloudIoT paradigm. To bridge this gap, in this paper we review the literature about the integration of Cloud and IoT. We start analyzing and discussing the need for integrating them, the challenges deriving from such integration, and how these issues have been tackled in literature. We then describe application scenarios that have been presented in literature, as well as platforms-both commercial and open source-and projects implementing the CloudIoT paradigm. Finally, we identify open issues, main challenges and future directions in this promising field.
Conference Paper
Full-text available
Cloud computing is ever stronger converging with the Internet of Things (IoT) offering novel techniques for IoT infrastructure virtualization and its management on the cloud. However, system designers and operations managers face numerous challenges to realize IoT cloud systems in practice, mainly due to the complexity involved with provisioning large-scale IoT cloud systems and diversity of their requirements in terms of IoT resources consumption, customization of IoT capabilities and runtime governance. In this paper, we introduce the concept of software-defined IoT units-a novel approach to IoT cloud computing that encapsulates fine-grained IoT resources and IoT capabilities in well-defined APIs in order to provide a unified view on accessing, configuring and operating IoT cloud systems. Our software-defined IoT units are the fundamental building blocks of software-defined IoT cloud systems. We present our framework for dynamic, on-demand provisioning and deploying such software-defined IoT cloud systems. By automating provisioning processes and supporting managed configuration models, our framework simplifies provisioning and enables flexible runtime customizations of software-defined IoT cloud systems. We demonstrate its advantages on a real-world IoT cloud system for managing electric fleet vehicles.
Chapter
Full-text available
Cloud Computing offers on-demand access to computational, infrastructure and data resources operated from a remote source. This novel technology has opened new ways of flexible resource provisions for businesses to manage IT applications and data responding to new demands from customers. In this chapter, we provide a general insight to the formation and interoperability issues of Cloud Federations that envisage a distributed, heterogeneous environment consisting of various cloud infrastructures by aggregating different Infrastructure-as-a-Service (IaaS) provider capabilities coming from both the commercial and academic area. These multi-cloud infrastructures are also used to avoid provider lock-in issues for users that frequently utilize different clouds. We characterize and classify recent solutions that arose from both research projects and individual research groups, and show how they attempt to hide the diversity of multiple clouds and form a unified federation on top of them. As they still need to cope with several open issues concerning interoperability; we also provide guidelines to address related topics such as service monitoring, data protection and privacy, data management and energy efficiency.
Conference Paper
Full-text available
In the last 20 years quite a few mature workflow engines and workflow editors have been developed to support communities in managing workflows. While there is a trend followed by the providers of workflow engines to ease the creation of workflows tailored to their specific workflow system, the management tools still often necessitate much understanding of the workflow concepts and languages. This paper describes the approach targeting various workflow systems and building a single user interface for editing and monitoring workflows under consideration of aspects such as optimization and provenance of data. The design allots agile Web frameworks and novel technologies to build a workflow dashboard offered in a web browser and connecting seamlessly to available workflow systems and external resources like Cloud infrastructures. The user interface eliminates the need to become acquainted with diverse layouts. Thus, the usability is immensely increased for various aspects of managing workflows.
Article
Full-text available
This paper discusses approaches and environments for carrying out analytics on Clouds for Big Data applications. It revolves around four important areas of analytics and Big Data, namely (i) data management and supporting architectures; (ii) model development and scoring; (iii) visualisation and user interaction; and (iv) business models. Through a detailed survey, we identify possible gaps in technology and provide recommendations for the research community on future directions on Cloud-supported Big Data computing and analytics solutions.
Article
Full-text available
In the current worldwide ICT scenario, a constantly growing number of ever more powerful devices (smartphones, sensors, household appliances, RFID devices, etc.) join the Internet, significantly impacting the global traffic volume (data sharing, voice, multimedia, etc.) and foreshadowing a world of (more or less) smart devices, or “things” in the Internet of Things (IoT) perspective. Heterogeneous resources can be aggregated and abstracted according to tailored thing-like semantics, thus enabling Things as a Service paradigm, or better a “Cloud of Things”. In the Future Internet initiatives, sensor networks will assume even more of a crucial role, especially for making smarter cities. Smarter sensors will be the peripheral elements of a complex future ICT world. However, due to differences in the “appliances” being sensed, smart sensors are very heterogeneous in terms of communication technologies, sensing features and elaboration capabilities. This article intends to contribute to the design of a pervasive infrastructure where new generation services interact with the surrounding environment, thus creating new opportunities for contextualization and geo-awareness. The architecture proposal is based on Sensor Web Enablement standard specifications and makes use of the Contiki Operating System for accomplishing the IoT. Smart cities are assumed as the reference scenario.
Article
Full-text available
Cloud computing has gaining importance in the recent past due to the conjunction of well-known key features, such as virtualization and pay-by-use, which together form an innovative concept. Even if cloud computing does not have a widely accepted definition, it has been used for many companies to deploy its infrastructures and promote their business. However, the lack of standards seems to be a drawback related to interoperability and optimization issues. Convenient actions like changing cloud providers, and/or exchange data/information between clouds may be an arduous work for its customers. Therefore, this paper presents major considerations regarding the lack of cloud standards and pointing why this is considered to be a problem. Furthermore, scenarios are discussed, which are desired to make use of cloud interoperabilityand which are currently initiatives addressing cloud standards issues. This leads to a set of important observations towards a solution solving the interoperability and standardization problem.
Conference Paper
The workflow interoperability problem was successfully solved by the SHIWA project if the workflows to be integrated were running in the same grid infrastructure. However, in the more generic case when the workflows were running in different infrastructures the problem has not been solved yet. In the current paper we show a solution for this problem by introducing a new type of workflow called as infrastructure-aware workflow. These are scientific workflows extended with new node types that enable the on-the-fly creation and destruction of the required infrastructures in the clouds. The paper shows the semantics of these new types of nodes and workflows and also how they can solve the workflow interoperability problem. The paper also describes how these new type of workflows can be implemented by a new service called as One Click Cloud Orchestrator and how this service can be integrated with the existing SHIWA Simulation Platform services like the WS-PGRADE/gUSE portal to provide the required functionalities of solving the workflow interoperability problem.
Article
We introduce a novel approach for describing sensors and their capa- bilities. Although existing standards for describing sensors and their ca- pabilities as well as their measurements, produced by Open Geospatial Consortium's Sensor Web Enablement activities (OGC SWE), achieve syntactic interoperability, they do not provide facilities for computer logic and reasoning. We argue that ontologies are an adequate methodol- ogy to model sensors and their capabilities. Ontologies enable reasoning, classication and other types of automation to extend the SWE stan- dards of the OGC. A semantic sensor network would allow the network and its components to be organised, queried, and controlled through high-level specications. The ontology proposed here is not an ontology that organises all the facets and concepts of sensing, but rather one that provides a language to describe sensors in terms of their capabilities and operations. This paper introduces an initial version.