• Newbury, United Kingdom
Recent publications
6G is believed to go beyond communication and provide integrated sensing and computing capabilities for a vision of Connected Intelligence with everything connected, everything sensed, and everything intelligent. Integrated sensing and communication will play a vital role for the fusion of physical and cyber worlds. The exploration of higher frequency bands, larger bandwidth, and more advanced large antenna technologies is paving the way towards the goal. In particular, the study of THz opens the possibility to have high resolution sensing and imaging capability on a communication mobile device. In this paper, we take a step along this direction and justify such possibility by building a THz sensing prototype with millimeter level imaging resolution while considering the physical aperture constraint of typical mobile device.
Background Older people receive care from multiple providers which often results in a lack of coordination. The Information and Communication Technology (ICT) enabled value-based methodology for integrated care (ValueCare) project aims to develop and implement efficient outcome-based, integrated health and social care for older people with multimorbidity, and/or frailty, and/or mild to moderate cognitive impairment in seven sites (Athens, Greece; Coimbra, Portugal; Cork/Kerry, Ireland; Rijeka, Croatia; Rotterdam, the Netherlands; Treviso, Italy; and Valencia, Spain). We will evaluate the implementation and the outcomes of the ValueCare approach. This paper presents the study protocol of the ValueCare project; a protocol for a pre-post controlled study in seven large-scale sites in Europe over the period between 2021 and 2023. Methods A pre-post controlled study design including three time points (baseline, post-intervention after 12 months, and follow-up after 18 months) and two groups (intervention and control group) will be utilised. In each site, (net) 240 older people (120 in the intervention group and 120 in the control group), 50–70 informal caregivers (e.g. relatives, friends), and 30–40 health and social care practitioners will be invited to participate and provide informed consent. Self-reported outcomes will be measured in multiple domains; for older people: health, wellbeing, quality of life, lifestyle behaviour, and health and social care use; for informal caregivers and health and social care practitioners: wellbeing, perceived burden and (job) satisfaction. In addition, implementation outcomes will be measured in terms of acceptability, appropriateness, feasibility, fidelity, and costs. To evaluate differences in outcomes between the intervention and control group (multilevel) logistic and linear regression analyses will be used. Qualitative analysis will be performed on the focus group data. Discussion This study will provide new insights into the feasibility and effectiveness of a value-based methodology for integrated care supported by ICT for older people, their informal caregivers, and health and social care practitioners in seven different European settings. Trial registration ISRCTN registry number is 25089186 . Date of trial registration is 16/11/2021.
Können Sie sich noch an eine Welt ohne digitale Dienstleistungen erinnern? Falls ja, sind vermutlich vor 1990 zur Welt gekommen. Ihre erste Musiksammlung bestand vermutlich aus CD oder Vinyl und nicht aus einer Spotify Playlist. Am Freitagabend war der Weg zur örtlichen Videothek als Vorbereitung für das Wochenende fest eingeplant, um einen der neuen Blockbuster auf VHS oder DVD zu leihen. Die Digitalisierung von Dienstleistungen verändert nicht nur unser tägliches Leben. Sie eröffnet Unternehmen aus unterschiedlichen Branchen und Industrien auch eine enorme Anzahl an Chancen, von diesen Beispielen zu lernen und die besten Ansätze, Denkmuster und Konzepte für sich nutzbar zu machen. Insbesondere die verschiedenen Disziplinen des strategischen und operativen Marketings werden durch diese Möglichkeiten vor vollständig neue Fragestellungen und Herausforderungen gestellt. Dieses Kapitel beschäftigt sich mit diesen Herausforderungen aus dem Blickwinkel des Customer Value Managements mit dem Schwerpunkt auf Business-to-Consumer (B2C) und den Herausforderungen an Entscheidungsträger und Entscheidungsprozesse innerhalb einer Organisation.
Moving object monitoring is becoming essential for companies and organizations that need to manage thousands or even millions of commercial vehicles or vessels, detect dangerous situations (e.g., collisions or malfunctions) and optimize their behavior. It is a task that must be executed in real-time, reporting any such situations or opportunities as soon as they appear. Given the growing sizes of fleets worldwide, a monitoring system must be highly efficient and scalable. It is becoming an increasingly common requirement that such monitoring systems should be able to automatically detect complex situations, possibly involving multiple moving objects and requiring extensive background knowledge. Building a monitoring system that is both expressive and scalable is a significant challenge. Typically, the more expressive a system is, the less flexible it becomes in terms of its parallelization potential. We present a system that strikes a balance between expressiveness and scalability. Going beyond event detection, we also present an approach towards event forecasting. We show how event patterns may be given a probabilistic description so that our system can forecast when a complex event is expected to occur. Our proposed system employs a formalism that allows analysts to define complex patterns in a user-friendly manner while maintaining unambiguous semantics and avoiding ad hoc constructs. At the same time, depending on the problem at hand, it can employ different parallelization strategies in order to address the issue of scalability. It can also employ different training strategies in order to fine-tune the probabilistic models constructed for event forecasting. Our experimental results show that our system can detect complex patterns over moving entities with minimal latency, even when the load on our system surpasses what is to be realistically expected in real-world scenarios.
It is widely expected that future networks of 6G and beyond will substantially improve on 5G. Technologies such as Internet of Skills and Industry 4.0 will become stable and viable, as a direct consequence of networks that offer sustained and reliable mobile performance levels. The primary challenges for future technologies are not just low-latency and high-bandwidth. The more critical problem Mobile Service Providers (MSPs) will face will be in balancing the inflated demands of network connections and customers’ trust in the network service. That is, being able to interconnect billions of unique devices while adhering to the agreed terms of Service Level Agreements (SLAs). To meet these targets, it is self-evident that MSPs cannot operate in a solitary environment. They must enable cooperation among themselves in a manner that ensures trust, both between themselves as well as with customers. In this study, we present the BEAT ( B lockchain- E nabled A ccountable and T ransparent) Infrastructure Sharing architecture. BEAT exploits the inherent properties of permissioned type of distributed ledger technology (i.e., permissioned distributed ledgers) to deliver on accountability and transparency metrics whenever infrastructure needs to be shared between providers. We also propose a lightweight method that enables device-level accountability. BEAT has been designed to be deployable directly as only minor software upgrades to network devices such as routers. Our simulations on a resource-limited device show that BEAT adds only a few seconds of overhead processing time – with the latest state-of-the-art network devices, we can reasonably anticipate much lower overheads.
Telecom operators' infrastructure is sustained by optical communication networks that provide the means for exchanging large amounts of information, which is essential for many modern society needs. Optical networks are characterized by rapid breakthroughs in a variety of technologies. Relevantly, the last decade encompassed remarkable advances in optical networks’ subfields of signal processing, electronics, photonics, communications, protocols, and control-plane architectures. Hence, these advancements unlocked unprecedented transmission capacities, reconfigurability and programmability, entailing an evolution in the way which networks were designed, planned, and analyzed. In this paper, we review the historical status of optical planning and design tools by focusing on the major enabling technologies and relevant landmarks of the last decade(s). We begin by pinpointing the major breakthroughs in the optical data plane, estimation models capturing the transmission medium behavior and the control plane. We then distil the implications that these advancements entail in the landscape of optical network design and analysis tools, which commonly sit “on top” of the control plane or as a fully separated entity. Then, we speculate with our view for the future, in which automatic validation of optical network operations and dimensioning jointly with learning/artificial intelligence mechanisms will permit zero-touch optical networking: i.e. updating, provisioning, and upgrading network capacities, by means of automation with minimal human intervention. We conclude with a proposal of an architecture that encompasses data and control planes in a comprehensive manner for paving the way towards zero-touch optical networking.
Balancing traffic among cellular networks is very challenging due to many factors. Nevertheless, the explosive growth of mobile data traffic necessitates addressing this problem. Due to the problem complexity, data-driven self-optimized load balancing techniques are leading contenders. In this work, we propose a comprehensive deep reinforcement learning (RL) framework for steering the cell individual offset (CIO) as a means for mobility load management. The state of the LTE network is represented via a subset of key performance indicators (KPIs), all of which are readily available to network operators. We provide a diverse set of reward functions to satisfy the operators' needs. For a small number of cells, we propose using a deep Q-learning technique. We then introduce various enhancements to the vanilla deep Q-learning to reduce bias and generalization errors. Next, we propose the use of actor-critic RL methods, including Deep Deterministic Policy Gradient (DDPG) and twin delayed deep deterministic policy gradient (TD3) schemes, for optimizing CIOs for a large number of cells. We provide extensive simulation results to assess the efficacy of our methods. Our results show substantial improvements in terms of downlink throughput and non-blocked users at the expense of negligible channel quality degradation.
This paper describes a recursive procedure to estimate the smallest eigenvalue of an nth-order boundary value problem under a wide set of boundary conditions. The procedure yields lower and upper bounds for that eigenvalue as well as an estimation of the associated eigenfunction, both of which are shown to converge to their exact values as the recursion index grows. A simpler version of the procedure is also displayed for the self-adjoint case. KEYWORDS cone theory, eigenvalue, Green function, invariant subspace, Lyapunov inequality, nth-order linear boundary value problem, operator norm MSC CLASSIFICATION 34B05; 34B09; 34B27; 34L15; 34L16; 47A15; 47A30; 47B65; 47G10; 47N20
Mobile phones have been used to monitor mobility changes during the COVID-19 pandemic but surprisingly few studies addressed in detail the implementation of practical applications involving whole populations. We report a method of generating a “mobility-index” and a “stay-at-home/resting-index” based on aggregated anonymous Call Detail Records of almost all subscribers in Hungary, which tracks all phones, examining their strengths and weaknesses, comparing it with Community Mobility Reports from Google, limited to smartphone data. The impact of policy changes, such as school closures, could be identified with sufficient granularity to capture a rush to shops prior to imposition of restrictions. Anecdotal reports of large scale movement of Hungarians to holiday homes were confirmed. At the national level, our results correlated well with Google mobility data, but there were some differences at weekends and national holidays, which can be explained by methodological differences. Mobile phones offer a means to analyse population movement but there are several technical and privacy issues. Overcoming these, our method is a practical and inexpensive way forward, achieving high levels of accuracy and resolution, especially where uptake of smartphones is modest, although it is not an alternative to smartphone-based solutions used for contact tracing and quarantine monitoring.
Small cells are low-power and low-range radio access nodes that can be used to increase network capacity in reduced areas with high traffic demand and to provide coverage to small isolated zones. As their number is expected to be very high, connecting them to the aggregation point in a cost-efficient manner is a key challenge, and there is consensus that none of the current technologies is valid for all scenarios. Since lamp-posts are one of the most convenient locations for outdoor small cells, this work analyzes the use of power line communications (PLC) over outdoor public lighting networks (OPLN) as a backhaul technology for outdoor small cell deployments. To this end, the characteristics of the PLC channels established in OPLN are firstly assessed. The analysis combines noise measurements performed in existing OPLN and channel responses obtained from a multiconductor transmission line (MTL) model. The attenuation, delay spread and spatial correlation of the multiple-input multiple-output (MIMO) channels are investigated and their influence in the physical layer parameters of PLC systems is discussed. Afterward, the performance achieved by state-of-the-art PLC systems is assessed. To this end, estimations obtained by means of simulations and measurements taken in actual networks are included. Data throughput achieved by PLC systems based on the ITU-T G.hn standard, as well as the expected improvements obtained by 3×3 MIMO systems, are given. Results indicate that PLC can be an interesting technology for coverage-driven small cell deployments with backhaul length shorter than 150 m, offering throughput values similar to existing Sub-6 GHz wireless solutions in non-line-of-sight (NLOS) conditions and wired ones like G.fast.
Better relaxing lockdown together Even during a pandemic, all countries—even islands—are dependent in one way or another on their neighbors. Without coordinated relaxation of nonpharmaceutical interventions (NPIs) among the most closely connected countries, it is difficult to envisage maintaining control of infectious viruses such as severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2). Ruktanonchai et al. used mobility data from smartphones to estimate movements between administrative units across Europe before and after the implementation of NPIs for coronavirus disease 2019 (COVID-19). Modeling disease dynamics under alternative scenarios of countries releasing NPIs, in particular stay-at-home orders, showed that if countries do not coordinate their NPIs when they relax lockdown, resurgence of disease occurs sooner. Coordination of on-off NPIs would significantly increase their effectiveness at reducing transmission across Europe. Science , this issue p. 1465
Access to financial institutions is difficult in developing economies and especially for the poor. However, the widespread adoption of mobile phones has enabled the development of mobile money systems that deliver financial services through the mobile phone network. Despite the success of mobile money, there is a lack of quantitative studies that unveil which factors contribute to the adoption and sustained usage of such services. In this paper, we describe the results of a quantitative study that analyzes data from the world's leading mobile money service, M-Pesa. We analyzed millions of anonymized mobile phone communications and M-Pesa transactions in an African country. Our contributions are threefold: (1) we analyze the customers' usage of M-Pesa and report large-scale patterns of behavior; (2) we present the results of applying machine learning models to predict mobile money adoption (AUC=0.691), and mobile money spending (AUC=0.619) using multiple data sources: mobile phone data, M-Pesa agent information, the number of M-Pesa friends in the user's social network, and the characterization of the user's geographic location; (3) we discuss the most predictive features in both models and draw key implications for the design of mobile money services in a developing country. We find that the most predictive features are related to mobile phone activity, to the presence of M-Pesa users in a customer's ego-network and to mobility. We believe that our work will contribute to the understanding of the factors playing a role in the adoption and sustained usage of mobile money services in developing economies.
The combination of increased availability of large amounts of fine-grained human behavioral data and advances in machine learning is presiding over a growing reliance on algorithms to address complex societal problems. Algorithmic decision-making processes might lead to more objective and thus potentially fairer decisions than those made by humans who may be influenced by greed, prejudice, fatigue, or hunger. However, algorithmic decision-making has been criticized for its potential to enhance discrimination, information and power asymmetry, and opacity. In this paper, we provide an overview of available technical solutions to enhance fairness, accountability, and transparency in algorithmic decision-making. We also highlight the criticality and urgency to engage multi-disciplinary teams of researchers, practitioners, policy-makers, and citizens to co-develop, deploy, and evaluate in the real-world algorithmic decision-making processes designed to maximize fairness and transparency. In doing so, we describe the Open Algortihms (OPAL) project as a step towards realizing the vision of a world where data and algorithms are used as lenses and levers in support of democracy and development.
Data visualisation is one of the most common mechanisms to explore data. It is therefore no surprise that there are today a broad array of techniques and tools available to visually explore data. However, data may be also perceived through other sensory channels, such as touch, taste or sound. In this paper we propose Musical Data, a novel interactive demo that transforms mobile usage data into music. In same way as there is a visual language to interpret data visualisations, we can draw from the musical language to interpret the music generated from the data. Musical-Data offers two key advantages: first, it enables visually impaired individuals to make sense of complex data; second, Musical Data -used by itself or combined with data visualisations- opens new possibilities in terms of customer understanding and human computer interaction, as musical patterns may provide a novel perspective for understanding the behavior of mobile users.
Introduction: This accompanying editorial provides a brief introduction to this focus theme, focused on “Machine Learning and Data Analytics in Pervasive Health”. Objective: The innovative use of machine learning technologies combining small and big data analytics will support a better provisioning of healthcare to citizens. This focus theme aims to present contributions at the crossroads of pervasive health technologies and data analytics as key enablers for achieving personalised medicine for diagnosis and treatment purposes. Methods: A call for paper was announced to all participants of the “11th International Conference on Pervasive Computing Technologies for Healthcare”, to different working groups of the International Medical Informatics Association (IMIA) and European Federation of Medical Informatics (EFMI) and was published in June 2017 on the website of Methods of Information in Medicine. A peer review process was conducted to select the papers for this focus theme. Results: Four papers were selected to be included in this focus theme. The paper topics cover a broad range of machine learning and data analytics applications in healthcare including detection of injurious subtypes of patient-ventilator asynchrony, early detection of cognitive impairment, effective use of small data sets for estimating the performance of radiotherapy in bladder cancer treatment, and the use negation detection in and information extraction from unstructured medical texts. Conclusions: The use of machine learning and data analytics technologies in healthcare is facing a renewed impulse due to the availability of large amounts and new sources of human behavioral and physiological data, such as that captured by mobile and pervasive devices traditionally considered as nonmainstream for healthcare provision and management.
This article describes how the alignment of business and information technology (IT) strategies impact organisational performance. The alignment involves an entire organisation. However, much of the research has focused on the factors affecting alignment at the senior executive level, and there appears to be less attention placed upon factors that affect the lower operational levels. This article attempts to address this gap in the literature through a case study of a healthcare organisation. Semi-structured interviews with ten employees at an operational level were qualitatively analysed to elucidate factors. Organisational culture, management expectations, communication, and the provision and recognition of skills were identified as main factors that may affect the alignment of business and IT strategies at the lower levels
With 5th generation (5G) cellular systems, both traditional services from earlier cellular generations and new services will be provided. In preparation for roll‐out and also the evaluation of options, a number of technology and service trials are taking place. This chapter presents the roadmap of the expected standardization activities towards a full 5G system design. It identifies the main standardization bodies and timelines for a two‐phase approach, where the first phase focuses only on a sub‐set of use cases and technology solutions, while the second phase reflects the full 5G standard addressing all main services types. The chapter then covers trials and early commercialization plans in the three regions Europe, Americas and Asia. A number of technical and service‐oriented tests have been performed and are planned, and an early commercial launch will happen as soon as the first phase standard solution is agreed and equipment is available.
The focus of this study is the performance of high-density truck platooning achieved with different wireless technologies for vehicle-to-vehicle (V2V) communications. Platooning brings advantages such as lower fuel consumption and better traffic efficiency, which are maximized when the inter-vehicle spacing can be steadily maintained at a feasible minimum. This can be achieved with Cooperative Adaptive Cruise Control, an automated cruise controller that relies on the complex interplay among V2V communications, on-board sensing, and actuation. This work provides a clear mapping between the performance of the V2V communications, which is measured in terms of latency and reliability, and of the platoon, which is measured in terms of achievable inter-truck spacing. Two families of radio technologies are compared: IEEE 802.11p and 3GPP Cellular-V2X (C-V2X). The C-V2X technology considered in this work is based on the Release 14 of the LTE standard, which includes two modes for V2V communications: Mode 3 (base-station-scheduled) and Mode 4 (autonomously-scheduled). Results show that C-V2X in both modes allows for shorter inter-truck distances than IEEE 802.11p due to more reliable communications performance under increasing congestion on the wireless channel caused by surrounding vehicles.
This paper is focused on techniques for maximizing utility across all users within a total network transit cost budget. We present a new method for selecting between replicated servers distributed over the Internet. First, we introduce a novel utility framework that factors in quality of service metrics. Then we design an optimization algorithm, solvable in polynomial time, to allocate user requests to servers based on utility while satisfying network transit cost constraints, mapping service names to service instance locators. We then describe an efficient, low overhead distributed model which only requires knowledge of a fraction of the data required by the global optimization formulation. Next, a load-balancing variant of the algorithm is explored that substantially reduces blocking caused by congested servers. Extensive simulations show that our method is scalable and leads to higher user utility compared with mapping user requests to the closest service replica, while meeting network traffic cost constraints. We discuss several options for real-world deployment that require no changes to end-systems based on either the use of SDN controllers or extensions to the current DNS system.
Institution pages aggregate content on ResearchGate related to an institution. The members listed on this page have self-identified as being affiliated with this institution. Publications listed on this page were identified by our algorithms as relating to this institution. This page was not created or approved by the institution. If you represent an institution and have questions about these pages or wish to report inaccurate content, you can contact us here.
497 members
CiacuGrasu Nicoleta
  • Department of Communication
Mourad Kara
  • Vodafone Group Technology
Ahmed Samir Roshdy
  • Department of Information Technology
Muslim Elkotob
  • Enterprise
Newbury, United Kingdom