Article

Structural Vibration Signal Denoising Using Stacking Ensemble of Hybrid CNN-RNN

Authors:
To read the full-text of this research, you can request a copy directly from the authors.

Abstract

Vibration signals have been increasingly utilized in various engineering fields for analysis and monitoring purposes, including structural health monitoring, fault diagnosis and damage detection, where vibration signals can provide valuable information about the condition and integrity of structures. In recent years, there has been a growing trend towards the use of vibration signals in the field of bioengineering. Activity-induced structural vibrations, particularly footstep-induced signals, are useful for analyzing the movement of biological systems such as the human body and animals, providing valuable information regarding an individual’s gait, body mass, and posture, making them an attractive tool for health monitoring, security, and human-computer interaction. However, the presence of various types of noise can compromise the accuracy of footstep-induced signal analysis. In this paper, we propose a novel ensemble model that leverages both the ensemble of multiple signals and of recurrent and convolutional neural network predictions. The proposed model consists of three stages: preprocessing, hybrid modeling, and ensemble. In the preprocessing stage, features are extracted using the Fast Fourier Transform and wavelet transform to capture the underlying physics-governed dynamics of the system and extract spatial and temporal features. In the hybrid modeling stage, a bi-directional LSTM is used to denoise the noisy signal concatenated with FFT results, and a CNN is used to obtain a condensed feature representation of the signal. In the ensemble stage, three layers of a fully-connected neural network are used to produce the final denoised signal. The proposed model addresses the challenges associated with structural vibration signals, which outperforms the prevailing algorithms for a wide range of noise levels, evaluated using PSNR, SNR, and WMAPE.

No full-text available

Request Full-text Paper PDF

To read the full-text of this research,
you can request a copy directly from the authors.

... Privacy considerations, including data anonymization and consent management, play a pivotal role in compliance. [5], [6], [7], [8]. *6. ...
... *8. Conclusion and Implications: [8] • Results: The study concludes that while organizations have made strides in securing data in cloud-based data warehousing, challenges persist. The implications of data breaches, compliance violations, and evolving threats underscore the importance of a proactive and adaptive approach to data security. ...
Research Proposal
Full-text available
As organizations increasingly leverage cloud-based data warehousing solutions to manage and analyze vast datasets, ensuring robust data security becomes paramount. This paper explores the challenges, strategies, and technologies involved in safeguarding sensitive information within cloud-based data warehousing systems. The abstract outlines key considerations for implementing effective security measures, including encryption, access controls, and monitoring mechanisms. By examining current practices and emerging trends, this study contributes valuable insights to the ongoing discourse on fortifying data security in the era of cloud-based data warehousing.
... The literature discusses various ML algorithms, including supervised learning for intrusion detection and unsupervised learning for anomaly detection, showcasing their efficacy in addressing specific security challenges. [5], [6], [7], [8]. ...
... Prepare a detailed report outlining the research process, findings, and implications for future work. 8 This methodology provides a systematic approach to integrating machine learning into cloud computing security, ensuring a thorough exploration of the selected security challenges and the evaluation of the proposed solution. [26], [27], [28], [29], [30]. ...
Research Proposal
Full-text available
Cloud computing has become an integral part of modern IT infrastructures, offering scalable resources and services to users globally. However, the increased reliance on cloud technologies brings forth new security challenges. This paper explores the evolving landscape of security threats in cloud computing and proposes a novel approach leveraging machine learning techniques for enhanced security measures. The integration of machine learning algorithms aims to proactively identify and mitigate potential risks, providing a robust defense mechanism against emerging threats. The study delves into the intricacies of implementing machine learning in cloud security, addressing issues such as data privacy, unauthorized access, and advanced persistent threats. Through a comprehensive analysis, this paper contributes valuable insights into fortifying cloud computing environments against evolving cybersecurity challenges.
... The subsequent sections of this paper will build upon this foundation, delving into specific technologies, applications, and future trends within the realm of power electronics in electric vehicles. [5], [6], [7], [8]. ...
... The results emphasize the need for robust cooling mechanisms and materials that can withstand the rigors of high-performance power electronics, ensuring longevity and reliability in electric vehicles. [8] • Studies exploring cost-effectiveness and scalability reveal the challenges of balancing performance requirements with cost considerations. Researchers advocate for advancements that make electric vehicles more economically viable and scalable, promoting mass adoption and market competitiveness. ...
Research Proposal
Full-text available
The electrification of transportation has witnessed a paradigm shift with the widespread adoption of electric vehicles (EVs). At the core of EV technology lies power electronics, a key enabler for efficient energy conversion and management. This paper provides a comprehensive overview of the role, advancements, and challenges of power electronics in electric vehicles. From the powertrain to charging infrastructure, the abstract explores how power electronics enhances performance, extends range, and contributes to the sustainability of electric mobility. Key technologies, such as inverters, converters, and energy management systems, are examined in the context of EVs. Additionally, the abstract delves into emerging trends, standards, and future directions shaping the landscape of power electronics in the rapidly evolving domain of electric vehicles.
... Studies investigate the incorporation of machine learning models that possess the ability to change and adjust to ever-shifting threat environments. By addressing the issue of threat evolution, this strategy makes sure that security measures continue to be effective against novel and sophisticated attacks [5], [6], [7], [8]. ...
Article
Full-text available
The increasing intricacy of cyber threats within cloud computing environments demands novel strategies for strong security protocols. To strengthen cloud computing security, this paper investigates the proactive approach of integrating machine learning algorithms. Within the framework of cloud security, the abstract explores the various uses of machine learning, such as anomaly detection, threat identification, and behavioral analysis. The study assesses the effectiveness of both supervised and unsupervised learning models, emphasizing how flexible they are in response to changing threat environments. The abstract also covers the potential for continuous learning to keep up with changing security challenges and how machine learning can improve real-time incident response. By looking at how cloud security and machine learning work together. This paper attempts to give a thorough overview of contemporary approaches and insights into how secure cloud computing is developing.
... The literature discusses various ML algorithms, including supervised learning for intrusion detection and unsupervised learning for anomaly detection, showcasing their efficacy in addressing specific security challenges. [5], [6], [7], [8]. ...
... Literature explores methodologies for maintaining comprehensive metadata repositories and ensuring the quality of data inputs, thereby enhancing the reliability and interpretability of AI-driven insights. [5], [6], [7], [8]. ...
Article
Full-text available
The synergy between data warehousing and artificial intelligence (AI) has become increasingly vital in the era of data-driven decision-making. This paper explores the recent advancements in data warehousing techniques tailored for optimal integration with AI applications. The abstract discusses the evolution of traditional data warehousing approaches, highlighting the specific requirements and challenges posed by the demands of AI. The paper delves into innovative solutions, such as parallel processing, advanced indexing, and in-memory databases, to efficiently handle large-scale datasets crucial for AI model training and inference. Additionally, it discusses the importance of data quality, metadata management, and scalability in the context of AI-driven analytics. By examining the intersection of data warehousing and AI, this paper aims to provide a comprehensive overview of state-of-the-art methodologies, fostering a deeper understanding of how modern data warehousing practices can fuel advancements in artificial intelligence applications.
... As we move forward in this exploration, the subsequent sections will delve deeper into specific aspects of technology management, shedding light on the strategies and practices that define success in an era shaped by rapid technological evolution. [10], [11], [12], [13]. ...
Article
Full-text available
As technology continues to drive the modern business landscape, effective technology management becomes paramount for organizations seeking a delicate balance between fostering innovation and managing costs. This paper explores the economic perspectives inherent in technology management, shedding light on strategies that enable businesses to navigate the dynamic terrain of innovation while maintaining financial sustainability. By examining case studies, theoretical frameworks, and industry best practices, this research aims to provide insights into the intricate interplay between economic considerations and technological advancements. From investment decisions to risk management, the abstract delves into the multifaceted realm of technology management, offering a nuanced understanding of how organizations can thrive in an era defined by rapid technological evolution.
... This data can then be used to refine product descriptions, modify pricing tactics, or even guide inventory choices. The aim is to generate a more reactive and flexible retail setting that constantly progresses depending on customer sentiment [20]. ...
Article
Full-text available
This This research aims to comprehensively review the current state of artificial intelligence techniques for emotional recognition and their potential applications in optimizing digital advertising strategies. A systematic literature review was conducted involving searches of academic databases and screening of papers on topics relating to emotion recognition using facial analysis, sentiment analysis, computational advertising, and measuring digital engagement.
... studies underscore the critical role of foundational practices such as data management and model development in achieving success in AI/ML-driven software engineering projects. Challenges in AI/ML Integration: Despite the potential benefits of AI and ML technologies, their integration into software engineering processes presents various challenges.Liang et al. (2022) identify data quality and availability as significant hurdles, particularly in domains with sparse or unstructured data. The interpretability and explainability of AI models are highlighted as key concerns by Yang et al.(2020), who emphasize the importance of transparent and interpretable AI systems, especially in safety-critical applic ...
Article
In the rapidly evolving landscape of software engineering, the integration of artificial intelligence (AI) and machine learning (ML) technologies presents both opportunities and challenges. This paper explores the best practices and challenges associated with leveraging AI and ML in software engineering processes. We discuss the importance of efficient software engineering practices in the era of AI and ML, highlighting key considerations such as data management, algorithm selection, model deployment, and ethical considerations. By addressing these challenges and adopting best practices, software engineers can harness the power of AI and ML to develop robust and scalable software solutions that meet the evolving needs of modern organizations. Introduction: In today's digital age, software engineering is undergoing a paradigm shift with the widespread adoption of artificial intelligence (AI) and machine learning (ML) technologies. This introduction sets the stage for the discussion by highlighting the increasing importance of efficient software engineering practices in leveraging AI and ML. We delve into the transformative potential of AI and ML in software development, emphasizing the need for best practices to address associated challenges effectively. The significance of efficient software engineering practices in the context of AI and ML integration. We discuss how AI and ML technologies are reshaping traditional software development processes and driving demand for more agile, scalable, and adaptive approaches to software engineering. The section emphasizes the role of efficient software engineering practices in maximizing the value and impact of AI and ML solutions. Here, we delineate the scope of AI and ML applications in software engineering, encompassing areas such as predictive analytics, natural language processing, computer vision, and automated software testing. We highlight the diverse range of AI and ML techniques that can enhance software development processes, from intelligent code generation to automated bug detection and resolution. The identifies and discusses key challenges associated with integrating AI and ML into software engineering practices. Challenges may include data quality and availability, algorithm selection, interpretability and explain ability of AI models, scalability and performance optimization, and ethical considerations related to bias, fairness, and privacy. Objectives of the Review:
... • Key Findings: Cloud-based learning environments are lauded for enhancing accessibility, enabling collaborative projects, and facilitating hands-on experiences with scalable computing resources. [5], [6], [7], [8]. ...
Research Proposal
Full-text available
The rapid evolution of Artificial Intelligence (AI) and Cloud Computing has profound implications for computer science education. This paper explores the integration of AI and Cloud Computing in educational frameworks, aiming to enhance the learning experience and prepare students for the demands of the modern computing landscape. The abstract reviews current trends, pedagogical approaches, and the impact of integrating AI and Cloud Computing in computer science curricula. By addressing challenges and highlighting best practices, this study contributes to the ongoing discourse on shaping future-ready computer science education.
... The subsequent sections will build upon these insights, exploring GAN architectures, methodologies, and their implications for anomaly detection in medical images. [5], [6], [7], [8]. ...
Research Proposal
Full-text available
Medical imaging plays a pivotal role in diagnostics, but the early detection of anomalies remains a significant challenge. This paper presents a comprehensive study on the application of Generative Adversarial Networks (GANs) for anomaly detection in medical images. GANs, known for their ability to generate realistic data, have shown promise in learning normal patterns and identifying anomalies. The abstract reviews the current state of anomaly detection in medical imaging, introduces GANs as a powerful tool, and delves into various architectures and methodologies employed. The study explores the strengths, limitations, and potential advancements in leveraging GANs for enhanced anomaly detection in medical images, contributing valuable insights to the intersection of deep learning and medical diagnostics.
... Multi-antenna configurations offer flexibility in adapting to the dynamic nature of the automotive environment. [5], [6], [7], [8]. ...
Research Proposal
Full-text available
As electric vehicles (EVs) become integral to the future of transportation, ensuring reliable and high-performance connectivity is paramount. This paper presents an in-depth exploration of antenna design strategies aimed at enhancing connectivity in electric vehicles. The abstract reviews key challenges associated with in-vehicle communication, such as signal attenuation and interference, and delves into innovative antenna solutions. From diversity antennas to advanced beamforming techniques, the paper investigates how antenna design can optimize communication between EVs and external networks, paving the way for improved navigation, telematics, and over-the-air updates. The findings contribute to the growing body of knowledge essential for achieving seamless connectivity in the rapidly evolving landscape of electric mobility.
... The subsequent sections will build upon this foundation, exploring specific applications and implications of AI in enhancing analytics within data warehousing. [5], [6], [7], [8]. ...
Research Proposal
Full-text available
The fusion of Artificial Intelligence (AI) with data warehousing presents a transformative paradigm for analytics, offering unprecedented insights and efficiency. This paper explores the symbiotic relationship between AI and data warehousing, highlighting the advancements, applications, and impact on analytical capabilities. The abstract delves into key AI techniques such as machine learning, natural language processing, and predictive analytics within the context of data warehousing. Through case studies and industry examples, the paper showcases how AI elevates data warehousing, enabling intelligent data processing, pattern recognition, and adaptive analytics. The discussion encompasses challenges, ethical considerations, and future trajectories, providing a comprehensive overview of the evolving landscape where AI and data warehousing converge to redefine the boundaries of analytics.
... This approach ensures that security measures remain effective against new and sophisticated attacks, addressing the challenge of threat evolution. [5], [6], [7], [8]. ...
Research Proposal
Full-text available
The escalating complexity of cyber threats in cloud computing environments necessitates innovative approaches for robust security measures. This paper explores the integration of machine learning algorithms as a proactive strategy to fortify cloud computing security. The abstract delves into the diverse applications of machine learning, including anomaly detection, threat identification, and behavioral analysis, within the context of cloud security. The paper evaluates the efficacy of supervised and unsupervised learning models, highlighting their adaptability to dynamic threat landscapes. Additionally, the abstract discusses the role of machine learning in enhancing real-time incident response and the potential for continual learning to stay abreast of evolving security challenges. By examining the symbiotic relationship between machine learning and cloud security, this paper aims to provide a comprehensive overview of state-of-the-art methodologies, offering insights into the evolving landscape of secure cloud computing.
... Literature explores methodologies for maintaining comprehensive metadata repositories and ensuring the quality of data inputs, thereby enhancing the reliability and interpretability of AI-driven insights. [5], [6], [7], [8]. ...
Research Proposal
Full-text available
The synergy between data warehousing and artificial intelligence (AI) has become increasingly vital in the era of data-driven decision-making. This paper explores the recent advancements in data warehousing techniques tailored for optimal integration with AI applications. The abstract discusses the evolution of traditional data warehousing approaches, highlighting the specific requirements and challenges posed by the demands of AI. The paper delves into innovative solutions, such as parallel processing, advanced indexing, and in-memory databases, to efficiently handle large-scale datasets crucial for AI model training and inference. Additionally, it discusses the importance of data quality, metadata management, and scalability in the context of AI-driven analytics. By examining the intersection of data warehousing and AI, this paper aims to provide a comprehensive overview of state-of-the-art methodologies, fostering a deeper understanding of how modern data warehousing practices can fuel advancements in artificial intelligence applications.
... Researchers explore the use of novel materials and nanostructuring techniques to increase energy density, contributing to the development of batteries with extended ranges and reduced weight. [5], [6], [7], [8]. ...
Research Proposal
Full-text available
The acceleration of electric vehicle (EV) adoption hinges on breakthroughs in battery technology, particularly advancements in battery materials. This paper provides a comprehensive overview of recent innovations in battery materials, exploring their potential to revolutionize the next generation of electric vehicles. From cathode and anode materials to electrolytes and beyond, the abstract delves into cutting-edge developments that aim to enhance energy density, lifespan, charging speed, and overall performance of electric vehicle batteries. By scrutinizing the latest research and innovations, this paper aims to shed light on the pivotal role played by battery materials in shaping the future landscape of electric mobility.
... Integration enables seamless communication and enhances overall system efficiency. [10], [11], [12], [13], [14], [15]. ...
Research Proposal
Full-text available
The electrification of vehicles has spurred advancements in power electronics and battery management systems, playing a pivotal role in enhancing the performance and efficiency of electric vehicles (EVs). This paper explores recent developments in power electronics for EVs, with a specific focus on innovations in battery management. The abstract provides a concise overview of the research, methodologies employed, and key findings, offering insights into the evolving landscape of electric vehicle technology.
... As we move forward in this exploration, the subsequent sections will delve deeper into specific aspects of technology management, shedding light on the strategies and practices that define success in an era shaped by rapid technological evolution. [10], [11], [12], [13]. ...
Research Proposal
Full-text available
As technology continues to drive the modern business landscape, effective technology management becomes paramount for organizations seeking a delicate balance between fostering innovation and managing costs. This paper explores the economic perspectives inherent in technology management, shedding light on strategies that enable businesses to navigate the dynamic terrain of innovation while maintaining financial sustainability. By examining case studies, theoretical frameworks, and industry best practices, this research aims to provide insights into the intricate interplay between economic considerations and technological advancements. From investment decisions to risk management, the abstract delves into the multifaceted realm of technology management, offering a nuanced understanding of how organizations can thrive in an era defined by rapid technological evolution.
... As we embark on this exploration of AGVs, it becomes evident that these autonomous entities are not merely machines; they represent a paradigm shift in how we interact with our environment, automate tasks, and extend our reach into the far reaches of our solar system and beyond. [6], [7], [8], [9], [10]. ...
Research Proposal
Full-text available
Robotics, particularly the realm of autonomous ground vehicles (AGVs), is experiencing a profound transformation, with innovations reshaping industries and daily life. This paper explores the current state and future potential of AGVs, investigating their applications, challenges, and technological advancements. From unmanned ground vehicles (UGVs) enhancing logistics to planetary exploration rovers expanding our understanding of distant worlds, the abstract provides a concise overview of the vast landscape where robotics is in action.
... • The literature explores the diverse applications of antennas in wireless communication, spanning cellular networks, Wi-Fi, satellite communication, and emerging technologies like 5G connectivity. [9], [10], [11]. ...
Research Proposal
Full-text available
Advancements in antenna design play a pivotal role in meeting the increasing demands of modern communication systems. This paper explores innovations in antenna design with a specific focus on materials, aiming to enhance overall performance. The abstract provides a succinct overview of the research, highlighting the significance of materials in pushing the boundaries of antenna capabilities and addressing the challenges posed by evolving communication technologies.
... RNNs and LSTMs, specifically, are designed to recognize sequences and remember patterns over long durations [89]- [91]. In financial sectors, where time-series data is abundant, this is a game-changer. ...
Article
Full-text available
Banking fraud prevention and risk management are paramount in the modern financial landscape, and the integration of Artificial Intelligence (AI) offers a promising avenue for advancements in these areas. This research delves into the multifaceted applications of AI in detecting, preventing, and managing fraudulent activities within the banking sector. Traditional fraud detection systems, predominantly rule-based, often fall short in real-time detection capabilities. In contrast, AI can swiftly analyze extensive transactional data, pinpointing anomalies and potentially fraudulent activities as they transpire. One of the standout methodologies includes the use of deep learning, particularly neural networks, which, when trained on historical fraud data, can discern intricate patterns and predict fraudulent transactions with remarkable precision. Furthermore, the enhancement of Know Your Customer (KYC) processes is achievable through Natural Language Processing (NLP), where AI scrutinizes textual data from various sources, ensuring customer authenticity. Graph analytics offers a unique perspective by visualizing transactional relationships, potentially highlighting suspicious activities such as rapid fund transfers indicative of money laundering. Predictive analytics, transcending traditional credit scoring methods, incorporates a diverse data set, offering a more comprehensive insight into a customer's creditworthiness. The research also underscores the importance of user-friendly interfaces like AI-powered chatbots for immediate reporting of suspicious activities and the integration of advanced biometric verifications, including facial and voice recognition. Geospatial analysis and behavioral biometrics further bolster security by analyzing transaction locations and user interaction patterns, respectively. A significant advantage of AI lies in its adaptability. Self-learning systems ensure that as fraudulent tactics evolve, the AI mechanisms remain updated, maintaining their efficacy. This adaptability extends to phishing detection, IoT integration, and cross-channel analysis, providing a comprehensive defense against multifaceted fraudulent attempts. Moreover, AI's capability to simulate economic scenarios aids in proactive risk
... Traditional denoising methods, while effective to a degree, sometimes struggle to cater to the unique intricacies of multimodal structural vibrations [6]. Such challenges underscore the necessity for novel denoising techniques tailored to the specific nature of these signals [7]. Enter the realm of synergistic signal denoising-a cutting-edge approach that promises a more holistic treatment of multimodal vibration data [8]. ...
Preprint
Full-text available
Structural Health Monitoring (SHM) plays an indispensable role in ensuring the longevity and safety of infrastructure. With the rapid growth of sensor technology, the volume of data generated from various structures has seen an unprecedented surge, bringing forth challenges in efficient analysis and interpretation. This paper introduces a novel deep learning algorithm tailored for the complexities inherent in multimodal vibration signals prevalent in SHM. By amalgamating convolutional and recurrent architectures, the algorithm adeptly captures both localized and prolonged structural behaviors. The pivotal integration of attention mechanisms further enhances the model's capability, allowing it to discern and prioritize salient structural responses from extraneous noise. Our results showcase significant improvements in predictive accuracy, early damage detection, and adaptability across multiple SHM scenarios. In light of the critical nature of SHM, the proposed approach not only offers a robust analytical tool but also paves the way for more transparent and interpretable AI-driven SHM solutions. Future prospects include real-time processing, integration with external environmental factors, and a deeper emphasis on model interpretability.
... This section explores how blockchain enhances protocol adherence and auditability, ensuring research rigor and transparency. [61], [62], [63], [64], [65]. ...
Article
Full-text available
Blockchain technology has emerged as a transformative force in ensuring data integrity within clinical trials. Its decentralized, tamper-proof, and transparent nature addresses critical challenges related to data manipulation, security breaches, and trust deficits. This paper explores the applications of blockchain in clinical trials, focusing on its role in enhancing data integrity. Through a comprehensive review of literature, case studies, and industry trends, this paper demonstrates how blockchain can revolutionize data management, data sharing, and participant consent processes. By leveraging blockchain's immutable data records and cryptographic security, clinical trials can establish an unprecedented level of trust, transparency, and accountability, ushering in a new era of data integrity in medical research.
Article
Full-text available
This research provides procedural insights into Cross-Cultural Management Research through the lens of a model inspired by Cultural Ecology. The model incorporates a comprehensive approach to understanding the intricacies of cross-cultural dynamics within organizational settings. By amalgamating principles from Cultural Ecology, the research outlines a systematic framework for conducting cross-cultural studies. The model encompasses key procedural steps, including literature review, study design, data collection, analysis, and interpretation. Through this holistic approach, the research aims to enhance the methodological rigor of cross-cultural management research and contribute valuable insights into the interplay between organizational dynamics and cultural diversity.
Article
Full-text available
In the era of cloud computing, where data is ubiquitously stored and processed in distributed environments, ensuring robust data security is paramount. This research delves into the application of machine learning techniques for enhancing data security in cloud computing environments. By leveraging the power of machine learning algorithms, this study aims to address emerging threats, detect anomalies, and fortify the confidentiality and integrity of sensitive information within cloud infrastructures. Through a synthesis of theoretical frameworks, empirical studies, and real-world implementations, the research provides a comprehensive exploration of the intersection between machine learning and data security in the context of cloud computing.
Article
Full-text available
AlphaZero, an artificial intelligence system developed by DeepMind, has demonstrated strategic brilliance and mastery in various board games, including Gomoku. This paper explores the lessons learned from AlphaZero's Gomoku gameplay, highlighting the strategic insights and techniques employed by the AI. By analyzing AlphaZero's decision-making processes, strategic adaptations, and overall gameplay in Gomoku, this study aims to extract valuable lessons applicable to broader contexts in artificial intelligence, game theory, and strategic decision-making.
Article
Full-text available
This paper delves into the transformative realm of solid-state pump technologies, with a specific focus on the integration of Electro-rheological (ER) fluids at their core. Solid-state pumps, driven by the controllable rheological properties of ER fluids under the influence of an electric field, offer unique advantages in various engineering applications. The study explores the fundamental principles of ER fluids, their application in solid-state pump designs, and the potential implications for fluid transport and actuation systems. Through a combination of theoretical analysis and experimental validation, this research aims to contribute to the advancement of solid-state pump technologies, emphasizing the role of ER fluids in achieving efficient and adaptive fluid manipulation.
Article
Full-text available
Structural vibration analysis is crucial for monitoring and ensuring the integrity of various engineering systems. However, the collected vibration signals often suffer from noise, making it challenging to extract meaningful information. This paper proposes an advanced signal denoising approach by leveraging the combined power of Convolutional Neural Networks (CNNs) and Recurrent Neural Networks (RNNs). The proposed model, referred to as CNN-RNN Stack, aims to exploit the spatial and temporal characteristics of structural vibration signals for enhanced denoising performance. The CNN component is designed to capture spatial features within short signal segments, effectively identifying patterns and relevant information. The RNN component, on the other hand, is employed to model the temporal dependencies inherent in structural vibrations, enabling the network to grasp long-term patterns and contextual information. By stacking these two architectures, our model synergistically combines their strengths, resulting in a comprehensive denoising solution for structural vibration signals. To validate the effectiveness of the proposed approach, extensive experiments are conducted on real-world vibration datasets, demonstrating superior denoising performance compared to standalone CNN or RNN models. The CNN-RNN Stack exhibits remarkable noise reduction capabilities, making it a promising solution for enhancing the accuracy of structural vibration analysis in various engineering applications.
Article
Full-text available
Magnetorheological fluids (MRFs) represent a class of smart materials that undergo reversible changes in rheological properties in response to an external magnetic field. This paper explores the dynamic applications of MRFs, focusing on their role in sealing mechanisms. The study investigates the feasibility and efficacy of utilizing MRFs in dynamic sealing systems, emphasizing their adaptability to varying operational conditions. Through an interdisciplinary approach involving material science, fluid dynamics, and engineering, the research evaluates the sealing performance and controllability of MRFs, offering insights into their potential to enhance the reliability and efficiency of dynamic sealing technologies.
Article
Full-text available
Sealing technology plays a pivotal role in numerous engineering applications, ranging from automotive systems to industrial machinery. This paper explores the revolutionary potential of Magnetorheological (MR) fluids in sealing applications. MR fluids, known for their controllable rheological properties in response to magnetic fields, offer unique advantages in enhancing the performance and adaptability of seals. The study investigates the application of MR fluids in sealing mechanisms, examining their tunable characteristics, responsiveness to varying conditions, and potential for improved sealing efficiency. Through experimental validation and analysis, the paper aims to contribute valuable insights into the transformative impact of MR fluids on sealing technology.
Article
Full-text available
This study delves into the genetic profiling of deadly cancers, aiming to analyze gene expression patterns across different cancer types. Leveraging high-throughput genomic data, the research employs advanced bioinformatics and computational methods to identify commonalities and distinctions in gene expression profiles among various lethal cancers. The analysis focuses on unveiling potential biomarkers, pathways, and molecular signatures that could contribute to a deeper understanding of these malignancies. The findings from this study not only provide insights into the shared genetic landscape of deadly cancers but also pave the way for the development of targeted therapeutic approaches and precision medicine strategies.
Article
Full-text available
This study explores the excellence of a stacking ensemble approach comprising Convolutional Neural Network (CNN) and Recurrent Neural Network (RNN) models for the denoising of vibration signals. Vibration signals are crucial indicators in various industrial and mechanical systems, but they often suffer from noise and interference that can obscure meaningful information. The proposed ensemble model leverages the strengths of both CNN and RNN architectures to effectively filter out noise and enhance the fidelity of vibration signals. The study provides a comprehensive analysis of the ensemble model's performance, comparing it with individual CNN and RNN models, and demonstrates its superior denoising capabilities. The findings present a significant advancement in signal processing techniques for vibration analysis, with potential applications in predictive maintenance and fault detection.
Article
Cloud computing has emerged as a transformative technology in healthcare, offering unprecedented opportunities for innovation, efficiency, and collaboration. This paper explores the opportunities, risks, and compliance considerations associated with the adoption of cloud computing in the healthcare sector. Drawing on a comprehensive review of existing literature and case studies, we provide insights into the benefits and challenges of cloud computing in healthcare and offer practical recommendations for healthcare organizations seeking to leverage cloud technology effectively. The opportunities afforded by cloud computing in healthcare are manifold. Cloud-based solutions enable healthcare providers to streamline operations, improve access to patient data, and enhance collaboration among care teams. Additionally, cloud computing offers scalability and flexibility, allowing healthcare organizations to adapt to changing demands and efficiently manage resources. Furthermore, cloud-based analytics and machine learning algorithms empower healthcare providers to derive actionable insights from vast amounts of data, facilitating personalized patient care and predictive analytics for disease management. However, the adoption of cloud computing in healthcare also presents inherent risks and challenges. Data security and privacy concerns are paramount, given the sensitivity of healthcare data and the regulatory requirements governing its protection. Healthcare organizations must navigate complex compliance frameworks, such as the Health Insurance Portability and Accountability Act (HIPAA), to ensure the confidentiality, integrity, and availability of patient information in the cloud. Moreover, interoperability and data portability issues may arise when integrating cloud-based systems with existing healthcare IT infrastructure, necessitating careful planning and integration strategies. Addressing these challenges requires a comprehensive approach to cloud adoption that encompasses risk management, regulatory compliance, and security best practices. Healthcare organizations must prioritize data security and privacy measures, such as encryption, access controls, and regular audits, to mitigate the risks associated with cloud computing. Additionally, robust contingency plans and data backup strategies are essential to ensure business continuity and data resilience in the event of security incidents or service disruptions. In conclusion, cloud computing holds immense promise for transforming healthcare delivery and improving patient outcomes.
Article
Reinforcement learning (RL) has emerged as a powerful paradigm for training autonomous software agents to make decisions in complex and dynamic environments. This abstract explores recent advances and applications of RL in diverse domains, highlighting its transformative potential and current challenges. Recent advances in RL algorithms, particularly deep reinforcement learning (DRL), have enabled significant breakthroughs in autonomous decision-making tasks. By leveraging deep neural networks, DRL algorithms can learn complex representations of state-action spaces, facilitating more effective exploration and exploitation strategies. Additionally, innovations in algorithmic improvements, such as prioritized experience replay and distributional RL, have enhanced the stability and sample efficiency of RL algorithms, enabling their deployment in real-world applications. The applications of RL span a wide range of domains, including robotics, autonomous vehicles, game playing, finance, and healthcare. In robotics, RL enables autonomous agents to learn locomotion, manipulation, and navigation tasks in complex and unstructured environments. Autonomous vehicles leverage RL for decision-making in dynamic traffic scenarios, improving safety and efficiency on the road. In finance, RL algorithms are employed for portfolio optimization, algorithmic trading, and risk management, enhancing investment strategies and decision-making processes. Moreover, in healthcare, RL facilitates personalized treatment planning, clinical decision support, and medical image analysis, empowering clinicians to deliver tailored care to patients. Despite the promising advancements and applications, RL still faces several challenges that limit its widespread adoption and scalability. These challenges include sample inefficiency, exploration-exploitation trade-offs, safety and reliability concerns, and the need for explainability and interpretability in decision-making processes. Addressing these challenges requires interdisciplinary collaboration, research in algorithmic advancements, and the development of robust evaluation frameworks. Introduction:
Article
In the era of rapid advancements in artificial intelligence (AI), evolving computer architectures play a pivotal role in meeting the demands of AI-intensive workloads. This paper explores the challenges and innovations associated with designing computer architectures tailored for AI applications. Firstly, we examine the growing complexity and scale of AI models, which pose significant challenges in terms of computational efficiency, memory bandwidth, and power consumption. Next, we discuss the emergence of specialized hardware accelerators, such as graphics processing units (GPUs), tensor processing units (TPUs), and neuromorphic chips, designed to optimize AI workloads. Additionally, we explore novel architectural paradigms, including heterogeneous computing, in-memory computing, and reconfigurable architectures, which aim to address the unique requirements of AI algorithms. Furthermore, we investigate the role of software-hardware co-design methodologies in optimizing performance and energy efficiency for AI tasks. Despite significant progress, several challenges remain, including the need for scalable and programmable architectures, efficient memory hierarchy designs, and effective utilization of emerging technologies such as quantum computing and photonic computing. By addressing these challenges and embracing innovations in computer architecture, we can unlock the full potential of AI technologies and drive transformative advances in various domains, including healthcare, finance, autonomous systems, and beyond.
Article
Machine learning (ML) has emerged as a powerful tool for enhancing various aspects of software development, revolutionizing traditional practices and opening new avenues for innovation. This paper provides an overview of the current state of the art in ML-enhanced software development and outlines potential future directions in this rapidly evolving field. The abstract begins by highlighting the transformative impact of ML on software development processes, including requirements elicitation, design, implementation, testing, and maintenance. By leveraging large volumes of data and sophisticated algorithms, ML techniques have enabled automated code generation, intelligent bug detection, and predictive maintenance, streamlining development workflows and improving software quality. Next, the abstract discusses key challenges and limitations associated with ML-enhanced software development, such as data quality issues, algorithmic biases, and interpretability concerns. While ML algorithms have demonstrated remarkable performance in certain tasks, their effectiveness may be limited by the availability and quality of training data, as well as the interpretability of model outputs. Furthermore, the abstract explores emerging trends and future directions in ML-enhanced software development, including the integration of ML models into development tools and environments, the adoption of federated learning approaches for collaborative model training, and the exploration of ethical and societal implications of ML-powered software systems. Overall, this paper provides valuable insights into the current state of ML-enhanced software development and offers a roadmap for future research and innovation in this dynamic and rapidly evolving field. By harnessing the potential of ML technologies, software developers can unlock new capabilities, accelerate development cycles, and create more intelligent and adaptive software systems.
Article
Recent years have witnessed significant advancements in deep learning techniques for natural language processing (NLP), leading to transformative applications in various software domains. This paper provides an overview of the recent progress and innovations in leveraging deep learning models for NLP tasks and their integration into software applications. In recent years, the field of natural language processing (NLP) has experienced remarkable advancements driven by the emergence of deep learning techniques. This paper provides a comprehensive review of the latest developments in leveraging deep learning for NLP tasks and its integration into software applications. We explore key areas such as sentiment analysis, named entity recognition, machine translation, text generation, and question answering, highlighting the novel architectures and methodologies that have propelled these advancements. Additionally, we discuss the challenges and opportunities associated with deploying deep learning models in realworld software applications, including issues related to data privacy, model interpretability, and scalability. Through a synthesis of recent research findings and case studies, we demonstrate the transformative impact of deep learning in enhancing the capabilities of software systems to understand and generate human-like text. Furthermore, we examine the implications of these advancements for industries such as healthcare, finance, e-commerce, and customer service, where NLP-powered software applications are driving innovation and improving user experiences. Finally, we discuss future research directions and emerging trends in deep learning for NLP, emphasizing the need for interdisciplinary collaboration and ethical considerations to ensure responsible and beneficial deployment of these technologies. This paper provides valuable insights for researchers, practitioners, and policymakers seeking to leverage deep learning for NLP in software applications across various domains.
Article
In the domain of drug discovery, computational approaches play a pivotal role in accelerating the identification and development of novel therapeutic compounds. This study focuses on optimizing computer architectures to enhance the performance of drug discovery workflows, aiming to expedite the process of drug candidate screening and evaluation. By leveraging advanced parallel computing techniques, including GPU acceleration and distributed computing frameworks, we aim to maximize the computational efficiency and throughput of drug discovery pipelines. Our research investigates the design and implementation of tailored computer architectures capable of handling the computational demands of large-scale molecular simulations, virtual screening, and molecular dynamics simulations. Through a comprehensive evaluation of different hardware configurations, software optimizations, and parallelization strategies, we aim to identify the most effective approaches for accelerating drug discovery workflows while minimizing computational costs. Additionally, we explore the integration of emerging technologies such as deep learning and quantum computing to further enhance the predictive accuracy and efficiency of drug discovery models. The findings of this study have significant implications for pharmaceutical companies, academic research institutions, and computational biologists seeking to leverage cutting-edge computational technologies to streamline the drug discovery process. By optimizing computer architectures for high-performance drug discovery workflows, we can accelerate the pace of drug development, facilitate the discovery of new therapeutic agents, and ultimately improve patient outcomes in the field of healthcare.
Article
Ethical considerations play a pivotal role in the development and deployment of AI-driven software systems, shaping their impact on individuals, societies, and the broader global community. This abstract delves into the multifaceted ethical challenges inherent in AI technology, offering insights into the complex interplay between technical innovation, societal values, and ethical principles. The rapid advancements in AI technology have ushered in unprecedented opportunities for innovation and transformation across various sectors. However, alongside these opportunities come ethical dilemmas and concerns that necessitate careful consideration and proactive mitigation strategies. This abstract explores key ethical considerations spanning data privacy, algorithmic bias, transparency, accountability, and the societal implications of AI-driven systems. Data privacy emerges as a critical ethical concern, given the vast amounts of data generated, collected, and processed by AI systems. Safeguarding individuals' privacy rights and ensuring responsible data handling practices are paramount to building trust and fostering user confidence in AI technologies. Moreover, addressing algorithmic bias and ensuring fairness and equity in AI decision-making processes are essential for mitigating the risks of perpetuating or exacerbating existing societal inequalities. Transparency and accountability are foundational principles in ethical AI development, empowering stakeholders to understand, scrutinize, and challenge the decisions and actions of AI systems. By promoting transparency in algorithmic processes and providing mechanisms for accountability, developers can enhance the trustworthiness and reliability of AI-driven software systems. Furthermore, this abstract examines the broader societal implications of AI technology, including its impact on employment, autonomy, and human dignity. Ethical considerations extend beyond technical design choices to encompass the ethical and moral dimensions of AI deployment, prompting critical reflection on the ethical responsibilities of AI developers, policymakers, and end-users. In conclusion, the imperative of integrating ethical considerations into all stages of AI development and deployment. By prioritizing ethical principles such as privacy, fairness, transparency, and accountability, stakeholders can navigate the complex ethical landscape of AI technology and ensure that AI-driven software systems contribute to the greater good while upholding fundamental human values and rights. This abstract provides a comprehensive overview of the ethical considerations surrounding AI-driven software systems, highlighting their importance and offering insights into potential approaches for addressing them.
Article
In recent years, the proliferation of edge computing and the Internet of Things (IoT) has revolutionized the way we interact with and deploy software in connected environments. This paradigm shift presents both challenges and opportunities for software engineering practitioners. This study explores the intricacies of software engineering for edge and IoT systems, focusing on the unique challenges posed by resource-constrained devices, heterogeneous architectures, and distributed computing environments. One of the primary challenges in software engineering for edge and IoT is the need to design and develop applications that can operate efficiently and reliably in resource-constrained environments. With limited processing power, memory, and energy resources, developers must adopt innovative approaches to optimize code size, minimize energy consumption, and ensure robustness in the face of intermittent connectivity. Another significant challenge is the heterogeneity of edge and IoT architectures, which encompass a diverse array of devices, platforms, and communication protocols. This diversity introduces complexity into the software development process, requiring developers to navigate interoperability issues, compatibility constraints, and platform-specific optimizations. Despite these challenges, software engineering for edge and IoT also presents numerous opportunities for innovation and advancement. Edge computing enables real-time processing and decision-making at the network edge, reducing latency and enhancing scalability for latency-sensitive applications. Additionally, IoT technologies facilitate the collection of vast amounts of data from distributed sensors and devices, opening new avenues for data-driven insights and intelligent automation. In conclusion, software engineering for edge and IoT represents a frontier of innovation and exploration, offering the potential to transform industries, enhance user experiences, and drive societal progress. By addressing the challenges and embracing the opportunities presented by edge and IoT environments, software engineers can unlock the full potential of connected systems in our increasingly interconnected world.
Article
The integration of artificial intelligence (AI) systems into various domains brings forth unprecedented opportunities for innovation and efficiency. However, alongside these advancements, concerns regarding trust and security have emerged as critical challenges that must be addressed to ensure the safe and responsible deployment of AI technologies. This abstract explores the multifaceted landscape of trust and security in AI, highlighting the challenges faced and proposing potential solutions to mitigate risks and foster trustworthiness. The rapid proliferation of AI technologies across sectors such as healthcare, finance, autonomous systems, and cybersecurity has underscored the importance of ensuring trust and security in AI systems. Key challenges include the vulnerability of AI models to adversarial attacks, the lack of transparency and interpretability in AI decision-making processes, and the potential for bias and discrimination in AI algorithms. These challenges pose significant risks to the reliability, fairness, and safety of AI systems, undermining user confidence and hindering widespread adoption. To address these challenges, a holistic approach to trust and security in AI is essential, encompassing technical, regulatory, and ethical dimensions. Technical solutions such as robustness testing, adversarial training, and model explainability techniques can enhance the resilience and transparency of AI systems, enabling stakeholders to better understand and trust AI-driven decisions. Additionally, regulatory frameworks and standards play a crucial role in ensuring compliance with ethical principles, data privacy regulations, and accountability mechanisms. Furthermore, fostering a culture of responsible AI development and deployment requires collaboration among stakeholders, including researchers, policymakers, industry practitioners, and civil society organizations. Education and awareness initiatives can empower individuals to make informed decisions about AI usage and advocate for ethical AI practices. Moreover, international cooperation and knowledge sharing are vital for addressing global challenges related to AI trust and security, promoting best practices, and building trust across borders. In conclusion, addressing the challenges of trust and security in AI requires a concerted effort from various stakeholders, encompassing technological innovation, regulatory frameworks, and societal engagement. By prioritizing transparency, fairness, and accountability in AI development and deployment
Research
This research presents a comprehensive model for Cross-Cultural Management (CCM) research, integrating principles from Cultural Ecology. Cultural Ecology, traditionally applied in anthropology and sociology, offers a unique lens to understand the dynamic interplay between human cultures and their environments. In the context of CCM, this model provides a procedural framework that guides researchers in exploring how cultural factors influence management practices, organizational behaviors, and intercultural interactions. By fusing Cultural Ecology principles with established procedures in CCM, the model enhances the depth and breadth of crosscultural research methodologies, facilitating a more nuanced understanding of the complex relationships within global organizations.
Article
Full-text available
This comprehensive overview explores the transformative impact of Cloud Computing, Artificial Intelligence (AI), and Automation on various industries. As these technologies continue to mature, their integration is reshaping traditional business models, processes, and interactions. The study delves into the applications, benefits, challenges, and future prospects of this technological convergence, offering insights into how organizations can harness these innovations to achieve efficiency, agility, and competitiveness across diverse sectors.
Article
Full-text available
This research provides procedural insights into Cross-Cultural Management Research through the lens of a model inspired by Cultural Ecology. The model incorporates a comprehensive approach to understanding the intricacies of cross-cultural dynamics within organizational settings. By amalgamating principles from Cultural Ecology, the research outlines a systematic framework for conducting cross-cultural studies. The model encompasses key procedural steps, including literature review, study design, data collection, analysis, and interpretation. Through this holistic approach, the research aims to enhance the methodological rigor of cross-cultural management research and contribute valuable insights into the interplay between organizational dynamics and cultural diversity.
Article
Full-text available
This research provides procedural insights into Cross-Cultural Management Research through the lens of a model inspired by Cultural Ecology. The model incorporates a comprehensive approach to understanding the intricacies of cross-cultural dynamics within organizational settings. By amalgamating principles from Cultural Ecology, the research outlines a systematic framework for conducting cross-cultural studies. The model encompasses key procedural steps, including literature review, study design, data collection, analysis, and interpretation. Through this holistic approach, the research aims to enhance the methodological rigor of cross-cultural management research and contribute valuable insights into the interplay between organizational dynamics and cultural diversity.
Article
Full-text available
This research explores innovative power solutions at the intersection of photovoltaic technology, solar cells, and cutting-edge radio frequency (RF) and millimeter-wave antennas. The integration of these technologies holds the potential to revolutionize power generation, storage, and wireless communication systems. The research investigates advancements in photovoltaic power plants, novel solar cell materials, and state-of-the-art RF/millimeter-wave antennas. Through a comprehensive examination of the synergies between these domains, this study aims to contribute insights that drive sustainable energy practices, enhance power efficiency, and propel advancements in wireless communication technologies.
Article
Full-text available
This study delves into the intricate interplay between Cultural Ecology and Economic Confidence, offering a holistic perspective that transcends traditional economic analyses. Cultural Ecology, the study of the dynamic relationship between human cultures and their environments, is explored as a lens through which to understand the nuanced influences on Economic Confidence. By examining the cultural factors that shape perceptions, values, and behaviors related to economic matters, this research contributes to a more comprehensive understanding of the multifaceted forces at play in the economic landscape.
Research Proposal
Full-text available
In the era of cloud computing, where data is ubiquitously stored and processed in distributed environments, ensuring robust data security is paramount. This research delves into the application of machine learning techniques for enhancing data security in cloud computing environments. By leveraging the power of machine learning algorithms, this study aims to address emerging threats, detect anomalies, and fortify the confidentiality and integrity of sensitive information within cloud infrastructures. Through a synthesis of theoretical frameworks, empirical studies, and real-world implementations, the research provides a comprehensive exploration of the intersection between machine learning and data security in the context of cloud computing.
Article
Full-text available
Kirchhoff–Love plate theory is widely used in structural engineering. In this paper, efficient and accurate numerical algorithms are developed to solve a generalized Kirchhoff–Love plate model subject to three common physical boundary conditions: (i) clamped; (ii) simply supported; and (iii) free. The generalization stems from the inclusion of additional physics to the classical Kirchhoff–Love model that accounts for bending only. We solve the model equation by discretizing the spatial derivatives using second-order finite-difference schemes, and then advancing the semi-discrete problem in time with either an explicit predictor–corrector or an implicit Newmark-Beta time-stepping algorithm. Stability analysis is conducted for the schemes, and the results are used to determine stable time steps in practice. A series of carefully chosen test problems are solved to demonstrate the properties and applications of our numerical approaches. The numerical results confirm the stability and 2nd-order accuracy of the algorithms and are also comparable with experiments for similar thin plates. As an application, we illustrate a strategy to identify the natural frequencies of a plate using our numerical methods in conjunction with a fast Fourier transformation power spectrum analysis of the computed data. Then we take advantage of one of the computed natural frequencies to simulate the interesting physical phenomena known as resonance and beat for a generalized Kirchhoff–Love plate.
Article
Full-text available
Conventional damage detection techniques are gradually being replaced by state-of-the-art smart monitoring and decision-making solutions. Near real-time and online damage assessment in structural health monitoring (SHM) systems is a promising transition toward bridging the gaps between the past’s applicative inefficiencies and the emerging technologies of the future. In the age of the smart city, Internet of Things (IoT), and big data analytics, the complex nature of data-driven civil infrastructures monitoring frameworks has not been fully matured. Machine learning (ML) algorithms are thus providing the necessary tools to augment the capabilities of SHM systems and provide intelligent solutions for the challenges of the past. This article aims to clarify and review the ML frontiers involved in modern SHM systems. A detailed analysis of the ML pipelines is provided, and the in-demand methods and algorithms are summarized in augmentative tables and figures. Connecting the ubiquitous sensing and big data processing of critical information in infrastructures through the IoT paradigm is the future of SHM systems. In line with these digital advancements, considering the next-generation SHM and ML combinations, recent breakthroughs in (1) mobile device-assisted, (2) unmanned aerial vehicles, (3) virtual/augmented reality, and (4) digital twins are discussed at length. Finally, the current and future challenges and open research issues in SHM-ML conjunction are examined. The roadmap of utilizing emerging technologies within ML-engaged SHM is still in its infancy; thus, the article offers an outlook on the future of monitoring systems in assessing civil infrastructure integrity.
Article
Full-text available
Background. The most common and successful technique for signal denoising with non-stationary signals, such as electroencephalogram (EEG) and electrocardiogram (ECG) is the wavelet transform (WT). The success of WT depends on the optimal configuration of its control parameters which are often experimentally set. Fortunately, the optimality of the combination of these parameters can be measured in advance by using the mean squared error (MSE) function. Method. In this paper, five powerful metaheuristic algorithms are proposed to find the optimal WT parameters for EEG signal denoising which are harmony search (HS), β-hill climbing (β-hc), particle swarm optimization (PSO), genetic algorithm (GA), and flower pollination algorithm (FPA). It is worth mentioning that this is the initial investigation of using optimization methods for WT parameter configuration. This paper then examines which efficient algorithm has obtained the minimum MSE and the best WT parameter configurations. Result. The performance of the proposed algorithms is tested using two standard EEG datasets, namely, Kiern’s EEG dataset and EEG Motor Movement/Imagery dataset. The results of the proposed algorithms are evaluated using five common criteria: signal-to-noise-ratio (SNR), SNR improvement, mean square error (MSE), root mean square error (RMSE), and percentage root mean square difference (PRD). Interestingly, for almost all evaluating criteria, FPA achieves the best parameters configuration for WT and empowers this technique to efficiently denoise the EEG signals for almost all used datasets. To further validate the FPA results, a comparative study between the FPA results and the results of two previous studies is conducted, and the findings favor to FPA. Conclusion. In conclusion, the results show that the proposed methods for EEG signal denoising can produce better results than manual configurations based on ad hoc strategy. Therefore, using metaheuristic approaches to optimize the parameters for EEG signals positively affects the denoising process performance of the WT method.
Article
Full-text available
Gait analysis is an invaluable tool in diagnosing and monitoring human health. Current techniques often rely on specialists or expensive gait measurement systems. There is a clear space in the field for a simple, inexpensive, quantitative way to measure various gait parameters. This study investigates if useful quantitative gait parameters can be extracted from floor acceleration measurements produced by the input of foot falls. A total of 17 participants walked along a 115-ft-long hallway while underfloor mounted accelerometers measured the vertical acceleration of the floor. Signal-energy-based algorithms detect the heel strike of each step during trials. From the detected footsteps, gait parameters such as the average stride length, the time between steps, and the step signal energy were calculated. In this study, a single accelerometer was shown to be enough to detect steps over a 115-ft corridor. Distributions for all gait parameters measured were generated for each participant, showing a normal distribution with low standard deviation. The success of gait analysis using underfloor accelerometers presents possibilities in the widespread adaptation of gait measurements. The ease of installation and operation offers an opportunity to gather long-term gait measurements. Such data will augment current gait diagnostic approaches by filling the gaps between specialist visits.
Article
Full-text available
A vibroacoustic numerical method employing an implicit finite-difference time-domain (FDTD) method, in which the target architecture is modeled as a composition of two-dimensional plate elements, is proposed in this paper. While structure-borne sound is a difficult phenomenon to predict owing to the complexity of the vibration mechanism on the building structure, wave-based numerical techniques may enable its accurate prediction by virtue of their flexibility from the viewpoint of modeling the object. However, with the current PC performance, prediction for a large-scale problem is still difficult. To solve such a problem, we model the target structure as a composition of plate elements to reduce the simulated field to two dimensions, in contrast to the discretization of the field into three-dimensional solid elements. This results in memorysaving and faster simulation. In this paper, the basic theory of vibroacoustic analysis for a model with plate elements is described, and the results of a case study for a box-type structure are discussed.
Article
Full-text available
The dynamic characteristics of a structure are commonly defined by its modal properties: modal frequencies, damping ratios, and mode shapes. Significant changes in modal properties of a structure after an extreme event, such as an earthquake, or during its service life can be strongly related to damage in the structures. This makes it crucial that the modal properties are accurately estimated and continuously tracked to detect any changes by the structural health monitoring (SHM) system. This paper introduces an algorithm and a MATLAB-based software that includes modules for real-time data processing, modal identification, damage detection, and stakeholder warnings for vibration-based SHM systems. The data processing and modal identification techniques used are based on the classical and stochastic techniques, and utilize running time windows to keep track of time variations in real-time data. The damage detection algorithm makes use of inter-story drifts to detect and locate damage. Since the calculation of inter-story drifts involves double integration and subtraction of acceleration signals, it is extremely hard to get accurate values of inter-story drifts in real-time monitoring. To improve the accuracy, inter-story drifts are calculated for each mode of the structure separately, and then combined synchronously. The displacements at non-instrumented floors are estimated by assuming that the mode shapes can be approximated as a linear combination of those of a shear beam and a bending beam. A software package, REC_MIDS, is developed for this purpose, and it has been operating in a large number of different structures with SHM systems in Turkey (tall buildings, suspension bridges, mosques, museums), and in seven high-rise buildings in UAE.
Article
Full-text available
Although most of its popular applications have been in discrete-time signal processing for over two decades, wavelet transform theory offers a methodology to generate continuous-time compact support orthogonal filter banks through the design of discrete-time finite length filter banks with multiple time and frequency resolutions. In this paper, we first highlight inherently built-in approximation errors of discrete-time signal processing techniques employing wavelet transform framework. Then, we present an overview of emerging analog signal processing applications of wavelet transform along with its still active research topics in more matured discrete-time processing applications. It is shown that analog wavelet transform is successfully implemented in biomedical signal processing for design of low-power pacemakers and also in ultra-wideband (UWB) wireless communications. The engineering details of analog circuit implementation for these continuous-time wavelet transform applications are provided for further studies. We expect a flurry of new research and technology development activities in the coming years utilizing still promising and almost untapped analog wavelet transform and multiresolution signal representation techniques.
Article
Full-text available
Most of the derivations of the mechanical behavior of a plate as the limit behavior of a three-dimensional solid whose thickness tends to zero deal with stationary homogeneous linear boundary conditions on the lateral boundary. Here, in the framework of small strains, we rigorously determine a large class of steady-state or transient nonlinear boundary conditions which provide asymptotic kinematics of Kirchhoff-Love type.
Article
Full-text available
The acoustic signature of a footstep is one of several signatures that can be exploited for human recognition. Early research showed the maximum value for the force of multiple footsteps to be in the frequency band of 1-4 Hz. This paper reports on the broadband frequency-dependent vibrations and sound pressure responses of human footsteps in buildings. Past studies have shown that the low-frequency band (below 500 Hz) is well known in the literature, and generated by the force normal to the ground/floor. The seismic particle velocity response to footsteps was shown to be site specific and the characteristic frequency band was 20-90 Hz. In this paper, the high-frequency band (above 500 Hz) is investigated. The high-frequency band of the vibration and sound of a human footstep is shown to be generated by the tangential force to the floor and the floor reaction, or friction force. The vibration signals, as a function of floor coverings and walking style, were studied in a broadband frequency range. Different walking styles result in different vibration signatures in the low-frequency range. However, for the walking styles tested, the magnitudes in the high-frequency range are comparable and independent of walking style.
Article
Full-text available
The generation of the impact sound by the act of the human walk involves two factors, the characters of the footfall and the shape of the induced floor deflection. The footfall noise is created by the impact excitation and the characters of the footfall depend on the foot-ware: the heels and the frequencies of the footfall. The shape of the floor deflection depends rather on the geometrical walking pattern and the construction of the floor structure. In this investigation, the vibration patter of the light-weight construction floor is created by the same walking object, a male with common height. The excitation from the person to the floor in the FE simulations is a function of the length of the foot and the weight of the walking object. The geometrical time history is of the foot step allows it to have different directions in the room. Since the excitation is assumed to be deterministic, differences between the excitation frequencies are estimated from the video records. The goal of this investigation is to determine the difference of the floor structure deflections between two different walking paths: one is perpendicular to the bearing beams and the other is the diagonal path.
Article
Full-text available
The problem of noise reduction has attracted a considerable amount of research attention over the past several decades. Among the numerous techniques that were developed, the optimal Wiener filter can be considered as one of the most fundamental noise reduction approaches, which has been delineated in different forms and adopted in various applications. Although it is not a secret that the Wiener filter may cause some detrimental effects to the speech signal (appreciable or even significant degradation in quality or intelligibility), few efforts have been reported to show the inherent relationship between noise reduction and speech distortion. By defining a speech-distortion index to measure the degree to which the speech signal is deformed and two noise-reduction factors to quantify the amount of noise being attenuated, this paper studies the quantitative performance behavior of the Wiener filter in the context of noise reduction. We show that in the single-channel case the a posteriori signal-to-noise ratio (SNR) (defined after the Wiener filter) is greater than or equal to the a priori SNR (defined before the Wiener filter), indicating that the Wiener filter is always able to achieve noise reduction. However, the amount of noise reduction is in general proportional to the amount of speech degradation. This may seem discouraging as we always expect an algorithm to have maximal noise reduction without much speech distortion. Fortunately, we show that speech distortion can be better managed in three different ways. If we have some a priori knowledge (such as the linear prediction coefficients) of the clean speech signal, this a priori knowledge can be exploited to achieve noise reduction while maintaining a low level of speech distortion. When no a priori knowledge is available, we can still achieve a better control of noise reduction and speech distortion by properly manipulating the Wiener filter, resulting in a suboptimal Wiener filter. In case that we have multi- - ple microphone sensors, the multiple observations of the speech signal can be used to reduce noise with less or even no speech distortion
Article
The accurate verification of beam parameters is mandatory for treatment safety in particle therapy, therefore, the systematic uncertainties of the segmented ionization chambers and detector arrays used as beam monitors, must be measured. In this study, a Fast Fourier Transform (FFT) based denoising method is proposed to remove noise from the pencil beam profile signal in both the spatial and spatial-frequency domains. In the spatial domain, the bias caused by noise was corrected, and the noise outside the 4σ range of the spot center was discarded. Then, in the spatial-frequency domain, thanks to the Gaussian shape of the signal’s magnitude spectrum, the 4σ range of frequency components was extracted to eliminate the signal fluctuation caused by noise. In addition, low amplitude high-frequency components are finely removed by thresholding to reduce the distortion of beam profile signals. The performance of the proposed method was verified through simulation experiments in terms of the coefficient of determination (R2) and signal-to-noise ratio (SNR). Results show that SNRs of the denoised signals are more than six times higher than those of the original signals and R2 of the denoised signals is larger than 0.9. The denoising capability of the presented method was verified by an X-ray radiation experiment. R2 of the denoised signal achieves 0.96 from 0.60, and the fitting errors of beam parameters (μ and σ) also decrease by denoising without increasing the radiation intensity and the integration time.
Article
Gait assessments are commonly used for clinical evaluations of neurocognitive disease progression and general wellness. However, gait measurements in clinical settings do not accurately reflect gait in daily life. We present a non-wearable and unobtrusive method of detecting gait parameters in the home through the vibrations in the floor created by footfalls. Gait characteristics and gait asymmetry are estimated despite a low sensor density of 6.7 m²/sensor. Features from each footfall vibration signal is extracted and used to estimate gait parameters with gradient boosting regression and probabilistic models. Temporal gait asymmetry, locations of the footfalls, and peak tibial acceleration asymmetry can be predicted with a root mean square error of 0.013 s, 0.42 m, and 0.34 g respectively. This system allows for continuous at-home monitoring of gait which aids in early detection of gait anomalies.
Article
Mechanical system usually operates in harsh environments, and the monitored vibration signal faces substantial noise interference, which brings great challenges to the robust fault diagnosis. This paper proposes a novel attention-guided joint learning convolutional neural network (JL-CNN) for mechanical equipment condition monitoring. Fault diagnosis task (FD-Task) and signal denoising task (SD-Task) are integrated into an end-to-end CNN architecture, achieving good noise robustness through dual-task joint learning. JL-CNN mainly includes a joint feature encoding network and two attention-based encoder networks. This architecture allows FD-Task and SD-Task can achieve deep cooperation and mutual learning. The JL-CNN is evaluated on the wheelset bearing dataset and motor bearing dataset, which shows that JL-CNN has excellent fault diagnosis ability and signal denoising ability, and it has good performance under strong noise and unknown noise.
Article
Occupant detection and recognition support functional goals such as security, healthcare, and energy management in buildings. Typical sensing approaches, such as smartphones and cameras, undermine the privacy of building occupants and inherently affect their behavior. To overcome these drawbacks, a non-intrusive technique using floor-vibration measurements, induced by human footsteps, is outlined. Detection of human-footstep impacts is an essential step to estimate the number of occupants, recognize their identities and provide an estimate of their probable locations. Detecting the presence of occupants on a floor is challenging due to ambient noise that may mask footstep-induced floor vibrations. Also, signals from multiple occupants walking simultaneously overlap, which may lead to inaccurate event separation. Signals corresponding to events, once extracted, can be used to identify the number of occupants and their locations. Spurious events such as door closing, chair dragging and falling objects may produce vibrations similar to footstep-impacts. Signals from such spurious events have to be discarded as outliers to prevent inaccurate interpretations of floor vibrations for occupant detection. Walking styles differ among occupants due to their anatomies, walking speed, shoe type, health and mood. Thus, footstep-impact vibrations from the same person may vary significantly, which adds uncertainty and complicates occupant recognition. In this paper, efficient strategies for event-detection and event-signal extraction have been described. These strategies are based on variations in standard deviations over time of measured signals (using a moving window) that have been filtered to contain only low-frequency components. Methods described in this paper for event detection and event-signal extraction perform better than existing threshold-based methods (fewer false positives and false negatives). Support vector machine classifiers are used successfully to distinguish footsteps from other events and to determine the number of occupants on a floor. Convolutional neural networks help recognize the identity of occupants using footstep-induced floor vibrations. The utility of these strategies for footstep-event detection, occupant counting, and recognition is validated successfully using two full-scale case studies.
Article
Electrocardiogram (ECG) signals are used to diagnose cardiovascular diseases. During ECG signal acquisition, various noises like power line interference, baseline wandering, motion artifacts, and electromyogram noise corrupt the ECG signal. As an ECG signal is non-stationary, removing these noises from the recorded ECG signal is quite tricky. In this paper, along with the proposed denoising technique using stationary wavelet transform, various denoising techniques like lowpass filtering, highpass filtering, empirical mode decomposition, Fourier decomposition method, discrete wavelet transform are studied to denoise an ECG signal corrupted with noise. Signal-to-noise ratio, percentage root-mean-square difference, and root mean square error are used to compare the ECG signal denoising performance. The experimental result showed that the proposed stationary wavelet transform based ECG denoising technique outperformed the other ECG denoising techniques as more ECG signal components are preserved than other denoising algorithms.
Article
Background Artifact contamination reduces the accuracy of various EEG based neuroengineering applications. With time, biomedical signal denoising has been the utmost protuberant research area. So, the noise-reducing algorithm should be carefully deployed since artifacts result in degraded performance. Motivation Artifact reduction or denoising in degraded EEG signals requires a lot of improvement. The main aim of this paper is to present the investigation carried out to suppress the noise found in EEG signals of depression. Method The focus is to compare the effectiveness of the physiological signal denoising approaches based on discrete wavelet transform (DWT) and wavelet packet transform (WPT) combined with VMD (variational mode decomposition), namely VMD-DWT and VMD-WPT, with other approaches. In these approaches, the detrended fluctuation analysis (DFA) will be used to define the mode selection criteria. First of all, VMD will decompose the signal into various components, then DWT and WPT will be used to denoise the artifactual components rather than completely rejecting these with DFA as the mode selection basis. Simulations have been carried out on artificially contaminated and real databases of depression to demonstrate the effectiveness of the proposed technique using the performance parameters such as SNR, PSNR, and MSE. Contribution It can be said that sufficient removal of artifacts is gained by VMD- DFA-WPT and VMD-DFA-DWT though VMD-DFA-WPT outperforms VMD- DFA-DWT and others. Such an artifact removal system may offer an effective solution for clinicians as a crucial stage of pre-processing and may prevent delay in diagnosis for depression signals.
Article
The article introduces a novel integrated moving element method (IMEM) to hydroelastic analysis of infinitely extended floating plates under moving loads in shallow water conditions. The floating plate is modeled via the Kirchhoff-Love theory, while the linearized shallow-water equation is adopted for the hydrodynamic modeling. Both computational domains of fluid and structure are concurrently discretized into “moving elements” whose coordinate system moves along with applied loads. Accordingly, the paradigm can absolutely eradicate the update procedure of force vector owing to the change of contact point with discretized elements not only for the plate but also for the fluid. Furthermore, the IMEM also requires fewer number of discrete elements than the standard finite element method (FEM) due to their independence with the distance of moving load. Results obtained in several numerical examples are compared with those of the Fourier Transform Method (FTM) to validate the accuracy and effectiveness of the proposed methodology. In addition, the influence of water depth, load speed, multiple contact points, as well as the distance between axles on the dynamic amplification factor of plate displacement and the loading's critical speed is also examined in great detail.
Article
Intelligent fault diagnosis (IFD) refers to applications of machine learning theories to machine fault diagnosis. This is a promising way to release the contribution from human labor and automatically recognize the health states of machines, thus it has attracted much attention in the last two or three decades. Although IFD has achieved a considerable number of successes, a review still leaves a blank space to systematically cover the development of IFD from the cradle to the bloom, and rarely provides potential guidelines for the future development. To bridge the gap, this article presents a review and roadmap to systematically cover the development of IFD following the progress of machine learning theories and offer a future perspective. In the past, traditional machine learning theories began to weak the contribution of human labor and brought the era of artificial intelligence to machine fault diagnosis. Over the recent years, the advent of deep learning theories has reformed IFD in further releasing the artificial assistance since the 2010s, which encourages to construct an end-to-end diagnosis procedure. It means to directly bridge the relationship between the increasingly-grown monitoring data and the health states of machines. In the future, transfer learning theories attempt to use the diagnosis knowledge from one or multiple diagnosis tasks to other related ones, which prospectively overcomes the obstacles in applications of IFD to engineering scenarios. Finally, the roadmap of IFD is pictured to show potential research trends when combined with the challenges in this field.
Article
In this paper, we develop a virtual element method (VEM) of high order to solve the fourth order plate buckling eigenvalue problem on polygonal meshes. We write a variational formulation based on the Kirchhoff–Love model depending on the transverse displacement of the plate. We propose a C1 conforming virtual element discretization of arbitrary order k≥2 and we use the so-called Babuška–Osborn abstract spectral approximation theory to show that the resulting scheme provides a correct approximation of the spectrum and prove optimal order error estimates for the buckling modes (eigenfunctions) and a double order for the buckling coefficients (eigenvalues). Finally, we report some numerical experiments illustrating the behavior of the proposed scheme and confirming our theoretical results on different families of meshes.
Conference Paper
Identification of occupant presence and location inside buildings is essential to functional goals such as security, healthcare, and energy management. Floor-vibration measurements, induced by footstep impacts, provide a non-intrusive sensing method for occupant identification, unlike cameras and smartphones. Detecting the presence of an occupant is a necessary first step for occupant location identification. A challenge for occupant detection is ambient noise that may hide footstep-induced floor-vibration signatures. Also, spurious events such as door closing, chair dragging and falling objects may result in vibrations that have similarities with footstep-impact events. In this paper, an accurate occupant-detection strategy for structures with varying rigidity is outlined. Event detection is based on computing the standard deviation of a moving window over measurements at various frequency ranges. Using a classification method, footsteps are distinguished from other events. This strategy enhances detection of footstep-impact events compared with methods that employ only thresholds, thereby reducing false positives (incorrect detection) and false negatives (undetected events). Footstep-impact events may then be used for footstep impact localization using model-based approaches. Finally, the utility of this strategy for footstep-event detection is evaluated using a full-scale case study.
Article
Instead of using traditional intrusive surveillance technologies such as cameras or body sensors, we propose a smart home system sensing the footstep induced vibrations to feature the same functions which can be used for the assisted living purpose. Based on distributed sensing and processing units, we adopt a distributed computing approach to pre-process the time-series data, recognize footstep signals and extract vibration features locally. Through communications over the sensor networks, our system is capable to estimate the occupancy via counting the number of occupants. Besides, based on the multi-component seismometer sensing system, we propose a novel indoor footstep localization method called ATDOA relying on both angle and arrival information of the recorded waveforms. According to separated pedestrian locations, different trajectories of multiple people can be tracked. From the location tracking history, the resident's daily activities and social interactions can be inferred. In our experiments, the proposed system obtains promising results. Specifically, the location error is 0.14 meters with a 0.11 meters standard deviation. And the multi-people identification accuracy is above 87.83%.
Article
This paper introduces an adaptive filtering process based on shrinking wavelet coefficients from the corresponding signal wavelet representation. The filtering procedure considers a threshold method determined by an iterative algorithm inspired by the control charts application, which is a tool of the statistical process control (SPC). The proposed method, called SpcShrink, is able to discriminate wavelet coefficients that significantly represent the signal of interest. The SpcShrink is algorithmically presented and numerically evaluated according to Monte Carlo simulations. Two empirical applications to real biomedical data filtering are also included and discussed. The SpcShrink shows superior performance when compared with competing algorithms.
Article
In the bearing health assessment issues, using the adaptive nonstationary vibration signal processing methods in the time-frequency domain, lead to improving of early fault detection. On the other hand, the noise and random impulses which contaminates the input data, are a major challenge in extracting fault-related features. The main goal of this paper is to improve the Ensemble Empirical mode decomposition (EEMD) algorithm and combine it with a new proposed denoising process and the higher order spectra to increase the accuracy and speed of the fault severity and type detection. The main approach is to use statistical features without using any dimension reduction and data training. To eliminate unrelated components from faulty condition, the best combination of denoising parameters based on the wavelet transform, is determined by a proposed performance index. In order to enhance the efficiency of the EEMD algorithm, a systematic method is presented to determine the proper amplitude of the additive noise and the Intrinsic Mode Functions (IMFs) selection scheme. The fault occurrence detection and the fault severity level identification are performed by the Fault Severity Index (FSI) definition based on the energy level of the Combined Fault-Sensitive IMF (CFSIMF) envelope using the central limit theorem. Also, taking the advantages of a bispectrum analysis of CFSIMF envelope, fault type recognition can be achieved by Fault Type Index (FTI) quantification. Finally, the proposed method is validated using experimental data set from two different test rigs. Also, the role of the optimum denoising process and the algorithm of systematic selection of the EEMD parameters are described regardless of its type and estimating the consistent degradation pattern.
Article
We propose an algorithm for minimizing the total variation of an image, and provide a proof of convergence. We show applications to image denoising, zooming, and the computation of the mean curvature motion of interfaces.
Article
A sound understanding of a structure's normal condition, including its response to normal environmental and operational variations is desirable for structural health monitoring and necessary for performance monitoring of civil structures. The current paper outlines the extensive monitoring campaign of the Tamar Suspension Bridge as well as analysis carried out in an attempt to understand the bridge's normal condition. Specifically the effects of temperature, traffic loading and wind speed on the structure's dynamic response are investigated. Finally, initial steps towards the development of a structural health monitoring system for the Tamar Bridge are addressed.
Article
The research approach is to construct a numerical model of floor vibration induced by foot steps with FEM. This numerical model is validated through the comparison of the vibration values obtained from the prediction model and the measurements. The regression analysis is employed for the validation. R 2 value of 0.8168 is calculated, which indicates the acceptability of the numerical model constructed with the FEM for predicting floor vibration by foot steps. With this numerical model, which may provide designers helpful information in reducing floor vibration.
Conference Paper
Footsteps, with different shoes of heels, sneakers, leathers or even bare footed, will appear in different grounds of concrete, wood, etc. To recognize human footsteps is not only the need of surveillance but the home service center also can use this information to understand the motion of people with techniques of sound localization, etc. Thus, if a footstep is discriminable, the application to various fields can be considered. In this paper, the feature extraction of a footstep is investigated. We focused on the mel-cepstrum analysis, walking interval, and the degree of similarity of spectrum envelope. It is shown for personal identification that the performance of the proposed method is effective.
Structural Health Monitoring (SHM)-Overview on Technologies Under Development
  • H Speckmann
  • R Henrich
Speckmann H, Henrich R. Structural Health Monitoring (SHM)-Overview on Technologies Under Development. In: Proceedings of the 16th world conference on NDT. 2004;1.
Dynamic Characterization of Arrows Through Stochastic Perturbation
  • R Fish
  • Y Liang
  • K Saleeby
  • J Spirnak
  • M Sun
Fish R, Liang Y, Saleeby K, Spirnak J, Sun M, et al. Dynamic Characterization of Arrows Through Stochastic Perturbation. 2019. arXiv preprint: https://arxiv.org/ftp/arxiv/papers/1909/1909.08186.pdf
Subsensory Vibrations to the Feet Reduce Gait Variability in Elderly Fallers
  • A M Galica
  • H G Kang
  • A A Priplata
  • D 'andrea
  • S E Starobinets
Galica AM, Kang HG, Priplata AA, D'Andrea SE, Starobinets OV, et al. Subsensory Vibrations to the Feet Reduce Gait Variability in Elderly Fallers. Gait Posture. 2009;30:383-387.
Design of Floor Structures for Human Induced Vibrations
  • M Feldmann
  • C Heinemeyer
  • C Butz
  • E Caetano
  • A Cunha
Feldmann M, Heinemeyer C, Butz C, Caetano E, Cunha A, et al. Design of Floor Structures for Human Induced Vibrations. JRC-ECCS joint report. 2009:45.
Wavelet Transform and Signal Denoising Using Wavelet Method
  • C P Dautov
  • M S Ozerdem
Dautov CP, Ozerdem MS. Wavelet Transform and Signal Denoising Using Wavelet Method. In: 26th Signal Processing and Communications Applications Conference (SIU). IEEE Publications; 2018:1-4.
A Multivariate Triple-Regression Forecasting Algorithm for Long-Term Customized Allergy Season Prediction
  • X Wu
  • Z Bai
  • J Jia
  • Y Liang
Wu X, Bai Z, Jia J, Liang Y. A Multivariate Triple-Regression Forecasting Algorithm for Long-Term Customized Allergy Season Prediction. 2020. Arxiv preprint: https://arxiv.org/pdf/2005.04557.pdf
Mental Task Classification Using Electroencephalogram Signal
  • Z Bai
  • R Yang
  • Y Liang
Bai Z, Yang R, Liang Y. Mental Task Classification Using Electroencephalogram Signal. 2019. ArXiv preprint: https://arxiv.org/ftp/arxiv/papers/1910/1910.03023.pdf