
Henrique Madeira- PhD
- Professor (Full) at University of Coimbra
Henrique Madeira
- PhD
- Professor (Full) at University of Coimbra
About
265
Publications
81,100
Reads
How we measure 'reads'
A 'read' is counted each time someone views a publication summary (such as the title, abstract, and list of authors), clicks on a figure, or views or downloads the full-text. Learn more
5,200
Citations
Introduction
Research topics: experimental evaluation of dependable computing systems, including security evaluation and benchmarking, fault injection techniques, error detection mechanisms, and transactional systems dependability.
Current institution
Additional affiliations
Publications
Publications (265)
Measuring code understandability is both highly relevant and exceptionally challenging. This paper proposes a dynamic code understandability assessment method, which estimates a personalized code understandability score from the perspective of the specific programmer handling the code. The method consists of dynamically dividing the code unit under...
The significant increase in urban UAVs, due to their benefits and commercial potential, will increase drone density and collision risks. To manage this, Unmanned Aircraft Systems Traffic Management (UTM), European implementation of UTM (U-space), and Air Traffic Management (ATM) are being developed for safe integration with other air traffic. Nonet...
Regulatory frameworks are essential to ensure safe (i.e., collision-free) and efficient operations of UAVs. In Europe, U-Space addresses these needs through traffic management services. However, to maximize the effectiveness of these services in preventing and mitigating safety risks, particularly the risk of collision, a comprehensive safety risk...
Assessing cognitive load using pupillography frequency features presents a persistent challenge due to the lack of consensus on optimal frequency limits. This study aims to address this challenge by exploring pupillography frequency bands and seeking clarity in defining the most effective ranges for cognitive load assessment. From a controlled expe...
Comprehending digital content written in natural language online is vital for many aspects of life, including learning, professional tasks, and decision-making. However, facing comprehension difficulties can have negative consequences for learning outcomes, critical thinking skills, decision-making, error rate, and productivity. This paper introduc...
Unmanned Aerial Vehicles (UAVs) are on the rise across a wide range of application domains like shipping and delivery, precise agriculture, geographic mapping, and search and rescue. Thus, ensuring UAVs’ safe operations and reliable integration into civilian airspace is essential. These unmanned vehicles face various potential hazards and threats,...
Cognitive human error and recent cognitive taxonomy on human error causes of software defects support the intuitive idea that, for instance, mental overload, attention slips, and working memory overload are important human causes for software bugs. In this paper, we approach the EEG as a reliable surrogate to MRI-based reference of the programmer’s...
In this study, we present iMind, a novel prototype tool designed to predict individuals’ comprehension difficulties with online content at local levels, such as paragraphs or multiple lines. iMind seamlessly captures the physiological responses of the Autonomic Nervous System (ANS) using commercially available wearable sensors (wristbands) in conju...
Unmanned Aerial Vehicles (UAVs) are susceptible to various hazards (e.g., software or hardware failures or security attacks) that may hinder mission completion or compromise safety by violating the separation minima (i.e., the minimum distance that must be maintained between UAVs in order to ensure safe and efficient operations). To address this is...
In recent years, UAVs have been increasingly used in urban environments due to agility in movement, simplicity in me- chanics, low price, and ability to access locations that are difficult or impossible to reach by humans. A significant number of drones are expected to fly in the urban sky shortly. The profitable nature of commercial UAVs/drone app...
Software programming is an acquired evolutionary skill originating from consolidated cognitive functions (i.e., attentive, logical, coordination, mathematic calculation, and language comprehension), but the underlying neurophysiological processes are still not completely known. In the present study, we investigated and compared the brain activities...
In the coming years, there will be a significant increase in the use of Unmanned Aircraft Vehicles (UAVs), particularly drones, in urban environments, where safety due to the large number of drones operating is the primary concern. Therefore, the UAS traffic management (UTM) and its European implementation, U-space services, are being created to gu...
This research aims to develop a more resilient system for unmanned aerial vehicles (UAVs) that can maintain their safe operations and availability, even in the presence of unexpected events such as GPS failures. To achieve this goal, this paper proposes a fault-tolerance mechanism using machine learning models and approaches to mitigate abnormal an...
Cloud Management Platforms (CMPs) have a critical role in supporting private and public cloud computing as a tool to manage, provision and track resources and their usage. These platforms, like cloud computing, tend to be complex distributed systems spread across multiple nodes, thus network faults are a threat that can lead to failures to provide...
Complexity is the key element of software quality. This article investigates the problem of measuring code complexity and discusses the results of a controlled experiment to compare different views and methods to measure code complexity. Participants (27 programmers) were asked to read and (try to) understand a set of programs, while the complexity...
Ultra-short-term HRV features assess minor autonomous nervous system variations such as variations resulting from cognitive stress peaks during demanding tasks. Several studies compare ultra-short-term and short-term HRV measurements to investigate their reliability. However, existing experiments are conducted in low cognitively demanding environme...
Unmanned Aerial Vehicles (UAVs) have gained notable importance in civil airspace. To ensure their safe operation, U-space services are being defined. We argue that the target level of safety of UAS can be threatened by faults, abnormal conditions, and security attacks. In this paper, we propose a fault-injection-based approach to build a framework...
Machine Learning has been extensively used in studies that utilized neurophysiological data (e.g., fMRI, EEG, ECG) in capturing brain and the Autonomic Nervous System (ANS) activity in various software engineering (SE) tasks such as code comprehension and development. However, there is a lack of research efforts to review machine learning technique...
The present paper describes the use of nonintrusive biofeedback sensors (e.g., ECG) and eye-tracker to study the cognitive load (CL) associated with two mental tasks: a) content reading and comprehension b) code review. The paper addresses the theoretical underpinnings of the comprehension assessment included in content reading (for understanding)...
Unmanned Aerial Vehicles (UAVs) have gained notable im-
portance in civil airspace. To ensure their safe operation, U-space services are being defined. In this paper, we argue that the target level of safety of UAS can be threatened by faults, abnormal conditions, and security attacks. In this paper, we propose a fault-injection-based approach to b...
Code review is an essential practice in software engineering to spot code defects in the early stages of software development. Modern code reviews (e.g., acceptance or rejection of pull requests with Git) have become less formal than classic Fagan's inspections, lightweight, and more reliant on individuals (i.e., reviewers). However, reviewers may...
The neural correlates of software programming skills have been the target of an increasing number of studies in the past few years. Those studies focused on error-monitoring during software code inspection. Others have studied task-related cognitive load as measured by distinct neurophysiological measures. Most studies addressed only syntax errors...
As a direct cause of software defects, human error is the key to understanding and identifying defects. We propose a new code inspection method: targeted code inspection based on human error mechanisms of software engineers. Based on the common erroneous mechanisms of human cognition, the method targets error-prone codes with high efficiency and mi...
Unmanned Ariel Vehicles (UAVs) have gained significant importance in diverse sectors, mainly military uses. Recently, we can see a growth in acceptance of drones in civilian spaces as well. Thus, a profound safety risk analysis/assessment to prevent any possible damage to themselves, the environment, and humans is fundamental for building and utili...
Code reviews and software inspections are essential for building reliable software. However, current code reviews practice in the software industry (e.g., acceptance or rejection of pull requests with Git) deviates considerably from classic (and expensive) Fagan's inspections. Modern code reviews are lightweight and asynchronous and do not rely on...
As our dependence on automated systems grows, so does the need for guaranteeing their safety, cybersecurity, and privacy (SCP). Dedicated methods for verification and validation (V&V) must be used to this end and it is necessary that the methods and their characteristics can be clearly differentiated. This can be achieved via method classifications...
Software programming is a modern activity that poses strong challenges to the human brain. The neural mechanisms that support this novel cognitive faculty are still unknown. On the other hand, reading and calculation abilities represent slightly less recent human activities, in which neural correlates are relatively well understood. We hypothesize...
An emergent research area in software engineering and software reliability is the use of wearable biosensors to monitor the cognitive state of software developers during software development tasks. The goal is to gather physiologic manifestations that can be linked to error-prone scenarios related to programmers’ cognitive states. In this paper we...
Assessing comprehension difficulties requires the ability to assess cognitive load. Changes in cognitive load induced by comprehension difficulties could be detected with an adequate time resolution using different biofeedback measures (e.g., changes in the pupil diameter). However, identifying the Spatio-temporal sources of content comprehension d...
Many organizations are moving their systems to the cloud, where providers consolidate multiple clients using virtualization, which creates challenges to business-critical applications. Research has shown that hypervisors fail, often causing common-mode failures that may abruptly disrupt dozens of virtual machines simultaneously. We hypothesize and...
Blockchain has become particularly popular due to its promise to support business-critical services in very different domains (e.g., retail, healthcare). Blockchain systems rely on complex middleware, like Ethereum or Hyperledger Fabric, that allow running smart contracts, which specify business logic in cooperative applications. The presence of so...
Blockchain has become particularly popular due to its promise to support business-critical services in very different domains (e.g., retail, supply chains, healthcare). Blockchain systems rely on complex middleware, like Ethereum or Hyperledger Fabric, that allow running smart contracts, which specify business logic in cooperative applications. The...
Virtualized servers compose the majority of cloud computing environments, where these nodes are used to host multiple clients over the same hardware. Many organizations run online applications by hiring elastic computing resources in order to match demand while reducing fixed costs. However, such organizations are unlikely to take advantage of thes...
All computer programs have flaws, some of which can be exploited to gain unauthorized access to computer systems. We conducted a field study on publicly reported vulnerabilities affecting three open source software projects in widespread use. This paper highlights the main observations and conclusions from the field data collected in the study.
We present a versatile architecture for AI-powered self-adaptive self-certifiable critical systems. It aims at supporting semi-automated low-cost re-certification for self-adaptive systems after each adaptation of their behavior to a persistent change in their operational environment throughout their lifecycle.
This paper provides a study using Electroencephalography (EEG) to investigate the brain activity during code comprehension tasks. Three different code complexity levels according to five complexity metrics were considered. The use of EEG for this purpose is relevant, since the existing studies were mostly focused on neuroimaging techniques. Using L...
Software programming is a complex and relatively recent human activity, involving the integration of mathematical, recursive thinking and language processing. The neural correlates of this recent human activity are still poorly understood. Error monitoring during this type of task, requiring the integration of language, logical symbol manipulation...
Security vulnerabilities are a concern in systems and software exposed via networked interfaces. Previous research has shown that only a minority of vulnerabilities can be emulated through software fault injection techniques. This paper aims to accurately emulate software security vulnerabilities. To this end, the paper provides a field-data study...
This chapter addresses cyber-physical systems resilience with a focus on Internet of Things as a particularly prominent example of large-scale cyber-physical systems. The emphasis is on current and future network architectures and systems, highlighting main research issues, as well as technological trends. This chapter opens by discussing and contr...
Hardware manufacturing advances along with the popularisation of energy saving techniques are predicted to cause an increase in the soft error rate, which in turn will transfer part of the responsibility for tolerating these errors to the software layer. Since the programming language and its supporting implementation have a determinant impact in t...
Critical systems are nowadays being deployed as services or web applica- tions, and are being used to provide enterprise-level business-critical opera- tions. These systems are supported by complex middleware, which often links different systems, and where a failure can bring in disastrous consequences for both clients and service providers. In thi...
With the rise of software complexity, software-related accidents represent a significant threat for computer-based systems. Software Fault Injection is a method to anticipate worst-case scenarios caused by faulty software through the deliberate injection of software faults. This survey provides a comprehensive overview of the state of the art on So...
Cloud infrastructures provide elastic computing resources to client organizations, enabling them to build online applications while avoiding the fixed costs associated to a complete IT infrastructure. However, such organizations are unlikely to fully trust the cloud for the most critical applications. Among other threats, soft errors are expected t...
The importance of temporal information (TI) is increasing in several Information Retrieval (IR) tasks. CHAVE, available from Linguateca’s site, is the only ad hoc IR test collection with Portuguese texts. So, the research question of this work is whether this collection is sufficiently rich to be used in Temporal IR evaluation. The obtained answer...
The security of software-based systems is one of the most difficult issues when accessing the suitability of systems to most application scenarios. However, security is very hard to evaluate and quantify, and there are no standard methods to benchmark the security of software systems. This work proposes a novel methodology for benchmarking the secu...
There is a lot of recent work aimed at improving the effectiveness in
Information Retrieval results based on temporal information extracted from
texts. Some works use all dates but others use only document creation or modification
timestamps. However, no previous work explicitly focuses on the use
of dates within in the document content to establis...
In this paper we propose a methodology and a prototype tool to evaluate web application security mechanisms. The methodology is based on the idea that injecting realistic vulnerabilities in a web application and attacking them automatically can be used to support the assessment of existing security mechanisms and tools in custom setup scenarios. To...
There is a plethora of information inside the Web. Even the top commercial search engines can not download and index all the available information. So, in the recent years, there are several research works on the design and implementation of focused topic crawlers and also on geographic scope crawlers.
Despite other areas of information retrieval,...
The use of temporal data extracted from text, to improve the effectiveness of Information Retrieval systems, has recently been the focus of important research work. Our research hypothesis is that the usage of the temporal relationship between words improves the Information Retrieval results. For this purpose, the texts are temporally segmented to...
Most web applications have critical bugs (faults) affecting their security, which makes them vulnerable to attacks by hackers and organized crime. To prevent these security problems from occurring it is of utmost importance to understand the typical software faults. This paper contributes to this body of knowledge by presenting a field study on two...
Fault injection has been successfully used in the past to support the generation of realistic failure data for offline training of failure prediction algorithms. However, runtime computer systems evolution requires the online generation of training data. The problem is that using fault injection in a production environment is unacceptable. Virtuali...
Software fault injection techniques are beginning to gain a strong recognition in critical domains. Their adoption is already recommended in several safety critical standards, such as automotive, avionics, and aerospace systems. This paper aims to provide an overview on software fault injection, suited both for researchers and practitioners in the...
The injection of software faults in software components to assess the impact of these faults on other components or on the system as a whole, allowing the evaluation of fault tolerance, is relatively new compared to decades of research on hardware fault injection. This paper presents an extensive experimental study (more than 3.8 million individual...
Computer benchmarks are standard tools that allow evaluat-ing and comparing different systems or components according to specific characteristics (performance, dependability, security, etc). Resilience en-compasses all attributes of the quality of 'working well in a changing world that includes faults, failures, errors and attacks'. This way, re-si...
Robustness is an attribute of resilience that measures the behaviour of the system under non-standard conditions. Robustness is defined as the degree to which a system operates correctly in the presence of exceptional inputs or stressful environmental conditions. As triggering robustness faults could in the worst case scenario even crash the system...
This chapter is devoted to field studies and the aspects related to this kind of measurements. The importance of measurements collected from the operational scenarios is discussed, and two case studies are presented. Field measurements are closely tied to data repositories, and this chapter presents an overview of some field data repositories avail...
This chapter provides a condensed description of a roadmap for research in technologies for assessment, measurement and benchmarking (AMB) of the resilience of information, computer and communication systems. The research roadmap is the result of the EU-funded AMBER Coordination Action, integrating the consortium experience in the field with the in...
Resilient systems are designed to operate at acceptable levels even in the presence of faults and other adverse events. Assessing the resilience of a given system therefore requires that the effects of such events can be measured and examined in detail, which in turn requires the ability to introduce faults and observe the subsequent behaviour of t...
Developing robust web services is a difficult task. Field studies show that a large number of web services are deployed with robustness problems (i.e., presenting unexpected behaviors in the presence of invalid inputs). Although several techniques for the identification of robustness problems have been proposed in the past, there is no practical ap...
The Data Warehouse Striping (DWS) technique is a data partitioning approach especially designed for distributed data warehousing environments. In DWS the fact tables are distributed by an arbitrary number of low-cost computers and each query is executed in parallel by all the computers, guarantying a nearly optimal speed up and scale up. Data loadi...
The use of Web services in enterprise applications is quickly increasing. In a Web services environment, providers supply a set of services for consumers. However, although Web services are being used in business-critical environments, there are no practical means to test or compare their robustness to invalid and malicious inputs. In fact, client...
In this work, we introduce a software testbed for temporal processing of Portuguese texts, composed by several building blocks: identification, classification and resolution of temporal expressions and temporal text segmentation. Starting from a simple document, we can reach a set of temporally annotated segments, which enables the establishment of...
Developing web services with timing requirements is a difficult task, as existing technology does not provide standard mechanisms to support real-time execution, or even to detect and predict timing violations. However, in business-critical environments, an operation that does not conclude on due time may be completely useless, and may result in se...
Developing web services with timing requirements is a difficult task, as existing technology does not provide standard mechanisms to support real-time execution, or even to detect and predict timing violations. However, in business-critical environments, an operation that does not conclude on due time may be completely useless, and may result in se...
Software reuse is the practice of using existing artifacts (code, architecture, requirements, etc.) in new projects. The advantages of using previously developed software in new projects are easily understood. However, reusing artifacts is usually done in an ad-hoc and incipient way, requiring an important effort of adaptation, so developers freque...