Article

The Contribution of Latent Human Failures to the Breakdown of Complex Systems

Authors:
To read the full-text of this research, you can request a copy directly from the author.

Abstract

Several recent accidents in complex high-risk technologies had their primary origins in a variety of delayed-action human failures committed long before an emergency state could be recognized. These disasters were due to the adverse conjunction of a large number of causal factors, each one necessary but singly insufficient to achieve the catastrophic outcome. Although the errors and violations of those at the immediate human-system interface often feature large in the post-accident investigations, it is evident that these 'front-line' operators are rarely the principal instigators of system breakdown. Their part is often to provide just those local triggering conditions necessary to manifest systemic weaknesses created by fallible decisions made earlier in the organizational and managerial spheres. The challenge facing the human reliability community is to find ways of identifying and neutralizing these latent failures before they combine with local triggering events to breach the system's defences. New methods of risk assessment and risk management are needed if we are to achieve any significant improvements in the safety of complex, well-defended, socio-technical systems. This paper distinguishes between active and latent human failures and proposes a general framework for understanding the dynamics of accident causation. It also suggests ways in which current methods of protection may be enhanced, and concludes by discussing the unusual structural features of 'high-reliability' organizations.

No full-text available

Request Full-text Paper PDF

To read the full-text of this research,
you can request a copy directly from the author.

... where both virtual and real experiments are typical. The Swiss Cheese [58,70,85] model is one approach to experimental research safety which represents a system as sequentially stacked barriers protecting against failure. While any one safety evaluation step might have holes (limitations or failure points) that would lead to harmful outcomes, the safety assessment protocol is designed to ensure these holes do not align and thus potential harmful outcomes are prevented. ...
... Kuespert [58] is a reasonable starting point for learning about different models of safety, with a focus on research laboratory safety. In best practices for safety assessment frameworks [39,58,85] systems are evaluated and must be proven in earlier evaluation stages, such as simulation, before a physical instance of the system is created or run. When simulation is an appropriate model, a failure in simulation is assumed to imply a failure on the real physical system. ...
... One example of a human process model of experimental research safety is to sequentially stack protections according to a Swiss Cheese [58,85] model. While any one safety evaluation step might have holes (limitations or failure points) that would lead to harmful outcomes, the safety protocol is designed to ensure these holes do not align, and thus the harmful outcome is prevented. ...
Preprint
Full-text available
Stereotypes, bias, and discrimination have been extensively documented in Machine Learning (ML) methods such as Computer Vision (CV) [18, 80], Natural Language Processing (NLP) [6], or both, in the case of large image and caption models such as OpenAI CLIP [14]. In this paper, we evaluate how ML bias manifests in robots that physically and autonomously act within the world. We audit one of several recently published CLIP-powered robotic manipulation methods, presenting it with objects that have pictures of human faces on the surface which vary across race and gender, alongside task descriptions that contain terms associated with common stereotypes. Our experiments definitively show robots acting out toxic stereotypes with respect to gender, race, and scientifically-discredited physiognomy, at scale. Furthermore, the audited methods are less likely to recognize Women and People of Color. Our interdisciplinary sociotechnical analysis synthesizes across fields and applications such as Science Technology and Society (STS), Critical Studies, History, Safety, Robotics, and AI. We find that robots powered by large datasets and Dissolution Models (sometimes called "foundation models", e.g. CLIP) that contain humans risk physically amplifying malignant stereotypes in general; and that merely correcting disparities will be insufficient for the complexity and scale of the problem. Instead, we recommend that robot learning methods that physically manifest stereotypes or other harmful outcomes be paused, reworked, or even wound down when appropriate, until outcomes can be proven safe, effective, and just. Finally, we discuss comprehensive policy changes and the potential of new interdisciplinary research on topics like Identity Safety Assessment Frameworks and Design Justice to better understand and address these harms.
... Rather, they tend to focus on the presence of positive capacities within a RE paradigm, termed Safety-II (Hollnagel, 2014a;Provan et al., 2020). Pillay (2017, p150) states 'RE is concerned with operating as close as possible to the boundaries of failure as part of normal work' a philosophy diametrically opposed to the traditional 'defences-in-depth' principles of Safety-1 (Reason, 1990). ...
... When problems arise, HOP focuses on identifying system weaknesses, not operator errors, by looking for situational Human Error traps (Petersen, 1980;Reason, 1990;Cooper & Findley, 2013) and removing them or building defences against them before they lead to a problem or incident (Bea, 2002). Consistent with safety culture research (Cooper, 2016), HOPs focus is on optimising systems and situations to optimise behaviour. ...
... Such ratios illustrate the vast scope for error residing in the remaining 0.01% from focusing on what goes right 99.9% of the time: they also illustrate a positive focus is not necessarily going to reduce incident rates, and also undermine the idea that safety is solely the presence of positives. Certainly, even when the vast majority of things are going right, it only takes a few small system features to go wrong, combined with people's ineffective behaviour (Meng et al., 2019) to destroy a facility, crash an airplane, or sink a vessel (Reason, 1990). ...
Preprint
Organizational Performance (HOP) and Safety Differently. Collectively termed New-View, they have created a stir amongst OSH practitioners by challenging them to view key areas of occupational safety in a different way: [1] how safety is defined; [2] the role of people in safety; and [3] how businesses focus on safety. When subject to critical scrutiny, New-View's major tenets are shown to be a collection of untested propositions (ideas, rules, and principles). New-view's underlying RE philosophy is predicated on repeatedly testing the boundary limitations of systems until a failure occurs, which paradoxically requires more risk controls that create the very problems New-View criticizes and attempts to address-constraints, complexity, rigidity, and bureaucracy. This continuous threat-rigidity cycle indicates New-View's raison d'etre is somewhat circular. New-View entirely lacks any new associated practical methodologies for improving safety performance: it uses traditional Safety-1 methodologies to tackle actual safety problems. Moreover, no published, peer-reviewed empirical evidence demonstrates whether or not any aspect of New-View's propositions are valid. Currently we don't know how, or if, New-View improves safety performance per se, or if it reduces or eliminates incidents/ injuries. The extant Safety-1 literature suggests that New-View's propositions lack substance. The inescapable conclusion, therefore, is 'the emperor has no clothes' and that ideology and emotion has triumphed over science and practice. It is also clear that the OSH profession has an immense crisis of ethics across its entire landscape.
... lack of leadership, poor training, lack of staff support). Accidents are complex and result from the unforeseen concatenation of multiple factors (Reason, 1990). However, this culture of blame is prevalent in the hospitality industry (de Freitas et al., 2020;Pereira et al., 2021), but can be improved by a fair treatment (Wiśniewska et al., 2019). ...
... The gaps in the defence systems arise from active failures and latent conditions (Reason, 2000). Active failures are breaches that have an immediate negative impact and are generally associated with 'frontline' forces (Reason, 1990) i.e., they are acts -slips, omissions, errors, system violationscommitted by those in direct contact with a system, and these acts have a direct impact on the integrity of that system's defences (Reason, 2000). On the other hand, latent conditions arise from top management decisions or actions. ...
... Latent conditions have no immediate negative impact but, in combination with local trigger factors can break through the system's defences (e.g. climate and organisational factors) (Reason, 1990). These conditions can remain dormant/latent in that system for a long period of time and may trigger a violation (or loss) when combined with other triggers or active failures (Reason, 2000). ...
Article
Full-text available
This study aims to discuss the use of multiple layers of defence to prevent foodborne illness in restaurants. A defence model was developed based on Reason's Swiss Cheese Model. Reason's model was extended by adding the concept of Hazard Analysis and Critical Control Points, as well as Five Keys to Safer Food. The defence system was divided into seven layers of defence: 1) adequate facilities and 2) training as administrative controls; 3) safe ingredients and water; 4) environmental hygiene; 5) personal and food hygiene and 6) safe food temperature as behavioural controls; and 7) control and systems. The hypothesis was that the layers would act as barriers to prevent hazards from causing losses. To test the model, a dataset (secondary data) of food safety assessments from 1536 different restaurant establishments in Brazil was used. A checklist of 51 items was organised into seven layers of defence system. The model was tested with a Partial Least Square Structural Equation Model. Errors in administrative controls (facilities and training) led to errors in behavioural controls. A ‘cascade effect’ was observed where errors in distal behavioural controls (safe ingredients and water and environmental hygiene) impacted proximal controls (personal and food hygiene and; safe food temperature) and system controls. It was discussed how latent conditions and active failures can string together and cause a foodborne illness incident. The Swiss Cheese Model of food safety incidents is proposed as a new perspective for food safety. This model can be used for risk management and food safety education.
... The cracks may initially be merely microcracks that are invisible to the casual observer (cracks of the first kind). These stabilization cracks will allow moisture to enter the asphalt layer and flow with greater ease horizontally [1,6] or at the interface of the base top and bottom of the thin asphalt and seal surfacing. This water invariably crosses the wheel path linked to cross fall and longitudinal gradient which dictates flow 'strive lines'. ...
... It is now known that the moisture ingress into the top of the granular base may experience 10 to 30 times easier horizontal flow of the water [1]. The water flows in the horizontal plane inside the thin asphalt surfacing or at the interface between the granular base and the surfacing [6]. If the water then flows with the normal gradient or superelevation and longitudinal gradient, obeying the laws of gravity, it has to pass the wheel path zones at some stage. ...
... The contact pressure due to a passing wheel in wet weather, normally exceeding the tire inflation pressure (TiP), can also cause a squirting action that can force water through such microcracks, multi-dimensionally [1,2]. This has led to the development of a High-Pressure Permeability Test (HPPT) [4,6] to simulate this high-water pressure in the asphalt layer. The HPPT has proven to be very discriminative regarding asphalt sensitivity to high-pressure permeability. ...
Article
Full-text available
The root cause of premature failures in relatively thin (<50 mm) asphalt surfaced roads is often a challenge to solve during forensic investigations in South Africa (SA). This description is based on peer review type discussions with forensic investigations experts as well as published research papers. The areas of ignorance or areas where research effort is needed are identified. These observations serve to identify areas of new knowledge needed in terms of actual verification with measurements, measurement technology, and modeling of the observed phenomena. The main objectives of this discussion paper are to highlight the evolution of distress development in asphalt layers starting from the identification and description of the kind of microcracks, effects of microcracks in the debonding of the asphalt layer from lower layer, and vehicle–pavement interactions (VPI) of moving truck wheels focusing on the thin asphalt layer. Specific reference is made to the need to measure and model the effects of bow waves in front of and next to the loaded rolling whell.
... The challenges we face with this pandemic, remind us that there is no silver bullet to control most pathogens, but that instead, it is the combination of multiple approaches that will result in their successful reduction and subsequent elimination. A good analogy is the Swiss Cheese model of Respiratory Pandemic Defense developed by Ian Mackay and colleagues at the University of Queensland, Australia (virologydownunder.com) based on Reason's model of accident causation (Reason 1990). The model illustrates that all control strategies have deficits and only when they are used together, we maximize the chances of decreasing, if not eliminating pathogen transmission. ...
... The Swiss Cheese model of Vector-Borne Disease Defense. Adapted from IanMackay (2020; virologydownunder.com) and based on James T. Reason model of accident causationReason 1990). Modified for vector-borne diseases by Andrea Gloria-Soria. ...
Article
Life remained far from normal as we completed the first year of the Covid-19 pandemic and entered a second year. Despite the challenges faced worldwide, together we continue to move the field of Medical Entomology forward. Here, I reflect on parallels between control of Covid-19 and vector-borne disease control, discuss the advantages and caveats of using new genotyping technologies for the study of invasive species, and proceed to highlight papers that were published between 2020 and 2021 with a focus on those related to mosquito surveillance and population genetics of mosquito vectors.
... According to the demonstration of accident causation model [44], accidents are appeared from failures of the worker interactions with their work place, material and equipment [45,46]. Thus, accident root causes can be identified as originating influences (client requirements, economical constrains and construction education), shaping factors (design and specification), worker factors (errors and violations by worker), site factors (layout, lighting, cite constraints, scheduling and housekeeping), material and equipment factors [44], distal factors (project conditions and management decisions), proximal factors (inappropriate site conditions or actions) [47], work technology (tool, material and actions required to perform a specific task), physical conditions (working environment), surrounding activities (such as falling objects from upper elevations and vibrations due to piling), human factors (all environmental, organisational and human characteristics which affects health and safety) [48], organisational factors (poor safety and management) [49,50] and natural factors (natural processes or phenomenon) [51]. In particular, existing literature discusses more on the human factor involvement towards occupational accidents. ...
... Worker factor Every possible error, violation, mental and health issue, lack of skills and behaviour of the workers inside the site environment [49,[58][59][60]. ...
Article
Full-text available
Construction is an industry well known for its very high rate of injuries and accidents around the world. Even though many researchers are engaged in analysing the risks of this industry using various techniques, construction accidents still require much attention in safety science. According to existing literature, it has been found that hazards related to workers, technology, natural factors, surrounding activities and organisational factors are primary causes of accidents. Yet, there has been limited research aimed to ascertain the extent of these hazards based on the actual reported accidents. Therefore, the study presented in this paper was conducted with the purpose of devising an approach to extract sources of hazards from publicly available injury reports by using Text Mining (TM) and Natural Language Processing (NLP) techniques. This paper presents a methodology to develop a rule-based extraction tool by providing full details of lexicon building, devising extraction rules and the iterative process of testing and validation. In addition, the developed rule-based classifier was compared with, and found to outperform, the existing statistical classifiers such as Support Vector Machine (SVM), Kernel SVM, K-nearest neighbours, Naïve Bayesian classifier and Random Forest classifier. The finding using the developed tool identified the worker factor as the highest contributor to construction site accidents followed by technological factor, surrounding activities, organisational factor, and natural factor (1%). The developed tool could be used to quickly extract the sources of hazards by converting largely available unstructured digital accident data to structured attributes allowing better data-driven safety management.
... À l'inverse, l'entreprise se trouve dans la situation d'incertitude, zone la moins maîtrisée. Toutefois, le « risque zéro » -c'est-à-dire la certitude absolue -n'existe pas, comme le rappellent les travaux de Reason (1990Reason ( , 2013 dont une synthèse des travaux est réalisée par Larouzée, Guarnieri et Besnard (2014). Malgré l'absence de certitude absolue, tout le travail est de trouver des moyens de passer d'un contexte d'incertitude non-maîtrisée à une situation risquée plus ou moins « maîtrisée », d'un point de vue info-communicationnel. ...
Thesis
Full-text available
De nombreuses études mettent en avant le rôle joué par un écosystème d’innovation (ESI) dans le développement du phénomène d’innovation. Parmi ces études, très peu finalement, explorent les processus info-communicationnels en tant que tels, comme relations « écologiques » entre les protagonistes « faisant système ». Cependant, certains travaux récents appellent à l’importance de qualifier le « sens » qu’accordent les protagonistes à leur contexte d’innovation, afin de mieux en comprendre les démarches. Par une méthodologie ascendante, l’originalité de cette thèse se situe dans une approche idiographique d’un ESI territorial. Au sein d’un paradigme socioconstructiviste, l’étude mobilise la médiation décisionnelle. L’objectif est de mieux cerner comment un ESI peut éclairer les risques-perçus qui sous-tendent les décisions des organisations faisant face à des problématiques d’innovation. Au travers de 15 entretiens semi-ouverts, menés auprès d’entreprises de la mode et du textile en Région Hauts-de-France, l’étude développe une analyse thématique, une analyse de clusters, puis une analyse factorielle, inspirées de la méthode triadique. Par une démarche axiomatico-inductive, les travaux développent un ensemble d’hypothèses autour de la formalisation originale de l’ESI, tel que perçu par les protagonistes. Il est ainsi avancé qu’une telle formalisation de l’ESI peut se résumer en éléments, signifiants clés, inscrits dans une logique triadique (le Dispositif, les Réseaux, les Aspirations, le Territoire, l’Infrastructure) prenant corps au travers d’attributs bivalents (Confiance/Méfiance, Renseignement/Désorientant, Faisabilité/Contestable, Différence/Banalité). Les résultats dressent alors une typologie de construits de sens qui président aux processus décisionnels. L’étude livre ainsi une meilleure appréhension des processus info-communicationnels à l’œuvre au sein d’un ESI, notamment par la caractérisation de ces construits. Les travaux entrepris offrent des outils méthodologiques et opérationnels quant à la manière dont les enjeux de l’innovation sont perçus. Pour des décideurs publics, cette étude met en évidence le rôle souvent sous-estimé d’éléments intangibles, tels que les aspirations des innovateurs.
... The focus of relative or net survival analysis is the distribution of the latent event time for death from disease. This endpoint has been advocated by many practitioners (Slud et al., 1988;Reason, 1990;and Louzada et al., 2015), as it removes the impact of other cause mortality on the risk of disease-specific mortality, permitting comparisons across populations with different background mortality. As an alternative, other work has considered estimation of the crude disease-specific survival, C k (t), using the relative survival estimates and the known reference hazard for other cause mortality (Cronin and Feuer, 2000). ...
Preprint
With known cause of death (CoD), competing risk survival methods are applicable in estimating disease-specific survival. Relative survival analysis may be used to estimate disease-specific survival when cause of death is either unknown or subject to misspecification and not reliable for practical usage. This method is popular for population-based cancer survival studies using registry data and does not require CoD information. The standard estimator is the ratio of all-cause survival in the cancer cohort group to the known expected survival from a general reference population. Disease-specific death competes with other causes of mortality, potentially creating dependence among the CoD. The standard ratio estimate is only valid when death from disease and death from other causes are independent. To relax the independence assumption, we formulate dependence using a copula-based model. Likelihood-based parametric method is used to fit the distribution of disease-specific death without CoD information, where the copula is assumed known and the distribution of other cause of mortality is derived from the reference population. We propose a sensitivity analysis, where the analysis is conducted across a range of assumed dependence structures. We demonstrate the utility of our method through simulation studies and an application to French breast cancer data.
... In the meanwhile, more accident causation models were developed to demystify the accident propagation behavior, including human-based, statistics-based, energy-based, system-based, linear, and nonlinear accident models ( Fu et al., 2020 ). The accident proneness ( Greenwood and Woods, 1919 ), the domino accident model ( Heinrich, 1931 ), accident liability ( Mintz and Blum, 1949 ), energy transfer theory ( Haddon, 1964 ), Bird's model ( Bird Jr, 1966 ), and Surry's model ( Surry, 1969 ), Swiss cheese model ( Reason, 1990 ), and Systems Theoretic Accident Model and Process ( Leveson, 2004 ) were some of the notable tools in this context. The subjective nature of these qualitative techniques limited their use. ...
Article
With a rapid realization of process digitization, data-driven methods are being increasingly adopted in process safety analysis. However, the use of data-driven methods contains a varied degree of myths and misconceptions that are the application of a method or the data representation without following a proper scientific notion. These myths and misconceptions cause significant errors in terms of results and their interpretability. Hence, the purpose of this study is set to analyze the most common myths and misconceptions of data-driven methods observed in the recent literature. In the current work, we have analyzed 500 public domain articles from 1990-2020, published in 10 renowned safety journals. The analysis attempts to address the following questions: (i) What are the key data in process safety analysis? (ii) What are the sources of data? (iii) What does the data-driven method mean? (iv) What are the common myths and misconceptions of data-driven methods? and (v) How frequently such myths and misconceptions are occurring? After analyzing the 500 articles, it is observed that most of the myths are related to improper data representation, missing appropriate assumptions, and blanket use of methods without a detailed understanding of their limitations. The authors believe this work will help peers to avoid the myths studied here, use data-driven methods with scientific rigor, and present findings in a meaningful way.
... Es sollen genau diese vermieden werden, indem unterschiedliche Schwellenwerte und Stufen identifiziert werden, welche bestimmte Auswirkungen und Probleme zur Folge haben. Dies kann und soll nicht zur Eliminierung eines Restrisikos führen, da dies voraussichtlich einen unverhältnismäßigen Aufwand erfordern würde oder auch gar nicht möglich ist; in Anlehnung an das Schweizer-Käse-Modell nach Reason (1990). Allerdings führt es zu einer Sensibilisierung und es ermöglicht die Erstellung von Mindestversorgungskonzepten, damit eine Handlungsfähigkeit in exzeptionellen Ereignissen möglich bleibt. ...
Thesis
Full-text available
Vergangene Ereignisse haben gezeigt, dass es trotz gesetzlicher Vorgaben und technischer Regeln, welche die Notstromversorgung von Krankenhäusern bestimmen, zu Vorfällen mit menschlichen Verlusten als Folge von Stromausfällen kam. Deshalb befasst sich diese Masterthesis mit der Entwicklung einer Bewertungsmethode für die Notfallplanung von deutschen Krankenhäusern im Bereich der Energieversorgung unter Berücksichtigung von technischen und organisatorischen Anforderungen. Sie soll in der Krankenhausalarm- und Einsatzplanung Anwendung finden. Das Ziel ist es, eine Methode zu entwickeln, welche von Anwender:innen mit unterschiedlicher Ausprägung der Fachexpertise in der Krankenhausalarm- und Einsatzplanung sowie Risikoanalyse genutzt werden kann. Über zwei Scoping Reviews für die Krankenhausalarm- und Einsatzplanung sowie zur Energieversorgung von Krankenhäusern werden bestehende Vorgaben und Konzepte sowie Anforderungen an die Methode ermittelt. Basierend auf diesen Informationen wird ein gestuftes Bewertungsmodell vorgeschlagen. Es setzt sich aus einer Synthese eines Verfahrens des Bundesamts für Bevölkerungsschutz und Katastrophenhilfe zur Vulnerabilitätsanalyse mit Methoden der multikriteriellen Entscheidungsunterstützung (Satisfizierungsverfahren und Analytic Network Process) zusammen. Anhand einer exemplarischen Durchführung erfolgte eine empirische Validierung. Des Weiteren wurden der Praxisbezug und die Anwendbarkeit mit Expert:innen-Gesprächen untersucht. Als Ergebnis kann festgehalten werden, dass sich das gestufte Bewertungsmodell als Planungsmodell für die Bewertung der Energienotversorgung von Krankenhäusern eignet. Es ist in eine Einsteiger-, Fortgeschrittenen- und Expertenstufe unterteilt, um für alle Anwender:innen unabhängig von ihrer Fachexpertise nutzbar zu sein. Es sind weitere Arbeiten unter enger Einbindung von Krankenhausbetreibenden erforderlich, um anhand des Modells priorisierte Checklisten mit allgemeinen technischen und organisatorischen Anforderungen für die Fortgeschrittenenstufe zu entwickeln. Durch die exemplarische Durchführung hat sich herausgestellt, dass das gestufte Bewertungsmodell zudem zur Identifizierung von Abhängigkeiten und Kritikalität des untersuchten Systems genutzt werden kann. Als Herausforderung stellte sich die Übertragung des Analytic Network Process zu einem Bewertungs- und Priorisierungssystem der technischen und organisatorischen Anforderungen in Verbindung mit der Vulnerabilität heraus. Die Anwendungsgrenzen werden durch die Stufung des Bewertungsmodells gering gehalten. Es sind jedoch weitere Arbeiten, wie die Erstellung eines Leitfadens, notwendig, um die praktische Umsetzung zu ermöglichen. urn:nbn:de:hbz:832-epub4-18671
... The main challenge is the difficulty of tracking the factors that lead to human errors. Reason (1990aReason ( , 1990b named these factors as active and latent factors; he stated that they only occur in the presence of a trigger, and he defined the model, also known as the Swiss Cheese Model, in which he explained that accidents could only occur as a result of the combination of more than one fault that will occur in various layers. Shappell and Wiegmann (2001) developed the Human Factors Analysis and Classification System (HFACS) based on the Swiss cheese model. ...
Article
Full-text available
Aviation accidents are caused by a chain of errors in many steps. Detection and classification of human factors in accidents are critical for taking effective precautions. This study aims to investigate the factors affecting civil aviation accidents in Turkey and increase aviation safety by raising awareness against the contributing factors in the accidents. Final accident reports of Turkish Civil Aviation Accidents, including fatalities or injuries between 2003 and 2017, were analysed retrospectively using the Human Factors Analysis and Classification System (HFACS). 59 aviation accidents were included in this study. Crew Resource Management (CRM) (41.4%), Loss of Situational Awareness (LSA) (39.0%), and meteorology (29.2%) were found to be the most contributing factors in 41 Plane, Helicopter, Glider (PHG) accidents, while meteorology (77.7%) and CRM (61.1%) were found to be the most contributing factors in 18 Balloon accidents. The rate of HFACS levels in the PHG/Balloon accidents were found to be 90.2%/66.6% in Level-1 (Unsafe Acts), 95.1%/100% in Level-2 (Preconditions for Unsafe Acts), 78.0%/94.4% in Level-3 (Unsafe Supervision), and 58.5/83.3% in Level-4 (Organizational Influences). These findings show that human factors are still major contributing factors in aviation accidents. Academic training like CRM, Aviation Meteorology and LSA should be given more frequently to the aviators to prevent accidents. Including Spatial Disorientation, hypoxia, and night vision practical training into the civilian pilot training and integrating HFACS into the "Aviation Safety Management System" might help to reduce aviation accident rates.
... However, these objectives are not achieved solely as a function of smoke control but are instead achieved as a function of a series of fire safety measures (or "protection layers"), where it should be demonstrated that these measures can work together in an integrated system (Gerges et al., 2018). In crude terms, the impact of these measures in relation to the performance objective may be represented using the Reason (1990Reason ( , 2000 Swiss cheese model, with an example for stair protection presented in Figure 4a. ...
Article
Purpose - The proposed use of unlatched, reverse swing flappy doors is becoming widespread in the design of residential common corridor smoke control systems. This article explores the conceptual arguments for and against the use of these systems. Design/methodology/approach - This article relies on industry experience, with reference to relevant building design practices, standards and research literature, to categorise arguments. These are collated into four common areas of concern relating to compartmentation, reliability, depressurisation and modelling practices. A final comparison is made between different common corridor smoke control system types for these four areas. Findings - The article highlights several concerns around the use of flappy door systems, including the enforced breaches in stair compartmentation, uncertainties around system reliability, the reliance on door closers as a single point of failure, the impact of day-today building use on the system performance and the false confidence that modelling assessments can provide in demonstrating adequacy. The article concludes in suggesting that alternative smoke control options be considered in place of flappy door systems. Originality/value - Discussion on the use of flappy door smoke control systems has been ongoing within the fire engineering community for several years, but there is limited public literature available on the topic. By collating the common arguments relating to these systems into a single article, a better understanding of their benefits and pitfalls has been provided for consideration by building design and construction professionals.
... However, our study identifies several issues listed below, which are still open concerning infectiology, immunology and biosecurity of TiLV infection in disease-resistant strains. These concerns should be addressed by incorporating the disease-resistant strains into an integrative approach with a multi-layered security system in accordance with Reason's Swiss cheese model [28]. The effective strategy limiting the impact of TiLV has to combine the use of the disease-resistant fish with limiting the prevalence of the virus with strict diagnostics, quarantine and increased biosecurity of fish farms, good farm practice increasing the welfare of animals and limiting stress, as well as protecting the susceptible strains with vaccination when it is available. ...
Article
The emergence of viral diseases affecting fish and causing very high mortality can lead to the disruption of aquaculture production. Recently, this occurred in Nile tilapia aquaculture where a disease caused by a systemic infection with a novel virus named tilapia lake virus (TiLV) caused havoc in cultured populations. With mortality surpassing 90% in young tilapia, the disease caused by TiLV has become a serious challenge for global tilapia aquaculture. In order to partly mitigate the losses, we explored the natural resistance to TiLV-induced disease in three genetic strains of tilapia which were kept at the University of Göttingen, Germany. We used two strains originating from Nilotic regions (Lake Mansala (MAN) and Lake Turkana (ELM)) and one from an unknown location (DRE). We were able to show that the virus is capable of overcoming the natural resistance of tilapia when injected, providing inaccurate mortality results that might complicate finding the resistant strains. Using the cohabitation infection model, we found an ELM strain that did not develop any clinical signs of the infection, which resulted in nearly 100% survival rate. The other two strains (DRE and MAN) showed severe clinical signs and much lower survival rates of 29.3% in the DRE strain and 6.7% in the MAN strain. The disease resistance of tilapia from the ELM strain was correlated with lower viral loads both at the mucosa and internal tissues. Our results suggest that the lower viral load could be caused by a higher magnitude of a mx1-based antiviral response in the initial phase of infection. The lower pro-inflammatory responses also found in the resistant strain might additionally contribute to its protection from developing pathological changes related to the disease. In conclusion, our results suggest the possibility of using TiLV-resistant strains as an ad hoc, cost-effective solution to the TiLV challenge. However, as the fish from the disease-resistant strain still retained significant virus loads in liver and brain and thus could become persistent virus carriers, they should be used within an integrative approach also combining biosecurity, diagnostics and vaccination measures.\
... Additional levels of controls, symbols, structures, and routines can help address gaps in the overall safety performance of KYTC. Reason's (1990) Swiss Cheese Model visualizes this need (Figure 6.1). ...
Technical Report
Full-text available
Highway work zones can be dangerous and unpredictable. Between 2003 and 2017, over 1,800 workers died on road construction sites. Eliminating injuries and deaths requires state transportation agencies to adopt robust safety cultures as there is a clear relationship between these cultures and worker behaviors. The Kentucky Transportation Cabinet(KYTC) is committed to improving safety performance by nurturing a positive safety climate among highway maintenance crews. To understand the safety cultures of KYTC maintenance crews, researchers administered a survey based on the safety Climate Assessment Tool (S-CAT) developed by the Center for Construction Research and Training (CPWR). This is the first tool developed for the construction industry. The survey was used to quantified the existing safety climate and evaluate how effective safety programs and controls are at reducing workplace hazards. Survey respondents answered questions on 37 indicators across eight safety climate categories: employee risk perception, management commitment, aligning and integrating safety as a value, ensuring accountability at all levels, improving supervisory leadership, empowering and involving employees, improving communication, and safety training. For each indicator, respondents were assigned a rating on a five-point Likert scale — Inattentive (1), Reactive (2), Compliant (3), Proactive (4), Exemplary (5). Analysis of survey responses at the statewide and district levels found that KYTC’s safety culture can be characterized as between compliant and proactive. Focus groups with maintenance superintendents generated recommendations to improve safety cultures and install multiple layers of preventive measures to further reduce the number and threat of job site hazards.
... The best known of theses is Reason's (Reason, 1990), identification of critical but permeable barriers designed to prevent specific situations that can lead to problems; but that can be breached or compromised in operation. These concepts are now routinely used to demonstrate safety in so called BOW TIE diagrams (Gower-Jones, van der Graaf & Milne, 1996). ...
Article
Full-text available
The very ideas that underpin our traditional understanding of organization and workplace are being critically explored by an increasingly large number of people. Equally, the discussion is also embracing new ways of thinking about organization, through a range of perspectives. It is as if a new school of organizational thought and practice is appearing and is crying out for closer investigation, as there are, apparently, no easy solutions as to how it can be remedied so that those similar incidents do not continue to occur at regular intervals. It seems too superficial to simply hold the entire organization to "blame". More recently attempts to address the fallibility of such organizations has tended to focus on their "culture" and style of leadership. Given human frailties, are very large, dispersed organizations then doomed to fail? Hopkins argues that a good organisation should not allow these deviations to get out of control. A number of other authors have addressed this problem and held that the problem could be solved by decentralization of authority structures. But still others point out that organizational drift is natural, but not unavoidable. It could be reversed but the organization and the people holding responsible positions need to be constantly aware of the propensity and design the structure appropriately.
... Several scientific fields have addressed this concept, but a particularly tangible analogy can be found in the field of accident prevention in risk management, known as the Swiss cheese model. In this model, no individual prevention strategy can avert all accidents, but, by layering different systems, each with their own holes, it becomes less likely that an experimental accident will occur, such as an ineffective or unsafe therapy reaching clinical trial stages (Reason, 1990). This Perspective will use this framework to evaluate recent achievements, as well as present what we consider is still needed to generate a robust ecosystem of mouse models of cancer. ...
Article
Full-text available
In a recent study, Sargent et al. characterise several novel Rag1−/− mouse strains and demonstrate that genetic background strongly influences xenograft development and phenotype. Here, we discuss this work within the broader context of cancer mouse modelling. We argue that new technologies will enable insights into how specific models align with human disease states and that this knowledge can be used to develop a diverse ecosystem of complementary mouse models of cancer. By utilising these diverse, well-characterised models to provide multiple perspectives on specific cancers, it should be possible to reduce the inappropriate attrition of sound hypotheses while protecting against false positives. Furthermore, careful re-introduction of biological variation, be that through outbred populations, environmental diversity or including animals of both sexes, can ensure that results are more broadly applicable and are less impacted by particular traits of homogeneous experimental populations. Thus, careful characterisation and judicious use of an array of mouse models provides an opportunity to address some of the issues surrounding both the reproducibility and translatability crises often referenced in pre-clinical cancer research.
... While understanding our management systems, performing QA programs, establishing standardization procedures, and participating in voluntary internal/external audits are the preventive measures. 11,13,[17][18][19][20][21][22][23][24][25] The proactive risk assessment, reactive risk analysis, review of reported adverse events, and establishing reporting and learning systems are other important measures adopted in managing errors as described. 14,26-28 Error reporting and learning systems form the backbone of most modern RT centers and associations to constantly learn and improve the safety and quality of patient care and are described further in detail here. ...
Article
Full-text available
PURPOSE To present an overview of quality and safety in radiotherapy from the context of low- and middle-income countries on the basis of a recently conducted annual meeting of our institution and our experience of implementing an error management system at our center. METHODS The minutes of recently concluded annual Evidence-Based Medicine (EBM-2021) meeting on the basis of technology in radiation oncology were reviewed. The session on quality and safety, which had international experts as speakers, was reviewed. Along with this, we reviewed the literature for preventive and reactive measures proposed to manage errors including error reporting and learning systems (ILSs). Concise summary for the same was prepared for this article. RESULTS We also reviewed the journey of development of our institutional ILS and present here a summary of achievements, challenges, and future vision. CONCLUSION Preventive and reactive measures must be followed to achieve high-quality and safe radiotherapy. Despite resource constraints, a successful ILS program can be developed in a low- and middle-income country center by first understanding the patterns of error and developing one that suits the working ecosystem.
... For example, Turner (1976Turner ( , 1978Turner ( , 1994 set out the processes by which organizations could incubate the potential for failure within their processes and practices. Reason termed these incubated elements as latent conditions (Reason, 1990a(Reason, , 1990b(Reason, , 1997 and identified several 'source types'-awareness, commitment and competence-in shaping the nature of the failure process (Reason, 1990c). ...
Article
Full-text available
The Goldilocks principle has been widely used in the medical science and psychology fields. However, there appear to be many areas where this principle can be explored in social science research. The main objective of this study is to rationalize the need for the Goldilocks principle by observing the effect of fatigue and stress on human error in manufacturing SMEs. A total of 190 questionnaires were collected and analyzed in SmartPLS version 3. It was found that the alternative hypotheses were supported with p-values below 0.05. The authors suggested that manufacturing companies should set a work record of the task completion time and consider the average time as grace duration in performing the task. An organization is advised to refrain from giving extra time or too little time in completing a task. The timeframe should be based on the task complexity or task familiarity of the overall organization. This study benefits the manufacturing companies as it can serve as a guideline on HR policies and working hours.
Article
Full-text available
Tibbi hatalarin turlerinin ve sebeplerinin tespiti, tibbi hatalarin onlenmesinde birinci adimdir. Bu calismada da tibbi hatalarin turleri ve sebeplerinin belirlenmesi amaclanmistir. Arastirma Isparta evrenini temsil ettigi dusunulen 550 kisi arasindan tibbi hataya ugradigini veya ailesinden bir ferdin tibbi hataya ugradigini ifade eden 252 kisi ile yuz yuze gorusmeler yapilarak gerceklestirilmis olup, gorusmelerde acik uclu sorulardan olusan bir anket kullanilmistir. Anketin olusturulmasinda ve elde edilen verilerin analizinde Health Quality Council of Alberta icin, Northcott ve Northcott tarafindan hazirlanarak Alberta (Kanada)’da uygulanmis bir anket olan “Alberta Patient Safety Survey 2004” esas alinmistir. Arastirma sonucunda katilimcilarin ifadelerine dayanarak, en sik tekrarlanan hata turunun ilac hatasi oldugu, katilimcilara gore yasanan hatalarin birinci sebebinin hekimin ilgisizligi oldugu tespit edilmistir. Tibbi hataya maruz kaldigini veya ailesinden bir ferdin maruz kaldigini ifade eden katilimcilarin %28.45'i tibbi hataya maruz kaldigi gorusunu baska bir hekime danisarak edindigini ifade etmistir. Anahtar Kelimeler: Tıbbi Hata, Tıbbi Hata Sıklığı.
Article
Fire and explosion accidents are the most important crises in the lab at colleges and universities, and human factors (HFs) are widely regarded as the highly contributing factors to the occurrence of lab accidents. Therefore, it is necessary to use efficient and reliable methods to identify and monitor the key HFs that cause and affect lab fire and explosion accidents. In this paper, to identify the most critical and highly contributing HFs exposed in the lab fire and explosion accidents, a hybrid method integrating Human Factors Analysis and Classification System (HFACS), Fuzzy set theory (FST), and Bayesian network (BN) is applied, which can compensate for the static nature of conventional methods in HFs analysis and its inability to deal with uncertainty. The hybrid model was tested on 39 lab fire and explosion accidents from 2008 to 2020 in China and the United States, and the sensitivity analysis was also conducted to recognize the top 10 most critical root events associated with HFs leading to lab fire and explosion accidents. The results demonstrated that organizational influences are the leading contributors to the top 10 most highly contributing root events for lab accidents.
Article
Tıbbi hataların türlerinin ve sebeplerinin tespiti, tıbbi hataların önlenmesinde birinci adımdır. Bu çalışmada da tıbbi hataların türleri ve sebeplerinin belirlenmesi amaçlanmıştır. Araştırma Isparta evrenini temsil ettiği düşünülen 500 kişi arasından tıbbi hataya uğradığını veya ailesinden bir ferdin tıbbi hataya uğradığını ifade eden 252 kişi ile yüz yüze görüşmeler yapılarak gerçekleştirilmiş olup, görüşmelerde açık uçlu sorulardan oluşan bir anket kullanılmıştır. Anketin oluşturulmasında ve elde edilen verilerin analizinde Health Quality Council of Alberta için, Northcott ve Northcott tarafından hazırlanarak Alberta (Kanada)’da uygulanmış bir anket olan “Alberta Patient Safety Survey 2004” esas alınmıştır. Araştırma sonucunda katılımcıların ifadelerine dayanarak, en sık tekrarlanan hata türünün ilaç hatası olduğu, katılımcılara göre yaşanan hataların birinci sebebinin hekimin ilgisizliği olduğu tespit edilmiştir. Tıbbi hataya maruz kaldığını veya ailesinden bir ferdin maruz kaldığını ifade eden katılımcıların %28.45'i tıbbi hataya maruz kaldığı görüşünü başka bir hekime danışarak edindiğini ifade etmiştir
Article
Full-text available
Lampreys (order: Petromyzontiformes) represent one of two extant groups of jawless fishes, also called cyclostomes. Lampreys have a variety of unique features that distinguish them from other fishes. Here we review the physiological features of lampreys that have contributed to their evolutionary and ecological success. The term physiology is used broadly to also include traits involving multiple levels of biological organization, like swimming performance, that have a strong but not exclusively physiological basis. We also provide examples of how sea lamprey traits are currently being used or investigated to control invasive populations in the Great Lakes, such as reduced capacity to detoxify lampricides, inability to surmount low barriers or dams, and sensitivity to several lamprey-specific chemosensory pheromones and alarm cues. Specific suggestions are also provided for how an improved knowledge of lamprey physiological traits could be exploited for more effective conservation of native lampreys and lead to the development of next generation sea lamprey control and conservation tools.
Article
Full-text available
This article takes the Guangdong Province of China as the research object and uses the difference-in-difference model to evaluate the impact of smart city construction on the quality of public occupational health and intercity differences. The obtained results show that smart city construction significantly improves the quality of public occupational health, and it is still valid after a series of robustness tests. The effect of this policy is stronger in cities that belong to the Pearl River Delta region or sub-provincial level cities. This study indicates that the central government should improve the pilot evaluation system and the performance appraisal mechanism of smart cities from the perspective of top-level design during the process of promoting smart city construction, which aims to correctly guide local governments to promote the construction of smart cities. To achieve the full improvement effect of smart city construction on the quality of public occupational health, local governments should implement smart city strategies in a purposeful and planned way according to the actual situation of the development of the jurisdiction.
Article
Purpose At the US passenger stations, train operations approaching terminating tracks rely on the engineer’s compliant behavior to safely stop before the end of the tracks. Noncompliance actions from the disengaged or inattentive engineers would result in hazards to train passengers, train crewmembers and bystanders at passenger stations. Over the past decade, a series of end-of-track collisions occurred at passenger stations with substantial property damage and casualties. This study’s developed systemic model and discussions present policymakers, railway practitioners and academic researchers with a flexible approach for qualitatively assessing railroad safety. Design/methodology/approach To achieve a system-based, micro-level analysis of end-of-track accidents and eventually promote the safety level of passenger stations, the systems-theoretic accident modeling and processes (STAMP), as a practical systematic accident model widely used in the complex systems, is developed in view of environmental factors, human errors, organizational factors and mechanical failures in this complex socio-technical system. Findings The developed STAMP accident model and analytical results qualitatively provide an explicit understanding of the system hazards, constraints and hierarchical control structure of train operations on terminating tracks in the US passenger stations. Furthermore, the safety recommendations and practical options related to obstructive sleep apnea screening, positive train control-based collision avoidance mechanisms, robust system safety program plans and bumping posts are proposed and evaluated using the STAMP approach. Originality/value The findings from STAMP-based analysis can serve as valid references for policymakers, government accident investigators, railway practitioners and academic researchers. Ultimately, they can contribute to establishing effective emergent measures for train operations at passenger stations and promote the level of safety necessary to protect the public. The STAMP approach could be adapted to analyze various other rail safety systems that aim to ultimately improve the safety level of railroad systems.
Article
BACKGROUND/OBJECTIVE: The COVID-19 virus has caused over 582,000 deaths in the United States to date. However, the pandemic has also afflicted the mental health of the population at large in the domains of anxiety and sleep disruption, potentially interfering with cognitive function. From an aviation perspective, safely operating an aircraft requires an airmans cognitive engagement for: 1) situational awareness, 2) spatial orientation, and 3) avionics programming. Since impaired cognitive function could interfere with such tasks, the current study was undertaken to determine if flight safety for a cohort of single engine, piston-powered light airplanes was adversely affected during a period of the pandemic (MarchOctober 2020) prior to U.S. approval of the first COVID-19 vaccine. METHODS: Airplane accidents were per the National Transportation Safety Board Access database. Fleet times were derived using Automatic Dependent Surveillance-Broadcast. Statistics used Poisson distributions, Chi-squared/Fisher, and Mann-Whitney tests. RESULTS: Little difference in accident rate was evident between the pandemic period (MarchOctober 2020) and the preceding (JanuaryFebruary) months (19 and 22 mishaps/100,000 h, respectively). Similarly, a proportional comparison of accidents occurring in 2020 with those for the corresponding months in 2019 failed to show over-representation of mishaps during the pandemic. Although a trend to a higher injury severity (43% vs. 34% serious/fatal injuries) was evident for pandemic-period mishaps, the proportional difference was not statistically significant when referencing the corresponding months in 2019. CONCLUSION: Surprisingly, using accidents as an outcome, the study herein shows little evidence of diminished flight safety for light aircraft operations during the COVID-19 pandemic. Boyd DD. General aviation flight safety during the COVID-19 pandemic. Aerosp Med Hum Perform. 2021; 92(10):773779.
Chapter
Politics have a considerable impact on the aviation industry, which again depends on politics itself. This ranges from conventions concerning air traffic rights to regulatory affairs that govern aspects such as safety or market conditions. Beyond the political sphere, technology has provided a major boost to innovations and developments in the aviation industry, thus improving the economic and ecologic efficiency of air travel. From an economic perspective, aviation creates considerable economic effects, may it be directly at the airports or beyond the aviation system. Aviation has also shaped our society by raising living standards and promoting cultural understanding. Nevertheless, the environmental impacts of aviation, such as pollution and noise, cannot be neglected and will provide an important topic for the years to come.
Thesis
Full-text available
Improving risk controls following root cause analysis of serious incidents in healthcare- Mohammad Farhad Peerally Background Root cause analysis (RCA) is widely used following healthcare serious incidents, but does not necessarily lead to robust risk controls. This research aimed to examine current practices and to inform an understanding of what good looks like in formulating and implementing risk controls to improve patient safety. Methods First, I undertook a content analysis of 126 RCA reports over a three-year period from an acute NHS trust, with the goals of characterising (i)the contributory factors identified in investigations and (ii)the risk controls proposed in the action plans. Second, I conducted a narrative review of the academic literature on improving risk control practices in safety-critical industries, including but not limited to healthcare. Finally, I undertook a qualitative study involving 52 semi-structured interviews with expert stakeholders in post-incident management, analysed using the framework method. Results: Content analysis of serious incident investigation reports identified the preoccupation of RCAs with identifying proximate errors at the sharp end of care, neglecting wider contexts and structures. Most (74%) risk controls proposed could be characterised as weak and were poorly aligned with identified contributory factors. Together, the narrative review and the findings of the interview study suggested eleven features essential to addressing these problems: systems-based investigations; a participatory approach, skilled and independent investigators; clear and shared language; including patients’ views; allocating time and space to risk control formulation; adding structure to risk control formulation; sustainable risk controls mapped to identified problems; purposeful implementation and better tracking of risk controls; a collaborative approach to quality assurance and improved organisational learning. Discussion and conclusion: RCAs as currently conducted, and the action plans that arise from them, are often flawed. The eleven features identified will be important in improving risk control formulation and implementation. To operationalise these features, there is a need for: professional and independent investigations, risk controls based on a sound theory of change, and improved cultures and structures for organisational learning.
Article
With known cause of death (CoD), competing risk survival methods are applicable in estimating disease-specific survival. Relative survival analysis may be used to estimate disease-specific survival when cause of death is either unknown or subject to misspecification and not reliable for practical usage. This method is popular for population-based cancer survival studies using registry data and does not require CoD information. The standard estimator is the ratio of all-cause survival in the cancer cohort group to the known expected survival from a general reference population. Disease-specific death competes with other causes of mortality, potentially creating dependence among the CoD. The standard ratio estimate is only valid when death from disease and death from other causes are independent. To relax the independence assumption, we formulate dependence using a copula-based model. Likelihood-based parametric method is used to fit the distribution of disease-specific death without CoD information, where the copula is assumed known and the distribution of other cause of mortality is derived from the reference population. We propose a sensitivity analysis, where the analysis is conducted across a range of assumed dependence structures. We demonstrate the utility of our method through simulation studies and an application to French breast cancer data.
Book
Full-text available
“In what is certain to become an indispensable book on public failures, their origins, and consequences, Wolfgang Seibel builds piece-by-piece a unique contribution to the study of rare events and the search for resilience in public policy. This is a must-read for those who want to better understand such ‘black swan’ events and the search for resilience.” — Andrew B. Whitford, Professor at the School of Public and International Affairs, University of Georgia, USA “A must read for practitioners and scholars, Wolfgang Seibel’s latest book provides profound insight in the intersection of public administration mismanagement and the absence of responsible leadership. An exceptional contribution to the field.” — Janine O’Flynn, Professor of Public Management, The Australia and New Zealand School of Government, Australia This open access book is about mismanagement of public agencies as a threat to life and limb. Collapsing bridges and buildings kill people and often leave many more injured. Such disasters do not happen out of the blue nor are they purely technical in nature since construction and maintenance are subject to safety regulation and enforcement by governmental agencies. The book analyses four relevant cases from Australia, New Zealand, the USA and Germany. Rather than stressing well-known pathologies of bureaucracy as a potential source of disaster, this book argues, learning for the sake of prevention should aim at neutralizing threats to integrity and strengthening a sense of responsibility among public officials. Wolfgang Seibel is Professor of Politics and Public Administration at the University of Konstanz, and an Adjunct Professor of Public Administration at the Hertie School in Berlin, Germany. His new book is an outcome of the research project “Black Swans in Public Administration: Rare Organizational Failure with Severe Consequences” funded by the German Research Foundation (DFG).
Article
In sectors such as aerospace manufacturing, human errors in the assembly of complex products can negatively impact quality, productivity, and safety. Until now, the analysis of assembly errors has focused more on the immediate human‐system interface and less on broader organizational factors. This article presents a case study‐based analysis of assembly errors in the aeronautical industry using the systemic methods AcciMap and Systems‐Theoretic Accident Model and Processes (STAMP). We seek to provide the company with elements to build a quality improvement strategy that considers human factors and ergonomics from a systemic perspective. The data and information necessary to conduct the analysis came from a project carried out at an aerospace manufacturing facility over a period of 12 months. The team had direct and recurrent access to primary data sources and communication with various stakeholders. A total of 31 influencing factors were identified with AcciMap at different levels within the manufacturing system. STAMP made it possible to model the sociotechnical control structure of the assembly process and identify several control flaws leading to hazards. The analysis shows that systemic methods require a high level of understanding of the manufacturing system and access to relatively high amounts of data and information. Therefore, direct contact with the field and stakeholders is crucial. Training quality specialists on systemic methods could support its use and help to close the gap between theory and practice. Globally, the field of quality in manufacturing could benefit from using systemic methods when deemed necessary.
Article
Human error is an essential factor that affects the quality and safety of industrial production. To deeply understand the human factors that cause failures in an organization from the perspective of maintenance personnel, we propose an analytical approach combined with Fault Tree Analysis and qualitative analysis and apply this approach to maintenance task failure incidents. These proposed methods are based on human factor classification and the Human Factors Analysis and Classification System. We conduct a case study to prove the effectiveness of applying our approach to the Chinese manufacturing industry. Results enabled the disclosure of the “latent factors” of maintenance incidents, helped improve human error analysis in maintenance incidents and helped to understand the fundamental reasons that affect work reliability and cause maintenance failures. This case study focuses on the impact of critical human factors on organizational effectiveness and operational reliability during maintenance activities.
Chapter
Das Kapitel beschäftigt sich mit Qualitätsmanagement in Notaufnahmen und rekurriert auf praxisrelevante Ansätze zur möglichen Qualitätsverbesserung und Fehlervermeidung.
Book
Full-text available
The book WETLANDS: the vital network that connects us, deals with notions, ideas and didactic resources, to instruct on our connection with wetlands, and highlight the need to protect these environments by encouraging their valorization and empowerment based on sustainable use. It is a wide-ranging text written in informative and fresh language, to guide an entertaining learning, but without abandoning the scientific rigor and the exposition of updated information that this important topic deserves. Aimed at students, educators, community leaders, environmentalists and, in general, people who work in wetland science. It is divided into six chapters: Chapter 1 deals with the definition of wetlands, their differences with other aquatic systems, and their notable presence in Venezuela. Chapter 2 highlights the importance of wetlands as the core of evolutionary processes, production of biodiversity, food and water. Chapter 3 identifies the main threats facing wetlands. Chapter 4 presents guidelines and ideas for educational topics on wetlands. Chapter 5 summarizes aspects concerning the economic valorization of wetlands and Chapter 6 highlights cases of citizen empowerment and governance of wetlands. RESUMEN El libro HUMEDALES: la red vital que nos conecta, trata sobre nociones, ideas y recursos didácticos, para instruir sobre nuestra conexión con los humedales, y desta-car la necesidad de proteger esos ambientes alentando a su valorización y empode-ramiento con base en el aprovechamiento sustentable. Es un texto de amplio alcance escrito en lenguaje divulgativo y fresco, para guiar un aprendizaje entretenido, pero sin abandonar la rigurosidad científica y la exposición de información actualizada que este importante tema merece. Dirigido a estudiantes, educadores, líderes comunita-rios, ambientalistas y en general a personas que trabajen en la ciencia de los hume-dales. Está divido en seis capítulos: en el capítulo 1 se aborda la definición de los humedales, sus diferencias con otros sistemas acuáticos, y su notable presencia en Venezuela. En el capítulo 2 se destaca la importancia de los humedales como núcleo de procesos evolutivos, producción de biodiversidad, alimentos y agua. En el capítulo 3 se señalan las principales amenazas que confrontan los humedales. En el capítulo 4 se presentan lineamientos e ideas de temas educativos sobre los humedales. En el capítulo 5 se sintetizan aspectos concernientes a valorización económica de los humedales y en el capítulo 6 se subrayan casos de empoderamiento ciudadano y gobernanza de los humedales.
Article
Recent technological advances have allowed some human activities, including those related to safety, to be automated. However, these advances increased the complexity of sociotechnical systems, representing exponential growth of interactions between humans, computers, machines, and the environment. Moreover, growing productive and economic pressures demand safer and more reliable products and projects, at lower costs and time than those practised by competitors. To cope with this complexity and set of conflicting objectives, the STAMP (Systems-Theoretic Accident Model and Processes) emerged as a novel approach to analyse processes and accidents. In this study, an overview of boiler accidents in Brazil is presented and a causal analysis based on STAMP (CAST) is conducted to revisit one of the worst boiler accidents in the Brazilian scenario. Even without direct participation in the investigations, additional and more relevant risk factors are evidenced. Furthermore, it was found that government agencies generally refrain from reviewing their own control actions contributing to the hazard, limiting their potential improvements. This suggests a need for companies and government agencies to adopt new paradigms of risk and accident analysis and to work together for a systemic safety improvement approach. Keywords: Accident analysis; Boiler; STAMP; CAST; Systems theory; Labour inspection
Article
Investigations into the causes of maritme incidents/accidents have often identified human error as one of the leading causal factors. Small vessels employed in small-scale fisheries activities, usually have little or no onboard shelter and limited navigation and safety equipment. The operator's effectiveness at the performance of their duty task is therefore critical, and they must be well-tooled to succeed. This paper presents a novel generic human factor analysis model proposed for analyzing small fishing vessel operations. Coupled with the Bayesian network the methodology is tested with a case study focused on a small fishing boat operating in the Atlantic Canada region of Newfoundland and Nova Scotia. The accident occurrence likelihood is estimated and a sensitivity analysis is also performed on the model. The analyses findings show the accident's most critical influencing factors to be related to operator's actions, the natural and technological environment, unsafe management of operations, and factors associated to the vessel itself.
Article
Introduction Intraoperative deaths (IODs) are rare but catastrophic. We systematically analyzed IODs to identify clinical and patient safety patterns. Methods IODs in a large academic center between 2015 and 2019 were included. Perioperative details were systematically reviewed, focusing on (1) identifying phenotypes of IOD, (2) describing emerging themes immediately preceding cardiac arrest, and (3) suggesting interventions to mitigate IOD in each phenotype. Results Forty-one patients were included. Three IOD phenotypes were identified: trauma (T), nontrauma emergency (NT), and elective (EL) surgery patients, each with 2 sub-phenotypes (e.g., ELm and ELv for elective surgery with medical arrests or vascular injury and bleeding, respectively). In phenotype T, cardiopulmonary resuscitation was initiated before incision in 42%, resuscitative thoracotomy was performed in 33%, and transient return of spontaneous circulation was achieved in 30% of patients. In phenotype NT, ruptured aortic aneurysms accounted for half the cases, and median blood product utilization was 2,694 mL. In phenotype ELm, preoperative evaluation did not include electrocardiogram in 12%, cardiac consultation in 62%, stress test in 87%, and chest x-ray in 37% of patients. In phenotype ELv, 83% had a single peripheral intravenous line, and vascular injury was almost always followed by escalation in monitoring (e.g., central/arterial line), alert to the blood bank, and call for surgical backup. Conclusions We have created a framework for IOD that can help with intraoperative safety and quality analysis. Focusing on interventions that address appropriateness versus futility in care in phenotypes T and NT, and on prevention and mitigation of intraoperative vessel injury (e.g., intraoperative rescue team) or preoperative optimization in phenotype EL may help prevent IODs.
Article
The duty to protect patient welfare underpins undergraduate medical ethics and patient safety teaching. The current syllabus for patient safety emphasises the significance of organisational contribution to healthcare failures. However, the ongoing over-reliance on whistleblowing disproportionately emphasises individual contributions, alongside promoting a culture of blame and defensiveness among practitioners. Diane Vaughan’s ‘Normalisation of Deviance’ (NoD) provides a counterpoise to such individualism, describing how signals of potential danger are collectively misinterpreted and incorporated into the accepted margins of safe operation. NoD is an insidious process that often goes unnoticed, thus minimising the efficacy of whistleblowing as a defence against inevitable disaster. In this paper, we illustrate what can be learnt by greater attention to the collective, organisational contributions to healthcare failings by applying NoD to The Morecambe Bay Investigation. By focusing on a cluster of five ‘serious untoward incidents’ occurring in 2008, we describe a cycle of NoD affecting trust handling of events that allowed poor standards of care to persist for several years, before concluding with a poignant example of the limitations of whistleblowing, whereby the raising of concerns by a senior consultant failed to generate a response at trust board level. We suggest that greater space in medical education is needed to develop a thorough understanding of the cultural and organisational processes that underpin healthcare failures, and that medical education would benefit from integrating the teaching of medical ethics and patient safety to resolve the tension between systems approaches to safety and the individualism of whistleblowing.
Article
International oil and gas corporations operating in Brunei may apply PSM and analysis techniques, resulting in varying approaches and measures to address process safety issues. Global corporations may have developed their own process safety standards while smaller firms employ established ones. According to the standards employed, this research should be able to compare the local PSM systems and standards to international ones. To determine which users face the most hurdles in implementing or increasing process safety inside their organisations. This study found that OSHA regulations are used by 30% of local users in downstream operations. Common challenges encountered by local users are Management/Leadership Commitment to Process Safety (11.9%), Mechanical Integrity and management of safety critical devices (5.3%), Management review and intervention for continuous improvement (4.9%), Communication amongst workers (3.8%), Management of change (3.8%), Operational control, permit to work and risk management (3.8%), Incident reporting (3.8%).
Changing decisions about safety in organizations
  • B Brehmer
What can be learned from human error reports
  • J Rasmussen
Herald of tree Enterprise
  • Sheen Mr Justice
Organizational and inter-organizational thought
  • R Westrum
Investigation into the King's Cross undergroundfire
  • D Fennell
The Deviation Concept in Occupational Accident Control Stockholm: Royal Institute of Technology
  • U Kjellen