Technische Universität Dortmund
  • Dortmund, North Rhine-Westphalia, Germany
Recent publications
The Large Hadron Collider beauty (LHCb) experiment at CERN is undergoing an upgrade in preparation for the Run 3 data collection period at the Large Hadron Collider (LHC). As part of this upgrade, the trigger is moving to a full software implementation operating at the LHC bunch crossing rate. We present an evaluation of a CPU-based and a GPU-based implementation of the first stage of the high-level trigger. After a detailed comparison, both options are found to be viable. This document summarizes the performance and implementation details of these options, the outcome of which has led to the choice of the GPU-based implementation as the baseline.
Against the background of digital transformation processes that are currently changing the world of work, this paper examines general digital competences of beginning trainees in commercial vocational education and training (VET) programs. We are particularly interested in factors influencing digital competence profiles. From survey data including N = 480 trainees in one federal state in Germany, we were able to identify three different competence profiles (based on the trainees’ self-assessment of their general digital competence). Initial descriptive analysis reveals differences between competence profiles of different training professions (industrial clerks and retail salespersons reach higher competence levels than salespersons). However, regression results indicate that these differences can be explained by differences in school leaving certificates. Contrary to prior empirical evidence, we find no significant effect of trainees’ gender. Finally, the frequency of certain private digital activities (e.g. using office programs, conducting internet searches) affects digital competence profiles. Implications for both VET programs and further research are discussed.
The ATLAS experiment at the Large Hadron Collider has a broad physics programme ranging from precision measurements to direct searches for new particles and new interactions, requiring ever larger and ever more accurate datasets of simulated Monte Carlo events. Detector simulation with Geant4 is accurate but requires significant CPU resources. Over the past decade, ATLAS has developed and utilized tools that replace the most CPU-intensive component of the simulation—the calorimeter shower simulation—with faster simulation methods. Here, AtlFast3, the next generation of high-accuracy fast simulation in ATLAS, is introduced. AtlFast3 combines parameterized approaches with machine-learning techniques and is deployed to meet current and future computing challenges, and simulation needs of the ATLAS experiment. With highly accurate performance and significantly improved modelling of substructure within jets, AtlFast3 can simulate large numbers of events for a wide range of physics processes.
One of the key challenges of uncertainty analysis in model updating is the lack of experimental data. The definition of an appropriate uncertainty quantification metric, which is capable of measuring as sufficient as possible information from the limited and sparse experimental data, is significant for the outcome of model updating. This work is dedicated to the definition and investigation of a general-purpose uncertainty quantification metric based on the sub-interval similarity. The discrepancy between the model prediction and the experimental observation is measured in the form of intervals, instead of the common probabilistic distributions which require elaborate experimental data. An exhaustive definition of the similarity between intervals under different overlapping cases is proposed in this work. A sub-interval strategy is developed to compare the similarity considering not only the range of the intervals, but more importantly, the distributed positions of the available observation samples within the intervals. This sub-interval similarity metric is developed to be capable of different model updating frameworks, e.g. the stochastic Bayesian updating and the interval model updating. A simulated example employing the widely known 3-dof mass-spring system is presented to perform both stochastic Bayesian updating and interval updating, to demonstrate the universality of the proposed sub-interval similarity metric. A practical experimental example is followed to demonstrate the feasibility of the proposed metric in practical application cases.
The ability to infer beliefs and thoughts in interaction partners is essential in social life. However, reasoning about other people’s beliefs might depend on their characteristics or our relationship with them. Recent studies indicated that children’s false-belief attribution was influenced by a protagonist’s age and competence. In the current experiments, we investigated whether group membership influences the way children reason about another person’s beliefs. We hypothesized that 4-year-olds would be less likely to attribute false beliefs to an ingroup member than to an outgroup member. Group membership was manipulated by accent (Experiments 1–3) and gender (Experiment 4). The results indicated that group membership did not consistently influence children’s false-belief attribution. Future research should clarify whether the influence of group membership on false-belief attribution either is absent or depends on other cues that we did not systematically manipulate in our study.
Using data from TIMSS 2015, this study investigated determinants of inequality between classrooms in mathematics performance in Sweden. Applying multiple-group confirmatory factor analysis and measurement invariance frameworks to identify latent constructs with which to build a two-level structural equation model, this study integrated teacher certification, teacher preparedness and school emphasis on academic success into a model of inequality of outcomes and opportunities. The study found evidence that more socioeconomically advantaged classes had better prepared mathematics teachers. School culture towards academic achievement was not associated with mathematics achievement. Finally, the analyses indicated that substantial inequalities exist for students taught by specialist and non-specialist teachers.
We investigated lexical retrieval processes in 4- to 6-year-old German–English bilinguals by exploring cross-language activation during second-language (L2) word recognition of cognates and noncognates in semantically related and unrelated contexts in young learners of English. Both button presses (reaction times and accuracies) and eye-tracking data (percentage looks to target) yielded a significant cognate facilitation effect, indicating that the children’s performance was boosted by cognate words. Nonetheless, the degree of phonological overlap of cognates did not modulate their performance. Moreover, a semantic interference effect was found in the children’s eye movement data. However, in these young L2 learners, cognate status exerted a comparatively stronger impact on L2 word recognition than semantic relatedness. Finally, correlational analyses on the cognate and noncognate performance and the children’s executive function yielded a significant positive correlation between noncognate performance and their inhibitory control, suggesting that noncognate processing depended to a greater extent on inhibitory control than cognate processing.
The non-Newtonian fluids are increasingly being employed in various engineering and industrial processes. The Casson fluid model is also used to characterize the non-Newtonian fluid behavior and has great importance in polymer processing industries and biomechanics. Motivated by these developments, the present study explores the entropy generation for the natural convection of Casson fluid in a porous, partially heated square enclosure considering the effects of horizontal magnetic field, cavity inclination and viscous dissipation. The bottom wall of the enclosure is considered partially heated, whereas the left and right walls are taken as cold at constant temperature. The top wall and the remaining portions of the bottom wall are adiabatic. The dimensionless form of the governing conservation laws is simulated via higher order Galerkin finite element method. Particularly, the discretization is performed using biquadratic elements for velocity and temperature components whereas the discontinuous linear elements employed for the pressure. The discretized nonlinear systems are handled by implementing the adaptive Newton-multigrid solver. In order to increase the reliability of the computed results, the designed solver is validated qualitatively and quantitatively for the available numerical and experimental data. The simulated results are analyzed through isotherms, streamlines and two dimensional plots. Moreover, the average entropy generation, kinetic energy and temperature are also computed and analyzed. The controlling parameters are Hartmann number (Ha=0−100), Darcy number (Da=10−4), Prandtl number (Pr=7), Rayleigh number (Ra=105), Casson fluid parameter (γ=0.1−10), cavity inclination (ϕ=0°−90°) and Eckert number (Ec=10−6−10−4). It is concluded that the average heat transfer rate decreases whereas the total entropy generation increases by increasing the cavity inclination (ϕ) for each value of Ha. Further, the irreversibilities due to heat transfer and magnetic field both are increasing for function of ϕ.
Various numerical methods have been extensively studied and used for reliability analysis over the past several decades. However, how to understand the effect of numerical uncertainty (i.e., numerical error due to the discretization of the performance function) on the failure probability is still a challenging issue. The active learning probabilistic integration (ALPI) method offers a principled approach to quantify, propagate and reduce the numerical uncertainty via computation within a Bayesian framework, which has not been fully investigated in context of probabilistic reliability analysis. In this study, a novel method termed `Parallel Adaptive Bayesian Quadrature' (PABQ) is proposed on the theoretical basis of ALPI, and is aimed at broadening its scope of application. First, the Monte Carlo method used in ALPI is replaced with an importance ball sampling technique so as to reduce the sample size that is needed for rare failure event estimation. Second, a multi-point selection criterion is proposed to enable parallel distributed processing. Four numerical examples are studied to demonstrate the effectiveness and efficiency of the proposed method. It is shown that PABQ can effectively assess small failure probabilities (e.g., as low as 10^{-7}) with a minimum number of iterations by taking advantage of parallel computing.
This paper is concerned with approximating the scalar response of a complex computational model subjected to multiple input interval variables. Such task is formulated as finding both the global minimum and maximum of a computationally expensive black-box function over a prescribed hyper-rectangle. On this basis, a novel non-intrusive method, called `triple-engine parallel Bayesian global optimization', is proposed. The method begins by assuming a Gaussian process prior (which can also be interpreted as a surrogate model) over the response function. The main contribution lies in developing a novel infill sampling criterion, i.e., triple-engine pseudo expected improvement strategy, to identify multiple promising points for minimization and/or maximization based on the past observations at each iteration. By doing so, these identified points can be evaluated on the real response function in parallel. Besides, another potential benefit is that both the lower and upper bounds of the model response can be obtained with a single run of the developed method. Four numerical examples with varying complexity are investigated to demonstrate the proposed method against some existing techniques, and results indicate that significant computational savings can be achieved by making full use of prior knowledge and parallel computing.
Deep drilling with smallest diameters is used for applications such as automotive, aerospace and medicine. Especially difficult to cut materials require an efficient tool design to realize an economical production. Therefore, combined simulation methods are used in this paper to analyze the fluid behavior under consideration of the transient chip positions. To improve tool cooling by an improved cutting fluid flow, which supports chip removal also, both the cross-sectional area of the internal cooling channel and the outer and inner cutting edge angles were modified. With the modified model, cutting fluid velocity increased by 40% and chip evacuation by 60%.
In this paper we derive martingale estimating functions for the dimensionality parameter of a Bessel process based on the eigenfunctions of the diffusion operator. Since a Bessel process is non-ergodic and the theory of martingale estimating functions is developed for ergodic diffusions, we use the space-time transformation of the Bessel process and formulate our results for a modified Bessel process. We deduce consistency, asymptotic normality and discuss optimality. It turns out that the martingale estimating function based of the first eigenfunction of the modified Bessel process coincides with the linear martingale estimating function for the Cox Ingersoll Ross process. Furthermore, our results may also be applied to estimating the multiplicity parameter of a one-dimensional Dunkl process and some related polynomial processes.
In the scientific literature, various temporal resolutions have been used to model electric vehicle charging loads. However, in most studies, the used temporal resolution lacks a proper justification. To provide a strengthened theoretical background for all future studies related to electric vehicle charging load modeling, this paper investigates the influence of temporal resolution in different scenarios. To ensure reliable baselines for the comparisons, hardware-in-the-loop simulations with different commercial electric vehicles are carried out. The conducted hardware-in-the-loop simulations consists of 134 real charging sessions in total. In order to compare the influence of different temporal resolutions, a simulation model is developed. The simulation model utilizes comprehensive preliminary measurement-based charging profiles that can be used to model controlled charging in fine detail. The simulation results demonstrate that the simulation model provides sufficiently accurate results in most cases with a temporal resolution of one second. Conversely, a temporal resolution of 3600 s may lead to a modeling error of 50% or even higher. Additionally, the paper shows that the necessary resolution to achieve a modeling error of 5% or less vary between 1 and 900 s depending on the scenario. However, in most cases, resolution of 60 s is reasonably accurate.
Background Targeted therapies for metastatic uveal melanoma have shown limited benefit in biomarker-unselected populations. The Treat20 Plus study prospectively evaluated the feasibility of a precision oncology strategy in routine clinical practice. Patients and methods Fresh biopsies were analyzed by high-throughput genomics (whole-genome, whole-exome, and RNA sequencing). A multidisciplinary molecular and immunologic tumor board (MiTB) made individualized treatment recommendations based on identified molecular aberrations, patient situation, drug, and clinical trial availability. Therapy selection was at the discretion of the treating physician. The primary endpoint was the feasibility of the precision oncology clinical program. Results Molecular analyses were available for 39/45 patients (87%). The MiTB provided treatment recommendations for 40/45 patients (89%), of whom 27/45 (60%) received ≥1 matched therapy. First-line matched therapies included MEK inhibitors (n = 15), MET inhibitors (n = 10), sorafenib (n = 1), and nivolumab (n = 1). The best response to first-line matched therapy was partial response in one patient (nivolumab based on tumor mutational burden), mixed response in two patients, and stable disease in 12 patients for a clinical benefit of 56%. The matched therapy population had a median progression-free survival and overall survival of 3.3 and 13.9 months, respectively. The growth modulation index with matched therapy was >1.33 in 6/17 patients (35%) with prior systemic therapy, suggesting clinical benefit. Conclusions A precision oncology approach was feasible for patients with metastatic uveal melanoma, with 60% receiving a therapy matched to identify molecular aberrations. The clinical benefit after checkpoint inhibitors highlights the value of tumor mutational burden testing.
For single-lip drills with small diameters, the cutting fluid is supplied through a kidney-shaped cooling channel inside the tool. In addition to reducing friction, the cutting fluid is also important for the dissipation of heat at the cutting edge and for the chip removal. However, in previous investigations of single-lip drills, it was observed that the fluid remains on the back side of the cutting edge, and accordingly, the cutting edge is insufficiently cooled. In this paper, a simulation-based investigation of an introduced additional drainage flute and flank surface modifications is carried out using smoothed particle hydrodynamics as well as computational fluid dynamics. It is determined that the additionally introduced drainages lead to a slightly changed flow situation, but a significant flow behind the cutting edge and into the drainage flute cannot be achieved due to reasons explained in this paper. Accordingly, not even a much larger drainage flute with unwanted side-effect of a decrease tool strength is able to archive a significant improvement of the flow around the cutting edge. Therefore, major changes to the cooling channel, like the use of two separate channels, the modification of their positions, or modified flank surfaces, are necessary in order to achieve an improvement in lubrication of the cutting edge and heat dissipation.
We investigate the spatio-temporal structure of the most likely configurations realizing extremely high vorticity or strain in the stochastically forced three-dimensional incompressible Navier–Stokes equations. Most likely configurations are computed by numerically finding the highest probability velocity field realizing an extreme constraint as solution of a large optimization problem. High-vorticity configurations are identified as pinched vortex filaments with swirl, while high-strain configurations correspond to counter-rotating vortex rings. We additionally observe that the most likely configurations for vorticity and strain spontaneously break their rotational symmetry for extremely high observable values. Instanton calculus and large deviation theory allow us to show that these maximum likelihood realizations determine the tail probabilities of the observed quantities. In particular, we are able to demonstrate that artificially enforcing rotational symmetry for large strain configurations leads to a severe underestimate of their probability, as it is dominated in likelihood by an exponentially more likely symmetry-broken vortex-sheet configuration. This article is part of the theme issue ‘Mathematical problems in physical fluid dynamics (part 2)’.
This paper introduces formal monitoring procedures as a risk-management tool. Continuously monitoring risk forecasts allows practitioners to swiftly review and update their forecasting procedures as soon as forecasts turn inadequate. Similarly, regulators may take timely action in case reported risk forecasts become poor. Extant (one-shot) backtests require, however, that all data are available prior to testing and are not informative of when inadequacies might have occurred. To monitor value-at-risk and expected shortfall forecasts “online”—that is, as new observations become available—we construct sequential testing procedures. We derive the exact finite-sample distributions of the proposed procedures and discuss the suitability of asymptotic approximations. Simulations demonstrate good behavior of our exact procedures in finite samples. An empirical application to major stock indices during the COVID-19 pandemic illustrates the economic benefits of our monitoring approach. This paper was accepted by Agostino Capponi, finance.
Zusammenfassung Hintergrund Während der COVID-19-Pandemie („coronavirus disease 2019“) kam es immer wieder zu Widerständen gegenüber nachweislich wirksamen Präventionsmaßnahmen. Eine solche durch Verärgerung und negativen Kognitionen gekennzeichnete „Reaktanz“ erleben Menschen (gemäß der psychologischen Reaktanztheorie) bei einer wahrgenommenen Bedrohung subjektiv wichtiger Freiheiten oder wahrgenommenen Versuchen, ihre Einstellungen oder ihr Verhalten zu ändern. Fragestellung Der vorliegende Beitrag beleuchtet die Rolle defensiver Prozesse im Kontext der COVID-19-Pandemie aus der Perspektive einer evidenzbasierten und abwehrsensiblen Risiko- und Krisenkommunikation. Nach einem Überblick über wesentliche Auslöser und Ausprägungen werden Möglichkeiten zur Minimierung von Abwehr diskutiert. Ergebnis Widerstände sind in einem gewissen Umfang immer zu erwarten, lassen sich aber durch bestimmte formale und inhaltliche Gestaltungen der Informationen minimieren. Hierzu zählen beispielsweise eine professionelle Anmutung, eine respektvoll wertschätzende und stigmasensible Grundhaltung, eine positive und selbstwirksamkeitsstärkende Ansprache sowie eine Vermeidung emotional überfordernder Informationen wie z. B. stark negative emotionale Appelle oder starkes Verlust-Framing. Schlussfolgerung Akteure sollten müssen sich darüber im Klaren darüber sein, dass Abwehrmechanismen durch die Kommunikation sowohl gefördert als auch reduziert werden können. Sie sollten wesentliche Auslöser hierfür kennen und durch eine konsistente, verständliche und adressatengerechte Kommunikation dazu beitragen, Unsicherheiten, Widerstände und Irritationen zu vermeiden.
There are three questions explored in this paper: (i) To what extent does mining-induced displacement impact livelihood capital, (ii) To what extent does livelihood capital impact livelihood resilience outcomes, and (iii) what impact does coping behaviour have on the relationship between livelihood capital and livelihood resil-ience? A sequential mixed exploratory method is employed to address these questions. The study's first phase includes 60 interviews and two focus group discussions, while 287 surveys were conducted in the second phase. Our hypothesis that coping behaviour moderates the relationship between livelihood capitals and livelihood resilience is explored with preliminary results from interviews and focus groups and confirmed with findings from the quantitative study. Based on the study's conceptual model, the results suggest that livelihood capitals positively affect livelihood resilience outcomes, while displacement limits them, except physical capital. However , the strength of these relationships depends on displaced people's coping behaviour. Finally, the implications of the results in terms of theory and practice are discussed.
Institution pages aggregate content on ResearchGate related to an institution. The members listed on this page have self-identified as being affiliated with this institution. Publications listed on this page were identified by our algorithms as relating to this institution. This page was not created or approved by the institution. If you represent an institution and have questions about these pages or wish to report inaccurate content, you can contact us here.
7,822 members
Boris Otto
  • Faculty of Mechanical Engineering, Chair of Industrial Information Management
Olga Kunina-Habenicht
  • Psychological Assessment
Peter Padawitz
  • Faculty of Computer Science
Mahadeo R. Halhalli
  • Faculty of Chemistry
Information
Address
August-Schmidt-Straße 4, 44227, Dortmund, North Rhine-Westphalia, Germany
Head of institution
Prof. Dr. Manfred Bayer
Website
www.tu-dortmund.de
Phone
(0231) 755-7550