Technische Universität Dortmund
  • Dortmund, North Rhine-Westphalia, Germany
Recent publications
The Large Hadron Collider beauty (LHCb) experiment at CERN is undergoing an upgrade in preparation for the Run 3 data collection period at the Large Hadron Collider (LHC). As part of this upgrade, the trigger is moving to a full software implementation operating at the LHC bunch crossing rate. We present an evaluation of a CPU-based and a GPU-based implementation of the first stage of the high-level trigger. After a detailed comparison, both options are found to be viable. This document summarizes the performance and implementation details of these options, the outcome of which has led to the choice of the GPU-based implementation as the baseline.
Against the background of digital transformation processes that are currently changing the world of work, this paper examines general digital competences of beginning trainees in commercial vocational education and training (VET) programs. We are particularly interested in factors influencing digital competence profiles. From survey data including N = 480 trainees in one federal state in Germany, we were able to identify three different competence profiles (based on the trainees’ self-assessment of their general digital competence). Initial descriptive analysis reveals differences between competence profiles of different training professions (industrial clerks and retail salespersons reach higher competence levels than salespersons). However, regression results indicate that these differences can be explained by differences in school leaving certificates. Contrary to prior empirical evidence, we find no significant effect of trainees’ gender. Finally, the frequency of certain private digital activities (e.g. using office programs, conducting internet searches) affects digital competence profiles. Implications for both VET programs and further research are discussed.
The ATLAS experiment at the Large Hadron Collider has a broad physics programme ranging from precision measurements to direct searches for new particles and new interactions, requiring ever larger and ever more accurate datasets of simulated Monte Carlo events. Detector simulation with Geant4 is accurate but requires significant CPU resources. Over the past decade, ATLAS has developed and utilized tools that replace the most CPU-intensive component of the simulation—the calorimeter shower simulation—with faster simulation methods. Here, AtlFast3, the next generation of high-accuracy fast simulation in ATLAS, is introduced. AtlFast3 combines parameterized approaches with machine-learning techniques and is deployed to meet current and future computing challenges, and simulation needs of the ATLAS experiment. With highly accurate performance and significantly improved modelling of substructure within jets, AtlFast3 can simulate large numbers of events for a wide range of physics processes.
One of the key challenges of uncertainty analysis in model updating is the lack of experimental data. The definition of an appropriate uncertainty quantification metric, which is capable of measuring as sufficient as possible information from the limited and sparse experimental data, is significant for the outcome of model updating. This work is dedicated to the definition and investigation of a general-purpose uncertainty quantification metric based on the sub-interval similarity. The discrepancy between the model prediction and the experimental observation is measured in the form of intervals, instead of the common probabilistic distributions which require elaborate experimental data. An exhaustive definition of the similarity between intervals under different overlapping cases is proposed in this work. A sub-interval strategy is developed to compare the similarity considering not only the range of the intervals, but more importantly, the distributed positions of the available observation samples within the intervals. This sub-interval similarity metric is developed to be capable of different model updating frameworks, e.g. the stochastic Bayesian updating and the interval model updating. A simulated example employing the widely known 3-dof mass-spring system is presented to perform both stochastic Bayesian updating and interval updating, to demonstrate the universality of the proposed sub-interval similarity metric. A practical experimental example is followed to demonstrate the feasibility of the proposed metric in practical application cases.
The ability to infer beliefs and thoughts in interaction partners is essential in social life. However, reasoning about other people’s beliefs might depend on their characteristics or our relationship with them. Recent studies indicated that children’s false-belief attribution was influenced by a protagonist’s age and competence. In the current experiments, we investigated whether group membership influences the way children reason about another person’s beliefs. We hypothesized that 4-year-olds would be less likely to attribute false beliefs to an ingroup member than to an outgroup member. Group membership was manipulated by accent (Experiments 1–3) and gender (Experiment 4). The results indicated that group membership did not consistently influence children’s false-belief attribution. Future research should clarify whether the influence of group membership on false-belief attribution either is absent or depends on other cues that we did not systematically manipulate in our study.
Using data from TIMSS 2015, this study investigated determinants of inequality between classrooms in mathematics performance in Sweden. Applying multiple-group confirmatory factor analysis and measurement invariance frameworks to identify latent constructs with which to build a two-level structural equation model, this study integrated teacher certification, teacher preparedness and school emphasis on academic success into a model of inequality of outcomes and opportunities. The study found evidence that more socioeconomically advantaged classes had better prepared mathematics teachers. School culture towards academic achievement was not associated with mathematics achievement. Finally, the analyses indicated that substantial inequalities exist for students taught by specialist and non-specialist teachers.
We investigated lexical retrieval processes in 4- to 6-year-old German–English bilinguals by exploring cross-language activation during second-language (L2) word recognition of cognates and noncognates in semantically related and unrelated contexts in young learners of English. Both button presses (reaction times and accuracies) and eye-tracking data (percentage looks to target) yielded a significant cognate facilitation effect, indicating that the children’s performance was boosted by cognate words. Nonetheless, the degree of phonological overlap of cognates did not modulate their performance. Moreover, a semantic interference effect was found in the children’s eye movement data. However, in these young L2 learners, cognate status exerted a comparatively stronger impact on L2 word recognition than semantic relatedness. Finally, correlational analyses on the cognate and noncognate performance and the children’s executive function yielded a significant positive correlation between noncognate performance and their inhibitory control, suggesting that noncognate processing depended to a greater extent on inhibitory control than cognate processing.
The non-Newtonian fluids are increasingly being employed in various engineering and industrial processes. The Casson fluid model is also used to characterize the non-Newtonian fluid behavior and has great importance in polymer processing industries and biomechanics. Motivated by these developments, the present study explores the entropy generation for the natural convection of Casson fluid in a porous, partially heated square enclosure considering the effects of horizontal magnetic field, cavity inclination and viscous dissipation. The bottom wall of the enclosure is considered partially heated, whereas the left and right walls are taken as cold at constant temperature. The top wall and the remaining portions of the bottom wall are adiabatic. The dimensionless form of the governing conservation laws is simulated via higher order Galerkin finite element method. Particularly, the discretization is performed using biquadratic elements for velocity and temperature components whereas the discontinuous linear elements employed for the pressure. The discretized nonlinear systems are handled by implementing the adaptive Newton-multigrid solver. In order to increase the reliability of the computed results, the designed solver is validated qualitatively and quantitatively for the available numerical and experimental data. The simulated results are analyzed through isotherms, streamlines and two dimensional plots. Moreover, the average entropy generation, kinetic energy and temperature are also computed and analyzed. The controlling parameters are Hartmann number (Ha=0−100), Darcy number (Da=10−4), Prandtl number (Pr=7), Rayleigh number (Ra=105), Casson fluid parameter (γ=0.1−10), cavity inclination (ϕ=0°−90°) and Eckert number (Ec=10−6−10−4). It is concluded that the average heat transfer rate decreases whereas the total entropy generation increases by increasing the cavity inclination (ϕ) for each value of Ha. Further, the irreversibilities due to heat transfer and magnetic field both are increasing for function of ϕ.
Various numerical methods have been extensively studied and used for reliability analysis over the past several decades. However, how to understand the effect of numerical uncertainty (i.e., numerical error due to the discretization of the performance function) on the failure probability is still a challenging issue. The active learning probabilistic integration (ALPI) method offers a principled approach to quantify, propagate and reduce the numerical uncertainty via computation within a Bayesian framework, which has not been fully investigated in context of probabilistic reliability analysis. In this study, a novel method termed `Parallel Adaptive Bayesian Quadrature' (PABQ) is proposed on the theoretical basis of ALPI, and is aimed at broadening its scope of application. First, the Monte Carlo method used in ALPI is replaced with an importance ball sampling technique so as to reduce the sample size that is needed for rare failure event estimation. Second, a multi-point selection criterion is proposed to enable parallel distributed processing. Four numerical examples are studied to demonstrate the effectiveness and efficiency of the proposed method. It is shown that PABQ can effectively assess small failure probabilities (e.g., as low as 10^{-7}) with a minimum number of iterations by taking advantage of parallel computing.
Studies of facilitators of professional development (PD) for mathematics teachers have been increasing in order to improve their preparation for conducting PD. However, specifications of what facilitators should learn often lack a conceptualization that captures facilitators’ expertise for different PD content. In this article, we provide a framework for facilitator expertise that is in line with current conceptualizations but makes explicit the content-related aspects of such expertise. The framework for content-related facilitator expertise combines cognitive and situated perspectives and allows unpacking different components at the PD level and the classroom level. Using two illustrative cases of different PD content (probability education in primary school and language-responsive mathematics teaching in secondary school), we exemplify how the framework can help to analyze facilitators’ practices in content-related ways in a descriptive mode. This analysis reveals valuable insights that support designers of facilitator preparation programs to specify what facilitators should learn in a prescriptive mode. We particularly emphasize the importance of working on content-related aspects, unpacking the PD content goals into the content knowledge and pedagogical content knowledge elements on the classroom level and developing facilitators’ pedagogical content knowledge on the PD level (PCK-PD), which includes curricular knowledge, as well as knowledge about teachers’ typical thinking about a specific PD content. Situated learning opportunities in facilitator preparation programs can support facilitators to activate these knowledge elements for managing typical situational demands in PD.
This paper is concerned with approximating the scalar response of a complex computational model subjected to multiple input interval variables. Such task is formulated as finding both the global minimum and maximum of a computationally expensive black-box function over a prescribed hyper-rectangle. On this basis, a novel non-intrusive method, called `triple-engine parallel Bayesian global optimization', is proposed. The method begins by assuming a Gaussian process prior (which can also be interpreted as a surrogate model) over the response function. The main contribution lies in developing a novel infill sampling criterion, i.e., triple-engine pseudo expected improvement strategy, to identify multiple promising points for minimization and/or maximization based on the past observations at each iteration. By doing so, these identified points can be evaluated on the real response function in parallel. Besides, another potential benefit is that both the lower and upper bounds of the model response can be obtained with a single run of the developed method. Four numerical examples with varying complexity are investigated to demonstrate the proposed method against some existing techniques, and results indicate that significant computational savings can be achieved by making full use of prior knowledge and parallel computing.
Deep drilling with smallest diameters is used for applications such as automotive, aerospace and medicine. Especially difficult to cut materials require an efficient tool design to realize an economical production. Therefore, combined simulation methods are used in this paper to analyze the fluid behavior under consideration of the transient chip positions. To improve tool cooling by an improved cutting fluid flow, which supports chip removal also, both the cross-sectional area of the internal cooling channel and the outer and inner cutting edge angles were modified. With the modified model, cutting fluid velocity increased by 40% and chip evacuation by 60%.
In this paper we derive martingale estimating functions for the dimensionality parameter of a Bessel process based on the eigenfunctions of the diffusion operator. Since a Bessel process is non-ergodic and the theory of martingale estimating functions is developed for ergodic diffusions, we use the space-time transformation of the Bessel process and formulate our results for a modified Bessel process. We deduce consistency, asymptotic normality and discuss optimality. It turns out that the martingale estimating function based of the first eigenfunction of the modified Bessel process coincides with the linear martingale estimating function for the Cox Ingersoll Ross process. Furthermore, our results may also be applied to estimating the multiplicity parameter of a one-dimensional Dunkl process and some related polynomial processes.
In the scientific literature, various temporal resolutions have been used to model electric vehicle charging loads. However, in most studies, the used temporal resolution lacks a proper justification. To provide a strengthened theoretical background for all future studies related to electric vehicle charging load modeling, this paper investigates the influence of temporal resolution in different scenarios. To ensure reliable baselines for the comparisons, hardware-in-the-loop simulations with different commercial electric vehicles are carried out. The conducted hardware-in-the-loop simulations consists of 134 real charging sessions in total. In order to compare the influence of different temporal resolutions, a simulation model is developed. The simulation model utilizes comprehensive preliminary measurement-based charging profiles that can be used to model controlled charging in fine detail. The simulation results demonstrate that the simulation model provides sufficiently accurate results in most cases with a temporal resolution of one second. Conversely, a temporal resolution of 3600 s may lead to a modeling error of 50% or even higher. Additionally, the paper shows that the necessary resolution to achieve a modeling error of 5% or less vary between 1 and 900 s depending on the scenario. However, in most cases, resolution of 60 s is reasonably accurate.
Quantum effects in novel functional materials and new device concepts represent a potential breakthrough for the development of new information processing technologies based on quantum phenomena. Among the emerging technologies, memristive elements which exhibit resistive switching that relies on the electrochemical formation/rupture of conductive nanofilaments, exhibit quantum conductance effects at room temperature. Despite the underlying resistive switching mechanism having been exploited for the realization of next‐generation memories and neuromorphic computing architectures, the potentialities of quantum effects in memristive devices are still rather unexplored. Here, we present a comprehensive review on memristive quantum devices where quantum conductance effects can be observed by coupling ionics with electronics. Fundamental electrochemical and physicochemical phenomena underlying device functionalities are introduced, together with fundamentals of electronic ballistic conduction transport in nanofilaments. Quantum conductance effects including quantum mode splitting, stability and random telegraph noise are analyzed, reporting experimental techniques and challenges of nanoscale metrology for the characterization of memristive phenomena. Finally, potential applications and future perspectives are envisioned, including how memristive devices with controllable atomic‐sized conductive filaments can represent not only suitable platforms for the investigation of quantum phenomena but also promising building blocks for the realization of integrated quantum systems working in air at room temperature. This article is protected by copyright. All rights reserved
Background Targeted therapies for metastatic uveal melanoma have shown limited benefit in biomarker-unselected populations. The Treat20 Plus study prospectively evaluated the feasibility of a precision oncology strategy in routine clinical practice. Patients and methods Fresh biopsies were analyzed by high-throughput genomics (whole-genome, whole-exome, and RNA sequencing). A multidisciplinary molecular and immunologic tumor board (MiTB) made individualized treatment recommendations based on identified molecular aberrations, patient situation, drug, and clinical trial availability. Therapy selection was at the discretion of the treating physician. The primary endpoint was the feasibility of the precision oncology clinical program. Results Molecular analyses were available for 39/45 patients (87%). The MiTB provided treatment recommendations for 40/45 patients (89%), of whom 27/45 (60%) received ≥1 matched therapy. First-line matched therapies included MEK inhibitors (n = 15), MET inhibitors (n = 10), sorafenib (n = 1), and nivolumab (n = 1). The best response to first-line matched therapy was partial response in one patient (nivolumab based on tumor mutational burden), mixed response in two patients, and stable disease in 12 patients for a clinical benefit of 56%. The matched therapy population had a median progression-free survival and overall survival of 3.3 and 13.9 months, respectively. The growth modulation index with matched therapy was >1.33 in 6/17 patients (35%) with prior systemic therapy, suggesting clinical benefit. Conclusions A precision oncology approach was feasible for patients with metastatic uveal melanoma, with 60% receiving a therapy matched to identify molecular aberrations. The clinical benefit after checkpoint inhibitors highlights the value of tumor mutational burden testing.
For single-lip drills with small diameters, the cutting fluid is supplied through a kidney-shaped cooling channel inside the tool. In addition to reducing friction, the cutting fluid is also important for the dissipation of heat at the cutting edge and for the chip removal. However, in previous investigations of single-lip drills, it was observed that the fluid remains on the back side of the cutting edge, and accordingly, the cutting edge is insufficiently cooled. In this paper, a simulation-based investigation of an introduced additional drainage flute and flank surface modifications is carried out using smoothed particle hydrodynamics as well as computational fluid dynamics. It is determined that the additionally introduced drainages lead to a slightly changed flow situation, but a significant flow behind the cutting edge and into the drainage flute cannot be achieved due to reasons explained in this paper. Accordingly, not even a much larger drainage flute with unwanted side-effect of a decrease tool strength is able to archive a significant improvement of the flow around the cutting edge. Therefore, major changes to the cooling channel, like the use of two separate channels, the modification of their positions, or modified flank surfaces, are necessary in order to achieve an improvement in lubrication of the cutting edge and heat dissipation.
Residual visual capabilities and the associated phenomenological experience can differ significantly between persons with similar visual acuity and similar diagnosis. There is a substantial variance in situations and tasks that persons with low vision find challenging. Smartglasses provide the opportunity of presenting individualized visual feedback targeted to each user’s requirements. Here, we interviewed nine persons with low vision to obtain insight into their subjective perceptual experience associated with factors such as illumination, color, contrast, and movement, as well as context factors. Further, we contribute a collection of everyday activities that rely on visual perception as well as strategies participants employ in their everyday lives. We find that our participants rely on their residual vision as the dominant sense in many different everyday activities. They prefer vision to other modalities if they can perceive the information visually, which highlights the need for assistive devices with visual feedback.
This paper presents the main work and results of the completed R&D-project LernBAR (Learning based on Augmented Reality). The project ran from June 2018 to January 2022 and aimed to develop a digital learning assistance environment for people with cognitive disabilities in vocational training in the field of home economics. Therefor an online learning platform with embedded AR-learning stations was developed and evaluated with the target group. In the paper the qualitative and quantitative results of the formative evaluation of the project are presented and discussed, as well as the main results of an external evaluation.
Institution pages aggregate content on ResearchGate related to an institution. The members listed on this page have self-identified as being affiliated with this institution. Publications listed on this page were identified by our algorithms as relating to this institution. This page was not created or approved by the institution. If you represent an institution and have questions about these pages or wish to report inaccurate content, you can contact us here.
7,852 members
Boris Otto
  • Faculty of Mechanical Engineering, Chair of Industrial Information Management
Olga Kunina-Habenicht
  • Psychological Assessment
Peter Padawitz
  • Faculty of Computer Science
Mahadeo R. Halhalli
  • Faculty of Chemistry
August-Schmidt-Straße 4, 44227, Dortmund, North Rhine-Westphalia, Germany
Head of institution
Prof. Dr. Manfred Bayer
(0231) 755-7550