The objective of this study is to estimate the, probably correlated, ligament material properties and attachment sites in a highly non-linear, musculoskeletal knee model based on kinematic data of a knee rig experiment for seven specific specimens. Bayesian parameter estimation is used to account for uncertainty in the limited experimental data by optimization of a high dimensional input parameter space (50 parameters) consistent with all probable solutions. The set of solutions accounts for physiologically relevant ligament strain (ϵ<6%). The transitional Markov Chain Monte Carlo algorithm was used. Alterations to the algorithm were introduced in order to avoid premature convergence. To perform the parameter estimation with feasible computational cost, a surrogate model of the knee model was trained. Results show that there is a large intra- and inter-specimen variability in ligament properties, and that multiple sets of ligament properties fit the experimentally measured tibio-femoral kinematics. Although all parameters were allowed to vary significantly, large interdependence is only found between the reference strain and attachment sites. The large variation between specimens and interdependence between reference strain and attachment sites within one specimen, show the inability to identify a small range of ligament properties representative for the patient population. To limit ligament properties uncertainty in clinical applications, research will need to invest in establishing patient-specific uncertainty ranges and/or accurate in vivo measuring methods of the attachment sites and reference strain and/or alternative (combinations of) movements that would allow identifying a unique solution.
The Large Hadron Collider beauty (LHCb) experiment at CERN is undergoing an upgrade in preparation for the Run 3 data collection period at the Large Hadron Collider (LHC). As part of this upgrade, the trigger is moving to a full software implementation operating at the LHC bunch crossing rate. We present an evaluation of a CPU-based and a GPU-based implementation of the first stage of the high-level trigger. After a detailed comparison, both options are found to be viable. This document summarizes the performance and implementation details of these options, the outcome of which has led to the choice of the GPU-based implementation as the baseline.
Against the background of digital transformation processes that are currently changing the world of work, this paper examines general digital competences of beginning trainees in commercial vocational education and training (VET) programs. We are particularly interested in factors influencing digital competence profiles. From survey data including N = 480 trainees in one federal state in Germany, we were able to identify three different competence profiles (based on the trainees’ self-assessment of their general digital competence). Initial descriptive analysis reveals differences between competence profiles of different training professions (industrial clerks and retail salespersons reach higher competence levels than salespersons). However, regression results indicate that these differences can be explained by differences in school leaving certificates. Contrary to prior empirical evidence, we find no significant effect of trainees’ gender. Finally, the frequency of certain private digital activities (e.g. using office programs, conducting internet searches) affects digital competence profiles. Implications for both VET programs and further research are discussed.
Research on the Dark Tetrad (Machiavellianism, narcissism, psychopathy, and sadism) is burgeoning. Therefore, psychometrically sound measures are required to ensure that predictions based on these constructs are reliable and valid. A recently devised scale to assess all traits simultaneously is the Short Dark Tetrad (SD4; Paulhus et al., 2021). Previous studies demonstrated the general suitability of the SD4 in terms of classical test theory, but item response theoretical (IRT) analyses are still missing, whereby IRT approaches extract more information on the items and scales and tests assumptions untested in classical test theory. Thus, we evaluated the subscales of the SD4 using IRT modeling (N = 594). Unlike the sadism subscale, the Machiavellianism, narcissism, and psy-chopathy subscales had satisfactory fit of the two-parameter polytomous IRT models and provided ample information across the respective trait continua. The sadism subscale, however, exhibited problems related to item discrimination (i.e., in differentiating between individuals with similar levels) and item (category transition) difficulties (i.e., quantifying how "difficult" it is to endorse higher compared to next lower response categories). The results thus emphasize the need for a thorough revision of the items of the sadism scale.
Numerical methods play a dominant role in structural reliability analysis, and the goal has long been to produce a failure probability estimate with a desired level of accuracy using a minimum number of performance function evaluations. In the present study, we attempt to offer a Bayesian perspective on the failure probability integral estimation, as opposed to the classical frequentist perspective. For this purpose, a Bayesian Failure Probability Inference (BFPI) framework is first developed, which allows to quantify, propagate and reduce numerical uncertainty behind the failure probability due to discretization error. Especially, the posterior variance of the failure probability is derived in a semi-analytical form, and the Gaussianity of the posterior failure probability distribution is investigated numerically. Then, a Parallel Adaptive-Bayesian Failure Probability Learning (PA-BFPL) method is proposed within the Bayesian framework. In the PA-BFPL method, a variance-amplified importance sampling technique is presented to evaluate the posterior mean and variance of the failure probability, and an adaptive parallel active learning strategy is proposed to identify multiple updating points at each iteration. Thus, a novel advantage of PA-BFPL is that both prior knowledge and parallel computing can be used to make inference about the failure probability. Four numerical examples are investigated, indicating the potential benefits by advocating a Bayesian approach to failure probability estimation.
The necessity to obtain, to parametrize, and to maintain models of the underlying dynamics impedes predictive control of energy systems in many real-world applications. To alleviate the need for explicit model knowledge this paper proposes a framework for the operation of distributed multi-energy systems via data-driven predictive control with stochastic uncertainties. Instead of modeling the dynamics of the individual distributed energy resources, our approach relies on measured input–output data of the distributed resources only. Moreover, we combine data-driven predictive control with forecasts of exogenous signals (renewable generations and demands) by Gaussian processes. A simulation study based on realistic data illustrates the efficacy of the proposed scheme to handle mild non-linearities and measurement noise.
This work proposes a methodology for the concurrent homogenization-based optimization of the type and the configuration of the microstructure of the structural domain, based on a list of pre-defined composite microstructures. The candidate microstructures are represented on this list by their homogenized mechanical properties, as predicted by means of the 3D homogenization theory. As such, each candidate property is tied to a specific microstructure -the one it has been derived from- and is conditioned on its topology, i.e. the geometric configuration of its unit cell. Similar to the standard discrete multi-material optimization problem (DMOP), in order to identify per element/patch in the structural domain the optimal microstructure from that list, weights are assigned to the candidate homogenized properties, with respect to which the final discrete microstructure type optimization problem (DMTOP) is posed. The topology of the optimal microstructure is determined by the volume constraint(s) imposed on one or more of its material components in the homogenization-based topology optimization problem (HTOP). The aim of this work is to combine the DMTOP with the HTOP in a unique mathematical framework in order to determine a unique microstructure type per element/patch in the structural domain, concurrently optimized in its topology. The proposed methodology is built step-by-step through the introduction of four microstructure types of two distinct constituent materials. Upon these auxiliary microstructures, the generalized concurrent homogenization-based topology and discrete microstructure type optimization problem (HTDMOP) is formulated for compliance minimization of the structure. Further, it is illustrated how the DMOP can be perceived as a DMTOP, and hence be combined in a similar fashion with the HTOP for the concurrent homogenization-based topology and material optimization (HTMO) of the structural domain. The paper concludes with demonstrating the developed methodology on the benchmark academic case study of the 3D Messerchmitt-Bölkow-Blohm (MBB) beam for the case where the auxiliary microstructures are considered as candidates for the domain.
The rise of the sharing economy represents a disruption for incumbent organizations, institutional regimes, and society at large, causing multiple actor groups to engage in institutional work to create, maintain, and disrupt institutions. While research has provided insights into the ways in which individual actors engage in institutional work, we still lack an understanding of the dynamics between actor groups and the institutional work battles they wage in the context of the emerging sharing economy. Drawing on an in-depth analysis of the discursive institutional work during the rise of Uber and Airbnb, as represented in the news media, we first provide new insights into which actors groups engage in institutional work and which rhetoric they utilize. We inductively distill 15 discursive strategies of institutional work related to seven institutional discourses. Moreover, enfolding extant literature, we derive a taxonomy of institutional work battles and introduce the concept of cross-countering—defined as efforts to weaken, oppose, or nullify opposing discursive institutional work. Altogether, our work provides novel insights on institutional work in the sharing economy, highlights the contested nature of institutional discourses and formalizes when such contestation fosters or hinders institutional change.
One of the key challenges of uncertainty analysis in model updating is the lack of experimental data. The definition of an appropriate uncertainty quantification metric, which is capable of measuring as sufficient as possible information from the limited and sparse experimental data, is significant for the outcome of model updating. This work is dedicated to the definition and investigation of a general-purpose uncertainty quantification metric based on the sub-interval similarity. The discrepancy between the model prediction and the experimental observation is measured in the form of intervals, instead of the common probabilistic distributions which require elaborate experimental data. An exhaustive definition of the similarity between intervals under different overlapping cases is proposed in this work. A sub-interval strategy is developed to compare the similarity considering not only the range of the intervals, but more importantly, the distributed positions of the available observation samples within the intervals. This sub-interval similarity metric is developed to be capable of different model updating frameworks, e.g. the stochastic Bayesian updating and the interval model updating. A simulated example employing the widely known 3-dof mass-spring system is presented to perform both stochastic Bayesian updating and interval updating, to demonstrate the universality of the proposed sub-interval similarity metric. A practical experimental example is followed to demonstrate the feasibility of the proposed metric in practical application cases.
Civic education is generally assumed to play a key role in youth’s political sophistication. It aims to equip young people with the necessary competencies and skills to effectively participate in political and civic life. However, few studies have examined the relative importance of different facets of teaching quality within civic education as well as mediating factors for fostering active citizens. The present study seeks to fill this gap by investigating how different facets of teaching quality are associated with adolescents’ willingness to participate in political and civic life and how this relationship is mediated by political knowledge and interest. The study uses original data from N = 250 students (n = 152 7th graders: Mage = 12.54, SD = 0.91, range = 11–14, 45% female; n = 98 10th graders: Mage = 16.12, SD = 0.97, range = 15–18, 35% female). The findings show that not all teaching quality facets are equally important. While perceived cognitive activation and open classroom climate were positively associated with students’ willingness to participate, a statistically significant association with discussions of current political events in the classroom was not found. In addition, the relationship between perceived cognitive activation and willingness to participate is fully mediated by students’ political knowledge and interest. This study illustrates the relative importance of different teaching quality facets in civic education and calls for continued efforts to better understand teaching quality in civic education.
The ability to infer beliefs and thoughts in interaction partners is essential in social life. However, reasoning about other people’s beliefs might depend on their characteristics or our relationship with them. Recent studies indicated that children’s false-belief attribution was influenced by a protagonist’s age and competence. In the current experiments, we investigated whether group membership influences the way children reason about another person’s beliefs. We hypothesized that 4-year-olds would be less likely to attribute false beliefs to an ingroup member than to an outgroup member. Group membership was manipulated by accent (Experiments 1–3) and gender (Experiment 4). The results indicated that group membership did not consistently influence children’s false-belief attribution. Future research should clarify whether the influence of group membership on false-belief attribution either is absent or depends on other cues that we did not systematically manipulate in our study.
Uncertainties existing in physical and engineering systems can be characterized by different kinds of mathematical models according to their respective features. However, efficient propagation of hybrid uncertainties via an expensive-to-evaluate computer simulator is still a computationally challenging task. In this contribution, estimation of response expectation function (REF), its variable importance and bounds under hybrid uncertainties in the form of precise probability models, parameterized probability-box models and interval models is investigated through a Bayesian approach. Specifically, a new method, termed “Parallel Bayesian Quadrature Optimization” (PBQO), is developed. The method starts by treating the REF estimation as a Bayesian probabilistic integration (BPI) problem with a Gaussian process (GP) prior, which in turn implies a GP posterior for the REF. Then, one acquisition function originally developed in BPI and other two in Bayesian global optimization are introduced for Bayesian experimental designs. Besides, an innovative strategy is also proposed to realize multi-point selection at each iteration. Overall, a novel advantage of PBQO is that it is capable of yielding the REF, its variable importance and bounds simultaneously via a pure single-loop procedure allowing for parallel computing. Three numerical examples are studied to demonstrate the performance of the proposed method over some existing methods.
Natural products have greatly influenced the development of drugs to combat infectious diseases, cancer, and other disorders affecting human well-being. Only rarely, a natural product is used in an unmodified form for therapeutic purposes. More often, natural product derivatives are preferred due to improved activity or toxicity profiles. These compounds are usually produced using ‘hybrid’ processes that integrate organic synthesis and biosynthesis. Either a natural product is isolated from a biological source and then converted into the final drug by semisynthesis or a synthetically prepared precursor is introduced into the engineered biosynthesis of a living cell in a procedure called mutasynthesis. In this review, we will present recent developments in these two research areas, which take advantage of heterologous biosynthesis.
The approach of electrochemical impedance spectroscopy is widely used to analyze electrochemical systems such as batteries and fuel cells. The majority of commercial spectroscopes is however limited to the investigation of devices under test at laboratory scale. In this paper the development of a setup performing impedance spectroscopy of commercial lithium-ion battery cells and complete packs up to 42 V and 10S4P configuration is demonstrated which is entirely based on standard measurement equipment. Galvanostatic AC excitation of up to 5A peak-to-peak can be reached. The obtained impedance spectra qualitatively match the theoretical expectations and reference results from literature. Acquired results can be successfully fitted to a second-order Thevenin equivalent circuit model without requirement of specialized analysis software. OriginPro is used as an example for the presented setup. A similar sensitivity towards external factors such as temperature and probe contacting as it applies for commercial devices is observed. The described setup can be reproduced in most electrical laboratories with similar equipment and the application scenario is not limited to batteries. This gives access to a powerful analysis tool complementing commercial impedance spectroscopy devices by extending measurement ranges towards higher voltage and excitation current as well as lower impedance.
Small molecules targeting the ubiquitous latent ribonuclease (RNase L), which has limited sequence specificity toward single-stranded RNA substrates, hold great potential to be developed as broad-spectrum antiviral drugs by modulating the RNase L-mediated innate immune responses. The recent development of proximity-inducing bifunctional molecules, as described in the strategy of ribonuclease targeting chimeras, demonstrated that small-molecule RNase L activators can function as the essential RNase L-recruiting component to design bifunctional molecules for targeted RNA degradation. However, only a single screening study on small-molecule RNase L activators with poor potency has been reported to date. Herein, we established a FRET assay and conducted a screening of 240,000 small molecules to identify new RNase L activators with improved potency. The extremely low hit rate of less than 0.03% demonstrated the challenging nature of RNase L activation by small molecules available from current screening collections. A few hit compounds induced enhanced thermal stability of RNase L upon binding, although validation assays did not lead to the identification of compounds with significantly improved RNase L activating potency. The sulfonamide compound 17 induced a thermal shift of ~ 0.9 °C upon binding to RNase L, induced significant apoptosis in cancer cells, and showed single-digit micromolar inhibitory activity against cancer cell proliferation. This study paves the way for future structural optimization for the development of small-molecule RNase L binders.
Site-specific heterogeneity of solid protein samples can be exploited as valuable information to answer biological questions ranging from thermodynamic properties determining fibril formation to protein folding and conformational stability upon stress. In particular, for proteins of increasing molecular weight, however, site-resolved assessment without residue-specific labeling is challenging using established methodology, which tends to rely on carbon-detected 2D correlations. Here we develop purely chemical-shift-based approaches for assessment of relative conformational heterogeneity that allows identification of each residue via four chemical-shift dimensions. High dimensionality diminishes the probability of peak overlap in the presence of multiple, heterogeneously broadened resonances. Utilizing backbone dihedral-angle reconstruction from individual contributions to the peak shape either via suitably adapted prediction routines or direct association with a relational database, the methods may in future studies afford assessment of site-specific heterogeneity of proteins without site-specific labeling.
The COVID-19 pandemic poses specific risks to vulnerable population groups. Informal carers for older adults are especially at risk of increased strain, as support from social networks and professional care services is no longer available or in short supply. Already before the pandemic, caring was unequally distributed within societies, with women and people in lower socio-economic status groups bearing a higher risk of caring strain. In this article, we propose a conceptual framework of (unequal) caring strain during the pandemic. We then summarise the state of empirical research, suggest questions for future studies and outline implications for social policy.
Professional development (PD) courses are the main context for mathematics teachers’ lifelong learning. Leaders with expertise are needed to facilitate these courses; thus, there is a growing interest in understanding the nature of this profession, its core practices, and the challenges it entails. This paper focuses on a specific group of facilitators: experienced mathematics teachers who have just begun facilitating PD courses in addition to their classroom teaching. To better understand these novice facilitators’ practices, a commognitive approach was implemented to examine how they draw on their mathematics teaching experience when leading PD courses. In commognitive terms, these practitioners draw on multiple professional discourses, but mostly on the discourses of teaching and facilitation. The analysis of the challenges and affordances associated with participating in these two professional discourses showed that novice facilitators bring into play their teaching practices in four distinct ways: enacting a familiar practice, negating a familiar practice, questioning the relevance of a familiar practice, and generating a new practice based on their teaching experience. We claim that novice facilitators’ well-established identity as teachers is both a challenge and an asset in grounding successful facilitation practices. Overall, facilitators modify their teaching experience through the adoption, adaptation, and retraction of their teaching practices. Implications for the preparation and support of facilitators, within processes of upscaling PD programs, are discussed.
The field of anion recognition chemistry is dominated by two fundamental approaches to design receptors. One relies on the formation of covalent bonds resulting in organic and often neutral host species, while the other one utilizes metal-driven self-assembly for the formation of charged receptors with well-defined nanocavities. Yet, the combination of their individual advantages in the form of charge-neutral metal-assembled bench-stable anion receptors is severely lacking. Herein, we present a fluorescent and uncharged double-stranded hydroxyquinoline-based zinc(II) helicate with the ability to bind environmentally relevant dicarboxylate anions with high fidelity in dimethyl sulfoxide (DMSO) at nanomolar concentrations. These dianions are pinned between zinc(II) centers with binding constants up to 145 000 000 M-1. The presented investigation exemplifies a pathway to bridge the two design approaches and combine their strength in one structural motif as an efficient anion receptor.
Institution pages aggregate content on ResearchGate related to an institution. The members listed on this page have self-identified as being affiliated with this institution. Publications listed on this page were identified by our algorithms as relating to this institution. This page was not created or approved by the institution. If you represent an institution and have questions about these pages or wish to report inaccurate content, you can contact us here.