Climate change has led to an increase in the frequency and intensity of natural disasters, necessitating the development of efficient crisis management strategies for population sheltering. However, existing research on this topic primarily focuses on the use of public resources such as ambulances and fire trucks, which may sometimes be insufficient due to high demand and impacted locations, worsening the shortage of resources. This research introduces an ontology-based crisis simulation system for population sheltering management that focuses on the integration and distribution of citizen–volunteer drivers/vehicles into the evacuation process. Recognizing the limitations of public resources in current crisis management models, our approach incorporates citizen resources to enhance overall evacuation capacity. We develop an ontology to standardize crisis management knowledge, frame vehicle distribution as a recommendation problem, and design a simulation module incorporating a constraint-based recommender system. The proposed scenario illustrates how the simulation system can recommend citizen resources during crisis situations by considering the constraints to be satisfied. With our system, we aim at helping stakeholders to be prepared for various disaster scenarios: optimizing resource allocation and reducing time to make decisions by decision-makers.
We present a method to calculate the asymptotic behavior of eigenfunctions of Schrödinger operators that also works at the threshold of the essential spectrum. It can be viewed as a higher order correction to the well-known WKB method which does need a safety distance to the essential spectrum. We illustrate its usefulness on examples of quantum particles in a potential well with a long-range repulsive term outside the well.
Mobile communication systems are witnessing an ongoing-increase in connected devices and new types of services. This considerable increase has led to an exponential augmentation in mobile data traffic volume. The dense deployment of small base stations and mobile nodes in traffic hotspots is considered one of the potential solutions aimed at satisfying the emerging requirements in 5G/Beyond 5G wireless networks. However, the ultra-densification poses challenges for the mobility management, including frequent, unnecessary and ping-pong handovers, with additional problems related to increased delay and total failure of the handover process. In this paper, we propose a new handover management approach using the Software Defined Networking (SDN) paradigm to overcome performance limitations linked to handover taking place at dense femtocell environments. With the exploitation of SDN, data plane and control plane are separated thus the HO decision can be made at the SDN controller. In addition, in order to reduce the complexity and delay of handover process, a Fuzzy logic system is used to decide whether a target candidate is suitable for handover. Simulation results validate the efficiency of our proposal.
Nowadays, information is generated and propagated by many means. In particular, social networks generate many rumors that spread very quickly. The study of this phenomenon is very important. Indeed, if the rumor is true, decision makers will foster its propagation, i.e., they will try to minimize the number of persons that ignore the rumor. In this work, we consider the case in which the rumor can be spread by two different types of spreaders: those who transmit the rumor and think it is true and those that spread it although they claim it is false. First, we model the phenomenon of the propagation of the rumor by a dynamical system. The objective function of the control problem is given by the combination of three goals: the minimization of the number of ignorant, the maximization of the number of spreaders and the minimization of the propagation costs. We consider three possible combinations of these goals. We study the corresponding optimal control models. For each case, we prove the existence and uniqueness of the solution and characterize it using the Pontryagin Minimum Principle. Some numerical examples are presented. They show that the propagation of the rumor is actually better if the optimal control is used. Comparisons with respect to the number of iterations, value of the objective function and level of propagation of the rumor allow to decide which model will be the best option.
The United Nations’ Sustainable Development Goals (SDGs) provide a framework for building sustainable marketing strategies to improve social and natural environments. Marketing is a key factor for achieving SDG12, “Ensuring sustainable production and consumption patterns,” and a new kind of marketing education is needed to engage and transform students accordingly. In a sustainable marketing class, we asked students to weave a rug from recycled clothes, then estimate its selling price, considering its production and environmental costs. This arts-and-crafts-based pedagogy gave students a deep understanding of the requirements of SDG12. It was engaging for students and highlighted the role marketing can play in raising willingness to pay higher prices for more sustainable products. In sum, this class achieved transformative sustainability learning in line with the Head, Hands, and Heart framework proposed by Sipos et al.
The development of models that can cope with noisy input preferences is a critical topic in artificial intelligence methods for interactive preference elicitation. A Bayesian representation of the uncertainty in the user preference model can be used to successfully handle this, but there are large costs in terms of the processing time required to update the probabilistic model upon receiving the user’s answers, to compute the optimal recommendation and to select the next queries to ask; these costs limit the adoption of these techniques in real-time contexts. A Bayesian approach also requires one to assume a prior distribution over the set of user preference models. In this work, dealing with multi-criteria decision problems, we consider instead a more qualitative approach to preference uncertainty, focusing on the most plausible user preference models, and aim to generate a query strategy that enables us to find an alternative that is optimal in all of the most plausible preference models. We develop a non-Bayesian algorithmic method for recommendation and interactive elicitation that considers a large number of possible user models that are evaluated with respect to their degree of consistency of the input preferences. This suggests methods for generating queries that are reasonably fast to compute. Our test results demonstrate the viability of our approach, including in real-time contexts, with high accuracy in recommending the most preferred alternative for the user.
The Dirac–Fock (DF) model replaces the Hartree–Fock (HF) approximation in quantum chemistry when relativistic effects cannot be neglected. Since the Dirac operator is not bounded from below, the notion of ground state is problematic in this model, and several definitions have been proposed in the literature. We give a new definition for the ground state of the DF energy, inspired of Lieb’s relaxed variational principle for HF. Our definition and existence proof are simpler and more natural than in previous works on DF, but remain more technical than in the nonrelativistic case. One first needs to construct a set of physically admissible density matrices that satisfy a certain nonlinear fixed-point equation: we do this by introducing an iterative procedure, described in an abstract context. Then the ground state is found as a minimizer of the DF energy on this set.
The Gray’s index has been used to compare accounting variables such as earnings in a sample of financial accounting figures since its introduction by Gray (Gray, S. J. 1980. “The Impact of International Accounting Differences from a Security-Analysis Perspective: Some European Evidence.” Journal of Accounting Research 18 (1): 64–76), in view to indicate the relative degree of accounting conservatism between sets of accounting standards. This note aims to improve on this index especially when variables under investigation can be negative. In the latter case, proportionality between the two variables may be hidden by its application. We suggest therefore some amendments to better show this proportionality and improve on statistical comparison of accounting figures generated by alternative sets of accounting standards.
Digital technologies can augment civic participation by facilitating the expression of detailed political preferences. Yet, digital participation efforts often rely on methods optimized for elections involving a few candidates. Here we present data collected in an online experiment where participants built personalized government programmes by combining policies proposed by the candidates of the 2022 French and Brazilian presidential elections. We use this data to explore aggregates complementing those used in social choice theory, finding that a metric of divisiveness, which is uncorrelated with traditional aggregation functions, can identify polarizing proposals. These metrics provide a score for the divisiveness of each proposal that can be estimated in the absence of data on the demographic characteristics of participants and that explains the issues that divide a population. These findings suggest that divisiveness metrics can be useful complements to traditional aggregation functions in direct forms of digital participation.
We suggest an explanation for the existence of “mission drift,” the tendency for Microfinance Institutions (MFIs) to lend money to wealthier borrowers rather than to the very poor. We focus on the relationship between MFIs and external funding institutions. We assume that both the MFIs and the funding institutions are pro‐poor. However, asymmetric information on the effort chosen by the MFI to identify higher‐quality projects may increase the share of loans attributed to wealthier borrowers. This occurs because funding institutions have to build incentives for MFIs, creating a trade‐off between the quality of the funded projects and the attribution of loans to poorer borrowers.
Background Direct, non-vitamin K antagonist oral anticoagulants (DOACs/NOACs) have become the first-choice therapy for stroke prevention in patients with atrial fibrillation (AF) at risk of stroke. Data on long-term effectiveness and safety of NOACs are scarce and not available from randomised clinical trials. Purpose We analysed 4-year outcome data in 13,632 European patients with AF treated with the NOAC, edoxaban. Methods This is the first report of the four-year follow-up information collected in the ETNA-AF-Europe (NCT02944019) study. It is a prospective registry, conducted in 825 centres enrolling edoxaban-treated patients in 10 European countries. Design and follow-up were agreed with the European Medicines Agency as part of the post-approval safety assessment of edoxaban. Key efficacy and safety outcomes were adjudicated. Results 13,164 patients (56.7% males) with AF treated with edoxaban (60 mg OD n=9617; 30 mg OD n=3042; missing dose n=505) were analysed: median (interquartile range [IQR]) age was 75 (68–80) years, weight 80 (70–90) kg, creatinine clearance (CrCl) 68.9 (52.7–87.9) mL/min, modified CHA2DS2-VASc and HAS-BLED scores 3 (2–4) and 2 (1–2), respectively in the overall population. Annualised all-cause and cardiovascular deaths in the overall population were 4.1%/year and 1.0%/year, respectively; higher in the edoxaban 30 mg vs 60 mg cohort (Figure 1). Annualised rates of stroke, transient ischaemic attack and systemic embolic events rates were <1% (0.6%/year, 0.3%/year and 0.1%/year, respectively); these proportions were similar between the two dose cohorts (Figure 1). The time-to-first event curves for all-cause death, stroke and major bleeding were almost linear throughout the 4-year follow-up (Figure 2), irrespective of dose cohort. Compared with patients receiving edoxaban 60 mg, those prescribed 30 mg were older (median [IQR] age: 80 [75–85] vs 73 [66–78] years), had greater frail proportion (27.0% vs 6.6%), a lower CrCl (46.1 [37.4–59.0] vs 75.9 [61.9–93.8] mL/min) and a higher CHA2DS2-VASc score (4 [3–5] vs 3 [2–4]). Annualised bleeding rate was 3.0%/year and higher in the 30 mg vs 60 mg cohort (Figure 1). Annualised major bleeding, intracranial haemorrhage (ICH) and major gastrointestinal (GI) bleeding rates were low (0.9%/year, 0.2%/year and 0.4%/year, respectively), with higher major bleeding and major GI bleeding rates in patients treated with edoxaban 30 mg OD (Figure 1). ICH rates were similar in both dose cohorts. Conclusion These findings illustrate the long-term effectiveness and safety of edoxaban. Treatment with edoxaban over 4 years in patients with AF is associated with a relatively low, linearly increasing rate of all-cause death and ischaemic stroke. Rates of bleeding and especially ICH rates were low.
Background Despite widespread use of telemonitoring (TLM) for chronic heart failure (CHF) patients, outcome data remain scarce and medico – economic assessment not routinely addressed. In this setting, the French National Health Data System (SNDS) can be used to comprehensively collect health care costs for the entire French population whatever the type of TLM system. Purpose The aim of this study is to assess the health economic impact of telemonitoring among 1057 patients presenting heart failure matched with a control group selected in the SNDS Methods All patients TLM at least 3 months between 2017 and 2020 were included. For each patient identified by his single social security number (case), anonymised data was retrieved from the SNDS. A control group (CG) was randomly selected from an exhaustive extraction of SNDS of patients with heart failure without TLM. This CG was matched in a 1:3 ratio to TLM patients using a propensity score including baseline characteristics (i.e. gender, age group, social disadvantage index, presence of comorbidities (renal insufficiency, coronary disease, diabetes, cancer), presence of treatments (beta blockers, Sacubitril, aldosterone antagonists, aspirin/Clopidogrel, anticoagulants)). Costs were evaluated using French National Health Insurance perspective using €2022. Binomial negative GEE models were used on matched data sets to assess differences. Results The matching rate between patients included in the TLM arm and the SNDS database was excellent at 99%. Hence, 1,057 TLM patients were matched to 3,171 patients with heart failure. In the telemonitoring group (TLM), patients were 74 years old (±12,3), 68.4 % male, mean left ventricular ejection fraction 41 % and the mean follow-up by the TLM system was 618 days. At 1-year, 2-year and 3-year follow-up, survival rates were respectively of 96%, 91% and 87% for TLM patients vs 93%, 83% and 76% among CG patients (p<0.0001, figure 1). At 6 months, the monthly cost for managing patients was 1,403.2 € in the TLM group vs 1,251.5 € in the CG (p=0.0064). At 12 months, 949.52 € in the TLM group vs 1,114.40 in the CG (p<0.0001). At 24 months, 591.31 € in the TLM group vs 1,064.20 in the CG (p<0.0001). Conclusion While using common evaluation methods seems to be unsatisfactory, economic evaluation both costs and outcomes of an intervention could be more appropriate. The French SNDS is a powerful tool to implement such a type of study. The results presented here show on one hand a significant decrease in mortality in the TLM. On the economic hand, we show here an increase in cost during the first 6 months of the telemonitoring balanced with a significant decrease at 12 and 24 months. Especially, those data place telemonitoring of heart failure as a real improvement in the care pathway which needs time to be effective but is able to have an effect even on the mortality rate
Vishkin (2022) shows that female participation in chess is lower in more gender equal countries (the “gender-equality paradox”) but that this relation is driven by the mean age of the players in a country, which makes it more of an epiphenomenon than a real paradox. Relying on the same data on competitive chess players ( N = 768,480 from 91 countries) as well as on data on 15-year-old students ( N = 312,571 from 64 countries), we show that the gender-equality paradox for chess holds among young players. The paradox also remains on the whole population of chess players when controlling for the age of the players at the individual rather than at the country level or when controlling for age differences across countries. Therefore, there is a gender-equality paradox in chess that is not entirely driven by a generational shift mechanism as argued by Vishkin (2022), and previous explanations for the paradox cannot be dismissed.
A substantial number of studies, reports, and policies—often advocated by financial regulators or think tanks—state that long-term investments in the equity market are underweighted compared to investments in the fixed income market, and that portfolio reallocation towards riskier assets would benefit both investors and firms. Can an optimal financial structure be determined ex-ante at the macroeconomic level? How could financial innovations and the engineering of structured products contribute to the welfare of the economy? While mainstream financial theories provide some (but incomplete) elements of answers, the Austrian school of economics has not yet developed a comprehensive financial theoretical framework to approach these types of questions. This article has three main objectives: firstly, it provides the basis for the development of an authentic Austrian financial theoretical framework, inherited from Austrian capital theory. Secondly, it uses this framework to analyse the economic benefits of financial innovations. Finally, it studies whether there is any theoretical justification and/or empirical evidence to implement public policies to channel saving from fixed income to equity. The approach followed in this article shares some conclusions with mainstream financial theories, but also some key differences. One of the originalities of this article from an Austrian perspective is to integrate an empirical test into the analysis, in the form of a cross-sectional study. This approach may allow mainstream and Austrian economists to mutually enrich and reconcile their theories and methods, in order to reach some consensus concerning different policies and recommendations.
Institution pages aggregate content on ResearchGate related to an institution. The members listed on this page have self-identified as being affiliated with this institution. Publications listed on this page were identified by our algorithms as relating to this institution. This page was not created or approved by the institution. If you represent an institution and have questions about these pages or wish to report inaccurate content, you can contact us here.