In this note we study the conditions for convergence of the recently introduced dynamic regressor extension and mixing (DREM) parameter estimator when the extended regressor is generated using LTI filters. In particular, we are interested in relating these conditions with the ones required for convergence of the classical gradient, namely the well-known PE requirement on the original regressor vector, phi in R^q, with q the number of unknown parameters. Moreover, we study the case when only interval excitation (IE) is available, under which DREM, concurrent and composite learning schemes ensure global convergence, being the convergence for DREM in finite time. Regarding PE we prove, under some mild technical assumptions, that if phi is PE then the scalar regressor of DREM, Delta_N in R, is also PE, ensuring exponential convergence. Concerning IE we prove that if phi is IE then Delta_N is also IE. All these results are established in the almost sure sense, namely proving that the set of filter parameters for which the claims do not hold is zero measure. The main technical tool used in our proof is inspired by a study of Luenberger observers for nonautonomous nonlinear systems recently reported in the literature.
The Mexican scientific production published in mainstream journals included in the Web of Science (Clarivate Analytics) for the period 1995–2015 is analyzed. To this purpose, the bibliometric data of the 32 states were organized into five groups according to the following criteria: research production, institutional sectors, and number of research centers. Our findings suggest that there has been an important deconcentration of the scientific activities mainly towards public state universities as a consequence of various public policies. While institutions located in Mexico City (CX) have published, during this period, 48.1% of the whole Mexican scientific production, there are two other groups of states with rather different productivity: 13 of them published 40% of the output and the rest of the entities (18) just published 11%. Our findings suggest that the highest research performance corresponds to those federal entities where there are branches of higher education institutions located in CX. We also identify those institutional sectors that contribute importantly to a specific research output for each federal entity. The results of this study could be useful to improve science and technology public policies in each state.
Existing theoretical literature on justice, law, and community typically treats them as ideas studying them through an analytical and rational approach. In this article, I propose to investigate these concepts through aesthetic experience as an attempt to both sharpen our imagination of such concepts and demonstrate they are inseparable. I do this by painstakingly examining the movies Shoplifters by Kore-eda and The House That Jack Built by Von Trier. Rather than focusing on thematic analysis, I claim and show that film form is crucial for aesthetic and affective experience. Furthermore, and against the conventional view, I argue that both movies articulate a spatialized vision of justice defined by its materiality. Together these aspects help us to keep (re)imagining law, justice, and community, and grasp better their worldmaking properties and powers.
We present a model that explicitly links the epidemiological Ross-Macdonald model with a simple immunological model through a virus inoculation term that depends on the abundance of infected mosquitoes. We explore the relationship between the reproductive numbers at the population (between-host) and individual level (within-host), in particular the role a certain measure of infectivity (defined in terms of the number of target cells infected) and viral clearance rate play in the coupled dynamics. The conditions for a disease outbreak require, for the average individual in the population, to have an active (within-host) viral infection. This infection depends on the viral load and the proportion of infected cells which are quantities that change in time. Only when these two quantities are sufficiently large, the epidemic outbreak may occur. We also describe a particular kind of transmission-clearance trade off for vector-host systems with a simple structure.
Proteins are macromolecules essential for living organisms. However, to perform their function, proteins need to achieve their Native Structure (NS). The NS is reached fast in nature. By contrast, in silico, it is obtained by solving the Protein Folding problem (PFP) which currently has a long execution time. PFP is computationally an NP-hard problem and is considered one of the biggest current challenges. There are several methods following different strategies for solving PFP. The most successful combine computational methods and biological information: I-TASSER, Rosetta (Robetta server), AlphaFold2 (CASP14 Champion), QUARK, PEP-FOLD3, TopModel, and GRSA2-SSP. The first three named methods obtained the highest quality at CASP events, and all apply the Simulated Annealing or Monte Carlo method, Neural Network, and fragments assembly methodologies. In the present work, we propose the GRSA2-FCNN methodology, which assembles fragments applied to peptides and is based on the GRSA2 and Convolutional Neural Networks (CNN). We compare GRSA2-FCNN with the best state-of-the-art algorithms for PFP, such as I-TASSER, Rosetta, AlphaFold2, QUARK, PEP-FOLD3, TopModel, and GRSA2-SSP. Our methodology is applied to a dataset of 60 peptides and achieves the best performance of all methods tested based on the common metrics TM-score, RMSD, and GDT-TS of the area.
CSmoothing allows an analyst to use the so-called Controlled Smoothing technique to estimate trends in a time series framework. In this Web-tool (Shiny), the analyst may apply the methodology to at most 3 mortality time series simultaneously, as well as to other kind of time series individually. Likewise, this smoothing approach allows the analyst to establish one, two or three segments in order to take into account possible changes in variance regimes. For estimating trends it uses different amounts of smoothness, both globally for the total data set and through some partial indices for each selected segment. It is also possible to endogenously fix the points where the segments start and end (the cutoff points) with continuous joints. Additionally, intervals of different standard deviations for their respective trends are given. Particular emphasis is placed on a big data set of log mortality rates, log(qx), taken from period life tables of the Human Mortality Database (HMD) (University of California Berkeley (USA) and and Max Planck Institute for Demographic Research (Germany)), 2021 University of California Berkeley (USA), and Max Planck Institute for Demographic Research (Germany). 2021. Human mortality database (HMD). www.mortality.org;www.humanmortality.de, data downloaded on 10/10/21. [Google Scholar]). In all cases, dynamic graphs and several statistics related to the Controlled Smoothing technique are illustrated.
Throughout this century, and with increasing frequency, the National Electoral Institute (INE) has added the production of quick countsQuick counts to the tasks that accompany the electoral processes in Mexico. Quick countsQuick counts were initially conceived to produce estimates of the results of the Presidential electionsElection; at present, these statistical exercises are used to estimate the integration of the federal chamber of deputies as well as the results of the electionsElection of Governors in 31 states of the country and the election of Mexico City Head of Government. Given the role of the INE as an electoral authority, the levels of demand and quality of the quick countsQuick counts it organizes are extremely high, and to carry them out, the Institute relies on a Committee made up of specialists who are in charge of both the sample design and the inference procedures. In this committee, conventional techniques for analyzing finite samples are combined with more modern procedures that use parametric models and simulation algorithms. For the 2020–2021 electoral process, the committee simultaneously estimated the results of the electionElection of federal deputies and those of fifteen governorships. Concurrency of electionsElection and the possibility that the planned sample may not be received entirely have revealed the need to efficiently process the available information and design measures to evaluate the potential biases that may arise when there are incomplete samplesIncomplete sample. In this work, some mechanisms developed to make the computations more efficient and faster are presented when the estimation in quick countsQuick counts is carried out with the model proposed by Mendoza and Nieto (2016).
In the Mexican electionsElection, the quick countQuick counts consists in selecting a random sample of polling stations to forecast the election results. Its main challenge is that the estimation is done with incomplete samplesIncomplete sample, where the missingness is not at random. We present one of the statistical models used in the quick countQuick counts of the gubernatorial electionsElection of 2021. The model is a negative binomial regression with a hierarchical structure. The prior distributions are thoroughly tested for consistency. Also, we present a fitting procedure with an adjustment for biasBias, capable of running in less than 5 min. The model yields probability intervals with approximately 95% coverageCoverage, even with certain patterns of biased samples observed in previous elections. Furthermore, the robustness of the negative binomial distribution translates to robustness in the model, which can fit well big and small candidates, and provides an additional layer of protection when there are database errors.
A great deal of research has investigated how various aspects of ethnic identity influence consumer behavior, yet this literature is fragmented. The objective of this paper is to present an integrative theoretical model of how individuals are motivated to think and act in a manner consistent with their salient ethnic identities. The model emerges from a review of social science and consumer research about U.S. Hispanics, but researchers could apply it in its general form and/or adapt it to other populations. Our model extends Oyserman's (2009) identity‐based motivation (IBM) model by differentiating between two types of antecedents of ethnic identity salience: longitudinal cultural processes and situational activation by contextual cues, each with different implications for the availability and accessibility of ethnic cultural knowledge. We provide new insights by introducing three ethnic identity motives that are unique to ethnic (non‐majority) cultural groups: belonging, distinctiveness, and defense. These three motives are in constant tension with one another and guide longitudinal processes like acculturation, and ultimately influence consumers’ procedural readiness and action readiness. Our integrative framework organizes and offers insights into the current body of Hispanic consumer research, and highlights gaps in the literature that present opportunities for future research.
Chili peppers are among the most important vegetables in the world. The demand of this fruit reveals a noticeable rapid increasing, which importance is mainly due to its nutraceutical composition. These fruits are rich in capsaicinoids, phenolic compounds, carotenoids, and others, including vitamins. In this study, a comparative evaluation between two extraction methods of bioactive compounds of fourteen chili pepper cultivars was performed. Two extraction methods for antioxidants, the time-solvent and the ultrasound were evaluated. The design of the experiment was completely randomized with three repetitions where variables evaluated were total phenolic compounds, flavonoids content, antioxidant capacity and capsaicin. Results showed that the phenolic compounds oscillated between 48.7 - 634.1 mg GAE/100 g dry weight (DW), the flavonoids content varied from 1 - 97 mg QE/100 g DW, the antioxidant activity from 65 - 348 µmol Trolox/g DW and the capsaicin content oscillated from 0.3 - 922 mg/100 g DW. The extraction method with higher values of bioactive compounds for each of the chili pepper types was the ultrasound for all the measured variables.
This paper aims to perform an alternative methodology the Ministry of Finance and Public Credit (SHCP) applies to estimate the annual Mexican Crude Oil Mix Export Price (MXM), a crucial element of the General Economic Policy Criteria in the Economic Package. We first identify the MXM and the West Texas Intermediate (WTI) relation, computing tail conditional dependence between both series. Subsequently, we use a market risk analysis approach that considers some methodologies to estimate the value at risk (), including an ARIMA-TGARCH model for the innovations of the MXM's price to forecast its behavior using data daily data from January 3rd, 1996, to December 30th, 2021. Once we identify the VaR and the ARIMA-TGARCH components, we aim to design an alternative method to estimate the annual average MXM's price.
This paper employs newly-collected historical data from Finland to present evidence of historically contingent, long-run consequences of a famine. We document high levels of local inequality in terms of income and land distribution until a violent uprising in 1918. These inequalities partly originated from the famine of 1866–1868 which increased the concentration of land and power to large landowners. We further show that regions with more exposure to the famine had more labor coercion by the early 1900s. These inner tensions led to violent conflict following the Russian Revolution and the Finnish independence from the Russian Empire. Using micro-data on all the casualties of the 1918 Finnish Civil War, we demonstrate that the famine plausibly contributed to local insurgency participation through these factors. Although unsuccessful in replacing the government, the insurgency led to significant policy changes, including radical land redistribution and a full extension of the franchise. These national reforms led to a more drastic shift towards equality in locations more affected by the famine with greater pre-conflict inequality. Our findings highlight how historical shocks can have large and long-lasting, but not straightforward impacts.
We consider planar and spatial autonomous Newtonian systems with Coriolis forces and study the existence of branches of periodic orbits emanating from equilibria. We investigate both degenerate and nondegenerate situations. While Lyapunov's center theorem applies locally in the nondegenerate, nonresonant context, our result provides a global answer which is significant also in some degenerate cases. We apply our abstract results to a problem from Celestial Mechanics. More precisely, in the three-dimensional version of the Restricted Triangular Four-Body Problem with possibly different primaries our results show the existence of at least seven branches of periodic orbits emanating from the stationary points.
The application of artificial intelligence (AI) to judicial decision-making has already begun in many jurisdictions around the world. While AI seems to promise greater fairness, access to justice, and legal certainty, issues of discrimination and transparency have emerged and put liberal democratic principles under pressure, most notably in the context of bail decisions. Despite this, there has been no systematic analysis of the risks to liberal democratic values from implementing AI into judicial decision-making. This article sets out to fill this void by identifying and engaging with challenges arising from artificial judicial decision-making, focusing on three pillars of liberal democracy, namely equal treatment of citizens, transparency, and judicial independence. Methodologically, the work takes a comparative perspective between human and artificial decision-making, using the former as a normative benchmark to evaluate the latter. The chapter first argues that AI that would improve on equal treatment of citizens has already been developed, but not yet adopted. Second, while the lack of transparency in AI decision-making poses severe risks which ought to be addressed, AI can also increase the transparency of options and trade-offs that policy makers face when considering the consequences of artificial judicial decision-making. Such transparency of options offers tremendous benefits from a democratic perspective. Third, the overall shift of power from human intuition to advanced AI may threaten judicial independence, and with it the separation of powers. While improvements regarding discrimination and transparency are available or on the horizon, it remains unclear how judicial independence can be protected, especially with the potential development of advanced artificial judicial intelligence (AAJI). Working out the political and legal infrastructure to reap the fruits of artificial judicial intelligence in a safe and stable manner should become a priority of future research in this area.
On March 26, 2021, the Inter-American Court of Human Rights found Honduras responsible for the killing of Vicky Hernández, a trans woman and human rights defender. ¹ The Vicky Hernández et al. v. Honduras judgment is the first in which an international court has protected a trans woman by applying a human rights treaty that protects women. It thus provides an opportunity to analyze the impact of feminist ideas on the system of human rights protection at the regional level, with implications for international law more generally. In this essay, I defend the Inter-American Court's majority decision against the dissenting opinions, by arguing that the political subject of human rights is dynamic and emergent and, therefore, positive law is often one step behind in the struggles for recognition. For this reason, we need interpretations of rights that are inclusive, that evolve, and that push for the destabilization of law as binary, allowing the emergence of a more egalitarian legal system that recognizes intersectionality.
This article argues that to disrupt legal education in a radical sense, students need to become acquainted with the art of worldmaking and the view that law is a “way of worldmaking”. First, I show that law is a cultural semiotic practice that requires decoding and, for that reason, demands a creative intervention by those that want to know, understand, and do things with law. Altogether this amounts to recognizing the different modes in which law creates, and is part of, worlds. Second, I propose that due to different features of their aesthetic form, comics are a particularly effective medium to place students before the myriad ways in which law and lawyers make and reproduce worlds. Third, I illustrate the argument by exploring how the Saga comic series, through its formal multimodality and narrative and cultural complexity, can make good on that challenge.
Institution pages aggregate content on ResearchGate related to an institution. The members listed on this page have self-identified as being affiliated with this institution. Publications listed on this page were identified by our algorithms as relating to this institution. This page was not created or approved by the institution. If you represent an institution and have questions about these pages or wish to report inaccurate content, you can contact us here.