104 reads in the past 30 days
Performance Evaluation of Control Strategies for Autonomous Quadrotors: A ReviewNovember 2024
·
104 Reads
Published by Wiley
Online ISSN: 1099-0526
·
Print ISSN: 1076-2787
Disciplines: Nonlinear and complex systems
104 reads in the past 30 days
Performance Evaluation of Control Strategies for Autonomous Quadrotors: A ReviewNovember 2024
·
104 Reads
64 reads in the past 30 days
Chaotic Image Encryption Scheme Based on Improved Z-Order Curve, Modified Josephus Problem, and RNA Operations: An Experimental Li-Fi ApproachNovember 2024
·
64 Reads
53 reads in the past 30 days
Compliance Risk Assessment in the Banking Sector: Application of a Novel Pairwise Comparison-Based PRISM MethodMay 2023
·
672 Reads
·
3 Citations
52 reads in the past 30 days
An Alternative Statistical Model to Analysis Pearl Millet (Bajra) Yield in Province Punjab and PakistanApril 2023
·
402 Reads
·
2 Citations
39 reads in the past 30 days
Ethiopian Consumer’s Behavior towards Purchasing Locally Produced Apparel Products: An Extended Model of the Theory of Planned BehaviorMarch 2024
·
239 Reads
Complexity is an open access journal publishing original research and review articles across a broad range of disciplines with the purpose of reporting important advances in the scientific study of complex systems.
As part of Wiley’s Forward Series, this journal offers a streamlined, faster publication experience with a strong emphasis on integrity. Authors receive practical support to maximize the reach and discoverability of their work.
November 2024
·
1 Read
Through theoretical analysis and empirical test, the following conclusions are drawn: Firstly, the benchmark test proves that transfer payments (TPs) significantly promote regional TFP. Secondly, the conclusions of this paper are still valid after using the system GMM model and instrumental variable method to deal with the endogeneity problem. Thirdly, heterogeneity analysis shows that TPs have a greater impact on TFP in the central and western regions, and general TPs and special TPs significantly promote regional TFP. Fourthly, mechanism analysis shows that TPs promote regional TFP through two channels: narrowing the financial gap between local governments and reducing the vertical fiscal imbalance between central and local governments. Because the balanced distribution of financial resources means the effective supply of public goods, all regions have the conditions to optimize the allocation of resources and improve the efficiency of resource use. Based on this, it is still necessary to implement the TPs system. Also, there is a need to optimize the TP rules.
November 2024
·
25 Reads
Conventional synergy theory explains the inhibitory effects of drug combinations at specific times. Determining the magnitude of inhibition is crucial for exploring the synergy effect. In the results of previous studies, the Chou–Talalay multiple drug effect analysis demonstrated that the combination of a mutant oncolytic herpes virus (G207) and the chemotherapeutic agent (paclitaxel) is the most effective strategy for treating anaplastic thyroid cancer, compared to other combinations such as G207 and NV1023 or paclitaxel and doxorubicin. However, the mechanism behind the synergy effect of G207 and paclitaxel remains unknown, and measuring the synergy effect over time is challenging and expensive. In this study, we formulated a mathematical model to quantify the synergy of G207, paclitaxel, and both over time using the dataset. We conducted a Bayesian estimation of tumor cell proliferation over 16 days using Markov chain Monte Carlo sampling. The Bliss independence was incorporated into the model to compare the observed and expected responses to combination therapy. The expected antitumor effect was significantly lower than the experimental data, suggesting a synergistic effect. Our result showed that the antitumor effect was influenced by the rate of inhibition of tumor growth and the absolute growth delay. Additionally, we found that combination therapy achieved an additional 24% antitumoral effect and a 12-day delay in cell growth. This modeling approach suggests the possibility of quantifying synergistic effects.
November 2024
·
104 Reads
The recent progress in the fields of sensor miniaturization, light materials, automatic control, and battery management systems has opened up new opportunities for low-cost unmanned aerial vehicles (UAVs), such as quadrotors. In fact, quadrotors have transitioned from a primarily military application to being widely used almost everywhere. Evidently, controlling such robots requires a deep understanding of their dynamic behavior and the use of robust strategies to accomplish the flight missions without compromising users’ safety. This study presents a comprehensive survey of control strategies for unmanned quadrotors. In our examination, the performance assessment of widely used control algorithms is discussed. Furthermore, the concept of model-based design is presented as a solution for bridging the gap between simulation and experimental validation of control systems. It is anticipated that the present study will provide the reader with a clear vision of quadrotor UAV control theory.
November 2024
·
64 Reads
Image encryption schemes are predominantly software-based. Only a select few have been implemented in real-life communication systems. This paper introduces a novel chaotic image encryption scheme based on a modified Z-order curve, a modified Josephus problem, and an improved Vigenère cipher–based ribonucleic acid (RNA) operation. It is implemented and assessed within a light-fidelity (Li-Fi) infrastructure, comprising two core components: software and hardware. The software component manages data encryption and decryption, while the hardware ensures efficient data transmission. The proposed encryption scheme starts with a pixel-level permutation based on an improved Z-order curve, applicable to rectangular images, optimizing efficiency and increasing permutation ability. This is followed by a bit-level permutation using a modified Josephus problem, which enhances the diversity of generated sequences and introduces additional dislocation effects. Subsequently, a Vigenère cipher–based RNA operation serves for diffusion alongside basic RNA operations and the cipher block chaining (CBC) mode. Theoretical analyses and experimental findings demonstrate that the proposed encryption scheme is highly robust, outperforming several existing cryptosystems. Moreover, owing to its successful implementation, the proposed encryption scheme signifies a compelling stride toward bolstering secure visible light communication systems.
November 2024
·
37 Reads
This article investigates the finite-time (FT) boundedness problem for the time delay (TD) Takagi–Sugeno fuzzy model (TSFM) with conformable derivative (CD) and in the presence of certain actuator faults. Through the reconstruction of an appropriate Lyapunov–Krasovskii functional, some sufficient conditions expressed by the linear matrix inequalities (LMIs) are given to ensure the FT boundedness of the proposed model not only during regular operation but also when encountering certain actuator faults. Finally, a numerical example and an inverted pendulum system are presented to illustrate our theoretical results.
November 2024
·
4 Reads
The focus on high-quality development in regional tourism involves not only transforming the previous extensive development model and improving the efficiency of tourism development but also promoting the coordinated development of the tourism industry across different regions. Taking the 21 prefecture-level cities in Guangdong Province as the research object and guided by high-quality development, a tourism efficiency measurement index system that includes carbon emissions as an unexpected output has been established. By comprehensively applying methods such as the Super-SBM model, LISA temporal path, and standard deviation ellipse, this study addresses the neglect of spatial relationships in the existing literature and measures the tourism efficiency of Guangdong’s 21 prefecture-level administrative units from 2009 to 2019, exploring its spatiotemporal evolution and collaborative trends. The results show that during the research period, the average tourism efficiency in Guangdong Province was 0.807, at a medium–high efficiency level. Spatially, the tourism efficiency of the province is composed of a main peak and side peaks, with a general leftward shift of the main peak, a fluctuating decrease in peak height, and an expanding width. The evolution of spatial patterns reveals that regions with similar tourism efficiency in Guangdong tend to be spatially concentrated, with strong local stability and clear spatial dependency in the change process of tourism efficiency. The study’s insights suggest strategies for Guangdong’s tourism sector, advocating for technological innovation, sustainable development practices, and a robust evaluation framework. It emphasizes leveraging regional tourism assets, fostering collaboration, and promoting the “Great Lingnan Tourism Circle” for a balanced industry growth.
October 2024
·
18 Reads
Regionalization is the basic feature of cruise shipping network organization. We insist that the cruise networks of Alaska, Hawaii, etc., have developed into a whole with the scaling up of cruise tourism. To prove it, we used complex network analysis methods to explore the port connections and the spatial structure of the cruise shipping network in these regions. We found that Alaska, Hawaii, and the west coast of Mexico all belong to seasonal cruise market areas. Cruise itineraries in these areas are categorized into one-way and round-trip itineraries, and more than 70% of the itineraries are short duration and medium duration. These areas build cruise shipping networks used in Vancouver, Los Angeles, Anchorage, San Francisco, Honolulu, and other cruise ports, which can be subdivided into nine single-core cruise shipping network systems and two dual-core cruise shipping network systems. The interconnection of different systems forms a T-shaped cruise shipping network in geographical space.
October 2024
·
42 Reads
Financial technology is crucial for the sustainable development of financial systems. Algorithmic trading, a key area in financial technology, involves automated trading based on predefined rules. However, investors cannot manually analyze all market patterns and establish rules, necessitating the development of supervised learning trading systems that can discover market patterns using machine or deep learning techniques. Many studies on supervised learning trading systems rely on up–down labeling based on price differences, which overlooks the issues of nonstationarity, complexity, and noise in stock data. Therefore, this study proposes an N-period volatility trading system that addresses the limitations of up–down labeling systems. The N-period volatility trading system measures price volatility to address uncertainty and enables the construction of a stable, long-term trading system. Additionally, an instance‐selection technique is utilized to address the limitations of stock data, including noise, nonlinearity, and complexity, while effectively reducing the data size. The effectiveness of the proposed model is evaluated through trading simulations of stocks comprising the NASDAQ 100 index and compared with up–down labeling trading systems. The experimental results demonstrate that the proposed N-period volatility trading system exhibits higher stability and profitability than other trading systems.
October 2024
·
15 Reads
Q&A platforms are vital sources of information but often face challenges related to their high ratios of passive to active contributors, which can impede knowledge construction and information exchange on the platforms. This study introduced a novel method for identifying trend discoverers, key users who can detect and initiate discussions on emerging question trends, through response order analysis of data from Zhihu and Stack Overflow. This study underscores the significant role of trend discoverers in influencing question popularity. Trend discoverers not only exhibit higher engagement in knowledge-sharing activities but also participate in discussions across a broader range of topics compared to regular users. The insights derived from this research have crucial implications for improving the development and functionality of Q&A platforms.
October 2024
·
18 Reads
With the phasor measurement units (PMUs) being widely utilized in power systems, a large amount of data can be stored. If transient stability assessment (TSA) method based on the deep learning model is trained by this dataset, it requires high computation cost. Furthermore, the fact that unstable cases rarely occur would lead to an imbalanced dataset. Thus, power system transient stability status prediction has the bias problem caused by the imbalance of sample size and class importance. Faced with such a problem, a TSA model based on the sample selection method is proposed in this paper. Sample selection aims to optimize the training set to speed up the training process while improving the preference of the TSA model. The typical samples which can accurately express the spatial distribution of the raw dataset are selected by the proposed method. Primarily, based on the location of training samples in the feature space, the border samples are selected by trained support vector machine (SVM), and the edge samples are selected by the assistance of the approximated tangent hyperplane of a class surface. Then, the selected samples are input to stacked sparse autoencoder (SSAE) as the final classifier. Simulation results in the IEEE 39-bus system and the realistic regional power system of Eastern China show the high performance of the proposed method.
October 2024
·
45 Reads
This paper begins by analyzing the key mathematical properties of diffusive vaccinated models, including existence, uniqueness, positivity, and boundedness. Equilibria are identified, and the basic reproductive number is calculated. The Banach contraction mapping principle is applied to rigorously establish the solution existence and uniqueness. In order to understand the disease’s time transmission, it is important to examine the global stability of the equilibrium points. Disease-free equilibrium and endemic equilibrium are the two equilibria in this model. Here, we demonstrate that the endemic equilibrium is worldwide asymptotic stable when the basic reproductive number is greater than 1, and the disease-free equilibrium is globally asymptotic stable whenever the basic reproductive number is less than 1. Moreover, based on the Caputo fractional derivative of order and the implicit Euler’s approximation, we offered an unconditionally stable numerical solution for the resultant system. This work explores the solution of some significant population models of noninteger order using an approach known as the iterative Laplace transform. The proposed methodology is developed by effectively combining Laplace transformation with an iterative procedure. A series form solution that exhibits some convergent behavior towards the precise solution can be attained. It is noted that there is a close contact between the obtained and precise solutions. Moreover, the suggested method can handle a variety of fractional order derivative problems because it involves minimal computations. This information will be helpful in further studies to determine the ideal strategy of action for preventing or stopping the spread disease transmission.
October 2024
·
30 Reads
Employing an improved adaptive Type-II progressively censored sample, this paper provides various point and interval estimations when the parent distribution of the population to be studied is the inverted Lomax distribution. The estimations include both the model parameters and two reliability metrics, namely, the reliability and failure rate functions. Both maximum likelihood and maximum product of spacing are studied from the conventional estimation standpoint. Along with the point estimations utilizing the two traditional approaches, the approximate confidence intervals based on both are also examined. The Bayesian point estimations with the squared error loss function and credible intervals for various parameters are investigated. Depending on the source of observed data, Bayesian estimations are obtained using two different types of posterior distributions. Numerous censoring designs are looked at in the simulation study to compare the accuracy of classical and Bayesian estimations. Using both conventional methodologies, several optimality metrics are suggested to identify the best removal design. One chemical and a pair of engineering actual data sets are examined to support the significance of the indicated methodologies. The analysis showed that the Bayesian estimation, using the likelihood function, is preferable when compared to other classical and Bayesian methods.
October 2024
·
42 Reads
Baroreflex is critical to maintain blood pressure homeostasis, and the quantification of baroreflex regulation function (BRF) can provide guidance for disease diagnosis, treatment, and healthcare. Current quantification of BRF such as baroreflex sensitivity cannot represent BRF systematically. From the perspective of complex systems, we regard that BRF is the emergence result of fluctuate states and interactions in physiological mechanisms. Therefore, the three-layer emergence is studied in this work, which is from physiological mechanisms to physiological indexes and then to BRF. On this basis, since the entropy in statistical physics macroscopically measures the fluctuations of system’s states, in this work, the principle of maximum entropy is adopted, and a new index called PhysioEnt is proposed to quantify the fluctuations of four physiological indexes, i.e., baroreflex sensitivity, heart rate, heart rate variability, and systolic blood pressure, which aims to represent BRF in the resting condition. Further, two datasets with different subjects are analyzed, and some new findings can be obtained, such as the contributions of the physiological interactions among organs/tissues. With measurable indexes, the proposed method is expected to support individualized medicine.
September 2024
·
5 Reads
Social networks are important for people to obtain information, make comments, and exchange opinions. The public opinion generated in social networks has a great impact on public life and value guidance, and the effect of public opinion is worth paying attention to and analyzing. Research on the influence of hot events can help us better understand the public attention and response to an event, and help industries and social organizations to judge market trends and predict future development direction. Social phenomena and trends reflected by hot events can be used as basic data and research objects for scientific research to promote the development and progress of the discipline. Aiming at social events in social networks, this paper analyzes the relevant user behavior data such as retweets, comments, likes, topic characteristics such as event duration, and other key characteristics and proposes an influence effectiveness analysis model for events, referred to as the caloric value model. This model is based on factors such as user interactivity, event activity, duration of the event, freshness of events,media attention, and the event heat decay ratio. It evaluates event heat, calculates the influence of events, identifies hot events with a high level of discussion, selects these hot events, and validates the findings through experiments. The model proposed in this paper has a good effect on the rationality and feasibility of heat evaluation.
September 2024
·
25 Reads
Taking into account the most recent improvements in graph theory and algebra, we can associate graphs of some mathematical structures with certifiable, widely known applications. This paper seeks to explore the connections established through edge labeling among Latin squares derived from Moufang quasigroups, which are constructed using additive abelian and multiplicative groups, along with their substructures and complete bipartite graphs. The algebraic characteristics of quasigroups exhibiting the antiautomorphic inverse property have been extensively examined in this study. These characteristics encompass identities associated with fixed element maps. To analyze the behavior of these groups under holomorphism, we utilize similar conditions.
September 2024
·
54 Reads
In recent years, the development of metrics to evaluate image aesthetics and photographic quality has proliferated. However, validating these metrics presents challenges due to the inherently subjective nature of aesthetics and photographic quality, which can be influenced by cultural contexts and individual preferences that evolve over time. This article presents a novel validation methodology utilizing a dataset assessed by individuals from two distinct nationalities: the United States and Spain. Evaluation criteria include photographic quality and aesthetic value, with the dataset comprising images previously rated on the DPChallenge photographic portal. We analyze the correlation between these values and provide the dataset for future research endeavors. Our investigation encompasses several metrics, including BRISQUE for assessing photographic quality, NIMA aesthetic and NIMA technical for evaluating both aesthetic and technical aspects, Diffusion Aesthetics (employed in Stable Diffusion), and PhotoILike for gauging the commercial appeal of real estate images. Our findings reveal a significant correlation between the Diffusion Aesthetics metric and aesthetic measures, as well as with the NIMA aesthetics metric, suggesting them as good potential candidates to capture aesthetic value.
September 2024
·
56 Reads
Normal and aberrant cognitive functions are the result of the dynamic interplay between large-scale neural circuits. Describing the nature of these interactions has been a challenging task yet important for neurodegenerative disease evolution. Fusing modern dynamic graph network theory techniques and control theory applied on complex brain networks creates a new framework for neurodegenerative disease research by determining disease evolution at the subject level, facilitating a predictive treatment response and revealing key mechanisms responsible for disease alterations. It has been shown that two types of controllability—the average and the modal controllability—are relevant for the mechanistic explanation of how the brain navigates between cognitive states. The average controllability favors highly connected areas which move the brain to easily reachable states, while the modal controllability favors weakly connected areas representative for difficult-to-reach states. We propose two different techniques to achieve these two types of controllability: a centrality measure based on a sensitivity analysis of the Laplacian matrix is employed to determine the average controllability, while graph distances form the basis of the modal controllability. The concepts of “choosing the best driver set” and “graph distances” are applied to measure the average controllability and the modal controllability, respectively. Based on these new techniques, we obtain important disease descriptors that visualize alterations in the disease trajectory by revealing densely connected hubs or sparser areas. Our results suggest that these two techniques can accurately describe the different node roles in controlling trajectories of brain networks.
September 2024
·
33 Reads
We study the existence of fixed points, local stability analysis, bifurcation sets at fixed points, codimension-one and codimension-two bifurcation analysis, and chaos control in a predator-prey model with Holling types I and III functional responses. It is proven that the model has a trivial equilibrium point for all involved parameters but interior and semitrivial equilibrium solutions under certain model parameter conditions. Furthermore, local stability at trivial, semitrivial, and interior equilibria using the theory of linear stability is investigated. We have also explored the bifurcation sets for trivial, semitrivial, and interior equilibria and proved that flip bifurcation occurs at semitrivial equilibrium. Furthermore, it is also proven that Neimark–Sacker bifurcation as well as flip bifurcation occurs at an interior equilibrium solution, and in addition, at the same equilibrium solution, we also studied codimension-two 1:2 strong resonance bifurcation. Then, OGY and hybrid control strategies are employed to manage chaos in the model under study, which arises from Neimark–Sacker and flip bifurcations, respectively. We have also examined the preservation of the positive solution of the understudied model. Finally, numerical simulations are given to verify the theoretical results.
September 2024
·
23 Reads
This paper studies the complexity of the decision system when a brand enterprise introduces new products and makes dynamic quality decisions to cope with imitation threats caused by a counterfeit enterprise. Using game theory, nonlinear system theory, and numerical simulation, the complexity characteristics of this system and consumer utility are further discussed; moreover, delay feedback control is used to restrain chaos. The conclusions are as follows: (1) The stable scope of the brand enterprise price adjustment enlarges with the increase of the adjustment parameters of new product quality and imitation price. (2) The attraction domain of the initial decisions of the counterfeit enterprise decreases when the brand enterprise makes dynamic decisions on new product quality; a higher new product quality adjustment parameter can reduce the imitation ability of the counterfeit enterprise. (3) With the growth of the degree of substitution between new products and existing products, the quality of new products is improved, the output of existing products is reduced, and the counterfeit enterprise withdraws from the market if the degree of substitution between existing products and imitations is high. (4) The dynamic system can suppress chaos and restore stability by using the delay feedback control method. Results are of great importance for managers to make reasonable decisions when dealing with imitation threats.
September 2024
·
54 Reads
Infectious diseases pose a significant threat to global health, necessitating the development of effective vaccination strategies. This study examines the dynamics of viral infections and immune responses, with a particular focus on the roles of antibodies and CD8+ T cells induced by vaccination. Through a mathematical model, we explore the intricate interactions between host cells and viruses to assess the impact of vaccination on viral replication. Our findings align with experimental results, demonstrating that vaccination substantially enhances immune responses and reduces viral replication. The contributions of both antibody and CD8+ T cell responses are shown to be vital for achieving optimal vaccine efficacy. The model’s predictions, validated against experimental observations, emphasize the need to incorporate mechanisms that induce robust immune responses in vaccine design. This study underscores the critical role of mathematical modeling in understanding immune dynamics and in refining vaccination strategies to develop more effective treatments against viral infections.
August 2024
·
61 Reads
August 2024
·
38 Reads
Network analysis involves using graph theory to understand networks. This knowledge is valuable across various disciplines like marketing, management, epidemiology, homeland security, and psychology. An essential task within network analysis is deciphering the structure of complex networks including technological, informational, biological, and social networks. Understanding this structure is crucial for comprehending network performance and organization, shedding light on their underlying structure and potential functions. Community structure detection aims to identify clusters of nodes with high internal link density and low external link density. While there has been extensive research on community structure detection in single-layer networks, the development of methods for detecting community structure in multilayer networks is still in its nascent stages. In this paper, a new method, namely, IGA-MCD, has been proposed to tackle the problem of community structure detection in multiplex networks. IGA-MCD consists of two general phases: flattening and community structure detection. In the flattening phase, the input multiplex network is converted to a weighted monoplex network. In the community structure detection phase, the community structure of the resulting weighted monoplex network is determined using the Improved Genetic Algorithm (IGA). The main aspects that differentiate IGA from other algorithms presented in the literature are as follows: (a) instead of randomly generating the initial population, it is smartly generated using the concept of diffusion. This makes the algorithm converge faster. (b) A dedicated local search is employed at the end of each cycle of the algorithm. This causes the algorithm to come up with better new solutions around the currently found solutions. (c) In the algorithm process, chaotic numbers are used instead of random numbers. This ensures that the diversity of the population is preserved, and the algorithm does not get stuck in the local optimum. Experiments on the various benchmark networks indicate that IGA-MCD outperforms state-of-the-art algorithms.
August 2024
·
91 Reads
This work explores the complicated realm of fullerene structures by utilizing an innovative algebraic lens to unravel their chemical intricacies. We reveal a more profound comprehension of the structural subtleties of fullerenes by the computation of modified polynomials that are customized to their distinct geometric and electrical characteristics. In addition to enhancing the theoretical underpinnings, the interaction between algebraic characteristics and fullerene structures creates opportunities for real-world applications in materials science and nanotechnology. Our results provide a novel viewpoint that bridges the gap between algebraic abstraction and chemical reality. They also open up new avenues for the manipulation and construction of materials based on fullerenes with customized features. Topological or numerical descriptors are used to associate important physicomolecular restrictions with important molecular structural features such as periodicity, melting and boiling points, and heat content for various 2 and 3D molecular preparation graphs or networking. The degree of an atom in a molecular network or molecular structure is utilized in this study to calculate the degree of atom-based numerics. The modified polynomial technique is a more recent way of assessing molecular systems and geometries in chemoinformatics. It emphasizes the polynomial nature of molecular features and gives numerics in algebraic expression. Particularly in this context, we describe multiple cages topologically based on the fullerene molecular form as polynomials, and several algebraic properties, including the Randić number and the modified polynomials of the first and second Zagreb numbers, are measured. By applying algebraic methods, we computed topological descriptors such as the Randić number and Zagreb indices. Our qualitative analysis shows that these descriptors significantly improve the prediction of molecular behavior. For instance, the Randić index provided insights into the stability and reactivity of fullerene structures, while the Zagreb indices helped us understand their potential in electronic applications. Our results suggest that modified polynomials not only offer a refined perspective on fullerene structures but also enable the design of materials with tailored properties. This study highlights the potential for these algebraic tools to bridge the gap between theoretical models and practical applications in nanotechnology and materials science, paving the way for innovations in drug delivery, electronic devices, and catalysis.
August 2024
·
25 Reads
High risk of braking easily causes more safety accidents. In this paper, the driving experiments on G350 Gengda-Yingxiu section containing continuous downhill tunnel group. Three-axle trucks under standard load and overload conditions were considered. The research proposes a method for ensuring safe truck driving on continuous downhill sections of mountain roads based on the rise in brake drum temperature. The study collects data on brake drum temperature and braking duration from an experimental vehicle under the coupling action of human-vehicle-road-environment. Through comparative analysis, theoretical derivation, and model construction, conclusions are drawn. The results indicate that the rise in brake drum temperature is influenced by the factors such as overload, alignment, road slope, and sections with bright and dark lines. The initial brake drum temperature, operating speed, and total vehicle mass are identified as the main controlling factors for the change in brake drum temperature. The study also demonstrates that water drenching can significantly reduce the rate of brake drum temperature rise, thereby ensuring driving safety. Furthermore, a model is constructed based on the relationship between brake drum temperature rise and various factors, which allows for the calculation of the corresponding safe slope length and average slope gradient. This model can be used for evaluating or designing overall load requirements. The research on brake drum temperature rise characteristics and braking behaviour under drenching conditions provides effective support for route design, traffic management, and the establishment of safety service facilities.
August 2024
·
32 Reads
In this article, the fixed-time stabilization issue is investigated for a kind of general p-norm stochastic nonlinear systems. The feature of the considered systems is that all powers are any positive rational numbers, which means this type of systems includes many previously considered systems. Firstly, a higher accurate upper bounded estimation of settling time is given by applying an ingenious variable transformation and the definition of Gamma function, thereby obtaining an improved stochastic fixed-time stability theorem. Then a continuous state-feedback controller is designed for the general p-norm stochastic systems by using the adding a power integrator method, and the designed controller is proved to ensure the fixed-time stability of the considered systems in light of stochastic fixed-time stability theorem. Finally, numerical simulation results of fixed-time stabilizer and finite-time stabilizer indicate the effectiveness of the developed scheme.
Journal Impact Factor™
Acceptance rate
CiteScore™
Submission to first decision
Submission to final decision
Acceptance to publication
Article processing charge
Associate Editor
Universitat Rovira i Virgili, Spain
Associate Editor
University of Auckland, New Zealand