February 2025
·
1 Read
Engineering Applications of Artificial Intelligence
This page lists works of an author who doesn't have a ResearchGate profile or hasn't added the works to their profile yet. It is automatically generated from public (personal) data to further our legitimate goal of comprehensive and accurate scientific recordkeeping. If you are this author and want this page removed, please let us know.
February 2025
·
1 Read
Engineering Applications of Artificial Intelligence
January 2025
·
15 Reads
IEEE Transactions on Knowledge and Data Engineering
In multisource information fusion (MSIF), Dempster–Shafer evidence (DSE) theory offers a useful framework for reasoning under uncertainty. However, measuring the divergence between belief functions within this theory remains an unresolved challenge, particularly in managing conflicts in MSIF, which is crucial for enhancing decision-making level. In this paper, several divergence and distance functions are proposed to quantitatively measure discrimination between belief functions in DSE theory, including the reverse evidential KullbackLeibler (REKL) divergence, evidential Jeffrey's (EJ) divergence, evidential JensenShannon (EJS) divergence, evidential (E ) divergence, evidential symmetric (ES ) divergence, evidential triangular (ET) discrimination, evidential Hellinger (EH) distance, and evidential total variation (ETV) distance. On this basis, a generalized f -divergence, also called the evidential f -divergence (Ef divergence), is proposed. Depending on different kernel functions, the Ef divergence degrades into several specific classes: EKL, REKL, EJ, EJS, E and ES divergences, ET discrimination, and EH and ETV distances. Notably, when basic belief assignments (BBAs) are transformed into probability distributions, these classes of Ef divergence revert to their classical counterparts in statistics and information theory. In addition, several Ef-MSIF algorithms are proposed for pattern classification based on the classes of Ef divergence. These Ef-MSIF algorithms are evaluated on real-world datasets to demonstrate their practical effectiveness in solving classification problems. In summary, this work represents the first attempt to extend classical f -divergence within the DSE framework, capitalizing on the distinct properties of BBA functions. Experimental results show that the proposed Ef-MSIF algorithms improve classification accuracy, with the best-performing Ef-MSIF algorithm achieving an overall performance difference approximately 1.22 times smaller than the suboptimal method and 14.12 times smaller than the worst-performing method.
December 2024
·
48 Reads
Chinese Journal of Aeronautics
September 2024
·
1,368 Reads
·
9 Citations
Chinese Journal of Information Fusion
Data fusion is a prevalent technique for assembling imperfect raw data coming from multiple sources to capture reliable and accurate information. Dempster–Shafer evidence theory is one of useful methodologies in the fusion of uncertain multisource information. The existing literature lacks a thorough and comprehensive review of the recent advances of Dempster– Shafer evidence theory for data fusion. Therefore, the state of the art has to be surveyed to gain insight into how Dempster–Shafer evidence theory is beneficial for data fusion and how it evolved over time. In this paper, we first provide a comprehensive review of data fusion methods based on Dempster–Shafer evidence theory and its extensions, collectively referred to as classical evidence theory, from three aspects of uncertainty modeling, fusion, and decision making. Next, we study and explore complex evidence theory for data fusion in both closed world and open world contexts that benefits from the frame of complex plane modelling. We then present classical and complex evidence theory framework-based multisource data fusion algorithms, which are applied to pattern classification to compare and demonstrate their applicabilities. The research results indicate that the complex evidence theory framework can enhance the capabilities of uncertainty modeling and reasoning by generating constructive interference through the fusion of appropriate complex basic belief assignment functions modeled by complex numbers. Through analysis and comparison, we finally propose several challenges and identify open future research directions in evidence theorybased data fusion.
August 2024
·
25 Reads
·
19 Citations
IEEE Transactions on Knowledge and Data Engineering
Time series data contains the amount of information to reflect the development process and state of a subject. Especially, the complexity is a valuable factor to illustrate the feature of the time series. However, it is still an open issue to measure the complexity of sophisticated time series due to its uncertainty. In this study, based on the belief R nyi divergence, a novel time series complexity measurement algorithm, called belief R nyi divergence of divergence (BR DOD), is proposed. Specifically, the BR DOD algorithm takes the boundaries of time series value into account. What is more, according to the Dempster-Shafer (D-S) evidence theory, the time series is converted to the basic probability assignments (BPAs) and it measures the divergence of a divergence sequence. Then, the secondary divergence of the time series is figured out to represent the complexity of the time series. In addition, the BR DOD algorithm is applied to sets of cardiac inter-beat interval time series, which shows the superiority of the proposed method over classical machine learning methods and recent well-known works.
June 2024
·
17 Reads
·
5 Citations
Pattern Recognition
May 2024
·
16 Reads
·
9 Citations
IEEE Transactions on Fuzzy Systems
Uncertainty modeling and reasoning in intelligent systems are crucial for effective decision-making, such as complex evidence theory (CET) being particularly promising in dynamic information processing. Within CET, the complex basic belief assignment (CBBA) can model uncertainty accurately, while the complex rule of combination can effectively reason uncertainty with multiple sources of information, reaching a consensus. However, determining CBBA, as the key component of CET, remains an open issue. To mitigate this issue, we propose a novel method for generating CBBA using high-level features extracted from Box–Cox transformation and discrete Fourier transform (DFT). Specifically, our method deploys complex Gaussian fuzzy number (CGFN) to generate CBBA, which provides a more accurate representation of information. The proposed method is applied to pattern classification tasks through a multisource information fusion algorithm, and it is compared with several well-known methods to demonstrate its effectiveness. Experimental results indicate that our proposed CGFN-based method outperforms existing methods, by achieving the highest average classification rate in multisource information fusion for pattern classification tasks. We found the Box–Cox transformation contributes significantly to CGFN by formatting data in a normal distribution, and DFT can effectively extract high-level features. Our method offers a practical approach for generating CBBA in CET, precisely representing uncertainty and enhancing decision-making in uncertain scenarios.
April 2024
·
15 Reads
·
14 Citations
Information Sciences
March 2024
·
17 Reads
·
2 Citations
Information Sciences
January 2024
·
22 Reads
·
2 Citations
Computational and Applied Mathematics
Dempster–Shafer (D–S) evidence theory is used to process multisource data fusion and uncertainty problems. When faced with strongly contradictory evidence, there are always some surprising phenomena. We propose a new generalized distance based on Li et al.’s Hellinger distance in this study to assess the distinction between basic probability assignments (BPAs) to solve this problem.The basic structure of Li et al.’s Hellinger distance was kept in the generalized Hellinger distance, and certain advancements were achieved. The generalized Hellinger distance considers the differences between both focal elements and the subsets of the sets of belief functions, enabling a wider range of applications for it. Additionally, we present the proof of generalized Hellinger distance satisfied nonnegativeness, symmetry, definiteness and triangle inequality. Through several comparative examples, we know that the new distance has better universality than some well-known works. Finally, we suggest a novel generalized Hellinger distance-based multisource data fusion approach and use it to solve a real-word pattern classification problem.
... Evidence fusion is one of the effective methods for addressing the above problem [11,12]. In recent years, various improvements based on evidence fusion theory have been proposed, including time series complexity measurement-based methods [13], fractal confidence-based methods [14,15], and multi-source data fusion-based methods [16]. However, the above methods ignore the correlation between attributes in the evidence fusion process. ...
September 2024
Chinese Journal of Information Fusion
... We also presented results from deep learning benchmarks, including DeepAR, N-BEATS, as well as clustering-based and ensemble-based global models as presented in (Godahewa et al. 2021). Furthermore, we compared our results with a new time series forecasting method based on complexity network analysis proposed in Zhan and Xiao (2024) and with a transformer-based model (Vaswani et al. 2017). Since all our implemented models use a multistep-ahead forecasting strategy, we selected only multistep-ahead forecasting models from the literature for benchmarking. ...
June 2024
Pattern Recognition
... The integration of evidence theory for real-time applications requires effective solutions. Recently, [124] presents a novel quantum Dempster's rule of combination, which constructs quantum circuits using quantum logical gates, significantly reducing the computational complexity of Dempster's rule of combination without information loss. It is believe that [124] provides a promising way to handle such kinds of complexity and real time problem, making it worthy of further investigation. ...
April 2024
Information Sciences
... To solve the first question, we propose to introduce the Box-Cox transformation [6]. Box-Cox transformation is a statistical method that can effectively map any input data into an approximately normal distribution while maintaining its relative relationship, and has been widely employed in various fields, such as anomaly detection [26,94], fuzzy systems [92], and Monte Carlo denoising [58]. Then, we nest a Z-score normalization to further turn the data output by Box-Cox transformation into approximately standard normal distribution, which eventually maps both the dwelling time and average ratings to the same space (standard normal space). ...
May 2024
IEEE Transactions on Fuzzy Systems
... Evidence fusion is one of the effective methods for addressing the above problem [11,12]. In recent years, various improvements based on evidence fusion theory have been proposed, including time series complexity measurement-based methods [13], fractal confidence-based methods [14,15], and multi-source data fusion-based methods [16]. However, the above methods ignore the correlation between attributes in the evidence fusion process. ...
August 2024
IEEE Transactions on Knowledge and Data Engineering
... Arora et al. 22 suggested similarity measures for q-ROFSs and applications to decision making. Ziyue & Xiao 23 proposed generalized Hellinger distance for multisource information fusion with applications. Wang et al. 24 proposed novel distance measures of q-ROFSs with their applications. ...
January 2024
Computational and Applied Mathematics
... The second category involves using different credibility distances to measure the similarity between pieces of evidence, followed by a fusion process based on similarity [55][56][57]. Evidence distance is used to measure and compare the differences and consistencies between different sources of evidence, where smaller distances indicate greater consistency between the evidence, and larger distances indicate greater inconsistency. ...
December 2023
Applied Intelligence
... Evidence fusion is one of the effective methods for addressing the above problem [11,12]. In recent years, various improvements based on evidence fusion theory have been proposed, including time series complexity measurement-based methods [13], fractal confidence-based methods [14,15], and multi-source data fusion-based methods [16]. However, the above methods ignore the correlation between attributes in the evidence fusion process. ...
January 2023
IEEE Transactions on Knowledge and Data Engineering
... The following two examples demonstrate the superiority of RFBD for the similarity factor G and the quantity factor F proposed in this paper. We compare it with several recent divergence measures, including BJS [32], RB [27], MRBD [26], EMJSD [33], IBχ 2 [34], F B D SK L [35], FBJS [40] and Liu et al.'s method [41] through numerical examples. In addition, we use BJS as the baseline method. ...
November 2023
Engineering Applications of Artificial Intelligence
... Notably, CET, as a generalization of classical DSET, was presented by Xiao [73,74] to be a solution. CET extends the classical DSET into the complex plane and is capable of modeling and handling uncertainty by means of complex numbers [102][103][104][105][106][107][108]. The main concepts of CET are introduced below [73, 74]. ...
November 2023
Engineering Applications of Artificial Intelligence