A fractal belief KL divergence for decision fusion

ENGINEERING APPLICATIONS OF ARTIFICIAL INTELLIGENCE(2023)

Cited 7|Views14
No score
Abstract
Dempster-Shafer (D-S) evidence theory is useful in the realm of multi-source data fusion. However, a counterintuitive result may be obtained when the belief probability assignments (BPAs) are highly conflicting. To overcome this flaw, in this paper a symmetric fractal-based belief Kullback-Leibler divergence (FBDSKL) is proposed. It is used to measure the divergence between BPAs, and is more capable than the existing belief divergence methods in measuring the conflict between two BPAs in numerical examples. Furthermore, the proposed FBDSKL is proved to have desirable properties including nonnegativity, nondegeneracy and symmetry. To apply FBDSKL divergence measure to practical problems, a novel FBDSKL-based multi -source data fusion (FBDSKL-MSDF) algorithm is designed. Through comparisons with the well-known related methods, the proposed FBDSKL-MSDF algorithm is validated to be superior and more robust. Finally, the proposed FBDSKL-MSDF is applied to two real-world classification problems to verify its high practicability.
More
Translated text
Key words
Dempster-Shafer evidence theory,Fractal belief KL divergence,Conflict management,Multi-source data fusion,Classification
AI Read Science
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Chat Paper
Summary is being generated by the instructions you defined