Informatics and Applications

2019, Volume 13, Issue 4, pp 18-26

ON COMPARATIVE EFFICIENCY OF CLASSIFICATION SCHEMES IN AN ENSEMBLE OF DATA SOURCES USING AVERAGE MUTUAL INFORMATION

  • M. M. Lange

Abstract

Given ensemble of data sources and different fusion schemes, an accuracy of multiclass classification of the collections of the source objects is investigated. Using the average mutual information between the datasets of the sources and a set of the classes, a new approach to comparing lower bounds to an error probability in two fusion schemes is developed. The authors consider the WMV (Weighted Majority Vote) scheme which uses a composition of the class decisions on the objects of the individual sources and the GDM (General Dissimilarity Measure) scheme based on a composition of metrics in datasets of the sources. For the above fusion schemes, the mean values of the average mutual information per one source are estimated. It is proved that the mean in the WMV scheme is less than the similar mean in the GDM scheme. As a corollary, the lower bound to the error probability in the WMV scheme exceeds the similar bound to the error probability in the GDM scheme. This theoretical result is confirmed by experimental error rates in face recognition of HSI color images that yield the ensemble of H, S, and I sources.

[+] References (14)

[+] About this article