| Literature DB >> 26613069 |
Charmgil Hong1, Iyad Batal2, Milos Hauskrecht1.
Abstract
We develop a novel probabilistic ensemble framework for multi-label classification that is based on the mixtures-of-experts architecture. In this framework, we combine multi-label classification models in the classifier chains family that decompose the class posterior distribution P(Y1, …, Yd |X) using a product of posterior distributions over components of the output space. Our approach captures different input-output and output-output relations that tend to change across data. As a result, we can recover a rich set of dependency relations among inputs and outputs that a single multi-label classification model cannot capture due to its modeling simplifications. We develop and present algorithms for learning the mixtures-of-experts models from data and for performing multi-label predictions on unseen data instances. Experiments on multiple benchmark datasets demonstrate that our approach achieves highly competitive results and outperforms the existing state-of-the-art multi-label classification methods.Entities:
Keywords: Mixtures-of-experts; Multi-label classification
Year: 2015 PMID: 26613069 PMCID: PMC4657574 DOI: 10.1137/1.9781611974010.80
Source DB: PubMed Journal: Proc SIAM Int Conf Data Min