| Literature DB >> 31254769 |
İlkay Yıldız1, Peng Tian2, Jennifer Dy3, Deniz Erdoğmuş3, James Brown4, Jayashree Kalpathy-Cramer4, Susan Ostmo5, J Peter Campbell5, Michael F Chiang5, Stratis Ioannidis3.
Abstract
We consider learning from comparison labels generated as follows: given two samples in a dataset, a labeler produces a label indicating their relative order. Such comparison labels scale quadratically with the dataset size; most importantly, in practice, they often exhibit lower variance compared to class labels. We propose a new neural network architecture based on siamese networks to incorporate both class and comparison labels in the same training pipeline, using Bradley-Terry and Thurstone loss functions. Our architecture leads to a significant improvement in predicting both class and comparison labels, increasing classification AUC by as much as 35% and comparison AUC by as much as 6% on several real-life datasets. We further show that, by incorporating comparisons, training from few samples becomes possible: a deep neural network of 5.9 million parameters trained on 80 images attains a 0.92 AUC when incorporating comparisons.Entities:
Keywords: Classification; Comparison; Joint learning; Neural network; Siamese network
Mesh:
Year: 2019 PMID: 31254769 PMCID: PMC6718310 DOI: 10.1016/j.neunet.2019.06.004
Source DB: PubMed Journal: Neural Netw ISSN: 0893-6080