TY - JOUR
T1 - Heterogeneous Double-Head Ensemble for Deep Metric Learning
AU - Ro, Youngmin
AU - Choi, Jin Young
N1 - Publisher Copyright:
© 2013 IEEE.
PY - 2020
Y1 - 2020
N2 - The structure of a multi-head ensemble has been employed by many algorithms in various applications including deep metric learning. However, their structures have been empirically designed in a simple way such as using the same head structure, which leads to a limited ensemble effect due to lack of head diversity. In this paper, for an elaborate design of the multi-head ensemble structure, we establish design concepts based on three structural factors: designing the feature layer for extracting the ensemble-favorable feature vector, designing the shared part for memory savings, and designing the diverse multi-heads for performance improvement. Through rigorous evaluation of variants on the basis of the design concepts, we propose a heterogeneous double-head ensemble structure that drastically increases ensemble gain along with memory savings. In verifying experiments on image retrieval datasets, the proposed ensemble structure outperforms the state-of-the-art algorithms by margins of over 5.3%, 6.1%, 5.9%, and 1.8% in CUB-200, Car-196, SOP, and Inshop, respectively.
AB - The structure of a multi-head ensemble has been employed by many algorithms in various applications including deep metric learning. However, their structures have been empirically designed in a simple way such as using the same head structure, which leads to a limited ensemble effect due to lack of head diversity. In this paper, for an elaborate design of the multi-head ensemble structure, we establish design concepts based on three structural factors: designing the feature layer for extracting the ensemble-favorable feature vector, designing the shared part for memory savings, and designing the diverse multi-heads for performance improvement. Through rigorous evaluation of variants on the basis of the design concepts, we propose a heterogeneous double-head ensemble structure that drastically increases ensemble gain along with memory savings. In verifying experiments on image retrieval datasets, the proposed ensemble structure outperforms the state-of-the-art algorithms by margins of over 5.3%, 6.1%, 5.9%, and 1.8% in CUB-200, Car-196, SOP, and Inshop, respectively.
KW - Ensemble learning
KW - deep architecture design
KW - deep metric learning
KW - image retrieval
KW - multi-head structure
UR - http://www.scopus.com/inward/record.url?scp=85088129536&partnerID=8YFLogxK
U2 - 10.1109/ACCESS.2020.3004579
DO - 10.1109/ACCESS.2020.3004579
M3 - Article
AN - SCOPUS:85088129536
SN - 2169-3536
VL - 8
SP - 118525
EP - 118533
JO - IEEE Access
JF - IEEE Access
M1 - 9123761
ER -