TY - GEN
T1 - Sequential recommendation with relation-aware kernelized self-attention
AU - Ji, Mingi
AU - Joo, Weonyoung
AU - Song, Kyungwoo
AU - Kim, Yoon Yeong
AU - Moon, Il Chul
N1 - Publisher Copyright:
© 2020, Association for the Advancement of Artificial Intelligence.
PY - 2020
Y1 - 2020
N2 - Recent studies identified that sequential Recommendation is improved by the attention mechanism. By following this development, we propose Relation-Aware Kernelized Self- Attention (RKSA) adopting a self-attention mechanism of the Transformer with augmentation of a probabilistic model. The original self-attention of Transformer is a deterministic measure without relation-awareness. Therefore, we introduce a latent space to the self-attention, and the latent space models the recommendation context from relation as a multivariate skew-normal distribution with a kernelized covariance matrix from co-occurrences, item characteristics, and user information. This work merges the self-attention of the Transformer and the sequential recommendation by adding a probabilistic model of the recommendation task specifics. We experimented RKSA over the benchmark datasets, and RKSA shows significant improvements compared to the recent baseline models. Also, RKSA were able to produce a latent space model that answers the reasons for recommendation.
AB - Recent studies identified that sequential Recommendation is improved by the attention mechanism. By following this development, we propose Relation-Aware Kernelized Self- Attention (RKSA) adopting a self-attention mechanism of the Transformer with augmentation of a probabilistic model. The original self-attention of Transformer is a deterministic measure without relation-awareness. Therefore, we introduce a latent space to the self-attention, and the latent space models the recommendation context from relation as a multivariate skew-normal distribution with a kernelized covariance matrix from co-occurrences, item characteristics, and user information. This work merges the self-attention of the Transformer and the sequential recommendation by adding a probabilistic model of the recommendation task specifics. We experimented RKSA over the benchmark datasets, and RKSA shows significant improvements compared to the recent baseline models. Also, RKSA were able to produce a latent space model that answers the reasons for recommendation.
UR - http://www.scopus.com/inward/record.url?scp=85101745491&partnerID=8YFLogxK
M3 - Conference contribution
AN - SCOPUS:85101745491
T3 - AAAI 2020 - 34th AAAI Conference on Artificial Intelligence
SP - 4304
EP - 4311
BT - AAAI 2020 - 34th AAAI Conference on Artificial Intelligence
PB - AAAI Press
T2 - 34th AAAI Conference on Artificial Intelligence, AAAI 2020
Y2 - 7 February 2020 through 12 February 2020
ER -