Sequential Recommendation with Relation-Aware Kernelized Self-Attention
- categorize
- Machine Learning
- Conference Name
- AAAI Conference on Artificial Intelligence (AAAI 2020)
- Presentation Date
- Feb 7-12
- City
- New York
- Country
- USA
- File
- Camera Ready - RKSA.pdf (878.0K) 42회 다운로드 DATE : 2023-11-10 00:18:31
- File
- supplementary 1.pdf (158.6K) 41회 다운로드 DATE : 2023-11-10 00:18:31
Mingi Ji, Weonyoung Joo, Kyungwoo Song, Yoon-Yeong Kim, and Il-Chul Moon, Sequential Recommendation with Relation-Aware Kernelized Self-Attention. AAAI Conference on Artificial Intelligence (AAAI 2020), New York, USA, Feb 7-12, 2020 (Acceptance Rate : 20.6%)
Abstract
Recent studies identified that sequential Recommendation is improved by the attention mechanism. By following this development, we propose Relation-Aware Kernelized Self-Attention (RKSA) adopting a self-attention mechanism of the Transformer with augmentation of a probabilistic model. The original self-attention of Transformer is a deterministic measure without relation-awareness. Therefore, we introduce a latent space to the self-attention, and the latent space models the recommendation context from relation as a multivariate skew-normal distribution with a kernelized covariance matrix from co-occurrences, item characteristics, and user information. This work merges the self-attention of the Transformer and the sequential recommendation by adding a probabilistic model of the recommendation task specifics. We experimented RKSA over the benchmark datasets, and RKSA shows significant improvements compared to the recent baseline models. Also, RKSA were able to produce a latent space model that answers the reasons for recommendation.
@inproceedings{Shin2020,
author = {Ji, Mingi and Joo, Weonyoung and Kim, Yoon-Yeong and Moon, Il-Chul},
year = {2020},
title = {Sequential Recommendation with Relation-Aware Kernelized Self-Attention},
journal = {AAAI Conference on Artificial Intelligence (AAAI 2020)}
}
Source Website: