Publications

International Conference

Implicit Kernel Attention
categorize
Machine Learning
Author
Kyungwoo Song, Yohan Jung, Dongjun Kim, and Il-Chul Moon
Year
2021
Conference Name
AAAI Conference on Artificial Intelligence (AAAI 2021)
Presentation Date
Feb 2-9
City
Virtual Conference
File
Implicit Kernel Attention.pdf (1.3M) 43회 다운로드 DATE : 2023-11-10 00:21:34
File
Supplementary Material for Implicit Kernel Attention.pdf (3.3M) 43회 다운로드 DATE : 2023-11-10 00:21:34

Kyungwoo Song, Yohan Jung, Dongjun Kim, and Il-Chul Moon, Implicit Kernel Attention, AAAI Conference on Artificial Intelligence (AAAI 2021), Virtual Conference, Feb 2-9  (Acceptance Rate : 21%)


Abstract

Attention computes the dependency between representations, and it encourages the model to focus on the important selective features. Attention-based models, such as Transformer and graph attention network (GAT), are widely utilized for sequential data and graph-structured data. This paper suggests a new interpretation and generalized structure of the attention in Transformer and GAT. For the attention in Transformer and GAT, we derive that the attention is a product of two parts: 1) the RBF kernel to measure the similarity of two instances and 2) the exponential of L2 norm to compute the importance of individual instances. From this decomposition, we generalize the attention in three ways. First, we propose implicit kernel attention with an implicit kernel function instead of manual kernel selection. Second, we generalize L2 norm as the Lp norm. Third, we extend our attention to structured multi-head attention. Our generalized attention shows better performance on classification, translation, and regression tasks. 


@inproceedings{song2021implicit, 

title={Implicit kernel attention}, author={Song, Kyungwoo and Jung, Yohan and Kim, Dongjun and Moon, Il-Chul}, 

booktitle={Proceedings of the AAAI Conference on Artificial Intelligence}, 

volume={35}, 

number={11}, 

pages={9713--9721}, 

year={2021} 

}


Source Website: 

https://https://ojs.aaai.org/index.php/AAAI/article/view/17168/16975