Publications

International Conference

Adversarial Dropout for Recurrent Neural Networks
categorize
Machine Learning
Author
Sungrae Park, Kyungwoo Song, Mingi Ji, Wonsung Lee, and Il-Chul Moon
Year
2019
Conference Name
AAAI Conference on Artificial Intelligence (AAAI 2019)
Presentation Date
Jan 27-Feb 1
City
Hawaii
Country
USA
File
AAAI19_Adv_drop_camera_ready.pdf (5.9M) 40회 다운로드 DATE : 2023-11-10 00:16:20

Sungrae Park, Kyungwoo Song, Mingi Ji, Wonsung Lee, and Il-Chul Moon, Adversarial Dropout for Recurrent Neural Networks, AAAI Conference on Artificial Intelligence (AAAI 2019), Hawaii, USA, Jan 27-Feb 1, 2019


Abstract

Successful application processing sequential data, such as text and speech, requires an improved generalization performance of recurrent neural networks (RNNs). Dropout techniques for RNNs were introduced to respond to these demands, but we conjecture that the dropout on RNNs could have been improved by adopting the adversarial concept. This paper investigates ways to improve the dropout for RNNs by utilizing intentionally generated dropout masks. Specifically, the guided dropout used in this research is called as adversarial dropout, which adversarially disconnects neurons that are dominantly used to predict correct targets over time. Our analysis showed that our regularizer, which consists of a gap between the original and the reconfigured RNNs, was the upper bound of the gap between the training and the inference phases of the random dropout. We demonstrated that minimizing our regularizer improved the effectiveness of the dropout for RNNs on sequential MNIST tasks, semi-supervised text classification tasks, and language modeling tasks. 


@inproceedings{Park2019,

author = {Park, Sungrae and Song, Kyungwoo and Ji, Mingi and Lee, Wonsung and Moon, Il Chul},

year = {2019},

month = {04},

pages = {},

title = {Adversarial Dropout for Recurrent Neural Networks}

}


Source Website:

https://ojs.aaai.org/index.php/AAAI/article/view/4395/4273